Gpu

Connecting Stable Diffusion WebUI to your locally running Open WebUI


Connecting Stable Diffusion WebUI to Ollama and Open WebUI, so your locally running LLM can generate …...

 · 6 min · torgeir

Running LLMs locally with Ollama and open-webui


Enjoying LLMs but don't care for giving away all your data? Here's how to run your own little …...

 · 4 min · torgeir