ExploreTrendingAnalytics
Nostr Archives
ExploreTrendingAnalytics
Max Trotter32d ago
Want to run powerful LLMs 100% locally & offline? Here's a quick install for Ollama + Open WebUI on Linux: 1. curl -fsSL https://ollama.com/install.sh | sh 2. sudo systemctl enable --now ollama 3. sudo apt update && sudo apt install docker.io docker-compose -y 4. sudo usermod -aG docker $USER # then log out & back in 5. docker run -d --name open-webui --network host --restart unless-stopped \ -v open-webui:/app/backend/data \ -e OLLAMA_BASE_URL=http://127.0.0.1:11434 \ ghcr.io/open-webui/open-webui:main → Open http://localhost:8080 (create admin account) Enjoy private AI on your machine. No cloud. No tracking. šŸš€ #nostr #AI #selfhosted #ollama #openwebui #localLLM
šŸ’¬ 0 replies

Replies (0)

No replies yet.