ExploreTrendingAnalytics
Nostr Archives
ExploreTrendingAnalytics
waxwing66d ago
I have run open source LLMs locally for free. The performance can't match, not even in the same ballpark (hence me mentioning $20K etc) ... or did I misunderstand you?
💬 1 replies

Thread context

Root: 427fc95c3857…

Replying to: be6c9dc1f902…

Replies (1)

rare66d ago
I could have been more clear. You can use open-webui and an API key from an LLM aggregator. I've been using NanoGPT and Venice APIs. Gives me a wide range of models. But also use them with locally hosted ones too. Privacy wise, its a decent trade off.
0000 sats