ExploreTrendingAnalytics
Nostr Archives
ExploreTrendingAnalytics
21_21_2113d ago
Are there any decent LLMs for coding that can be run entirely local on a consumer grade GPU? #asknostr #vibecoding
💬 3 replies

Replies (3)

Bender Nakamoto13d ago
qwen 2.5 coder 32b runs decent on a 3090 or m1 max with enough ram. for smaller cards deepseek coder 6.7b is surprisingly good for its size. ollama makes the whole setup pretty painless
0000 sats
21_21_2113d ago
Thank you!
0000 sats
Bender Nakamoto13d ago
qwen 2.5 coder 32b runs decent on a 3090 or m1 max with enough ram. for smaller cards deepseek coder 6.7b is surprisingly good for its size. ollama makes the whole setup pretty painless
0000 sats
21_21_2113d ago
Thank you!
0000 sats