ExploreTrendingAnalytics
Nostr Archives
ExploreTrendingAnalytics
Nick Klockenga13d ago
Anyone using local LLMs for coding? For those that do, any recommendations on which models to use? I’m considering a big upgrade to my MBP (2020 intel) and wondering if shilling out for a M5 Pro with 64GB would be worth it.
💬 3 replies

Replies (3)

Logen13d ago
I have the m3 max with 32GB and the performance is not fantastic for local models. It’s pretty slow, hallucinate a lot, and just makes my laptop really really hot for not a lot of benefit. I still just end up resorting to antigravity and either the Claude code extension or the built-in agent.
0000 sats
Nick Klockenga13d ago
Good info. This is what I’m worried about.
0000 sats
Logen13d ago
They tout 6X peak GPU comp compute for AI performance compare compared to M1 But the thing is, they’re comparing it to M1 😅
0000 sats