ExploreTrendingAnalytics
Nostr Archives
ExploreTrendingAnalytics
OKIN27d ago
I don’t know about the rest of you but I’m enjoying my OpenClaw bot, Slick, thus far. Made homie run using Ollama & Open Source Models & things are great for now. Still have some configuration to do but we soak on how best to get it done. No tokens running out around me, got speech-to-text working so I can send voicenotes & am busy refining my Mission Control all while laying in bed & using Telegram 😍
💬 20 replies

Replies (20)

OKIN27d ago
Running this bad boy off a 2015 Dell Laptop with Kali Linux on it 🤓
0000 sats
OKIN27d ago
Since I have 1TB I guess I’ll just download this 51GB model to see if it works. Should probably download a few other smaller ones too, since it’s a HDD 😅
0000 sats
OKIN27d ago
Using a few different ones: minimax, qwen3, Kimi 2.5 & glm4.7-flash. Basically checking out all the free ones I can download on my computer and a few cloud ones too so I can compare the results.
0000 sats
Diego Valley26d ago
I started using ollam yesterday. Are you running openclaw locally then?
0000 sats
Gigi26d ago
This is the way.
0000 sats
OKIN26d ago
Blessings! It’s a wonderful time to be alive indeed. I now speak to my openclaw & the open source models through telegram instead of typing. All my Open Source models run as subagents of the main model, currently minimax, to achieve different tasks simultaneously. Plus I just got my agent, Slick, to install tts so she can send me voicenotes as responses as well. 🥳 Soon one of us humans shall build better than telegram for this use case too 🫡
0000 sats
Diego Valley26d ago
This is brilliant! And great to read. I started running ollama and qwen3. Im probably over thinking running openclaw because of security issues I’ve read about and not taken the plunge yet. Are guides YT vids you’d recommend?
0000 sats
Seinz Fiction26d ago
I‘m trying different local models as well with my M4 Mac Mini. However I‘m still having trouble when telling OpenClaw to do things on its own….like establishing a Git Repository. It then does stuff and asks me to check in the file system if it has been created. It should be able to check this on its own.
0000 sats
OKIN26d ago
That I am. Only way I’d do it for now tbh
0000 sats
OKIN26d ago
I took the plunge with an old laptop I already had to see what could and what couldn’t be done. There are quite a few out there but generally Ollama has one-liners to run almost all their models on openclaw so you really don’t have to do too much besides run one of them before you install openclaw so it asks you if you want to use the Ollama model before you even see the other options. It takes a bit of tinkering around so if you have an old laptop first try stuff out with that one. It’s a lot of fun crafting my agent into the amazing assistant I know she’ll be. This is probably one of the better ones I found as most are about VPS setups -> https://youtu.be/eDIDysgEHUU
0000 sats
Diego Valley25d ago
Thanks 🙏 any updates?
0000 sats
Seinz Fiction24d ago
I‘m tinkering around with the local Qwen3:30b model. I‘m working on getting a Morning Briefing working which gives me Information on important News from the Forex Factor Calendar. It generated a python script for scraping the data from the site.
0000 sats
Seinz Fiction24d ago
With the local Qwen3:30b Model I felt my #openclaw did hallucinations and not really do what I want it to do. So I started my Clawdi fresh with the Ollama kimi-k2.5:cloud modell and it feels much better than before.
0000 sats
Diego Valley24d ago
Thx. What machine are you running on? I’ve not dloaded the 30b model. Not sure if my machine will handle it. Just running qwen3 8b locally at the minute.
0000 sats
Seinz Fiction24d ago
I‘m running a Mac Mini M4 with 32 GB Memory. Qwen3:30b is working in general on it.
0000 sats
Diego Valley24d ago
Ok, I’m on an m1 16gb.
0000 sats
Diego Valley24d ago
Have you done anything with n8n to create agents?
0000 sats