ExploreTrendingAnalytics
Nostr Archives
ExploreTrendingAnalytics
gladstein16d ago
If you use open claw with a corporate LLM brain, you can ask it to interact with a local model. So they would see the query but not the answer which could stay on your machine
💬 1 replies

Thread context

Root: 1caf28556686…

Replying to: ba99140533ed…

Replies (1)

NNonMetalCoin16d ago
Gotcha, Anthropic sees “write and run a script to save local model output for <sensitive question> to a file” it does that. It doesn’t necessarily see the file. If you get the question there a different way than through the corporate LLM it could be: “write and run a script to save local model output when fed the contents of <sensitive question containing file path> to a file. Don’t read the file at that path. And then they’d probably not see the question or the answer”
0000 sats