I don’t get it. It says nothing leaves your computer, but it’s sending things to OpenRouter, not running models locally. Perhaps I am dumb (and I always feel dumb after reading an AI generated README for yet another AI tool Tbf)
> Yes. OpenYak is local-first. Your conversations and files are stored only on your machine. When using cloud models, only API calls to LLM providers leave your computer.
So local-first and still upload files to cloud models if you configure it.
Yes it appears your personal data IS being sent to open router and the model provider here. The problem I think is that a lot of people (especially in the openclaw community) mistake “I run it on my mac mini” to mean their data is private. Meanwhile all data is being shipped off for training to anthropic via openrouter and both of those parties see everything.
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
In that I'm saying their AI generated slop README is misleading enough to be lying.
When you prominently say "all without uploading anything to the cloud." but the default does exactly that, a single half-sentence mentioning Ollama doesn't cut it.
This is a weird phenomenon with some dictation apps too.
They talk about privacy first, they talk up local first, then their default settings are to send every syllable to someone else’s computer. Once you understand the app it’s trivial to make it a local, but there’s a good chance your first transcript is coming off a server unlike what the marketing material suggested.
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
Not to be too conspiratorial here but since the founder of OpenClaw was snatched up, there seems to be a rush of “open source” AI projects desperately bidding to be alternatives. Which can generate huge returns if one of the major players decides that “they also need a cowork-style product”
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later
I've got the strong feeling that AI model and agent requires different operating system (OS) paradigm that's data centric rather than file-system for more efficient, effective and trustworthy operations. This new OS should work seamlessly with data natively across different processors for examples CPU, GPU, TPU, NPU, accelarators, etc.
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
The prompts contain e.g. a terminal UI, which gives you root access to the machine. If someone can access that UI and its backend, the can do whatever they want! So make sure to put it behind a firewall or basic auth or something else.
SilverElfin | a day ago
h05sz487b | a day ago
spiderfarmer | a day ago
girvo | a day ago
_ache_ | a day ago
So local-first and still upload files to cloud models if you configure it.
Barbing | a day ago
Given the software‘s broad appeal, I’d rephrase to make it more clear every word/file you send would leave your computer.
_ache_ | a day ago
3s | a day ago
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
raincole | a day ago
Are you saying this part is a lie?
girvo | a day ago
In that I'm saying their AI generated slop README is misleading enough to be lying.
When you prominently say "all without uploading anything to the cloud." but the default does exactly that, a single half-sentence mentioning Ollama doesn't cut it.
Barbing | 18 hours ago
They talk about privacy first, they talk up local first, then their default settings are to send every syllable to someone else’s computer. Once you understand the app it’s trivial to make it a local, but there’s a good chance your first transcript is coming off a server unlike what the marketing material suggested.
jstummbillig | a day ago
Where does it say that?
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
gbalduzzi | a day ago
I agree that someone may misunderstand their phrasing though
hrmtst93837 | a day ago
the_real_cher | a day ago
kennywinker | a day ago
SomaticPirate | a day ago
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later
rakag | a day ago
jaimex2 | a day ago
teleforce | a day ago
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
https://dspace.mit.edu/handle/1721.1/126114
[2] D4M: Dynamic Distributed Dimensional Data Model:
https://d4m.mit.edu/
[3] Intel Demos Chip to Compute with Encrypted Data (121 comments):
https://news.ycombinator.com/item?id=47322815
[4] Intel Demos Chip to Compute With Encrypted Data: Fully homomorphic encryption chip speeds operations 5,000-fold:
https://spectrum.ieee.org/fhe-intel
rbren | a day ago
Here are the prompts I use for my AI environment, though it's changed a bunch since the last snapshot
https://github.com/rbren/personal-ai-devbox
Barbing | a day ago
rbren | 17 hours ago
m_kos | a day ago
What do you mean by interfaces in "These interfaces can do literally anything on the host machine. You're responsible for your own security"?
Also, your backdooring image links to a 404.
rbren | 17 hours ago
coldtrait | a day ago
https://news.ycombinator.com/item?id=47560380#47560381
When it's clear he is one of the major contributors to the project?
https://github.com/openyak/desktop/graphs/contributors
stingraycharles | a day ago
ares623 | a day ago
girvo | a day ago
october8140 | 23 hours ago
ThrowawayR2 | 21 hours ago
Sathwickp | a day ago
zombot | a day ago
Just when I thought it couldn't get worse than OpenClaw, someone proposes this, in all seriousness. I see a stellar future for them at OpenAI.
Factor1177 | a day ago
kvakkefly | a day ago
systima | a day ago
jaimex2 | a day ago
october8140 | 23 hours ago
imiric | 11 hours ago