OpenYak – An open-source Cowork that runs any model and owns your filesystem

80 points by wangzhangwu a day ago on hackernews | 39 comments

SilverElfin | a day ago

What does “owns your filesystem” mean? That sounds dangerous.

h05sz487b | a day ago

Its your filesystem which is now, also, owned.

spiderfarmer | a day ago

I have used Cowork so much over the last couple of months and I have no reason to switch. But I’ll definitely give this a try.

girvo | a day ago

I don’t get it. It says nothing leaves your computer, but it’s sending things to OpenRouter, not running models locally. Perhaps I am dumb (and I always feel dumb after reading an AI generated README for yet another AI tool Tbf)

_ache_ | a day ago

> Yes. OpenYak is local-first. Your conversations and files are stored only on your machine. When using cloud models, only API calls to LLM providers leave your computer.

So local-first and still upload files to cloud models if you configure it.

Barbing | a day ago

>only API calls

Given the software‘s broad appeal, I’d rephrase to make it more clear every word/file you send would leave your computer.

_ache_ | a day ago

Absolutely, that is misleading to not-much-technical people, maybe intentionally. Does not inspire confidence.
Yes it appears your personal data IS being sent to open router and the model provider here. The problem I think is that a lot of people (especially in the openclaw community) mistake “I run it on my mac mini” to mean their data is private. Meanwhile all data is being shipped off for training to anthropic via openrouter and both of those parties see everything.

I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy

raincole | a day ago

> run locally via Ollama

Are you saying this part is a lie?

girvo | a day ago

Yes.

In that I'm saying their AI generated slop README is misleading enough to be lying.

When you prominently say "all without uploading anything to the cloud." but the default does exactly that, a single half-sentence mentioning Ollama doesn't cut it.

Barbing | 18 hours ago

This is a weird phenomenon with some dictation apps too.

They talk about privacy first, they talk up local first, then their default settings are to send every syllable to someone else’s computer. Once you understand the app it’s trivial to make it a local, but there’s a good chance your first transcript is coming off a server unlike what the marketing material suggested.

jstummbillig | a day ago

> It says nothing leaves your computer

Where does it say that?

It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.

gbalduzzi | a day ago

I read it as "everything controlled by us is local first and we do not collect any data about you"

I agree that someone may misunderstand their phrasing though

hrmtst93837 | a day ago

You're reading it correctly: it's a thin OpenRouter wrapper calling itself local while your prompts still leave the machine.

the_real_cher | a day ago

So it's like open claw but you have to pay for it?

kennywinker | a day ago

It looks free / open source to me?

SomaticPirate | a day ago

Not to be too conspiratorial here but since the founder of OpenClaw was snatched up, there seems to be a rush of “open source” AI projects desperately bidding to be alternatives. Which can generate huge returns if one of the major players decides that “they also need a cowork-style product”

So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later

rakag | a day ago

What's the difference between this and OpenWork which has existed for a while?

jaimex2 | a day ago

OpenWork supports Linux where this does not

teleforce | a day ago

I've got the strong feeling that AI model and agent requires different operating system (OS) paradigm that's data centric rather than file-system for more efficient, effective and trustworthy operations. This new OS should work seamlessly with data natively across different processors for examples CPU, GPU, TPU, NPU, accelarators, etc.

For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].

With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].

[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines

https://dspace.mit.edu/handle/1721.1/126114

[2] D4M: Dynamic Distributed Dimensional Data Model:

https://d4m.mit.edu/

[3] Intel Demos Chip to Compute with Encrypted Data (121 comments):

https://news.ycombinator.com/item?id=47322815

[4] Intel Demos Chip to Compute With Encrypted Data: Fully homomorphic encryption chip speeds operations 5,000-fold:

https://spectrum.ieee.org/fhe-intel

rbren | a day ago

I still strongly believe every developer should be vibecoding their own cowork/openclaw/devin

Here are the prompts I use for my AI environment, though it's changed a bunch since the last snapshot

https://github.com/rbren/personal-ai-devbox

Barbing | a day ago

Thanks for the link. You mention security; is the _average_ developer safer going with OpenClaw?

rbren | 17 hours ago

It's probably just as hard to secure OpenClaw as this, but you'll find better tutorials for securing OpenClaw

m_kos | a day ago

Neat! I might give it a try.

What do you mean by interfaces in "These interfaces can do literally anything on the host machine. You're responsible for your own security"?

Also, your backdooring image links to a 404.

rbren | 17 hours ago

The prompts contain e.g. a terminal UI, which gives you root access to the machine. If someone can access that UI and its backend, the can do whatever they want! So make sure to put it behind a firewall or basic auth or something else.

coldtrait | a day ago

Why did the OP make a comment about the project like he was someone else?

https://news.ycombinator.com/item?id=47560380#47560381

When it's clear he is one of the major contributors to the project?

https://github.com/openyak/desktop/graphs/contributors

stingraycharles | a day ago

Because it’s just silly AI generated spam, don’t read too much into it.

ares623 | a day ago

aren't these supposed to be bannable offences?

girvo | a day ago

They are, dang et al deal with it pretty strictly. Which I thank them for.

october8140 | 23 hours ago

This was posted earlier this week.

ThrowawayR2 | 21 hours ago

They are but somebody has to bring it to the attention of the HN moderators by emailing them for that to happen.

Sathwickp | a day ago

A simpler version of openclaw?

zombot | a day ago

> owns your filesystem

Just when I thought it couldn't get worse than OpenClaw, someone proposes this, in all seriousness. I see a stellar future for them at OpenAI.

Factor1177 | a day ago

Anyone else getting a 404 when trying to download?

kvakkefly | a day ago

Nice! MacOS download link is a 404

systima | a day ago

How does this differ to Open Code Desktop?

jaimex2 | a day ago

This doesn't support Linux where Open Code does.

october8140 | 23 hours ago

Flag this garbage.

imiric | 11 hours ago

It looks like HN's new AI guideline is working as intended.