This might be a good thing for homebrew to adopt for the download/install process, but if it doesn't include a ruby interpreter, I have a hard time seeing how it's going to be compatible with anything but searching and installing bottles. I install most of my packages from a Brewfile, which itself is Ruby code.
If it doesn’t ever execute Ruby: it cannot be compatible with Homebrew. “Compatible” is doing a bit of work here when it also means “implicitly relies on Homebrew’s CDN, CI, packaging infrastructure and maintainers who keep all this running”.
There’s a new vibe coded Homebrew frontend with partial compatibility and improved speed every few weeks.
Homebrew is working on an official Rust frontend that will actually have full compatibility. Hopefully this will help share effort across the wider ecosystem.
It is really coll that Homebrew provides a comprehensive enough JSON API to let people build on Homebrew in useful ways without directly running Ruby, despite everything being built in a Ruby DSL. That really does seem like a "best of both worlds" deal, and it's cool that alternative clients can take advantage of that.
I didn't know about the pending, official Rust frontend! That's very interesting.
Yeah I don't know why people are saying that speed doesn't matter. I use Homebrew and it is slow.
It's like yum vs apt in the Linux world. APT (C++) is fast and yum (Python) was slow. Both work fine, but yum would just add a few seconds, or a minute, of little frustrations multiple times a day. It adds up. They finally fixed it with dnf (C++) and now yum is deprecated.
Glad to hear a Rust rewrite is coming to Homebrew soon.
* it’s purpose built for mega-sized monorepo models like Google (the same company that created it)
* it’s not at all beginner friendly, it’s complex mishmash of three separate constructs in their own right (build files, workspace setup, starlark), which makes it slow to ramp new engineers on.
* even simple projects require a ton of setup
* requires dedicated remote cache to be performant, which is also not trivial to configure
* requires deep bazel knowledge to troubleshoot through its verbose unclear error logs.
Because of all that, it’s extremely painful to use for anything small/medium in scale.
One of the reasons I switched to arch from debian based distros was precisely how much faster pacman was compared to APT -- system updates shouldn't take over half an hour when I have a (multi)gigabit connection and an SSD.
It was mostly precipitated by when containers came in and I was honestly shocked at how fast apk installs packages on alpine compared to my Ubuntu boxes (using apt)
pacman is faster simply because it does less things and it supports less use cases.
For example pacman does not need to validate the system for partial upgrades because those are unsupported on Arch and if the system is borked then it’s yours to fix.
Less charitably, pacman is fast because it's wrong. The dependency resolver is wrong; it fails to find correct answers to dependency resolution problems even when correct answers are available.
> dnf < 5 was still performing similarly to yum (and it was also implemented in python)
I'm perhaps not properly understanding your comment. If the algorithmic changes were responsible for the improved speed, why did the Python version of dnf perform similarly to yum?
Because dnf4 used the same dependency resolution as yum but they revamped it in dnf5 (it was initially supposed to be a whole new package manager with a different name)
> Yeah I don't know why people are saying that speed doesn't matter. I use Homebrew and it is slow
Because how often are you running it where it's not anything but a opportunity to take a little breather in a day? And I do mean little, the speedups being touted here are seconds.
I have the same response to the obsession with boot times, how often are you booting your machine where it is actually impacting anything? How often are you installing packages?
Do you have the same time revulsion for going to the bathroom? Or getting a glass of water? or basically everything in life that isn't instantaneous?
I would guess this change builds on the existing json endpoints for package metadata but that the Ruby DSL is remaining intact.
I think how to marry the Ruby formulas and a Rust frontend is something the Homebrew devs can figure out and I'm interested to see where it goes, but I don't really care whether Ruby "goes away" from Homebrew in the end or not. It's a lovely language, so if they can keep it for their DSL but improve client performance I think that's great.
Is Ruby really the speed bottleneck in Homebrew? I would assume it would be due to file operations (and download operations), not choice of programming language.
Largely agree, though some things are notably difficult in some languages. Things like true concurrency for example didn’t come as naturally in Ruby because of the global interpreter lock. Of course there are third party libs, and workarounds though. Newer versions of Ruby bring it more natively, and as we’ve seen, Homebrew has adopted and makes use of that experimentally for a while, and the default relatively recently.
I can’t say that’s the only reason it’s slow of course. I’m on the “I don’t use it often enough for it to be a problem at all” side of the fence.
I appreciate the push for an official rust frontend. I've personally been migrating (slowly) to using nix to manage my Mac's software, but there are a ton of limitations which lead me to rely on homebrew anyway. The speed ups will be appreciated.
> I appreciate the push for an official rust frontend
Why? I think I am seriously starting to contract as case of FOMO. I feel like Rust is rapidly gaining territory everyday. I mean, that's fine and all, I suppose. I have never used it, so I have no real opinions on the language.
Sorry, examples of what? Package managers that present themselves as replacements for other package managers? Or package managers that aren't compatible with the registry they're supposed to be compatible with? Your use of scare quotes is confusing.
pnmp, npm, yard all have different lockfiles, all use the same registry format (and the same registry itself), all try to stay compatible in other ways.
You won't be having situation where one uses yarn and someone uses pnpm on the same project tho.
Is this still true since they swapped to distributing binaries rather than building from source on each install? It's been years since I last installed something from homebrew that built from source, so something that could install the same binaries would be compatible from my standpoint.
That said, it's also been a while since I've really had any huge complaints about brew's speed. I use Linux on my personal machines, and the difference in experience with my preferred Linux distro's package manager and brew used to be laughable. To their credit, nowadays, brew largely feels "good enough", so I honestly wouldn't even argue for porting from Ruby based on performance needs at this point. I suspect part of the motivation might be around concerns about relying on the runtime to be available. Brew's use of Ruby comes from a time when it was more typical for people to rely on the versions of Python and Ruby that were shipped with MacOS, but nowadays a lot of people are probably more likely to use tooling from brew itself to manage those, and making everything native avoids the need to bootstrap from an existing runtime.
It can revert back to building from source under some cases and I still think even when doing binary downloads it will execute install hooks which are ruby inside the recipe
I would agree with you that probably Ruby itself is probably not the bottleneck (except maybe for depsolving cuz that’s cpu bound)
Good to know! I was doing this with a hacky one-liner but wasn't aware of this flag. I think the sequential build/install process is the agonizing bit though.
Yeah, tbh homebrew is slow as fuck. It literally took 30 minutes to install aws cli on my 2020 mbp. I will happily flock to every new version that's faster.
I don't see where he said it's a bad thing, or even implied it. As I see it, he did imply that superlatives like THE FASTEST PACKAGE MANAGER aren't worth much in this environment.
> People are free and probably do this because it is slow. Alternatives often are not a bad thing.
Alternatives are always good but IMO brew is just not something I interact with all that much and to me it's "good enough". It works and does what I expect, although to be fair maybe I'm on the happy path <shrug>.
Indeed, everyone's free to do what they want, that's the beauty of open source.
I have zero issues with people vibe coding alternative Homebrew frontends, it's good for the ecosystem for there to be more experimentation.
What I take objection to is when one or more of these happen:
- incorrect compatibility claims are made (e.g. if you're not running Ruby, no post-install blocks in formulae are gonna work)
- synthetic benchmarks are used to demonstrate speed (e.g. running `brew reinstall openssl` in a loop is not a terribly representative case, instead a e.g. cold `brew upgrade` of >10 packages would be). to be clear, I'm sure most of these projects are faster than Homebrew in fair benchmarks too!
- incorrect claims about why Homebrew is slow are made (e.g. "we do concurrent downloads and Homebrew doesn't": true a year ago, not true since 5.0.0 in November 2025)
- it's pitched as a "replacement for Homebrew" rather than "an alternative frontend for Homebrew" when it's entirely reliant on our infrastructure, maintainers, update process, API, etc.
Even on the above: of course people are free to do whatever they want! It's just at least some of the above hinders rather than helps the ecosystem and makes it harder rather than easier for us as a wider open source ecosystem to solve the problem "Homebrew is slow" (which, to be clear, it is in many cases).
Thanks for all the hard work. I think brew is what makes the Mac the best “unix” machine choice as far as being stable and not having to take up maintaining my OS as a multi-hour per week hobby. I have been using it daily for three years and have never had any problems.
> Homebrew is working on an official Rust frontend that will actually have full compatibility.
When you say "Rust frontend", is the vision that Homebrew's frontend would eventually transition to being a pure Rust project — no end-user install of portable-ruby and so forth?
If so (ignore everything below if not):
I can see how that would work for most "boring" formulae: formula JSON gets pre-baked at formula publish time; Rust frontend pulls it; discovers formula is installable via bottle; pulls bottle; never needs to execute any Ruby.
But what happens in the edge-cases there — formulae with no bottles, Ruby `post_install` blocks, and so forth? (And also, how is local formula development done?)
Is the ultimate aim of this effort, to build and embed a tiny little "Formula Ruby DSL" interpreter into the Rust frontend, that supports just enough of Ruby's syntax + semantics to execute the code that appears in practice in the bodies of real formulae methods/blocks? (I personally think that would be pretty tractable, but I imagine you might disagree.)
We will never be 100% Rust an 0% Ruby. It’s possible that 99% of users end up never running any Ruby, though. It’ll still be needed for local development and our CI. We’re optimising for speeding up the 99% case as much as possible.
McQuaid is correct in the strict sense, but the interesting question is what percentage of actually installed formulae use Ruby DSL features beyond archive extraction and path manipulation. My guess is it's a small minority of what most developers have on their machines.
The real compatibility test isn't "runs all Homebrew formulae" — it's "runs the 15-20 formulae each developer actually uses." A tool that handles those correctly and fails clearly on edge cases is more useful in practice than a technically complete implementation that's slower.
What's missing from this thread is any data on that surface area, not more benchmark numbers.
> Homebrew is working on an official Rust frontend that will actually have full compatibility. Hopefully this will help share effort across the wider ecosystem.
> Compatible” is doing a bit of work here when it also means “implicitly relies on Homebrew’s CDN, CI, packaging infrastructure and maintainers who keep all this running”.
This is literally what "compatible" means, how else did you expect then to frame it?
That is great news! Would be even more awesome if it was being ported to a more approachable language like Go or Zig, or somehow rearchitected in Ruby, but I take it that ship has sailed long ago. Ruby -> Rust is a brutal move.
Wait a minute, Homebrew is slow? I thought most of the time it takes for me is downloading and installing. I haven't noticed slowdowns anywhere else, even for the ones mentioned.
In it's current form, homebrew is amazing. It's not that slow and recent updates have made it really good to use. May I know the reasons for a Rust rewrite?
This feels like a solution looking for a problem. I have a couple hundred brew packages on my system and I’ve never sat there thinking “If this was only 2 seconds faster…” while doing an update. I’m sure the Homebrew folks could mine this for a few ideas of how to further optimize brew, but I don’t think I’ll be adopting it anytime soon. Compatibility is more important than speed in this case.
If you use the Homebrew module for Nix-Darwin, running `brew` against the generated brewfile becomes the slowest part of a `darwin-rebuild switch` by far. In the fast cases, it turns something that could take 1 second into something that takes 10, which is definitely annoying when running that command is part of your process for configuration changes even when you don't update anything. Homebrew no-ops against an unchanging Brewfile are really slow.
Agreed on horses for courses. Different people have different tolerances. And yea, all things being equal, faster is better, but they are almost never equal. If you don’t mind me asking, what does “too slow” mean for you in this context? Do you have a particularly complex setup? And what do you use now as an alternative and how has that impacted the update speed?
I wish I could remember the details -- I know I got annoyed with things being slow and when I got a new computer decided to go the no-homebrew route. I'm using nix, and it seems fine so far, but I also really don't understand it at all, which is a little concerning. :-)
FWIW this seems to have improved in recent years. Back in the dark times of non parallelized downloads I would purposefully wait to end of day and fire the thing off before leaving
But you can turn that behavior off, IIRC it tells you the environment variable to set if you don’t want it to do that every time it runs.
I agree it’s annoying, but I haven’t turned it off because it’s only annoying because I’m not keeping my computer (brew packages) up-to-date normally (aka, it’s my own fault).
I'm not sure if I just have way fewer things installed than most people or I just update more often, but I haven't experienced anything like this for years. I run `brew upgrade` probably around once every (work)day, usually right before doing a git pull or something, and then I'll quickly look at a couple emails or slack messages, and then it's always done by the time I switch back
But why do you want daily upgrades? Most of us want to wait a little while for the bugs to be worked out of fresh releases. And hey, if everything is working today... why would I want to risk potential breaking changes?
> Most of us want to wait a little while for the bugs to be worked out of fresh releases.
This is not something that's solved by updating less frequently though. It would be solved by a 'minimum age' setting, but `brew` aren't planning on implementing that, with arguably valid reasoning: https://github.com/Homebrew/brew/issues/21421
Minimum age solves a related problem - it gives maintainers some margin of time in which to discover vulnerabilities and yank the affected versions.
However, minimum age also delays you getting bug fixes (since those also need to age out).
In an ideal world one would probably be able to configure a minimum-age-or-subsequent-patch-count rule. i.e. don't adopt new major/minor package versions until either 1 month has elapsed, or a minimum of 2 patch versions have been released for that version.
The same criticism has been said of Deno and Pnpm and bun, and yet, despite all these years since their respective releases, node and npm remain slower than all three options.
Yeah, but do they work? Last time I gave bun a chance their runtime had serious issues with frequent crashes. Faster package installation or spin-up time is meaningless if it comes at the cost of stability and compatibility.
Agreed here. The speed bottleneck I run into is simply that there's often a lot of packages that need updating, so there's a lot to download. And if anything needs to be compiled from source then the time that takes will dominate (though I think everything I currently run is thankfully pre-built)
Brew definitely used to be a lot slower, and I used to find it very tedious. I feel like they've done a reasonably good job in improving that over the years though (with the switch to distributing binaries by default being a huge win in terms of speed). I have to wonder if stuff like this is more due to lingering feelings from before combined with the easy access to vibe coding tools. If LLM coding came a few years earlier, maybe projects like this one would have made more sense to me.
I've been looking for something like this, especially to use only with casks now that Homebrew has removed support for not adding the quarantine bit. Looking forward to giving it a try!
And zerobrew, like the original Homebrew, is compatible with Linux.
It appears that Nanobrew is not.
I care about the light-weight efficiency of these new native code variants much more when I want to use brew on some little Linux container or VM or CI, than I do for my macOS development machine.
> Zerobrew is experimental. We recommend running it alongside Homebrew rather than as a replacement, and do not recommend purging homebrew and replacing it with zerobrew unless you are absolutely sure about the implications of doing so.
So I guess its fine to run this alongside Homebrew and they don't conflict.
I'm not a Python dev, but I appreciate the motivation uv has inspired across other package managers. I tried another brew replacement called zerobrew last month. It installed packages to a different directory from homebrew, so I didn't actually test drive after seeing that. Regardless, I look forward to the competition pushing mainstream tools to improve their performance.
What would be great is a Homebrew-compatible system that doesn't cut off support for older machines. I have a 3.8 GHz Quad core i5 iMac that still crushes, yet Homebrew has determined that I'm just too old and icky[1] to work with anymore. I had to move over to MacPorts, which is surprisingly nice, but I still miss brew.
Yea, I know. It's open source. They can do what they want. Still sucks.
I still think that's entirely fair for a power user tool like homebrew. With the upgrade rates of macOS that probably means that's 98% of the users would be covered. Expecting an open source project to accept bug requests from a bigger variety of versions that then would need test devices on these versions to replicate issues sounds unrealistic. Bigger companies, or Apple itself I would hold to much higher standards when it comes to that.
brew used to say, more or less, "This OS is old and unsupported. Don't submit bug reports. If you have problems, too bad. If you submit a PR to fix something, we might merge it". Fair enough, right? Now it just says, "Go fuck yourself, grandpa."
That makes no sense then. A power user may still want to run older OS versions for a reason. Take the training wheels off it and then it'll be a power user tool.
> A power user may still want to run older OS versions for a reason.
No doubt there are edge cases like that, but I don't fault a project for not catering to the < 1% of users who would fall into that bucket and would probably be the ones that cause trickier support cases. These would maybe also be the user that could just install it without homebrew then, it's not like homebrew is the only way to install software.
This is not an edge case. Most HN commenters describe the latest two versions of macOS as being objectively worse than earlier versions: slower, less stable, more broken. There are significant numbers of “power users” who deliberately avoid upgrading or have actively downgraded macOS to Sonoma because they care about their computing experience.
People who downgraded to Sonoma are the definition of an edge case, maybe you hear from some of them on HN and it sounds like a big group but this is a niche of a niche.
I think MacPorts still supports PowerPC Macs. I would need to rebuild my G5 to verify it because the hard disk is long dead, but last time I checked, it worked.
I get it - it’s a different beast with very different ideas behind it, but MacPorts is BSD-solid, and that’s a lot.
MacPorts has some level of support for PowerPC, but anything that isn't in the most recent ~3-4 releases is likely to be cut off from any number of packages at useful versions. (There's substantial work down to support Rust on much older versions of macOS, but there's also versions above which Rust has cut off older macOS versions.)
I believe that there's a recommended stream for when you need older versions support, but it's definitely a secondary target from what I've been reading on the MLs.
True, but I think you still want to avoid Homebrew if you're interested in older Mac versions. A specific project might have some support for the version you're interested in. For example, the Go 1.23 toolchain (which isn't the latest version) supports Mac releases back to Big Sur.
Yes MacPorts is the way. I switched after a new MacOS release meant mine was too old - brew update uninstalled a bunch of stuff I had been using then it stopped and let me know.
Sure, but this might win you a couple of years max. Homebrew's "Support Tiers" page, which I linked, also addresses OCLP users, going so far as to specify a minimum Intel architecture. So, even if you use OCLP to allow support for newer OS versions, eventually your CPU architecture will be too old and you're back in Tier 3.
Also, the writing is on the wall: Ultimately, Homebrew will be ARM-only, once Apple's legacy support becomes ARM-only. At which point it's game-over for Intel Macs.
Homebrew solves the "availability of software" problem in the Mac ecosystem, but it does not solve the "Need to stay on the new hardware treadmill" problem.
The current version of brew has a flaw where the installer can't install isolated dependency trees in a sterile manner. If you have packages A, B, C, and D that all have updates, and assuming A,B,C depend on each other and come out to a total of say 1MB, and D is 1000MB, brew works in a MapReduce manner where it will attempt to finish downloading everything in parallel (even though the real bottleneck is D) before doing any installation.
Since the first 3 has no dependency on D, a better way would be to install them in parallel while D is still downloading.
what happens if I test this tool by installing some packages and then remove (the tool)? will I still be able to use Homebrew to manage these new packages?
I naively assumed it would work on the already installed homebrew packages. No such luck.
After installing, 'nb list' and thus eg. 'nb outdated' will yield the empty list!
I have absolutely no use for a competing homebrew installation that is mostly compatible ..
OT: speaking of Homebrew, I made an incorrect assumption about it that eventually led to some problems. It was me being stupid, but I bet others have made the same mistake but not yet hit problems. Hence this comment.
My mistake was when I upgraded from my 2017 iMac (Intel processor) to an Apple silicon Mac at the start of 2024 and migrated via Time Machine I did not do anything extra specifically for Homebrew. I just assumed that as things got updated via the normal periodic Homebrew updates I run it would start grabbing the Apple silicon binaries for binary things it installed.
It turn out that is wrong. They made Apple silicon Homebrew kind of independent of Intel Homebrew. Intel Homebrew uses /usr/local and Apple silicon Homebrew uses /opt/homebrew. This allows having both native and Intel Homebrew installed at the same time if you need both.
The correct way to migrate from an Intel Mac to an Apple silicon Mac is to install Apple silicon Homebrew on the new Mac, and then install all the packages you want. Intel Homebrew works fine on Apple silicon Macs so you can use the Intel Homebrew that migrated via Time Machine to make the package list to use with Apple silicon Homebrew (or you can make it on the old Mac).
I only noticed this because I was trying to build something from source using some libraries that were installed via Homebrew and running into problems. An LLM was helping figure this out and it was telling me I might have to manually symlink those libraries from where they were in /opt/homebrew to where the build process for the thing I was building expected to find them and I didn't have a /opt/homebrew. The libraries were somewhere in /usr/local. I then noticed those libraries were not for Apple silicon, checked other things installed view Homebrew and saw nothing was for Apple silicon, and realized I had the wrong Homebrew.
Why does the speed of a package manager matter? I'm being sincere too. I have used countless package managers, and speed is not an attribute of any of them that I have noticed.
Brew is a leaky piece of crap. nix 4 lyf. Seriously, I used to hate any time I needed to install something with brew not knowing if it was going to break everything else, since using nixpkgs for my macos dev requirements it's been so much nicer.
I've found brew so painful that I switched to nix. Nix unfortunately is painful in its own way. However, I recently discovered devbox which is a wrapper around nix. It works really well as a package manager. Just run "devbox global add <package>"
This thread invigorated my interest in Nix to manage my Mac environment.
I already have a Brewfile in my dotfiles stored in git, but wanted a way to setup all the little things on my Mac like trackpad settings, dock settings, file associations, etc. nix-darwin is the obvious solution.
Gave the task to ChatGPT and it came back saying it's a good way to get started, but then offered a middle-ground of an idempotent script to set things up. So I investigated the latter, and after a couple of minutes I now have a setupmac() function in my .bash_profile (yeah I use bash) which mostly consists of a bunch of 'defaults' commands and a few other things, and now continue with brew for managing software and setupmac() to setup everything else, and of course manually manage my dotfiles for ghostty/nvim.
I wish I had this earlier, because I just set myself up on 3 different Macs in the last week or so. I'm also glad I don't need to learn a new language and tooling for something pretty simple. Everything is a bit disjointed and not as automated as a proper nix setup and doesn't have that fidelity that nix has, but it's straight-forward, compact in that it sits in my brain easily, and easy to execute.
an0malous | a day ago
mitchitized | a day ago
an0malous | a day ago
ericcholis | a day ago
chuckadams | a day ago
alwillis | a day ago
Same. Whatever happens, the new version should support Brewfile.
mikemcquaid | a day ago
There’s a new vibe coded Homebrew frontend with partial compatibility and improved speed every few weeks.
Homebrew is working on an official Rust frontend that will actually have full compatibility. Hopefully this will help share effort across the wider ecosystem.
pxc | a day ago
I didn't know about the pending, official Rust frontend! That's very interesting.
SOLAR_FIELDS | a day ago
petcat | a day ago
It's like yum vs apt in the Linux world. APT (C++) is fast and yum (Python) was slow. Both work fine, but yum would just add a few seconds, or a minute, of little frustrations multiple times a day. It adds up. They finally fixed it with dnf (C++) and now yum is deprecated.
Glad to hear a Rust rewrite is coming to Homebrew soon.
novok | a day ago
SlightlyLeftPad | 15 hours ago
ukuina | 15 hours ago
SlightlyLeftPad | 15 hours ago
* it’s purpose built for mega-sized monorepo models like Google (the same company that created it)
* it’s not at all beginner friendly, it’s complex mishmash of three separate constructs in their own right (build files, workspace setup, starlark), which makes it slow to ramp new engineers on.
* even simple projects require a ton of setup
* requires dedicated remote cache to be performant, which is also not trivial to configure
* requires deep bazel knowledge to troubleshoot through its verbose unclear error logs.
Because of all that, it’s extremely painful to use for anything small/medium in scale.
kelvie | a day ago
It was mostly precipitated by when containers came in and I was honestly shocked at how fast apk installs packages on alpine compared to my Ubuntu boxes (using apt)
akdev1l | a day ago
For example pacman does not need to validate the system for partial upgrades because those are unsupported on Arch and if the system is borked then it’s yours to fix.
pxc | 18 hours ago
akdev1l | a day ago
Anyway the python program would call into libsolv which is implemented in C.
dnf5 is much faster but the authors of the program credit the algorithmic changes and not because it is written in C++
dnf < 5 was still performing similarly to yum (and it was also implemented in python)
wavemode | 23 hours ago
I'm perhaps not properly understanding your comment. If the algorithmic changes were responsible for the improved speed, why did the Python version of dnf perform similarly to yum?
akdev1l | 22 hours ago
mhurron | a day ago
Because how often are you running it where it's not anything but a opportunity to take a little breather in a day? And I do mean little, the speedups being touted here are seconds.
I have the same response to the obsession with boot times, how often are you booting your machine where it is actually impacting anything? How often are you installing packages?
Do you have the same time revulsion for going to the bathroom? Or getting a glass of water? or basically everything in life that isn't instantaneous?
what | 16 hours ago
pxc | a day ago
I think how to marry the Ruby formulas and a Rust frontend is something the Homebrew devs can figure out and I'm interested to see where it goes, but I don't really care whether Ruby "goes away" from Homebrew in the end or not. It's a lovely language, so if they can keep it for their DSL but improve client performance I think that's great.
joemi | 17 hours ago
SlightlyLeftPad | 15 hours ago
I can’t say that’s the only reason it’s slow of course. I’m on the “I don’t use it often enough for it to be a problem at all” side of the fence.
atonse | a day ago
(Just kidding, thank you for creating homebrew and your continued work on it!)
samgranieri | 23 hours ago
mikemcquaid | 6 hours ago
tfrancisl | a day ago
hirvi74 | 19 hours ago
Why? I think I am seriously starting to contract as case of FOMO. I feel like Rust is rapidly gaining territory everyday. I mean, that's fine and all, I suppose. I have never used it, so I have no real opinions on the language.
boobsbr | a day ago
forsakenharmony | 9 hours ago
halapro | a day ago
nozzlegear | a day ago
> nanobrew
> The fastest macOS package manager. Written in Zig.
> 3.5ms warm install time
> 7,000x faster than Homebrew · faster than echo
It presents itself as an alternative to Homebrew.
halapro | a day ago
nozzlegear | a day ago
0x457 | a day ago
You won't be having situation where one uses yarn and someone uses pnpm on the same project tho.
akdev1l | a day ago
You cannot really be compatible with this unless you run the Ruby as the install scripts could do whatever arbitrary computations
In reality most recipes contain a simple declarative config but nothing stops you from doing Ruby in there.
Hence to achieve total compatibility one would need to run Ruby
saghm | 23 hours ago
That said, it's also been a while since I've really had any huge complaints about brew's speed. I use Linux on my personal machines, and the difference in experience with my preferred Linux distro's package manager and brew used to be laughable. To their credit, nowadays, brew largely feels "good enough", so I honestly wouldn't even argue for porting from Ruby based on performance needs at this point. I suspect part of the motivation might be around concerns about relying on the runtime to be available. Brew's use of Ruby comes from a time when it was more typical for people to rely on the versions of Python and Ruby that were shipped with MacOS, but nowadays a lot of people are probably more likely to use tooling from brew itself to manage those, and making everything native avoids the need to bootstrap from an existing runtime.
akdev1l | 22 hours ago
I would agree with you that probably Ruby itself is probably not the bottleneck (except maybe for depsolving cuz that’s cpu bound)
orf | a day ago
runjake | a day ago
Xunjin | a day ago
> There’s a new vibe coded Homebrew frontend with partial compatibility and improved speed every few weeks.
People are free and probably do this because it is slow. Alternatives often are not a bad thing.
runjake | a day ago
alwillis | a day ago
brailsafe | 23 hours ago
firecall | 19 hours ago
https://brew.sh/2025/11/12/homebrew-5.0.0/
pugio | 18 hours ago
naikrovek | 19 hours ago
jazzpush2 | 23 hours ago
mpalmer | 22 hours ago
rbanffy | 22 hours ago
Exactly. I’ve been using MacPorts for ages and I love it.
/me ducks.
bentcorner | 17 hours ago
Alternatives are always good but IMO brew is just not something I interact with all that much and to me it's "good enough". It works and does what I expect, although to be fair maybe I'm on the happy path <shrug>.
mikemcquaid | 6 hours ago
I have zero issues with people vibe coding alternative Homebrew frontends, it's good for the ecosystem for there to be more experimentation.
What I take objection to is when one or more of these happen:
- incorrect compatibility claims are made (e.g. if you're not running Ruby, no post-install blocks in formulae are gonna work) - synthetic benchmarks are used to demonstrate speed (e.g. running `brew reinstall openssl` in a loop is not a terribly representative case, instead a e.g. cold `brew upgrade` of >10 packages would be). to be clear, I'm sure most of these projects are faster than Homebrew in fair benchmarks too! - incorrect claims about why Homebrew is slow are made (e.g. "we do concurrent downloads and Homebrew doesn't": true a year ago, not true since 5.0.0 in November 2025) - it's pitched as a "replacement for Homebrew" rather than "an alternative frontend for Homebrew" when it's entirely reliant on our infrastructure, maintainers, update process, API, etc.
Even on the above: of course people are free to do whatever they want! It's just at least some of the above hinders rather than helps the ecosystem and makes it harder rather than easier for us as a wider open source ecosystem to solve the problem "Homebrew is slow" (which, to be clear, it is in many cases).
steve-atx-7600 | 5 hours ago
derefr | 22 hours ago
When you say "Rust frontend", is the vision that Homebrew's frontend would eventually transition to being a pure Rust project — no end-user install of portable-ruby and so forth?
If so (ignore everything below if not):
I can see how that would work for most "boring" formulae: formula JSON gets pre-baked at formula publish time; Rust frontend pulls it; discovers formula is installable via bottle; pulls bottle; never needs to execute any Ruby.
But what happens in the edge-cases there — formulae with no bottles, Ruby `post_install` blocks, and so forth? (And also, how is local formula development done?)
Is the ultimate aim of this effort, to build and embed a tiny little "Formula Ruby DSL" interpreter into the Rust frontend, that supports just enough of Ruby's syntax + semantics to execute the code that appears in practice in the bodies of real formulae methods/blocks? (I personally think that would be pretty tractable, but I imagine you might disagree.)
mikemcquaid | 11 hours ago
yokoprime | 19 hours ago
hirvi74 | 19 hours ago
AbanoubRodolf | 17 hours ago
The real compatibility test isn't "runs all Homebrew formulae" — it's "runs the 15-20 formulae each developer actually uses." A tool that handles those correctly and fails clearly on edge cases is more useful in practice than a technically complete implementation that's slower.
What's missing from this thread is any data on that surface area, not more benchmark numbers.
waterTanuki | 14 hours ago
Where can I read more on this effort?
cromka | 10 hours ago
This is literally what "compatible" means, how else did you expect then to frame it?
ricardobeat | 9 hours ago
breppp | 9 hours ago
However, how is this effort different than uv vs pypi? why is this a bad thing?
ricardobeat | 8 hours ago
anbotero | 4 hours ago
adityamwagh | 3 hours ago
patabyte | 3 hours ago
drob518 | a day ago
pxc | a day ago
dilap | a day ago
Edit: no, it won't...
drob518 | a day ago
dilap | a day ago
swiftcoder | a day ago
I definitely have thought something along those lines (mostly when I go to install a small tool, and get hit with 20 minutes of auto-updates first).
Pretty sure I also will not be adopting this particular solution, however
bombcar | a day ago
SOLAR_FIELDS | a day ago
joshstrange | a day ago
I agree it’s annoying, but I haven’t turned it off because it’s only annoying because I’m not keeping my computer (brew packages) up-to-date normally (aka, it’s my own fault).
slackfan | a day ago
swiftcoder | a day ago
menno-dot-ai | 5 hours ago
saghm | 23 hours ago
what | 15 hours ago
saghm | 11 hours ago
swiftcoder | 10 hours ago
menno-dot-ai | 5 hours ago
This is not something that's solved by updating less frequently though. It would be solved by a 'minimum age' setting, but `brew` aren't planning on implementing that, with arguably valid reasoning: https://github.com/Homebrew/brew/issues/21421
swiftcoder | 5 hours ago
Minimum age solves a related problem - it gives maintainers some margin of time in which to discover vulnerabilities and yank the affected versions.
However, minimum age also delays you getting bug fixes (since those also need to age out).
In an ideal world one would probably be able to configure a minimum-age-or-subsequent-patch-count rule. i.e. don't adopt new major/minor package versions until either 1 month has elapsed, or a minimum of 2 patch versions have been released for that version.
mproud | a day ago
noahbp | a day ago
never_inline | a day ago
fleebee | a day ago
alwillis | a day ago
Never had any issues.
ziml77 | a day ago
motorpixel | a day ago
staticassertion | a day ago
password4321 | a day ago
https://github.com/asdf-vm/asdf/issues/290#issuecomment-2365...
rconti | a day ago
It constantly blows my mind how insanely long it takes just to do a few simple things on the fastest hardware I've ever owned in my life.
zerd | 14 hours ago
saghm | 23 hours ago
karel-3d | 5 hours ago
However, this is a vibe-coded app, with around 30 commits per day, which I don't let install packages on my machine.
pxc | a day ago
kassadin | a day ago
nb info --cask codex-app
nb: formula '--cask' not found
nb: formula 'codex-app' not found
luizfelberti | a day ago
[0] https://github.com/lucasgelfond/zerobrew
tomComb | a day ago
It appears that Nanobrew is not.
I care about the light-weight efficiency of these new native code variants much more when I want to use brew on some little Linux container or VM or CI, than I do for my macOS development machine.
Alifatisk | a day ago
Btw, I noted this:
> Zerobrew is experimental. We recommend running it alongside Homebrew rather than as a replacement, and do not recommend purging homebrew and replacing it with zerobrew unless you are absolutely sure about the implications of doing so.
So I guess its fine to run this alongside Homebrew and they don't conflict.
phist_mcgee | 22 hours ago
>Immediately get an error saying the install path is too long and needs to be fixed as /opt/zerobrew/prefix is too many bytes.
Yeah gonna need some work.
alsetmusic | a day ago
ryandrake | a day ago
Yea, I know. It's open source. They can do what they want. Still sucks.
1: https://docs.brew.sh/Support-Tiers
happyopossum | a day ago
I don’t think it’s reasonable to expect an open source project to support everything
gabagool | a day ago
dewey | a day ago
ksherlock | a day ago
bsagdiyev | a day ago
That makes no sense then. A power user may still want to run older OS versions for a reason. Take the training wheels off it and then it'll be a power user tool.
dewey | a day ago
No doubt there are edge cases like that, but I don't fault a project for not catering to the < 1% of users who would fall into that bucket and would probably be the ones that cause trickier support cases. These would maybe also be the user that could just install it without homebrew then, it's not like homebrew is the only way to install software.
edschofield | 20 hours ago
dewey | 20 hours ago
https://telemetrydeck.com/survey/apple/macOS/versions/
rbanffy | 22 hours ago
I get it - it’s a different beast with very different ideas behind it, but MacPorts is BSD-solid, and that’s a lot.
halostatue | 3 hours ago
MacPorts has some level of support for PowerPC, but anything that isn't in the most recent ~3-4 releases is likely to be cut off from any number of packages at useful versions. (There's substantial work down to support Rust on much older versions of macOS, but there's also versions above which Rust has cut off older macOS versions.)
I believe that there's a recommended stream for when you need older versions support, but it's definitely a secondary target from what I've been reading on the MLs.
skybrian | 17 hours ago
password4321 | a day ago
There's also https://github.com/dortania/OpenCore-Legacy-Patcher for the adventurous.
maxkfranz | a day ago
ryandrake | a day ago
Also, the writing is on the wall: Ultimately, Homebrew will be ARM-only, once Apple's legacy support becomes ARM-only. At which point it's game-over for Intel Macs.
Homebrew solves the "availability of software" problem in the Mac ecosystem, but it does not solve the "Need to stay on the new hardware treadmill" problem.
yabutlivnWoods | a day ago
tantalor | a day ago
manlymuppet | a day ago
maxloh | a day ago
Do they use some kind of Ruby parser to parse formulae?
[0]: https://github.com/Homebrew/homebrew-core/blob/26-tahoe/Form...
fny | a day ago
Onavo | a day ago
Since the first 3 has no dependency on D, a better way would be to install them in parallel while D is still downloading.
marksully | a day ago
12_throw_away | a day ago
themadsens | a day ago
After installing, 'nb list' and thus eg. 'nb outdated' will yield the empty list! I have absolutely no use for a competing homebrew installation that is mostly compatible ..
hsaliak | 23 hours ago
MoonWalk | 23 hours ago
tzs | 22 hours ago
My mistake was when I upgraded from my 2017 iMac (Intel processor) to an Apple silicon Mac at the start of 2024 and migrated via Time Machine I did not do anything extra specifically for Homebrew. I just assumed that as things got updated via the normal periodic Homebrew updates I run it would start grabbing the Apple silicon binaries for binary things it installed.
It turn out that is wrong. They made Apple silicon Homebrew kind of independent of Intel Homebrew. Intel Homebrew uses /usr/local and Apple silicon Homebrew uses /opt/homebrew. This allows having both native and Intel Homebrew installed at the same time if you need both.
The correct way to migrate from an Intel Mac to an Apple silicon Mac is to install Apple silicon Homebrew on the new Mac, and then install all the packages you want. Intel Homebrew works fine on Apple silicon Macs so you can use the Intel Homebrew that migrated via Time Machine to make the package list to use with Apple silicon Homebrew (or you can make it on the old Mac).
I only noticed this because I was trying to build something from source using some libraries that were installed via Homebrew and running into problems. An LLM was helping figure this out and it was telling me I might have to manually symlink those libraries from where they were in /opt/homebrew to where the build process for the thing I was building expected to find them and I didn't have a /opt/homebrew. The libraries were somewhere in /usr/local. I then noticed those libraries were not for Apple silicon, checked other things installed view Homebrew and saw nothing was for Apple silicon, and realized I had the wrong Homebrew.
hirvi74 | 19 hours ago
coldtea | 18 hours ago
Tried the same package with brew. Worked like a charm.
Uninstalled nanobrew.
denkmoon | 18 hours ago
sanderwebs | 16 hours ago
commandersaki | 11 hours ago
I already have a Brewfile in my dotfiles stored in git, but wanted a way to setup all the little things on my Mac like trackpad settings, dock settings, file associations, etc. nix-darwin is the obvious solution.
Gave the task to ChatGPT and it came back saying it's a good way to get started, but then offered a middle-ground of an idempotent script to set things up. So I investigated the latter, and after a couple of minutes I now have a setupmac() function in my .bash_profile (yeah I use bash) which mostly consists of a bunch of 'defaults' commands and a few other things, and now continue with brew for managing software and setupmac() to setup everything else, and of course manually manage my dotfiles for ghostty/nvim.
I wish I had this earlier, because I just set myself up on 3 different Macs in the last week or so. I'm also glad I don't need to learn a new language and tooling for something pretty simple. Everything is a bit disjointed and not as automated as a proper nix setup and doesn't have that fidelity that nix has, but it's straight-forward, compact in that it sits in my brain easily, and easy to execute.
bibimsz | 15 hours ago
orefalo | 12 hours ago
NamlchakKhandro | 11 hours ago
It's terrible