If it doesn’t ever execute Ruby: it cannot be compatible with Homebrew. “Compatible” is doing a bit of work here when it also means “implicitly relies on Homebrew’s CDN, CI, packaging infrastructure and maintainers who keep all this running”.
There’s a new vibe coded Homebrew frontend with partial compatibility and improved speed every few weeks.
Homebrew is working on an official Rust frontend that will actually have full compatibility. Hopefully this will help share effort across the wider ecosystem.
Context for those unaware: the commenter, mikemcquaid, is the project lead for Homebrew.
Thank you, his arguments totally makes sense, only the part that makes me icky is:
> There’s a new vibe coded Homebrew frontend with partial compatibility and improved speed every few weeks.
People are free and probably do this because it is slow. Alternatives often are not a bad thing.
Yeah, tbh homebrew is slow as fuck. It literally took 30 minutes to install aws cli on my 2020 mbp. I will happily flock to every new version that's faster.
Since I enabled HOMEBREW_DOWNLOAD_CONCURRENCY, downloads have improved for me to the point where download speed is no longer an issue.
Good to know! I was doing this with a hacky one-liner but wasn't aware of this flag. I think the sequential build/install process is the agonizing bit though.
Point noted! I took it as a tongue-in-cheek phrasing of "agentically coded". Hopefully, that's right.
It is really coll that Homebrew provides a comprehensive enough JSON API to let people build on Homebrew in useful ways without directly running Ruby, despite everything being built in a Ruby DSL. That really does seem like a "best of both worlds" deal, and it's cool that alternative clients can take advantage of that.
I didn't know about the pending, official Rust frontend! That's very interesting.
Wow they are finally getting away from Ruby? Awesome. The speed will be a nice boon
Yeah I don't know why people are saying that speed doesn't matter. I use Homebrew and it is slow.
It's like yum vs apt in the Linux world. APT (C++) is fast and yum (Python) was slow. Both work fine, but yum would just add a few seconds, or a minute, of little frustrations multiple times a day. It adds up. They finally fixed it with dnf (C++) and now yum is deprecated.
Glad to hear a Rust rewrite is coming to Homebrew soon.
One of the reasons I switched to arch from debian based distros was precisely how much faster pacman was compared to APT -- system updates shouldn't take over half an hour when I have a (multi)gigabit connection and an SSD.
It was mostly precipitated by when containers came in and I was honestly shocked at how fast apk installs packages on alpine compared to my Ubuntu boxes (using apt)
pacman is faster simply because it does less things and it supports less use cases.
For example pacman does not need to validate the system for partial upgrades because those are unsupported on Arch and if the system is borked then it’s yours to fix.
Ruby doesn't have to be the slow part, bazel uses starlark which is mostly python and it's very fast.
yum was slow not because of python but because of the algorithm used to solve dependencies
Anyway the python program would call into libsolv which is implemented in C.
dnf5 is much faster but the authors of the program credit the algorithmic changes and not because it is written in C++
dnf < 5 was still performing similarly to yum (and it was also implemented in python)
> dnf < 5 was still performing similarly to yum (and it was also implemented in python)
I'm perhaps not properly understanding your comment. If the algorithmic changes were responsible for the improved speed, why did the Python version of dnf perform similarly to yum?
> Yeah I don't know why people are saying that speed doesn't matter. I use Homebrew and it is slow
Because how often are you running it where it's not anything but a opportunity to take a little breather in a day? And I do mean little, the speedups being touted here are seconds.
I have the same response to the obsession with boot times, how often are you booting your machine where it is actually impacting anything? How often are you installing packages?
Do you have the same time revulsion for going to the bathroom? Or getting a glass of water? or basically everything in life that isn't instantaneous?
I would guess this change builds on the existing json endpoints for package metadata but that the Ruby DSL is remaining intact.
I think how to marry the Ruby formulas and a Rust frontend is something the Homebrew devs can figure out and I'm interested to see where it goes, but I don't really care whether Ruby "goes away" from Homebrew in the end or not. It's a lovely language, so if they can keep it for their DSL but improve client performance I think that's great.
I appreciate the push for an official rust frontend. I've personally been migrating (slowly) to using nix to manage my Mac's software, but there are a ton of limitations which lead me to rely on homebrew anyway. The speed ups will be appreciated.
Heyyyy, who are you to tell us what is and isn't compatible with homebrew?
(Just kidding, thank you for creating homebrew and your continued work on it!)
I think Max Howell created Homebrew. I think McQuaid is the current maintainer
Makes no sense, the wording suggests it can use Homebrew's backend, not that it's a complete alternative to Homebrew. Nobody is confused about that.
The recipes for building and installing homebrew packages are written in Ruby
You cannot really be compatible with this unless you run the Ruby as the install scripts could do whatever arbitrary computations
In reality most recipes contain a simple declarative config but nothing stops you from doing Ruby in there.
Hence to achieve total compatibility one would need to run Ruby
Is this still true since they swapped to distributing binaries rather than building from source on each install? It's been years since I last installed something from homebrew that built from source, so something that could install the same binaries would be compatible from my standpoint.
That said, it's also been a while since I've really had any huge complaints about brew's speed. I use Linux on my personal machines, and the difference in experience with my preferred Linux distro's package manager and brew used to be laughable. To their credit, nowadays, brew largely feels "good enough", so I honestly wouldn't even argue for porting from Ruby based on performance needs at this point. I suspect part of the motivation might be around concerns about relying on the runtime to be available. Brew's use of Ruby comes from a time when it was more typical for people to rely on the versions of Python and Ruby that were shipped with MacOS, but nowadays a lot of people are probably more likely to use tooling from brew itself to manage those, and making everything native avoids the need to bootstrap from an existing runtime.
I mean, I'm confused about it. The nanobrew homepage says this:
> nanobrew
> The fastest macOS package manager. Written in Zig.
> 3.5ms warm install time
> 7,000x faster than Homebrew · faster than echo
It presents itself as an alternative to Homebrew.
There are many such examples for npm as well: many "compatible" managers, one registry.
Sorry, examples of what? Package managers that present themselves as replacements for other package managers? Or package managers that aren't compatible with the registry they're supposed to be compatible with? Your use of scare quotes is confusing.
pnmp, npm, yard all have different lockfiles, all use the same registry format (and the same registry itself), all try to stay compatible in other ways.
You won't be having situation where one uses yarn and someone uses pnpm on the same project tho.
Please, don't remove bottles and casks that are blocked by Gatekeeper. :˜(
What would be great is a Homebrew-compatible system that doesn't cut off support for older machines. I have a 3.8 GHz Quad core i5 iMac that still crushes, yet Homebrew has determined that I'm just too old and icky[1] to work with anymore. I had to move over to MacPorts, which is surprisingly nice, but I still miss brew.
Yea, I know. It's open source. They can do what they want. Still sucks.
To be fair, Apple stopped providing security fixes for Mojave ~4+ years ago, and there have been 7 or 8 new os releases since then…
I don’t think it’s reasonable to expect an open source project to support everything
I agree in principle but Homebrew only supports the latest 3 versions of macOS. Right now Ventura 13 which came out in October 2022 is unsupported.
I still think that's entirely fair for a power user tool like homebrew. With the upgrade rates of macOS that probably means that's 98% of the users would be covered. Expecting an open source project to accept bug requests from a bigger variety of versions that then would need test devices on these versions to replicate issues sounds unrealistic. Bigger companies, or Apple itself I would hold to much higher standards when it comes to that.
> power user tool like homebrew.
That makes no sense then. A power user may still want to run older OS versions for a reason. Take the training wheels off it and then it'll be a power user tool.
> A power user may still want to run older OS versions for a reason.
No doubt there are edge cases like that, but I don't fault a project for not catering to the < 1% of users who would fall into that bucket and would probably be the ones that cause trickier support cases. These would maybe also be the user that could just install it without homebrew then, it's not like homebrew is the only way to install software.
brew used to say, more or less, "This OS is old and unsupported. Don't submit bug reports. If you have problems, too bad. If you submit a PR to fix something, we might merge it". Fair enough, right? Now it just says, "Go fuck yourself, grandpa."
Run Linux on it. Apple has cut that OS off anyway. Would be safer security wise to have an OS that's updated
Yes MacPorts is the way. I switched after a new MacOS release meant mine was too old - brew update uninstalled a bunch of stuff I had been using then it stopped and let me know.
There's also https://github.com/dortania/OpenCore-Legacy-Patcher for the adventurous.
You could use the OpenCode legacy patcher to upgrade to v15/Sequoia: https://dortania.github.io/OpenCore-Legacy-Patcher/
Sure, but this might win you a couple of years max. Homebrew's "Support Tiers" page, which I linked, also addresses OCLP users, going so far as to specify a minimum Intel architecture. So, even if you use OCLP to allow support for newer OS versions, eventually your CPU architecture will be too old and you're back in Tier 3.
Also, the writing is on the wall: Ultimately, Homebrew will be ARM-only, once Apple's legacy support becomes ARM-only. At which point it's game-over for Intel Macs.
Homebrew solves the "availability of software" problem in the Mac ecosystem, but it does not solve the "Need to stay on the new hardware treadmill" problem.
This feels like a solution looking for a problem. I have a couple hundred brew packages on my system and I’ve never sat there thinking “If this was only 2 seconds faster…” while doing an update. I’m sure the Homebrew folks could mine this for a few ideas of how to further optimize brew, but I don’t think I’ll be adopting it anytime soon. Compatibility is more important than speed in this case.
Brew definitely used to be a lot slower, and I used to find it very tedious. I feel like they've done a reasonably good job in improving that over the years though (with the switch to distributing binaries by default being a huge win in terms of speed). I have to wonder if stuff like this is more due to lingering feelings from before combined with the easy access to vibe coding tools. If LLM coding came a few years earlier, maybe projects like this one would have made more sense to me.
> I’ve never sat there thinking “If this was only 2 seconds faster…” while doing an update
I definitely have thought something along those lines (mostly when I go to install a small tool, and get hit with 20 minutes of auto-updates first).
Pretty sure I also will not be adopting this particular solution, however
I'm not sure if I just have way fewer things installed than most people or I just update more often, but I haven't experienced anything like this for years. I run `brew upgrade` probably around once every (work)day, usually right before doing a git pull or something, and then I'll quickly look at a couple emails or slack messages, and then it's always done by the time I switch back
I've never thought "only 2 seconds faster" - I've certainly thought "why is this taking half the time it takes Gentoo to recompile an entire server".
But you can turn that behavior off, IIRC it tells you the environment variable to set if you don’t want it to do that every time it runs.
I agree it’s annoying, but I haven’t turned it off because it’s only annoying because I’m not keeping my computer (brew packages) up-to-date normally (aka, it’s my own fault).
I'd be much happier if it were on a background job, than arbitrarily running when I invoke a command
Terrible default behavior is a great reason to abandon a software package.
FWIW this seems to have improved in recent years. Back in the dark times of non parallelized downloads I would purposefully wait to end of day and fire the thing off before leaving
I've been a lightweight homebrew user for many, many, many years now. I just use it to download or update a thing I need, once every 3-6mo.
It constantly blows my mind how insanely long it takes just to do a few simple things on the fastest hardware I've ever owned in my life.
If you use the Homebrew module for Nix-Darwin, running `brew` against the generated brewfile becomes the slowest part of a `darwin-rebuild switch` by far. In the fast cases, it turns something that could take 1 second into something that takes 10, which is definitely annoying when running that command is part of your process for configuration changes even when you don't update anything. Homebrew no-ops against an unchanging Brewfile are really slow.
Agreed here. The speed bottleneck I run into is simply that there's often a lot of packages that need updating, so there's a lot to download. And if anything needs to be compiled from source then the time that takes will dominate (though I think everything I currently run is thankfully pre-built)
The same criticism has been said of Deno and Pnpm and bun, and yet, despite all these years since their respective releases, node and npm remain slower than all three options.
Yeah, but do they work? Last time I gave bun a chance their runtime had serious issues with frequent crashes. Faster package installation or spin-up time is meaningless if it comes at the cost of stability and compatibility.
bun is my go to for npm packages; it’s so much better and faster than npm, it’s not funny.
Never had any issues.
Well, pnpm solves the storage issue, which is a more pressing reason to use it. (I don't know about deno/bun)
If I have to deal with even the mention of another package manager in the cross-platform dev ecosystem I am going to snap
Horses for courses, but I've stopped using brew 'cuz it's too slow, so this might bring me back!
Edit: no, it won't...
Agreed on horses for courses. Different people have different tolerances. And yea, all things being equal, faster is better, but they are almost never equal. If you don’t mind me asking, what does “too slow” mean for you in this context? Do you have a particularly complex setup? And what do you use now as an alternative and how has that impacted the update speed?
I wish I could remember the details -- I know I got annoyed with things being slow and when I got a new computer decided to go the no-homebrew route. I'm using nix, and it seems fine so far, but I also really don't understand it at all, which is a little concerning. :-)
My brew update/upgrade takes forever
I've wanted brew to be faster. It would be a nice QoL for me.
See also: asdf and mise
https://github.com/asdf-vm/asdf/issues/290#issuecomment-2365...
I naively assumed it would work on the already installed homebrew packages. No such luck.
After installing, 'nb list' and thus eg. 'nb outdated' will yield the empty list! I have absolutely no use for a competing homebrew installation that is mostly compatible ..
This might be a good thing for homebrew to adopt for the download/install process, but if it doesn't include a ruby interpreter, I have a hard time seeing how it's going to be compatible with anything but searching and installing bottles. I install most of my packages from a Brewfile, which itself is Ruby code.
> I install most of my packages from a Brewfile, which itself is Ruby code.
Same. Whatever happens, the new version should support Brewfile.
It might be good to explain how this differs from zerobrew [0], which is trying to accomplish the same thing
Zerobrew looks mature, I'll check it out.
Btw, I noted this:
> Zerobrew is experimental. We recommend running it alongside Homebrew rather than as a replacement, and do not recommend purging homebrew and replacing it with zerobrew unless you are absolutely sure about the implications of doing so.
So I guess its fine to run this alongside Homebrew and they don't conflict.
And zerobrew, like the original Homebrew, is compatible with Linux.
It appears that Nanobrew is not.
I care about the light-weight efficiency of these new native code variants much more when I want to use brew on some little Linux container or VM or CI, than I do for my macOS development machine.
This is most certainly vibed with a few optimization focused prompts. Yes - performance is a feature, but so is lack of risk.
So, A) to what extent is this vibe coded? And B) what is "trilok.ai" where you download it from?
Do you choose compatibility or speed?
nb info --cask codex-app
nb: formula '--cask' not found
nb: formula 'codex-app' not found
How does this work? AFAIK Homebrew formulae are written in Ruby [0].
Do they use some kind of Ruby parser to parse formulae?
[0]: https://github.com/Homebrew/homebrew-core/blob/26-tahoe/Form...
It uses the Homebrew API and uses its own dependency resolver and linker to pull Homebrew's precompiled packages.
If we get the Bun-ification of every package manager and language ecosystem that would be an awesome thing. This is a great trend.
I'm not a Python dev, but I appreciate the motivation uv has inspired across other package managers. I tried another brew replacement called zerobrew last month. It installed packages to a different directory from homebrew, so I didn't actually test drive after seeing that. Regardless, I look forward to the competition pushing mainstream tools to improve their performance.
And why does speed matter in this case?
Does it reinstall postgres for every package install?
HOMEBREW_NO_AUTO_UPDATE=1 will disable this (annoying) behavior. Set it in your bashrc or zshrc.
(report card for an0malous): "Does not play nice with other students."
It's true :')
I've been looking for something like this, especially to use only with casks now that Homebrew has removed support for not adding the quarantine bit. Looking forward to giving it a try!
what happens if I test this tool by installing some packages and then remove (the tool)? will I still be able to use Homebrew to manage these new packages?
The current version of brew has a flaw where the installer can't install isolated dependency trees in a sterile manner. If you have packages A, B, C, and D that all have updates, and assuming A,B,C depend on each other and come out to a total of say 1MB, and D is 1000MB, brew works in a MapReduce manner where it will attempt to finish downloading everything in parallel (even though the real bottleneck is D) before doing any installation.
Since the first 3 has no dependency on D, a better way would be to install them in parallel while D is still downloading.