Faustian bargains for rotten apples

If I don’t publish at least one thing here in 2025 then I’ll break a tradition I started back in 2011 when I restarted writing. Every year since then I’ve put fingers to keyboard for at least one post, and I don’t quite feel like mothballing or putting it out to pasture just yet, despite the morass and chilling effect of putting decent written content online these days. That’s a great topic for 2026.

So when thinking about what else I could write about in 2025 to keep the streak going, given so much is off the table now for so many reasons, I realised that one topic for me this year is still fair game: the big changes in my personal computing setup outside of work.

I’ve been using Apple’s Mac as my daily driver since 2006, starting with a white polycarbonate MacBook. They rebooted that product line as part of their transition from PowerPC to x86 as their underlying processor architecture for Mac OS X as it was called back then. Between 2002 and 2006 I’d been using Linux in my personal computing life across the desktop and laptop systems that I had, and I’d come to see Mac OS X as a somewhat ideal embodiment of what I loved about Linux: it paired a UNIX-like core and userland with a slick GUI that was way more in sync with the way I think and work than Windows.

The Mac brought a serious level of polish to the kind of personal computing software I’m drawn to, by taking advantage of it only running on top of hardware they shipped and could therefore tightly optimise for. That gave them something very compelling that they’ve cashed in on ever since, dragging legions of folks like me—folks that can’t gel with Windows and don’t want to daily-drive Linux, but that want a powerful, stable and malleable personal computer to create things on—to their platform as a result.

There’s a Faustian bargain at play when you use a Mac, though, since you need to trade the free soul of software for those diabolical favours of polish, performance and stability. Sadly for Macphistopheles, now indubitably given over to the power of the endless riches provided by the others he made the same bargain with, he’s not holding up his end any more. Drop a hand into his barrel of fresh apples these days and you’re highly likely to grab one that’s rotten. Thus ends the ropey pun section of this year’s streak-keeping technological lament.

Between 2006 and 2019, though, the bargain held firm. I don’t buy new personal computers very often, so when I do I don’t mind investing what I judge to be the right money into the next one. 2006’s polycarbonate MacBook was parlayed into the first of a trio of desktop systems in 2010: a 27-inch iMac. I can’t remember exactly what I paid for it but it wasn’t bought new. I loved the all-in-one design that put pretty good hardware behind a great-for-the-time screen, so when the 27-inch “5K Retina Display” iMac showed up in 2014, putting pretty good hardware behind a best-in-class display that optimised heavily for desktop usage and reading text, it was an easy upgrade to one of those in 2015.

Then in late 2019, flush with the spoils of a good bonus at work, I decided to heavily increase my stake in the Mac. Apple had given me three really good computers in the last decade and a bit, each improving on the last in key ways that truly felt like upgrades. So even though at the time of buying I knew they were transitioning every Mac to Apple Silicon at some point in the future, I was still happy to invest in what would turn out to be the final swansong for Intel in an Apple product: the 2019 Mac Pro.

I was thirsty for all of the claims from Apple that started in 2017 around re-designing the then poorly-regarded Mac Pro, the model line that replaced the Power Mac during the PowerPC to Intel transition. They convinced the world (or at least me) that they were taking high-end workstation computing properly again as a product class, and that the new Mac Pro that they were working on would be a love letter to that kind of system.

Following the promises throughout 2017 and 2018, in 2019 it looked for all the world like they’d delivered. and with an Apple Silicon version some nebulous time in the future away (which turned out to be 2023) I felt that while they were already actively moving away from Intel in other product lines, that one last Intel model released in 2019 would be supported well enough to last the 5-6 year cadence that I tend to buy new personal machines on.

So I tipped an obscene amount of money into a 2019 Mac Pro plus big upgrades and accessories to go with it, and felt good about the continuing state of my deal with the Mac division of Apple. At the time I bought it I thought I’d be over the moon with the system, get all the way to about 2025 before feeling like upgrading and with almost 20 years of Macs under my belt, and finish my stint with the 2019 Mac Pro with a desire to get the direct Apple Silicon-based replacement.

Instead, I type this to you from a modern x86-64 system running Linux, the Mac Pro hasn’t been switched on for anything serious in at least half a year now, and I sold the Apple Studio Displays I bought to go with it. The 2nd hand market for the 2019 Mac Pro is so poor that I got more for the Studio Displays than I could ever get for the Mac, and that’s despite loading it with upgrades.

I really went all out. I upgraded to the 24-core Intel Xeon W-3265M plus 192GB of ECC DDR4-2933 (which is probably currently worth more than the rest of the system at the time of writing, due to datacentre deployments of ML systems turning the commodity DRAM market into shambles), and added a Sonnet 4x4 and brimmed it with enough high-end gen3x4 NVMe M.2 storage to have it benchmark at 12GB/s read and 10GB/s write. I replaced the default Radeon Pro 580X MPX with the Radeon Pro W5700X MPX to get access to its 16GB of dedicated VRAM, and spent almost $500 on Thunderbolt 4 Pro cables for the two Studio Displays. I even bought the stupid wheels. By most measures it remains a beast of a machine that I should still relish using today.

So, what went so wrong in the five years that I used it that it had me run screaming away from the platform I’d used for almost 20 years? There’s no denying Apple Silicon is (very) good, but I’d have to be out of my mind to blame it on the “legacy” hardware I chose to invest in one last time.

Given it’s what brought me to the Mac in the first place, instead it’s absolutely maddening to write that it was the now terrible quality of the software that really got under my skin, across both macOS and the software stack running on the Apple Studio Displays. The real kicker is that the closed source proprietary nature of it all means that there’s nothing I could do about it.

I have the privilege of access to key people at Apple responsible for some of the software running at the beating heart of the bits of macOS that went wrong for me, but—and I say this with the utmost respect for those very talented engineers that tried to help me get some of my bugs fixed—even they weren’t able to do anything about what I reported to them directly. And if you don’t have a direct line to folks working there like I do then you’re left filing “Feedback”, which I wouldn’t even bother wasting my time on. Getting a response is so rare as to make it completely useless as a means to get bugs fixed in macOS.

The functional lowlights were super low:

  • The firmware on the Apple Studio Displays wouldn’t tolerate them being used with a Display Port switch, going blank when switching from Mac to PC and back to a Mac. That happened regardless of the Mac used, be it the Mac Pro or an M1 Pro-based MacBook Pro that I also have. Given the Studio Displays only have a single input, I had to physically swap the cable to move the displays to different machines, and most days I did that several times a day.
  • macOS broke compatibility with the Sonnet 4x4 in macOS 14 Sonoma and didn’t fix it until macOS 15 Sequoia. The failure mode was the installed SSDs not showing up on boot, cold or warm. It’s hard to use a computer when your main high-performance storage array isn’t there, and that was broken for a full year.

Then there’s myriad smaller bugs and problems that compound on top of the functional ones that stopped me from actually using the system:

  • macOS is incredibly slow almost everywhere you look; cold boot speed to a functional desktop, general I/O performance due to its security posture, fetching and applying software updates (especially major version upgrades). It’s hard to overstate this when get to use comparable hardware with Linux. Windows is a similar unoptimised morass, but for different reasons.
  • The incessant design changes and feature creep that often seems anti-user. Liquid Glass makes fundamentally bad usability mistakes, especially around legibility of text. ML-powered features are often inaccurate and misleading (especially the text summaries), or are plain creepy and weird (generative image things like Genmoji).
  • FaceTime won’t tolerate me taking off my Bluetooth headphones mid-call to go and use the bathroom. They become unusable as an output target as soon as I take them off my head, and I have to restart the call to fix that. Sometimes I need to completely restart the whole Mac. That’s been broken for almost 2 years.

I’ve already mentioned the next point but I have to repeat it since it’s so critical. Given the proprietary nature of almost everything about macOS, from the bottom layer where nobody bar Apple can construct and sign a macOS kernel and boot it on the hardware, all the way up through their key substrate software like CoreGraphics and Foundation that most application software sits on top of, and the opacity of the support that they provide when it goes wrong, it’s impossible for anyone bar Apple to fix anything bugs or modify the system to suit their needs.

So the bargain only holds while a platform operator like Apple is a good steward of what they control. That means responding diligently and rapidly to feedback about problems, and listening to users to figure out what we want from their hardware and software. When you’re in charge of everything like that and don’t allow any user malleability, or anyone outside to poke and prod and fix things, you need to steward it almost perfectly. I used to feel like they put me first when it came to crafting the whole system, so that even though it was proprietary and I had no say in how the hardware and software intersection went, I’d get a fast, powerful, thoughtful system that did what I needed and moulded to my style of using a computer, with no showstopping rough edges that cause serious friction.

This time around I got something that now doesn’t fit what I’m looking for, especially around Liquid Glass and the ML feature creep, and serious bugs that caused a complete inability to use the computer as intended. It’s hard at this point not to hit up against the realisation that if the software was more open source I’d be able to help fix it, possibly even directly. I actually enjoy getting my hands dirty to do just that.

The bugs that involved the hardware feel particularly inexcusable given the money spent. Despite the Mac being a proprietary platform, the Mac Pro and the Studio Displays use industry standards for connecting add-in cards and the displays to a computer, allowing for the use of something like the Sonnet 4x4 (which is a product designed by Sonnet to be used specifically with the 2019 Mac Pro), and connecting the displays to a PC.

In fact, Apple are often the pioneers of the high-speed modern interface standards in the computer industry, for peripheral device, display and add-in card connectivity. They’ve had a big hand in PCI Express, DisplayPort, USB and Thunderbolt, all of which feature in modern Mac systems as they do in contemporary competing systems.

Despite that use of industry standards that should confer some standard interoperability with non-Apple hardware, I know that its my own fault for handing over the money in the first place, especially for the displays. I should have known that any company that sees fit to outfit a $1,600 segment flagship display with just a single input wouldn’t care about how it might be used with a DisplayPort switch, despite that being the interface standard the display uses. They pull the same single input bullshit with the $4,999 Pro Display XDR.

My post-Mac clarity comes in the form of a very high performance x86-64 system running Linux, and it’s an enormous difference in terms of interactivity, responsiveness and my ability to mould it to the way I want to work. I’ve paired that new system with a duo of LG 4K OLEDs that run at 240Hz. Its CPU (an AMD Ryzen 9 9950X3D) has 25% fewer CPU cores than the Xeon in the Mac Pro, but each is much higher in terms of its raw performance and has a much higher performance cache and memory system (although I only have half the RAM capacity in the new system). Each of the individual Crucial T705 gen5x4 NVMe drives in the new system is as fast as the complete aggregate storage array in the Mac Pro’s Sonnet 4x4. NVMe bandwidth has quadrupled like-for-like in the time since the Mac Pro was released.

So with my new system’s hardware in the same kind of performance ballpark as the Mac Pro, it feels at least an order of magnitude faster and more responsive in normal use, especially when it comes to anything bound by I/O. The Linux distribution I use is Arch btw, and Arch is a rolling distribution that gets constantly upgraded with new software. I have a multi-gigabit connection to the Internet, which allows that new Linux system to download, validate, extract, and apply very large updates, even those spread across thousands of files, usually in the low tens of seconds.

macOS takes several times longer than just just to validate the data it downloads as part of a system update process, never mind get it all extracted. prepared and applied to disk, even on a system with high-performance CPUs and very fast raw disk I/O performance like I have in the Mac Pro. I know the things each OS does during system updates aren’t 100% equivalent, but it highlights how fast a computer can be if the philosophy over who owns and controls the computer is flipped in favour of both the user and technical simplicity.

On the Linux system all applications, including binaries I use from my shell, launch almost instantly, whereas on macOS you need to wait for the system to effectively phone home to Cupertino to ask for permission to run any code. I know that macOS does that in the name of system safety, but it results in a system that makes you feel like it’s wading through treacle. It’s gotten to the point where it’s even slow running code for notarised apps Apple have already been given signatures for, and that have already cleared the first-run quarantine process that macOS puts applications through.

Technologically I think we all know deep down that macOS doesn’t need to be that way, doing so much behind the scenes all the time that it’s noticeably detrimental to overall system responsiveness. Open Console.app on macOS and ask it to show you the current streaming system logs, something it doesn’t do by default now when you open it because of the usability and performance hit, and you’ll see so much log spam from everything running that it makes you feel like the system is broken.

Given how good modern Linux distributions like Arch are these days, Apple and Microsoft have to be worried. Especially Microsoft, who are currently somehow doing even worse than Apple when it comes to stewardship of their flagship consumer operating system, brimming it with intrusive ads—thankfully something Apple has stayed away from in macOS—and ML capabilities nobody seems to have asked for or wants to use.

Put all together—the big functional bugs that persisted for full major version runs, the smaller yet significant usability issues, the proprietary closed-source nature of the software, the really bad Liquid Glass UI redesign, the ML features that don’t work well on behalf of the user—and they’ve lost me as a desktop workstation customer at least for the forseeable future. Realistically, I can’t see a way back either unless they open source a majority of the substrate frameworks, undo or make truly optional years of the excess feature creep, fix a majority of the smaller operational bugs, make it much faster and more responsive, and repeal the recent poor user interface design and experience.