Operating systems are worthy of our attention
Regular readers will know The Guardian’s misreporting of computer science is a frequent stimulus for my posts. Well, it’s happened again. This time, an article described Unix as a programming language. Argh!
Okay, I struggle to understand how someone who specialises in computer-related reporting could make this mistake, but I’m not going to dwell on this1. Instead, I’m going to use it as an excuse to fill the recent lapse in my posting schedule2 and talk about why operating systems (OS) should be worthy of our attention, rather than being an obscure thing that can readily be confused with something else. And I’m also going to say a little bit about how the future of OSs might pan out in an increasingly AI-centric era.
I should start by saying that not everything about OSs excite me. In fact, they clearly separate into things I find interesting and things I find boring. The boring bit includes resource allocation algorithms, which spend their time divvying up the hardware resources between everything that wants to use them. They’re basically managers. Without them, the bits and pieces of your computer would spend all their time squabbling. So, they’re definitely very important, but they bring back distant memories of sleeping through lectures.
More interesting to me are the services provided by OSs. You want your program to have a user interface? You want to draw something? You want some animation? Show a video? Download from the internet? Make some nice sounds? And you don’t want to implement all this from scratch by yourself? Well, the OS is waiting to provide all of this and more with a mere function call, saving you the decades of man hours that it took experts to put all of this stuff together. Which I think is pretty neat. And OSs are always growing, providing more and more services to reflect the growing needs of developers. Take Apple Intelligence as a recent example — this is basically an OS service that gives applications access to LLMs, allowing them to do things like text translation and image generation with just a few lines of code.
But what really excites me is the sheer complexity of modern OSs. Take Unix as an example. This thing’s been around since the 1960s, growing more and more complex every year, gradually morphing into modern incarnations like macOS and Linux. It’s not just a program — it’s a rich ecosystem of programs and code written by generations of perhaps hundreds of thousands of programmers in over 100 programming languages3. Most of its components are blissfully unaware of the rest, but thanks to the way in which OSs are designed, it all hangs together and gets things done. Really it’s a masterclass in how software engineering principles can assimilate over a billion lines of code into one cohesive system. And they more-or-less give it away for free4.
And all this man-made complexity is completely hidden behind the shiny exteriors of computers, mobile devices and the other electronic devices that litter our houses. Thousands of programs, continuously doing things that users are unaware of, just so that you and I can watch content, doom scroll, incarcerate Pokémons, or whatever. So next time you pluck your smartphone from your pocket, think about all that your OS is doing on your behalf, and please don’t accuse it of being a mere programming language.
Not that programming languages are unimpressive, by the way. Those of you who’ve been here a while know I have a fetish for such things. And an interesting fact is that both Unix and the great C programming language were created at Bell Labs, with the latter used to write the former.
And what of the future? AI is influencing OSs as much as anything else, and over the last couple of years there’s been a lot of talk of a convergence between AIs and OSs. Much of this is quite speculative, but in principle an LLM could take on many of the activities currently done by man-made code and algorithms, and potentially lead to efficiency gains in the process. Or maybe it would just replace perfectly adequate code with an unreliable black box that sometimes works miracles and sometimes causes Armageddon5. Who knows?
But at the moment, the real action lies in adapting operating systems to support AI infrastructure, rather than transforming them into AIs. I mentioned Apple Intelligence already — this is an example of developments on the consumer-facing side of things, essentially an OS-level service that gives apps seamless access to LLMs. On the enterprise side, new operating systems, like Vast AI OS, are being built specifically for data centres, optimised to manage the demands of modern AI workloads more efficiently than traditional OSs. In a way, it’s a case of history going full circle: as computing shifts back from personal devices to remote servers, we’re revisiting the environment Unix was originally designed for.
I’ve already been down this rabbit hole in “Big computers” will save us all!
Sorry about that. Teaching and paper deadlines have been absorbing all my processing cycles recently.
The figures in this paragraph are rough estimates for Unix-like OSs based on some Googling and a follow-up discussion with Claude. Don’t quote me!
Linux has always been free, and Apple basically give it away with the hardware. Microsoft still charge, but it’s pretty cheap considering the amount of work that went into it. You could also argue that a lot of the code in modern OSs wasn’t written by the companies that market it, so perhaps this is fair enough.
A trend that I discussed in Will everything become a neural network?



Ahhhh do you remember BeOS? I remember that making an old PC look&feel super fast again.
And even before that Steinberg MROS...