IT systems become more complex over generations. There's always something that could be made simpler by adding another abstraction layer. This can not go on
indefinitely. But it will probably go on for longer than we all wish it would.
Sorry, this might be a totally stupid and banale thought. But it feels relevant right now and I don't know how to explain my point more concise. Let me know if you think I
should. (Of if it's because I don't really have a point.)
When you think back about how computers were used in the 1960s, the 1970s, the 1980s, the people who used them back then really knew their systems, as it's often put. With
each generation more people began using computers. So while some were still designing, improving and expanding circuits, others would work on inventing higher-level
languages, operating systems. When you think a bit further, the 1990s, microcode became more complex, operating systems started to became more complex, software interfaces
between applications were developed. But we still needed people who maintained OS kernels, worked on processor architectures and knew serial and parallel interfaces on a
low level.
You can look at any small part of computers and will find that new abstraction layers have formed over time. E.g. Your fan speed controller. I bet you don't even
know what processor it is utilising and what it's capable of. Even if you study an open source driver for it, you'll likely just see an imitation of some things a Windows
driver is doing. There are probably only a handful of people in the world who really understand that tiny part of your computer.
Let's not get into networking, the internet and the complexity that was added to everything in the last couple of decades.
My point is: We will need people who understand
every little thing of these hugely complex systems at least at some point. Otherwise systems will not run smoothly or
reliably. In the silly little example of the fan speed controller, if that would not work with newer systems anymore for some reason, documentation would probably be good
enough to find a workaround. If it would have to be replaced in future systems, that's also doable, or you can just run fans at full speed all the time. But there are so
many other components (I'm mainly thinking of software) that don't just have to work reliably on their own but interact with other, evolving components.
Not every little
thing can be maintained continuously with the amount of attention it deserves. Be it the often used example of a small open source software component that
90% of software somehow relies on, maintained by a single person at the risk of, well, anything that might happen to a human. Or a commercial product that's driven to make
as much money as possible with next to no work hours. Or end of life of some software that still runs on millions of machines.
These are disruptions in IT that happen right now. With increasing complexity of systems, failures that have not been properly planned for will probably happen more and
more often. It's not even unusual today that when a service goes down, the people responsible for keeping it up don't understand what has happened. They have to start a
research into the matter; if they have the time or it's deemed important enough. Because there's a gap between the coders, who know the languages, frameworks and tools
they're working with, and the system administrators, who know their OS, config, containers with other OSs, their config and somewhat the services that are running. But in
between there are frameworks, huge libraries that depend on other libraries you don't even know about, cloud services you have no insight into. The code written, if it's
still written by a coder at all, may be compiled into another language that's interpreted, at each layer adding Gigabytes of dependencies you have never read and can't
possible stay updated on.
All of those components have bugs. The more we add, the more failures will occur. More projects will be kept hardly alive because they're still needed to delay another
failure. Just as you – even as a computer enthusiast – likely don't know what physical signals are needed over your USB port to make it do what it does, people responsible
for keeping a service you rely on running don't know how most of the systems work that they are keeping online.
This growing complexity can be seen in almost every field. But I think it is growing especially fast in medical science and IT. Both will have negative effects on our
lives. But with medicine it is a side effect of a science that's working to improve and prolong our lives. So it might be worth it. IT does not have that noble goal.