Today’s infrastructure is not your grandparents’ IT infrastructure, nor is it the infrastructure from a generation ago. The days of punch cards, vacuum tubes, ferrite core memory, floppies, and dial-up Internet are over.
Today’s infrastructure is also not the IT infrastructure that it five years ago, or even a year ago for that matter. Modern infrastructure is changing constantly, and all that we can do is provide a snapshot of infrastructure at the moment, along with a general picture of where it’s going.
If you are going to monitor infrastructure effectively, you need to understand what infrastructure looks like today, how it is changing, and what it will include tomorrow.
Let’s start by making a basic distinction: Hardware infrastructure is relatively stable (with a strong emphasis on the word “relatively”), and has been in a state of semi-stability for a few years. While any speculation about Moore’s Law reaching the end of the line is premature, the Moore’s Law curve appears to have at least partially leveled off for the moment, at least with regard to processor speed and RAM capacity (mass storage may be another story).
This leveling off means that the most substantial and important changes in IT infrastructure have been on the software side. This shouldn’t be surprising, since to a considerable degree, modern infrastructure is software. Software-defined networking, virtual machines, containers and the like mean that the line between hardware and software today is effectively quite blurry.
The fact that IT infrastructure can be seen largely as software is itself a key element of modern computing, and it should come as no surprise. Hardware, after all, is basically a framework, a structure designed to make things possible. What one does with those possibilities can make all the difference in the world.
The shift to software-based infrastructure has implications that go far beyond a typical change of platform. For one thing, hardware itself imposes a serious lag on the rate of change. It is expensive and time-consuming to replace or upgrade physical servers, networks, and peripherals, so many organizations have traditionally waited until it is obviously necessary (or even later) before making such changes. This lag may only be a matter of a few years, but it has typically affected the software level, as well as the infrastructure hardware itself, by imposing the need to accommodate both legacy hardware and the legacy software that it requires.
In modern software-based infrastructure, however, both application software and the elements that make up the infrastructure are insulated from most (if not all) of the underlying hardware elements, often by several layers of abstraction. As long as the hardware can support the requirements of the abstraction layer, the infrastructure itself is now largely free of hardware-imposed lag.
As a result, the rate of change in both infrastructure and application software is now governed by other factors, such as organizational culture and the practical limits on the speed of software design and development. These factors are generally “soft,” and the kind of lag that they tend to impose is both much shorter and much more dependent on conditions prevailing within a specific organization.
This means that any understanding of how we compute today can only be a snapshot that captures the state of modern IT infrastructure at the current moment. And what would such a snapshot contain? The key elements might look something like this:
So, how do we compute today? We compute largely in an environment that is virtualized, and insulated from the hardware level by multiple layers of abstraction. Our development and deployment pipeline is continuous and managed by event-driven automation. In many respects, the modern IT environment is a virtual world, insulated from the traditional hardware-based IT world to the point where many of the concerns that dominated IT just a few years ago have become irrelevant.
If that’s a snapshot of today, what will the picture look like tomorrow, or in five or ten years? There’s no real way to know, of course. Any prediction made today is likely to look increasingly foolish as time goes on.
But here are some other predictions. It is likely that we have only begun to see the effects of freeing the virtual-computing environment from the constraints imposed by hardware. It is also likely that the distinctions between virtualized computing, virtual reality, and the traditional world of physical experience will break down even more. In many respects, the rate of change in computing today is limited by our ability to assimilate changes as they occur, and to make full use of new capabilities as they develop. But automation and intelligence capabilities will likely disrupt nearly every function, vertical, and domain, unleashing new potentials for efficiency and dramatically altering the focus of people’s work.
Perhaps the virtualization of both computing and everyday experiences will increase the rate at which we can assimilate future change. If this is the case, future computing and future life in general might be completely unrecognizable to us if we were to catch a glimpse of it now, even though we are likely to be both the creators of and participants in that future. As we change the world, we change ourselves.