By Andy Jonak
If there's one thing most in IT can agree upon, it's change. In what we do in IT, change is constant, whether we like it or not. If you don't like change, then I recommend staying far away from the IT world. However, if you are the type of person who embraces or even thrives on change, there's a good chance IT is where you belong. Let's explore change in the context of IT this month.
As someone who has been in IT since 1994—going on 27 years now—I've seen the IT world progress, change, and evolve. To be clear, I like change. Change can be exciting, thrilling, and even downright scary at times. I believe that change, generally, but not always, brings out the best in us and keeps us on our toes.
With that said, change starts this year with our firm, Vicom. At the end of 2020, Vicom was acquired by Converge Technology Solutions, something we're all incredibly excited about. This acquisition allows us to tap into the vast resources that Converge brings to bear and enables us in ways that we never had before at Vicom. It's excellent news and a, as my post title states, a significant change. Let's talk about some of the significant changes I've seen in IT throughout the years.
When I first started in IT in 1994 (yeah, that long ago), PC's were a huge deal, remember? Today, the PC seems to be the least essential device behind phones, tablets, and notebooks, but PCs were the thing back then, especially since those other devices were rare or non-existent. Look how that has changed.
This was also the start of client-server, where things evolved from central systems that were very powerful to distributed systems barely used in terms of capacity and performance. Remember when firms bought racks and racks of servers (I remember many of our customers with 1U racks of servers—pizza boxes) to house their applications. Since most applications did not play nicely on the same server, each required a separate server with a separate Windows or Linux instance.
We all remember how this created enormous inefficiencies, where these servers would at most use 5-6% of their processing capacity. If you got to 8 or 10%, that was considered efficient. You bought the whole server but barely used any of it—what a terrible investment.
These server inefficiencies led to virtualization, which, as we all know, is the simple concept where a piece of software (the hypervisor) can run multiple OS instances and, therefore, applications and workloads virtually as VM's and run many of them on the same system. We all remember how powerful this was when first introduced and how it was embraced across data centers and IT departments everywhere.
Virtualization resolved server inefficiencies where servers could now run at 60-80% of processing capacity. Now you could finally use the full server you paid for, and it was straightforward to do. Virtualization was a significant change, a game-changer really, and laid the foundation for cloud computing, which is so pervasive today.
A consequence of virtualization was how easy it was to create and provision VM's, and people did—a lot. It used to be challenging and time-consuming to set up new servers and Windows or Linux instances, but no longer. This "VM sprawl" (as we use to call it) created the need for super dense computing platforms (blade servers) and large scalable systems that would each hold lots of VM's and firms bought them (and we sold them) by the truckload.
Firms realized how expensive VMware was to run and that it wasn't OK to provision unlimited VM's. Along came cheaper solutions such as Hyper-V, KVM, and others like Acropolis, where firms needed to accomplish the same thing, but more cheaply. We saw many new and reimagined management solutions to bring controls to help reign in VM sprawl. Again, another significant change.
Then came different types of computing platforms that were easy to implement, where the goal was simplicity in how to provision hardware and VM's. We saw compute, network, and storage solutions combined, such as NetApp's FlexPod and EMC's Vblock, which, while positioned as solutions, tended to be more of an architecture of different solutions working together, with predictable (and sometimes guaranteed) performance. It was an "applicance-ized" approach to IT infrastructure.
What followed was a natural extension of what came before it: CI/HCI solutions, which refined and developed this "appliance-ized" approach further with tightly integrated systems that were compact, predictably scalable, and provided predictable performance. Many of these started as single-purpose solutions, in areas such as VDI and backup, but then moved to the more general-purpose infrastructure solutions you see commonplace today, such as Nutanix, HPE SimpliVity, Dell VxRail, and Cisco HyperFlex—all of which are outstanding solutions, BTW. These solutions, and changes, were driven by market and customer needs.
While all of this was happening, the past few years ushered in the cloud era, where everything is about cloud. The idea of not having technology on-premise anymore and just put everything in the cloud is logical and desirable. It seems reasonable and practical on the surface, but it's not always as easy as it sounds, nor as cheap as it sounds, as we all know. Don't get me wrong; cloud makes sense for the right firms, the right workloads, and the right applications. But it isn't the be-all, end-all, replacement for everyone and everything in IT, and it generally isn't cheaper in the long run, as we've all found out, and must be tightly managed.
Cloud is just IT infrastructure, as are all of the technologies I've talked about previously, but it's delivered differently. Where it makes sense, it performs well, but that is for each firm to decide, and if you need help (shameless plug), reach out to your trusted partners, such as Vicom/Converge, as we can help guide you. And talk about change: the cloud has provided one of the most significant IT changes for all of us, customers, and solution providers alike.
These technologies have brought us to the era of Machine Learning and Artificial Intelligence. This is fascinating and, as we all know, probably the biggest game-changer of all. ML and AI's capabilities have helped firms transform their businesses through their use, and what's fascinating is there is an ecosystem of firms created who provide AI & ML-based based solutions primarily. ML & AI has brought new ways of doing business and new opportunities, too many to mention in a single blog post. They will continue to change and disrupt the IT industry—and that's a good thing. Firms ask us repeatedly: how can ML and AI help their specific business and their particular needs. Firms and industries as a whole are still trying to figure that out, but figure it out, they will. That leads to one of the most significant changes since I joined the IT world 27 years ago.
Inevitably that leads to a question: what's next? I think you'll continue to see many of these things I talked about continue to evolve into new solutions, potentially in ways we haven't even thought of yet, with one solution compounding or building off another. But whatever these new solutions become, firms will embrace them, implement them, and use them to make their businesses better in true entrepreneurial form. Isn't that what IT is all about? As you've heard me say for years, IT and technology are just tools, and those tools need to help make business better. If not, then what's their point?
Whatever's next will no doubt involved change, as IT has always been about change. Perhaps small change, or perhaps large change, but change nonetheless. Change is good and leads to what's next, which I know sounds like circular reasoning, but it's true. Are you OK with and ready for change? If not, then, as I mentioned at the beginning of my post, IT is a tough place to be.
Since I am a big fan of change, I look forward to seeing—and being part of—what's next, and I'll roll with whatever changes that come. I'm looking forward to it.
Until next month.
Andy
Comments