What does cloud computing have in-store next?
Today in 2018, the age of “cloud computing” is well underway. The origin of the term is a matter of debate – Google CEO Eric Schmidt first used the term “cloud computing” in August of 2006; MIT’s Technology Review traces the term to a Compaq Computer business plan authored in 1996. Wherever the term originated, the concept of the cloud goes back to the 1960s when IBM and DEC offered time-shared systems. However, the cloud couldn’t become what we see today without fundamental breakthroughs in technology. What’s most important now is what cloud computing has become and where it’s headed; where we are headed.
The concept of IT, particularly for large enterprises, has been built on a physical presence – data centers. In the past, developing new applications meant purchasing new compute (servers) capacity, wiring it up to the network, adding capacity to the ever-expanding storage array and ensuring appropriate backup capabilities. It was an expensive and highly specialized process, which often led to data center sprawl and additional management complexities.
Then, in 2001, VMware introduced x86 server virtualization. At first, the industry was skeptical, but it didn’t take long before the benefits of the technology were clear. Server virtualization eventually led to network and storage virtualization – an important step towards cloud computing.
One of the many attempts at defining the cloud started with: “cloud computing can and does mean different things to different people.” Precisely. What’s funny is that in the early days there were ferocious arguments about just what cloud was. And although concise definitions evaded us, everyone sensed what it fundamentally meant; we were in for some significant changes. Business opportunities would be enormous.
According to ESG research, 85% of surveyed organizations are using some sort of public cloud, while only 3% of organizations have no intention of ever adopting the technology. Further evidence that the IT world has made the big switch; if one were to Google “cloud computing,” you’d get about 213,000,000 results.
Cloud, hasn’t just changed IT. It has changed how organizations are born and how business is conducted. Cloud has changed the way information is created, stored, altered, distributed and secured. It’s not only that traditional approaches no longer hold; the nature of our data, where it is, who owns it, how we protect it, have all shifted as well.
It’s clear that this isn’t just another turn of the IT crank. So, how closely are we watching and managing how it all unfolds?
Ten years ago, Nicholas Carr’s book, “The Big Switch: Rewiring the World, from Edison to Google” was described as “the most influential book on the cloud computing movement” (Christian Science Monitor). At the time, there was a good deal of healthy skepticism about cloud potential and practicality. The book was an early examination of cloud computing developments and the parallels to the creation of the electrical grid. In his February 2018 blog, Carr notes the ten-year anniversary of the book and reflects particularly on the second part, “Living in the Cloud.”
“Living in the Cloud,” is darker. In fact, it was during the course of writing it that my view of the future of computing changed. I began The Big Switch believing that the new computing grid would democratize the use of computing power even as it centralized the machinery of data processing. That is, after all, what the electric grid did. By industrializing the generation and distribution of electricity, it made power a cheap resource that everyone could use simply by sticking a plug into a wall socket. But data is fundamentally different from electric current, I belatedly realized, and centralizing the provision of computing would also mean centralizing control over information. The owners of the server farms would not be faceless utilities; they would be our overseers.”
Sometimes, technology can get away from us. Unforeseen developments have dramatic results, even outside the data center. Some results can be good, improving business outcomes and raising the competitive bar. However, some outcomes give cause for concern, or at least the need for further detailed evaluation. What do we need to do to ensure the future of technology delivers the type of results expected? Can we be sure that thoughtful planning is always the foundation of how we use technology?
In the last century, we’ve witnessed humanity achieve several feats – nuclear energy, space travel, smartphones, electronic cars – just to name a few. If previous generations could have foreseen the long-term implications of some of these achievements, how might that have influenced the things they did to manage our collective future? How would all the competing interests have changed the innovation that we’ve achieved thus far?
Each technological advancement comes with excitement and goes through several phases of adoption – from the early adopter to the laggards. Cloud computing, well past the early adopter stage, is bound to undergo further innovation as adoption increases and new applications and use cases are uncovered. It will be fascinating to see the advancements and impacts these innovations have on businesses and humanity.