Ray O’Farrell of VMware’s cloud infrastructure R&D took the stage for Thursday’s LSI AIS 2013 opening keynote to show the current correlation of virtualization and cloud, and how it would extend to IT in 2020 as a faster, more efficient and multifaceted service compared to the present day.
The estimated numbers were quite staggering, especially considering the projections were only for a seven year span. Looking out from 2013, connected devices are expected to grow from 13 billion to 24 billion; computing capacity from 80 million to 2.6 billion; virtual machines from 68 million to 240 million; server ports from 63 million to 206 million; and finally the overall data from 4 zettabytes to 40 zettabytes.
From a next-generation application point of view, characterized by hyperscale companies such as Google/FaceBook, the deployment of virtual machines across data centers are developed with built-in applications of scale, fault-tolerance, as well as dynamic growth. In terms of networking, an interesting stat to note from the 63 million server ports is that 57% of them today are virtualized, hence the jump to 77% is not as substantial – effectively a software concept living in a virtualized environment.
Another key fact to remember is that, while a ridiculously high amount, the entire chunk of 40 zettabytes will not be “hot data”, as in data that needs to be accessed immediately and regularly. For that the fundamental change comes in the form of “hot servers” and “cold storage ” which would allow for data to be accessed quickly from a massive data pool, and perfect for those who do not or will not know when they will need to access data quickly yet are not willing to wait (essentially everyone on the planet). From a cloud and software aspect this is where the need of automated, software-defined data centers comes into play due to the aggregate data along a wide array of server clusters.