Is Virtualization Needed for Cloud?

Is Virtualization Needed for Cloud?

Estimated reading time: 9 minutes

This post was originally a vlog post from October of 2020. For reference, we post our vlogs here with the full transcript below along with clarifying comments and graphics. You can check out all our vlogs on our YouTube channel.

This week we tackle the question, “Is Virtualization Needed for Cloud?” which is pretty interesting and of course requires several long-winded corporate stories. If you don’t see the embedded video above, here’s a direct link.

Is Virtualization Needed for Cloud Computing?

Today we want to answer the question, Is Virtualization Necessary for Cloud Computing? Hopefully, by the end of this short video, I’ll be able to outline what Cloud Computing is and if it’s necessary for cloud computing. Just a quick caveat, this video is going to exclude virtualization as it applies to microservices and networks.

I can’t tell tell you the answer to the question Is Virtualization Needed for Cloud Computing until I recount some sordid tale of my corporate past. When I first got into the server business a long time ago, enterprises shied away from putting 3 tier architecture components on the same server. Your 3 tier architecture had web, app and database. These were almost always 3 different machines for reliability and better intercommunication. When apps were being developed or tested, you would have possibly seen them all on 1 machine.

engineers in server room

As we’ve discussed in the past, in 3 tier architecture, Web servers only presented data, application servers ran the business logic, and database servers stored the data. Also, these servers were also inherently redundant so they were costly. Once you had them configured, they’d sit there, idling. So you had 3 servers that were at least $2-$5k apiece, sitting there idling. If you measured the overall usage of these machines, it wouldn’t be unusual for them to be at 1-10% utilization. If your marketing department thought that a million visitors were going to come to your website, you tended to do something called building to peak which was building the scale of your environment to accommodate the worst-case scenario possible on the high side. So if you thought you were going to have a million visitors, you’d have to build for 1 million visitors even though only 50,000 visitors may have made it to the site. This was super costly but there were early startups that got media attention (usually TechCrunch) and the site would literally crash which was a bit embarrassing for a startup especially one in tech.

So, to answer the question, “is virtualization necessary for cloud computing”, we need to wait a bit longer. I’d like to do a quick recap on what virtualization is. Virtualization allows you to put multiple logical servers on a single a server. So you have this large server, you install a hypervisor (B-ROLL – definition) and then divide the resources of that ONE server into small chunks which operate independently of each other logically. This way you get the specialization that enterprise computing demands, without having to allocate a single server to each discrete function.

Why Normalization Is Good

Here’s another problem that virtualization solved which is probably the most amazing benefit we get from virtualization. So strap for another sordid corporate story!

confused

When I worked in the managed hosting business back in the late ’90s, we would manage systems for clients. Let’s say you had a very specific HP server sitting in a cabinet and you had an identical cold spare sitting next to it. Sometimes, the running server would fail – so you’d take the hard drives out of the failing machine and put them into the cold spare and you’d turn it on. In some cases, that computer may not turn on and start working ALTHOUGH in theory it really should have. There were a bunch of small nuances that would have caused this cold spare not to work. For instance, this could have been due to a difference in firmware between the servers, or some slight change in manufacturing versions – even those the model numbers were identical. It may have been a patch that was applied to one machine but never initialized because the server hadn’t been rebooted in months. It could have been something else so you were never guaranteed an easy recovery. In the end, this meant that you had to image a new machine from scratch, carefully reinstall all the applications, then copy the data off those hard drives and copy it back over to the news servers, test, and then run. So at the most basic level, what does virtualization does for us It allows you to have those single-purpose servers, still separated in every regard but sitting on the same piece of hardware. This allows you to almost fully consume the resources of each server you put into production with very little idling but if you are to take anything away from this video, it’s the resolution to the story I just told — Virtualization allows us to “normalize” CPU, RAM and storage resources. Normalization is the idea that types and origins no longer matter. CPU, RAM, Storage Space, and Network are made common. Where they come from, what brand of hardware no longer matters. In other words, the origin of those resources isn’t taken into account – they just need to be there.

Virtualization allowed the modern cloud
Virtualization allowed the modern cloud

Where Virtualization Helps

Here is why (finally) virtualization is needed for cloud computing. When we built systems for clients before, we’d allocate specific servers to them. The fact that those servers were used 1% or if they were perpetually clobbered 24 hours a day it didn’t matter. Those servers cost the service provider these things: the full capital cost of the server, network, software licensing, monitoring, management, space, power, cooling. This server also could only be used by 1 client. Putting someone else on this server would be impossible and there was no easy way to logically separate this single resource amongst clients. (There was something called shared hosting, but it wasn’t something most enterprises would tolerate unless it was a simple brochure type of website). This is akin to renting an apartment to a family and If the family goes out to dinner, you can’t rent that apartment for 3 hours while they are out to another family.

Going back to cloud computing where you build a platform full of servers and you sell access to the cumulative resources that those servers offer. So you buy a handful of servers and that could give you 10 CPUs, 10 GB RAM and 10GB of storage. You can sell this at an hourly rate and when it’s given back to you, you can have it virtually “rented” by another entity. So the buyer benefits because they can buy in bite sized chucks and the service provider benefits because they fully utilize their capitol investment of servers. Of course, they will need to build a little overhead for surges of business but the more automation they employ, the more competitive their rates can be. From an uptime perspective, since these instances are virtual, fully inclusive, and lighter you can start, stop, clone and move them around in a way not possible before.

Buy Your Compute By the Drink

Interior of pub
This is interior of modern european pub.

So cloud computing is really predicated on the premise that you have virtualization. The virtualization doesn’t make the raw compute necessarily cheaper but it does 2 important things: 1. Buyers can buy in small chunks with little to no commitment. 2. Sellers can sell their entire capacity and scale their environment without having to match their hardware thanks to the normalization we discussed earlier. If you need 10 racks of hardware, buying that in the cloud will probably be more expensive but the fact you could slowly have scaled up to 10 racks of gear 1 CPU at a time is the real benefit here.

Last little story, that involves virtualization and VMWare. The first time I saw VMWare run was on a friends laptop. He was an engineer for a software company and his role was to take some of the applications his prospective client was using, and create a proof of concept on how his software would help them integrate these various applications. So he’d literally mock their environment up on his laptop and the load test data and demonstrate to the client how the integration would look like using their actual applications and test data.

playing rugby

Installing a Windows Server OS, SQL Server and several enterprise applications on a laptop is not the best idea if you need to still check your email, fill out expense reports, collaborate with your colleagues, and not have the IT run down the hall and tackle you. So he would build these proofs of concepts on a Virtual instance that was fully isolated on his laptop. Once the client saw the proof of concept, he could delete that VM instance which was literally a single file or copy it off to another device. Meanwhile, his pristine laptop imaged by IT the day he was hired was in perfect shape since all that craziness lived in its own, self-contained instance on his laptop in a single, virtual hard drive file.

I hope this video helped explain how virtualization helped the cloud become the cloud and hopefully I answered the question Is Virtualization Needed for Cloud for you. The overall idea is that you are abstracting operating system instances from hardware. To think about micro services which we covered last time, you are abstracting code from the operating system. (Define abstract) but we’ll cover that in a future video when we go into the world of DevOps. Thank you again, and I look forward to seeing when we release our next video!

We have an evolving library of Cloud Computing resources that you can use for any research you have.

«
»