The evolution of computing has seen a steady progression from expensive dedicated solutions to less costly and resource-light options. Before we discuss the development of computing, let’s first define what computing is. Compute in the cloud computing world refers to the servers comprising the processing, memory, and storage required to run a cloud-based service. In the early days of computing, expensive dedicated servers were the only option available to businesses. Soon after, Virtual Machines were invented, allowing several servers to exist on one physical piece of hardware. Then came containers that virtualize right to the application level. Finally, in the modern era, companies can dissect their workload down to the smallest of computational units, a simple function like the ones you see so many times in your favorite programming languages.
History Of The Cloud
How did the idea of cloud computing develop? To start from the beginning, we have to go back to the 1950s with the invention of mainframe computing. Mainframe computing is the concept of having a central computer accessed by numerous user devices. The main computer with all the computing capabilities was called the mainframe computer. All the user devices which sent requests out to the mainframe computer were called dumb terminals. These days if you peek into a college computer lab, there are computers at every desk fully independent from the ones around it. In the ’50s, however, computers were costly to buy and maintain. So instead of placing one at every seat, organizations would buy one mainframe computer and allow the dumb terminals to share its computing resources. In the ’70s, the concept of virtual machines emerged. Virtual machines are multiple complete operating systems that live in a single piece of hardware. For example, you can have multiple Windows virtual machines in your single Mac laptop. Suddenly, one single mainframe computer could have multiple operating systems running simultaneously to do many different things. Then, a new idea hit them. What if we could use many mainframe computers’ resources as if it’s just one computer? This was the beginning of the modern concept of cloud computing. To make pooling resources a reality, developers created hypervisor software that could be installed onto multiple pieces of hardware, such as a server. They could then link all that hardware and use their combined computational storage powers as one giant resource. Imagine the amount of storage and computing power you can harness by adding up all the memory and hard drive space of every computer in your office. Programs will run super fast, you can store a lot of files, and you will be able to analyze data at blazing speed. This is what cloud computing allows people to do on an enormous scale, using the Internet to connect end users to massive computational hardware in their data centers.
Why do we look at the evolution of compute? We review compute development because it helps us better understand all of the compute layers in the modern stack. The first paradigm we can look at is the Dedicated Server approach. When a business chooses to go with a dedicated server, they sign up for a physical server dedicated to that customer. So what happens in this approach is that the client needs to guess capacity in the initial stages and will likely overpay for an underutilized server and any wasted space that comes with it. It is akin to buying clothes that are too big and waiting to grow into them. It works, but it is not the most efficient approach. Upgrading techniques must also be considered since it can be slow and expensive with a dedicated server. The operating system installed on the physical server is what you get, so it lacks some flexibility. If the business runs multiple applications, it may result in conflicts and resource sharing. A dedicated server does have the great benefit of guaranteed security, privacy, and total utility of the underlying resources. It is this last reason why some companies still prefer to use the dedicated server method in their cloud computing stack. This is sometimes referred to as a “Bare Metal” approach, as it mimics what it was like to have your own physical server in your data center.
Virtual Machines are the next iteration of computing in the cloud. It’s a fantastic technology makes it possible to run multiple virtual machines on one physical machine. This is made possible by what is known as a Hypervisor. A Hypervisor is the software layer that enables the ability to run virtualized computers. When taking the Virtual Machine approach in cloud computing, a physical server is shared amongst many customers. The benefit of this method is that the customer only has to pay for a fraction of the server. Hardware costs are shared among more businesses. A Virtual Machine can still have the problem of overpaying if the VM is underutilized, which is always possible. Like the dedicated server, only one guest operation system would be in use. Multiple applications on the same virtual machine may also have resource-sharing conflicts like the dedicated server sometimes does.
Containers are the next iteration of computing in the cloud after virtual machines. They are an incredible technology with several benefits. Containers are isolated from each other and provide running applications using software that packages code and its dependencies. Hence,o the application runs consistently from one computing environment to another. Now you can have a virtual machine that is further fractionalized by running multiple containers on the same VM. Docker is the most commonly used software layer for running various containers. By subdividing it into containers, customers can utilize the maximum available capacity, which is very cost-effective. All running containers share the same underlying operating system, so that containers can be more efficient than virtual machines. The great thing is that you can have different operating systems as well. Each container could run another OS; therefore, multiple applications can run side by side without limiting the same underlying OS and will not cause resource-sharing conflicts.
Last up in the evolution of computing (at least at the time of this writing) are functions. Amazingly, containers are not the smallest level of subdivision in cloud computing. That award goes to Functions. Functions result from breaking up your applications into small pieces of code responsible for one well-defined task. This offers an even better utility of computing than just standard containers provide. By leveraging functions in the cloud, you will find these pros and cons:
- Managed Virtual Machines Running Managed Containers
- Defined As Serverless Compute
- Only responsible for Code and Data – nothing else
- Very cost-effective
- Cold Starts can be troublesome
Serverless computing means you don’t have to worry about running and configuring virtual machines or containers. The cloud service provider takes care of all of that for you. You don’t set up anything, you put your code online, and it should work. The only thing you might need to consider is the memory used and duration. What makes serverless computing so cost-effective is that you only pay for a function to run. Any underutilized resources are not your concern any longer. The cloud service provider deals with that. The one downside to serverless computing is that you can run into what is known as Cold Starts. This happens when a function or serverless code is triggered, but an underlying container is not yet running to support that code. So the first time the serverless code runs, there may be a slight delay while waiting for the underlying support structure to ramp up. You can minimize this phenomenon by keeping functions small in size, reducing set-up variables, and triggering the function automatically every 15 minutes or so to keep the function warm.
Dedicated, Virtual Machine, Container, Functions Resources
- Cloud Vs. Dedicated (rackspace.com)
- Difference Between Cloud Servers Dedicated Servers (cloudacademy.com)
- Dedicated Servers (ionos.com)
- Virtual Machines (azure.microsoft.com)
- Learning Cloud What Is A Virtual Machine (cloudflare.com)
- Learn Virtual Machines (ibm.com)
- Virtual Machine In Cloud Computing (jigsawacademy.com)
- Learn What Are Containers (cloud.google.com)
- Cloud Containers What They Are And How They Work (searchcloudsecurity.techtarget.com)
- Containers For Everyone (docker.com)
- Learn Cloud Containers (ibm.com)
- What Is A Container (azure.microsoft.com)
- Docker Container Container Cloud Computing (aquasec.com)
- Cloud Native Apps What Is Serverless (redhat.com)
- What Is Serverless Computing And Which Enterprises Are Adopting It (computerworld.com)
- State Of Serverless (datadoghq.com)
- Azure.microsoft.com En Us Solutions Serverless (azure.microsoft.com)
- Serverless: Develop & Monitor Apps On AWS Lambda(serverless.com)
- Why Use Serverless (cloudflare.com)