What does it mean to embark on a Cloud Transformation today? The key lies in the adoption of Cloud Computing and of a Cloud Native approach: let’s take a closer look at these two concepts together.

Today, enterprises are increasingly looking to modernize their existing applications or build new Cloud-compatible ones, with a particular focus on exploring the Cloud Native concept. The problem arises when these two concepts are confused or even used as synonyms due to the similarity in their names (basically both refer to the Cloud). In fact, it is very important to remember that their meanings are significantly different.

What is Cloud Computing?

We have been hearing and talking about the Cloud for many years now. The simplest and at the same time the most inclusive definition is the one offered by NIST, the National Institute for Standards and Technology. According to NIST, Cloud Computing is a set of IT services accessible in on-demand and self-service mode through the Internet, built around shared resources and characterized by the ability to very quickly scale the solution and by the timely measurement of performance levels in such a way to be able to pay based on consumption.

Simply put, the Cloud is an infrastructure made up of hardware, servers, databases and applications whose service is used on request via the Internet, guaranteeing scalability and flexibility. The best-known and most popular platforms that deliver Cloud services are Amazon Web Services (AWS), Google Cloud Platform and Azure di Microsoft.

 

What is Cloud Native?

Unlike Cloud Computing which, as we have seen, is an infrastructure that delivers services, Cloud Native is an architecture aimed at assembling components developed natively for the Cloud, in order to obtain maximum benefits. As Pini Reznik, Jamie Dobson and Michelle Gienow write in their book "Cloud Native Transformation" (O'Reilly 2019), “Essentially, cloud native is the name of a particular approach to designing, building and running computer applications (...) based on an Infrastructure-as-a-Service, combined with new operational tools and services like continuous integration, container engines and orchestrators.”

So the difference between Cloud Computing and Cloud Native could not be more substantial. While the former is a set of technologies and services and a way of using computing resources for different applications, the latter is an articulated and complex architecture aimed at the enterprise that wants to modernize its technological infrastructure and processes. To be able to do this, however, an upstream cultural change is necessary, with a direct impact on organizational models and working methods.

Today, the Cloud Native model is increasingly more popular and more commonly adopted, thanks also to the work of the CNCF (Cloud Native Computing Foundation). This international consortium is part of the Linux Foundation, which brings together all the various technologies concerning the Cloud Native ecosystem. Indeed, more than 500 members (including SparkFabrik, as a Silver Partner) contribute to the growth of open-source solutions such as Kubernetes and the diffusion of the Cloud Native paradigm around the world.

THIS MAY INTEREST YOU: Microservices and Cloud Native Applications vs. Monolithic Applications

Cloud services and the Cloud Native model

There is more and more familiarity these days with terms such as IaaS, PaaS and SaaS. And, of course, it has become common knowledge that services based on Cloud infrastructure represent a critical element in digital transformation due to the ease of use and flexibility that they offer to companies, from small players to large enterprises. But what exactly are we talking about?

  • Infrastructure-as-a-Service: The provider offers cloud computing infrastructure, including servers, networks, operating systems and storage, through virtualization technology. The key benefit? The company can take advantage of the same technologies and features of a traditional data center, without having to physically maintain or manage them. Amazon Web Services (AWS), Microsoft Azure and the Google Compute Engine (GCE) are all examples of IaaS solutions.
  • Platform-as-a-Service: hybrid solution between IaaS and SaaS, which requires the provider to offer the platform for creating the software. This solution gives developers the freedom to focus on software development without having to worry about operating systems, software updates, storage or infrastructure. Some examples of PaaS? AWS Elastic Beanstalk, Windows Azure, Google App Engine and Openshift.
  • Software-as-a-Service: the provider makes the entire application available via the web and there is no need on your part to know how to write code to be able to use it. To give you a better idea, examples of SaaS include services such as Gmail, Google Docs, Salesforce, Dropbox, Microsoft Office 365 and iCloud: all applications accessible via the Internet for free or subject to the purchase of a license, without the need to download and install any files.

The advantage of Cloud Computing is to provide services on demand, allowing customers to pay only for what they use and capable of being scaled up or down almost instantaneously. What you need to be able to use them in the most effective way – and it is precisely this that is the role of Cloud Native – is to assemble them adopting the right architecture for the needs of each company and use case.

The elements of a Cloud Native architecture

Cloud Native architectures are built using a series of elements, essential for this design approach. Let’s see what they are.

Microservices

Microservices are the basis of the architectural approach based on which an application is created by dividing it into its fundamental functions, each of which becomes a small service, independent from the others (or with minimal interdependencies) in terms of code, implementation and maintenance.

This leads to a considerable advantage in the evolutionary development or ordinary maintenance of software solutions created based on this approach. In fact, it becomes possible to have different teams busy on distinct microservices, independently and if necessary even in parallel, without this impacting on the entire project.

Containerization

Containers are a technology born as a natural evolution of the traditional virtualization idea. By taking advantage of the ability of the kernel of an operating system, typically Linux, to make multiple isolated and autonomous instances coexist in the user space, containers make it possible to isolate an application and all its dependencies in an autonomous environment

This architecture guarantees portability: in other words the containers can be run on any type of platform and server, both when it comes to development and testing, as well as the deployment to production. Today, the use of containers is a competitive advantage for many enterprises.

DevOps

DevOps is the development philosophy that has led to a real revolution in the approach to software creation. DevOps combines development (Dev) with the IT operations part (Ops), following the Agile development methodology. 

This approach makes it possible to reduce development times and enables continuous and automatic production deployments, significantly improving the time-to-market. Thanks to DevOps, it is in fact possible to obtain high-quality software with fewer bugs and more features. DevOps is an indispensable element of Cloud Native architectures.

Automation

Automation is often underestimated, but in reality, it is a fundamental element for Cloud Native architectures. There are two main benefits of having an automated infrastructure. First, it becomes possible to save and allocate more time for actual development work. Second, automation makes it possible to manage complex and diverse environments and enables rapid scalability.

The positive impact is as much for the Ops team as it is for the Dev team. The former can spend much less time on repetitive support activities and more time on continuous system improvement. The second is able to speed up the time required to make changes and perform testing, for example, by designing efficient CI/CD pipelines.

Orchestration

Modern applications span multiple containerized microservices distributed across a variety of public and private clouds. Microservices which must be distributed, managed and scaled, without ever foregoing availability: a process that clearly leads to a fair bit of complexity. This is precisely where container orchestrators like Kubernetes come into play.

This tool allows you to automate and control a variety of tasks ranging from container management to automatic provisioning, from balancing workloads across the infrastructure to the ability to quickly scale up or down. Aspects that make Kubernetes an indispensable tool in Cloud Native environments.

Cloud Computing and Cloud Native: the binomial of digital transformation

The evolution of business applications towards Cloud Computing through the adoption of a Cloud Native architecture requires a change not only in technologies, but also in the processes and culture used by the company to work on software.

After all, as the authors of “Cloud Native Transformation” write, Cloud Native “is a philosophical approach for building applications that take full advantage of Cloud computing. This new paradigm requires embracing new technologies and a new way of working, which makes going Cloud Native an extensive undertaking. The payoff however is immense”.

The enterprise’s advantages include setting up a very quick mechanism for the creation of new software features. This allows the application to be brought to market much faster, immediately collecting user feedback and thus being innovated and improved just as quickly, while reducing risks. 

The Cloud Native approach, in fact, enables the creation of very complex projects working in small teams, each focused on individual autonomous components of the project. Therefore, it becomes much easier to reduce complexity and bottlenecks, to move forward quickly and, at the same time, to be ready to change direction when problems are identified, as well as to restore individual functionality if bugs or errors are encountered.