DevOps Approach to Microservices Using Kubernetes
Monolithic vs SOA vs Microservices
In my previous blog, I have mentioned about how DevOps and SRE can be beneficial in making operations reliable across the spectrum. In case you have not read it, I would highly recommend you read the article before reading this blog. In this blog, I will explain how the DevOps approach can be used to create Microservices using Kubernetes. This blog also presumes that you have a good understanding of container solutions like docker image, docker containers and CICD tools like Jenkins, Gitlab, Jira etc.
So, before diving in, let me introduce you to some of the keywords (or) rather I would say jargon that will be used in this article.
- Monolithic Application: Architectural pattern where all the components, for ex: Web Tier, App Tier, DB Tier are packaged in a single artefact and deployed as single unit.
- SOA (Service Oriented Architecture): Architectural style where each component, for ex: Web, App, DB Tier are packaged as individual packages and deployed as individual services. These services are deployed as standalone services. Each service can communicate with other services via ESB (enterprise service bus) or networking layer i.e. UDP/TCP (or) HTTP/S.
- Microservices: Architectural pattern where each component, for ex: Web, App and DB tier is further divided into loosely coupled services and then allowed to run as microservices individually, providing the entire functionality i.e. Web individually, App Individually, & DB Individually.
Below diagram gives a distinction between Monolithic, SOA and Microservices.
Each of these architectural patterns also demand their own base-set of virtual machine & infrastructure to be able to install/deploy the model.
But with advent of Cloud Native Technologies and Containerization of Applications, Monolithic and SOA Architecture do not fit the bill of providing highly available, elastic, scalable, and recoverable architecture. Let us explore why...
Here are the challenges with adopting Monolithic.
- As all the components are packaged in a single unit, there are very high chances of single point of failure with no recovery.
- If there must be an update to web tier, the entire package needs an update.
- Because of Point #2, there is no flexibility to update a component individually.
Here are the challenges w.r.t adopting SOA
- Cannot have a function deployed independently, there always is a need to have a component sharing model in SOA
- If one of the services in SOA has issues in relation to ESB (Communications layer), it can have negatively affect the other connected services.
- An SOA Service is relatively large, resembling a combination of different modules in the application. SOA Services cannot be divided to achieve granularity of each module.
- All the services share the same database storage services and provide no flexibility of every service having its own database.
- SOA Protocols like AMQP are heavy-weight compared to protocols in Microservices.
So, after a certain threshold, both Monolithic and SOA would fail to provide the required capabilities like high-availability, scalable architecture, fault-tolerant solution and most importantly a self-recoverable application with no compromise on performance.
And, this is where the scale is tilted in favor of Microservices as it not only helps in providing the above capabilities, but also helps in:
- Making version upgrades seamlessly without having any impact on the other components of entire application ecosystem
- Providing fine-grained granularity of services which can be easily maintained and subjected to scaling depending on the load of the system.
- Giving each component its own data storage services, so there is never a single point of failure.
So, what are Microservices?
Martin fowler one of the popular industry DevOps Enthusiast states about Microservices as:
“The term "Microservice Architecture" has sprung up over the last few years to describe a particular way of designing software applications as suites of independently deployable services. While there is no precise definition of this architectural style, there are certain common characteristics around organization around business capability, automated deployment, intelligence in the endpoints, and decentralized control of languages and data”.
Source: https://martinfowler.com/articles/microservices.html
Microservices, in my mind is an architectural framework, wherein different components of an application in a fine-grained fashion, can run as a small/mini (in other words micro) independent services, allowing themselves to be loosely coupled with a larger cohesion of other services of an entire application.
Now that we understand the advantages of using Microservices over SOA & Monolithic, the other important aspect of this architecture needs to be taken into consideration and that is deployment. This is important because the paradigm changes abruptly in this case, as we need a platform that not only orchestrates the creation of these loosely coupled micro services but also provides the in-built functionality of a recoverable solution.
Kubernetes is one such orchestrating platform which is best suited to deploy and run a microservices based application.
It provides the following benefits
- Containerizing every fine-grained module of your app into a microservice
- Versioning/Storing these containerized apps on to a registry from where it can be downloaded to run as micro service
- Self-healing systems in place
- Zero downtime with rolling updates of your micro-service
- Service discovery that enables one component of an application automatically discovering a service of another component in the same application
- Can run any microservices belonging to different framework like JAVA, .NET or Python flask together. It is development framework independent.
And many more…
But then how do we push a software change into Microservices applications using DevOps model of continuous integration, continuous testing, continuous deployment and then ensuring the service is accessible and available for end users to use?
Enabling Microservices Deployment with DevOps
Here is a diagram that helps you understand the usage of DevOps for a Microservices application
So, to apply the above DevOps end-to-end framework, let us consider a simple E-Commerce application that has following services running as microservices
- Service A
- User Login Page
- Catalogue Service
- Catalogue Database
- Service B
- Cart Service
- Payment Service
- Service C
- Order Service
- Order Database
All developers in each services A, B and C can develop their code independently without disturbing the other services.
Here are the DevOps steps involved from the beginning:
- Let us assume that
a: E-commerce application is running on a Kubernetes Platform
b: Service A here needs a version upgrade from V1 to V2. - Business user/product owner raises an enhancement/bug/user story in JIRA for which a feature w.r.t to that ticket needs to be developed in Service A
- Developer is assigned with the ticket and starts developing the feature in feature branch as per branching and merging strategy in repository tool, in this case Gitlab.
- This Triggers a CI Build on the Jenkins CI Server which will automatically trigger a pipeline to compile/validate code quality/perform static code analysis/check for security vulnerabilities/
- If the above step is successful, using a dockerfile service, for ex: Service A is now packaged into a docker image with a version number. This docker image tagged with version number now contains the changes made by the developers for Service A.
- All that is required now is to make a rolling update so that update to service A is rolled out in gradual manner without the service going down.
- There are several deployment strategies that make this possible. Let us understand briefly:
- Ramped/Rolling Update: This is a deployment pattern, where a new version is rolled out with gradual termination of older version so that application stays available with older version and then shows up the new version all in all after the new version is ready and stable.
- B/G Deployments: This is a deployment pattern where you have the current version (green) running and next version running (blue) together. This is done to captivate the end users’ attention through a teaser of the new feature and to introduce them to the new look and feel of application (blue) before retiring/decommissioning the Green version of the application.
- Canary Release: This is a deployment pattern where a new version of application is rolled out only for subset of users (for ex: premium users) while the rest are still accessing the old one. This way, feedback is generated on new features and its stability.
- A/B Testing: Like Canary, this is a deployment pattern where a new version of application is rolled out for a specific group of end users – but with the intention of how this version is effective at achieving business goals. Informed business decision can be taken based on these features and then rolled out.
- After following the above deployment strategies, we now must consider different parameters that are required for our application to get deployed, i.e.
a: Security: Vault service from Hashicorp stores secrets in encrypted formats and is only accessible by application rather than hard coding secrets in application.
b: Packaging: In Kubernetes, several objects need to be created while deploying a Service. So, to deploy Service, all these objects are packaged into package manager tool like Helm for easy installation and deployment
c: Load Balancer/Ingress: After the component, Service A is deployed as a microservice, it is necessary that other services like Service B and Service C can discover Service A automatically while being accessible externally by end users. Load Balancer tools like Nginx helps achieve this automatic service discovery of Service A by different entities.
This way, Via CI (Cont. Integration)/CT (Cont. Testing)/CS (Cont. Security)/CD (Cont. Delivery), DevOps can be leveraged to implement an end-to-end automation framework that can simplify the deployment of Microservices application running on Kubernetes.
DevOps can further be extended to Artificial Intelligence and Machine Learning by bringing in the industry standard monitoring services for Microservices like Prometheus and Grafana which would provide a detailed insight into functioning of Microservices. I will write a different blog on Continuous Monitoring in Microservices using Kubernetes.
Feel free to drop your comments, feedback, queries on this article, I will try and answer each of those at my earliest convenience.