We all love buzzwords, and one over the last couple/few years has been DevOps. What in the world does it mean? I have talked to people that think it means Agile/SCRUM methodology, while others think it is just Docker containers. To some people it is just scripts to manage their network infrastructure and Linux servers, and to others it is a Continuous Integration/Continuous Deployment (CI/CD) pipeline using git repositories. Wikipedia says:
So which one is right?? As we work internally and with clients, I believe the best definition for me is a set of practices, techniques, and tools that make automation a reality. So, that may be Ansible/Chef/Puppet checking and setting configuration servers on infrastructure, Linux, and Windows servers. It is also the software development process. At the end of the day, it is looking at what is possible and putting it into action using the appropriate tools.
So, now we have the age old "tools discussion." It is a holy war. But I would say don't start there. Instead do this:
Whiteboard out exactly what you want to do.
Ask why. A LOT. Use the Five Whys method to get to the root cause of existing problems with your businesses processes
Take an inventory of your current tools, especially ones that already have agents installed or proper permissions
Get and use a source code repository
Start simple and modular, allowing for code/technique reuse
RUTHLESSLY ELIMINATE all manual steps wherever possible
Refactor and look for efficiencies.
Rinse and repeat
So, what are some examples? Reach out to us; we are happy to help you put together some ideas and share best practices. Do not limit yourself. Treat this as an opportunity to show the Art of the Possible. To get you thinking, below are a couple of DevOps projects that we have successfully completed:
EXAMPLE 1 - CI/CD Pipeline for Software Deployment
This one is pretty "standard," but saves a ton of time and leverages several stages/additional pipelines throughout the process. Reach out and we can go into more details, but here are the high level pieces:
Developer submits a PR (GitHub) or Merge Request (GitLab) to the "dev" branch of an Angular/.NET web application.
Run .NET unit tests and report these results back to the GitHub PR
Run Angular unit tests and report these results back to the GitHub PR
Build a Docker Container
Push it to Docker Hub or another container repository tagged with the commit hash
Run an npm audit against the installed npm packages and report these results back to the GitHub PR
Run container vulnerability scanning against the built container and report these results back to the GitHub PR
Analyze the static code and publish the results to Sonarqube tool (i.e. for Quality or Section 508 issues)
The person approving the PR then has relevant data/results to view in addition to just looking at code. If he/she approves the PR, then the following happens:
Download the latest Secrets and ConfigMap (environment variables) and deploy them to Kubernetes
Update the image of the running pod in the DEV namespace of Kubernetes with the newly built image/commit hash
Run Cucumber tests against DEV for basic smoke tests and other test cases
Publish the Cucumber report to the pipeline
Now the app is up and running in DEV with nothing being done manually outside of the normal PR approval process. Developers and decision makers see more data to make more informed decisions. This approach lowers costs by eliminating manual labor, improves software quality, and ensure security vulnerabilities do not escape to production. This pipeline then continues all the way through to Production and releases for customers.
EXAMPLE 2 - Extending This Pipeline
So, how can we take this even further? Our software can run in a Docker container, but it also can be deployed using a standalone virtual appliance. We leverage the above pipelines to assist with this as well:
A release tag is created in GitHub
The release pushes the production container to Docker Hub for customers to deploy/update
This process also creates a release in our Appliance pipeline
This pipeline gets the release version as an input variable
It updates the necessary files in its git repository
It spins up a custom Linux box to do the build running in Azure/AWS/wherever
It builds the appliance, creating an ISO
It automatically uploads this ISO to an Azure Blob which is referenced from a URL or website
It shuts down the Linux box to save compute costs within Azure/AWS
All this occurs once again from a single action of an authorized individual: creating a release in GitHub. Everything is 100% automated with the only thing required is a simple governance process to approve the release.
I hope this gave you a couple ideas of how DevOps can benefit you. In a future post, I will dive into another example that is more of an infrastructure focus. The purpose of DevOps is putting automation into action. Ruthlessly eliminate every manual step possible. Reach out to us. Or better yet, schedule a free initial consultation with me (Craig Thomas) here. We would love to partner with you as you put these techniques and others into practice to eliminate manual steps and focus on the more important areas of your work.