Things I Wished I Knew About DevOps Practices and Cloud Technologies When I Started my First Role in Tech

Things I Wished I Knew About DevOps Practices and Cloud Technologies When I Started my First Role in Tech

It’s 2021 and I’m just over a month into my third role as a Software Engineer & Tech Coach. It’s been a whirlwind of a journey so far! Here’s some things I wished I knew about DevOps practices and cloud technologies when I started my first role in tech.

My role wasn’t just about full-stack Software Engineering in C#, but also involved DevOps practices and Cloud technologies

During my career switch into tech, I thought that DevOps practices and Cloud technologies were utilised solely by DevOps Engineers and Cloud Engineers. I under appreciated how much of my role involved DevOps practices and Cloud.

When I spoke to people in my network and especially those who have recently started their first roles in technology; it seemed like there was a mixed bag. Some people were not involved in DevOps and Cloud at all, though they mentioned some of their colleagues were. Others, like myself had more of a hybrid role and some people were doing DevOps and Cloud every single day!

What is DevOps in a nutshell?

AWS states, “DevOps is the combination of…philosophies, practices, and tools that increases an organisation’s ability to deliver applications and services…”. The infrastructure and process that sits behind software ensures a smoother experience for building code, testing it, shipping it out and monitoring it.

DevOps and Cloud is there to help Developers

Some Software Engineers would say that DevOps and Cloud is not part of their role, so why should they bother; they do have a point there. It’s a massive world, recently, product offerings like AWS Amplify for example, help those who major on the front-end and API domains build mobile/web apps quickly. However, there’s value in learning some of the key concepts on how DevOps and cloud is helpful.

In my first role in tech, I wanted to learn some fundamentals of DevOps and Cloud that would support me in my role as a C# Full-Stack Software Engineer.

In my team at the time, one of the projects we were tasked with was re-writing a legacy Excel application into a .NET Core 3.1 C# web application (at the time of writing this post, it’s .NET 5). I really liked the way my team worked together on this, all the developers/testers, business analysts, our product owner and scrum master mobbed on this.

Something popped into my head at the time: “Why can’t we just build the web application and then just deploy it to production for the users, easy right? I can just click around on the Azure Portal and just manually make my resources there and then manually deploy.”

Well, when we started mob programming on the cloud infrastructure process, I realised there was more to just ‘making something work’.

Automated Continuous Integration & Continuous Deployments Using Azure Repos & Pipelines

One of the things that stuck with me was CI/CD (Continuous Integration / Continuous Deployment). According to the AWS DevOps blog, “An integral part of DevOps is adopting the culture of continuous integration and continuous delivery/deployment (CI/CD), where a commit or change to code passes through various automated stage gates, all the way from building and testing to deploying applications, from development to production environments.”

I got to appreciate this by learning about git, git repositories on Azure repos, managing branches and creating pipelines to build and deploy our C# solution.

During my learning process, I had a sneak peak at how different teams were utilising Azure Pipelines. At first I was hard-coding things in and this sort of worked, but then I found myself copying and pasting all the time. I then realised parameterisation was helpful to ensure I could supply different values for the same pipeline variables. This helped me as a developer and for other developers on my team because it meant we could replicate the same setup across the development environments, testing environments, pre-production and production environments of the pipeline. We could configure things to be switched ‘on’ and ‘off’ through code.

Separation of concerns was important here. We decided to go with an infrastructure pipeline and an app pipeline. If there were changes to the web application on a branch, CI/CD will automatically detect this and trigger a build and deploy onto the relevant environments using the relevant pipelines. Test suites would also run automatically too. Once the Pull Request (PR) for the branch has been approved and merged, the CI/CD pipeline will build and deploy to the environments. No more arduous manual deployments that we had to deal with for the original Excel application! Great!

Infrastructure-as-Code

During my first role, I realised that clicking around the settings on the Azure Portal to create and configure resources was helpful for me, but not helpful for others. It wasn’t repeatable. We had to think as a team how we can define the infrastructure and configure it using a better approach. This was where the Azure Resource Manager (ARM) templates came in handy. It enabled the definition of what infrastructure we wanted to make, how we wanted to make it and configure it.

The ARM templates were useful as they could be version controlled through git as well; just like we would version control code. There were also helpful extensions on Visual Studio for structuring and validating these templates.

Most importantly, it enabled a repeatable and testable process for our infrastructure.

Logging & Monitoring

So why do we need logging & monitoring? Let me put it this way, when you release a new feature for your product, that’s just the start. Just as a plane has a suite of telemetry to record readings from instruments; it is the same concept for software to ensure everything is operating as it should. Try to think where logging and monitoring makes sense for you.

We used Azure Monitor to add observability into our applications, infrastructure and network.

Final Thoughts

This is just the surface of what DevOps and Cloud technologies can offer to developers, of course there are specialists who go a bit deeper into more concepts that those I’ve covered here. If you are working in tech, there is some benefit to learning some of the fundamentals about the infrastructure and process that sits behind software to ensure smoother experiences for building code, testing it, shipping it out and monitoring it.

Intro to Docker Containers & Microsoft Azure Part 1 – Beginner’s Guide to Containerising a Hello World Python Web App 

Intro to Docker Containers & Microsoft Azure Part 1 – Beginner’s Guide to Containerising a Hello World Python Web App 

Greetings!

Hi everyone, it’s great to be back again! This blog is part one of a two-part series on Docker and Microsoft Azure. In Part 1, we will containerise a Hello World Python web app using Docker. In Part 2, we will learn how to build and push the container image using devOps pipelines on Microsoft Azure.

Prerequisites:

Before we get stuck in, here are some prerequisites:

Containers

Here is a useful link if you would like to have a quick 5 min intro to containers.

Docker

Docker is a set of platform-as-a-service products that use OS-level virtualization to deliver software in packages called containers. (Ref: https://en.wikipedia.org/wiki/Docker_(software))

It mitigates against the classic: “But it works on my machine problem!” and streamlines the development lifecycle by allowing developers to work in standardized environments using local containers which provide the applications and services. (Ref: https://docs.docker.com/engine/docker-overview/).

Containers are great for continuous integration and continuous delivery (CI/CD) workflows. Did I mention we can even integrate it with Azure? ☁️

You can get started on the Docker documentation here.

Docker Desktop

This blog assumes you have Docker Desktop Installed. To install Docker Desktop, you can use this link. I’m using my lovely Macbook Pro 💻 for this – hehe! 😉; but you can choose whether you want to download Docker for Mac or Windows.

For this tutorial, you will need Python 3.7+ by going to the following link.

Pip

You will also need the latest version of pip which is a recommended tool for installing Python packages.

To check you have the right Python and pip versions, you can use the commands:

python --version
pip --version
Checking Python and Pip Versions
Checking Python and Pip Versions

Now, onto the fun stuff! 🏄‍♀️

Step 1: Project Setup – Create a Project Directory

First, let’s create a new project directory called hi-there-docker. Please feel free to call your project directory any name you want, but just remember to reference it throughout this blog.

Open up the project directory in your favourite code editor. I find Visual Studio Code works quite well if you’re starting out.

Step 2: Project Setup – File Setup

Next, let’s create a requirements.txt file in the directory; it is good practice in Python projects to have this file to manage packages.

Top Tip! 💡

If you are using Windows, you can create a function called ‘touch’ for the UNIX fans out there which enables you to use the touch command to create a new file in Windows Powershell. Enter the following command in Windows Powershell to enable this:

function touch {set-content -Path ($args[0]) -Value ($null)}

In the requirements.txt file, enter the package ‘Flask==1.0.2’, as we will need Flask to create the hello world application.

requirements.txt
requirements.txt

Finally, enter the following in the terminal to install the packages listed in requirements.txt.

pip install -r requirements.txt

Step 3: Python – Create a Hello World Flask App

I won’t be going into detail on how Flask works, but you should check it out if you’re interested.

We’re now onto Step 3, let’s move onto creating a new file called main.py.

In the main.py file, enter the following:

from flask import Flask

app = Flask(__name__)

@app.route("/")
def hello():
   return "Hello World!"

if __name__ == "__main__":
   app.run(host="0.0.0.0", port=int("5000"), debug=True)

This is a very simple app and has one route with ‘Hello world’.

main.py
main.py

Step 4: Create a Dockerfile

Once that’s complete, we can move onto Docker! Let’s create a Dockerfile. The Dockerfile contains settings such as the base image for the container and Python version, as well as dependencies on build to produce the container image.

Create a Dockerfile in the root directory of your project folder and enter the following:

FROM python:3.7
COPY . /app
WORKDIR /app
RUN pip install -r requirements.txt
EXPOSE 5000
CMD python ./main.py

Your Dockerfile should look something like this:

Dockerfile
Dockerfile

Step 5: Let’s Build the Image

Before we can build the image using Docker, let’s confirm the Docker CLI is working by typing the following into your terminal:

docker --version
Checking Docker Version
Checking Docker Version

Also, check that Docker Desktop is up and running, there should be a cute little icon of the Docker whale on your desktop panel.

Docker Whale
Docker Whale

Now we are ready to build the image! 💿

You can tag the image with a name of your choosing, but let’s use the name hi-there-image

Ensure you are in the root directory of the project and enter the following into the terminal:

docker build --tag hi-there-image .
Building the Image
Building the Image

🎉 You’ve built your first image using Docker and tagged it with a name – woohoo! 🎉


Step 6: Running the Image as a Container

Once the Docker image has been built, you are now ready to run the image as a container.

Our container will be called:

hi-there-container

And our image name is:

hi-there-image

To start the application as a container, enter the following into the terminal:

docker run --name hi-there-container -p 5000:5000 hi-there-image
Running the Image as a Container
Running the Image as a Container

🎈That’s it! You are now running the image as a container! 🎈

Step 7: Go to the App

🥳 Now you can go to your app at http://localhost:5000 🥳

Go to the App
Go to the App

Final Step: Viewing and Managing Containers using the Docker CLI

#To display a list of running containers:
docker ps

#To stop running one or more running containers:
docker container stop [container name(s)]

#For example, if we wanted to stop running the hi-there-container, we can run the following command in the terminal:
docker container stop hi-there-container

#To remove one or more containers:
docker container rm [container name(s)]

#For example, if we wanted to remove the hi-there-container, we can run the following command in the terminal:
docker container rm hi-there-container

#To confirm that the container is no longer running, check that it is no longer in the list:
docker ps

😊 Congratulations, you have just built and containerised a Hello World App using Docker! 😊

🤔 What’s next?

If you want to explore more on how to build and push the images using Docker as tasks within the Microsoft Azure Pipelines, watch out for Part 2 of this blog series. ☁️