Automated deployment with Google Compute Engine and Cloud Build
and a cup of tea
This article shows how to use a set of tools available in Google Cloud Platform that allows your team to set up a development + automated build + continuous delivery pipeline using Google Compute Engine + Google Cloud Build.
- A server running on Google Cloud Compute Engine.
- PM2 installed and running
- For production purpose, read this article to creahe a clean SSH connection
If you don’t have that, follow this brilliant and easy tutorial to deploy a node application online.
Deploy a Node.js server using Google Cloud Compute Engine
Getting started with a minimal Node.js “ping” app
What is CI/CD ?
CI/CD stands for Continuous Integration and Continuous Deployment.
Big words for an easy concept : Automated deployment and testing.
Continuous integration improves collaboration and quality.
Teams using CI/CD are going through a few steps before deploying code to the end user.
The GitLab Flow proposes a pre-production and production branch before deploying code online.
With that kind of flow, automated deployment is essential to avoid spending hours on a server…
The pipeline will be as follow :
- Push on Github main branch
- Trigger from Google Cloud Build
- Script pull.sh executed
- Code pulled on server & server restart
Create a pull.sh file
The pull.sh file is a bash script that will execute 2 tasks :
- Pull code from Github repository
- Restart PM2
This file will be executed after any push on Github (see next parts).
Google Cloud Build use root user to access our server, we first need to act as a root user.
Run the following command :
sudo su roottouch pull.sh sudo vim pull.sh
In Vim, press “i”, paste the following code (adapt with your data) :
Press Esc, type “:wq” and press enter.
You should have something similar to this :
Now, type :
chmod +x pull.sh
This will give root user the permission to execute the script.
To be sure everything is correctly installed, execute pull.sh :
The script is executed successfully. First, it’s pulling last updates from Github, then it restarts the process manager.
First task… Done! ✅
Create a Google Source Repository
You need to have a Github repository to push code to the main branch.
If you have your own repo, all good, if not, fork mine.
Once you have a Github repository, link it with https://source.cloud.google.com/repos.
Click on “Add repository” on the top-right corner.
Then connect to an external repository.
Select a project and authorize Google to go to your Github repository
This being done, you can see the new repository in the list :
Any update done on Github will impact this Google Source Repository.
We can now create a trigger on push on the main branch.
You might have noticed the cloudbuild.yml file in my repository.
This file contains command lines to execute the pull.sh file we just created.
DON’T FORGET to change your server name, the zone and the user name.
Create a trigger
Short reminder of the process :
Push on main branch => pull.sh bash script executed by root user => New code pulled on server and server restart
How will pull.sh be executed? By cloudbuild.yml
How will cloudbuild.yml be executed? By a trigger
How will the trigger be executed ? By a push on main branch.
Let’s create a trigger!
Go to https://console.cloud.google.com/cloud-build/builds and click on “Triggers”
Then, click on “+ Create a trigger”
Copy the following configurations :
And you are all set!!
We can test our trigger by clicking on the RUN button :
A snackbar should be displayed : click on SHOW
In the build log, you can see the cloudbuild file executing and finally running the pull.sh file.
Verify the entire CI/CD process
When getting from http://220.127.116.11/api/v1/ping we get “pong”
Let’s change the response with “pong from CI CD process” and push the changes on main :
The trigger is now running on cloud build :
And the response is magically updated online :
If you plan to use that in production, please read this article so your connection to Compute Engine is made in a clean way using SSH: