Troubleshooting Etherpad In Google Cloud

There is quite a lot that could go wrong with the work in the previous post, and I feel I experienced most of the issues! We need to remember there are two projects in play here. One to create the container inside the infrastructure, and the other to create the infrastructure the conatiner lives in. A problem could be caused by either of these, and so the correct action needs to be taken to correct the issue.

Connecting To The Database

Now that we have a database it is time to connect Etherpad to it. We see in the Hands on Guide to Google Cloud that the way to achieve this is by setting some environment variables to be read by Etherpad when it starts. Lets have a look how to do that. Adding Environment Variables Once again this is set up by the cloud run app module. which as I noted before is the basis of my webapp.

Creating The Database

The next step in the Hands on Guide to Google Cloud says we should connect the Etherpad instance to a database. Let’s see how we create the database using the automated processes we are developing. Creating the database using Terraform Cloud SQL module A SQL instance was created by the terraform code that was generated from the boilerplate. University members can see the sql.tf template. It contains the following: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 resource "random_id" "sql_instance_name" { byte_length = 4 prefix = "sql-" }# We make use of the opinionated Cloud SQL module provided by Google at # https://registry.

Creating Infrastructure With Terraform

This is an enormous and very complicated area. The hands on guide says to change the memory limit for the application to make it run faster. So, how do we do this in terraform? In the project generated from cookiecutter, I can simply edit webapp.tf and change the webapp module to add memory_limit="1024M". Simple, but how do I know I can do that? I need to read up on the Terraform language to understand what is going on.

SettingUpTerraform

Before starting to change the configuration with Terraform, there is some set up work that needs to be done. While the getting started guides are fine, in practice this leaves a problem of how to work with colleagues, and how to manage secrets. My colleagues have created a tool called Logan which they use to run terraform. It is installed using pip, but it is a docker container, so will require a working docker to run properly.

Deploying a Container to Google Cloud

Having created the container, we now need to deploy it to the Google cloud. We create a deploy project. This was created for me, I believe from our Boilerplate Google Cloud Deployment project This project can currently only be viewed from within the University. The gitlab-ci.yml contained the following at the time of writing: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 # Use UIS deployment workflow, adding jobs based on the templates to deploy# synchronisation imagesinclude:- project:'uis/devops/continuous-delivery/ci-templates'file:'/auto-devops/deploy.

Building A Container with AutoDevops in GitLab

In the last post I set up a runner. Now lets see if we can use it to create a container. I am looking into how my colleagues at Cambridge do things. They have a handy guide to the Google cloud using what colleagues call click ops. When I ran through it I was given a project which already had an ID, so had to make sure I used the project ID I was given rather than the one in the document.

Gitlab Runner for Auto Devops

The Auto DevOps pipeline in GitLab is supposed to be magic! It just reads your mind and does whatever you wanted! At least that seems to be what the sales blurb says! Over the next few posts I will investigate how we use it. It is pretty clever, but we do need to set a couple of things first. Requirements This assumes docker is installed and working on the machine that will be used for the runner.