A while back I was mindlessly scrolling through LinkedIn when I saw this promoted post.Specifically, DevOps Engineers with GCP, RabbitMQ, and Terraform. Best of all? 100% remote.
Let it be said, I know absolutely nothing about this company or how they operate. I'm not promoting them...However, this is the future of this industry. Message brokers, Terraform, and the public cloud are everything you need to secure a career in 2021 and on for the foreseeable future.Now, if you are new to the industry, the popular thing lately is "button click" environment builds. Believe me when I tell you it sounds way more annoying coming from a non-technical manager.But it is important though. We really want to automate as many services as we can so that when catastrophe hits, we can quickly destroy and rebuilt with as few clicks as possible and in theory minimal experience required. If the least technical person in the room can't rebuild your service, then you haven't automated enough.Going off of this simple post, here is what I want to use.
- AWS for our Cloud Provider (GCP is my weakest, and although I could get through it, I want to make sure you take away the core concepts of cloud)
- Terraform for our Infrastructure as Code
- Ansible for our server configuration (With Dynamic Inventory for our RMQ EC2)
- Jenkins host and jobs to Deploy the Terraform and ansible files
- RabbitMQ instance in EC2 and a direction forward for learning RMQ specifics.
Reference the below visual for our flow:
Some of this we have completed in other tutorials, but I don't want to use up space here going through each install. Reference each one below if needed!
AWS account (this complies with free tier)
Once you have those, begin below.
Alternatively, check out the video.
Part 1: Setting Up AWS for our project
The Preliminary AWS steps we need to take are:
- Create an IAM role for this project
- Create an EC2 Key pair to ssh to our instances
- EC2 Security Group
IMPORTANT NOTE: I will not be covering remote state in this project. Although there are several ways, including Terraform cloud, Gruntwork has some great information and a tutorial on how to do this! Check it out if you want to manage remote state. See it HERE
Log into the aws console and navigate to Services > Security, Identity, & Compliance > IAM
- In 'Users' click the 'Add User' .
- Enter the username (I will be using DevOpsUser) and then select "Programmatic Access".
- Select 'Attach existing policies directly' and add AmazonEC2FullAccess
- Click through the remaining steps and retrieve the Access and Secret Keys (hint, keep these safe)
- Open a terminal and configure your user with your keys like:
Since we will be using Jenkins locally for this tutorial, let's copy that credential to our jenkins user (You could also run the configure as Jenkins if you want, but I figure you'd wanna be able to access the EC2 quick if needed).
Ok, now that we have our IAM user, let's grab a key pair. Navigate to the EC2 Dashboard (via services or the search bar)
- Scroll down the left menu and select 'Key Pairs'
- Click 'Create Key Pair' and name it what you'd like. I will be naming mine rabbitmq since I will be using this key pair for those instances. Up to you. Some companies use one key pair for all instances (yikes).
While in the EC2 Dashboard, let's navigate to security groups as well.
Create a new security group and name it something you will recognize. Set Custom TCP for everywhere with the port 15672. This will be important for RMQ Later. Also, add an SSH rule to your preference (either local IP or everywhere).
After creating it, make sure to click on it and take note of the ID. We will need this in the terraform step.
Lastly, in this tutorial, my laptop is going to act like my jenkins/ansible server. So I will be moving the downloaded pem file to my jenkins user ~/.ssh directory and chmoding it. Then we will want to change ownership to make sure jenkins can use it.
And we are done with the AWS prelim setup.
Part 2: GitHub and Jenkins Setup
For this project (and most professional settings) you will want to keep your infra files in their own repository and then have Jenkins checkout those files and run them as needed.
So for this, create a Github Repo now for your project. You can call this whatever you'd like. I am naming mine devops3-Terraform-RMQ-AWS.
I've also created this repo with a README and .gitignore file with the terraform template.
in your project dir on your machine, clone that here.
Next, for Jenkins to have access, we need to provide an ssh key.
In your terminal:
when prompted, you can rename this (in the .ssh dir) to be something that coincides with Jenkins (i.e. mine is jenkins_rsa)
Copy that output and jump over to github and navigate to your users settings (click on your icon top right and select settings > SSH and GPG Keys)
Add a new ssh key and name it something like "Jenkins" and then paste the output you copied previously.
Save it and continue to Jenkins.
If Jenkins is not started on your machine, go ahead and fire it up with
Once that is running, navigate to http://localhost:8080 and complete the standard prompted setup. Grab the initial password at
Once you are in navigate to "Manage Jenkins" > "Manage Plugins". We need to add the following in "Available"
- CloudBees AWS Credentials
Now navigate back to 'Manage Jenkins' > 'Manage Credentials' and drill down to global creds.
Click on Add Credential and select SSH Username with Private Key.
Paste that output into this field with your username.
Add another credential and this time select "AWS Credentials". Enter your Access Key and Secret key from your ~/.aws/credentials file and give the Id the IAM username for reference.
Finally, navigate back to manage jenkins > manage users and click on your user. Now click configure on the left. Under API TOKEN add and generate a new token. save this and keep it SAFE. We will need it at the end.
Part 3: Terraform
Before we dive in, review the end result of our project and I recommend for this project you do the same (directories in blue):
│ └── rmq
│ └── rmq_playbook.yml
So inside your main project directory do:
If you didn't notice from the dir structure, this is for our EC2 RabbitMQ instance.
Open both of those files in your editor and add the following:
Pretty simple in this case. We have the provider and resource here. We also have the security group ID from the SG we created back in the preliminary setup. Note the key_name is the rabbitmq key we created in the last example. In addition, we want to update our variables.tf file as well with the name and profile vars.
Now, a very important piece here is to push this branch to your repo, as Jenkins will need it. so:
I used "main" here so I can skip a merge step...but it's recommended you create branches...
That's it for our Terraform segment.
Part 4: Jenkins Job
Now that we have something in our repo, let's start on jenkins by opening http://localhost:8080
Head to the dashboard and create a new job (freestyle project) and name it:
In the first segment, select the box "This project is parameterized" and add the following Params: Action (Choice), Ansible (Boolean), Name (String), Group (string):
Now, scroll to "Source Code Management" and select Git. Enter your repo information here and use the ssh key creds me made earlier. Select to build it off of Main (or a branch of your choice)
Jump all the way down to Build Environment and select Use secret text(s) or file(s). Select AWS access key and secret key and choose the key from the credentials that you created in the prelim steps.
In the Build Steps and add a Shell build step in the drop down. In this text box we will add our commands.
Don't get too hung up on what I did here. In fact, I hope you will make it your own and make it better. I simply have an if elif else statement here. One for apply, plan, and destroy. ( you will see in the next step why we have a separate one for apply. Note the -auto-approve on apply and destroy. This is to get us through the terraform promt.
Now save it and run it with plan selected. You should see your terraform plan output in the jenkins console output.
Part 5: Ansible with a Dynamic inventory
If you have used ansible with the /etc/ansible/hosts file, you can imagine that it gets out of hand adding hosts as they are created. So why not make use of a dynamic inventory?
From the main project dir, I want to mkdir ansible and then ansible/rmq (and make my playbook)
Now, we also need to add an ansible.cfg (if its not there) and a dir called group_vars/ with a file inside called tag_group_rmq.yaml.
The config will be used here to set a default of not prompting to ask whether or not you want to add a new host to known_hosts.
The group vars are used by ansible to help us use the group tag in our EC2 and apply a specific set of params to that group. So in this case, user and ssh private key
We also want to create a file /etc/ansible/aws_ec2.yaml and open it.
straight from Ansible's documentation (with some tweaks) we want to add
Then in the playbook (in our project directory ansible/rmq), let's add our basic Rabbit install and then run it with the start command. The last two steps will be for our Admin panel and initial user.
Part 6: Final steps and launch
Let's add a last job for our RMQ and call it RabbitMQ_Configuration. All we need to do for this job is add out git url like in our first jenkins job, add a build trigger and then add a shell build step with the following:
To trigger this, under the Build Trigger section, check the box that says Trigger builds remotely (e.g., from scripts) and then add an authentication token like...verysecureansibletoken
Use something more secure 🙂
Lastly, navigate back over to our first job and add an if statement nested inside the apply, as well as a new parameter
So in the RabbitMQ_Terraform_Build configure panel add a boolean parameter called Ansible and then make sure your shell command looks like this (Replace USER:TOKEN with your username and the token from the setup).
Now you are done. If you check that ansible box on apply and run it, terraform will build your ec2, and then on success run the ansible script!
At this point, you can navigate over to RabbitMQ's tutorial and complete it using your EC2 instance!
Try to think critically about way's you can add certain steps to your configurations we made above. I do plan on doing more real world RMQ examples in the late future. But this will get you started. Happy learning!