Terraform 101

A simple guide to the underlying concepts of Terraform

Shanika Perera
Towards Data Science

--

Photo by Adi Goldstein on Unsplash

If you’re a DevOps engineer or someone that has to deal with DevOps-related work in your day-to-day work-life, I’m certain that you have heard about the Infrastructure as Code (IaC) concept. Simply, IaC is something that has fallen right out from Heaven to lend a helping hand to everyday struggling DevOps engineers. IaC is the method of using machine-readable definition files to manage and provision an entire IT infrastructure. Using programming scripts, aids in the automation of the whole IT infrastructure. IaC has many benefits around it. It allows faster execution when configuring infrastructure, helps reduce the cost as well as the risk associated with implementing infrastructure, has full traceability of the changes, etc.

There are several IaC tools in the market right now. Today I’m going to explain about Terraform, which has caught the eye of many IT engineers.

According to the official Terraform documentation,

Terraform is a tool for building, changing, and versioning infrastructure safely and efficiently. Terraform can manage existing and popular service providers as well as custom in-house solutions.

It is an open-source IaC tool developed by HashiCorp. It looks a lot like CloudFormation, which is used to automate AWS infrastructure. You can use Terraform on other cloud services as well. Terraform creates an execution plan that explains how it will get to the desired state and then implements it to build the infrastructure specified. Terraform can decide what changed as the configuration changes and build gradual execution plans that can be implemented.

Fundamentals

Language

  • Input Variables — Serve as parameters for a Terraform module, so users can customize behavior without editing the source
  • Modules — Acts as a container for multiple resources that are used together. It is a way to package and reuse resource configurations.
  • Resources — Documents the syntax for declaring resources
  • Data sources — Allow data to be fetched or computed for use elsewhere in Terraform configuration
  • Output values — Return values for a Terraform module
  • Local values — A convenience feature for assigning a short name to an expression

Commands

  • terraform init — Initializes the working directory which consists of all the configuration files
  • terraform validate — Validates the configuration files in a directory
  • terraform plan — Creates an execution plan to reach a desired state of the infrastructure
  • terraform apply — Makes the changes in the infrastructure as defined in the plan
  • terraform destroy — Deletes all the old infrastructure resources

Let’s get started!

Photo by Jukan Tateisi on Unsplash

1. Set the AWS credentials

For this tutorial, I’m going to work with the AWS platform. To connect with our AWS account and the Terraform code, we need to set up the AWS user credentials. Ensure that the user credentials you’re using have access to the resources you’re going to provision. Get a hold of the AWS Access key and the Secret access key of the IAM user with the right permissions and save it in a secure place. We will be needing this later in the tutorial.

2. Install Terraform

As I’m using Linux, I’ll provide the necessary commands for installing Terraform in Linux distribution in this tutorial. For other OS, please visit this webpage to install Terraform accordingly.

To add the HashiCorp GPG key

curl -fsSL https://apt.releases.hashicorp.com/gpg | sudo apt-key add -

To Add the official HashiCorp Linux repository —

sudo apt-add-repository “deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main”

Update and install —

sudo apt-get update && sudo apt-get install terraform

Verify the installation —

terraform -help

3. Write your first Terraform script

Since we’re dealing with AWS, we need to provide the credentials that we retrieved earlier to connect with our respective AWS resources. To set the AWS credentials in your command-line workspace,

export AWS_ACCESS_KEY_ID=(access key)
export AWS_SECRET_ACCESS_KEY=(secret access key)

Terraform can build infrastructure on several platforms, or providers, such as AWS, Azure, Google Cloud, DigitalOcean, and others. Usually, the first step in using Terraform is to set up the provider you want to use. Make a file named aws.tf and insert the code below into it.

As the first step, we’ll learn how to deploy a simple EC2 instance in AWS. Terraform has well-explained documentation about their language, syntaxes, resources, etc. Below mentioned is the code that you need to deploy a t2.micro instance.

Note that this is just a simple code snippet to create an ec2 instance. There are many attributes that you can specify such as the availability zone, security group, EBS information, private key value, etc. These attributes can be found on the official resource pages of respective hashicorp documentation.

**(Please note that I’m using separate files for each resource because it’s easy to manage and understand. You can copy and paste both aws.tf file and ec2.tf file content in one file called main.tf and get the same output.)**

In a terminal, go to the folder that you created ec2.tf and aws.tf and run terraform init command.

The terraform init command creates a working directory in which Terraform configuration files can be found. After creating a new Terraform configuration or cloning an existing one from version control, you have to run this command first. It is appropriate to use this command more than once.

You can run terraform validate to make sure that your code is valid. If there are any errors in the syntax or missing data, this will display that.

Our code is initialized and validated. We can run terraform plan to see the execution plan. This is a fantastic way to double-check the changes before executing them. If you don’t specify any VPC, subnet IDs, this configuration will create the resources in the default VPC with respective default values.

To actually create the resources, run terraform apply.

As you can see, we have now successfully created our first ec2 instance with the help of Terraform. If you navigate to your AWS console, you’ll see something like this.

Let’s say you need to change something in your instance. You can edit the ec2.tf file and run terraform plan again. It will show you what’s going to be created, modified, or deleted. If I want to add a new Env tag, this is how the execution plan will look like.

4. How to write a module

Now that you have a slight idea about how Terraform works, in this section I’ll explain how you can write and use Terraform modules.

Imagine you’re working in a company that handles multiple AWS resources. For example, you have 5 instances for the Dev environment, another 5 instances for the QA environment, etc. Instead of writing the same terraform resource block, we can create modules and reuse that code which will act as a template.

In your current working directory, create a folder named “modules”. Inside that create two folders named “dev-instance”, “qa-instance”. I’ll leave the GitHub link for this project at the end of the tutorial. You can find the file structure and all these files in that repository.

To separate the modules from each other, I have specified a tag named “Dev” in the dev-instance module and “QA” in the qa-instance module. So, when we’re using these modules, these tags will be added to the respective resource. I have also changed the root ebs volume size in both modules.

dev-instance module main.tf file
qa-instance module main.tf file

This is how the ec2.tf file should change when we’re using modules. Unlike in the previous section, now we’re using module block to create resources. If you’re using a module block, it is a must to provide the source of that module. In our case, it is the modules that we defined earlier locally. You can provide different source types such as Github, Bitbucket, HTTP URLs, Terraform Registry, etc.

ec2.tf file

As you can see, I have used variables in this script. Variables are defined in a separate file called variables.tf. You can use them anywhere in your script with the unique name you provide. Eg- var.ami_id, var.instance_type. You can provide a default AMI ID in the module definition, in the variables.tf file as a variable or manually apply an ID as I have done here.

variables.tf file

The outputs.tf file is shown below. Each output value exported by a module must be declared using the output block. If you go to the module outputs files, I have declared two outputs in each module which will return the instance ID and the public IP of the instance. Now that we’re using the modules, we can call the output values by the respective module names as shown below.

Syntax - module.<module-name>.<output-name-mentioned-in-module>

outputs.tf file

It’s time to create these resources. You have to run terraform init to initialize the newly created modules.

Run terraform validate to check whether our script configurations are valid. If it’s valid run terraform plan to see the execution plan.

module.dev_server has the environment tag as “Dev” while module.qa_server has its environment tag as “QA” as we specified separately in the modules.

When you have an outputs.tf file, you will see the return values in the terraform plan like shown below.

When you’re done with experimenting with Terraform, make sure to clean up the environment to prevent unnecessary costs. Simply run terraform destroy and it will also display the resources that you’re going to destroy. If you’re certain, type yes and the specified resources will be destroyed accordingly.

Github Link — https://github.com/Shani1116/terraform-101

Congratulations 😻 At last, you now have a basic understanding of using Terraform. But we’ve just begun to scratch the surface. In the future, I’ll publish more Terraform-related blog posts. What am I missing here? Let me know in the comments and I’ll add it in. Keep in touch for more cool stuff!

--

--

Infrastructure Security Engineer | WSO2 | CKA | AWS SysOps Administrator | HashiCorp Certified Terraform Associate