Multi-Cloud with Terraform – Any infra, anywhere provision

This post is about provisioning Amazon Web Services EC2 instances using Terraform IaaS, we can create reproducible infrastructure for Dev, UAT, SIT, and Production environments using automation of Terraform scripts. Terraform allows you to split your configuration into as many files as you wish.

We can execute these same scripts against multiple clouds providers, Microsoft Azure, Amazon Web Services, GCP etc. Terraform as infrastructure as code tool, makes this task simple and easy to perform. We can use Terraform to use Infrastructure as Code to provision and manage any cloud, infrastructure, or service.

Terraform infrastructure is configured and provision using language, HCL – HashiCorp Configuration Language. Below are the most commonly used commands by HCL infra provisioning.

CommandsPurpose
INITInitialize the working directory
PLANTo execute the plan for the resources managed by .tf files
APPLYBuild or changes the infrastructure as code which is defined
VERSIONPrints of the Terraform versions
DESTROYDestroy the infrastructure managed by Terraform
VALIDATEValidate the terraform files against the targetted schema
GRAPHOutputs the visual graph of Terraform resources.

There can be multiple reason for the client to go for Multi-Cloud approach:-

  • Cost Considerations,
  • Vendor diversity,
  • Vendor leverage,
  • Desire to use unique services,
  • Need for a particular region [Low latency],

Prerequisites

Once you download terraform binaries, please ensure to have environment variables setup. I have set up my variables, considering I have placed binary in the C folder. Run a version check on the command prompt in the below screenshot.

Implementation – Demo for AWS provisioning

To keep the length of the post limited, since most of the code is very similar. I will limit this demo to AWS. The similar approach can be adopted for Azure. Terraform has separate providers for Microsoft Azure and they can be used to configure infrastructure in Microsoft Azure using the Azure Resource Manager API’s.

More info here – https://www.terraform.io/docs/providers/azurerm/index.html

This configuration provisions infrastructure on Amazon Web Service. I will post the .tf file below for AWS which needed to be executed using AWS access keys and secret access keys.

I have added Azure Terraform plugin, as a Visual Studio Code plugin for IntelliSense but you can use any other tool for the same.

Note – AWS Access Key and AWS Secret Access Key should be configured on the host running this Terraform configuration.

export AWS_ACCESS_KEY_ID=XXXXXXXXXXXXXXXXXXXX
export AWS_SECRET_ACCESS_KEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Below is the code required to provision for AWS provider:-

provider “aws” {
region = “eu-west-1”
}

module “aws_vpc” {
source = “terraform-aws-modules/vpc/aws”
version = “1.5.1”
name = “${var.configuration_name}-vpc”
cidr = “10.0.0.0/16”
enable_dns_hostnames = true
enable_dns_support = true
azs = [“eu-west-1a”, “eu-west-1b”, “eu-west-1c”]
public_subnets = [“10.0.101.0/24”, “10.0.102.0/24”, “10.0.103.0/24”]
}

module “aws_asg” {
source = “terraform-aws-modules/autoscaling/aws”
version = “2.0.0”
name = “${var.configuration_name}-asg”
image_id = “${data.aws_ami.amazon_linux.id}”
instance_type = “t2.nano”
security_groups = [“${aws_security_group.sg.id}”]
user_data = “${data.template_file.web_server_aws.rendered}”
load_balancers = [“${module.aws_elb.this_elb_id}”]

root_block_device = [
{
volume_size = “8”
volume_type = “gp2”
},
]

vpc_zone_identifier = “${module.aws_vpc.public_subnets}”

health_check_type = “EC2”
min_size = 3
max_size = 3
desired_capacity = 3
wait_for_capacity_timeout = 0
}

module “aws_elb” {
source = “terraform-aws-modules/elb/aws”
version = “1.4.1”
name = “elb”
subnets = [“${module.aws_vpc.public_subnets}”]
security_groups = [“${aws_security_group.sg.id}”]
internal = false

listener = [
{
instance_port = “80”
instance_protocol = “HTTP”
lb_port = “80”
lb_protocol = “HTTP”
},
]

health_check = [
{
target = “HTTP:80/”
interval = 30
healthy_threshold = 2
unhealthy_threshold = 2
timeout = 5
},
]
}

resource “aws_security_group” “sg” {
name = “${var.configuration_name}-sg”
description = “security group for ${var.configuration_name}”
vpc_id = “${module.aws_vpc.vpc_id}”

ingress {
from_port = 80
to_port = 80
protocol = “tcp”
cidr_blocks = [“0.0.0.0/0”]
}

egress {
from_port = 0
to_port = 65535
protocol = “udp”
cidr_blocks = [“0.0.0.0/0”]
}

egress {
from_port = 0
to_port = 65535
protocol = “tcp”
cidr_blocks = [“0.0.0.0/0”]
}
}

data “aws_ami” “amazon_linux” {
most_recent = true

filter {
name = “name”
values = [“amzn-ami-hvm-*-x86_64-gp2”]
}

filter {
name = “owner-alias”
values = [“amazon”]
}
}

data “template_file” “web_server_aws” {
template = “${file(“${path.module}/web-server.tpl”)}”

vars {
cloud = “aws”
}
}

data “aws_availability_zones” “available” {
state = “available”
}

Commands to be executed for Terraform.

  1. Initialise Terraform :terraform init
  2. Execute a plan :terraform plan -out=1.tfplan
  3. Execute an Apply :terraform apply 1.tfplan
  4. Destroy if you are done with the task in hand and you no longer need the resources.

Upcoming Demo – Azure provisioning

This is pending for another post, I will put this as demo with screenshots in a separate post.

Azure Active Directory – Bulk user import to a new organisation via .csv with PowerShell

As a part of this demo post, the objective is to use PowerShell and do a bulk user import into Azure Active Directory. We have a specific TenantId in an organisation and we would like to use PowerShell since it would make things a bit easy.

As a test sample size, we will take 10 users who are there in the .csv file to be imported via PowerShell.

For this demo, we create a new organization below and a separate domain.

  • Organization Name – VanillaCaffeine
  • Domain Name – VMADDemo.OnMicrosoft.com



Open Windows PowerShell IDE with Admin rights on powershell and install the below plugin. This will be used to connect to Azure Active Directory from your local machine.

Install-Module AzureAD

We connect to Azure Cloud and see the tenant id for this demo post.


The below PowerShell is going to read the .csv file from your local drive and read through the list of collection and import to Azure Active Directory.

The powershell scripts and template is there on the github link.

https://github.com/varunmaggo/PowerShellScripts

[CmdletBinding()]
Param(
[Parameter(Position = 0, Mandatory = $True, HelpMessage = 'CSV file')]
[Alias('CSVFile')]
[string]$FilePath,
[Parameter(Position = 1, Mandatory = $false, HelpMessage = 'Put Credentials')]
[Alias('Cred')]
[PSCredential]$Credential,
#MFA Account for Azure AD Account
[Parameter(Position = 2, Mandatory = $false, HelpMessage = 'MFA enabled?')]
[Alias('2FA')]
[Switch]$MFA,
[Parameter(Position = 3, Mandatory = $false, HelpMessage = 'Azure AD Group Name')]
[Alias('AADGN')]
[string]$AadGroupName
)
Function Install-AzureAD {
Set-PSRepository -Name PSGallery -Installation Trusted -Verbose:$false
Install-Module -Name AzureAD -AllowClobber -Verbose:$false
}
Try {
$CSVData = @(Import-CSV -Path $FilePath -ErrorAction Stop)
Write-Verbose "Successfully imported entries from $FilePath"
Write-Verbose "Total no. of entries in CSV are : $($CSVData.count)"
}
Catch {
Write-Verbose "Failed to read from the CSV file, PS $FilePath Exiting!"
Break
}
Try {
Import-Module -Name AzureAD -ErrorAction Stop -Verbose:$false | Out-Null
}
Catch {
Write-Verbose "Azure AD PowerShell Module not found…"
Write-Verbose "Installing Azure AD PowerShell Module…"
Install-AzureAD
}
Try {
Write-Verbose "Connecting to Azure AD…"
if ($MFA) {
Connect-AzureAD -TenantId efcb2733-e012-4628-bae4-a96147285b5a -ErrorAction Stop | Out-Null
}
Else {
Connect-AzureAD -TenantId efcb2733-e012-4628-bae4-a96147285b5a
}
}
Catch {
Write-Verbose "Cannot connect to Azure AD. Please check your credentials. Exiting!"
Break
}
Foreach ($Entry in $CSVData) {
# Verify that mandatory properties are defined for each object
$DisplayName = $Entry.DisplayName
$MailNickName = $Entry.MailNickName
$UserPrincipalName = $Entry.UserPrincipalName
$Password = $Entry.PasswordProfile
If (!$DisplayName) { Write-Warning '$DisplayName is not provided. Continue to the next record' Continue } If (!$MailNickName) { Write-Warning '$MailNickName is not provided. Continue to the next record' Continue } If (!$UserPrincipalName) { Write-Warning '$UserPrincipalName is not provided. Continue to the next record' Continue } If (!$Password) { Write-Warning "Password is not provided for $DisplayName in the CSV file!" $Password = Read-Host -Prompt "Enter desired Password" -AsSecureString $BSTR = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($Password) $Password = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto($BSTR) $PasswordProfile = New-Object -TypeName Microsoft.Open.AzureAD.Model.PasswordProfile $PasswordProfile.Password = $Password $PasswordProfile.EnforceChangePasswordPolicy = 1 $PasswordProfile.ForceChangePasswordNextLogin = 1 } Else { $PasswordProfile = New-Object -TypeName Microsoft.Open.AzureAD.Model.PasswordProfile $PasswordProfile.Password = $Password $PasswordProfile.EnforceChangePasswordPolicy = 1 $PasswordProfile.ForceChangePasswordNextLogin = 1 } Try { New-AzureADUser -DisplayName $DisplayName ` -AccountEnabled $true ` -MailNickName $MailNickName ` -UserPrincipalName $UserPrincipalName ` -PasswordProfile $PasswordProfile ` -City $Entry.City ` -Country $Entry.Country ` -Department $Entry.Department ` -JobTitle $Entry.JobTitle ` -Mobile $Entry.Mobile | Out-Null Write-Verbose "$DisplayName : AAD Account is created successfully!" If ($AadGroupName) { Try { $AadGroupID = Get-AzureADGroup -SearchString "$AadGroupName" } Catch { Write-Error "$AadGroupName : does not exist. $_" Break } $ADuser = Get-AzureADUser -ObjectId "$UserPrincipalName" Add-AzureADGroupMember -ObjectId $AadGroupID.ObjectID -RefObjectId $ADuser.ObjectID Write-Verbose "Assigning the user $DisplayName to Azure AD Group $AadGroupName" } } Catch { Write-Error "$DisplayName : Error occurred $_" }
}

Once the Authentication is successful, we can log in over to our Azure Login section to cross-check the user’s list. Our sandbox area and all the users are present there. All the sampled users are present in the Users Azure Active Directory area.

AzureDevOps Build Pipeline – OWASP Dependency Check with universal packages

G’day, this blog post is about adding OWASP dependency check to Azure Build pipeline and universal packages being used for containing all the build output. SonarQube plugin will be MsBuild.exe and will perform the quality of the continuous integration.

OWASP Dependency Check – The purpose of dependency check it to check your dependencies for known vulnerabilities. It works cross-platform and Integrates well with SonarQube. Works both in Azure DevOps (online) and Server (on premise)

This plugin can be downloaded here – https://marketplace.visualstudio.com/items?itemName=InfoSupport.infosupport-owasp-dependecy-checker

For this demo project, we create a new project in AzureDevOps, LegacyToVogue. And, its scope to the build pipeline for the continuous integration part.

We have an empty Azure Repository over here, we will quickly import the code from github.

After importing the source code from github, we have all the required files from github now.

Once the source code is available we will create the Azure Build Pipeline, and later on we will add the required plugins. Also please move over to the azure marketplace and download the required plugins.

For Universal packages – According to Microsoft definition, Universal Packages store one or more files together in a single unit that has a name and version. You can publish Universal Packages from the command line by using the Azure CLI. In simple terms, we will keep it to hold the build output and later use it for Release pipeline.

Lets create an artefacts feed, we will use this for storing build output.



Lets move over to our Azure Build Pipeline, and add a new task for OWASP Dependency Check.



Lets also add Universal package task, to ensure we are storing the build output will all the required dependencies to be used for release pipeline. Please have the destination feed ready in Azure Artifacts.

Once the tasks are done, and YAML tasks code matches as above, we will trigger the build and we will see the outcome.