Recently I had an requirement that I couldn’t find documented outside of the abstract; migrating a single private DNS zone to AWS’ hosted DNS service; Route 53 and conditionally forwarding queries for that zone from an existing Windows DNS infrastructure. This isn’t something I expected to be broken down blow by blow in the AWS documentation but there are plenty of Windows DNS infrastructures out there in the wild and . . .
In a previous post we looked at setting up centralised Terraform state management using S3 for AWS provisioning (as well as using Azure Object Storage for the same solution in Azure before that). What our S3 solution lacked however is a means to achieve State Locking, I.E. any method to prevent two operators or systems from writing to a state at the same time and thus running the risk of . . .
Previously I’ve looked at how to lookup secrets from Hashicorp Vault using Ansible Tower however whilst that functionality is incredibly valuable it doesn’t really tackle the issue of how to write Playbooks which can interact with Vault. In this post we’ll look at how we can use some excellent lookup functionality provided as part of the ansible which provides this functionality. Some Assumptions For this article, I’m going to be . . .
In a previous post we looked at a method to use Terraform’s output function to export return data and load it in to an external YAML file for consumption by Ansible. While this is a useful function it’s a little topheavy, and if we just want to pass data in to another Terraform configuration in order to run an apply operation, we have a means to work a lot more . . .
UPDATED 11/2020: Have a look at a different method for this configuration better suited to CI/CD. In a previous post we looked at how to use Terraform provision and authenticate with Clusters using AWS’ Elastic Kubernetes Service (EKS) using the somewhat unique authentication method of it’s webhook token method leveraging aws-iam-authenticator. Once we get past that point however we still have another permission hurdle to overcome, specifically how we handle . . .
NOTE: The sample code used here is hosted in my GitHub here. Recently I’ve been getting my hands dirtier and dirtier with Kubernetes but there’s some interesting oddities that only occur in Elastic Kubernetes Service (EKS), the AWS PaaS Kubernetes platform, especially when it comes to how you can authenticate. As Kubernetes is strongly driven by a declarative (and by extension Infrastructure as Code) philosophy, it makes perfect sense that . . .
Vault offers an array of flexible storage backends with a view to providing a highly available storage location to store secrets, this is a great baked-in design choice as if you make Vault an integral part of your infrastructure you can ill afford a sudden outage, a perfect platform for storing structured data is, of course, a RDBMS (Relational Database Management System), as many of the mainstays are scalable and . . .
In a previous post we’ve looked at how to build Azure infrastructure with Terraform, handle sensitive secrets by storing them within Vault and centrally manage states within Azure Object Storage (confusingly called Containers). In this post we’ll take a look at the same solution but leverage the same technology within AWS, making use of AWS S3 object storage platform and using Terraform to provision further AWS resources. Sample code for . . .
Previously we looked at implementing a CI/CD pipeline using both Terraform and Ansible for provisioning and Configuration Management. In this deployment we relied on an official Python Docker image to build our Ansible environment, however this required a few steps that add a few top-heavy steps that could be solved by creating our own Docker image instead. The sample code for this post is in my GitHub here. Speeding up . . .
In previous posts we looked at a basic example of creating Immutable Infrastructure via BitBucket Pipelines using Terraform as well as why we would want to use Immutable Infrastructure and what benefits it brings. However we didn’t look at how to extend the pipeline in to Configuration Management. We’re going to look at that now, leveraging Ansible within the pipeline to automatically configure the instances we create immediately after they . . .