Create Terraform Array In a Loop

The other day, I needed to create a multi-SAN TLS certificate in Terraform. The DNS names ended in a number sequence, so you could easily create them in a loop. Now, Terraform has count & for_each loop to create multiple resources. But what if you want to create a variable in a loop? (I am sure some people are screaming Pulumi now.)

Luckily, Terraform has null_resource available, which is exactly what I needed. It creates a virtual resource with a map of strings. So, you can define it in a loop and then use the result to declare a variable.

Here’s example code:

locals {
  dns_sans = null_resource.dns_sans[*].triggers.dns_name
}

resource "null_resource" "dns_sans" {
  count = var.replicas

  triggers = {
    dns_name = "service-${count.index}.service-internal"
  }
}

This made my life easier. Maybe it will make yours too.

Start AWX Job Via API

If you are using Ansible, there is high change you are also using AWX (or Ansible Tower) to orchestrate your jobs. And you might want trigger AWX jobs externally in some cases, such as from your CI pipeline. Luckily, AWX has an API that allows you to do just that.

To run AWX jobs remotely, you will need to make 3 API calls. One to start the job itself, another one to monitor its progress and lastly a request to print the output. You can see sample code to do that in bash shell below. In order to keep things simple, it uses authentication token, but you could also use OAuth 2.

POLLING_SLEEP=30

function print_job_output() {
  api_request GET "/jobs/${id}/stdout/?format=txt"
}

response=$(
  curl -s -i -o - "${AWX_API}/job_templates/${TEMPLATE_ID}/launch/" \
    -H "Authorization: Bearer ${API_TOKEN}" \
    -H "Content-Type: application/json" \
    -XPOST
)

http_status=$(head -n 1 <<<"${response}" | awk '{print $2}')
body=$(grep '^{' <<<"${response}")

if [[ ${http_status} != 201 ]]; then
  echo "AWX returned error."
  exit 1
fi

id=$(jq <<<"${body}" '.id')
echo "Monitoring job ID: ${id}"
while true; do
  echo "Sleeping for ${POLLING_SLEEP}s"
  sleep $POLLING_SLEEP

  response="$(api_request GET "/jobs/${id}/" -i)"

  http_status=$(head -n 1 <<<"${response}" | awk '{print $2}')
  body=$(grep '^{' <<<"${response}")
  if [[ ${http_status} != 200 ]]; then
    echo "AWX returned error."
    exit 1
  fi

  status="$(jq -r '.status' <<<"${body}")"
  if [[ "${status}" == "failed" ]]; then
    echo "Job failed."
    print_job_output
    exit 1
  elif [[ "${status}" == "successful" ]]; then
    echo "Job has finished successfully."
    print_job_output
    exit 0
  fi
done

Postfix SASL authentication in Alpine Linux

Most of the popular Postfix Docker images assume that you run the service as a local SMTP forwarder. Therefore, they do not bother with authentication. So, if you want to use Postfix as your central mail sending agent, you need to roll your own. This post will walk you through the setup of Postfix with SASL authentication on Alpine Linux, my container distro of choice.

Read more…

Choosing the Right Programming Font

, updated

Programmers and other IT professionals often customize their desktops to increase productivity. However, fonts often remains overlooked. Since you spend around 8-10 hours per day staring at a screen, it is an important choice. The right font increases legibility. That in turn helps battling eye fatigue and lowers the risk of typos (which can lead to nasty bugs). This article compares some of the most popular programming fonts.

Criteria for choosing a font

First of all, a programming font should be monospace. I don’t think this needs much explanation. The main reason is for structures in code to be aligned.

As mentioned above, the font should also be legible. Especially characters that are easily confused, such as lowercase l and uppercase I. To compare the various fonts, I am using a modified programming font test pattern form Martinus. It looks like this:

o0O s5S z2Z qg9 !|l1Iij {([|])} .,;: ``''""
a@#* vVuUwW <>;^°=-~ öÖüÜäÄßµ \/\/ -- == __
the quick brown fox jumps over the lazy dog
THE QUICK BROWN FOX JUMPS OVER THE LAZY DOG
0123456789 &-+@ for (int i=0; i<=j; ++i) {}

Last, but not least, you should consider the license. As more people use some mix of Linux, Windows and MacOS to do their work, I have picked only open-source fonts. You can easily get those on any OS. This disqualifies some popular choices, such as Consolas and Menlo.

Read more…

Elasticsearch Index Lifecycle Management for Fluentd

External tools, such as Curator, used to be a necessity for managing Elasticsearch indexes. This has changed with the introduction of Index Lifecycle Management in (ILM) Elasticsearch 6.6. It has all but eliminated the need for other tools. While has been developed primarily with Logstash in mind, you can also take advantage of it when using Fluentd. It works with both data streams and regular indexes. Becase most people are probably familiar with the latter, this post will explain how to setup ILM for your Fluentd indexes.

Read more…

Control Whitespace in Ansible Templates

Ansible uses the powerful Jinja templating engine. However, the way it handles whitespace in templates is not ideal. Specifically, the fact that it preserves leading whitespace around Jinja blocks. So, you can either not indent Jinja syntax, making the templates hard to comprehend, or accept broken indentation in the resulting file (not an option with whitespace-sensitive formats such as yaml).

Luckily, there is a third option. Jinja has two configuration options regarding whitespace:

Read more…

Import Scaleway Infrastructure to Terraform

Terraform is one of the leading applications to manage your infrastructure as code. It defines your server instances and accompanying services using a simple declarative language. Moreover, the infrastructure state is kept in a separate file. So, whenever you make a change in your configuration, Terraform compares it to the current state and only performs necessary changes.

Terraform has plugins for all the major IaaS providers, so you should be covered there. However, you are probably not starting from a scratch, but already have some infrastructure running. Personally, I use Scaleway as my cloud provider, so I will show you how to import their resources to Terraform. I will demonstrate the process on a single server instance.

Read more…

Deploy Docker Container from Gitlab CI

Containers are all the rage nowadays and for a good reason. They help in unifying development and production environments. They also provide application encapsulation and isolation, among other things. But to get the most out of them, you should build and deploy them automatically. This post will show you how to do it using Gitlab CI and docker-compose.

Read more…

Run Nikola Blog in Docker

A lot of you might have a blog or a personal website created by static generator. Thanks to their simple requirements (just a webserver, really), they are an ideal starting point for your dockerization journey. In this post, I will explain how to run a Nikola website in a container. Nikola powers this website and is my static generator of choice. But the steps should be fairly similar for other generators out there.

Dockerfile

The dockerfile I am using looks like this:

FROM python:latest AS builder

# Copy the whole repository into Docker container
COPY . . 

# Build the blog
RUN pip install nikola \
    && run nikola build


FROM nginx:alpine

# Copy output to the default nginx directory
COPY --from=builder output /usr/share/nginx/html

# Copy nginx host configuration
COPY nginx/default.conf /etc/nginx/conf.d/

Read more…

Modify All Items in Ansible List

Ansible lets you easily interpolate list items within values (like interpolated_{{ item }}_value). However, sometimes you need a more powerful transformation. This is where the map filter comes to rescue again. You can use it to perform regular expression replace on each item in a list. As you can see, the syntax is relatively simple:

map('regex_replace', REGEX_PATTERN, OUTPUT)

For a concrete example, let us say you want to extract network mask from a list of IP addresses (192.168.0.100/24 for example). Assuming this list is stored in the ip_addresses variable, the regex replace would look like this:

{{ ip_addresses | map('regex_replace', '.*/([0-9]{1,2})', '\\1') | list }}

Of course, you can easily use it as a part of longer jinja2 pipelines. If you also want to learn how to loop over dictionary attribute, or see other Ansible tips, take a look here.