Apache Airflow Tutorial – Part 4 DAG Patterns


During the previous parts in this series, I introduced Apache Airflow in general, demonstrated my docker dev stack, and built out a simple linear DAG definition. I want to wrap up the series by showing a few other common DAG patterns I regularly use.

In order to follow along, get the source code!

Bring up your Airflow Development Environment

unzip airflow-template.zip cd airflow-template docker-compose up -d docker-compose logs airflow_webserver

This will take a few minutes to get everything initialized, but once its up you will see something like this:

DAG Patterns

I use 3 main DAG patterns. Simple, shown in Part 3, Linear, and Gather. Of course, once you master these patterns, you an combine them to make much more complex pipelines.

Simple DAG Pattern

What I call a simple pattern (and I have no idea if any of these patterns have official names) is a chain of tasks where each task depends upon the previous task. In this...

Continue Reading...

Apache Airflow Tutorial – Part 3 Start Building


If you've read this far you should have a reasonable understanding of the Apache Airflow layout and be up and running with your own docker dev environment. Well done!  This part in the series will cover building an actual simple pipeline in Airflow.

Start building by getting the source code!

Build a Simple DAG

The simplest DAG is simply having a list of tasks, where each task depends upon its previous task. If you've spun up the airflow instance and taken a look, it looks like this:

Now, if you're asking why I would choose making an ice cream sundae as my DAG, you may need to reevaluate your priorities.

Generally, if you order ice cream, the lovely deliverer of the ice cream will first as you what kind of cone (or cup, you heathen) you want, then your flavor (or flavors!), what toppings, and then will put them all together into sweet, creamy, cold, deliciousness.

You would accomplish this awesomeness with the following Airflow code:


Continue Reading...

Apache Airflow Tutorial – Part 2 Install with Docker

Install Apache Airflow With Docker Overview

In this part of the series I will cover how to get a nice Apache Airflow instance up and running with docker. You won't need to have anything installed locally besides docker, which is fantastic, because configuring all these pieces individually would be kind of awful!

This is the exact same setup and configuration I use for my own Apache Airflow instances. When I run Apache Airflow in production I don't use Postgres in a docker container, as that is not recommended, but this setup is absolutely perfect for dev and will very closely match your production requirements!

Following along with a blog post is great, but the best way to learn is to just jump in and start building. Get the Apache Airflow Docker Dev Stack here.

Celery Job Queue

Getting an instance Apache Airflow up and running looks very similar to a Celery instance. This is because Airflow uses Celery behind the scenes to execute tasks. Read more...

Continue Reading...

Apache Airflow Tutorial – Part 1 Introduction

What is Apache Airflow?

Briefly, Apache Airflow is a workflow management system (WMS). It groups tasks into analyses, and defines a logical template for when these analyses should be run. Then it gives you all kinds of amazing logging, reporting, and a nice graphical view of your analyses. I'll let you hear it directly from the folks at Apache Airflow

Apache Airflow is a platform to programmatically author, schedule and monitor workflows.

Use airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.

When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.

Source - ...

Continue Reading...

Setting up a Local Spark Development Environment using Docker

Every time I want to get started with new tech I figure out how to get a stack up and running that closely resembles a real-world production instance as much as possible. 

This is a get up and running post. It does not get into the nitty gritty details of developing with Spark, since I am only just getting comfortable with Spark myself. Mostly I wanted to get up and running, and write a post about some of the issues that came up along the way.


What is Spark?

Spark is a distributed computing library with support for Java, Scala, Python, and R. It's what I refer to as a world domination technology, where you want to do lots of computations, and you want to do it fast. You can run computations from the embarrassingly parallel, such as parallelizing a for loop to complex workflows, and support for distributed machine learning as well. You can transparently scale out your computations to not only multiple cores, but even multiple machines by creating a spark cluster....

Continue Reading...

Deploy a Celery Job Queue With Docker – Part 2 Deploy with Docker Swarm on AWS


In Part 1 of this series we went over the Celery Architecture, how to separate out the components in a docker-compose file, and laid the ground for deployment.

Deploy With AWS CloudFormation

This portion of the blog post assumes you have a ssh key setup. If you don't go to the AWS docs here.

What is CloudFormation?

AWS CloudFormation is an infrastructure design tool that allows users to design their infrastructure by defining file systems, compute requirements, networking, etc. If you have no interest in designing infrastructure, y0u probably don't need to worry. Cloudformation configurations are shareable through templates. 

Docker AWS CloudFormation

Getting Started

Docker has come to our rescue here, with a Docker for AWS CloudFormation template. This will, with the click of a few buttons, deploy a docker swarm on AWS for us!! 

Click on the page, and scroll down to quick start. Under 'Stable Channel' select '...

Continue Reading...

Deploy a Celery Job Queue With Docker – Part 1 Develop


In this post I will hopefully show you how to organize a large docker-compose project, specifically a project related to a job queue. In this instance we will use Celery, but hopefully you can see how the concepts relate to any project with a job queue, or just a large number of moving pieces.

This post will be in two parts. The first will give a very brief overview of celery, the architecture of a celery job queue, and how to setup a celery task, worker, and celery flower interface with docker and docker-compose. Part 2 will go over deployment using docker-swarm.


What is Celery?

Celery is a distributed job queuing system that allows us queue up oodles of tasks, and execute them as we have resources. 

From celeryproject.org - 

Celery is an asynchronous task queue/job queue based on distributed message passing.It is focused on real-time operation, but supports scheduling as well.
The execution units, called tasks, are executed...
Continue Reading...

Deploy a Full Stack Web Application with Docker and Traefik

docker python traefik Feb 05, 2019


Have you ever had to deploy a full stack web application? With different services, some mapping to different ports, some static html, maybe a couple databases thrown in there? Have you ever banged your head against your desk trying to figure out how to expose different ports on a remote server, or through a cloud platform?

If you have, you understand

I'm going to go over how I deploy such an application, with nice path names instead of annoying port numbers, using traefik and docker. I'm not going to go into the specific code so much, but I may go crazy and do that at a later time. This setup makes it much easier to deploy your application to cloud platforms that don't expose ports besides 80 by default, because you don't need any other ports!

Get the Code

Like I said, I'm not going to go so much into the code itself, this post is all about Traefik, but it's all here on github. 

The final website, deployed to AWS, is here.


Continue Reading...

Docker Ecosystem

Uncategorized Feb 01, 2019

Containers, Compose, Swarm, Desktop?

Docker isn't simply a container or a virtual machine, its an entire ecosystem. Luckily, each component is a layer that builds upon its previous layer. There are definitely a lot of moving pieces, but from a birds eyes perspective there are three main components.

This post is only meant to be a very brief perspective. For more in-depth information check out the official getting starting docs.

Docker - A single image

A docker image is very much like a virtual machine. If you haven't worked with virtual machines before, it's basically your computer, without a desktop. You can't run around clicking on start or finder icons, but you can install packages, start web servers, create users, run applications, and spin up databases all without installing any software to your local machine. Besides docker, of course!


  •  Web Applications - python flask, node.js express, static html sites, etc.
  •  Machine Learning...
Continue Reading...

Develop a Python Flask Application with Docker and Deploy it to AWS – Part 4 Deploy our docker images

Uncategorized Jan 18, 2019


Now that our docker-compose.yml file is all ready, we have almost all the pieces we need to deploy our application to AWS. The last thing it so deploy our docker images to some docker storage provider.

I personally like quay.io, but docker hub is an equally excellent choice that is rapidly improving.

Get the Code

If you haven't already, clone the github repo to see the full code.

Upload your docker containers

AWS can't work with containers just hanging out on OS, so you need to upload (or push) them to the cloud. 

Create a Docker Account

Create an account on either quay.io or docker hub. It really doesn't matter where. Just pick one and go for it. Quay.io has a nice tutorial to get started pushing (uploading) and pulling (downloading) your docker images.

If you're like me, and you're really terrible at remembering things, you will want to save your password somewhere. Probably in a file. Possibly even in a file that is backed up...

Continue Reading...

50% Complete

Two Step

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.