Dask on HPC

Recently I saw that Dask, a distributed Python library, created some really handy wrappers for running Dask projects on a High Performance Computing Cluster, HPC.

Most people who use HPC are pretty well versed in technologies like MPI, and just generally abusing multiple compute nodes all at once, but I think technologies like Dask are really going to be game changers in the way we all work. Because really, who wants to write MPI code or vectorize?

If you've never heard of Dask and it's awesomeness before, I think the easiest way to get started is to look at their Embarrassingly Parallel Example, and don't listen to the haters who think speeding up for loops is lame. It's a superpower!

Onward with examples!

Client and Scheduler

Firstly, these are all pretty much borrowed from the Dask Job Queues page. Pretty much, what you do, is you write your Python code as usual. Then, when you need to scale across nodes you leverage your HPC scheduler to get you some...

Continue Reading...

Apache Airflow Tutorial – Part 4 DAG Patterns

Overview

During the previous parts in this series, I introduced Apache Airflow in general, demonstrated my docker dev stack, and built out a simple linear DAG definition. I want to wrap up the series by showing a few other common DAG patterns I regularly use.

In order to follow along, get the source code!

Bring up your Airflow Development Environment

unzip airflow-template.zip cd airflow-template docker-compose up -d docker-compose logs airflow_webserver
Bash

This will take a few minutes to get everything initialized, but once its up you will see something like this:

DAG Patterns

I use 3 main DAG patterns. Simple, shown in Part 3, Linear, and Gather. Of course, once you master these patterns, you an combine them to make much more complex pipelines.

Simple DAG Pattern

What I call a simple pattern (and I have no idea if any of these patterns have official names) is a chain of tasks where each task depends upon the previous task. In this...

Continue Reading...

Apache Airflow Tutorial – Part 3 Start Building

Overview

If you've read this far you should have a reasonable understanding of the Apache Airflow layout and be up and running with your own docker dev environment. Well done!  This part in the series will cover building an actual simple pipeline in Airflow.

Start building by getting the source code!

Build a Simple DAG

The simplest DAG is simply having a list of tasks, where each task depends upon its previous task. If you've spun up the airflow instance and taken a look, it looks like this:

Now, if you're asking why I would choose making an ice cream sundae as my DAG, you may need to reevaluate your priorities.

Generally, if you order ice cream, the lovely deliverer of the ice cream will first as you what kind of cone (or cup, you heathen) you want, then your flavor (or flavors!), what toppings, and then will put them all together into sweet, creamy, cold, deliciousness.

You would accomplish this awesomeness with the following Airflow code:

Now,...

Continue Reading...

Apache Airflow Tutorial – Part 2 Install with Docker

Install Apache Airflow With Docker Overview

In this part of the series I will cover how to get a nice Apache Airflow instance up and running with docker. You won't need to have anything installed locally besides docker, which is fantastic, because configuring all these pieces individually would be kind of awful!

This is the exact same setup and configuration I use for my own Apache Airflow instances. When I run Apache Airflow in production I don't use Postgres in a docker container, as that is not recommended, but this setup is absolutely perfect for dev and will very closely match your production requirements!

Following along with a blog post is great, but the best way to learn is to just jump in and start building. Get the Apache Airflow Docker Dev Stack here.

Celery Job Queue

Getting an instance Apache Airflow up and running looks very similar to a Celery instance. This is because Airflow uses Celery behind the scenes to execute tasks. Read more...

Continue Reading...

Apache Airflow Tutorial – Part 1 Introduction

What is Apache Airflow?

Briefly, Apache Airflow is a workflow management system (WMS). It groups tasks into analyses, and defines a logical template for when these analyses should be run. Then it gives you all kinds of amazing logging, reporting, and a nice graphical view of your analyses. I'll let you hear it directly from the folks at Apache Airflow

Apache Airflow is a platform to programmatically author, schedule and monitor workflows.

Use airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.

When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.

Source - ...

Continue Reading...

Setting up a Local Spark Development Environment using Docker

Introduction

This is a get up and running post. It does not get into the nitty gritty details of developing with Spark, since I am only just getting comfortable with Spark myself. Mostly I wanted to get up and running, and write a post about some of the issues that came up along the way.

 

What is Spark?

Spark is a distributed computing library with support for Java, Scala, Python, and R. It's what I refer to as a world domination technology, where you want to do lots of computations, and you want to do it fast. You can run computations from the embarrassingly parallel, such as parallelizing a for loop to complex workflows, and support for distributed machine learning as well. You can transparently scale out your computations to not only multiple cores, but even multiple machines by creating a spark cluster. How cool is that?

My favorite introduction to Spark and the Spark ecosystem is here at the mapr blog.

 

Why should I learn Spark?

Well, I'm not sure...

Continue Reading...

Deploy a Celery Job Queue With Docker – Part 2 Deploy with Docker Swarm on AWS

Overview

In Part 1 of this series we went over the Celery Architecture, how to separate out the components in a docker-compose file, and laid the ground for deployment.

Deploy With AWS CloudFormation

This portion of the blog post assumes you have a ssh key setup. If you don't go to the AWS docs here.

What is CloudFormation?

AWS CloudFormation is an infrastructure design tool that allows users to design their infrastructure by defining file systems, compute requirements, networking, etc. If you have no interest in designing infrastructure, y0u probably don't need to worry. Cloudformation configurations are shareable through templates. 

Docker AWS CloudFormation

Getting Started

Docker has come to our rescue here, with a Docker for AWS CloudFormation template. This will, with the click of a few buttons, deploy a docker swarm on AWS for us!! 

Click on the page, and scroll down to quick start. Under 'Stable Channel' select '...

Continue Reading...

Deploy a Celery Job Queue With Docker – Part 1 Develop

Overview

In this post I will hopefully show you how to organize a large docker-compose project, specifically a project related to a job queue. In this instance we will use Celery, but hopefully you can see how the concepts relate to any project with a job queue, or just a large number of moving pieces.

This post will be in two parts. The first will give a very brief overview of celery, the architecture of a celery job queue, and how to setup a celery task, worker, and celery flower interface with docker and docker-compose. Part 2 will go over deployment using docker-swarm.

 

What is Celery?

Celery is a distributed job queuing system that allows us queue up oodles of tasks, and execute them as we have resources. 

From celeryproject.org - 

Celery is an asynchronous task queue/job queue based on distributed message passing.It is focused on real-time operation, but supports scheduling as well.
The execution units, called tasks, are executed...
Continue Reading...
Close

50% Complete

Two Step

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.