The ELK stack powered by Docker

UPDATE: The stack is now powered by docker-compose and using the latest official images for Elasticsearch/Logstash/Kibana. See my new article — 23/07/2015


I’ve recently created a solution to setup an ELK stack using the Docker engine, it can be used to:

  • Quickly boot an ELK stack for demo purposes
  • Use it to test your Logstash configurations and see the imported data in Elasticsearch
  • Import a subset of data into Elasticsearch and prepare dashboards on Kibana 3
  • Use Kibana 4 !

UPDATE: The stack is now fully functionnal with docker-compose as a replacement for fig. See for docker-compose installation. — 27/02/2015

The solution is available on Github:

It is based on multiple Docker images available on Dockerhub:

  • elk-elasticsearch: Latest 1.5 stable version of Elasticsearch + Marvel with Oracle Java JDK 7
  • elk-logstash: Latest 1.4 stable version of Logstash with Oracle Java JDK 7
  • elk-kibana: Kibana 3.1.2 or Kibana 4


You’ll need Docker and Fig.

The following installation procedures have been tested on Ubuntu 14.04.

Docker installation

Use the following command to install Docker:

$ curl -sSL | sudo sh

Fig installation

Fig is available via pip, a tool dedicated to manage Python packages.

First, you’ll need to install pip if it is not already present on your system:

$ sudo apt-get install python-pip

Then, install fig:

$ sudo pip install -U fig

Use the stack

First, you’ll need to checkout the git repository:

$ git clone

By default, the stack is shipped with a simple Logstash configuration, it will listen for any TCP input on port 5000.

You can update the Logstash configuration by updating the file logstash-conf/logstash.conf (to test your filters, for example).

Then start the stack using fig:

$ cd fig-elk
$ fig up

Fig will start a container for each service of the ELK stack and output their logs.

If you’re still using the default input configuration for Logstash, you can inject some data into Elasticsearch from a file:

$ nc localhost 5000 < /some/log/file.log

Then you can check the results in Kibana 3, by hitting the following URL in your browser: http://localhost:8080

Or, if you’d like to use Kibana 4, hit the following URL: http://localhost:5601

Also Elasticsearch is shipped with Marvel, you have access to the cluster monitoring on the following URL http://localhost:9200/_plugin/marvel

Have fun with the ELK stack 🙂


11 thoughts on “The ELK stack powered by Docker

  1. Thanks for the write-up.
    I have a little problem though, when I do ‘fig up’, I get the following:

    adbab46a8af4: Error pulling image (latest) from deviantony/elk-elasticsearch, Untar exit status 1 open /usr/include/linux/firewire-cdev.h: no space left on de
    vicenux/firewire-cdev.h: no space left on device

    firewire? ehm?

      1. Hmm. Actually, I forgot about the problem with docker consuming so much inodes

        df vs df -i
        shows free space, but 100% inode use…

  2. I love it!
    I have a question, it is probably more about Docker than your tool specifically, but is there an easy way to persist the ELK data outside Docker, so that it isn’t lost when either Docker or the VM is restarted?

    1. Hey Michael, sure you can !

      All you need to do is replace the elasticsearch container definition in the docker-compose.yml file to add a volume mapping:

      image: deviantony/elk-elasticsearch
      - /tmp/elastic:/usr/share/elasticsearch/data
      - "9200:9200"

      The elasticsearch data will be persisted in the /tmp/elastic folder 🙂

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s