UPDATE: The stack is now powered by docker-compose and using the latest official images for Elasticsearch/Logstash/Kibana. See my new article https://deviantony.wordpress.com/2015/07/23/the-elk-stack-powered-by-docker-updated/ — 23/07/2015
I’ve recently created a solution to setup an ELK stack using the Docker engine, it can be used to:
- Quickly boot an ELK stack for demo purposes
- Use it to test your Logstash configurations and see the imported data in Elasticsearch
- Import a subset of data into Elasticsearch and prepare dashboards on Kibana 3
- Use Kibana 4 !
UPDATE: The stack is now fully functionnal with docker-compose as a replacement for fig. See http://docs.docker.com/compose/install/ for docker-compose installation. — 27/02/2015
The solution is available on Github: https://github.com/deviantony/docker-elk
It is based on multiple Docker images available on Dockerhub:
- elk-elasticsearch: Latest 1.5 stable version of Elasticsearch + Marvel with Oracle Java JDK 7
- elk-logstash: Latest 1.4 stable version of Logstash with Oracle Java JDK 7
- elk-kibana: Kibana 3.1.2 or Kibana 4
The following installation procedures have been tested on Ubuntu 14.04.
Use the following command to install Docker:
$ curl -sSL https://get.docker.com/ubuntu/ | sudo sh
Fig is available via pip, a tool dedicated to manage Python packages.
First, you’ll need to install pip if it is not already present on your system:
$ sudo apt-get install python-pip
Then, install fig:
$ sudo pip install -U fig
Use the stack
First, you’ll need to checkout the git repository:
$ git clone https://github.com/deviantony/fig-elk.git
By default, the stack is shipped with a simple Logstash configuration, it will listen for any TCP input on port 5000.
You can update the Logstash configuration by updating the file logstash-conf/logstash.conf (to test your filters, for example).
Then start the stack using fig:
$ cd fig-elk $ fig up
Fig will start a container for each service of the ELK stack and output their logs.
If you’re still using the default input configuration for Logstash, you can inject some data into Elasticsearch from a file:
$ nc localhost 5000 < /some/log/file.log
Then you can check the results in Kibana 3, by hitting the following URL in your browser: http://localhost:8080
Or, if you’d like to use Kibana 4, hit the following URL: http://localhost:5601
Also Elasticsearch is shipped with Marvel, you have access to the cluster monitoring on the following URL http://localhost:9200/_plugin/marvel
Have fun with the ELK stack 🙂