Automation Ninja's Dojo

Running ElasticSearch/Kibana and Logstash on Docker

In todays world if you are new to the combination of words in subject means you need to quickly catch up 😀 In IT world Docker is introducing new way to how we operate. Days when you need 20 sysadmins to make deployment successful are now long gone. You could say nowadays we get DevOps that with a click of a button change the world 😀

Today we will discuss how with Docker running  ElasticSearch + Logstash and Kibana you can visualise your environment behaviour and events. At this stage i would like to point out that this can be useful not only in IT where you get insights to what is going on with your infrastructure but also it has a great potential in the era of IoT . In a single “go” you will build required components to see its potential.

Since this will be only touching real basics I will try to point you to more interesting sources of information.

The whole excercise will be done on host running Ubuntu with the following version installed

Distributor ID: Ubuntu
Description: Ubuntu 14.04.3 LTS
Release: 14.04
Codename: trusty

I already have followed Docker docs on installing Docker engine on this OS. So make sure you have the engine installed.

As a quick verification this is verison of Docker running during writeup of this post

Client:
Version: 1.8.2
API version: 1.20
Go version: go1.4.2
Git commit: 0a8c2e3
Built: Thu Sep 10 19:19:00 UTC 2015
OS/Arch: linux/amd64

Server:
Version: 1.8.2
API version: 1.20
Go version: go1.4.2
Git commit: 0a8c2e3
Built: Thu Sep 10 19:19:00 UTC 2015
OS/Arch: linux/amd64

 

So since we got that ready let’s fire up an instance of ElasticSearch. Since we would like to store data outside of the container we need to make folder somewhere on the host. Since this is only non-production excercise I will just use simple folder in root structure. For this purposes I have created folder called cDocker and within there created subfolder data/elasticsearch . This can be achieved by running the following in console

Once ready we can kick off creation of our container

After a moment of pulling all required image layers we can see container running on our Docker host

docker_elasticSearch_createdok

 

For communicating with API you can see we have exposed port 9200. For ease of making API calls I will be using Postman addon for Chrome. With that we wlll send GET request to address of http(s)://<IP>:9200/_status which in return should come back with our instance status. In my case everything works out of the box so reply looks following

elasticsearch_api_status_ok

 

For the next part we will create LogStash container. We do this by creating container based on LogStash image. The main difference here is that we will link our elasticsearch container so they will be able to talk to each other.

In the above we expose port 25826 TCP/UDP and mount volume for configuration (here I use $(pwd) for existing folder in my current console session ) . Next we link our elasticsearch container and give it db alias.  Remaining is the name of the image and initial command to be executed.

Now if you paid close attention i specified that we will be using config file called first.conf since that file does not exist we must create it. Contents of those file come directly from Logstash documentation and are real basic configuration enabling us to see working solution

Now if I open 2 session windows – one to tail logstash container logs and other one to create a telnet connection to 25826 then we will see message I type into telnet session will get translated and forwarded to elasticsearch.

logstash_testmessage_ok

 

Of course this kind of configuration in this instance is only good for excercise and shows quickly how nicely we can get the system running

So since thats ready it’s time to set up Kibana. Its quite easy using the default image from Docker Hub . I have choosen to link containers for ease of this excercise

And now seconds later we can login to our Kibana server and take a look on our forensic details 🙂 The message we sent before as a test message is already visible! How cool is that 😀 ?

 

 

kibana_firstevent_test

Lets add some extra fake messages to make some visualisation of it. I will be doing that using telnet command and sending some dummy messages to logstash

After thats done 🙂 we can then create visualizations – and from there onwards … so awesome dashboards. For purposes of this excercise I have just created basic pie charts to show you how it can look like. Of course there is much more power there and you should explore resources available for this if you want to do more 😀

kibana_firstdashboardtest

 

Well that concludes this short introduction to logging with ELK stack. There are of course a lot of other considerations when setting this up for production. Using redis to avoid bottleneck with lost messages / avoid complex message parsing etc. We will try to look into some of those in upcoming posts!

 

 

1 thought on “Running ElasticSearch/Kibana and Logstash on Docker

  1. Good day!

    You Need Leads, Sales, Conversions, Traffic for rafpe.ninja ? Will Findet…

    I WILL SEND 5 MILLION MESSAGES VIA WEBSITE CONTACT FORM

    Don’t believe me? Since you’re reading this message then you’re living proof that contact form advertising works!
    We can send your ad to people via their Website Contact Form.

    IF YOU ARE INTERESTED, Contact us => [email protected]

    Regards,
    Lockington

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: