This post contains information which have been updated in post
docker ELK with compose v2
However to get idea of how solution works I recommend just reading through 🙂
Hello!
Its been a long time when it was a bit quiet here however there was a lot that I was busy with. And as you know its in majority of scenarios the time we are short on 🙂 Today I wanted to share with you update to one of my previous posts where we setup ELK in automated way
When I originally finished the post it of coourse ‘was working like a charm!” and then I just left it for a while and focused on couple of other projects. And recently I visited that page back as I wanted to quickly deploy ELK stack for my testing…. and then suprise – it does not work ?! Of course IT world is like a super speed train 🙂 and seems like I just stopped on a station and forgot to jump back there 🙂
So from my perspective it was a great opportunity to craft some extra bash skillks and refresh knowledge about ElasticSearch , Logstash and Kibana.
So what’s changed ?
First of all there is now one major script which gets the job done. the only thing you need to do is to specify a cluster name for elasticsearch.
Also I have added some folder existance checking so it doesnt come with dummy error msgs that folders do exist already.
How to run it now ?
Start by downloading script locally to folder under which we will create remaining folders for our components
curl -L http://git.io/vBPqC >> build_elk.sh
The -L option is there for purposes of followiing redirect (as thats what git.io is doing for us )
Once done you might need to change it to executable
sudo chmod +x build_elk.sh
And thats all 🙂 last thing to do is to execute the script with first argument being desired name of your elasticsearchcluster. Output is almost instant and promising 🙂
[email protected]~$ sudo ./build_elk.sh myelastico Cloning into '/home/bar/compose/elk_stack'... remote: Counting objects: 26, done. remote: Compressing objects: 100% (26/26), done. remote: Total 26 (delta 7), reused 0 (delta 0), pack-reused 0 Unpacking objects: 100% (26/26), done. Checking connectivity... done. Cloning into '/home/bar/logstash/central'... remote: Counting objects: 17, done. remote: Compressing objects: 100% (17/17), done. remote: Total 17 (delta 4), reused 0 (delta 0), pack-reused 0 Unpacking objects: 100% (17/17), done. Checking connectivity... done. Cloning into '/home/bar/logstash/agent'... remote: Counting objects: 8, done. remote: Compressing objects: 100% (8/8), done. remote: Total 8 (delta 1), reused 0 (delta 0), pack-reused 0 Unpacking objects: 100% (8/8), done. Checking connectivity... done. Cloning into '/home/bar/elasticsearch/config'... remote: Counting objects: 8, done. remote: Compressing objects: 100% (8/8), done. remote: Total 8 (delta 0), reused 0 (delta 0), pack-reused 0 Unpacking objects: 100% (8/8), done. Checking connectivity... done. Creating redis-cache Creating elasticsearch-central Creating logstash-central Creating kibana-frontend Creating logstash-agent
Lets check docker deamon if our containers are indeed running …
[email protected]:~$ sudo docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES ff4f41753d6e logstash:latest "/docker-entrypoint.s" 2 minutes ago Up 2 minutes 0.0.0.0:25827->25827/udp, 0.0.0.0:25827->25827/tcp logstash-agent be97a16cdb1e kibana:latest "/docker-entrypoint.s" 2 minutes ago Up 2 minutes 0.0.0.0:5601->5601/tcp kibana-frontend d3535a6d9df8 logstash:latest "/docker-entrypoint.s" 2 minutes ago Up 2 minutes 0.0.0.0:25826->25826/tcp, 0.0.0.0:25826->25826/udp logstash-central 38b47ffbb3e7 elasticsearch:2 "/docker-entrypoint.s" 2 minutes ago Up 2 minutes 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp elasticsearch-central 100227df0b50 redis:latest "/entrypoint.sh redis" 2 minutes ago Up 2 minutes 0.0.0.0:6379->6379/tcp redis-cache [email protected]:~$
They all do 🙂 thats took less than second (altough I had the images already on my host … ) and if we just check browser ?
And if anything changes ? Well then this is all in git … 🙂 so just pull for changes and you will defenitely get the most up to date version. But maybe you have some suggstions or improvements ? Then just push them – I’m sure it would be beneficial 🙂
Below is the view on the gIst 🙂
2 Comments