ChatOps using Hubot – Zabbix maintanance





This post is suplement to GitHub repo available under https://github.com/RafPe/hubot-zabbix-scripts




So finally day has come when I can write about my recent involvement in automating 🙂 this time with use of hubot ( in this role favorite Bender ) and good Rocket.Chat  .


Simple idea:

If we need to do it once – lets automate it as for sure someone else will need to use it also at least once


And in most cases its true 🙂 So one day I just woke up quite early. Really too early to go to work already 🙂 and too late to get really good sleep still. So I got the thing which we all think in the morning ….. yezzzz coffee 🙂 And then thought about the things that ppl around me have been doing manually for quite a while :/

The challenge which came out of that short moment of thinking was : “setting zabbix server maintanance with hubot ( bender ) “


Getting pieces together:

Now I really liked that idea. It was around 6AM in the morning , my coffee was half way through so I geared up and was ready when opened my laptop. Now what was really challenging here is the fact I have never programmed in Coffee script nor in Python and those 2 main components are used to bake this solution. However at the end of the day its only different gramma for getting things done 🙂

I decided not to reinvent the wheel and looked at things that already work. Since at the moment I have been automating a lot with Ansible I looked at their Github page with extra modules.

And that was exactly what I needed. Then I just went ahead and downloaded the hubot – following nice and simple documentation. Based on the info there getting coffeee script to do exactly what I need was just a matter of minutes 🙂 ( at least I hoped so )


So this is a proxy ?

Exactly. Coffee script in hubot makes sure we respond to properly set regex values which corresponds to commands given to our hubot. From there we execute python script.

So I have placed biggest efforts on getting the Python script running. I googled around and managed to get it running with arguments. Which in return opened doors to properly proxy from Coffee script.


The final version of python script ( final per write up of this post ) has the following syntax

python zbx-maint.py

usage: zbx-maint.py [-h] -u USER -p PASSWORD [-t TARGET] [-s SERVER] -a ACTION
                    [-l LENGTH] [-d DESC] [-r REQUESTOR] [-i ID]

 -u USER      : used to connect to zabbix - needs perm to create/delete maintanance
 -p PASSWORD  : password for the user above
 -t TARGET    : host/groups to create maintanance on
 -s SERVER    : URL of the zabbix server
 -a ACTION    : del or set
 -l LENGTH    : Number of minutes to have maintanance for
 -d DESC      : Additonal description added to maintanance
 -r REQUESTOR : Used to pass who has requested action
 -i ID        : Name of maintanance - used for deletion


What about security ?

All passwords and links used within the hubot script are passed using environment variables. For proper control of processes and isolation I have been using here supervisorD ( which is great tool to do this ).


HUBOT_ZBX_USER      : user accessing zabbix
HUBOT_ZBX_PW        : password for the user
HUBOT_ZBX_URL       : zabbix server URL
HUBOT_ZBX_PYMAINT   : full path to zbx-maint.py script (used by coffee script)


Bender in action:

So without any further delay this is how it looks in action ….






Being considered:

I’m still looking for other people feedback to see what can be done better. Most likely I will be publishing some more of zabbix automations to enrich chatops and make life more interesting 🙂




Building own datacenter – this time at home

Datacenter in home ? Why ?

So a natural question would be why for whatever reason you would like to build a “datacenter”  ( quoted on purpose ) in your piece location of home. Well for 99.9%  of people the  answer would be “I would never have one“. Well I’m in that 0.1% and the reason is simple … well its even more than one. Its passion to IT and the drive to learn & do cool stuff.

For this reason several things at the piece of my home has changed:

  • Internet connection is now upgraded to fibre of 0.5 GB
  • Public IP address space of /29
  • Internal home network secured with customised APUboard to be edge router
  • Managed switch to introduce VLANs
  • Strong wireless with ubiquiti
  • And I think the most interesting ….. the server


Networking with “learn as you go”

Since apart of just application/server/development I also try to do electronics and this subject is also interesting for me I decided that router given to me by my provider is far away from being “cool” and fully under my control. So I started off with good old desktop station. But that kicked me back to so called “router on a stick

Since I wanted to have better experience I decided to move on and by pure luck I found this board  . And since then I already have 3 of them. How come ? Well they just connect much more than just networking. I can use them to learn linux kernel patching skills / I can use that board to connect world of software into world of electronic devices made by me and what else …. ohhh yes … and its my gigabit ethernet port (3x of them ) edge router which in return allows me to learn all tricks about networking/routing/vlans/troubleshooting 🙂

If that would not be enough I harvested my old dell laptop and plugged in Erricson Mobile Modem (3G) which now gives me alternative internet in case of failure 🙂  wow + wow + wow 🙂

So here is how one looks like without enclosure



And there it is 🙂 If you would have any questions about this small devil – just let me know 🙂 I will be happy to try provide you with more answers.


That was fun – but where is the server ?

So the whole point of here would be having a server which I strongly believe is not about “how much did it cost ?” but all about “what can you learn on it?“. If you already see this difference then ur one step ahead of others most probably.

Now at this point I will not be pretending that I know hardware really well … I don’t 🙂 and thats why good friend of mine with super uber eXperience has helped me to put together a kit list which turned out to be a great server for learning purposes.  Below you can see table of what we have concluded to be best :

Type Product Comments Link
Motherboard Asus Q170M2 Chosen of 2 build in ethernet ports https://www.asus.com/Motherboards/Q170M2/
PSU Corsair VS550 – 550W To have enough of spare power
Enclosure Antec ISK 600m Just cause it looks cool 🙂
HDD ( data ) HGST 4TB 3.5″ 7200RPM 3x of them – RAID5 – used for data
HDD ( os ) HGST 500GB 2.5″ 7200RPM 2x of them – RAID1 – used for OS
Processor Intel I7 6700 To get the most out of it
Memory Hyper Fusion [email protected] 16GB 4xthem – to max out the board. Max the fun

Now this is what we call server for fun. At this stage you would ask what will be running on that box …. well KVM for virtualisation and openVswitch to play around with SDN.

So was it scary to put it all together ?

Ohhh hell it was 🙂 I felt like getting really fragile lego pieces and the fact of me being so excited didn’t really help 🙂 So I’m attaching couple of better photos during the build out up till the first boot 🙂 enjoy!


2016-07-05 195117 2016-07-06 073927 2016-07-06 080504 2016-07-06 081813 2016-07-06 083100 2016-07-06 090849 20160706_081212 20160706_102301



Thats it for now folks 🙂

Hope you enjoyed this short adventure. We will be using those toys mentioned in this post quite soon and will definitely have more fun! So stay tuned!


DevOpsdays 2016 Amsterdam – Videos are here

If you have missed for whatever reason DevOps days in Amsterdam this year – then you can watch all published videos on vimeo channel! Just head out and go to HERE

Some of my favorites :

DevOpsdays Amsterdam 2016 Day 1 – Adam Jacob from [email protected] on Vimeo.

DevOpsdays Amsterdam 2016 Day 1 – Avishai Ish-Shalom from [email protected] on Vimeo.

DevOpsdays Amsterdam 2016 Day 1 – Daniël van Gils from [email protected] on Vimeo.


Hope you will enjoy as well!



Git – Visualize your repository

Working with any kind of version control system in today’s world of IT should not be even a question. One may remain which one 🙂 I personally use Github / BitBucket and for small factor use Gogs ( the last one itself deserves post of its own … but thats for future 🙂 )

Now once you already use versioning system another interesting discussion/challenge can occur and that is “to branch or not to branch“. Since I know that we could write up complete post here on this subject I will just stick to my personal opinion at the moment of writing this blog – which is “yep – branch 🙂

Ok – that was easy. So now we have reached the stage where we have our “code” which we version and if we do features we even branch – and now how can we visualize and nicely cherry pick changes we are interested in / browse history and do tags ?

By default GIT offers you to use “git log” command. While browsing internet for some cool approach how to do that I came across the following post which showed how to do visualizations ( at least basic one ) .

To make this easy ( if the post would for some reason be not there ) I have made a gist out of it and its below


And I think I would just stop there if not the fact that reading a bit more I came across really great tool called ungit .All it takes to install it is

npm install -g ungit


Once installed from console just invoke it



and off you go … In theory this is nothing more than just an UI for your GIT repository – but look at this already great view on one of my repos



Now the moment I saw this I already thought – “Ok – I’m thinking this will be one of my core tools”. Now I was not wrong. I could nicely expand any of my commits and get access to options further down – and of course details of every of those commits



For me this makes my everyday work now so much easier 🙂 and a bit more cool ! If you are using some different tools – just leave a comment and share your opinion 🙂




MySql SSL – require client certificate for user

When working with MySql database where you have setup encryption following one of many guides on internet you then have choice between just requires SSL to be used or that the client also has certificate. I followed the complete guide from mysql dev which allowed me to quickly get the certificates and SSL setup for my database.

Then depending on your choice you can create users using snippets below :


Ansible role for Redhat 7 CIS baseline

A Compliance fuel gauge with needle pointing to Follow the Rules to illustrate being compliant with regulations, guidelines and standards


If you are working with environments where certain policies and rules needs to be applied something like CIS baselines will be well known to you.

So it works on basis where you define which points you will apply to your system and from that point onwards you are expected to deliver proof that this is how ur systems are now compliant (or not ) and if you do not apply certain settings what is the reason for it .

However the problem comes when you need to enforce this compliancy on multiple systems and make sure they are all happily running this policies.


And here comes the really good part – where you take a configuration management tool like Ansible and create a reusable piece of code which defines your infrastructure. Although looking at CIS baseline documents – if you are to start from zero that would be a lot of work … but …. good friend of mine has spent his time preparing CIS baseline for Redhat 7 which is no available on github in his repository HERE 🙂


And for much more interesting info you can always look at his blog under https://blog.verhaar.io


Screenshot 2016-03-22 23.07.16






Atom cheatsheet

atom-logoIf you are like me and appreciate tools which enables you to work with highlighting multiple standards syntax and as well enable you to be quick and efficient then I recommend using atom.io

And since we want to be as fast as possible below you can find cheatsheet that I have came across of.



Screenshot 2016-03-01 21.13.14




C# – Active Directory changes synchronization with cookie

c-shIn recent post we have discussed how to track Active Directory changes effeciently with PowerShell .

Now the same thing we can achieve with C#. And if you would wonder why C# since we have had it already in PowerShell ? Well maybe you would be writing a form of REST API for your enterprise ? Or writing application for personnel who is not fluent with scripting ( the ppl that do use GUI 🙂  )

Neverless this is going to be nice and easy. I will not be using screenshoots of Visual Studio in this post but just providing you with the information needed.


The architecture and design is totally up to you 🙂 I will introduce you to basics needed to put the bits and pieces together. To hold information which we receive it would be best to create a class with properties we will be interested in and hold that in a list.

public class adresult
   string objName {get;set;}
   string objDN   {get;set;}
   string objXYZ  {get;set;} # Whatever else properties you would be interested in 


That was easy 🙂 Now let’s get to write our application. I focus here on console application but you can you whatever else type suitable for you.

Let’s prepare LDAP connections :

                string ldapSrv = "LDAP://<LDAP-path>";
                string ldapFilter = "(objectClass=user)";

                // File to store our cookie
                string ldapCookie = @"c:\adsync-cookie.dat";

                // set up search
                DirectoryEntry dir = new DirectoryEntry(ldapSrv);
                DirectorySearcher searcher = new DirectorySearcher(dir);

                searcher.Filter = ldapFilter;
                searcher.SearchScope = SearchScope.Subtree;
                searcher.ExtendedDN = ExtendedDN.Standard;


Next is the interesting – which is synchronization object

// create directory synchronization object
DirectorySynchronization sync = new DirectorySynchronization();

// check whether a cookie file exists and if so, set the dirsync to use it
if (File.Exists(ldapCookie))
      byte[] byteCookie = File.ReadAllBytes(ldapCookie);


Lastly is combining of what we have prepared and executing search

// Assign previously created object to searcher 
searcher.DirectorySynchronization = sync;

// Create group of our objects
List<adresult> ADresults = new List<adresult>();

foreach (SearchResult result in searcher.FindAll())
      adresult objAdresult = new adresult();
      objAdresult.Objname  = (string)result.Properties["name"][0];
      string[] sExtendedDn = ((string)result.Properties["distinguishedName"][0]).Split(new Char[] { ';' });
      objAdresult.objDN    = sExtendedDn[2];


// write new cookie value to file
File.WriteAllBytes(ldapCookie, sync.GetDirectorySynchronizationCookie());

// Return results 
return ADresults;


This concludes this short post. I hope you would be able to use it for your complex Active Directory scenarios.




Ansible – ‘DEFAULT_DOCKER_API_VERSION’ is not defined

Working on daily basis with DevOps causes you to automate a lot of work. Now one of recent orchestration tools by me is Ansible. I’m not saying THIS IS THE TOOL to go :) but it for sure have a lot of potential.

So I decided to use it to deploy services also leveraging docker … but then I received error message :


NameError: global name ‘DEFAULT_DOCKER_API_VERSION’ is not defined

  - name: install the required packages
    apt: name=python-pip state=present update_cache=yes
  - name: Install docker-py as a workaround for Ansible issue
    pip: name=docker-py version=1.2.3