2

Logstash – Filtering Vyos syslog data

logstash-logoHey , So in last days/weeks 🙂 I work quite a lot with ELK stack. Especially in getting data from my systems into Elastic. There would not be any problem if not the fact that default parsing did not quite do work. But what would be IT life without challenges ?

So in this post I will explain in short how I have overcome this problem. And I’m sure you would be able to use this or event make it better.

We will look into following:

* Incoming raw data

* Creating filter

* Enjoying results

 

Incoming raw data:

So you got your vyos box doing the hard work on the edge of your network. And now you would like to have control when someone is knocking to your door or to find root cause when troubleshooting firewall rules.

Example of incoming data from my box looks similar to the following :

<4>Dec  6 01:36:00 myfwname kernel: [465183.670329] 
[internet_local-default-D]IN=eth2 OUT= 
MAC=00:aa:aa:aa:aa:aa:aa:aa:aa:aa:aa:aa:aa:00 
SRC=5.6.7.8 DST=1.2.3.4 LEN=64 TOS=0x00 
PREC=0x00 TTL=56 ID=10434 DF PROTO=TCP 
SPT=51790 DPT=80 WINDOW=65535 RES=0x00 SYN URGP=0

and if we just apply basic syslog filtering it will not give out all required fields. The problem we challenge here is we need to get out internet rule. Then following that we can see use case for key value filter.

 

Creating filter:

So time has come to use some magical skills of creating configuration for Logstash filter. I would like to put stress that using different approaches can have impact on performance. Both negative and positive. As yet I’m getting familiar with Logstash this might not be the best solution but I will definitely explore this.

You will notice that my filters use conditional statements so I do not process data unnecessary. In my case vyoss traffic is tagged as syslog and contains specific string in the message.

So without further bubbling … We begin with parsing out data from the message that will get for sure extracted.

  if [type] == "syslog" and [message] =~ "myfw" {
    grok {
      break_on_match => false
      match => [
      "message",  "<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: \[(?<syslog_pid>.*?)\] %{GREEDYDATA:syslog_message}",
      "syslog_message", "\[(?<firewall_rule>.*?)\]"
      ]
    }

Points of interest here :

  • Grok filter DOES NOT break on match
  • We do match on message and further on extracted syslog_message to get our firewall rule from [ ]

Next we will do changes on fly to our fields using mutate

    # mutate our values
    mutate {
      add_field => [ "event_type", "firewall" ]
      rename => { "firewall_rule" => "[firewall][rule]" }
      gsub => [ "message", "= ", "=xxx" ]            # Here we remove scenario where this value is empty
    }

Points of interest :

  • I add a field event_type  called firewall so in future I would be able to quickly query for those events.
  • I rename my previous field ( firewall_rule ) to nested field
  • And lastly I use gsub to mitigate problem of missing values in key pair

Once this is done I extract remaining values using kv filter which is configured as follow :

    # Apply key value pair
    kv {
      include_keys => ["SRC","DST","PROTO","IN","MAC","SPT","DPT"]
      field_split => " \[\]"
      add_field => {
        "[firewall][source_address]" => "%{SRC}"
        "[firewall][destination_address]" => "%{DST}"
        "[firewall][protocol]" => "%{PROTO}"
        "[firewall][source_port]" => "%{SPT}"
        "[firewall][destination_port]" => "%{DPT}"
        "[firewall][interface_in]" => "%{IN}"
        "[firewall][mac_address]" => "%{MAC}"
      }
    }

Points of interest :

  • I use include_keys so only fields in array will be extracted ( positive impact on performance )
  • I tried field_split to help out with one of previous challenges but that did not make a lot of difference
  • And lastly I specify my new nested fields for extracted values

 

So thats it! The complete file looks following :

filter {
  if [type] == "syslog" and [message] =~ "vmfw" {
    grok {
      break_on_match => false
      match => [
      "message",  "<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: \[(?<syslog_pid>.*?)\] %{GREEDYDATA:syslog_message}",
      "syslog_message", "\[(?<firewall_rule>.*?)\]"
      ]
    }
    
    # mutate our values
    mutate {
      add_field => [ "event_type", "firewall" ]
      rename => { "firewall_rule" => "[firewall][rule]" }
      gsub => [ "message", "OUT= MAC=", "MAC=" ]            # Here we remove scenario where this value is empty
    }

    # Apply key value pair
    kv {
      include_keys => ["SRC","DST","PROTO","IN","MAC","SPT","DPT"]
      field_split => " \[\]"
      add_field => {
        "[firewall][source_address]" => "%{SRC}"
        "[firewall][destination_address]" => "%{DST}"
        "[firewall][protocol]" => "%{PROTO}"
        "[firewall][source_port]" => "%{SPT}"
        "[firewall][destination_port]" => "%{DPT}"
        "[firewall][interface_in]" => "%{IN}"
        "[firewall][mac_address]" => "%{MAC}"
      }
    }
  }
}

 

Enjoying results:

We now would need to test this if it really works as we expect it to work. For this to check we of course will use Docker

First I will create the afore mentioned config file and name it logstash.conf . Once thats done we bring up container up with the following command :

docker run -d -p 25666:25666 -v "$PWD":/config-dir logstash logstash -f /config-dir/logstash.conf

This creates container for me which I can then test locally. Now for this to work you need input source ( i.e. tcp / udp and stdout i.e. codec ruby )

Then I will split my screen using tmux and will execute request while looking at results from docker logs

Logstash-vyos-local

 

And thats it! You have beautifully working parsing for your vyos box ! If you have any comments / or improvements – feel free to share!

 

 

0

Powershell – Network cmdlets

In effort to move away from old school habits of using i.e. nslookup instead of PS cmdlets I thought it would be beneficial if for reference I would reblog quite interesting article about replacement of those cmd into pure PowerShell cmdlets. Original article you can find under here

IPCONFIG

Used to get ip configuration.

Get-NetIPConfiguration
Get-NetIPAddress

Get-NetIPConfiguration
Get-NetIPAddress | Sort InterfaceIndex | FT InterfaceIndex, InterfaceAlias, AddressFamily, IPAddress, PrefixLength -Autosize
Get-NetIPAddress | ? AddressFamily -eq IPv4 | FT –AutoSize
Get-NetAdapter Wi-Fi | Get-NetIPAddress | FT -AutoSize

 

PING

Check connectivity to target host.

Test-NetConnection

Test-NetConnection www.microsoft.com
Test-NetConnection -ComputerName www.microsoft.com -InformationLevel Detailed
Test-NetConnection -ComputerName www.microsoft.com | Select -ExpandProperty PingReplyDetails | FT Address, Status, RoundTripTime
1..10 | % { Test-NetConnection -ComputerName www.microsoft.com -RemotePort 80 } | FT -AutoSize

 

NSLOOKUP

Translate IP to name or vice versa

Resolve-DnsName

Resolve-DnsName www.microsoft.com
Resolve-DnsName microsoft.com -type SOA
Resolve-DnsName microsoft.com -Server 8.8.8.8 –Type A

 

ROUTE

Shows the IP routes (also can be used to add/remove )

Get-NetRoute
New-NetRoute
Remove-NetRoute

Get-NetRoute -Protocol Local -DestinationPrefix 192.168*
Get-NetAdapter Wi-Fi | Get-NetRoute

 

TRACERT

Trace route. Shows the IP route to a host, including all the hops between your computer and that host.

Test-NetConnection –TraceRoute

Test-NetConnection www.microsoft.com –TraceRoute
Test-NetConnection outlook.com -TraceRoute | Select -ExpandProperty TraceRoute | % { Resolve-DnsName $_ -type PTR -ErrorAction SilentlyContinue }

 

 

NETSTAT

Description: Shows current TCP/IP network connections.

Get-NetTCPConnection

Get-NetTCPConnection | Group State, RemotePort | Sort Count | FT Count, Name –Autosize
Get-NetTCPConnection | ? State -eq Established | FT –Autosize
Get-NetTCPConnection | ? State -eq Established | ? RemoteAddress -notlike 127* | % { $_; Resolve-DnsName $_.RemoteAddress -type PTR -ErrorAction SilentlyContinue }

 

 

So happy moving into objectirezed world of PowerShell 🙂

 

3

PowerShell – Active Directory changes synchronization with cookie

In today’s post I wanted to show you something that can be of interest for those who need to find recent Active Directory changes but are challenged by i.e. big AD forest with a large amount of object and are hitting performance problems when executing queries.  So where this problem comes from ? Well if you have Active Directory with a lot ( really a lot of objects ) then querying quite often for changes can be troublesome.

But dont worry – there are couple of ways to tackle this challenge. If you look for more details you will find that you can just query information (duh ?! ) / subscribe yourself to be notified when changes occur (push) / or make incremental queries (pull). And today we will exactly investigate querying using synchronization cookie

The principal here is to use cookie which will allow us to poll for changes since last time we queried AD. This way we can have only very specific query and return only subset of properties we are really interested with.

 

The whole code is quite simple to implement and consits of the following :

And that would be all for this. So from the code above you see that your subsequent requests would be based on changes since last poll (of course based on the query your provided ). In one of next posts we will focus on getting this in C# as some of you may want to do more DevOps

 

 

 

 

 

0

OpenSSL – generate self signed certificate

Quite often to test different aspects of IT or security we use certificates which are self signed. As the name implies we are responsible for generating them. In this post we will go through short explanation how to generate one with use of openssl.

To create one we will issue the following command :

openssl req -x509 -newkey rsa:2048 -keyout certificate-key.pem -out certificate.pem -days 365

In order to better understand above command lets break it down :

  • req : PKCS#10 certificate request and certificate generating utility
  • -x509 : we receive self signed certificate as output instead of certificate request
  • -newkey rsa:#### : creates a new certificate request and new private key. In this instance we use RSA with size of #### bits
  • -keyout file.name : outputs just created private key into a file
  • -out file.name : specifies output file name
  • -days # : specifies how many days the certificate will be valid when x509 option have been used. Default value for this setting is 30 days
  • -nodes : indicates that private key should not be encrypted

 

For those being on windows we sometimes need to get PFX (which contains private and public key ). Easiest is to use OpenSSL in the following form :

openssl pkcs12 -inkey bob_key.pem -in bob_cert.cert -export -out bob_pfx.pfx

 

Since some of you will be working on windows you might get across the following error :

WARNING: can't open config file: /usr/local/ssl/openssl.cnf

then what you are missing is setting for a environmental variable (*make sure to adjust path to your cfg file ):

set OPENSSL_CONF=c:\OpenSSL-Win32\bin\openssl.cfg

 

And thats it for self signed certificate. In next post we will use knowledge of certificates with the power of Docker and will set up our own registry

 

1

Azure Files on Ubuntu

If you have not seen recent post on Azure blog , then I would like to let you know that Azure Files are now GA. Details of this blog entry are available here.

Since I would not like to make duplicate of content I’m going to show you how you can get the Azure File share mapped on your linux boxes. Why linux boxes ? I already have tryzylion ideas of usage for this – major one is Docker and containers which I would like to make HA or my own Docker repository.

 

Creation of files via portal is extremly easy and intuitive

azure_files_firstview

 

Install tools

We need to install the following package if not alredy present ( I become a fan of ubuntu 🙂 :

sudo apt-get install cifs-utils

 

Mount fileshare

Then next step is mounting the share. This has some limitations based on SMB protocol version being used (for more detailed info look into the mentioned azure blog post link ) .  I will be using in this instance SMB v3 so we are good to go on using AF on premises.

sudo mount -t cifs //rafpeninja.file.core.windows.net/docker-demo-data ./dockerdemodata -o vers=3.0,username=rafpeninja,password=YourAwesomeStorageKey==,dir_mode=0777,file_mode=0777

 

As I did not want to play yet with any restrictions the permissions are kind high 🙂 but you can modify them as you need

 

Simple test

Once this is done you can head to the folder and create a sample file.

sudo touch test.me

 

When done you can see that file instantly via the portal

azure_file_test_ok

 

 

And here you go – your file is immiediately available. If you got any scenarios where you already use those I’m keen to hear about it !

 

 

 

0

MacOs – Multiple terminals with customised color scheme

If you are like me 😀 So not closing yourself only to one operating system you then probably operate between the world of Windows and world of Linux 😀

At the moment I have set up my working environment in a way that allows me to work wit both systems. So one one end I got the new and shiny Windows 10 and on the other boot I got Mac OS.

And on Mac Os I have been looking for software that would give me better control and visibility of my sessions than the standard terminals app. With a bit of looking around I found couple of alternatives and the one that really got my attention is iTerm

The way it looks its more than satisfying 🙂 You can see detailed view of horizontal split on the screen below :

Screeny Shot 29 Aug 2015 13.15.42

 

The program has a lot of cool color schemes to offer. What I also did was to edit my profile file accordingly to the mentioned solutions in this post

If you rather just to get details of modification here it is :

export CLICOLOR=1
export LSCOLORS=GxFxCxDxBxegedabagaced
export PS1='\[\033[01;32m\]\[email protected]\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]\$ '

 

Hope this will help you to customise your environment up to your needs 😀 If you are using other helpful tools feel free to share in comments!

 

0

Manage your GitHub gists with GistboxApp

If you are like me 😀 and understand that sharing with your knowledge is something great – then this application might be for you. I dont have yet so many Gists on GitHub but getting them correctly managed is something I wanted to have.

Thats how I came across GistboxApp which I personally think its a great value added for anyone having to manage gists. It gives you option of creating labels , adding files , editing … and much more. I think if you use GitHub you might an to give it a spin.

2015-08-28_15h13_57

 

 

It also has Chrome cool addon which enables you to quickly create gits on the fly!

 

Looking out to hear from you what else are you using 😀

0

Splunk – Network components diagram

If you have been using Splunk (or need to make a design for that system ) it is handy to visualize how Splunk works and what ports you are looking for.

While browsing Splunk forum base I have came across this great diagram ( link to source *thx for finding that Matthijs  )

spluk

 

 

I think this is really helpfull to see exactly how does components work with each other.

0

PowerShell – Automate multilanguage maintenance page with culture objects

maintenance …. doesnt that word just sounds like fun in the world of IT people 🙂 I think it all depends on you and how you set up your approach to the situation.

I recently have been overseeing maintenance where we had a single page for one of my customers. And that would be just fine and without a problem if not the fact within that single web page he had HTML containers for multiple languages and multiple countries.

Ok – but still ? Where is the problem with that ?  For purposes of ilustration I will try to come up with some text to give you an idea of how this would look like :

# -------- other code removed for visibility 

<p> On Monday August 17 2015 between 00:01 and 00:02 we will be undergoing maintenance.

Banana!
</p>

# --------- some other HTML here 

<p> El lunes 17 de agosto 2015 entre las 00:01 y las 00:02 estaremos experimentando mantenimiento.

¡Plátano! </p>

# --------- and again a lot of code here and there 

<p>Le lundi 17 Août 2015 0 heures 01-00h02 nous serons en cours de maintenance.

Banane! </p>


# --------- and again a lot of code here and there ... and so on and so on :O

 

Yey :O So I asked – ok how do you generate that file … unfortunately … this was a manual work which means someone would just go and modify those dates manually also using some google translate to get appropiate days/ months names.

 

We must automate this!

Yes! Cannot agree more that this cannot be like that. I’m fine with everlasting static code (altough I think the whole should be dynamically generated ) however lets start small 🙂

So what can do here …. we must identify moving parts. In our case the moving parts in first instance is the Country. Then we can have multiple different locale for country. Example ? Belgium … We can have English,French and German. Lastly we identify the property of locale like day of week/month etc

Now our code in source could look like

# -------- other code removed for visibility 

<p> On {US_en-US_DayOfWeek} {US_en-US_Month} {US_en-US_Day} {US_en-US_Year} between {US_en-US_StartDate} and {US_en-US_StopDate} we will be undergoing maintenance.

Banana!
</p>

# --------- some other HTML here 

<p> El {ES_es-ES_DayOfWeek} {ES_es-ES_Day} de {ES_es-ES_Month} {ES_es-ES_Year} entre las {ES_es-ES_StartDate} y las {ES_es-ES_StopDate} estaremos experimentando mantenimiento.

¡Plátano! </p>

# --------- and again a lot of code here and there 

<p>Le {FR_fr-FR_DayOfWeek} {FR_fr-FR_Day} {FR_fr-FR_Month} {FR_fr-FR_Year} {FR_fr-FR_StartDate}-{FR_fr-FR_StopDate} nous serons en cours de maintenance.

Banane! </p>


# --------- and again a lot of code here and there ... and so on and so on :O

 

So what have we done here ? Well we have introduced variables that will allow us to modify moving parts. If you look at single one {ES_es-ES_DayOfWeek}

We have it in ‘{}’ which will allow for quick search within content of files. Then we have it in capitols as country followed by locale and lastly by property name.

All of those divided by using ”_’ . Easy isnt?

 

Let the coding begins!

Since I want to avoid situation where I would have 50 ‘if statements’ or ‘replace’ statements in my code I will code with thinking of

  • modularity of the code
  • ease of extending this

 

Great! So now we have already prepared file contents with our customized variables and now we need to figure a way of putting that into the game 😀

 

So lets see what happened here

Here I have created my sel hashArray to be used within the script. As you can see the country is primary unique key for us. Each country can have multiple locales and maybe in future extra other settings.

# we define our countries
$pageCountries= @(    @{country="NL";[email protected]("nl-NL","en-US")}, `
                      @{country="BE";[email protected]("nl-BE","fr-BE","de-DE","en-US")},`
                      @{country="FR";[email protected]("fr-FR")},`
                      @{country="DE";[email protected]("de-DE")},`
                      @{country="UK";[email protected]("en-GB")},`
                      @{country="US";[email protected]("en-US")} 
)

 

Next I defined time slots for this maintanance. On purpose I used [datetime] object as I like the fact of just passing variables and not trying to parse from string 🙂 At the moment of writing duration is applied for all countries but as you can see it could be that we customize if for each country

# maintanance start date 
$maintananceStartDate=[datetime]::new(2015,8,16,1,0,0) # year,month,day,hour,minute,second

# maintanance duration (should be per country maybe ? )
[int]$maintananceDuration = 4

# stop time is 
$maintananceStopDate = $maintananceStartDate.AddHours($maintananceDuration)

 

Next we do iterations. We start off with countries and then for each of the countries we get into country locale :

# We start with each country
foreach($singleCountry in $pageCountries)
{
   
   # we then go for each locale
   foreach($languageLocale in $singleCountry.locale)
   {

 

From here we now will be creatign customzied variables for replacements. We start off by getting the locale culture for our current iteration

# get culture 
        $cultureCurrent = New-Object system.globalization.cultureinfo($languageLocale)

 

Having that we go ahead and create our properties and assign their values accordingly. If you notice later by just adding properties we will be auto-extending possible scope of variables in file ….

# We define our props 
        $props = @{ dayOfWeek           = $cultureCurrent.DateTimeFormat.DayNames[ [int]$maintananceStartDate.DayOfWeek ]; 
                    day                 = $maintananceStartDate.Day; 
                    month               = $cultureCurrent.DateTimeFormat.MonthNames[ $maintananceStartDate.Month ];
                    startTime           = $maintananceStartDate.ToShortTimeString();
                    stopTime            = $maintananceStopDate.ToShortTimeString();
                    datetime            = $maintananceStartDate.ToShortDateString()}

What is interesting here is that dayOfWeek comes from array of enums for specific language selected by integer value of our current day for maintanance.

DayNames looks as following in PS

DayNames                         : {Sunday, Monday, Tuesday, Wednesday...}

 

Cool – so lastly we go into replacing mode 🙂 As mentioned just a moment ago – with just adding single property into that array we get it auto added. This is done by iterating every named property in there

 # We need to now interate each of the props and make appropiate replacements
        foreach($item in $props.GetEnumerator()|select Name -ExpandProperty Name) 
        {

 

And then there is not much left except of just replacing those values

            $filter = "{" + [string]::Format('{0}_{1}_{2}',$singleCountry.country, $languageLocale, $item) + "}"

            Write-Host "Our filter is $filter" -ForegroundColor Yellow
            Write-Host "Target Value is $($props[ $item ] )" -ForegroundColor Yellow

            $maintanancePage = $maintanancePage.Replace( $filter, $props[ $item ] )

 

And thats IT ! Automated , streamlined , prone to user error 😀

Future todo

At the moment I think it is a good moment to have basic version working before we start rumbling with changes 😀 I defnitely will add Pester tests to it and make sure it can be more customized. Im thinking of per country advanced settigns maybe … ? We will see – will keep you updated.

 

 

 

0

Powershell – DSC checklist

As you remember we are in the middle of the series for DSC module creation. I maybe should have mentioned this before but better now than never.  What I’m talking about … well its all about good practice within DSC. Some of them we do apply and some we will apply in our DSC series.

As it is good to have it around I decided to do a repost . Source from http://blogs.msdn.com/b/powershell/archive/2014/11/18/powershell-dsc-resource-design-and-testing-checklist.aspx

1       Resource module contains .psd1 file and schema.mof for every resource

2       Resource and schema are correct and have been verified using DscResourceDesigner cmdlets

3       Resource loads without errors

4       Resource is idempotent in the positive case

5       User modification scenario was tested

6       Get-TargetResource functionality was verified using Get-DscConfiguration

7       Resource was verified by calling Get/Set/Test-TargetResource functions directly

8       Resource was verified End to End using Start-DscConfiguration

9       Resource behaves correctly on all DSC supported platforms (or returns a specific error otherwise)

10     Resource functionality was verified on Windows Client (if applicable)

11     Get-DSCResource lists the resource

12     Resource module contains examples

13     Error messages are easy to understand and help users solve problems

14     Log messages are easy to understand and informative (including –verbose, –debug and ETW logs)

15     Resource implementation does not contain hardcoded paths

16     Resource implementation does not contain user information

17     Resource was tested with valid/invalid credentials

18     Resource is not using cmdlets requiring interactive input

19     Resource functionality was thoroughly tested

20     Best practice: Resource module contains Tests folder with ResourceDesignerTests.ps1 script

21     Best practice: Resource folder contains resource designer script for generating schema

22     Best practice: Resource supports -whatif