Powershell – Connect to TFS wiith alternative credentials

In this short post I will just show you how can you connect to your TFS server (from PowerShell) using alternative credentials. As it might be useful when you automate your infra to peform certain task. This script assumes you do have required TFS dlls available on your machine. If not you need to get them and load them on runtime

This is how I load it in a seperate parts of my scripts

        # load the required dll


Here is the script itself …


PowerShell – Automate multilanguage maintenance page with culture objects

maintenance …. doesnt that word just sounds like fun in the world of IT people 🙂 I think it all depends on you and how you set up your approach to the situation.

I recently have been overseeing maintenance where we had a single page for one of my customers. And that would be just fine and without a problem if not the fact within that single web page he had HTML containers for multiple languages and multiple countries.

Ok – but still ? Where is the problem with that ?  For purposes of ilustration I will try to come up with some text to give you an idea of how this would look like :

# -------- other code removed for visibility 

<p> On Monday August 17 2015 between 00:01 and 00:02 we will be undergoing maintenance.


# --------- some other HTML here 

<p> El lunes 17 de agosto 2015 entre las 00:01 y las 00:02 estaremos experimentando mantenimiento.

¡Plátano! </p>

# --------- and again a lot of code here and there 

<p>Le lundi 17 Août 2015 0 heures 01-00h02 nous serons en cours de maintenance.

Banane! </p>

# --------- and again a lot of code here and there ... and so on and so on :O


Yey :O So I asked – ok how do you generate that file … unfortunately … this was a manual work which means someone would just go and modify those dates manually also using some google translate to get appropiate days/ months names.


We must automate this!

Yes! Cannot agree more that this cannot be like that. I’m fine with everlasting static code (altough I think the whole should be dynamically generated ) however lets start small 🙂

So what can do here …. we must identify moving parts. In our case the moving parts in first instance is the Country. Then we can have multiple different locale for country. Example ? Belgium … We can have English,French and German. Lastly we identify the property of locale like day of week/month etc

Now our code in source could look like

# -------- other code removed for visibility 

<p> On {US_en-US_DayOfWeek} {US_en-US_Month} {US_en-US_Day} {US_en-US_Year} between {US_en-US_StartDate} and {US_en-US_StopDate} we will be undergoing maintenance.


# --------- some other HTML here 

<p> El {ES_es-ES_DayOfWeek} {ES_es-ES_Day} de {ES_es-ES_Month} {ES_es-ES_Year} entre las {ES_es-ES_StartDate} y las {ES_es-ES_StopDate} estaremos experimentando mantenimiento.

¡Plátano! </p>

# --------- and again a lot of code here and there 

<p>Le {FR_fr-FR_DayOfWeek} {FR_fr-FR_Day} {FR_fr-FR_Month} {FR_fr-FR_Year} {FR_fr-FR_StartDate}-{FR_fr-FR_StopDate} nous serons en cours de maintenance.

Banane! </p>

# --------- and again a lot of code here and there ... and so on and so on :O


So what have we done here ? Well we have introduced variables that will allow us to modify moving parts. If you look at single one {ES_es-ES_DayOfWeek}

We have it in ‘{}’ which will allow for quick search within content of files. Then we have it in capitols as country followed by locale and lastly by property name.

All of those divided by using ”_’ . Easy isnt?


Let the coding begins!

Since I want to avoid situation where I would have 50 ‘if statements’ or ‘replace’ statements in my code I will code with thinking of

  • modularity of the code
  • ease of extending this


Great! So now we have already prepared file contents with our customized variables and now we need to figure a way of putting that into the game 😀


So lets see what happened here

Here I have created my sel hashArray to be used within the script. As you can see the country is primary unique key for us. Each country can have multiple locales and maybe in future extra other settings.

# we define our countries
$pageCountries= @(    @{country="NL";[email protected]("nl-NL","en-US")}, `
                      @{country="BE";[email protected]("nl-BE","fr-BE","de-DE","en-US")},`
                      @{country="FR";[email protected]("fr-FR")},`
                      @{country="DE";[email protected]("de-DE")},`
                      @{country="UK";[email protected]("en-GB")},`
                      @{country="US";[email protected]("en-US")} 


Next I defined time slots for this maintanance. On purpose I used [datetime] object as I like the fact of just passing variables and not trying to parse from string 🙂 At the moment of writing duration is applied for all countries but as you can see it could be that we customize if for each country

# maintanance start date 
$maintananceStartDate=[datetime]::new(2015,8,16,1,0,0) # year,month,day,hour,minute,second

# maintanance duration (should be per country maybe ? )
[int]$maintananceDuration = 4

# stop time is 
$maintananceStopDate = $maintananceStartDate.AddHours($maintananceDuration)


Next we do iterations. We start off with countries and then for each of the countries we get into country locale :

# We start with each country
foreach($singleCountry in $pageCountries)
   # we then go for each locale
   foreach($languageLocale in $singleCountry.locale)


From here we now will be creatign customzied variables for replacements. We start off by getting the locale culture for our current iteration

# get culture 
        $cultureCurrent = New-Object system.globalization.cultureinfo($languageLocale)


Having that we go ahead and create our properties and assign their values accordingly. If you notice later by just adding properties we will be auto-extending possible scope of variables in file ….

# We define our props 
        $props = @{ dayOfWeek           = $cultureCurrent.DateTimeFormat.DayNames[ [int]$maintananceStartDate.DayOfWeek ]; 
                    day                 = $maintananceStartDate.Day; 
                    month               = $cultureCurrent.DateTimeFormat.MonthNames[ $maintananceStartDate.Month ];
                    startTime           = $maintananceStartDate.ToShortTimeString();
                    stopTime            = $maintananceStopDate.ToShortTimeString();
                    datetime            = $maintananceStartDate.ToShortDateString()}

What is interesting here is that dayOfWeek comes from array of enums for specific language selected by integer value of our current day for maintanance.

DayNames looks as following in PS

DayNames                         : {Sunday, Monday, Tuesday, Wednesday...}


Cool – so lastly we go into replacing mode 🙂 As mentioned just a moment ago – with just adding single property into that array we get it auto added. This is done by iterating every named property in there

 # We need to now interate each of the props and make appropiate replacements
        foreach($item in $props.GetEnumerator()|select Name -ExpandProperty Name) 


And then there is not much left except of just replacing those values

            $filter = "{" + [string]::Format('{0}_{1}_{2}',$singleCountry.country, $languageLocale, $item) + "}"

            Write-Host "Our filter is $filter" -ForegroundColor Yellow
            Write-Host "Target Value is $($props[ $item ] )" -ForegroundColor Yellow

            $maintanancePage = $maintanancePage.Replace( $filter, $props[ $item ] )


And thats IT ! Automated , streamlined , prone to user error 😀

Future todo

At the moment I think it is a good moment to have basic version working before we start rumbling with changes 😀 I defnitely will add Pester tests to it and make sure it can be more customized. Im thinking of per country advanced settigns maybe … ? We will see – will keep you updated.





Powershell – build complex module with functions dependency tree

logo-powershell-520x245Hey! So the time has come to share with you one of my recent achievements within automation. As you may have noticed in subejct of the post we will be focusing on complex functions with dependencies on onther functions within module. If that would not be enough … we will execute them remotely within session. Isnt that just uber cool ?

So in my last post I focuses on brief intro to what exporting of functions could look like. Today we start off with definition how we set up our functiosn and what we will use to build dependencies ( of course keeping it all under control )

How do you build complex functions ?

This might sound trivial however it is quite important to get this idea before you go ahead and build a module with 30 cmdlets and 3000 lines of code. From my DevOps experience I can tell you that wherever possible I start building from generic functions which then I use in more specific functions (a form of wrappers level above )

I think if we were to think of visualisation we could say we would get something in lines of :



Someone just looking at it could say it looks really promising and professional 😀 Well it does. It is all about low level functions (the generic ones) to work more less with RAW data objects without performing complex validations. This validations and remaining high level functionality would be done by high level functions. 


Sample code which I could use for executing such dependent functions could look as follow :

                # Required functions on the remote host
                $exportNewObjGeneric  = Export-FunctionRemote New-ObjGeneric                                                                                                                                             
                $exportAddProps       = Export-FunctionRemote Add-ObjProperties 
                $exportGetAdSite      = Export-FunctionRemote Get-MyAdSite

                $remoteFunctions = @($exportGetAdSite, $exportNewObjGeneric, $exportAddProps)

Invoke-Command -Session $Session -ScriptBlock {   
                    # we recreate each of required functions
                    foreach($singleFunction in $using:remoteFunctions)
                        . ([ScriptBlock]::Create($singleFunction));
                 # ---- some other code doing magic -----



Great! Since it sounds so easy to implement … where is the catch …. 😀 Of course there is one … look what potentially could happen if you just go ahead and start referencing functions within a module ….



It will not take a long time when you will lose track of what references what and where are your dependencies. This is just asking for trouble as you will practically not be able to assert what will be consequences of yoru changes on one end. I can guarantee you would get a butterfly effect in this scenario!

You will quickly lose ability to properly manage the code and automation will become more than a nightmare than a pleasure!


I have seen that to – however a bit differently. I though we could utilize something that every function of mine has – code based help!

Look at the example of function below – for purposes of this post I have ammended the code based help




Have you noticed anything special ? …. Well if you didnt let me explain ….


Reference functions which are dependencies

Yes! This is the road to ultimate automation. I have came up with idea which can be describe as following :

  • Functions can nest (have a dependency  )  only on 1 level – so not dependency in dependency in dependency (but maybe you will come up with some more elegant way to overcome this 😀 )
  • Generic functions should not have dependencies on custom functions

With those 2 points I was able to continue with this solution. Therefore I have ammended a bit of the code we used last time and came up with the following :


Now this functions references one of which one already discussed called ‘Export-FunctionRemote’ (available @ Github ).

So what we got from the above ? Well we got something really great. In a controlled way by decorating our function with commented based help and specyfing  RequiredFunction<Some-FunctionName> it will be cosnidered as dependency in out script.

        Do xyz 



Finally use example

So we need to use what we have just received. I wont be taking long to explain – this is the pure awesomness of automation 🙂 …

                # Aqquire function name
                $functionName = $MyInvocation.MyCommand
                # get remote functions into remote script block
                Write-Verbose "Exporting functions for $functionName"
                $remoteFunctions = Get-RemoteRequiredFunctions -functionName                    $functionName


                Invoke-Command -Session $Session -ScriptBlock {   
                    # we recreate each of required functions
                    foreach($singleFunction in $using:remoteFunctions)
                        . ([ScriptBlock]::Create($singleFunction));

                    # ---- do magic -----




I hope you like the idea of automating yoru functions in generic way to work with Raw data and then use high level functions to really utilize their potential. Given that you have also now received way to perform advanced operations by creating function dependencies.

Of course this is more than extendible – you can buld dependency trees – do more complex unit testign with Pester … there is not limtis 😀

Got feedback / issue ? As usual leave a comment or jump to GitHub / Gists



PowerShell – Local functions in remote sessions with ScriptBlock

logo-powershell-520x245 I think all of have been in situation when we wanted to run local PowerShell functions in remote sessions. If someone is new to the subject you would think there is no challenge here 😀

But wait …. there is 😀 I have personally seen couple of implementations of such ‘workarounds’ and in my opinion the most important is to choose the one that suits you.

Option 1 : Pass the function in script block :

You can directly invoke something in lines of :

Invoke-Command -ScriptBlock ${function:foo} -argumentlist "Bye!"

But when you do a lot of automations like I do this doesnt do any good for me :O


Option 2 : Use ScriptBlock

Script block you could say will be our ‘precious‘ 🙂 It seems a lot of ppl underestimate its potential. We should maybe first start of with how we define ScriptBlock.

I think easiest  to say would be something in lines of ‘… its a container for statements or/and expressions which can accept parameters and return value’ 

If you would like to know more please take a look at technet page.

Now you would ask but how this helps us in remote creation of PowerShell functions ? Ohhh it really does! It opens you doors to total automation ….

Look at this simple example (the snippet below comes from StackOverFlow )

$fooDef = "function foo { ${function:foo} }"

Invoke-Command -ComputerName someserver.example.com -ScriptBlock {
    Param( $fooDef )

    . ([ScriptBlock]::Create($using:fooDef))

    Write-Host "You can call the function as often as you like:"
    foo "Bye"
    foo "Adieu!"


So what happened here ? As we said a moment ago scriptblock is an container for expressions/statements and can accept input and return value(s)….  So we have defined our container of statements as

$fooDef = "function foo { ${function:foo} }"

Its perfectly valid to do so 😀

Next one is execution of our script in remote session and what we do here is we recreate our scriptblock based on definiton

. ([ScriptBlock]::Create($using:fooDef))

And voilla – from that moment you can use your function in remote session as it would be local!


One step further – automate exporting of functions

I’m a bit excited writing about this however using the above we will be able to create amazing automations. We will focus on creating multilayer (dependent each on) functions that we will automatically recreate when necessary.

But before we get there we need to automate exporting of our functions. For this we will use the following PS script :




PowerShell – Creating DSC module – Part 1


Hey ,

As you probably know in world of DevOps a neat feature as DSC is must have and must do considering what it is capable of.  Since recently I have been working a lot with webservers I decided it took to long to get appropiate configuration in place. Considering event though we had some legacy scripts doing work , sometimes we had discripiencis which were really hard to solve (as finding root cause happening randomly on servers was fun )

So with this post I will be introducing you to creation of new DSC module with best practices and experience of myself and people from all over world (thanks GitHub). In many cases I will be referring you to external sources of information but the reason to that is simple – I dont want to duplicate information contained there.

Repo used for this excersise is available under https://github.com/RafPe/cWebAdmin so feel free to comment/contribute as it might be that you will find a better approach.

To be clear from beggining I will be using PSv5 as it offers me the most up to date access to cmdlets

PS C:\Program Files\WindowsPowerShell\Modules> $PSVersionTable

Name                           Value                                                       
----                           -----                                                       
PSVersion                      5.0.10105.0                                                 
WSManStackVersion              3.0                                                         
CLRVersion                     4.0.30319.42000                                             
BuildVersion                   10.0.10105.0                                                
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...}                                     
PSRemotingProtocolVersion      2.3                                                         


For creation of resource we will be using DSC module called xDscResourceDesigner. If you will check that module you will see the latest updates and description of cmdlets

If you download the tool make sure its available under $env:ProgramFiles\WindowsPowerShell\Modules

Once confirmed you can just import the module by typing :

Import-Module xDSCResourceDesigner

Plan your resource:

Before we go any further it is important to plan your DSC resource accordingly. Think here of “LEGO” 🙂 Ahhhh yes – the good days when you used your imagination to build skyscrapers and pirate ships. In both cases they are much a like 😀 a lot of reusable pieces that interconnects and creates amazing things.

Firstly we will prepare our folder structure. As you can see in example below (taken from Microsoft GitHub) we have :

  • root folder with important module manifest.
  • Then folder containing our resources called DSCResources.
  • Every resource has its own folder and the folder name is the full name of resource (in most of cases this is longer than friendly name).
  • Within each of the folders you have script module (.psm1) and a schema (.schema.mof) file

The decision “To bundle or not to bundle” belongs to you. Since I want to control resources from single module I go for bundle.



Its all about your resources:

So in order to get going with DSC we will need our resources. Using aforementioned xDscResourceDesigner we will be able to properly create them. In the next post I will go into details of creating resources – their parameters and how to update them if necessary.


Whats in DSC :

If we open DSC module  we will find 3 primary DSC TargetResource functions (every DSC resource will have them ):

  • Get – this will get whatever we need to validate
  • Set – this will set whatever we need to set
  • Test – this will validate whatever it is we need to validate and will return a true or false state

The diagram below shows more in detail the flow :

DSC flow

Test-TargetResource is essential as this function is responsible for checking if your target resource exists or not. If it exists, this function must return True otherwise False . Proper writing this function will allow for DSC to deliver desired results.

Set-TargetResource function makes required changes on target resource. It gets called only if  target resource is not in desired state .

The Get-TargetResource function, is used to retrieve the current state of the resource instance.



Links :

There are several links worth of looking at (that I can recommend you to ) :

There are probably more (and newer ones ) so drop comment and I will update the list so ppl would be able to find them.


Summary for part 1:

So this is it for today 🙂 In the next post we will dive in details into :

  • Creating DSC resources and parameters for New-xDscResourceProperty
  • Creating module manifest
  • Talk about reusable scripting and what to avoid when writing scripts
  • …. and much more 🙂








IIS advanced logging – first approach


We start of with quite popular subject which is IIS advanced logging. This extra package provided by Microsoft enables for example to have logging of date time with miliseconds or have CPU utilization.

Research and preparations:

So I came across the challenge of quickly coming up with a way of installing it and configuring the servers so we would get the required data. Therefore before proceeding I have made research and determined the following as requirements to complete this challenge :

  1. Obtain MSI to install IIS advanced logging
  2. Determine log file locations
  3. Specify log fields used
  4. Mitigate problem with username in advanced IIS logging module
  5. Specify any extra fields used
  6. Specify fields order in log
  7. Specify any filters


1 –  This is fairly simple as downloading from Microsoft Download

2 – This will be the folder where you want to keep your logs. I have not tried network locations – so folders were always local.

3 – From available fields you can specify which ones you are interested. I have choosen the one that brings the most value in my environment

4 – There is a known problem with the username in IIS advanced logging where username is blank. Now I of course have come across this and used the following PowerShell command to mitigate this

# Reconfigure username property
Set-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST'  -filter "system.webServer/advancedLogging/server/fields/field[@id='UserName']" -name "sourceType" -value "BuiltIn"

5 – It might be that you need to log more than standard fields available out of the box in the solution. That is fairly easy and requires the following code to be executed ( in this instance I’m adding X-Forwarded-Proto header to fields available )

Add custom header X-Forwarded-Proto to available logging fields
Add-WebConfiguration "system.webServer/advancedLogging/server/fields" -value @{id="X-Forwarded-Proto";sourceName="X-Forwarded-Proto";sourceType="RequestHeader";logHeaderName="X-Forwarded-Proto";category="Default";loggingDataType="TypeLPCSTR"} 

6 – Easiest thing to do is to use array which will be executed in order of items. I have defined mine as

# Array of fields we want to be logging 
$RequIisAdvLogFields = "Date-UTC","Time-UTC","Client-IP","Host","Referer","User Agent","Bytes Received","Method","URI-Querystring","URI-Stem","UserName","Protocol Version","BeginRequest-UTC","EndRequest-UTC","Time Taken","Bytes Sent","Status","Substatus","Server-IP","Server Port","Win32Status","X-Forwarded-For","X-Forwarded-Proto","CPU-Utilization"

7 – This is work in progress and will come back to this part.



Since I have defined my objectives and now I know what I’m working towards to it is easy to create the script. As I would not like to double the content I will just outline steps here and complete script you will fidn tha the bottom of the page (hosted in GitHub) .

  1. I define folder log location and frequency of log rotation
  2. We check if folder exists – if it does not we create it
  3. We need to assign custom permissions in order to be able to save to that folder ( the IIS_IUSRS needs to have write access )
  4. Reconfigure username property
  5. Define required fields in log
  6. Get server available fields
  7. Next for each website we will add log definition (this is something you may want to change )
  8. We delay commits of changes – this is quite important as if you dont do that you will be executing script a really long time
    # Do some work and change configs 
    Stop-WebCommitDelay -Commit $true


  9. For each of available fields we add it to our defined log definition
  10. Then as last steps I disable standard logging and IIS advanced logging on server level. As last I enable the advanced logging.


The whole script is available on gitHub so you are much more than welcome to contribute.  There are points which require work and that is defenitely better error control and adding filtering. However I will be actively developing this to have fully operational tool for IIS advanced configuration.