0

Powershell – build complex module with functions dependency tree

logo-powershell-520x245Hey! So the time has come to share with you one of my recent achievements within automation. As you may have noticed in subejct of the post we will be focusing on complex functions with dependencies on onther functions within module. If that would not be enough … we will execute them remotely within session. Isnt that just uber cool ?

So in my last post I focuses on brief intro to what exporting of functions could look like. Today we start off with definition how we set up our functiosn and what we will use to build dependencies ( of course keeping it all under control )

How do you build complex functions ?

This might sound trivial however it is quite important to get this idea before you go ahead and build a module with 30 cmdlets and 3000 lines of code. From my DevOps experience I can tell you that wherever possible I start building from generic functions which then I use in more specific functions (a form of wrappers level above )

I think if we were to think of visualisation we could say we would get something in lines of :

2015-08-11_11h56_49

 

Someone just looking at it could say it looks really promising and professional 😀 Well it does. It is all about low level functions (the generic ones) to work more less with RAW data objects without performing complex validations. This validations and remaining high level functionality would be done by high level functions. 

 

Sample code which I could use for executing such dependent functions could look as follow :

                # Required functions on the remote host
                $exportNewObjGeneric  = Export-FunctionRemote New-ObjGeneric                                                                                                                                             
                $exportAddProps       = Export-FunctionRemote Add-ObjProperties 
                $exportGetAdSite      = Export-FunctionRemote Get-MyAdSite

                $remoteFunctions = @($exportGetAdSite, $exportNewObjGeneric, $exportAddProps)

Invoke-Command -Session $Session -ScriptBlock {   
            
                    # we recreate each of required functions
                    foreach($singleFunction in $using:remoteFunctions)
                    {
                        . ([ScriptBlock]::Create($singleFunction));
                    }
                 # ---- some other code doing magic -----
}

 

 

Great! Since it sounds so easy to implement … where is the catch …. 😀 Of course there is one … look what potentially could happen if you just go ahead and start referencing functions within a module ….

2015-08-11_15h59_13

 

It will not take a long time when you will lose track of what references what and where are your dependencies. This is just asking for trouble as you will practically not be able to assert what will be consequences of yoru changes on one end. I can guarantee you would get a butterfly effect in this scenario!

You will quickly lose ability to properly manage the code and automation will become more than a nightmare than a pleasure!

 

I have seen that to – however a bit differently. I though we could utilize something that every function of mine has – code based help!

Look at the example of function below – for purposes of this post I have ammended the code based help

2015-08-11_13h22_06

 

 

Have you noticed anything special ? …. Well if you didnt let me explain ….

 

Reference functions which are dependencies

Yes! This is the road to ultimate automation. I have came up with idea which can be describe as following :

  • Functions can nest (have a dependency  )  only on 1 level – so not dependency in dependency in dependency (but maybe you will come up with some more elegant way to overcome this 😀 )
  • Generic functions should not have dependencies on custom functions

With those 2 points I was able to continue with this solution. Therefore I have ammended a bit of the code we used last time and came up with the following :

 

Now this functions references one of which one already discussed called ‘Export-FunctionRemote’ (available @ Github ).

So what we got from the above ? Well we got something really great. In a controlled way by decorating our function with commented based help and specyfing  RequiredFunction<Some-FunctionName> it will be cosnidered as dependency in out script.

    <#
        .SYNOPSIS
        Do xyz 

        .FUNCTIONALITY
        RequiredFunction<Get-NobelPrice>
    #>

 

Finally use example

So we need to use what we have just received. I wont be taking long to explain – this is the pure awesomness of automation 🙂 …

                # Aqquire function name
                $functionName = $MyInvocation.MyCommand
             
                # get remote functions into remote script block
                Write-Verbose "Exporting functions for $functionName"
                $remoteFunctions = Get-RemoteRequiredFunctions -functionName                    $functionName

               

                Invoke-Command -Session $Session -ScriptBlock {   
            
                    # we recreate each of required functions
                    foreach($singleFunction in $using:remoteFunctions)
                    {
                        . ([ScriptBlock]::Create($singleFunction));
                    }

                    # ---- do magic -----
                }

 

 

Summary

I hope you like the idea of automating yoru functions in generic way to work with Raw data and then use high level functions to really utilize their potential. Given that you have also now received way to perform advanced operations by creating function dependencies.

Of course this is more than extendible – you can buld dependency trees – do more complex unit testign with Pester … there is not limtis 😀

Got feedback / issue ? As usual leave a comment or jump to GitHub / Gists

 

0

Powershell – DSC checklist

As you remember we are in the middle of the series for DSC module creation. I maybe should have mentioned this before but better now than never.  What I’m talking about … well its all about good practice within DSC. Some of them we do apply and some we will apply in our DSC series.

As it is good to have it around I decided to do a repost . Source from http://blogs.msdn.com/b/powershell/archive/2014/11/18/powershell-dsc-resource-design-and-testing-checklist.aspx

1       Resource module contains .psd1 file and schema.mof for every resource

2       Resource and schema are correct and have been verified using DscResourceDesigner cmdlets

3       Resource loads without errors

4       Resource is idempotent in the positive case

5       User modification scenario was tested

6       Get-TargetResource functionality was verified using Get-DscConfiguration

7       Resource was verified by calling Get/Set/Test-TargetResource functions directly

8       Resource was verified End to End using Start-DscConfiguration

9       Resource behaves correctly on all DSC supported platforms (or returns a specific error otherwise)

10     Resource functionality was verified on Windows Client (if applicable)

11     Get-DSCResource lists the resource

12     Resource module contains examples

13     Error messages are easy to understand and help users solve problems

14     Log messages are easy to understand and informative (including –verbose, –debug and ETW logs)

15     Resource implementation does not contain hardcoded paths

16     Resource implementation does not contain user information

17     Resource was tested with valid/invalid credentials

18     Resource is not using cmdlets requiring interactive input

19     Resource functionality was thoroughly tested

20     Best practice: Resource module contains Tests folder with ResourceDesignerTests.ps1 script

21     Best practice: Resource folder contains resource designer script for generating schema

22     Best practice: Resource supports -whatif

5

PowerShell – Local functions in remote sessions with ScriptBlock

logo-powershell-520x245 I think all of have been in situation when we wanted to run local PowerShell functions in remote sessions. If someone is new to the subject you would think there is no challenge here 😀

But wait …. there is 😀 I have personally seen couple of implementations of such ‘workarounds’ and in my opinion the most important is to choose the one that suits you.

Option 1 : Pass the function in script block :

You can directly invoke something in lines of :

Invoke-Command -ScriptBlock ${function:foo} -argumentlist "Bye!"

But when you do a lot of automations like I do this doesnt do any good for me :O

 

Option 2 : Use ScriptBlock

Script block you could say will be our ‘precious‘ 🙂 It seems a lot of ppl underestimate its potential. We should maybe first start of with how we define ScriptBlock.

I think easiest  to say would be something in lines of ‘… its a container for statements or/and expressions which can accept parameters and return value’ 

If you would like to know more please take a look at technet page.

Now you would ask but how this helps us in remote creation of PowerShell functions ? Ohhh it really does! It opens you doors to total automation ….

Look at this simple example (the snippet below comes from StackOverFlow )

$fooDef = "function foo { ${function:foo} }"

Invoke-Command -ComputerName someserver.example.com -ScriptBlock {
    Param( $fooDef )

    . ([ScriptBlock]::Create($using:fooDef))

    Write-Host "You can call the function as often as you like:"
    foo "Bye"
    foo "Adieu!"
}

 

So what happened here ? As we said a moment ago scriptblock is an container for expressions/statements and can accept input and return value(s)….  So we have defined our container of statements as

$fooDef = "function foo { ${function:foo} }"

Its perfectly valid to do so 😀

Next one is execution of our script in remote session and what we do here is we recreate our scriptblock based on definiton

. ([ScriptBlock]::Create($using:fooDef))

And voilla – from that moment you can use your function in remote session as it would be local!

 

One step further – automate exporting of functions

I’m a bit excited writing about this however using the above we will be able to create amazing automations. We will focus on creating multilayer (dependent each on) functions that we will automatically recreate when necessary.

But before we get there we need to automate exporting of our functions. For this we will use the following PS script :

 

 

0

PowerShell – Creating DSC module – Part 2

Hey ,

Without any further delays we will continue our excercise to write yp DSC module containing multiple resources. As this is multi-part post you might want to jump back to first post to read up on preparations and considerations.

 

Objectives for this post:

In this particular post we will focus on creating DSC resource properties and modifying out root module manifest.  Source code in ps1 format for what we will be doing today is available as usual 🙂 on GitHub  ( https://github.com/RafPe/cWebAdmin/blob/master/src/createDscResources.ps1 ) so if you would like to help or have feedback or suggestions leave a comment or just fork the script and request a pull 😀

If you are new to this series of post I’m aiming to get fully operational webapp Pool and website admin DSC module (at the moment of writing this post it will support only IIS 8.5 ). If you would say “Hey! But there is already one from MS available even on GitHub! So why reinvent the wheel ?” then my answer would be ‘The ones available do not perform scope of configurations I want to do – therefore I’m creating one to fullfuill my technical needs and on the other hand I get a nice tech excercise! ‘

 

Investigate your target resource properties:

Let’s go ahead and see what’s available for us. As usual there are multipe ways to achieve your goal 😀 its all about selecting the optimal one. As in majority cases you could go ahead and investigate GUI to see which ones are available for you. Something in lines of :

2015-08-09_18h10_51

 

Hmmmm … but really – how much time will it take you to complete list of all available properties ? So lets go ahead and do it a bit differently.

$props = Get-Item IIS:\AppPools\<YourAppPoolNam> | Get-Member -MemberType NoteProperty

That gives you more less output similar to the following one :

2015-08-09_18h15_19

 

So is that all ? 🙂 Of course it isnt – we are missing quite a lot. The remaining properties are hidden in child elements collections. We get to them by calling :

(Get-Item IIS:\AppPools\<YourAppPoolName>).ChildElements

And that gives you the following output :

2015-08-09_18h17_43

 

And you access the properties of each collection by calling

(Get-Item IIS:\AppPools\<YourAppPoolName>).ChildElements['processmodel'].Attributes | select Name,TypeName

and our output will look similar to the following one :

2015-08-09_18h22_12

 

 

Fine 🙂 we have our resources and more less we know their types (keep in mind that some of int types here – within GUI are enums )

 

Get me the resource!

Ok – so you are hungry for coding 😀 I totally udnerstand – So let’s discuss cmdlet that will be playing main role here for us : which is New-xDscResourceProperty.

This cmdlet has couple of properties which we need to discuss. Lets take a look on example of usage :

$managedRuntimeVersion          = New-xDscResourceProperty –Name managedRuntimeVersion –Type string –Attribute Write -ValidateSet "v4.0","v2.0",""

As you can see we specify couple of important switches :

  1. Name : I think that this will be self explanatory 🙂 What you need to keep in mind that single DSC resource CANNOT HAVE 2 properties with the same name (but you can imagine why 🙂 )
  2. Type This defines what type is the property we will be working with
  3. Attribute :
    [ Key ]     : 
    on a property signals that the property uniquely identifies the resource
    [ Write ] : means we can assign value in our configuration
    [ Read ]  : indicates that property cannot be changed , neither we can assign value to it
  4. Description : Allows you to describe property
  5. Values and ValuesMap : restricts possible property values to specified in ValuesMap

 

It’s coding time 😀

 

Since now you know how to go about this – lets go ahead and create our resources properties and finally our resource. The decision how to tackle all properties and creating each one I leave up to you. At the moment of writing this post in series my script have had defined all of them (however this could have changed 🙂 )

Since I want to create multiple resource properties I define them as following :

   $Ensure                         = New-xDscResourceProperty –Name autoStart –Type String –Attribute Write
   $enable32BitAppOnWin64          = New-xDscResourceProperty –Name enable32BitAppOnWin64 –Type string –Attribute Write -ValidateSet "true","false"
   $managedRuntimeVersion          = New-xDscResourceProperty –Name managedRuntimeVersion –Type string –Attribute Write -ValidateSet "v4.0","v2.0",""
   $managedRuntimeLoader           = New-xDscResourceProperty –Name managedRuntimeLoader –Type string –Attribute Write

# -------------- Rest is removed for clear of this example 
# -------------- script is available @ https://github.com/RafPe/cWebAdmin/blob/master/src/createDscResources.ps1

 

Next to ease assigning them into a new resource I create an array

$xDscProperties [email protected](
       $name,
       # -------------------------- Removed for visibility
       $numaNodeAffinityMode      
    )

 

And once all of that is done we can go ahead and finally create our new resource! In my case this is resource to manage WebApp pool default settings.

# Create resource that will be defining defaults for application pool 
New-xDscResource -Name RafPe_cWebAppPoolDefaults`
                 -FriendlyName cWebAppPoolDefaults`
                 -ModuleName cWebAdmin`
                 -Path 'C:\Program Files\WindowsPowerShell\Modules' `
                 -Property $xDscProperties  -Verbose

As you can see here I’m prefixing the resource name with RafPe_ and I’m doing this because a lot of other people could come up with same name. Altough FriendlyName I’m choosing to be cWebAppPoolDefaults as thats the way I want resource to show up. Next we define our root module name which in my case is cWebAdminFollowed by Path which in this case points to default modules folder. And of course the most important – I specify our resource properties which we have just defined a moment ago.

Once you are happy with your module just hit enter! And after couple of seconds your new module should be ready and good to go!

You should now have your module already available for import .

2015-08-09_19h02_15

 

Modify root DSC module manifest:

Whats now left – we need to modify module manifest. Easiest (and a lot of ppl do it ) is to copy one of already existsing ones and just changed it a bit. What is extremly important is to generate own new GUID 

Module manifest for root module we are working with looks following :

#
# Module manifest for module 'cWebAdmin'
# Author RafPe

@{

# Version number of this module.
ModuleVersion = '1.0'

# ID used to uniquely identify this module
GUID = 'f3e1b30a-9292-4ca3-a1f1-572bd64cf460'

# Author of this module
Author = 'RafPe'

# Company or vendor of this module
CompanyName = 'RafPe'

# Copyright statement for this module
Copyright = '(c) 2015 RafPe. All rights reserved.'

# Description of the functionality provided by this module
Description = 'Module with DSC Resources for Web Administration'

# Minimum version of the Windows PowerShell engine required by this module
PowerShellVersion = '4.0'

# Minimum version of the common language runtime (CLR) required by this module
CLRVersion = '4.0'

# Private data to pass to the module specified in RootModule/ModuleToProcess. This may also contain a PSData hashtable with additional module metadata used by PowerShell.
PrivateData = @{

    PSData = @{

    # A URL to the license for this module.
    LicenseUri = 'https://rafpe.ninja'

    # A URL to the main website for this project.
    ProjectUri = 'https://rafpe.ninja'

    } # End of PSData hashtable

} # End of PrivateData hashtable

# Functions to export from this module
FunctionsToExport = '*'

# Cmdlets to export from this module
CmdletsToExport = '*'
}

 

And how do you generate a Guid ? simple … use .Net methods to quickly do that :

[guid]::NewGuid().Guid

 

Summary:

Well thats it for today 🙂 We have narrowed our resources and evaluated their types. We discussed how to create resource property and what parameters we can use fot that instance . And lastly we initially created our module and modified its manifest. So I think we are on really good track to start write of our resource methods in next post in this series.

 

Stay tuned for more! Catch up with all changes/Comment / Pull ->  on GitHub (https://github.com/RafPe/cWebAdmin) and till then !

 

 

 

 

2

PowerShell – Creating DSC module – Part 1

logo-powershell-520x245

Hey ,

As you probably know in world of DevOps a neat feature as DSC is must have and must do considering what it is capable of.  Since recently I have been working a lot with webservers I decided it took to long to get appropiate configuration in place. Considering event though we had some legacy scripts doing work , sometimes we had discripiencis which were really hard to solve (as finding root cause happening randomly on servers was fun )

So with this post I will be introducing you to creation of new DSC module with best practices and experience of myself and people from all over world (thanks GitHub). In many cases I will be referring you to external sources of information but the reason to that is simple – I dont want to duplicate information contained there.

Repo used for this excersise is available under https://github.com/RafPe/cWebAdmin so feel free to comment/contribute as it might be that you will find a better approach.

To be clear from beggining I will be using PSv5 as it offers me the most up to date access to cmdlets

PS C:\Program Files\WindowsPowerShell\Modules> $PSVersionTable

Name                           Value                                                       
----                           -----                                                       
PSVersion                      5.0.10105.0                                                 
WSManStackVersion              3.0                                                         
SerializationVersion           1.1.0.1                                                     
CLRVersion                     4.0.30319.42000                                             
BuildVersion                   10.0.10105.0                                                
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...}                                     
PSRemotingProtocolVersion      2.3                                                         

 

For creation of resource we will be using DSC module called xDscResourceDesigner. If you will check that module you will see the latest updates and description of cmdlets

If you download the tool make sure its available under $env:ProgramFiles\WindowsPowerShell\Modules

Once confirmed you can just import the module by typing :

Import-Module xDSCResourceDesigner

Plan your resource:

Before we go any further it is important to plan your DSC resource accordingly. Think here of “LEGO” 🙂 Ahhhh yes – the good days when you used your imagination to build skyscrapers and pirate ships. In both cases they are much a like 😀 a lot of reusable pieces that interconnects and creates amazing things.

Firstly we will prepare our folder structure. As you can see in example below (taken from Microsoft GitHub) we have :

xNetworking
   xNetworking.psd1
   DSCResources
       MSFT\_xDNSServerAddress
           MSFT\_xDNSServerAddress.psm1
           MSFT\_xDNSServerAddress.schema.mof
       MSFT\_xIPAddress
           MSFT\_xIPAddress.psm1
           MSFT\_xIPAddress.schema.mof
       MSFT\_xIPAddress
           MSFT\_xIPAddress.psm1
           MSFT\_xIPAddress.schema.mof
  • root folder with important module manifest.
  • Then folder containing our resources called DSCResources.
  • Every resource has its own folder and the folder name is the full name of resource (in most of cases this is longer than friendly name).
  • Within each of the folders you have script module (.psm1) and a schema (.schema.mof) file

The decision “To bundle or not to bundle” belongs to you. Since I want to control resources from single module I go for bundle.

 

 

Its all about your resources:

So in order to get going with DSC we will need our resources. Using aforementioned xDscResourceDesigner we will be able to properly create them. In the next post I will go into details of creating resources – their parameters and how to update them if necessary.

 

Whats in DSC :

If we open DSC module  we will find 3 primary DSC TargetResource functions (every DSC resource will have them ):

  • Get – this will get whatever we need to validate
  • Set – this will set whatever we need to set
  • Test – this will validate whatever it is we need to validate and will return a true or false state

The diagram below shows more in detail the flow :

DSC flow

Test-TargetResource is essential as this function is responsible for checking if your target resource exists or not. If it exists, this function must return True otherwise False . Proper writing this function will allow for DSC to deliver desired results.

Set-TargetResource function makes required changes on target resource. It gets called only if  target resource is not in desired state .

The Get-TargetResource function, is used to retrieve the current state of the resource instance.

 

 

Links :

There are several links worth of looking at (that I can recommend you to ) :

There are probably more (and newer ones ) so drop comment and I will update the list so ppl would be able to find them.

 

Summary for part 1:

So this is it for today 🙂 In the next post we will dive in details into :

  • Creating DSC resources and parameters for New-xDscResourceProperty
  • Creating module manifest
  • Talk about reusable scripting and what to avoid when writing scripts
  • …. and much more 🙂

 

Thanks!

RafPe

 

 

 

1

x509Certificate – System.Security.Cryptography.CryptographicException “Object was not found”

Hey ,

So recently I have been working with JSON web Tokens authentication and wanted to make extra step with security. I decided to sign my tokens with certificates.

So without any further delays I have happily placed certificate within my storage location ( for sake of this post lets say it was local filesystem ) and created simple method to create my object from byte array of that certificate and my password.

byte[] binaryData = new byte[1];
// ... Removed for code visibility - binaryData contains raw certificate byte array 

var cert          = new X509Certificate2(binaryData, password);

The problem :

However when I have tried to invoke ctor on X509Certificate2 passing my raw array of certificate bytes I have received nasty error saying :

System.Security.Cryptography.CryptographicException
Object was not found.
at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate2..ctor(Byte[] rawData, String password)
//my code here

 

Tackling the challenge:

In this instance solution to the problem should be understanding whats going on in this instance.

To give you more details same problem occured on my local development environment and my Azure designated webApp.

My local website have dedicated application pool with specified domain user which app pool uses as identity.

It appears that that even though I was loading the certificate from byte[] the underlying Windows Cryptographic Service provider tried to use user store and since my application pool account profile was not available a cryotographic context was not available.

So initially seems like enabling to Load User Profile to true solves the problem. But wait …. ? Does it really ?

What happens then when you change that setting ? Well ApplicationPool is calling LoadProfile and all related implications of doing that follows.This of course includes possible security vulnerabilities / performance etc.

Other approach:

* this will also work in Azure WebApp *

X509Certificate2 ctor has extra flags ( X509KeyStorageFlags ) that can be used. If you investgate them you will notice one particklary interesting:

MachineKeySet – the key is written to a folder owned by the machine.

var cert = new X509Certificate2(bytes, password, X509KeyStorageFlags.MachineKeySet);

More info avaliable under link to a great post that discuss this in details

 

Good practice:

Its good to cleanup after yourself. If you have read the aforementioned blog you will find more info about temp files left behind when using byte[] within X509Certificate ctor.

So I have adapted method mentioned then and now use :

var file = Path.Combine(Path.GetTempPath(), "rafpe-" + Guid.NewGuid());
try
{
    File.WriteAllBytes(file, bytes);
    return new X509Certificate2(file,X509KeyStorageFlags.MachineKeySet);

}
finally
{
    File.Delete(file);
}

 

Happy coding 😀

1

IIS advanced logging – first approach

Hello,

We start of with quite popular subject which is IIS advanced logging. This extra package provided by Microsoft enables for example to have logging of date time with miliseconds or have CPU utilization.

Research and preparations:

So I came across the challenge of quickly coming up with a way of installing it and configuring the servers so we would get the required data. Therefore before proceeding I have made research and determined the following as requirements to complete this challenge :

  1. Obtain MSI to install IIS advanced logging
  2. Determine log file locations
  3. Specify log fields used
  4. Mitigate problem with username in advanced IIS logging module
  5. Specify any extra fields used
  6. Specify fields order in log
  7. Specify any filters

 

1 –  This is fairly simple as downloading from Microsoft Download

2 – This will be the folder where you want to keep your logs. I have not tried network locations – so folders were always local.

3 – From available fields you can specify which ones you are interested. I have choosen the one that brings the most value in my environment

4 – There is a known problem with the username in IIS advanced logging where username is blank. Now I of course have come across this and used the following PowerShell command to mitigate this

# Reconfigure username property
Set-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST'  -filter "system.webServer/advancedLogging/server/fields/field[@id='UserName']" -name "sourceType" -value "BuiltIn"

5 – It might be that you need to log more than standard fields available out of the box in the solution. That is fairly easy and requires the following code to be executed ( in this instance I’m adding X-Forwarded-Proto header to fields available )

Add custom header X-Forwarded-Proto to available logging fields
Add-WebConfiguration "system.webServer/advancedLogging/server/fields" -value @{id="X-Forwarded-Proto";sourceName="X-Forwarded-Proto";sourceType="RequestHeader";logHeaderName="X-Forwarded-Proto";category="Default";loggingDataType="TypeLPCSTR"} 

6 – Easiest thing to do is to use array which will be executed in order of items. I have defined mine as

# Array of fields we want to be logging 
$RequIisAdvLogFields = "Date-UTC","Time-UTC","Client-IP","Host","Referer","User Agent","Bytes Received","Method","URI-Querystring","URI-Stem","UserName","Protocol Version","BeginRequest-UTC","EndRequest-UTC","Time Taken","Bytes Sent","Status","Substatus","Server-IP","Server Port","Win32Status","X-Forwarded-For","X-Forwarded-Proto","CPU-Utilization"

7 – This is work in progress and will come back to this part.

 

Execution:

Since I have defined my objectives and now I know what I’m working towards to it is easy to create the script. As I would not like to double the content I will just outline steps here and complete script you will fidn tha the bottom of the page (hosted in GitHub) .

  1. I define folder log location and frequency of log rotation
  2. We check if folder exists – if it does not we create it
  3. We need to assign custom permissions in order to be able to save to that folder ( the IIS_IUSRS needs to have write access )
  4. Reconfigure username property
  5. Define required fields in log
  6. Get server available fields
  7. Next for each website we will add log definition (this is something you may want to change )
  8. We delay commits of changes – this is quite important as if you dont do that you will be executing script a really long time
    Start-WebCommitDelay
    
    # Do some work and change configs 
    
    Stop-WebCommitDelay -Commit $true

     

  9. For each of available fields we add it to our defined log definition
  10. Then as last steps I disable standard logging and IIS advanced logging on server level. As last I enable the advanced logging.

 

The whole script is available on gitHub so you are much more than welcome to contribute.  There are points which require work and that is defenitely better error control and adding filtering. However I will be actively developing this to have fully operational tool for IIS advanced configuration.

 

 

1

Road to challenges in IT

Hey ,

It has been long and quiet in last 2 years I think but this times comes to an end. A lot have been happening in regards to learning curve of SCCM/SCOM/ PowerShell  (especially DSc part) and REST Apis.

Nowadays we cannot forget about importance of cloud and hybrid environments and Docker technology!

With all of that I can assure you that from now I will be on regular basis sharing as much as possible from challenges I have came across and from the news I got from the engineering world!

As usual the primary forcus of my experience is providing advanced automation solutions with maintaining security and availability of your services (nop – not forgot -> scalability as well 😀 )

So stay tuned / fork GitHub and enjoy the automation!