0

C# – Active Directory changes synchronization with cookie

c-shIn recent post we have discussed how to track Active Directory changes effeciently with PowerShell .

Now the same thing we can achieve with C#. And if you would wonder why C# since we have had it already in PowerShell ? Well maybe you would be writing a form of REST API for your enterprise ? Or writing application for personnel who is not fluent with scripting ( the ppl that do use GUI 🙂  )

Neverless this is going to be nice and easy. I will not be using screenshoots of Visual Studio in this post but just providing you with the information needed.

 

The architecture and design is totally up to you 🙂 I will introduce you to basics needed to put the bits and pieces together. To hold information which we receive it would be best to create a class with properties we will be interested in and hold that in a list.

public class adresult
{
   string objName {get;set;}
   string objDN   {get;set;}
   ...
   string objXYZ  {get;set;} # Whatever else properties you would be interested in 
}

 

That was easy 🙂 Now let’s get to write our application. I focus here on console application but you can you whatever else type suitable for you.

Let’s prepare LDAP connections :

                string ldapSrv = "LDAP://<LDAP-path>";
                string ldapFilter = "(objectClass=user)";

                // File to store our cookie
                string ldapCookie = @"c:\adsync-cookie.dat";

                // set up search
                DirectoryEntry dir = new DirectoryEntry(ldapSrv);
                DirectorySearcher searcher = new DirectorySearcher(dir);

                searcher.Filter = ldapFilter;
                searcher.PropertiesToLoad.Add("name");
                searcher.PropertiesToLoad.Add("distinguishedName");
                searcher.SearchScope = SearchScope.Subtree;
                searcher.ExtendedDN = ExtendedDN.Standard;

 

Next is the interesting – which is synchronization object

// create directory synchronization object
DirectorySynchronization sync = new DirectorySynchronization();

// check whether a cookie file exists and if so, set the dirsync to use it
if (File.Exists(ldapCookie))
   {
      byte[] byteCookie = File.ReadAllBytes(ldapCookie);
      sync.ResetDirectorySynchronizationCookie(byteCookie);
   }

 

Lastly is combining of what we have prepared and executing search

// Assign previously created object to searcher 
searcher.DirectorySynchronization = sync;

// Create group of our objects
List<adresult> ADresults = new List<adresult>();

foreach (SearchResult result in searcher.FindAll())
  {
      adresult objAdresult = new adresult();
      objAdresult.Objname  = (string)result.Properties["name"][0];
      
      string[] sExtendedDn = ((string)result.Properties["distinguishedName"][0]).Split(new Char[] { ';' });
      objAdresult.objDN    = sExtendedDn[2];

      ADresults.Add(objAdresult);
   }

// write new cookie value to file
File.WriteAllBytes(ldapCookie, sync.GetDirectorySynchronizationCookie());

// Return results 
return ADresults;

 

This concludes this short post. I hope you would be able to use it for your complex Active Directory scenarios.

 

 

0

C# – Generate Entity Framework SQL script

This one is going to be really short one. As It happens that I need to enable others to recreate DBs for API models I create I usually deliver them SQL script that does the work. So how do you generate one in VS ?

Well using package manager you just call

Update-Database -Script -SourceMigration:0

 

And that creates you SQL script – of course without any seed data 🙂

2

SSL file standards explained

While browsing net I cam across interesting post on serverfault and I thought it would be nice to have it as a point of reference , especially when working with certificates

Below you may find the most popular standards :

  • .csr This is a Certificate Signing Request. Some applications can generate these for submission to certificate-authorities. The actual format is PKCS10 which is defined in RFC 2986. It includes some/all of the key details of the requested certificate such as subject, organization, state, whatnot, as well as the public key of the certificate to get signed. These get signed by the CA and a certificate is returned. The returned certificate is the public certificate (not the key), which itself can be in a couple of formats.
  • .pem Defined in RFC’s 1421 through 1424, this is a container format that may include just the public certificate (such as with Apache installs, and CA certificate files /etc/ssl/certs), or may include an entire certificate chain including public key, private key, and root certificates. The name is from Privacy Enhanced Mail (PEM), a failed method for secure email but the container format it used lives on, and is a base64 translation of the x509 ASN.1 keys.
  • .key This is a PEM formatted file containing just the private-key of a specific certificate and is merely a conventional name and not a standardized one. In Apache installs, this frequently resides in /etc/ssl/private. The rights on these files are very important, and some programs will refuse to load these certificates if they are set wrong.
  • .pkcs12 .pfx .p12 Originally defined by RSA in the Public-Key Cryptography Standards, the “12” variant was enhanced by Microsoft. This is a passworded container format that contains both public and private certificate pairs. Unlike .pem files, this container is fully encrypted. Openssl can turn this into a .pem file with both public and private keys: openssl pkcs12 -in file-to-convert.p12 -out converted-file.pem -nodes

A few other formats that show up from time to time:

  • .der A way to encode ASN.1 syntax in binary, a .pem file is just a Base64 encoded .der file. OpenSSL can convert these to .pem (openssl x509 -inform der -in to-convert.der -out converted.pem). Windows sees these as Certificate files. By default, Windows will export certificates as .DER formatted files with a different extension. Like…
  • .cert .cer .crt A .pem (or rarely .der) formatted file with a different extension, one that is recognized by Windows Explorer as a certificate, which .pem is not.
  • .p7b Defined in RFC 2315, this is a format used by windows for certificate interchange. Java understands these natively. Unlike .pem style certificates, this format has a defined way to include certification-path certificates.
  • .crl A certificate revocation list. Certificate Authorities produce these as a way to de-authorize certificates before expiration. You can sometimes download them from CA websites.

In summary, there are four different ways to present certificates and their components:

  • PEM Governed by RFCs, it’s used preferentially by open-source software. It can have a variety of extensions (.pem, .key, .cer, .cert, more)
  • PKCS7 An open standard used by Java and supported by Windows. Does not contain private key material.
  • PKCS12 A private standard that provides enhanced security versus the plain-text PEM format. This can contain private key material. It’s used preferentially by Windows systems, and can be freely converted to PEM format through use of openssl.
  • DER The parent format of PEM. It’s useful to think of it as a binary version of the base64-encoded PEM file. Not routinely used by much outside of Windows.
2

ASP.NET 5 – Dependency injection with AutoFac

Today we will shift a bit from previous tracks in order to research more on Visual Studio 2015 powering us with MVC6 / ASP.NET 5 . I personally find that Microsoft is going into right direction – especially being so open source.

But coming back to original subject of this post. When you create a new project in VS2015 and select .Net5 we can see that this is till being in preview – therefore it might be that information provided to you in this post are already out of date! Therefore I recommend you do take it under account.

For .Net 5 documentation look here . And if you are more interested in Autofac check documentation here

Startup.cs

        public IServiceProvider ConfigureServices(IServiceCollection services)
        {
            services.AddMvc();

            //create Autofac container build
            var builder = new ContainerBuilder();

            //populate the container with services here..
            builder.RegisterType<DemoService>().As<IProjectDemo>();
            builder.Populate(services);

            //build container
            var container = builder.Build();

            //return service provider
            return container.ResolveOptional<IServiceProvider>();
        }

 

Project.json

  "dependencies": {
    "Autofac": "4.0.0-beta6-110",
    "Autofac.Framework.DependencyInjection": "4.0.0-beta6-110",
    "Microsoft.AspNet.Mvc": "6.0.0-beta6",
    "Microsoft.AspNet.Server.IIS": "1.0.0-beta6",
    "Microsoft.AspNet.Server.WebListener": "1.0.0-beta6",
    "Microsoft.AspNet.StaticFiles": "1.0.0-beta6"

  },

 

What I also learned at this stage – it is not smart to mix different beta versions. So if possible try to keep them on the same level. Hope this helps and will get you going!

 

We will be defenitely visiting autofac in later posts when we will play around with creating REST services or other apps!

0

PowerShell – using Nlog to create logs

If you are after a logging framework I can recommend you one I have been using not only in Windows but also in C# development for web projects. Its called NLOG and it is a quite powerfull , allowing you to log not only in specific format or layout – but also do the things to have reliable logging ( by having i.e. multiple targets with failover ) with required performance ( i.e. Async writes ). Thats not all! Thanks to out of the box features you can log to flat files , databases , network endpoints , webapis … thats just great!

The Nlog is available in GitHub here so I can recommend that you go there and get your self familiar with the Wiki explaining usage and showing some examples.

At this point of time I can tell you that you can use XML config file either configure logger on the fly before creation. In this post I will show you the both options so you would be able to choose best.

 

The high level process looks following :

  1. Load assembly
  2. Get configuration ( or create it )
  3. Create logger
  4. Start logging

 

Nlog with XML configuration file

The whole PowerShell script along with configuration module looks following :

Now the thing that can be of interest of you .. the waywe load our assembly. What I use here is loading of byte array and then passing that as parameter to assembly load method.

$dllBytes = [System.IO.File]::ReadAllBytes( "C:\NLog.dll")
[System.Reflection.Assembly]::Load($dllBytes)

Reason to do this that way is to avoid situations where we would have the file locked by ‘another process’. I had that in the past and with this approach it will not happen 🙂

 

The next part with customized data – is used when we would like to pass custom fields into our log. The details are described here on Nlog page

 

After that I’m loading configuration and assigning it

$xmlConfig                       = New-Object NLog.Config.XmlLoggingConfiguration("\\pathToConfig\NLog.config")
[NLog.LogManager]::Configuration = $xmlConfig

 

Nlog with configuration declared on the fly

As promised it might be that you would like to use Nlog with configuration done on fly instead of centralized one. In the example below I will show you file target as one of the options. There is much more so yu may want to explore remaining options

    # Create file target
    $target = New-Object NLog.Targets.FileTarget  

    # Define layout
    $target.Layout       = 'timestamp=${longdate} host=${machinename} logger=${logger} loglevel=${level} messaage=${message}'
    $target.FileName     = 'D:\Tools\${date:format=yyyyMMdd}.log'
    $target.KeepFileOpen = $false
    
    # Init config
    $config = new-object NLog.Config.LoggingConfiguration

    # Add target 
    $config.AddTarget('File',$target)

    # Add rule for logging
    $rule1 = New-Object NLog.Config.LoggingRule('*', [NLog.LogLevel]::Info,$target)
    $config.LoggingRules.Add($rule1)

    # Add rule for logging
    $rule2 = New-Object NLog.Config.LoggingRule('*', [NLog.LogLevel]::Off,$target)
    $config.LoggingRules.Add($rule2)

    # Add rule for logging
    $rule3 = New-Object NLog.Config.LoggingRule('*', [NLog.LogLevel]::Error,$target)
    $config.LoggingRules.Add($rule3)

    # Save config
    [NLog.LogManager]::Configuration = $config

    $logger = [NLog.LogManager]::GetLogger('logger.name')

 

Engineers…. Start your logging 🙂

Once done not much left 😀 you can just start logging by typing :

$logger_psmodule.Info('some info message')
$logger_psmodule.Warn(('some warn message')
$logger_psmodule.Error(('some error message')

 

 

1

x509Certificate – System.Security.Cryptography.CryptographicException “Object was not found”

Hey ,

So recently I have been working with JSON web Tokens authentication and wanted to make extra step with security. I decided to sign my tokens with certificates.

So without any further delays I have happily placed certificate within my storage location ( for sake of this post lets say it was local filesystem ) and created simple method to create my object from byte array of that certificate and my password.

byte[] binaryData = new byte[1];
// ... Removed for code visibility - binaryData contains raw certificate byte array 

var cert          = new X509Certificate2(binaryData, password);

The problem :

However when I have tried to invoke ctor on X509Certificate2 passing my raw array of certificate bytes I have received nasty error saying :

System.Security.Cryptography.CryptographicException
Object was not found.
at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate2..ctor(Byte[] rawData, String password)
//my code here

 

Tackling the challenge:

In this instance solution to the problem should be understanding whats going on in this instance.

To give you more details same problem occured on my local development environment and my Azure designated webApp.

My local website have dedicated application pool with specified domain user which app pool uses as identity.

It appears that that even though I was loading the certificate from byte[] the underlying Windows Cryptographic Service provider tried to use user store and since my application pool account profile was not available a cryotographic context was not available.

So initially seems like enabling to Load User Profile to true solves the problem. But wait …. ? Does it really ?

What happens then when you change that setting ? Well ApplicationPool is calling LoadProfile and all related implications of doing that follows.This of course includes possible security vulnerabilities / performance etc.

Other approach:

* this will also work in Azure WebApp *

X509Certificate2 ctor has extra flags ( X509KeyStorageFlags ) that can be used. If you investgate them you will notice one particklary interesting:

MachineKeySet – the key is written to a folder owned by the machine.

var cert = new X509Certificate2(bytes, password, X509KeyStorageFlags.MachineKeySet);

More info avaliable under link to a great post that discuss this in details

 

Good practice:

Its good to cleanup after yourself. If you have read the aforementioned blog you will find more info about temp files left behind when using byte[] within X509Certificate ctor.

So I have adapted method mentioned then and now use :

var file = Path.Combine(Path.GetTempPath(), "rafpe-" + Guid.NewGuid());
try
{
    File.WriteAllBytes(file, bytes);
    return new X509Certificate2(file,X509KeyStorageFlags.MachineKeySet);

}
finally
{
    File.Delete(file);
}

 

Happy coding 😀