2

PCEngines – APU Board and nct5104d gpio driver

The board 🙂

Today I will explain you how I managed to write my own custom driver for nct5104d under Centos running on PCEngines APU board . But before we go any further wanted to share my “big wow” to the makers of the board. For anyone doing home automation , tinkering around or being just interested in engineering it is something I can completely recommend. It features amongst many cool perks things like 3 Gig ethernet ports , 16(18) GPIO ports , I2C , 2xRS232 ( one with RX,TX only ). For me its 5/5 start rating 🙂

 

In a place far far away…

… I have started this post some time ago since I thought it would be a great idea to have opportunity of sharing my experience as I  go through the whole learning of how to write a Linux driver for nct5104d  ( sitting on APU board )

Before I decided to anything crazy like that I would like to let you know that there is already a driver for the device and you can find it https://github.com/tasanakorn/linux-gpio-nct5104d . What made me thinking of writing my version was the way I would need to interoperate with the GPIOs by making some funky commands like :

echo 1 > /sys/class/gpio/gpio0/value

At that moment I knew I can make it easier for my automation purposes 🙂

 

Writing own driver …  where to start ?

So this is good question to ask yourself here. It took me many hours of reading articles/forums but also talking with people that have been doing such things before. From a high level perspective its simple – read basics and then start small with hello world. Once you start understanding it you will have more results.

I could recommend you take a look at the following resources ( which I have found very useful for getting my head around 🙂  )

 

Step by step ?

 

In most of my articles we would probably dive into technical details of the challenge. But in this instance I will point you to my github repository and ask to take a look. There has been a big amount of work that I have put into this and if you will have specific questions I will be here to try and answer them!

 

As just as interesting part – thats how my work looked like though the last 2 weeks ( from start to finish 🙂 )

Where is the code ?

 
The complete repository is available in my github repo https://github.com/RafPe/gpio-driver-nct5104d

 

So how does it work ?

Now we are talking 🙂 So using the driver is really nice. Once you go through the steps of compiling it and installing in your system you then have access to device via ioctl.

I have exposed methods for interacting with registries and with pins. However what is important here – the device automatically uses Logical device 7 which is GPIO. If you would have other needs we would most likely need to compile some logic around it.

Since not everyone is guru in creating binaries 🙂 I have created 2 apps which are respectively for management of pins or registries

 

Managing pins

With simple commands you can manage pins instantly

nct5104dpin [ --get|--set ] [--pin ] --val < 0|1 > --dir <out|in>

Get a pin value: nct5104dpin --pin 7 
Get a pin value: nct5104dpin --get --pin 14
Set a pin value: nct5104dpin --set --pin 14 --val 1
Set a pin direction: nct5104dpin --set --pin 14 --dir out

Cool thing is that I have made it in such a way that parsing data with i.e. JQ is just straightforward.

[email protected] > nct5104dpin --pin 1 | jq
[
  {
    "pin": 1,
    "value": 0
  }
]

Managing registries

Same apply for managing registers. I have been aiming to keep it simple and specific.

ct5104dreg [ --get|--set ] [--reg ] <HEX> --val <DECIMAL> 

Get a reg value: nct5104dreg --reg 0x07 
Get a reg value: nct5104dreg --get --reg 0x07
Set a reg value: nct5104dreg --set --reg 0xE0 --val 252

Here I also made sure output can easily be parsed

[email protected] > nct5104dreg --reg 0xE1 | jq
[
  {
    "registry": "0xe1",
    "value": 248
  }
]

 

Adventure begins here

I hope by sharing this I will enable you or maybe someone else to do things that you have not been doing before 🙂 or at least get you interested

 

0

C# – Active Directory changes synchronization with cookie

c-shIn recent post we have discussed how to track Active Directory changes effeciently with PowerShell .

Now the same thing we can achieve with C#. And if you would wonder why C# since we have had it already in PowerShell ? Well maybe you would be writing a form of REST API for your enterprise ? Or writing application for personnel who is not fluent with scripting ( the ppl that do use GUI 🙂  )

Neverless this is going to be nice and easy. I will not be using screenshoots of Visual Studio in this post but just providing you with the information needed.

 

The architecture and design is totally up to you 🙂 I will introduce you to basics needed to put the bits and pieces together. To hold information which we receive it would be best to create a class with properties we will be interested in and hold that in a list.

public class adresult
{
   string objName {get;set;}
   string objDN   {get;set;}
   ...
   string objXYZ  {get;set;} # Whatever else properties you would be interested in 
}

 

That was easy 🙂 Now let’s get to write our application. I focus here on console application but you can you whatever else type suitable for you.

Let’s prepare LDAP connections :

                string ldapSrv = "LDAP://<LDAP-path>";
                string ldapFilter = "(objectClass=user)";

                // File to store our cookie
                string ldapCookie = @"c:\adsync-cookie.dat";

                // set up search
                DirectoryEntry dir = new DirectoryEntry(ldapSrv);
                DirectorySearcher searcher = new DirectorySearcher(dir);

                searcher.Filter = ldapFilter;
                searcher.PropertiesToLoad.Add("name");
                searcher.PropertiesToLoad.Add("distinguishedName");
                searcher.SearchScope = SearchScope.Subtree;
                searcher.ExtendedDN = ExtendedDN.Standard;

 

Next is the interesting – which is synchronization object

// create directory synchronization object
DirectorySynchronization sync = new DirectorySynchronization();

// check whether a cookie file exists and if so, set the dirsync to use it
if (File.Exists(ldapCookie))
   {
      byte[] byteCookie = File.ReadAllBytes(ldapCookie);
      sync.ResetDirectorySynchronizationCookie(byteCookie);
   }

 

Lastly is combining of what we have prepared and executing search

// Assign previously created object to searcher 
searcher.DirectorySynchronization = sync;

// Create group of our objects
List<adresult> ADresults = new List<adresult>();

foreach (SearchResult result in searcher.FindAll())
  {
      adresult objAdresult = new adresult();
      objAdresult.Objname  = (string)result.Properties["name"][0];
      
      string[] sExtendedDn = ((string)result.Properties["distinguishedName"][0]).Split(new Char[] { ';' });
      objAdresult.objDN    = sExtendedDn[2];

      ADresults.Add(objAdresult);
   }

// write new cookie value to file
File.WriteAllBytes(ldapCookie, sync.GetDirectorySynchronizationCookie());

// Return results 
return ADresults;

 

This concludes this short post. I hope you would be able to use it for your complex Active Directory scenarios.

 

 

0

C# – Generate Entity Framework SQL script

This one is going to be really short one. As It happens that I need to enable others to recreate DBs for API models I create I usually deliver them SQL script that does the work. So how do you generate one in VS ?

Well using package manager you just call

Update-Database -Script -SourceMigration:0

 

And that creates you SQL script – of course without any seed data 🙂

2

ASP.NET 5 – Dependency injection with AutoFac

Today we will shift a bit from previous tracks in order to research more on Visual Studio 2015 powering us with MVC6 / ASP.NET 5 . I personally find that Microsoft is going into right direction – especially being so open source.

But coming back to original subject of this post. When you create a new project in VS2015 and select .Net5 we can see that this is till being in preview – therefore it might be that information provided to you in this post are already out of date! Therefore I recommend you do take it under account.

For .Net 5 documentation look here . And if you are more interested in Autofac check documentation here

Startup.cs

        public IServiceProvider ConfigureServices(IServiceCollection services)
        {
            services.AddMvc();

            //create Autofac container build
            var builder = new ContainerBuilder();

            //populate the container with services here..
            builder.RegisterType<DemoService>().As<IProjectDemo>();
            builder.Populate(services);

            //build container
            var container = builder.Build();

            //return service provider
            return container.ResolveOptional<IServiceProvider>();
        }

 

Project.json

  "dependencies": {
    "Autofac": "4.0.0-beta6-110",
    "Autofac.Framework.DependencyInjection": "4.0.0-beta6-110",
    "Microsoft.AspNet.Mvc": "6.0.0-beta6",
    "Microsoft.AspNet.Server.IIS": "1.0.0-beta6",
    "Microsoft.AspNet.Server.WebListener": "1.0.0-beta6",
    "Microsoft.AspNet.StaticFiles": "1.0.0-beta6"

  },

 

What I also learned at this stage – it is not smart to mix different beta versions. So if possible try to keep them on the same level. Hope this helps and will get you going!

 

We will be defenitely visiting autofac in later posts when we will play around with creating REST services or other apps!

1

x509Certificate – System.Security.Cryptography.CryptographicException “Object was not found”

Hey ,

So recently I have been working with JSON web Tokens authentication and wanted to make extra step with security. I decided to sign my tokens with certificates.

So without any further delays I have happily placed certificate within my storage location ( for sake of this post lets say it was local filesystem ) and created simple method to create my object from byte array of that certificate and my password.

byte[] binaryData = new byte[1];
// ... Removed for code visibility - binaryData contains raw certificate byte array 

var cert          = new X509Certificate2(binaryData, password);

The problem :

However when I have tried to invoke ctor on X509Certificate2 passing my raw array of certificate bytes I have received nasty error saying :

System.Security.Cryptography.CryptographicException
Object was not found.
at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate2..ctor(Byte[] rawData, String password)
//my code here

 

Tackling the challenge:

In this instance solution to the problem should be understanding whats going on in this instance.

To give you more details same problem occured on my local development environment and my Azure designated webApp.

My local website have dedicated application pool with specified domain user which app pool uses as identity.

It appears that that even though I was loading the certificate from byte[] the underlying Windows Cryptographic Service provider tried to use user store and since my application pool account profile was not available a cryotographic context was not available.

So initially seems like enabling to Load User Profile to true solves the problem. But wait …. ? Does it really ?

What happens then when you change that setting ? Well ApplicationPool is calling LoadProfile and all related implications of doing that follows.This of course includes possible security vulnerabilities / performance etc.

Other approach:

* this will also work in Azure WebApp *

X509Certificate2 ctor has extra flags ( X509KeyStorageFlags ) that can be used. If you investgate them you will notice one particklary interesting:

MachineKeySet – the key is written to a folder owned by the machine.

var cert = new X509Certificate2(bytes, password, X509KeyStorageFlags.MachineKeySet);

More info avaliable under link to a great post that discuss this in details

 

Good practice:

Its good to cleanup after yourself. If you have read the aforementioned blog you will find more info about temp files left behind when using byte[] within X509Certificate ctor.

So I have adapted method mentioned then and now use :

var file = Path.Combine(Path.GetTempPath(), "rafpe-" + Guid.NewGuid());
try
{
    File.WriteAllBytes(file, bytes);
    return new X509Certificate2(file,X509KeyStorageFlags.MachineKeySet);

}
finally
{
    File.Delete(file);
}

 

Happy coding 😀

1

Road to challenges in IT

Hey ,

It has been long and quiet in last 2 years I think but this times comes to an end. A lot have been happening in regards to learning curve of SCCM/SCOM/ PowerShell  (especially DSc part) and REST Apis.

Nowadays we cannot forget about importance of cloud and hybrid environments and Docker technology!

With all of that I can assure you that from now I will be on regular basis sharing as much as possible from challenges I have came across and from the news I got from the engineering world!

As usual the primary forcus of my experience is providing advanced automation solutions with maintaining security and availability of your services (nop – not forgot -> scalability as well 😀 )

So stay tuned / fork GitHub and enjoy the automation!