0

openSSL pb7 certificate : unable to load certificate

Recently when working with certificates I received them in pb7 format. If you just try to take them as is u might get

unable to load certificate
140735207381436:error:0D0680A8:asn1 encoding routines:ASN1_CHECK_TLEN:wrong tag:tasn_dec.c:1319:
140735207381436:error:0D07803A:asn1 encoding routines:ASN1_ITEM_EX_D2I:nested asn1 error:tasn_dec.c:381:Type=X509_CINF
140735207381436:error:0D08303A:asn1 encoding routines:ASN1_TEMPLATE_NOEXP_D2I:nested asn1 error:tasn_dec.c:751:Field=cert_info, Type=X509
140735207381436:error:0906700D:PEM routines:PEM_ASN1_read_bio:ASN1 lib:pem_oth.c:83:

When trying to validate a certificate using openssl, this is because it is in the wrong format, whilst the certificate file visually appears to be in x.509 format, you will find it contains a far longer base64 string than x.509 certificates of the same bit length.
The format in this case is p7b (PCKS #7); to use the certificate with apache you’re going to have to convert this.

openssl pkcs7 -print_certs -in certificate.p7b -out certificate.cer

Within the resulting .cer file you will file you x.509 certificate bundled with relevant CA certificates, break these out into your relevant .crt and ca.crt files and load as normal into apache.

0

PKI infrastructure using Hashicorp Vault

So today we will quickly go through setting up vault as our PKI backend. Capabilities of vault are much more to what is shown here as we are just touching several out of many more options from Hashicorp Vault.

Idea here will be to create root CA and then intermediate CA to provide our users/servers with certificates based on our needs. Since I already have been playing a bit with vault I prepared myself quick script. But before we go there we have a list of pre requisites need for all of this to work:

Building quickly vault server when you have a docker engine is easy as running

docker run -d --name vault -P --cap-add IPC_LOCK rafpe/docker-vault:latest server -dev-listen-address=0.0.0.0:8200 -dev

which will bring up our container. From there we need to grab token ID which we will use later for calls to our servers.

 

Export the values

export VAULT_ADDR="http://my-server-address:my-port"
export VAULT_TOKEN="my-token"

 

Once done you can grab my init script below

Be sure to modify URL for your vault server and off you go 🙂

 

To create certificate you need to create a role and then make a request for issuing one

vault write rafpe_intermediate/roles/rafpe-engineer lease_max="336h" lease="336h" key_type="rsa" key_bits="2048" allow_any_name=true

vault write rafpe_intermediate/issue/rafpe-engineer common_name="ninja.rafpe.engineer:rafpe" ttl=720h format=pem

 

This will get you started. And in one of next posts we will use this infra for our HAproxy

0

Ansible – Using dictionary to deploy pem certificates

When automating certificate deployments I wanted to have smart way of deploying them. So I went ahead and decided to use dictionaries.

For this example my variables looked more like as following :

ssl_certificates:
    domain_uno_com:
      owner: haproxy
      group: haproxy
      mode: "u=r,go="
      certificate: |
                                              -----BEGIN CERTIFICATE-----
                                              dslkajfafak234h23o4h32jkh43jqtghkjafhads;fhd89fuad9f6a8s7f6adsf
                                              < ..................... bogus info uno ...................... >
                                              yjEdslkajfafak234h23o4h32jkh43jZlcmlTaWduLCBJbmMuMR8wHQYDVQQL23
                                                -----END CERTIFICATE-----
      key: |
                                              -----BEGIN PRIVATE KEY-----
                                              dslkajfafak234h23o4h32jkh43jqtghkjafhads;fhd89fuad9f6a8s7f6adsf
                                              < ..................... bogus info uno ...................... >
                                              yjEdslkajfafak234h23o4h32jkh43jZlcmlTaWduLCBJbmMuMR8wHQYDVQQL23
                                              Edslkajfafak234h==
                                              -----END PRIVATE KEY-----
      
    domain_duo_com:
      owner: haproxy
      group: haproxy
      mode: "u=r,go="
      certificate: |
                                              -----BEGIN CERTIFICATE-----
                                              dslkajfafak234h23o4h32jkh43jqtghkjafhads;fhd89fuad9f6a8s7f6adsf
                                              < ..................... bogus info duo ...................... >
                                              yjEdslkajfafak234h23o4h32jkh43jZlcmlTaWduLCBJbmMuMR8wHQYDVQQL23
                                                -----END CERTIFICATE-----
      key: |
                                              -----BEGIN PRIVATE KEY-----
                                              dslkajfafak234h23o4h32jkh43jqtghkjafhads;fhd89fuad9f6a8s7f6adsf
                                              < ..................... bogus info duo ...................... >
                                              yjEdslkajfafak234h23o4h32jkh43jZlcmlTaWduLCBJbmMuMR8wHQYDVQQL23
                                              Edslkajfafak234h==
                                              -----END PRIVATE KEY-----

 

Once we have that within our playbook we will be using the following actions to create ourselves pem files

       - name: SSL certificates Web | Create certificate key files
         copy:
           dest: "{{web_ssl_folder}}/{{ item.key.replace('_','.') }}.pem"
           content: "{{ item.value.certificate + '\n' + item.value.key }}"
           owner: "{{ item.value.owner }}"
           group: "{{ item.value.group }}"
           mode: "{{ item.value.mode }}"
         with_dict: ssl_certificates
         no_log: true

 

Now when we run our playbook what will happen is we will get within folder defined under web_ssl_folder  new certificates called respectively domain.uno.com.pem and domain.duo.com.pem.

Of course if you add more entries you will get more created. So for you the only thing to change from here is the owner and possibly the rights ( although think twice 🙂 )

0

OpenSSL – generate self signed certificate

Quite often to test different aspects of IT or security we use certificates which are self signed. As the name implies we are responsible for generating them. In this post we will go through short explanation how to generate one with use of openssl.

To create one we will issue the following command :

openssl req -x509 -newkey rsa:2048 -keyout certificate-key.pem -out certificate.pem -days 365

In order to better understand above command lets break it down :

  • req : PKCS#10 certificate request and certificate generating utility
  • -x509 : we receive self signed certificate as output instead of certificate request
  • -newkey rsa:#### : creates a new certificate request and new private key. In this instance we use RSA with size of #### bits
  • -keyout file.name : outputs just created private key into a file
  • -out file.name : specifies output file name
  • -days # : specifies how many days the certificate will be valid when x509 option have been used. Default value for this setting is 30 days
  • -nodes : indicates that private key should not be encrypted

 

For those being on windows we sometimes need to get PFX (which contains private and public key ). Easiest is to use OpenSSL in the following form :

openssl pkcs12 -inkey bob_key.pem -in bob_cert.cert -export -out bob_pfx.pfx

 

Since some of you will be working on windows you might get across the following error :

WARNING: can't open config file: /usr/local/ssl/openssl.cnf

then what you are missing is setting for a environmental variable (*make sure to adjust path to your cfg file ):

set OPENSSL_CONF=c:\OpenSSL-Win32\bin\openssl.cfg

 

And thats it for self signed certificate. In next post we will use knowledge of certificates with the power of Docker and will set up our own registry

 

1

x509Certificate – System.Security.Cryptography.CryptographicException “Object was not found”

Hey ,

So recently I have been working with JSON web Tokens authentication and wanted to make extra step with security. I decided to sign my tokens with certificates.

So without any further delays I have happily placed certificate within my storage location ( for sake of this post lets say it was local filesystem ) and created simple method to create my object from byte array of that certificate and my password.

byte[] binaryData = new byte[1];
// ... Removed for code visibility - binaryData contains raw certificate byte array 

var cert          = new X509Certificate2(binaryData, password);

The problem :

However when I have tried to invoke ctor on X509Certificate2 passing my raw array of certificate bytes I have received nasty error saying :

System.Security.Cryptography.CryptographicException
Object was not found.
at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate2..ctor(Byte[] rawData, String password)
//my code here

 

Tackling the challenge:

In this instance solution to the problem should be understanding whats going on in this instance.

To give you more details same problem occured on my local development environment and my Azure designated webApp.

My local website have dedicated application pool with specified domain user which app pool uses as identity.

It appears that that even though I was loading the certificate from byte[] the underlying Windows Cryptographic Service provider tried to use user store and since my application pool account profile was not available a cryotographic context was not available.

So initially seems like enabling to Load User Profile to true solves the problem. But wait …. ? Does it really ?

What happens then when you change that setting ? Well ApplicationPool is calling LoadProfile and all related implications of doing that follows.This of course includes possible security vulnerabilities / performance etc.

Other approach:

* this will also work in Azure WebApp *

X509Certificate2 ctor has extra flags ( X509KeyStorageFlags ) that can be used. If you investgate them you will notice one particklary interesting:

MachineKeySet – the key is written to a folder owned by the machine.

var cert = new X509Certificate2(bytes, password, X509KeyStorageFlags.MachineKeySet);

More info avaliable under link to a great post that discuss this in details

 

Good practice:

Its good to cleanup after yourself. If you have read the aforementioned blog you will find more info about temp files left behind when using byte[] within X509Certificate ctor.

So I have adapted method mentioned then and now use :

var file = Path.Combine(Path.GetTempPath(), "rafpe-" + Guid.NewGuid());
try
{
    File.WriteAllBytes(file, bytes);
    return new X509Certificate2(file,X509KeyStorageFlags.MachineKeySet);

}
finally
{
    File.Delete(file);
}

 

Happy coding 😀