What's new in Validator for Identity Manager 1.4 - Part 3


I have been discussing the new features in Validator 1.4. I am working through the list of enhancements in the readme, and trying to explain what each of those items really mean.

* HTTP and UserApp Connectors will import https certs into the keystore when you do Test Connection.

This is a great feature. If you have not enjoyed the lovely fun of working with SSL certificates then every time someone adds a way to get the trusted roots in automatically you should cheer. It really should be the default, and I wish a bunch of common low level toolkits would just build it in, but not all have.

This issue confuses the heck out of people all the time, and is worth spending a minute on clearing up that confusion. Because if you take away nothing else from these articles, if I can help you understand this issue, then your life in IT will be better, I guarantee it.

The short answer is, when you use SSL or TLS (SSL is now totally broken and should be replaced entirely with TLS) you need some way to exchange the very first shared secret. We do that by trusting the Certificate Authority that signed the SSL certificate in use. How do you trust a certificate authority? The big browser guys charge a lot of money to get your CA's public key into their default certificate stores. Years ago it was on the order of $15 million. No clue what it is these days.

But what if you self signed? Or used the eDirectory Certificate Authority in your tree? Well you need to import the certificate into the trusted keystore. I.e. Get it into the list of trusted CA's.

Which keystore? Aye there's the rub. That is usually the hardest part to figure out.

The issue is that the SSL and TLS protocols use a fast symmetric cipher to encrypt all the real traffic between devices. But if you had that shared secret you could decode everything in that stream. So how do you transfer a shared secret with a web server you have never seen before? Well you use a slower encryption cipher, an asymmetric one, like the RSA Key pair model. (NetWare started with it, and the NDS password in eDirectory uses the same algorithm to store passwords). In this case there is a private key and a public key that are generated in a pair. You could use my private key and your public key to encrypt something, that ONLY your private key and my public key can decrypt. Thus we can be sure that I sent it, and only you can read it.

But this approach is fairly slow especially when talking about encrypting a 100 Megabit/second server output. Instead this slower method is used to encrypt the shared secret. This way you can be sure the message the web server sent you, with the shared secret came from the web server you think, and only you can read it.

But there is still a missing step in that chain. How do you know that the web server you think you are talking too, is really the web server you think you are talking too. In theory I could mint a certificate that is valid and says I am any IP address or DNS name I want.

Thus we introduce the notion of a Certificate Authority. The CA is someone/something you trust for some reason. Even if it is only that Mozilla trusts them enough to take their money to include in your browsers set of default CA's. For a scammer to go those extremes is quite extreme indeed.

Thus trusting the CA is of critical importance. Want me to make a certificate for you that, that on a web server will claim to be www.gmail.com? Here is the command line to go do it using Java.

/opt/novell/eDirectory/lib64/nds-modules/jre/bin/keytool -keysize 2048 -genkey -alias gmail -keyalg RSA -validity 10950 -keystore /tmp/tomcat.keystore -storepass gmail -dname "CN=www.gmail.com" -ext san=dns:www.microsoft.com,dns:www.apple.com

If you take the private key that generates (good for 30 years by the way, 10950 days is 30 years) and install it in a web server, it will pass the hostname test at least. Have fun on an isolated network and you can call your webserver www.gmail.com or www.apple.com or the other names I listed.

But hostname matching the certificate name is not the only test. There is also who signed it, and do I trust them? Well my cert example above is completely untrustworthy and all browsers will report it that way.

Now User App and other HTTPS sites you might be testing may be using certificates from well known CA's or might not. Your production environment might use one, but maybe your development environment does not.

What this new enhancement does is add in code, so that if your browser errors when you go the web page and says it does not trust the certificate, you can chose to trust it now. Before the HTTPS connectors would not work over SSL if the CA that signed the web sites certificate was not trusted.

Now it turns out, that adding it is not that hard by hand, but it requires a couple of pieces of knowledge.

    1. Where is the keystore? (In Java is in jre/lib/security/cacerts and the default password is 'changeit')


    1. If Java, which JVM is being used.


    1. Where do I get the CA public key?


    1. What is the command to make this a trusted key.

For Validator, the JRE is included in the download, so it is the one in the Validator directory, in the jre/lib/security/cacerts.

Go to the web site, when you get the HTTPS lock icon, click on it, look at the Certificate info and you should see a chain of certificates. You need to select each parent of the final certificate and export it as base 64 encoded into a file.

Watch out, there may be an intermediate authority as well. I.e. One CA signed another CA, and that second one is the actual signer of the final certificate.

The command would then be

%INSTALL_DIR\jre\bin\keytool.exe -keystore %INSTALL_DIR\jre\lib\security\cacerts -storepass changeit -import -alias MyTrustedCA -file c:\path\to\cert-file.b64 -trustcacerts

Then ensure you imported it right, try:

%INSTALL_DIR\jre\bin\keytool.exe -keystore %INSTALL_DIR\jre\lib\security\cacerts -storepass changeit -list -v 

and look for the MYTrustedCA in that output.

But now, it does it automatically. This is a very good thing. Yay!

* Created a new add-on product called Scheduler

I talked about Scheduler in the first article in this series. This is basically a model where the consultant/developer uses the more expensive full Validator license and leaves behind a test suite. Then you sell the customer a cheaper Scheduler license and the management is team is happy since they have a button to push and get back a report on which functions in their IDM system is working. (Hopefully all of it!).

It is a very clever model, and it works well for Novell Consulting since they require all projects they deliver to have a Validator test suite available to the client at the end. If they want to buy Validator, cool. If not they can buy Scheduler cheaper instead but cannot easily make new versions.

* Added a Scheduler button to try the new Scheduler

This is basically a link to show you what Scheduler looks like by opening it, as a 30 day demo copy. (The actual product comes with Validator, it is the license that you buy that is important.

You can see they spent a bit of time to make the interface prettier for bosses, and set it up easily to send emails of the various tests that were run on some schedule.

* Rewrote Starter Tests to include the use of templates and made optimal use of variables

A few years ago in the earlier versions I pointed out to the developers that shipping Validator tests that test out all the functionality of the shipping IDM drivers would be a great idea. I even suggested that they include the JSON file as part of the driver package. Maybe have an add on package for each driver that includes a DirXML-Resource object with a DirXML-ContentType of text, (or even define a text/json type) that you could point Validator at, to read out the test suite from.

This way every driver could be shipped, ready to test. They eventually added in some basic examples of testing for a bunch of drivers with Validator (Look in the tests directory you will see a bunch of test suites shipped with Validator.

To be honest, the ones shipped in 1.3 were kind of half hearted, and sort of barely there. In the version of Validator 1.4 only one of my starter suites is open, and it is a much nicer example of most of the connector functionality.

I stand by my Add On Package idea with tests that Validator can read, but so be it. I will go add that in Ideascale, go vote for it and add your own ideas!

Go take a look at the Validator Ideascale instance. This is basically a way to propose ideas and vote on them. The hope is that the actual customers will tell NetIQ which ones to work on next. You may recall the RMS system at Novell, which stands for Requirements Management System, but alas some products paid attention and used it well (ZENworks) others did not use it well at all (Looking at you IDM and Access management!). There were many critiques of that system, and it is mostly being phased out at MicroFocus now.

Seems like the successor is known as Ideas, which you can see at:

Same basic idea, lets you submit ideas and vote on them. The RMS system did not let you see other peoples suggestions so the product managers had to go look at it on their own, and get drowned in a sea of suggestions. This way, in principle some of the ideas bubble to the top. It also did not do a good job of avoiding duplication so the PM would have to hunt through many similar suggestions and try to merge them together, which was not easy in that system. Hopefully this Ideas or Ideascale approach will work and get some traction.

Right now, the starter tests are more focused on showing off examples of what each Validator connector can do, as opposed to what each IDM Driver delivers as functionality. I will take this as a good starting point and keep pressuring for more. (Go vote on my idea maybe it will happen!)

Regardless, they are helpful to see how you might structure a test and what the functions of each connector might offer.

Using Templates is a good thing since code reuse is the way to go where possible, and using variables makes things much easier.

Additionally, using variables is great, since in 1.3 they introduced the notion of variables that are local to each stage level. That is, there is a Dev, QA, and Prod set of stages out of the box. If you define a variable, you can set a value, and it gets set for that stage.

Thus you could have a variable eDirServerA with the IP address of the dev server in the dev stage variable. When it is time to test in QA or Prod, switch the variable set over to QA or Prod, set the variables with values that make sense in those stages and away you go, ready to test. If you are one of those crazy companies with more than three stages, no problem, define a 4th, 5th, and maybe 19th if you are that crazy. Go wild, be happy.

That is about if for part 3, stay tuned for part 4, lots of great stuff coming up!



New Release-Feature
Comment List
Related Discussions