When Identity Manager was first released by Novell as DirXML 1.0 the product was focused exclusively on synchronizing objects between different directories, databases, text files, and more. As time rolled on more features got added. With Novell Nsure Identity Manager 2.0 we got DirXML Script, a great language that provided a GUI and XML language for manipulating XML events. With Novell Identity Manager 3.5 we started to see the beginnings of the User Application.
This was the beginnings of a user facing aspect of IDM. Previously IDM was really just plumbing in the background, no one cared until it broke, but there was not even a sink at the front end to get water out of.
The User Application, as the Identity front end, provided functionality in many different areas. On the one hand there were Workflows, where the user could chose a Workflow, be presented a form to fill in, and based on that data actions taken. Approvals by a manager, an application owner, in parallel or serial could be required. I like to say that IDM automates the user provisioning process, but User Application inserts a manual step in the middle. Seems counter intuitive, but there is a good reason. Blame. You need someone to make a decision, take responsibility for it, and when it causes an issue accept the blame. You may have heard of multicasting for imaging workstations, or streaming video. This is all about blamecasting.
There is a portal where users can change their passwords, and if given sufficient permissions, modify attributes about themselves in the Identity Vault. Of course, once those changes get written they propagate to other systems by the IDM engine. But it also could accept portlets to make it more of a multipurpose one stop shop for IT services.
Over time, the portal functionality has been removed release by release, as the interface has aged, and NetIQ did not want to support the portal functionality in the User Application.
Over the years, the interface to the User Application has been fairly criticized for its look and feel, as modern interfaces moved on, but the User Application interface stayed much the same. It worked, but was just not modern looking, and in some ways missing functionality that was now expected. All along, the response had been that the current interface is a reference, you can always just go and write your own in whatever framework desired, just make the SOAP calls to the back end to perform the actions you need.
In Identity Manager 4.5 NetIQ went out and actually did that. You can see the results as the Identity Manger Home and Provisioning Dashboards. (Technically they first released it for IDM 4.02 as an Enhancement Pack, but it is fully part of 4.5 and support for 4.02 has ended.)
Using more modern Web 2.0 style approaches in the Home/Dash interfaces allowed them to build something that looks more like today's web pages, leverage modern web tool kits and frameworks, while making it look good on a tablet or a laptop display. However, it also means that before, the user would authenticate to the User Application itself, perhaps via an SSO method, perhaps just username and password, and that was it for authentication. Now, each modules needs to authenticate the user.
Adding in a new login for every single module they added (Home, Dash, SSPR, Reporting, Catalog Access, Access Review) would be ludicrous and unacceptable, so they looked around the product suite, and the general market for an approach to use to link them all together. On the one hand, NetIQ owns NetIQ Access Manager, so they could have included the Access Manager Appliance, required, yet another server to run it and manage the Single Sign On aspects of the product. They could have written something new from scratch, or just required some other SSO product to be in place.
Instead they looked at the xAccess product line (x here meaning Mobile, Social, or Cloud) which had a similar issue and they took the federation components out of Access Manager, and made a much thinner, simpler product called OSP (One SSO Provider). Amusingly, I was at a conference talking with the architects of OSP and NAM, and they apparently sit across the hall from each other in Provo, so that is a good sign.
In order to simplify OSP, they really took only the SAML federation, Kerberos, and OAuth provider components out of NAM, and left behind form fill, identity injection and some of the other cool stuff NAM can perform. (In some ways, they stripped NAM down, until it could only do as little as a product like Okta can do) As a full fledged Access Management platform, OSP would thus be considered lacking. As a simple federation tool it is actually pretty good.
Next they made their new (and old) applications use OAuth for ticketing of permissions, and OSP the ticket granter and authentication front end. This meant that User Application itself, which is still there under the covers, and still used to process workflow form rendering and workflow processing had to accept OAuth for authentication. To replace the password management and user profile management aspects, SSPR (Self Service Password Reset) was added to the suite and can accept OAuth tickets.
With all that said, this means installing IDM 4.5 has new components that are simple once you understand them, but getting that understanding can be tricky. It is no longer just a Web App as the front end, it is a series of them, that all need to work with OSP.
In the xAccess product line there is a nice configuration GUI for OSP that does all the work that is needed. In IDM it is just a single configuration file. The very good news in IDM 4.5 is that almost everything needed to configure the web apps and OSP are stored in the ism-configuration.properties file. No more unzipping the WAR, editing files, and repacking it. No more Java tools that do that in the background for you either. This makes patching and management much easier.
Bringing in OSP brings in some of OSP's dependencies however which you need to be aware of when installing.
First thing I would suggest is that if you plan to use a public certificate, bought from a well known CA, that you get it first, by generating a CSR (Certificate Signing Request) and have it at the ready for during the installation. The Subject Name of the certificate MUST match the DNS name that users access OSP through. I have tried Subject Alternate Names in certificates with mixed success. A wildcard cert is great for this, if your organization has one.
This will ask you to set a keystore password when you execute the command.
Additionally, you actually need at least one and possibly two certificates. You can share them but sometimes it is better to have two. Tomcat itself that presents web pages that people will see needs a certificate, this should be trusted by a well known CA so browsers do not complain.
The second if for OSP itself which is used for SAML federation and is rarely public ally exposed. Thus for the Tomcat certificate you might have a 1 year term on it and need to renew it yearly. This would require an update of the cert in the keystore, which you could do in a second copy offline, and then in a restart of Tomcat, have it apply late in the night. The OSP certificate however, is shared with the federated IDP (Identity Providers) who need to update as well. Thus it is common to generate a self signed 10 or 20 year cert for OSP, so that the SAML federation trust rarely times out. Of course if your IDP is using a 1 year cert you could run into the same problem, if that cert is self signed.
The issue is one of trust. An SSL connection is started by using RSA key pairs to exchange a shared secret. Then all encryption is done using that shared secret symmetrically, which is much faster than RSA's asymmetric approach. However you need to securely exchange that first secret to do it. Thus the server has a private key, your browser trusts that it is real because the CA that signed it is explicitly trusted. If it is a self signed certificate the signer is by definition not well known nor well trusted. Then you would have to import the public key as a trusted signer. (-trustcacerts switch on keytool).
If the IDP or the OSP certificate is a 1 year cert, signed by a well known CA, then renewing it via that CA is not a big deal, once it is close to expiring, renew it, and restart both ends of the connection. The signer (the well known CA) is still trusted, so no changes needed. If the IDP or OSP uses a Self Signed CA, then once they expire, the new certificate has to be explicitly trusted as a signer.
Thus you could use a single Tomcat and OSP certificate, but when it renews you want to know in advance what work needs to be done on the IDP end, to be ready for it, since well known CA's rarely sign for longer than 1-3 years.
My example of keytool above shows only one keystore. But in reality there are actually two keystores, the one that the OSP instance is using and the one that the rest of the Java stack is using. (/opt/netiq/idm/apps/jre/lib/security/cacerts) Both keystores need to trust a couple of certs.
1) The eDirectory tree CA. 2) The Tomcat certificate. 3) The OSP certificate's signer.
It seems odd that although the Java stack has the certificates the OSP keystore still needs them, and vica versa but it is true as each operates in a separate security domain and needs to trust the other.
This is important, since when you install OSP you get asked for some URL's and they must match what the user will be typing. If you protect OSP via a reverse proxy using NAM or any other Access Management product that needs to match as well.
Additionally, although you can run User App, Home, Dash on web application servers like JBoss Enterprise or WebSphere, OSP only runs (supported) in Tomcat. So you will need a Tomcat instance regardless.
Using the installer from the DVD it is very easy to install on a fresh Linux server, the JVM, Tomcat instance, and Database in a single installer. Once you have that done, the first thing I would do is install the certificate in it.
For Tomcat this means editing the /opt/netiq/idm/apps/tomcat/conf/server.xml file and finding the HTTPS directive that is commented out. You need to specify the keystore your certificate is stored in, and the store password. For reasons I do not fully understand I can only get this to work if the keystore and private key password are the same.
Note above I only disabled SSL and TLS 1.0 (by allowing only 1.1 and 1.2), you will need to disable the weak ciphers as well by explicitly including all the allowed ciphers. A Google search will get the appropriate settings for Tomcat. It is just long and ugly in an article like this.
In my example of keytool, I called my server myID.myDomain.com this is critical. It means when you enter a URL in the configupdate.sh utility or manually edit the ism-configuration.properties file, the URL for everything has to start as https://myId.myDomain.com/ at the minimum. Out of the box, Tomcat listens on port 8443, and you may wish to redirect it to listen on 443, for standard HTTPS. The ports below 1024 require root permissions to access, and thus you would potentially need to run Tomcat as root, which would mean if there was an exploit you could be exposing root access to the web. Therefore Tomcat runs as the novlwww user, not as root, but cannot access port 443. There are many ways around this, my colleague wrote a short shell script that uses IPTables to reroute all 443 traffic inbound to 8443 and that script runs as root, not Tomcat.
If you redirect to 443, the URL is oddly NOT https://myId.myDomain.com:443/ because browsers like to be clever and drop the 443, it is implied by the HTTPS and no port. But that means the URL in the browser is different than in OSP config. That is how sensitive OSP can be to the URL.
What I usually do is get OSP working, with the Tomcat SSL cert, and everything else (SAML if you are using it) and only then install the User Application. It is much easier to troubleshoot OSP by itself, Tomcat restarts faster with only one small web app, and so on.
Hopefully you will find this helpful as you get started with IDM 4.5.