Wikis - Page

Getting Started Building a SOAP Driver for IDM - Part 8

1 Likes
Starting A SOAP Driver for IDM Part 8:

Novell Identity Manager comes with a bunch of prebuilt and out of the box drivers that mostly do what is needed for most cases. However some drivers allow for so much flexibility that no out of the box configuration will ever be complete. The JDBC driver, which can connect to many different databases comes with some of the big ones (Oracle, Microsoft SQL Server) configured, but the rest are sort of up to you. Mostly because almost everyone uses databases differently. Novell has talked about a fan out configuration for the JDBC driver that will come out after the formal release of Identity Manager 4.0. That will probably manage the case of out of the box database models, like using Oracle with 'Oracle Users' in which case, you can imagine a model fairly easily where the driver could be set up to push users into many dozens of such Oracle databases that all use the same basic model of users, really differing only in host information (DB server, port, database name, etc) and using entitlements to specify which database the user gets access too.



But the SOAP driver is even harder to provide useful default configurations. Well it ships with a DSML and SPML 1.0 configuration, since those are the only really mature standards for SOAP operations involving users. But SOAP is basically as open ended as you want it to be, and everyone does whatever they want.



For example, the User Application is more about Provisioning than user events directly, but if you really wanted too, you could use the SOAP driver to talk to the User Application. (Actually, it is almost as much fun to use a Workflow, to use a SOAP integration activity, to talk to User Application to do stuff.)



With that said, there are some big targets for SOAP that you could develop configurations for. I started this series to try and provide notions on how you might do that, using Salesforce.com as the target, since it is a pretty big target to aim at. However the concepts involved would be much the same that would be used to try and target any other SOAP system. Heck I used the ideas I developed here in my SOAP integration activity in a Workflow to call User Application web services. Sounds silly but a Provisioning Request Definition (PRD) that cancels another PRD can be quite useful! (There is a Start Workflow token, but there is no Stop workflow token. So to make a Stop workflow token, you use Start Workflow to call a Workflow or PRD that stops the specified workflow. Same thing to solve the issue that there is no Approve Workflow token, can be done with a Start Workflow of a PRD that Approves a running workflow). I need to find the time to write an article about that whole concept.



In part 1 of this series, Getting Started Building a SOAP Driver for IDM - Part 1 I discussed some of the things you need to get started building a SOAP driver. I was using the example of Salesforce.com (henceforth known as SFDC, since typing the full name is too much of a pain each time). In Part 1 I focused on how you might start connecting via SOAP to get a session ID.



In Part 2 of this series Getting Started Building a SOAP Driver for IDM - Part 2 I discussed how you might process the results from SFDC after you submit a login request, and converting it into an <instance> document.



In Part 3 of this series Getting Started Building a SOAP Driver for IDM - Part 3 I discussed how you might handle query events and their responses.



In Part 4 of this series Getting Started Building a SOAP Driver for IDM - Part 4 I talked about some of the background stuff you need to manage, like attribute syntaxes, and left hanging two more concepts. Subscriber channel write events, like <add> or <modify> events, that need to be sent to SFDC, and the ability to get events onto the Publisher channel.



In Part 5 of this series Getting Started Building a SOAP Driver for IDM - Part 5 I started talking about how you would map add and modify events from Identity Manager into SFDC events. This would allow you to write changes (modify events) back to SFDC, or add new users to SFDC.


I started talking about how you would handle modify events, and left add events as an exercise. However I did not finish the modify discussion. I showed some sample code, to manage it, but I would like to discuss the actual process that the code sample uses.



In Part 6 of this series Getting Started Building a SOAP Driver for IDM - Part 6 I finished talking about how to handle modify events. However add events were not addressed.



In Part 7 of this series Getting Started Building a SOAP Driver for IDM - Part 7 I discussed the issues you would need to address to handle <add> events, even though I did not actually implement it in practice. I also discussed that Novell has released Identity Manager 4.0 Advanced Edition, which has an integration module available for Salesforce.com.



As I said it part 7, I still think this series is of value, even if Novell has shipped a configured driver for SFDC, primarily because the Novell driver in its current configuration only supports pushing users into SFDC on the Subscriber channel. It does not support much of anything on the Publisher channel. This approach is aimed at supporting the Publisher channel.



Now what is needed to get events out of SFDC? Well this is going to be different than usual driver work. SFDC has an API call, getUpdated that lets you specify an Object type, a start date, and an end date. The results return a list of database Ids that have changed in the specified time period for the specified object class.



This looks something like:



<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:urn="urn:enterprise.soap.sforce.com">
<soapenv:Header>
<urn:SessionHeader>
<urn:sessionId>SomeSessionIDValue</urn:sessionId>
</urn:SessionHeader>
</soapenv:Header>
<soapenv:Body>
<urn:getUpdated>
<urn:sObjectType>User</urn:sObjectType>
<urn:startDate>?</urn:startDate>
<urn:endDate>?</urn:endDate>
</urn:getUpdated>
</soapenv:Body>
</soapenv:Envelope>


The sObjectType is of course the SFDC object class name, and the startDate and endDate are the parts where you specify the time range for getting updated objects.



It is worth noting that you get an error if you ask for something longer than 30 some odd days ago, so be careful of that. Next question is, in my sample document I have question marks (?) in place of a time value, since that is how soapUI shows it, based on the WSDL. Well what format is the time string in?



Well good news is that you can figure it out using the getServerTimestamp API call in soapUI and you get back a string that will tell you how to format the time string.



Here is the getServerTimestamp call example:



<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:urn="urn:enterprise.soap.sforce.com">
<soapenv:Header>
<urn:SessionHeader>
<urn:sessionId>?</urn:sessionId>
</urn:SessionHeader>
</soapenv:Header>
<soapenv:Body>
<urn:getServerTimestamp/>
</soapenv:Body>
</soapenv:Envelope>


Try it in soapUI and the response will tell you that the current time looks something like:



2010-08-26T20:55:31.195Z



That is pretty easy to manage with the Convert Time token. One little trick is that the Z means Zulu for GMT/UTC time. (I know those two are not exactly the same, but basically UTC is what is needed). I do not know what the letter T in the middle is for, but it is easy to get Convert Time to take it in a pattern. Surround it in single quotes in the time pattern string. So it would be something like yyyy-MM-dd'T'HH:mm:ss.SSS'Z' to get the right formatting.



Now it turns out that implementing support for getServerTimestamp in your driver is really useful as well. After all, how do you ensure that the time in your lab or production system matches the time in the SFDC system? Well use this function call.



So I made a decision. I would overload the class-name attribute in <query> docs when I supported Query events. If it is a known object class, great, just convert it to a query as we talked about in previous articles. This means that with no extra work on the IDM side, I just make a query via the Query token, and specify an API name, instead of a real object class. In fact, in my Query handling rule, if the object class is unknown, I let it pass through unmodified, so that a later rule can try and see if the class-name is an API name being handled.



The next set of rules that handle different API calls. For example I implemented a directSOQL function, which takes the <read-attr> node value string and uses that as the SOQL call. This was VERY useful for migrating users. In my example case, the SFDC system had 30,000 Contacts in it, but only 1000 were relevant for my project. From a licensing perspective, migrating in all 30,000 of them would have been ridiculous, and expensive, ignoring how much time it would have taken.



But in SOQL (SFDC's query language, very much modeled after SQL) I could define a complex query to get my objects of interest. However I could not implement that kind of query using the query tokens options. (Or I could have but it would have been so contrived, why bother?). Instead, if in the Output transform I saw a <query class-name="directSOQL"> I had a separate policy that generated the correct query into SFDC. I would have had to write a policy in the Input Transform as well to handle the conversion back to an <instance> document, but since all the <query> docs used SOQL at the end of the day, I was already handled.



I then used the same approach to handle getServerTimestamp. If in the Output transform I saw a <query class-name="getServerTimestamp"> event, then I had a rule to build the above XML doc to send to SFDC. Of course this did require a rule in the Input transform to handle a response document that had a <urn:getServerTimestampResponse> node in it just like I had a rule to handle <urn:queryResponse> XML documents.



Having getServerTimestamp is very useful, as this way, you are always sure that the End date you are sending is the current correct time. This means if your system falls out of timesync with SFDC then you are still fine, and actually do not really care. As usual, like in eDirectory, it does not matter what time SFDC has, so long as you have the same time in your system. Once the query for updated objects is completed, I make sure to store the End date timestamp (now converted back to CTIME format) into a value on the Driver object. I used the attribute Last Referenced Time, as it is part of base schema, and available. I got the idea from Lothar Haegers Password Notification Driver, IDM 3.5 Password Notification Service Driver which I strongly recommend to all people learning IDM to pick apart as a learning exercise. I guarantee you will learn some neat new tricks. Lothar is a sneaky guy, and very clever in his use of Policy. I know I learned at least 4 things from it. Anyway, this is a structured attribute so you have to specify the two components, one of which is a timestamp value, so it looks like a Date/Time attribute when you look at in Console1 or iManager.



This way, you store the last time you succeeded on the driver object, and on each driver start up, you read back that time, and use it as your start time. This way, even if the driver is down (for less than 30 days!) then you recover and collect all events that happened since you last polled for changes.



The response from getUpdated is just a list of <Id> nodes, with a database ID value. What is quite annoying is that object class that we sent in with the query as the sObjectType, does not return in the result. What this means is that you cannot tell from the response alone, what object class it is referring too.



In my rule in the Input Transform that handles the getUpdatedResponse documents, I make sure to set a flag in the <instance> doc, to indicate that this is a getUpdatedResponse generated document.



You should be able, using the ideas in the past few articles to create these rules on your own. They are basically the exact same approach as the Query, and Upsert functions discussed in past articles in the series.



However there are some tricks needed still. How do I check on a regular basis? Just use another trick I learned from Lothar! Use the driver heartbeat!



As of IDM 3.5 I think, Novell added a driver heartbeat on the Publisher channel that events every X number of minutes, where X is defined in a configuration value. Alas, you cannot go lower than 1 minute, which might be helpful in some cases.



Anyway, you can detect such an event in the Publisher Event Transform policy set, with a test for operation = status and XPATH is true of @level='heartbeat'. Thus you have a polling cycle triggering your event.



Now there are a bunch of subtle things to consider here, so lets take a break and talk about them in part 9 of this series.





Labels:

How To-Best Practice
Comment List
Related
Recommended