Highlighted
Absent Member.
Absent Member.
124 views

Lotus integration and newly created DBs

Dear all!
On my DataProtector 6 environment (cell server up-to-date on a HP-UX 11.31) I set up a new Lotus integration backup on a Domino server.

I discovered a really bad behaviour on the way Lotus integration works: it doesn't automatically add new DBs created on the Lotus server to the previously created backup specifications, nor it exclude the DBs deleted from the lotus server (giving "file not found error").

This is really a bad thing, since in a big enterprise environment where people who controls backups and people who administer domino are different, this behaviour can bring to missed backups or to serveral backup errors...

Is there a way to avoid this?
Should I open a ticket to HP support?

Thanks, Alessandro
0 Likes
1 Reply
Highlighted
Absent Member.
Absent Member.

Re: Lotus integration and newly created DBs

Ok, I solved by my own.

A note: this is _not_ documented in the IBM integrations manual. Shame on you HP!

PREMISE:
My situation: 2x W2003 servers with a Domino cluster on them (not an MS cluster), with many DBs replicated between them. But NOT ALL DBs are replicated. So I need to backup SOME DBs on the 1st server and SOME DBs on the 2nd one.

THE PROBLEM:
I created the 2 backup jobs for these 2 servers choosing (for each server) all the NSFs and un-checking the single DBs I didn't need to backup.
I noticed then that the backup jobs didn't automatically add new DBs created on the Lotus servers to the previously created backup specifications, nor it didn't exclude the DBs deleted from the lotus servers (giving "file not found error").

TROUBLESHOOT:
During various tests I noticed these details:
- selecting to backup the whole NSF root, DP creates a backup script where the single DBs are not mentioned: this script will backup ALL the NSFs on the server, not searching or excluding the single DBs.
- starting from the previous point, if you un-check just ONLY a SINGLE DB in the list below the NSF root, DataProtector changes the script, creating a list of ALL the single DBs it has to backup (obviously without the previously unchecked one). This bring the system to search for every (and ONLY) single DB it has in the script = if one of the DB in the script list has been deleted from the server, you get a "NSFDBOpen File not found" error; even worse: every new DB added to the system doesn't automatically go into the backup and you must manually add it. Practically a Hell...

SOLUTION:
If you choose to backup the WHOLE NSF root (so creating a generalist script which will backup ALL the NSF DBs), then you can add the directories containing the DBs for which you don't want the backup, to the "exclusion list" in the "Application Specific Options" (click on the Help button to have the format you must write these directories). Of course you must have these DBs in specific folders.
And TAKE CARE: the Exclusion list DOESN'T WORK if the backup script contains the list of single DBs you want to backup: it has to be "generalist": just put the check on "NSF" in the GUI.

Best regards,
Alessandro - Milan
0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.