If I use a filter it runs ok.
I have around 550 servers and 20000 job running. What I can do to keep the script running for all my stuff.
Update 30-Apr-12: By default (re)parse only Jobs which have been modified since the script's last execution. This enables the script to be run on a regular basis to keep the data up to date. The zip file also includes a new SQL Query which outputs the main columns for each table. Note: if you have previously run an older version of this script, then the first time you run this new one you must overwrite the previous output by running the script with the following switch /modifiedonly=false
The ParseJobs script reads all of the properties of each Job and inserts them into a set of database tables for later viewing and analysis. The information may then be used in a number of ways, for example to identify which Knowledge Scripts and/or Servers a particular email recipient will receive, which Jobs have Monitoring Policy overrides, which Jobs may raise high severity Events, or the Jobs where Event Collapsing is disabled.
Separate tables are created to store Job information (e.g. JobID and host name), run parameters, schedule, advanced properties and action details, and the information may be saved to a different database to the source repository. In addition, the information from multiple repositories may be stored in the same tables, allowing Job properties to be compared across repositories. Each time the script is run the existing records are deleted and recreated, unless the append setting is changed. Where Monitoring Policy overrides are in place, the override value may be recorded instead of the default policy setting.
The attached zip file contains the script and a help file which describes its usage, output information and example queries.