Highlighted
Absent Member.. MilanJ Absent Member..
Absent Member..
1245 views

Content development - clarification of Operations Agent data source use-case in simplified OBR CDE

Hi,

I am trying to use simplified OBR CDE 10.01 to get custom data collected by opeations agent. 

Here is the example of data on OA agent:

OA custom dataOA custom data

I want to get OBR report similar to the one created in OMi Performance Graphing:

OA data in OMi Performance grapherOA data in OMi Performance grapher

I have tried to implement this in simplified CDE. From CDE UI point of view no problems and I was also able to compile and deploy the content:

OA custom data in CDEOA custom data in CDE

But the result created by CDE seems not be correct. CDE will create only fact table Rate_CSAS_DB (+ aggregation tables for hour, day), but not any dimension table for operations agent CSAS_DB class.

To get custom class from OA looks like basic OBR CDE use case, so I would expect that CDE will create also CSAS_DB dimension table and relevant mapping to K_CI_SYSTEM dimension, so the result should be similar to what's in OBR for operations agent SCOPE:FILESYSTEM, SCOPE:CPU, … data structures.

I tried to look on out of the box data model and related Vertica tables for OA SCOPE:Filesystem:

  • Fact table/column SR_SM_FILESYSTEM / dsi_key_id_ ---> Dim table/column K_SM_Filesystem / dsi_key_id (primary key)
  • Dim table/column K_ SM_Filesystem / SystemRef ---> Dim table/column K_CI_System / dsi_key_id (primary key)

There is the relation between two dimensions (K_ SM_Filesystem -> K_CI_System), so it looks like SNOWFLAKE schema and not STAR schema, which is currently supported by simplified CDE. It's the most probably providing the answer for not getting the right data after deplyment of the stuff created by CDE.

Can you please clarify what kind of data we can get from operations agent using simplified CDE?

Thanks,

Milan

0 Likes
6 Replies
Valued Contributor.. RK36 Valued Contributor..
Valued Contributor..

Re: Content development - clarification of Operations Agent data source use-case in simplified OBR C

Hi Milan,

Did you managed to get this working?

We have an operations connector system monitoring the third part systems like OEM and SCOM. I am trying to create cusotm content packs to collect the data from Operations Connector agent. I am able to successfully create the content pack and deploy but seems no data is coming through to OBR.

Thanks,

Rajesh

0 Likes
Absent Member.. MilanJ Absent Member..
Absent Member..

Re: Content development - clarification of Operations Agent data source use-case in simplified OBR C

Hi Rajesh,

you are facing exactly the same situation like me some time ago - stuff looks OK in simplified CDE, deployable content as result, ... but in the end no data. As I tried to explain in this post, simplified CDE is going to create wrong model for this use-case. In the end I have managed to get it working. You have to follow non-simplified approach in CDE. This means:

  • Create model and ABC stream XMLs
  • Create ETL and ABC stream XMLs. For collection ETLs you will need both, RTSM + PA collection policy keeping the same “domain_name” attribute. It's important and it's not mentioned in the doc. The rest is more less described in the doc.
  • Create universe and report in BO. I have faced the trouble to generate universe in CDE on Windows (there is a another post on HPLN from me about). BO universe generated by CDE was not perfect, so I have to adjust couple of things in BO client.

For MS SCOM stay tuned. HPE announced that MS SCOM ETLs should be published on HPLN as part of Operation Bridge 2017.01 release. I hope it will happen soon.

Regards,

Milan

0 Likes
Absent Member.. MilanJ Absent Member..
Absent Member..

Re: Content development - clarification of Operations Agent data source use-case in simplified OBR C

Valued Contributor.. RK36 Valued Contributor..
Valued Contributor..

Re: Content development - clarification of Operations Agent data source use-case in simplified OBR C

Thanks Milan

0 Likes
Valued Contributor.. RK36 Valued Contributor..
Valued Contributor..

Re: Content development - clarification of Operations Agent data source use-case in simplified OBR C

Hi Milan,

I managed to get the data collected successfully and I can see the data available in stage tables but getting failed to load into actual tables.

2017-02-14 17:07:48,739 FATAL [ ABCBatchID:415389, ABCStreamID:TestDomain@load_Fact_stream, ABCStepID:DataLoad_load_Fact, ABCProcessID:418049 ][load_Fact]:[unixODBC]ERROR 3429:  For 'nvl', types varchar and int are inconsistent
DETAIL:  Columns: unknown and unknown
 (SQL-42804) ;return value: 1

I found this error in the loader.log file. Any help where to look and what could be the issue?

Thanks,

Rajesh

 

0 Likes
Absent Member.. MilanJ Absent Member..
Absent Member..

Re: Content development - clarification of Operations Agent data source use-case in simplified OBR C

Hi Rajesh,

based on the error message it looks as wrong datatype on the "load_Fact" table, you the most probably trying to store string into integer column. I have found datatype mentioned in two places - domain model XML and ETL collection policies XMLs. Check datatype mentioned in these XMLs, look for collected CSV and compare the content with table/columns in Vertica. You should find inconsistency.

Milan

0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.