Highlighted
Absent Member.
Absent Member.
480 views

Agregation get me out of space

Hi,

I have some SmartConnector for CheckPoint Provider. I have a hugh number of events so i have configured agregation to this SmartConnectors.

Now i have seen that on some of them i have stopped receiving events and looking locally on the SmartConnector Server i have viewed that i am running out of space.

If i look on "$ARCSIGHT_HOME\current\user\agent\agentdata" i see lots of files with names similar to "xxxxxxxxxxx==.prstdout.12345. These files are about 10MB each of them.

In one Smartconnector it gets 30GB of space in other 15 GB.

Does anyone know why is this behaviour or anything i can do?

Labels (1)
0 Likes
Reply
6 Replies
Highlighted
Micro Focus Expert
Micro Focus Expert

Re: Agregation get me out of space

Hi!

Sounds like caching...

What is the current/logs/agent.out.wrapper.log telling you?

The file type *prstdout.12345 doesn't sound familiar to me, but for our systems, aggregation had a huge impact to our inbound cache (which is not visible directly), so we had to increase the JVM Memory from 256MB to 1GB. Maybe this also affects your environment. You have to check the times also - like end time vs. manager receipt time.

Cheers Tobias

0 Likes
Reply
Highlighted
Absent Member.
Absent Member.

Re: Agregation get me out of space

Hi Tobias,

How do you increased the JVM Memory from 256MB to 1GB?

Is there any way to monitor the resources availables for this smartconnector so i can detect this behaviour?

Here you have some of the error that shows:

*************************************

FATAL EXCEPTION:
INFO   | jvm 2    | 2011/06/20 05:51:00 | com.arcsight.agent.lc.k: Serializer error while reading cache data
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.lc.c.e(c.java:833)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.lc.c.a(c.java:769)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.lc.c.d(c.java:668)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.lc.c.n(c.java:637)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.eg.d.d(d.java:312)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.eg.a.run(a.java:89)
INFO   | jvm 2    | 2011/06/20 05:51:00 | Caused by: com.arcsight.common.serialize.SerializationException: Could not read content type
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.common.serialize.StreamHandler._readContentType(StreamHandler.java:173)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.common.serialize.StreamHandler.unmarshallAll(StreamHandler.java:421)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.lc.c.a(c.java:857)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.agent.lc.c.e(c.java:831)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  ... 5 more
INFO   | jvm 2    | 2011/06/20 05:51:00 | Caused by: java.util.zip.ZipException: unknown compression method
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:147)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:264)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:306)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:158)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at java.io.InputStreamReader.read(InputStreamReader.java:167)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at java.io.BufferedReader.fill(BufferedReader.java:136)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at java.io.BufferedReader.read(BufferedReader.java:157)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.common.serialize.csv.CSVSerializerUtil.readLine(CSVSerializerUtil.java:824)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.common.serialize.csv.CSVSerializerUtil.readLine(CSVSerializerUtil.java:806)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  at com.arcsight.common.serialize.StreamHandler._readContentType(StreamHandler.java:170)
INFO   | jvm 2    | 2011/06/20 05:51:00 |  ... 8 more

****************************************

Insufficient space

og4j:ERROR Failed to flush writer,
INFO   | jvm 2    | 2011/06/20 04:32:36 | java.io.IOException: Espacio en disco insuficiente
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at java.io.FileOutputStream.writeBytes(Native Method)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at java.io.FileOutputStream.write(FileOutputStream.java:260)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:202)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:272)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:276)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:122)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:212)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:49)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:309)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.RollingFileAppender.subAppend(RollingFileAppender.java:294)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.WriterAppender.append(WriterAppender.java:157)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at com.arcsight.common.log.n.append(n.java:80)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:57)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.Category.callAppenders(Category.java:255)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.Category.forcedLog(Category.java:445)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at org.apache.log4j.Category.fatal(Category.java:436)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at com.arcsight.common.log.f.fatal(f.java:397)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at com.arcsight.common.log.LogManager.fatal(LogManager.java:1059)
INFO   | jvm 2    | 2011/06/20 04:32:36 |  at com.arcsight.agent.util.c.b.run(b.java:210)

*************************************

Best Regards

0 Likes
Reply
Highlighted
Micro Focus Expert
Micro Focus Expert

Re: Agregation get me out of space

I would recommend:

  • stop the connector
  • adjust the following lines in current/user/agent/agent.wrapper.conf
    • # Initial Java Heap Size (in MB)
      wrapper.java.initmemory=1024

      # Maximum Java Heap Size (in MB)
      wrapper.java.maxmemory=1024
  • delete the files in user/agent/agentdata (ArcSight KB # 1989 )
  • restart the connector
  • watch errors in agent.log and agent.out.wrapper.log


br Tobias


0 Likes
Reply
Highlighted
Absent Member.
Absent Member.

Re: Agregation get me out of space

Did you find a solution for this problem?

I have also lots of "xxxxxxxxxxx==.prstdout.12345"-files.

the maxmemory is already set to 1024MB.

0 Likes
Reply
Highlighted
New Member.

Re: Agregation get me out of space

To explain the filename:

“connectorid_n.prstdout.f” (or .prstderr) where connectorid is the Connector ID, n is the thread index, and f is the file number in the queue are process files

To monitor if you run out of memory due to aggregation, I recommend to use logfu.

If the memory never hits the upper limit of the JVM, then the root cause is  probably something else.

Till

0 Likes
Reply
Highlighted
Absent Member.
Absent Member.

Re: Agregation get me out of space

The connector does not run out of memory, but I don't find the cause, why the events are processed so slow and why there are so many process files.

I tried multithreading, upgrade to the latest Connector version, but I have not found a solution for this problem.

Hannes

EDIT: I deleted the complete container and rebuild it, and now the connector is not caching.

0 Likes
Reply
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.