Highlighted
Absent Member.
Absent Member.
3265 views

Automation for "import csv file to active list"

Jump to solution

Hi,

I am having a csv file at my local system. I need to import it into active list at ESM by a script(automation).

I have tried ESM ActiveList Import Script – Allen Pomeroy for this task but with no luck.

Request all helpful input.

Regards,

Priya Malik

Labels (1)
0 Likes
1 Solution

Accepted Solutions
Highlighted
Outstanding Contributor.
Outstanding Contributor.

Hello Priya,

I have just recreated your scenario; it works for me, the steps below:

1. Install Regex Folder Follower flex connector (in agent.properties the type should appear as agents[0].type=sdkfolderreader)

2. During installation, in the following step fill in with:

options.jpg

Log Folder - where you will have your file

Configuration File - the first part of the name of your flex parser

You will get an error saying it can't find the parser; ignore it, you will add the parser afterwards.

3. Create a parser file (in my scenario it is called flexFile.sdkrfilereader.properties) and place it in ~\current\user\agent\flexagent\. The parser can be very easy for what you sent me, just identifying an IP Address per row:

regex = (\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3})

token.count = 1

token[0].name = address

token[0].type = IPAddress

event.targetAddress = address

event.deviceVendor = __stringConstant("FlexFile")

event.deviceProduct = __stringConstant("FlexFile")

4. Place the file you want to parse at the location mentioned in point 2, "Log Folder".

5. Edit agent.properties at ~\current\user\agent\ by changing the following line, with the name of the file you want to parse:

agents[0].wildcard=<name_of_your_file.csv>

Start your connector, open an Active Channel and you should already see it importing events. When you delete the current file with IPs and add a new one, the connector will automatically read the new file and import the events. Proof in the print screen below, for an easy file with 3 IP Addresses which I have imported twice:

working.jpg

From here create your Rule identifying events coming from your connector, create an Active List for the relevant information and instruct the Rule to Add to Active List whenever seeing a new event from your connector. If you have issues with these steps, let me know.

Hope this will work for you .

All the best,

Stefan

View solution in original post

27 Replies
Highlighted
Outstanding Contributor.
Outstanding Contributor.

Hello Priya,

I can't open your link, I get a 403 Forbidden.

However, how I usually import information from files into ArcSight:

1. I create a Flex Connector to parse the file and map the information into the ArcSight schema. The guide to develop Flex Connectors is here:

If you have never done that before, it might take some practice.

2. Basically each line of your file will be represented by an event sent to ArcSight by the Flex Connector. Thus, the next step is to create a rule in ESM that for every event coming in from your Flex Connector, it adds the relevant information from that event to your Active List.

When your file is modified, the Flex Connector will automatically parse the new information, send it to ESM and the Rule will add the information to the Active List.

Of course, the implementation of the steps above depends on your specific scenario; but that is the basic idea and I guess it should work in most cases.

All the best,

Stefan

0 Likes
Highlighted
Absent Member.
Absent Member.

Hi Stefan,

I have created a flexconnector(Thanks for the suggestion). But in active channel of ESM(Arcsight), I am getting CSV file name (test.csv) inspite of contect of csv file.

Could you help me in the same.

I am a novice in arcsight.

Thanks and Regards,

Priya Malik

0 Likes
Highlighted
Outstanding Contributor.
Outstanding Contributor.

Hello Priya,

If everything else is working ok, you didn't get any ERRORS during Flex Connector installation and your service and connector appear as "running", I guess it must be an issue with your parser file.

My best guess is that the entry you see in the Active Channel with the .csv file name is actually the entry which says something like "file opened" or "started reading file". What I would do: check the log files in ~/current/logs/ agent.log and agent.out.wrapper.log and look for an entry saying that the connector started processing your file; also look for any ERROR entries to see whether or not there are issues. The entries telling you that the file is at least opened should be similar to:

agent.log: "Started processing file"

agent.out.wrapper.log: "Created all streams/readers for the file"

You could also send me an example of .csv file that you are using, with 2 or 3 entries and if I have time today I will try to quickly recreate your scenario. But don't just send me the entries, but the exact file; you can erase any confidential information from the logs.

By the way, what kind of Flex Connector did you choose?

All the best,

Stefan

0 Likes
Highlighted
Absent Member.
Absent Member.

Hello Stefan,

Thanks for the quick response.

Previously, I was using 'file reader connector'. But yesterday I created another connector (XML one). I checked the logs and got the below entry:

agent.log: "Started processing file"

agent.out.wrapper.log: "Created all streams/readers for the file"

But currently we are facing some issue in ESM and not able to get any logs in ESM.

My CSV file look like: (It only have one filed name as "IP")

IP

2.2.3.4

4.5.6.7

120.23.34.67

111.23.34.45

Thanks and Regards,

Priya Malik

0 Likes
Highlighted
Outstanding Contributor.
Outstanding Contributor.

No need for XML Connector for a file as basic as yours.

One more thing: how is this file going to be update? Are there entries going to be added at the end of the file, or are you going to erase the current file and replace it with a new one with the same name?

All the best,

Stefan

0 Likes
Highlighted
New Member.

You can convert the CSV file into an XML import file and use the ArcSight Archive command.

All the comments above seem to be about using a flex agent to convert your csv into events that get sent into the system. From here you'd need rules to get the data into the lists. I'm using this method along with 2 other methods.

Method 1: Use the ArcSight API to write data to a list

- This works, but it's hard to get working well

Method 2: Use the "arcsight archive" command to import an xml format file

- This also works, but you need to convert a csv into an arcsight xml format that can be imported like an .arb.

Method 3: Use a flex agent to read your data into events, then use rules to write the data to lists.

- This works, but has issues when values in your list are removed. Ex: an AD group list, as a member is removed, it's difficult to keep values up-to-date in real time.

Highlighted
Absent Member.
Absent Member.

Hi Stefan,

Good Morning!!

I will erase the current file and will replace it with new one with the same name.

P.S.: When I was using file reader connector, I was not getting  below output.

agent.log: "Started processing file"

agent.out.wrapper.log: "Created all streams/readers for the file"

But when I converted my csv file into xml file and use xml connector, I was getting above output.

Please suggest me which connector I should use?

Thanks and Regards,

Priya Malik

0 Likes
Highlighted
Absent Member.
Absent Member.

Hello Grant,

In method 2, how to convert a csv file into arcsight xml format.

Need your suggestion for the same.

Thanks and Regards,

Priya Malik

0 Likes
Highlighted
Outstanding Contributor.
Outstanding Contributor.

Hello Priya,

I have just recreated your scenario; it works for me, the steps below:

1. Install Regex Folder Follower flex connector (in agent.properties the type should appear as agents[0].type=sdkfolderreader)

2. During installation, in the following step fill in with:

options.jpg

Log Folder - where you will have your file

Configuration File - the first part of the name of your flex parser

You will get an error saying it can't find the parser; ignore it, you will add the parser afterwards.

3. Create a parser file (in my scenario it is called flexFile.sdkrfilereader.properties) and place it in ~\current\user\agent\flexagent\. The parser can be very easy for what you sent me, just identifying an IP Address per row:

regex = (\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3})

token.count = 1

token[0].name = address

token[0].type = IPAddress

event.targetAddress = address

event.deviceVendor = __stringConstant("FlexFile")

event.deviceProduct = __stringConstant("FlexFile")

4. Place the file you want to parse at the location mentioned in point 2, "Log Folder".

5. Edit agent.properties at ~\current\user\agent\ by changing the following line, with the name of the file you want to parse:

agents[0].wildcard=<name_of_your_file.csv>

Start your connector, open an Active Channel and you should already see it importing events. When you delete the current file with IPs and add a new one, the connector will automatically read the new file and import the events. Proof in the print screen below, for an easy file with 3 IP Addresses which I have imported twice:

working.jpg

From here create your Rule identifying events coming from your connector, create an Active List for the relevant information and instruct the Rule to Add to Active List whenever seeing a new event from your connector. If you have issues with these steps, let me know.

Hope this will work for you .

All the best,

Stefan

View solution in original post

Highlighted
Absent Member.
Absent Member.

Hey Stefan,

Good Morning!! and Thanks a ton 🙂

I will do it today and will let you know the result.

(As you have explained whole process, I assume my 99.9% work has been done.)

Thanks again 🙂

Thanks and Regards,

Priya Malik

0 Likes
Highlighted
Absent Member.
Absent Member.

Hi Stefan,

Task done 🙂

csvData.png

Again Thank you so much.

Regards,

Priya Malik

0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.