Respected Contributor.. Respected Contributor..
Respected Contributor..

What is the step-by-step setup for FlexConnector?

Hello everyone,

I have a configuration with parse of my connector. The IPS connection logs. But the file is not being read. I remmenber that exist a configuration for allow reading of my "snort.sdkrfilereader.properties".


Can someone help me?


2 Replies
Respected Contributor.
Respected Contributor.

Here are links to Snort connectors documentation:


And if you have problem with your connector look to your Connector logs in /current/logs directory for errors. 

Knowledge Partner
Knowledge Partner

Step by step in a functional way, this is how I do it and I have created over >70 parsers, SecLex (company) created over 15+ for three customers in December 2018 alone. I am bit tired at the moment so a "Disclaimer : there might be some spelling mistakes". This is very high level.

  1. Get Use Case requirements
  2. What is the logging mechanism of the log source, to select the proper connector?
    1. File
    2. XML
    3. JSON
    4. Syslog
    5. API
    6. etc...
  3. Select Log sources based on these requirements and specify possible Events of Interests (EoI)
    1. IAAA (Identification, Authentication, Authorization, Auditing) events
    2. System events
    3. Security Events
    4. Etc.. 
  4. Perform Log Analysis, I normally import it in Excel or Notepad++ and first analyse the pattern of the logs
    1. Structured? 
      1. JSON
      2. XML
      3. CSV
      4. Delimited with
      5. DB
        1. Timebased
        2. ID-based
        3. etc...
    2. Non-structured (free format?) complete sentences, unique per event type or activity (regex parser)
      1. [ConnectorHome]/current/bin/./arcsight regex (Linux) or
        [ConnectorHome]\current\bin\arcsight regex (Windows) to open Regex Tool
      2. Find the commonalities of the free format strings normally "timestamp [object.class] [severity] [Free format log string]
        1. e.g. 12:34 01-01-2019 [lalala.random] [Warning] this happened somewhere on whatever whatnot device
          (\d{2}:\d{2}\s+\d{2}-\d{2}-\d{4}) \[([^\]]+)\]] \[(Info|Warning|Error|Fatal)\] (.*)
        2. Test it out against your log sample and adjust accordingly, make you regex not too greedy and also not too strict try the find the middle ground.
        3. So when creating the regex parser thing about this
          1. Define your Main Regex
          2. Define your tokens properly and give them logical names and proper types corresponding with the fields your going to map it to.
          3. Map the relevant fields from your tokens
          4. As it is free format use regexTokens or submessages to map additional info to other fields for the purpose of ensuring proper information and context of the event. This will ensure that you can either correlate and/or effectively create use cases
          5. Don't forgot to add your severity mapping 
          6. Based on your log analysis create a categorization file, utilize the categories.txt as an example to create it.
          7. If neccesary utilize a mapping file to further enrich the CEF events
            1. e.g. Webserver log provide the HTTP code, I used a mapping file to populat the name field corresponding with the meaning of the HTTP Code which I pulled from apache website and/or wikipedia
      3. Now that you have developed everything, test it, re-iterate, optimize, test it again, re-iterate, optimize, test it again
      4. Now document the parser, the regex, fieldmapping, map file and categorization, create a MD5 hash package the documentation next to the .properties file, categories.csv and map.#.properties and hand it over.


The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.