New Ranks & Badges For The Community!
Notice something different? The ranks and associated badges have gone "Star Fleet". See what they all mean HERE
Highlighted
299 views

"Syslog File" connector issue ( one time read )

Hello Guys,

 

I have created a "Syslog File" connector to read some pulled Linux audit.log files. 

I created a script to fetch logs via SFTP to /root/ServerLog1/  with scheduled job for each 15 min and replace it to the new one.  i got below problem :

- The connector only read one time and i can see parsed logs in my logger but for the new updates (after 15 min) there is no any new logs however the log file updated and some logs added.

below is my connector properties file 

agents[1].aggregationcachesize=1000
agents[1].configrestartsleeptime=5000
agents[1].customsubagentlist=ciscopix_syslog|netscreen_syslog|cyberguard_syslog|niksun_syslog|sourcefire_syslog|intrushield_syslog|ciscovpnios_syslog|sonicwall_syslog|apache_syslog|netscreen_idp_syslog|ciscovpnnoios_syslog|attackmitigator_syslog|rsaace_syslog|ciscoaironet_syslog|ciscoworks_syslog|ciscorouter_syslog|nortelvpn_syslog|pf_syslog|coreguard_syslog|watchguard_syslog|fortigate_syslog|peakflow_syslog|honeyd_syslog|neoteris_syslog|prosafe_syslog|trushield_syslog|alcatel_syslog|extreme_syslog|tippingpoint_syslog|nokiasecurityplatform_syslog|whatsup_syslog|airdefense_syslog|stealthwatch_syslog|nagios_syslog|netcontinuum_syslog|cef_syslog|tlattackmitigator_ng_syslog|airmagnet_enterprise_syslog|manhunt_syslog|m40e_aspic_syslog|ironmail_syslog|ciscorouter_nonios_syslog|ingrian_syslog|nitrosecurity_syslog|junipernetscreenvpn_syslog|catos_syslog|ipolicy_syslog|symantecnetworksecurity_syslog|bigiron_syslog|type80_syslog|miragecounterpoint_syslog|newbury_syslog|packetalarm_syslog|cyberguard6_syslog|neowatcher_syslog|netkeeper_syslog|snare_syslog|ntsyslog_syslog|f5bigip_syslog|sms_syslog|ciscocss_syslog|barracuda_spamfw_syslog|radware_defensepro_syslog|barracuda_spamfw_ng_syslog|bluecoatsg_syslog|peakflowx_syslog|aruba_syslog|mcafeesig_syslog|stonegate_syslog|ciscosecureacs_syslog|tripwire_enterprise_7_7_syslog|tripwire_enterprise_syslog|datagram_iis_syslog|oracle_audit_syslog|sms7x_syslog|messagegate_syslog|cyberguard52_syslog|symantecendpointprotection_syslog|cisco_mse|junipernetscreenvpn_6x_syslog|netscreen_idp5_syslog|bsm_syslog|junipernetscreenvpn_keyvalue_syslog|citrix_syslog|linux_auditd_syslog|netappfiler_syslog|vmwareesx_syslog|ciscoise_monitoringaudit_syslog|aixaudit_syslog|ciscoairspace76_syslog|junos_sdsyslog|type80v3_syslog|vormetricdatasecurity_syslog|citrixnetscaler_syslog|tippingpoint_sms_2_5_syslog|tippingpoint_sms_audit_syslog|tippingpoint_device_audit_syslog|vmwareesx_4_1_syslog|infobloxnios_syslog|proofpoint_syslog|junos_syslog|cisco_nxos_syslog|ciscoise_syslog|hpprinter_syslog|hp_c7000_syslog|pulseconnectsecure_syslog|pulsepolicysecure_syslog|pulseconnectsecure_keyvalue_syslog|snare_syslog_heartbeat|ilo_syslog|ironport_websecurity_syslog|ironport_syslog|sidewinder_syslog|gauntlet_syslog|flexagent_syslog|sendmail_syslog|nsm_syslog|nsm2009_syslog|ciscosecureacs51_syslog|hph3c_syslog|hp_ux_syslog|checkpoint_syslog|hpprocurve_syslog|mcafee_webgateway_syslog|isamaudit|isamsystem|barracuda_ng_firewall_f|barracuda_ng_firewall_f_streaming|generic_syslog|ciscoairspace_syslog
agents[1].destination.count=1
agents[1].destination[0].agentid=3QuYe724BABCT8RjftMCyRg\=\=
agents[1].destination[0].failover.count=0
agents[1].destination[0].params=<?xml version\="1.0" encoding\="UTF-8"?>\n<ParameterValues>\n <Parameter Name\="cefver" Value\="0.1"/>\n <Parameter Name\="port" Value\="443"/>\n <Parameter Name\="host" Value\="LG04.company.com"/>\n <Parameter Name\="rcvrname" Value\="Flex_Files"/>\n <Parameter Name\="compression" Value\="Disabled"/>\n</ParameterValues>\n
agents[1].destination[0].type=loggersecure
agents[1].deviceconnectionalertinterval=60000
agents[1].enabled=true
agents[1].entityid=VO4e724BABCT8hjftMCyRg\=\=
agents[1].fcp.version=0
agents[1].filequeuemaxfilecount=100
agents[1].filequeuemaxfilesize=10000000
agents[1].forwarder=false
agents[1].forwardmode=false
agents[1].id=3QuYe724BABCT8RjftMCyRg\=\=
agents[1].internalevent.filecount.duration=-1
agents[1].internalevent.filecount.enable=false
agents[1].internalevent.filecount.minfilecount=-1
agents[1].internalevent.filecount.timer.delay=60
agents[1].internalevent.fileend.enable=true
agents[1].internalevent.filestart.enable=true
agents[1].ipaddress=(ALL)
agents[1].onrotation=None
agents[1].onrotationoptions=processed
agents[1].overwriterawevent=false
agents[1].persistenceinterval=0
agents[1].pipename=/tmp/XXXX/audit.log
agents[1].port=514
agents[1].preservedstatecount=10
agents[1].preservedstateinterval=30000
agents[1].preservestate=true
agents[1].processingmode=batch
agents[1].protocol=UDP
agents[1].sleeptime=5
agents[1].solarissyslogconfigrestartcommand=kill -HUP `cat /etc/syslog.pid`
agents[1].startatend=true
agents[1].syslog.subagent.parsers=
agents[1].tcpbindretrytime=5000
agents[1].tcpbuffersize=10240
agents[1].tcpcleanupdelay=-1
agents[1].tcpmaxbuffersize=1048576
agents[1].tcpmaxidletime=-1
agents[1].tcpmaxsockets=1000
agents[1].type=syslog_file
agents[1].unparsedevents.log.enabled=false
agents[1].usecustomsubagentlist=false
agents[1].usefilequeue=true
agents[1].usenonlockingwindowsfilereader=false

 

below is the Connector parameter:

Capture.PNG

 

Regards

4 Replies
Highlighted
Commander
Commander

agents[1].pipename=/tmp/XXX/audit.log
agents[1].port=514
agents[1].protocol=UDP

never saw UDP in filereader... but maybe i am wrong. 

Highlighted
Commodore
Commodore

Hi @Miran Arsalan Sleman 

Probably the connector is keeping a handle on the old file and is trying to read the logs from that old file..

Try adding agent mode in your agent.properties either to delete or rename the file after processing.

Delete file

agent[0].mode=DeleteFile

Rename file

agents[0].mode=RenameFileInTheSameDirectory

agents[0].modeoptions=processed

Hope this helps!!

 

Regards

Ajith K S

Highlighted

Hello, But i'm not sure if i add above, do the connector duplicate logs by starting reading from fist line of the log form the new file or it will read from the last line log based on previous of deleted file ? Regards,
0 Likes
Highlighted
Commodore
Commodore

Hi @Miran Arsalan Sleman 

I understand from your post that you developed a script to fetch logs every 15 minutes and the existing file will be replaced with the new one.

If you add either DeleteFile or RenameFileInTheSameDirectory, the file will be either deleted or renamed by the connector once it is processed. And when new file is present after 15 minutes, the connector should start reading that file.

 

Still there is possibility of missing the logs. If the connector doesn't read the agent.log file for some reason, the file will be already present in the folder and after 15 minutes the new file will be replacing the existing one (which is not processed by thec connector).

As a solution to this problem, you can do one thing.

1. Use DeleteFile as agent mode.

2. Write the script to check if the audit.log file is present in the folder. If present, rename it to audit.log.1, audit.log.2 and so on.

3. Use the wildcard pattern in connector configuration as 'audit.log*'

 

Regards

Ajith K S

The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.