UPDATE! The community will be go into read-only on April 19, 8am Pacific in preparation for migration on April 21. Read more.
UPDATE! The community will be go into read-only on April 19, 8am Pacific in preparation for migration on April 21.Read more.
982 views

flexconnector for file, file loaded every 5 min, is there any chance to read only the new entries?

I am going to develope a flexconnector to read a file which comes from an output of a sql script. This script runs every 5 minutes and the old file delets and a new file (same name and with the concatenate new entries) in the directory reloaded. My question is, how i can parse just the new entries not reading the file from the beginning which had beed read in the old file? is ther any way to workaround this so a flexconnector parse the file and follow the new lines in the reloaded file?

Thank you in advance for your respose

Labels (3)
0 Likes
5 Replies
Captain Captain
Captain

Hi,

Keep the parameter "agents[0].startatend=true" in "agent.properties" and try.

Regards,

Sumanth M

0 Likes

Thank you Sumanth,

Here the file is not the same file, the file has been replaced when the new but with the same name is loaded. i think Connector put a flag at the end of file when it's been read and if the new line added to it then the connector realize it and process. But in this case the new file has no flog on it, I suppose.

Br//Fred

0 Likes
Commodore
Commodore

Farok,

The property you need is the preservestate. This defaults to false, but you must set it to true so that the smartconnector remebers where it was in processing the file.

Below is copied from  Flex Connectors Developers Guide Appendix G:

agents[x].preservestate 

If set to true, remembers the last location read in the file periodically,
depending on the values set for the perservedstatecount and
preservedstateinterval properties.
If set to false, then nothing is written and the connector has no record of
where it left off. In this case, the values of perservedstatecount and
preservedstateinterval are ignored.

Regards,

Martyn

0 Likes
Commodore
Commodore

I would add that I have done this in exactly the same circumstances where an SQL script overwrote the file (in my case every hour) and it worked perfectly. BUT have you thought of how you will roll over the file every day? The connector can throw an error if a file which was say 800 records long at 23:59, suddenly becomes 2 records long at 00:04.

Martyn

0 Likes

Thank you for your responses. 

I implemented it as a multifilereader connector. The scripts runs every 15 miutes and a new file with unique details like yyyyhhss (file_xxxxxxxx) will be created in the dirctory with only the last 15minutes data.

Tested and No problem then with the dublicate or rotating 🙂

all the bests//Fred 

0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.