Memory issue while using SetDocument() method

Hello,

recently I've created a Windows service, which was monitoring the selected directories and loading files into CM (with some simple logic). During testing, I observed that the memory is increasing linearly and is never released. After some troubleshooting, it appeared that when I commented out the line responsible for attaching the electronic document to a record and leaving the rest of the code as it is, memory stopped increasing. I ran some tests with memory profilers and they showed that it is some unidentified (unmanaged) type.

To eliminate any possible impact from other parts of my code, I've created a simple script, which is loading files with electronic document in a loop. The behaviour was exactly the same (PFA - difference between the script with and without the SetDocument line). As expected, forcing GC does not help with unmanaged memory.  I also drop the CM db connection after every 50 files but it still doesn't help.

Could it be that the memory leak comes from SDK's SetDocument() method or am I missing something? I'm tryin to dispose everything that should be disposed. The memory occupied by the app keeps increasing until it crashes.

Here is the code of a simple test app:

using HP.HPTRIM.SDK;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;

namespace CMRecordLoadTest
{
    public class Program
    {
        private static readonly log4net.ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);

        static void Main(string[] args)
        {
            List<string> files = Directory.GetFiles(args[0]).ToList();
            for (int i = 0; i < 50; i  )
            {
                using (Database db = new Database())
                {
                    InitializeDatabase(db);
                    foreach (string file in files)
                    {
                        try
                        {
                            Record rec = new Record(db, new RecordType(db, "TestDoc"));
                            rec.Title = Path.GetFileNameWithoutExtension(file);
                            AttachElectronicDocument(rec, file);
                            rec.Save();

                            if (rec.Uri != null)
                            {
                                log.Info("Record "   rec.Title   " created.");
                            }
                        }
                        catch (Exception ex)
                        {
                            log.Error(ex.Message, ex);
                        }
                    }
                }               
            }
        }

        public static void AttachElectronicDocument(Record rec, string filePath)
        {
            if (File.Exists(filePath))
            {
                InputDocument objDoc = new InputDocument();
                objDoc.SetAsFile(filePath);
                rec.SetDocument(objDoc, false, false, rec.Title);
            }
            else
            {
                throw new Exception(string.Format("Invalid path to file {0}. Path provided: {1}", rec.Title, filePath));
            }
        }

        private static void InitializeDatabase(Database db)
        {
            TrimApplicationBase.Initialize();
            db.WorkgroupServerName = "localhost";
            db.WorkgroupServerPort = 1137;
            db.Id = "AP";
            db.Connect();
            Console.WriteLine("Successfully connected to CM database");
        }
    }
}

Kind regards,
Mateusz Michałkiewicz

  • Hi Mateusz,

    You say, "I also drop the CM db connection after every 50 files but it still doesn't help.", but the code you posted actually tries to create a record from every file in the directory 50 times?

    As a general principle, you should only need one Database connection for what you are doing.  I prefer to use new InputDocument(filename) rather than using InputDocument.SetAsFile(name), and you could create a single instance of RecordType before the loop rather than creating a new one with each iteration, but neither of those are likely to make any difference to memory leaks.

    If you can reliably reproduce this in a couple of different environments, then you should log a fault report with Microfocus.

  • but the code you posted actually tries to create a record from every file in the directory 50 times?

    That's right, I use a bit different code in the screenshot to the one I pasted but it shouldn't be the cause of memory leaks. In my original app I use only one db connection but wanted to test if disposing Database object helps with releasing memory. I used the loop instead of creating 10k files to monitor the behavior for more than one minute

    I have an issue raised but it's taking ages so I wanted to check if anyone has experienced similar issues before or possibly that there is some flaw in my code. I've already tried on 3 environments.

  • How much memory did you observe it allocating?

    I could reproduce the issue in 9.3.2.418, but if I simplified the example to just do this it appeared to eventually GC (this was after 50k or so instantiations)

    for (int i = 0; i < 500000; i  )
    {
        foreach (string file in files)
        {
             var objDoc = new InputDocument(file);
        }
    }

    Might be worth looking into the function TrimApplicationBase.EnableSdkLeakTracking - although I have never used it myself

  • It was allocating memory until it crashed and the memory usage was 100%. We have 16 GB on this server but some other processes where also running. It was more than 10 GB for this app. After one crash we were even unable to restart the server and it had to be restarted on physical host.

    I used leak tracking function but it wasn't of any help. I suppose it was designed to trace leaks that are known and that come from SDK objects (for example when Database object is not disposed). In this case, it looks like some unknown issue so I'm not sure how could this leak be possibly traced by this method. 

  • I got a confirmation that the issue will be fixed in the next release. Thanks for help.