What is cloud support for dedupe device

 

Does anyone know what the cloud support to create a store in the dedupe device actually does.

I've posted this before and neither MF support nor the documentation have anything about it.

There are  several things not documented here including the purpose of the cache and sizing for dedupe store installed locally versus cloud backed?

Picture below.

Any ideas appreciated as having undocumented feature in your product is never a good look.

Top Replies

  • 0   in reply to Koen Verbelen

    Many thanks again,

    I uploaded that file to the case.

    It means nothing to me as this feature is undocumented.

    The drive is 200GB in size and the cache is set to 100Gb.

    It currently has no space left but I assume that's because the backups keep failing as there's no space.  How does it reclaim the space?

    Bit of a chicken and egg situation unless I just start again, but I can see the same thing happening as I don't know what those settings should be as there's no docs.

    thanks.

  • 0   in reply to HowardArch

    Max percentage full set to: maximum-percentage-full="0.95"

  • 0   in reply to Koen Verbelen

    What I have seen is that the drive is completely full. What you forgot to answer is:

    • What's the disk usage of the "Dedupe_store" directory now?

    Koen Verbelen
    Micro Focus (now OpenText) Customer Care Specialist
    If this answered your question, please mark it as "Suggest as Answer" or "Verify as Answer".
    If you found this post useful, please give it a "Like".

  • 0   in reply to Koen Verbelen

    More than the cache so around 179GB total.

    Many thanks,

  • 0   in reply to Koen Verbelen

    I see the case is not really moving on yet. Therefor I want to make sure we captured the info correctly, so the support engineer can move on with it. As far as I have understood, the cache limit has been configured to 100GB, but the directory content is going above this level. That really means the content of the directory and not just the disk level, right? The other thing I wanted to check with you is: checking the Azure side, can you see all the data being uploaded there?


    Koen Verbelen
    Micro Focus (now OpenText) Customer Care Specialist
    If this answered your question, please mark it as "Suggest as Answer" or "Verify as Answer".
    If you found this post useful, please give it a "Like".

  • 0   in reply to Koen Verbelen

    No problem,

    I can see data uploaded to Azure so clearly that is working until the device on prem runs out of space.  This issue seems to be purely on the designated dedupe server.

    The cache limit has been configured to 100GB.  the disk is 200Gb.  The cache is filling to 179Gb at which point the disk runs out of space.  21Gb of data is the OS and other files.

    This has been happening since it was setup.   I'm sure it's just a config problem as stated but there is no accompanying documentation on the feature regarding the local cache/dedupe store for Azure.

    Many thanks.

  • 0   in reply to HowardArch

    Frankly and unfortunately, I woukd say, it does not really look like just a config issue. I admit the documentation is limited, but there is not so much to configure. With a default cache of 30GB it should just work fine in most cases. You have 100GB configured and for one or another reason this limit is not respected. I'll have a chat with the support case owner and suggest him to get help from our labs on this.


    Koen Verbelen
    Micro Focus (now OpenText) Customer Care Specialist
    If this answered your question, please mark it as "Suggest as Answer" or "Verify as Answer".
    If you found this post useful, please give it a "Like".

  • 0   in reply to Koen Verbelen

    OK I know the case has been elevated to our labs as I just had a chat with the lab support qualifier. I provided him with all the background information I have. One of the details we don't know yet (and to be clear: it actually shouldn't matter, but we are curious to know): how much data are you actually backing up per session, per day, ... Can we get some rough estimate?


    Koen Verbelen
    Micro Focus (now OpenText) Customer Care Specialist
    If this answered your question, please mark it as "Suggest as Answer" or "Verify as Answer".
    If you found this post useful, please give it a "Like".

  • 0   in reply to Koen Verbelen

    No problem,

    Currently we back up between 100GB and 1Tb a day but that can vary i.e. more at end of week month.

    We are also looking to use this primarily for long term retention to Azure blob initially before moving across other backup workloads.

    During my testing I tried various data sets up to 1Tb - smaller ones under 100Gb obviously worked and the larger ones did not.