Large filesystem backup

Idea ID 2771821

Large filesystem backup

Data Protector should provide a “file system connector” to back up a large amount of data in a very efficient and fast way. (GPFS, NTFS, cluster technologies, …, big data lakes). It is critical to have parallel streams and additional technologies to maximize backup performance.

Goal is to have minimum read, no (minimum) tree-walk, minimum transmit data for very large file systems.

 

This feature should be supported by all main deduplication device providers and in addition by a Data Protector software deduplication target (VSA).

2 Comments
Micro Focus Expert
Micro Focus Expert
Status changed to: Waiting for Votes
 
Respected Contributor.. Respected Contributor..
Respected Contributor..

Hi,

so far an often used way is to manually split the data on one filesystem so it is backed up with multiple streams. It would be great if DP could do this on its own, by just specifying the number of streams for the Filesystem.

 

Regards

Jesko

The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.