8 minute read time

Big Data cannot be a challenge in today´s world

by Micro Focus Employee in Information Management & Governance

Authors: Juan Niekerk, Data Protector Channel Manager, and Sandrine Avenier, Field Marketing Manager, Micro Focus

Data is everywhere, and no longer confined to the physical boundaries of the datacenter. No one questions that digital data is growing, and estimates keep climbing. What’s really relevant nowadays, is that businesses are harvesting and using this data to improve market knowledge, enhance competitiveness, and transform their operations and even their business models. Such amount of digital data created, analysed and stored is called big data.

Big data typically refers to a volume of data so great that traditional IT can no longer store, manage, and process it. But this isn’t just a case of data growth outstripping technology growth. Big data embodies fundamental differences that necessitate new approaches and technologies.

Much of this data is stored, managed, and processed by disparate systems. And much of the big data value is derived from simply bringing together data from many different sources to achieve a 360-degree view of customers, products, and business operations.

Data has acquired a new value, it is not only the size that it is acquiring but it is also the blood of the company. But to realize that value, companies must evolve from legacy processes to solutions that provide the ability to address proactive approaches to govern those assets. 

For this reason, companies are immersed in a digital transformation, to take advantage of Big Data through solutions that allow information governance and management while allowing them to analyze the data collected. See IDC Infobrief: "Turning Data into Action". But they also know that one of the challenges is compliance, and making sure the data lake is secure. And in a hybrid world, the lake is all around us, moving every second, with a new email, a new document modification, a new application in the Cloud which produce additional data in another context, etc.

Due to this new value and even more since 2020, ransomware has become a global pandemic for the IT world and that is spreading like wildfire1 trying to impact these data lakes that have become company´s lifeblood. Watch video here. Therefore, and as mentioned above, securing and protecting those assets has become a business priority and put a tremendous responsibility placed on IT teams. Read also "Unified Protection with Micro Focus Data Protector".

1https://www.idgconnect.com/article/3674314/which-countries-and-industries-are-suffering-the-worst-cyber-attacks.html  

2Source : IDC Digital Universe in 2020

3https://www.statista.com/statistics/871513/worldwide-data-created

In addition, time has now become public enemy #1, we need to radically change the time it takes to get value from our processes, and it´s even more important when talking about backup and recovery. When running backup and recovery processes, a minute is a minute. We can’t change the element of time. It is constant. But what you can do is change what gets done in that time. Can you protect your application till last minute? Can you implement a true application-level data consistency? Can you reduce the data sent over the network to speed up and optimize backup time? 

It’s not enough to just be hybrid, hyper-converged, optimized, or agile; we need to start thinking in terms of globalization and 360-degree overview. The key for customers is to be able to react immediately. Read also "Virtual Machine Backup - The Choice is Yours".

Growing information and its impact on the backup strategy big data

In a world where Big Data is the normal (2.5 quintillion bytes of data are produced by humans every day4) and where interest in harvesting value from big data is impacting in the traditional backup and recovery processes; companies are enforced to “rethink” about their data protection approach, and find the balance between serving the organization’s desire for big data, seeking more value from the information they create (through data mining and analysis), and the age old and essential requirement of protecting information from a disaster, cyber-attack, or a logical or physical system failure.

This is a logical conclusion, as backup is the one location in every organization where at least one copy of what is considered important is stored and cataloged for future protection and use. IT is not only being asked to “protect everything forever” (many IT organizations have been dealing with this issue for decades), but now there is a mandate to “protect everything from everywhere”.

More important than protecting massive amounts of data that presents itself daily/weekly/monthly in the IT environment, is the consideration of what is required to restore larger operations. It is one thing to protect everything from everywhere, but how will the company restore huge volumes of data stored in a data center, remote offices, or even in the cloud?

So how to manage this complexity and that nobody can stop?

4 https://explodingtopics.com/blog/big-data-stats

Key things to consider about big data’s impact on backup and recovery

 More data (within applications, databases, file systems, etc.) requires tougher choices to be made regarding what we protect and when. This influx of new information requires organizations to reconsider backup schedules, and data capture mechanisms used to capture and protect data—including file system/application backup agents, storage array or hardware-assisted snapshots/replication, and hypervisor API integration for virtual servers.

The result of massive data growth makes it harder to stay within a defined backup window and meet data protection SLAs (for both traditional data protection and disaster recovery). With more information to protect, disaster recovery becomes inherently more complex and IT organizations are required to be more selective about what to protect and when.

Remote and cloud environments pose new challenges, as many organizations consolidate IT resources to a central data center operation, but lack trained staff or, in some cases, dedicated backup infrastructure in the remote sites or don´t take into account the data protection policies defined by the cloud provider.

While backup performance is always top of mind, it’s the restore that matters at the end of the day (remember a minute is a minute. It is constant. But what is done in that time can be optimized, and even more so in critical situations). Restoring information is the capability that keeps a business moving forward. For some, this might mean recovering an entire environment, for others it might just require a select group of applications or servers, and at the very least a few select files.

Addressing backup challenges in the era of big data

Big data clearly opens up new possibilities for leveraging information as a valuable asset, in many organizations, the assumption is that their backup solution would be a safe harbor from which to restore data in the event of a ransomware attack.

Data Protector saves IT administrators’ time, and ensures business continuity by providing a disaster recovery process that is built for modern enterprise IT environments. It is a comprehensive backup solution that supports the 3-2-1 rule for backup with an integrated management and reporting capability that ensures backups are completing on schedule and within their service level agreements.

Velocity, variety and volume are some of the characteristics of big data, and one of the key mechanisms to reduce the footprint of Big Data backups is deduplication, one of the key and most relevant features of Data Protector. Along with many other backup features, data deduplication remains one of the most important and fastest-growing storage optimization techniques. During the deduplication process, duplicate data is deleted, leaving only one copy of the data to be stored and therefore, reduces storage space consumption. In addition to reducing power consumption, it also decreases bandwidth consumption. Another goal of data deduplication is to provide better performance for data-intensive applications by optimizing response and data access times.

Big data challenges are unavoidable, and how these challenges are managed could have a significant impact on the strategic and tactical performance of an organization. It’s essential that backup/recovery solution can address the volume, complexity, and diversity of data that the big data challenge presents.

Micro Focus enables organizations to protect big data at scale

Meeting the demands of big data backup require both retooling our approach and using of optimization technologies. In addition, and unique to data protection, is the fact that these solutions are the only ones spanning across all data types, applications, locations, and organizational departments.

With over 30 years of experience in the backup market, Micro Focus data protection solutions are unique positioned to provide a trusted source for innovation, experience, and knowledge. Our experience has developed a solution in which customers can benefit from improved SLAs, improved storage and performance efficiency, and reduced storage requirements when compared to other vendor offerings in the space.

  • Much more secure architecture to deal with nowadays threats. Data Protector can help customers recover business operations after a cyber-attack much faster than other solutions thanks to its deployment flexibility and capabilities, allowing a true 3-2-1-1 backup rule implementation. Developed and proved during all these years!
  • Increased reduction in storage space. New deduplication capabilities provide greater flexibility and features to achieve comprehensive data protection with maximum storage efficiency. It enables team to handle data growth on same storage platform or using dedicated deduplication appliances.
  • Improved total cost of ownership. Through different licensing models Data Protector can better adapt to customers´ needs, and optimize their available budget; facilitating decision-making about their data protection solution.

Customers can address the cost and complexity of backup and recovery in today’s borderless datacenter. Data Protector is helping organizations of all sizes meet these challenges with an adaptive and dynamic data protection that is just as elastic as current environments.

Micro Focus Data Protection solution help customers to protect any data from virtually any source, across applications, and virtual, physical and cloud environments. And our committed team is comprised of both internal experts as well as our channel community as we emphasizes training and development of the channel community so our solutions are better tailored to the specific needs of our customers.


Be sure to connect with Micro Focus on Twitter and LinkedIn.

We’d love to hear your thoughts on this blog. Comment below.

The Micro Focus IM&G team

Know your data | empower your people | drive your future

Join our community | @microfocusimg | www.microfocus.com 

Labels:

Data Protection