How often do you store the same data? (Hint: it’s more!)

As storage infrastructure becomes more complex to manage, businesses are faced with massive data fragmentation. It’s time to change your practices in this area.

Companies are always generating more data. It is a fact. But what proportion of these volumes is unique? As employees are more and more consumers of IT tools, data is regularly duplicated for different uses. Take the example of an Oracle database. This database is backed up regularly, used for test/dev operations or consumed by a business division that wants to conduct its own analysis on it.

Not to mention the cautious user who will also export it to another medium in order to be certain to be able to access it in case of problems. According to a study conducted by the firm ESG , CIOs estimate that the same data is on average copied nearly 7 times! And the cloud has given a new impetus to this phenomenon, with extremely simple services to deploy for users, who will, in turn, create other silos, always with the same data.

So much so that organizations today face a situation of massive fragmentation of their data. According to the ESG study, 73% of companies host data in multiple public clouds in addition to their own data center. “Not only are there massive amounts of duplicate data but in addition, they are completely dispersed,” says the analyst.
 publicity 

The attack of the clones

The equation is simple. The more the data is replicated, the more the company will have to deploy systems to manage them and the more it will have to allocate human and financial resources to do so. So many resources that will not be dedicated to transformation projects with higher added value for the business. It is therefore high time to adopt a new approach.

Until now, business strategy has been to direct data to the application. It is now much more efficient to drive the application to the data. How to do? Simply by exploiting the backups. The Cohesity platform proposes to wake dormant data on the back-up infrastructure. When a user wants to use data, rather than duplicate it, he creates a clone from the backup and works on it.

Thanks to the built-in deduplication engine, this clone takes up very little space. Only changes made by the application result in the creation of new data. Thus, a database of a terabyte weighs only a few bytes, without preventing the user from accessing the information he needs.

Simplify to secure

This new approach has several advantages. The first is obviously economic, thanks to a drastic reduction in storage requirements on the one hand, but also thanks to a better exploitation of the data. “The proliferation of copied data is not only a nightmare in terms of management costs, it also prevents the full value of the information,” says ESG.

At a time when digital literacy means leaders, simplifying access to information is a key business challenge for all organizations. Simplify, but also secure. This is one of the other important advantages of the approach. More than eight out of ten companies indicate that data fragmentation poses a problem of visibility.

By centralizing more, it becomes much easier to monitor your information system and maintain control over the use of the data. The Cohesity file system will also map all data to make them accessible via a search engine. A valuable tool to ensure the organization’s compliance with regulatory constraints such as the RGPD.

A simple query allows for example to find the personal or banking information stored in the platform if it is understood that they were not encrypted upstream. It can be used on-premise or in multi-tenant cloud mode, the solution can be deployed on different sites or platforms and managed as one environment. And in this new ultra-fragmented world, uniqueness is strength.

Leave a Comment