Handling Data Growth

Aiming at creating a dynamic and non-disruptive environment that supports data growth? Read on...

As IT becomes pervasive, data is growing in a geometrical progression. Mobility is adding another dimension to this data growth: consider the fact that 48 million people worldwide do not have access to electricity but they have mobile phones! By 2015, the number of such people will reach 125 million. And besides mobility, there are several other factors that will make data pile up from just about everywhere.

In the enterprise space, what is it thats bothering IT managers with regard to storage? The growth in data within an organisation with users on the rise and with it stored all over and in many manifestations...surely a recipe for major stress.

IT managers have been looking at various solutions in various permutations and combinations to manage this data tsunami. The buzz around big data is further adding to their nervousness and increasing the challenges of data management.

Inherent Concerns
An IT Next survey revealed that 50 per cent of the participants found scaling up capacity and performance of storage systems and implementing storage pooling or storage virtualisation to be the greatest challenges in managing data growth.

Syed Masroor, Head, NetApp India, says the main challenge in storage management is the fact that the storage architecture which has gone through several evolutions in the last 10 years is fast changing, and is itself posing great challenges for IT managers.

The challenges relate to allocation of resources and experimentation with virtualisation technologies and tools which are compounding the issues of security; while grappling with changing storage architecture and aiming at performance enhancement, says Masroor.

V Srinivas, CIO, Nagarjuna Fertilisers and Chemicals Ltd. (NFCL) says, the challenge in storage management as in all others, is to identify the data from the existing workload, as part of the capacity management activity, to see what can be virtualised.

According to Srinivas, storage management as part of the infrastructure lifecycle management process has become a separate profit centre with the entire data requiring a joint stakeholder from IT and business.

Amit Phadke, Head, Systems & Technology, Kale Consultants Ltd is concerned, As the demand for storage is escalating, it brings with it the challenge of effectively controlling the vast data being created, stored and accessed.

Phadke further says, To address data growth without interrupting business operations, faster deployment of storage and IT resources to meet increasing demand becomes a function of scalability. The other challenge for Phadke is related to forecasting relative data growth, faster provisioning of capacity, and improving system performance at the same or reduced operational cost--challenges that one faces with regard to virtualisation.

For NR Satyapalan, GM-MS & CIO, National Fertilisers Ltd, the key challenges are around unified management of storage systems, scaling up capacity and performance of storage systems, implementing storage pooling and storage virtualisation, ensuring security of storage and better utilisation of storage systems among many others.

The challenge for the IT managers is also with regard to identifying suitable solutions, though most participants said that they looked at the best of breed solutions to justify their requirements.

Fine Print
It is obvious that organisations have insatiable appetite for data consumption and managing them has become the IT managers top agenda. While adding more storage is not a concern given the spend and cost of storage dropping, the IT managers want to derive more from storage and drive optimal utilisation of the existing storage, making way for newer absorption.

Gartner says storage pool is like a dead sea where almost 90 per cent of the data is never taken stock of.

NetApps Masroor recommends that IT managers go for a single source which is easy to deploy with stringent SLAs on the deliverables. I would suggest that IT managers opt for a two-tier model of storing data which is on the disk and flash cache for optimal performance, says Masroor.

The flash cache trend is used to optimise the performance of the storage system without adding disk drives and also helps in conserving power, cooling and space, adds Masroor.

According to him, IT managers can use Flash Cache in combination with SATA drives for many workloads to increase storage capacity without comprising on performance. About 16 TB of read cache can be configured in a storage system using Flash Cache cards.

About 75 per cent of IT Next survey participants said that they would go for data de-duplication technology as storage management practice this year.

NFCLs Srinivas recommends using data de-duplication tools for backup. Dynamic provisioning of data and automatic tiring is also the practice that Srinivas says would give positive results. I am using 24 TB of data storage and will add another 10TB in this year and distribute using the above methods, says Srinivas. While solid state drives (SSDs) are gaining momentum and vendors are creating a buzz, I see it to be an expensive proposition at this point of time, nearly one and half times more than the cost of other tools, Srinivas warns.

However, Phadkes, primary objective to meet the scaling virtue is to create a dynamic, non-disruptive environment that supports data growth in such a way that system capabilities remain as balanced and productive as possible.

The key for any organisation searching for an optimum solution is to understand which of its goals are most important. It depends on various factors such as available topologies and their associations with the different scaling approaches. The eventual decision will almost certainly take other factors like functionality, price, TCO, skill/comfort levels, etc, into account; these might include predicted changes and growth across the organization.

Scale-out and Scale-up vendors will battle over their relative values in terms of overall economic efficiency, operational efficiency and greenness which makes decision-making further complicated.

I personally favour scale-out storage due to its various advantages like seamless capacity addition. It eliminates the need to pre-determine a performance or capacity ceiling to restricted compute power, can add more processing for higher I/O performance at any stage, etc, says Phadke.

Kale Consultants, which is using 80 TB of storage capacity, will add about 25 TB in the next six months. I will implement server virtualisation, infrastructure as a service (IaaS) and storage as a service (SaaS) for backup, says Phadke.

Besides, Phadke recommends automatic tiering, SSDs, caching, data de-duplication among others.

Tips to Tackle Data Deluge:
Understand data structure and growth
Get clear visibility into physical, logical and tiered storage
Since virtualised environments these days have high interdependencies; rapid isolation and problem management process should be clearly spelt
Use storage management software which can provide Capacity reporting and analysis, performance monitoring, end-to-end infra view, alerting, etc
Go for single and best of breed solution which is easy to deploy and manage
Scale-out storage architecture is ideal for capacity planning
Create dynamic, non-disruptive environment that supports data growth.

Zoom Kobe 1 Protro


Add new comment