Top 10 Storage Trends To Watch Out

Hu Yoshida, Vice President and CTO, Hitachi Data Systems, highlights the top storage trends

You have been forecasting the top IT industry trends for storage for Hitachi. What then are the directions you want Hitachi to take?

Big data will continue to be the primary concern for the IT industry. For example, exabytes will enter into planning discussions and petabytes will be the newnorm for large data stores. Much attention will be on secondary data generated for copies and backups. The total cost of ownership (TCO) for storage will change as operational costs decrease and capital costs creep up.

IT professionals will have to tackle these challenges with  budget and time constraints. Simultaneously, they must extract business value from big data to support growth and development. Hitachi should take into account the new trends and align customer strategy accordingly.

1. Dramatic Changes in OPEX and CAPEX: Over the past 10 years, the total cost of storagehas increased by about 7 per cent annually, mainly due to operational costs (OPEX), while  the cost of hardware (CAPEX)has been relatively flat.

2. New Consumption Models: Instead of buying all their  storage today and spreadingCAPEX over the next 4 to 5 years, organisations will buy what they  need when they need it. So, they must leverage technologies and capabilities like dynamic storageprovisioning, virtualisation and nondisruptive data migration.

3. Managing the Explosion of Data Replication: Replication multiplies data growth andbackups are the biggest driver of data replication.

4. The Emergence of Enterprise Flash Controllers: The use of high-performance flash solid state drives (SSDs) has been slow due to their high price andlimited durability compared to hard disk drives. 2013 will see the introduction of flash controllers with advanced processors built specifically for enterprise storage systems and increase durability, performance and capacity of flash memory.

5. New Requirements for Entry Enterprise Storage Systems: The increasing use of hypervisors like VMware and applications such as VDI have changed the requirements for midrange storage systems. The gap between enterprise and midrange storage architectures is narrowing as the industry begins to demand entry enterprise storage systems.

6. The need for object-based file systems: The growth of unstructured data will require larger, more scalable file systems; hence, object-based file systems.

7. Accelerating use of content platforms for data archives and data sharing: This will accelerate as users try to correlate information from different applications.

8. Hardware Assist Controllers to Satisfy Increasingly Complex Workloads: Storage controllers will be equipped with advanced processors and hardware assist ASICs to address increasingly complex workloads and higher throughput.

9. Creating a Secure Platform for the Adoption of Mobile Devices: Adoption of mobile devices increases productivity and innovation, but also creates a nightmare for corporate data centres.

10. More Tightly Integrated Converged Solutions: Certified, pre-configured and pre-tested converged infrastructure solutions are gaining traction.

So, what initiatives have you planned for Hitachi?

Technology-wise, there is a sea change. We’re hearing a lot about about big data, Cloud is getting more real now. The next big thing would be machine-to-machine. Some call it industrial internet. Cisco is building Internet-of-Everything where machines talk to machines with intelligence at the endpoint. But the danger is that vendors will then deal more directly with end users than with IT, creating more silos. For example, in health care, there is a standard around medical imaging. Fujitsu makes it for cardiology, Siemens for oncology... all the information gathered are in silos. So, the patient cannot get a holistic view. Hence, they are approaching IT/ITes as they realised the need of vendor neutral archives. Hitachi is moving towards big data. We have the verticals, we have the IT. So, we are going to enable enterprises to offer solutions. We like to work with the GEs, the Siemens. If we don’t start doing it now, then we bypass IT.

How will these trends impact the role of a CIO?

In data centres, we hear of shadow IT: applications people bringing public cloud, the BYOD trend, etc... But employees today need to be given that kind of flexibility and capability. However, it has to be done behind the firewalls. That’s one area CIOs must focus on. Another thrust area is enabling BYOD as a business trend and not controlling it. This will be driven by CEOs because they are using Apple products (smiles)....So, mobility is the big trend. A recent 2013 survey suggested that internet traffic in mobile devices has surpassed internet usage on desktops. Integration, access to the data anywhere has become the need. We have HCP Anywhere solution from Hitachi. It’s a fully integrated, on-premises solution for safe, secure file synchronisation and sharing. It’s built end-to-end to be enterprise-ready, and hardened for uncontrolled Internet; HCP Anywhere uses the Hitachi Content Platform object store to store, protect, secure and manage data in the most efficient, easily scalable and highest-density object storage platform.

Against this backdrop, how is the role of CIOs changing?

It was thought that CIOs would get a seat at the corporate table as a technology person. Now, his role is evolving as an implementer. In tough economic times, the CFO’s role is getting more crucial. The problem then is that the CFO does not look at the back end integration part, which the CIO is more responsible for.

It wouldn’t be wise to give less weight to CIOs. CIOs need to look at so much more technology. There’s server, storage, networking, BYOD, security, etc. In Mumbai, we work with a successful CIO, who initiates the conversation with us, collaborates in planning, strategising and throws challenges at us as we come up with solutions that suit his needs. He partners with all vendors and leverages their expertise.

But there are others who keep vendors away. They believe vendors bring a set of problems by offering different set of solutions. They call it ankle biting. CIOs need to manage vendor relationships by being open to them, by sharing their vision, get the right insights from them and complement their vision. That’s the key. Vendors know the best technologies; they are subject experts as they have created the technologies. Also, with lot of technologies coming to the front, CIOs must be at the forefront and in control.

What’s new in storage that could transform the work environment?

Flash has been extremely transformative. However, its widespread adoption as a mainstream storage option has been hampered by high costs, limited endurance and suboptimal write performance. The SSDs were not designed for storage, but only for PCs or commodity devices. They optimise manufacturing cost, and were expensive, initially. They were very limited in processing power and multi capability, and had to be changed for enterprise storage purposes. Today’s flash technology is not only very durable with only 2000 to 3000 writes. Also, the writes are multiplied. With a hard disk, you write it just once. But with flash, you write it to a block and the block fills up and you must erase it. So, you must move this somewhere else. Also, electrons leak all the time. Over time, it must be refreshed. So, you need a large ECC to delay refreshing it.

We have quad core processing in controller of our flash. By adding more intelligence and building technologies of enterprise storage, we can increase the endurance and performance; and with quad core, we can do multi threading. That’s why SSDs are 200 or 800 gigabytes; they have to have a lot of spares and a lot of processing.

With more powerful processors, I can put more RAMs in there. So, we came up with 1.6 TB. Now, we are at 3.2 TB and next year, it will be 6.4 TB.

At Hitachi, we are able to build controllers as we are an engineering company.

How do you see consumer behaviour patterns changing, from the storage perspective?

One change is eliminating the Do-It-Yourself (DIY) approach. It takes three months to do that. Today, with so many applications, you need to spin things up very quickly. So, in our Unified Compute Platform which is very different from VBlock or VCE as they are consortiums, we have our own blade server and storage, we OEM the Cisco brocade switch or now we also OEM the VMWare...the whole stack is there on one service call. Some people object to that, they call it vendor locking. But we do virtualised storage, we can use the existing storage behind it. We can support Cisco server, Cisco brocade switches. But the application user does not really care what’s underneath, all he cares is how to spin up his application and who can fix it quickly.

So, DIY doesn’t make sense. It’s time consuming, expensive and a management challenge.

What are the tangible and intangible benefits in this model that IT managers look at?

Cost reduction, greater agility where you can do things faster and lesser errors because it’s automated. For us, it’s easy as all our products— server, block, file etc. are managed by one set of tools. We just need to provide that interface to vCenter. vCenter manages everything. So, we are going to make it compatible to vCenter’s roadmap.

Air Jordan VII 7.5 Ture Flight


Add new comment