Anyone working in IT cannot have escaped the fact that data is growing. To give just one example of this: in the first 26 years of its existence, EMC shipped one Exabyte of storage – which is equal to the amount we shipped in one month this year.
When it comes to storage, this data challenge is crystallising around two concerns: how to manage the complexity data growth brings to storage infrastructures and how to control spiralling costs.
The age of data in which we now operate calls for a completely new approach to storage; one in which resources can be deployed on a vast scale and managed with ease, without the prohibitive costs that would be associated with the traditional methods of managing and delivering storage.
The public cloud attempts to solve these challenges – but as we shall see it is not the whole solution.
The public cloud
Public clouds have shown that by standardising and virtualising technology infrastructure, it is possible to simplify and automate, thereby achieving operational improvements.
The public cloud operational model targets real customer pain points, particularly in reducing complexity. But it is not without its own set of challenges and can even constrain businesses.
For instance, with public cloud business are required to move their data off-premises and it forces them to take a 'one-size-fits-all' approach to their storage requirements. Some businesses will get around this by building their own public cloud, but this is an expensive option and therefore not suitable for the majority of businesses.
That is the key point: businesses want to run complex environments efficiently and lower their costs. If they can do this without giving up their choice of IT infrastructure, the one that they believe best meets their business needs, then so much the better.
Also, businesses do not want to run the risks related to sending their intellectual property off-premise. What businesses are really looking for is the ease-of-use of the public cloud operational model without compromising on security, control, and total cost of ownership.
Until recently it was simply not possible for all of these requirements to be met, but with the advent of software-defined storage, businesses are set to benefit from a storage infrastructure that is truly fit for our data age.
Software-defined storage
Software-defined storage is a completely new approach to managing data growth. The fundamental idea of the technology is to deliver enterprise storage in a similar way to how public clouds offer virtual machines: enabling developers to provision storage through self-service.
Furthermore, as it is an open environment, multiple vendors and partners can work together as a community to redefine how storage should be managed and delivered. As such, software-defined storage has the capability to evolve with storage innovations and capabilities arising from new business requirements.
With software-defined storage, service providers and IT departments are able to drive towards the operational model of web-scale data centres without hiring thousands of technical experts to build a custom environment.
Crucially, those questions that IT has been asking itself and forced to compromise on are no longer an issue. Businesses will no longer have to choose whether they want a best-of-breed storage capability or operational simplicity; they can have both.
In short, software-defined storage enables vastly simplified storage automation, management and delivery while maintaining and even extending the capabilities and inherent value of existing storage investments.
These capabilities are available to businesses today and will go a long way towards solving the challenges of managing data growth. This is just the start, however.
New services
In the near future the concept of software-defined storage will enable IT departments and service providers to deliver easily consumable storage services but with the control, security and reliability of a private cloud.
This will be provided via a streamlined and highly-automated user experience, from purchase through to deployment and consumption, with 'Click and Go' access to block, file, object, and other storage data types depending on a customer's workload needs.
The storage of the future will be designed for massive scale, geo-distribution and elasticity, giving it all the benefits of the public cloud. Finally, software-defined storage will in the near future be offered at a very aggressive price point, redefining the economics of on-premise Web-scale storage deployments for businesses.
Thanks to the power of virtualisation, therefore, businesses will be able not only to accommodate their ever-growing amounts of data, but to do so comfortably and cost effectively.
In part this may well be due to the adoption of public and private cloud elements, but of most importance will be the evolution of software-defined storage. Businesses will soon find that not only is storage easier than ever to manage, but it is also more flexible and feature-rich.
- Vice President & Chief Technology Officer, Advanced Software Division at EMC.
No comments:
Post a Comment