Bob Hankins, vice president, Storage Solutions, Logicalis US, sets out why legacy storage just won’t cut it in tomorrow’s workplace and provides five reasons to conduct a storage assessment.
Lurking inside corporate data centres worldwide are legacy storage devices just waiting to expire during a critical task. Some of these older storage devices may be insufficient to handle today’s increased data storage and access requirements, let alone the requirements of tomorrow – and the associated challenges are particularly acute as data analytics become increasingly important.
The fact is legacy storage simply wasn’t designed with today’s uses in mind. That means there are systemic gaps that may leave an organisation exposed to new pressures placed upon it by unpredictable workloads and unrestrained data growth.
Why examine storage now? There are a variety of reasons, which arise from include important trends ranging from mobility and social media to cloud computing and ‘big data’. One thing is clear, storage requirements have increased and they are not going backward, so here are my five reasons to conduct a storage assessment today:
Now is the time to get in front of the important trends impacting storage requirements. CIOs and other IT professionals responsible for data storage management need to take time today to assess their organisations’ current storage requirements and plot them against the backdrop of other technology trends such as mobility, social media, cloud computing and ‘big data’. Each of these has a significant impact on legacy storage architectures.
Typical legacy storage is actually not typical. Different drives, different speeds, and different media are all factors that comprise the legacy data device landscape. These devices include drives, disks, optical storage, and tapes – many from different manufacturers. By assessing the landscape and planning today, organisations can plan to strip management complexity out of their data storage environments.
Newer technologies such as software-defined storage provide the ability to deliver storage solutions that scale from entry-level to high-end, sharing common data services across both physical storage systems and software-defined storage. This enables end users to access storage that supports all applications and data types across physical, virtual and cloud resources.
Today, the term ‘return on investment’ as applied to storage is commonly defined as a reduction in complexity that results in the consolidation of workloads on less equipment and the discovery of hidden value inside existing data. The ability to access data from multiple sources and in multiple forms – structured and unstructured, SQL, NoSQL and so on, is critical to increasing ROI from corporate storage environments.
Storage is at the centre of the corporate universe today and for the foreseeable future – and creating an environment that enables information retention and business analytics is critical if organisations are to survive, thrive and compete. This includes developing the ability to find the proverbial ‘needle in a haystack’ with unprecedented speed and ease.
The bottom line is this. No matter which vendor’s products a CIO selects for storage solutions moving forward, it’s important to ensure that the chosen storage technology is streamlined, efficient and future-proof. It must enable a simpler storage environment, but one which is capable of supporting increasing data storage requirements, and which is enabled for tomorrow’s converged and software-defined environments.