Americas

  • United States

Asia

Oceania

by Rebecca Wettemann

Maximizing ROI from Storage Management

Feature
Apr 18, 20037 mins
CSO and CISOData and Information Security

When storage was thought of as just infrastructure – an unavoidable cost of doing business – organizations could simply shop for disks that offered adequate performance at the lowest possible price. Today, however, new storage alternatives and management tools present opportunities to save money by reducing burdens on IT staff and by reducing the need to spend on storage capacity itself. Smart storage administrators and CIOs will know the ROI story of their storage strategy – and take advantage of tools and technologies to maximize the returns from their storage investment.

Beyond the obvious technical complexity of an enterprise storage environment, companies should keep in mind two key themes when they are evaluating the bottom-line impact of storage management:

  • Maximizing ROI from storage investments requires balancing storage capacity needs, personnel requirements, and system reliability.
  • The ideal balance is different for every company.

The good news? The more information IT personnel have about their storage environment, the better they can manage and plan to maximize returns from their storage investment. The more automated the storage monitoring and provisioning process becomes, the less IT staff investment will be needed to support effective storage management. Nucleus Research’s recent storage benchmarking report found that storage managers can work toward bottom-line gains in three ways: by reducing personnel time devoted to storage, by utilizing storage more efficiently to reduce purchases, and by limiting downtime related to storage issues.

Reducing Storage Personnel Costs

Growing storage volumes and increasing executive demands to monitor, measure, and optimize has forced companies to address heavier administrative burdens. There are two main strategies to reduce personnel costs: spend more on storage capacity, or invest in a SAN.

Spending more on storage may be seen as the lazy choice – but it may be the best choice from an ROI perspective when cheaper, direct attached storage is sufficient to satisfy application requirements and uptime expectations – and staff aren’t available or appropriate to monitor utilization on an ongoing basis.

Companies that have decreased personnel burdens by investing in a SAN solution increase system reliability and simplify duties like allocation. Those that have already invested in a SAN – especially those monitoring capacity and SLAs – storage management software has emerged as the best way to keep the number of terabytes per administrator above average. These tools help simplify allocation, volume management, performance monitoring, and report-generation responsibilities.

Reducing Storage Capacity Costs

Companies have relied on three different tactics to reduce their spending on storage capacity: implementing a chargeback system, consolidating to a SAN, and using storage management software.

Chargeback systems can reduce the storage demands of internal clients by forcing them to “pay” for the storage they use – driving less “padding” of storage requests. However, in many cases the accounting and reporting burden associated with a credible chargeback system is not worth the costs – without automated storage management tools. Because of the additional administrative costs, companies should consider chargeback only if it’s estimated that current storage overbudgeting could pay for half of one employee salary.

A large number of companies have saved storage costs by consolidating to a SAN – reallocating storage from individual servers to one SAN fabric and supporting a higher utilization rate. In cases where companies have been purchasing new servers just for their storage capacity, a SAN can also yield the side benefit of freeing up these servers for redeployment or resale. But a SAN is not necessarily a cure-all: system performance concerns still put a limit on how high utilization can go, and storage “mirroring” can lead storage consumption to rise rather than fall. Because of the costs associated with deploying a SAN, this strategy is best pursued by companies that can also derive other bottom-line benefits, such as increased uptime and centralized management.

For companies that have had a SAN in place for some time, storage management software provides another means to reduce storage spending – and not just because improved monitoring tools can help keep utilization rates high. Better reporting of growth trends puts IT staff in a position to time storage purchases. Better timing not only keeps cash in pocket longer but also takes advantage of the ever-declining cost of storage. Because these applications also help reduce personnel costs, SAN owners should take a serious look at storage management software options.

Reducing Costs Associated with Downtime

Storage options that foster increased uptime give storage managers a third way to improve the bottom line – but different companies will see a different ROI impact. In one camp are those who don’t worry much about downtime, either because the applications served by storage do not place heavy demands on storage and breakdowns are rare or because the applications are not mission critical and a certain amount of downtime is tolerable.

If a company loses six figures every time the system goes down for an hour, investing in a more reliable architecture – for example, not just a SAN but a SAN with top-of-the-line storage devices and a redundant architecture – will likely yield positive ROI. Deploying storage management software within a SAN may improve a data center’s ability to meet uptime requirements as well.

Before investing in increased uptime, companies should compare the cost of current downtime with the cost of upgrading storage.

COSTS

Three main cost areas associated with a storage investment that companies should consider are capacity, personnel, and management software.

Storage capacity costs depend on the environment as well as two main factors:

  • Variance in the timing of procurement. A company that bought at the end of the year often spent half as much as a peer that bought the same kind of storage 10 months earlier.
  • Differences in hardware quality. Companies that bought high-end disks to feed SAN networks with high uptime requirements spent far more than those that purchased cheap direct-attached storage boxes.

Careful storage performance monitoring and frequent utilization reports do bring benefits, but companies need to weigh the cost advantages of these administrative practices against the cost of personnel needed to perform those tasks. Administrator salaries are fairly consistent; very few companies spend outside a range of $75,000 to $100,000 per administrator. Obviously, salaries vary from region to region and according to experience.

Although it’s clear there is a cost associated with storage management software, many companies currently using such tools don’t have a clear view of their cost – because they were bundled into a larger storage contract purchase. Companies should demand pricing of software separately so that a careful cost-benefit analysis becomes possible. Companies should also take advantage of the modular approach that most vendors have taken in engineering their management suites – data managers ought to purchase only those modules that can be justified through their individual impact on the bottom line.

CONCLUSION

Building and maintaining a cost-efficient storage environment is no longer just a matter of selecting equipment and disks based on total cost of ownership. Architectural alternatives and rapidly improving management software present IT directors with multiple ways to improve the ledger sheet for the entire company as well as for the data center. Companies should several different scenarios when they are looking for ROI opportunities within the storage environment. The smartest companies will also reevaluate their opportunities on an ongoing basis. Because costs continue to decrease, new software emerges, and architectures become more and more stable, striking the right balance and getting the best bottom-line results will require ongoing review and adjustment.