Blog

Despite plenty of evidence demonstrating the value of automated management processes, 43% of data centers still rely on manual capacity planning and forecasting methods. These are the findings from a recent Intel survey of 200 IT decision makers in the US and UK. What is interesting is that when asked why they employed manual approaches, 46% of respondents said it was because they felt alternatives would be too expensive. A further 35% feared they lacked the resources to implement a more automated management process.

At face value these might seem like acceptable reasons, but even the most cursory research shows that basic automation improves operational efficiency, enables real-time asset management, mitigates risk and reduces the costs associated with human labor. These are certainly not unachievable outcomes, but realistic objectives.

IBM clearly demonstrates what a successful strategy looks like. The company migrated from manual tracking processes to real-time automated data collection and analysis. This helped IBM raise its inventory accuracy from 71.8% to 99.7%. The organization saved $8M in the first year, saw a return on investment in just nine months and five years later has delivered savings of $42M. 

Resource Allocation

time-money-hourglassLacking the resources to increase automation may appear challenging, particularly if a large proportion of time is being allocated to manual capacity management, but it is vital to identify a way out of this inefficient spiral. This often comes from bringing together multiple teams and rethinking how processes are controlled and monitored.

Take another customer as a use-case, a global mobile communications carrier. The company deployed a real-time monitoring and asset management system that integrated with its ERP software.

This was a time-intensive task initially, but the improved business intelligence has streamlined financial accountability and strategic planning to such an extent that the organization is now planning to automate its asset lifecycle management practices across its global data center estate.

The most significant benefit is that the company can precisely understand what assets are available and whether they are functioning safely. This level of control far outweighs any investment that has to be made initially.

Lights On

Intel’s survey also highlighted the impact of outages and found that 59% of data center managers were unable to quantify the cost of downtime at their respective facilities. However, according to data from the Ponemon Institute, the per-minute cost of downtime increased 41% between 2010 and 2013, and averaged almost $700,000 during 2013. The risk of outages, particularly in larger data centers, is ever present and managers need to minimize this risk as much as possible, not just to satisfy regulatory requirements or customer service level agreements, but also to meet the expectations of senior executives.

Automation is one of the most effective ways to achieve this outcome. It provides managers with control over the entire data center estate through accurate information of asset location, movement, security and efficiency. It may not prevent an outage, given that causes can range widely from extreme weather conditions to squirrels eating cables, but what automation can achieve is an acceleration of the recovery process.

However, despite all the benefits to the overall business, perhaps the most valuable advantage is time savings for those responsible for the daily management of a facility, especially during audits. Knowing exactly where assets are at any time means managers no longer have to walk rack-to-rack, server-to-server confirming assets are where they should be.

Maybe it is time you ditched the clipboard for the dashboard?


 

Learn how real-time monitoring, capacity planning and predictive analysis technologies improve data center agility and efficiency, ensuring higher performance at a lower cost. Download our whitepaper, "IT Starts with Infomration," today.

New Call-to-action