Data centers are power hogs. We’ve all seen the statistics: US data centers use 90 billion kilowatt hours of electricity each year and that’s about 3% of the global supply of electricity and 2% of greenhouse emissions. And IP traffic, data center storage capacity, and computing loads are just increasing– thanks to things like streaming video, high-density computing, real-time data analysis, and consumers’ penchant for instant access to anything and everything. With that, you’d expect the predictions to get worse as time goes on.

Energy use in data centers around the globe has increased from 194 TWh in 2010 to 202 TWh in 2018. And this recent analysis by Northwestern University, Lawrence Berkeley National Laboratory, and Koomey Analytics suggests that might not be all bad news for energy use and data centers. The study – Recalibrating global data center energy-use estimates – indicates that increases in data efficiency have managed to meet the increasing energy needs of data centers over the past decade.

That doesn’t mean data centers can sit on their laurels. Demand for electricity will continue to increase as computing demands intensify. Energy efficiencies come from things like proper airflow management and cooling of the data center. Cooling the data center accounts for  up to 40% of total data center energy use. One critical way to stay on top of energy use – to then find a way to increase efficiency – investing in monitors for your data center to track in real time environmental data like temperature, humidity, and air pressure. Reliable, accurate data is the key to finding efficiencies and reducing your energy footprint.