Organisations around the world are on high alert for rapid and effective ways to boost their sustainability through cutting their carbon emissions – but are they missing the obvious?
Notoriously energy hungry data centres are an issue that every enterprise can tackle. Using edge computing to move the processing of data closer to where it’s created can reduce the amount of data centre capacity you need.
Added to this, edge computing can support Artificial Intelligence (AI) powered algorithms that optimise data centre energy use.
Data centres are a sustainability concern
Organisations who are really making sustainability progress are considering all the options, from longer-term plans only possible when infrastructure is replaced, to short-term wins they can achieve with their current operating environment.
Data centres have huge potential for improved sustainability in the short term. This is welcome news because, right now, the picture is fairly bleak. Today, it’s estimated that data centres emit roughly as much CO2 as the airline industry, and the energy consumption of data centres is set to account for 3.2% of the total worldwide carbon emissions by 2025, consuming no less than a fifth of global electricity.
Edge computing supports data centre shrinkage
The first shift you can make towards reducing data centre energy use is to shrink your data centre estate, and edge computing has a fundamental role to play in this.
By keeping processing at the edge, you significantly reduce the volume of data that’s sent to the data centre for processing and storage. Potentially, this can open the way for data centre consolidation, reducing energy consumption and accelerating decarbonisation.
Optimise energy use with edge computing
However, edge computing offers more than this one sustainability boosting option. Increasingly, organisations across all industries are using edge computing as a platform to enable the AI and machine learning tools that can identify energy efficiencies.
Connected to a network of sensors, edge computing delivers the low latency processing that’s so critical for AI-driven real-time decision making.
It supports sustainability apps that take the guesswork out of energy optimisation. The apps create dynamic AI models that compare activity levels with the energy used, then start to model the best scenarios and outcomes to achieve your data centre’s ‘sweet spot’.
The AI learns the energy consumption pattern at the rack, rack Power Distribution Unit (PDU) and rack component or U level, creating an aggregated Energy Efficiency Index (EEI) for the entire data centre.
The models provide continual guidance on how to save energy and carbon, delivering visibility into power consumption per server, node and application.
They can recognise power drains such as stranded servers or high power and cooling scenarios – and can identify the most efficient placement of workloads. Then, once you trust the model, it can be run automatically to fine-tune energy usage.
Running on an edge computing platform, these sustainability apps optimise performance, reducing energy consumption and your carbon footprint. Additionally, they bring important savings in today’s high-cost energy environment.
Comments 1