A recent report from the US Department of Energy predicts that the increased load on data centres could see them consuming up to 12% of total US electricity by 2028, up from 4.4% in 2023.  

In Ireland – a European hub for tech multinationals – data centres consumed 21% of the country’s electricity in 2023. The race to build out infrastructure to support AI-powered services has also seen a revival of interest in nuclear power, with Microsoft, Amazon and Google all investing in schemes to secure access to nuclear energy to power their data centres. 

An uptick in renewable energy generation and more sustainable practices among data centre operators – something Datacom has undertaken – will help meet the growing demand for AI, but these changes alone won’t get us out of the woods. The only way to ensure AI use grows at a sustainable pace is to rethink how we use the technology.   

Decoupling AI tech from its use cases 

Simply put, we need to decouple AI usage from the technology infrastructure itself. Right now, GenAI tasks are routinely being processed in the cloud – adding to that data centre load. But some AI tasks lend themselves to be processed on devices and infrastructure that is closer to the people who need the AI-driven output. Tasks like facial recognition, object detection, real-time sensor data processing, or basic chatbot and virtual assistant queries and common responses. 

There is an opportunity to move away from simply feeding queries into the major GenAI engines powered by large language models (LLMs) and being more selective about what AI service we use for different tasks.  

Lou Compagnone, Datacom Director of AI, standing against neutral background wearing brown shirt black vest and plaid skirt
"Right now, AI tasks are routinely being processed in the cloud, adding to the data centre load and energy demands, but some AI tasks lend themselves to be processed on devices and infrastructure that is closer to the people who need the AI-driven output," says Datacom Director of AI, Lou Compagnone.

We call this edge AI or “AI at the edge” and it is a big part of the answer to scaling up use of AI sustainably. We are already seeing edge AI being deployed in personal devices with the likes of Apple Intelligence in the latest models of iPhones, and the CoPilot+ PCs that hit the market in 2024.   

These devices are equipped with neural processors that are optimised to handle AI workloads on the hardware, rather than sending everything to the cloud to be processed. This has security and privacy benefits but is also far more efficient.   

AI can scan the emails in your mail app to summarise them for you, or edit your photos based on your phone’s camera roll. You only need to go to the cloud and anywhere near an LLM for more complex tasks that tap into the vast information those LLMs have been trained with and the powerful transformer technology that underpins them.  

As more applications become available that allow us to tap into the power of these edge AI devices, we will start to realise significant energy gains and cost savings. Organisations need to do the same when it comes to enterprise use of AI. 

The right tool for the job

Copilot is a great AI tool for personal productivity in the Microsoft 365 environment, allowing you to complete knowledge-based tasks. But for more functional tasks, such as analysing your customer data for insights, using a dedicated AI service built for that purpose is likely to be more efficient than getting Copilot to sift through your Excel spreadsheets. 

We need to get a lot more deliberate in our use of AI, using the infrastructure that’s most appropriate for the task, rather than treating AI like one massive Google search engine we can feed everything into. Across New Zealand and Australia, organisations are emerging from their experimentation phase to deploy AI at a more fundamental level. Now is the time to be thinking about the intended use cases and then investigating what infrastructure is needed to support them. 

For instance, in the healthcare space, you could use edge AI in clinics and in the homes of patients to monitor, analyse and report on health indicators. Some of that data could later be sent to a cloud-based AI platform for processing. But you don’t need to be sending all of the data to your AI platform of choice, all of the time. 

We are seeing many customers deploying AI in manufacturing for industry 4.0 applications, such as process automation and robotics. Internet of Things (IoT) devices are being used in agriculture to gather valuable data that help farmers and growers manage production. But the computing requirements are generally low and data can often be processed locally to yield actionable insights. 

Cloud platforms will continue to be central to AI use. But I can envisage a time soon where a sizeable chunk of AI workloads are being processed at the edge, and employing small AI models and stripped-down applications built for efficiency. 

The obvious first place to take this hybrid approach to using AI is for use cases that require low latency connectivity for real-time decision making. Another is where privacy and security are paramount. 

Assessing your AI readiness

While data security is usually front of mind for CIOs when it comes to enterprise AI, the energy use of the technology is also becoming a significant consideration for organisations targeting emissions reductions. 

This is a complex landscape, which is why we’ve developed the Datacom AI Academy – a series courses and workshops that aim to increase AI literacy in organisations. 

Through this work we’ve helped executives grasp basic AI concepts so they can appreciate the potential of the technology, helped teams develop AI proof of concept projects, and run formal courses offering skills in various aspects of the technology. 

The aim is to help customers develop their AI readiness so that they can find an appropriate stance on AI that incorporates effective governance and processes, while allowing them to safely experiment with the technology.  

Datacom’s AI Ignite is focused on helping organisations identify and prioritise golden use cases for AI and then helping them build out AI Proof of Concept (POC) to bring those use cases to life.  

As Datacom’s AI Academy evolves, we’ll also be helping customers take this hybrid approach to using AI, so they can more efficiently accomplish tasks at the edge while leveraging the power of LLMs in the cloud.  

Related industries
Technology Education Energy & utilities Engineering & construction Financial services Healthcare Professional services Public sector
Related solutions
Platforms & applications Advisory & consulting Cloud services Data & analytics Digital process automation