Ask Onix
AI's future: On-device power over remote data centres
Artificial intelligence could soon shift from massive data centres to personal devices, according to Perplexity CEO Aravind Srinivas. In a recent podcast, he suggested that advanced, tailored AI tools might eventually run on the hardware already inside smartphones and laptops, eliminating the need to transmit data to remote servers.
Current on-device AI trends
Tech giants are already exploring this model. Apple's Apple Intelligence system processes some features directly on its latest devices using specialised chips, improving speed and data security. Microsoft's Copilot+ laptops also support on-device AI. However, these capabilities remain limited to high-end gadgets, as most consumer devices lack the processing power for AI tasks.
Jonathan Evans, director of Total Data Centre Solutions, described the shift as a long-term possibility contingent on advancements in efficiency and local hardware.
The rise of micro data centres
While demand for traditional data centres-sprawling facilities handling everything from video streaming to AI-continues to grow, smaller alternatives are emerging. A UK-based project in Devon demonstrated this by powering a public swimming pool with waste heat from a washing machine-sized data centre. Similar setups have since appeared, including a British couple heating their home with a garden shed data centre and a university professor using a GPU under his desk to warm his office.
Despite these innovations, tech firms are still investing billions in large-scale data centres. Nvidia CEO Jensen Huang calls them "AI factories," arguing they are essential for advancing AI. However, critics question whether such massive infrastructure is sustainable, given its energy consumption and environmental impact.
Edge computing and distributed networks
Experts like Mark Bjornsgaard, founder of DeepGreen, advocate for smaller "edge" data centres near urban areas to reduce latency and improve response times. Bjornsgaard envisions a network of micro data centres in public buildings, working together while providing heat as a byproduct. Amanda Brock of OpenUK agrees, predicting that the dominance of large data centres may fade as processing moves to devices like smartphones, routers, or set-top boxes.
Some companies are even exploring orbital data centres. Avi Shabtai, CEO of Ramon Space, suggests that small, scalable data centres in space could offer efficiency and flexibility.
Smaller AI models gain traction
The push toward localised AI aligns with a broader trend: the rise of smaller, specialised AI models. Large language models (LLMs), which power chatbots, are often criticised for their broad scope and tendency to make errors. Businesses are increasingly adopting bespoke AI tools trained on proprietary data, tailored to specific tasks. These models require less computing power and can run on local hardware.
"I've spoken to multiple people who aren't seeing the benefits of using generic AI tools. We're shifting from resource-heavy large models to smaller, bespoke ones running locally."
Dr. Sasha Luccioni, AI and Climate Lead at Hugging Face
Security and environmental benefits
Distributed networks of small data centres could also enhance security. Professor Alan Woodward of Surrey University notes that smaller targets present less risk if compromised, unlike large centres that can become single points of failure. Environmentally, reducing reliance on massive data centres could lower energy consumption, a concern highlighted by Luccioni.
While the future of AI infrastructure remains uncertain, the debate underscores a growing interest in decentralised, efficient alternatives to today's data centre-dominated model.