The AI disruptions of the past few years have only been the prologue to what’s coming in 2026 − AI’s full integration into data centre processes and builds. It’s a moment we’ve been building up to ever since OpenAI’s ChatGPT brought AI into the mainstream in late 2022, sending shockwaves through every type and size of business.

A truly profound transformation will begin to take hold in 2026 as AI becomes ever more ingrained in every aspect of life, and the focus shifts from LLMs to AI inferencing. In some ways, 2026 will be the year the rubber truly hits the road when it comes to AI.
AI rewires functions and industries
According to McKinsey’s latest State of AI survey, 78% of organisations use AI in at least one business function. This is up from 72% in early 2024 and 55% the year before. While most adoption remains in sales and marketing, AI is expanding rapidly across manufacturing, healthcare, finance, and data centres.
• Manufacturers using AI to support demand forecasting have improved accuracy by a median 30 percentage points.
• Financial institutions are harnessing AI for fraud detection, payment optimisation and risk management.
• Data centres are increasingly using AI-driven cooling systems and predictive analytics to minimise overheating, reduce energy waste and improve grid efficiency through better balancing of electricity supply and demand.
As adoption deepens, AI will transform industries. For example, AI agents operating with little or no supervision will become central to operations, relying on multiple models and demanding vast compute capacity within AI factories.
The rise of AI factories
An AI factory is a data centre that not only stores data, but also outputs intelligence. We are moving beyond model training to inferencing. This is where ROI is realised and these environments become essential.
Inferencing workloads are becoming more varied, from chatbot prompts to real-time analytics and autonomous systems. While typically requiring less power per server than training, inferencing workloads are increasingly varied and pervasive.
They now range from simple chatbot prompts to complex real-time analysis in industries using autonomous systems and agentic agents. Depending on the deployment and workload, inference environments can range from less than 20 kW for compressed or tuned models, up to 140 kW per rack for more advanced agentic use cases.
To keep pace, operators will adopt next-generation GPUs such as NVIDIA Rubin CPX, due in late 2026. Paired with NVIDIA Vera CPUs and Rubin GPUs in the NVIDIA Vera Rubin NVL144 CPX platform, this system delivers 8 exaflops of AI compute and 7,5 times more AI performance than the NVIDIA GB300 NVL72.
Robotics become highly advanced
AI-driven robotics will surge in 2026. Beyond longstanding applications, like radiation detection or bomb disposal, AI will expand automation to drones, firefighting systems, search-and-rescue tools, healthcare robotics and even passenger transport.
Again, these technologies require immense processing and network capacity because they rely heavily on high-definition video as an input.
Furthermore, we will see data centres increasingly deploying robotics for security monitoring, server installation, maintenance, cable organisation, drive replacement and optimisation of liquid cooling systems.
Digital twins take centre stage
In 2026, we will see the rise of digital twins as processing power continues to evolve in AI data centres and advanced platforms are developed, like NVIDIA’s Omniverse and Cosmos. Data centre operators will use digital twins to achieve greater efficiency and accelerate development by designing and simulating highly complex physical objects, systems and processes.
Take for example a data centre’s power system itself. ETAP sophisticated modelling technology can create a virtual replica of a data centre’s electrical infrastructure through integration with NVIDIA Omniverse.
Liquid cooling goes mainstream
As we are well aware, traditional cooling cannot support next-generation compute density. In 2026, rack densities are expected to reach 240 kW per rack, rising to 1 MW per rack by 2028, with research exploring the feasibility of 1,5 MW per rack.
This makes liquid cooling unavoidable, transitioning from niche to mainstream as high-density AI clusters continue to dominate.
Sustainability remains critical
Power sourcing will continue to be a major challenge in 2024. Operators will rely on diverse energy mixes including natural gas turbines with carbon capture, HVO-fuelled backup generators, wind, solar, geothermal, and battery storage. According to the International Energy Agency (IEA), renewables currently supply 27% of electricity consumed by data centres and are expected to meet nearly half of additional demand growth through 2030.
Expect 2026 to be a critical year where AI’s impact moves from a disruptive force to a foundational element of business and technology. As AI reshapes every layer of digital infrastructure, tomorrow’s data centres will not simply support technology, they will enable intelligence itself.
| Tel: | +27 11 254 6400 |
| Email: | [email protected] |
| www: | www.se.com/za/en/ |
| Articles: | More information and articles about Schneider Electric South Africa |
© Technews Publishing (Pty) Ltd | All Rights Reserved