Recom
Blog

The “common cloud”: What MWC 2025 revealed about the future of AI and connectivity

Ivo Ivanov, CEO of DE-CIX

Barcelona welcomed attendees to Mobile World Congress 2025 with its usual mix of interest and anticipation. The weather remained mild for the time of year – good news for attendees travelling from colder climes – but the ongoing roadworks near the Fira exhibition center at Gran Via tested the patience of bus and taxi drivers. The logistical headaches soon subsided, however, as crowds left the sunshine to enjoy some of the breathtaking demonstrations on display.

As is often the case with MWC, a nice balance was struck between serious technological innovation and tech industry showmanship. Agility Robotics unveiled a new humanoid robot that had been “supercharged” by the same technology that powers ChatGPT, enabling it to make judgements, identify colors, and even learn new instructions on-the-fly. Lenovo revealed a new solar-powered laptop, Harman showcased its new “sight beyond sight” vehicle-to-cloud software service, and the world’s first “AI Phone” was officially announced.

Yet, beneath the buzz of futuristic tech, this year’s MWC had a clear focus: not 6G or the continued convergence of mobile and broadband as many were expecting, but AI and how to make it viable at scale. Discussions often centered around one critical challenge – how networks can support AI’s rapid expansion in a way that ensures reliability, efficiency, and performance. With telecom giants, cloud providers, and AI specialists all bringing their two cents to the table, the event evolved from being simply “MWC 2025” into something much greater – an attempt to solve the infrastructure puzzle that will determine the trajectory of AI for decades to come.

AI: Everything, everywhere, all at once?

If there was one dominant theme at MWC 2025, it was AI – everywhere, and in everything. From AI-powered assistants promising to revolutionize the smartphone experience to telecom operators leveraging AI for network automation, the message was clear: AI is happening now.

But beyond the polished announcements, a more fundamental question loomed: Where should AI actually live? Speaking at the event, Federated Wireless CEO Iyad Tarazi laid out the dilemma: should AI workloads remain in hyperscale cloud environments, be processed locally in private data centers, or be embedded directly within network infrastructure – on radios, routers, or edge computing nodes? As companies weigh the trade-offs between cost, latency, and control, this debate will shape how AI scales in the coming years. What’s clear is that AI’s future depends not just on more powerful models, but on high-performance connectivity and interconnection strategies to ensure real-time access to data, wherever it’s needed.

Catching the “the common cloud”

What is MWC without a new buzzword? In 2023 we had “phygital convergence”, in 2024 we were introduced to “AIoT” – a mashup of Artificial Intelligence and the Internet of Things, and this year “the common cloud” seems to have garnered some traction. Red Hat’s telecom chief architect, Rimma Iontel, described it as a growing trend in which telecom operators are shifting toward shared cloud infrastructure to handle both IT and network functions. So rather than relying on siloed environments, operators are now leveraging multi-tenant cloud platforms to optimize costs and efficiency, resulting in improved scalability, easier network management, and reduced reliance on proprietary, single-vendor ecosystems.

However, this shift in infrastructure raises challenges of its own. If network operators and enterprises are consolidating workloads into shared cloud environments, they need reliable connectivity to ensure consistent, low-latency access to data. Public Internet connections alone won’t cut it – especially when it comes to AI inference like analytics or the use of LLMs where every millisecond counts. That’s why we’re seeing a pivot toward direct, high-speed interconnection between cloud providers, telcos, and enterprise networks in 2025. As AI adoption accelerates, ensuring that these interconnected ecosystems can handle the demands of next-gen applications will be key to establishing the “common cloud,” as Rimma puts it.

Building an AI superhighway

One of the most significant announcements came from Nokia, AMD, Cisco, and Jio, which together unveiled plans for an AI-driven platform to “redefine telecom operations”, aiming to streamline network operations and optimize performance in real-time. This brought to mind another trend we’re seeing emerge – that AI isn’t just a consumer-facing technology, but an embedded component in the very fabric of infrastructure.

AI’s appetite for computing power is forcing the industry to rethink this infrastructure. While Iyad Tarazi asked the important question of where AI workloads should be processed, as noted above, how they are processed is of equal concern. With AI models growing in size and complexity, traditional cloud architectures may not be enough. Edge computing, colocation, cloud exchanges, and direct interconnection between the cloud and AI services are emerging as essential components of tomorrow’s “AI-ready” networks.

MWC 2025 gave us a glimpse of what’s coming – autonomous networks, AI smartphones, robots that can smile, and even solar powered laptops. But none of these innovations will gain a real foothold in the market without the right infrastructure to support them. Innovation is the easy part; supporting and scaling that innovation is where the real work starts.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

To Top