IT Brief Ireland - Technology news for CIOs & IT decision-makers
Isometric global data centers network world map focus australia

Equinix unveils vendor-neutral hub for distributed AI

Thu, 12th Mar 2026

Equinix has launched a Distributed AI Hub, which it says provides a single framework for organisations running AI workloads across multiple locations and providers.

The hub runs on Equinix Fabric Intelligence and targets businesses that spread training data, inference, and related services across public cloud, private data centres, and edge sites. Equinix said it is designed to reduce the operational complexity and governance issues that can emerge as organisations adopt what it calls distributed intelligence.

The Distributed AI Hub lets customers connect to and manage an ecosystem of AI infrastructure providers through private connectivity across Equinix's data centre footprint. It is available across 280 Equinix data centres globally, including 17 sites in Australia.

Vendor-neutral model

Equinix positioned the hub as a vendor-neutral alternative to AI marketplaces run by large cloud providers. It said the service lets customers integrate AI models, data, and platforms from different suppliers without being tied to a single provider's catalogue.

The hub offers connectivity options for model companies, GPU cloud providers, data platforms, network services, security services, and AI frameworks. Equinix said customers can assemble their AI stack from multiple suppliers and manage those connections through a single approach.

Enterprises increasingly run AI systems close to where data resides or where users access applications. The shift reflects latency requirements, data residency concerns, and the cost and complexity of moving large datasets between environments.

Mary Johnston Turner, Research Vice President, Digital Infrastructure Strategies at IDC, linked this shift to the growing deployment of edge infrastructure for AI use cases.

"Enterprises are racing to deploy agentic AI but are finding that their existing infrastructure was never designed for the complexities of distributed intelligence," said Mary Johnston Turner, Research Vice President, Digital Infrastructure Strategies at IDC. "By 2027, IDC expects 80% of enterprises will deploy distributed edge infrastructure to improve the latency and responsiveness of AI applications. Enterprises will need solutions like Equinix's Distributed AI Hub to enable them to unify these disparate systems."

Security integration

The first major integration for the hub is with Palo Alto Networks. Equinix said the combination delivers real-time security for AI workloads and supports central policy controls across locations.

The integration focuses on protecting interactions between AI agents or models and external tools and data sources. Equinix said these interactions can create new pathways for data exposure and misuse without consistent controls across environments.

The security component uses Palo Alto Networks Prisma AIRS for what the companies describe as real-time AI security and centralised policy enforcement. Equinix said it provides visibility into AI applications, data, and interactions across sites.

Prisma AIRS will also be available on Equinix Network Edge, according to Equinix. Network Edge is the company's platform for deploying virtual network services at the edge of networks. Equinix said this enables central management of AI-related security services closer to users and workloads.

Operational focus

Equinix described the Distributed AI Hub as an operational response to AI deployments that span multiple infrastructure domains. It said the hub provides a consistent way to connect models and platforms, move data, and run inference across distributed environments.

Governance and control are also central themes for the service. In practice, organisations often face a patchwork of policies and tools when running AI workloads across multiple clouds, as well as on-premises and edge environments.

Jon Lin, Chief Business Officer at Equinix, said organisations increasingly rely on multiple locations for AI deployments, even when they want a consistent experience.

"AI isn't centralised - but the right infrastructure can make it run as seamlessly as if it were," said Jon Lin, Chief Business Officer at Equinix. "Equinix is the neutral ground where AI, cloud and networking infrastructure converge. We are providing enterprises the freedom to build and scale AI wherever their data, partners, and teams already live, while running inference close to the data and users that depend on it, without the operational drag that comes from stitching together complex, distributed systems. With our Distributed AI Hub, we're giving customers a simpler, smarter, and far more connected way to run and scale their AI today. We are building one of the most expansive and neutral AI ecosystems."

Alembic, a customer referenced in the announcement, said the industry has moved from discussing distributed AI in theory to addressing practical requirements such as placement and governance.

"The conversation around distributed AI is finally getting real," said Lloyd Taylor, CTO/CISO at Alembic. "It's more than compute and data, it's controlling where the data lives and how the compute runs. Equinix is framing that problem the right way, by bringing placement, governance, and predictable performance into the same architecture with the Distributed AI Hub. This is what makes distributed AI viable at enterprise scale."

Equinix said the Distributed AI Hub is available at its global data centre locations, and expects customers to use it to deploy consistent AI infrastructure patterns across regions.