EDB pitches intelligence per watt for AI data layer
EDB has introduced what it calls an "intelligence per watt" framework for EDB Postgres AI, aimed at reducing token consumption and data centre emissions tied to AI workloads.
The announcement focuses on performance claims across data-layer operations, including vector indexing, query handling and infrastructure use, as businesses deploy more AI agents and retrieval systems.
EDB argues that the energy debate around artificial intelligence has focused too heavily on models and graphics processors, while giving less attention to the databases and data movement behind AI systems. It says enterprises have more direct control over efficiency in the data layer than in the model layer.
The platform targets energy use on two fronts: the core infrastructure needed to run enterprise applications, and the database activity behind agentic AI, including search, retrieval and vector indexing.
An analysis of three banking, financial services and insurance customers with more than 120 data centres found substantial reductions in computing resources. The work, independently validated by Incendium Consulting according to EDB, showed up to a 94% reduction in compute cores in one case and an expected emissions reduction of up to 87%.
EDB said that level of emissions reduction would amount to about 153,000 metric tons of avoided CO2e, roughly equivalent to removing 33,000 cars from the road.
Data layer focus
EDB also released benchmark figures for vector indexing, a process used in AI search and retrieval systems. It said EDB Postgres AI delivered vector index builds that were 5x to 12x faster, while handling 1 billion vectors on a 128 GB server, compared with more than 1,000 GB for traditional vector engines.
Another result came from a pilot with a global telecoms provider. In that test, EDB said its software achieved up to 57% lower AI token usage while preserving 90% quality and recording a 72% scenario win rate.
Those figures matter because token usage is closely linked to the running cost of large language model-based applications. As companies put more AI agents into production, repeated database calls, indexing tasks and data transfers can increase both electricity demand and operating costs.
EDB placed the announcement against a broader rise in AI infrastructure consumption. It said enterprises are expected to make wider use of AI agents over the next several years, with those systems consuming trillions of tokens each day as deployment scales.
It also cited forecasts that global data centre electricity demand will more than double by the end of the decade, with AI as a major contributor. That backdrop is shifting attention beyond the cost of training and inference to the storage, retrieval and orchestration work that supports those processes.
"The AI energy conversation has been about what happens with the models and GPUs. Almost nobody is talking about what happens at the data layer that every agent, every model, every inference call depends on," said Quais Taraki, Chief Technology Officer, EDB.
"You can't control consumption at the model layer. Agents consume what they consume. But you can control efficiency at the data layer, and for most enterprises, that's the only lever they actually have," Taraki said.
Three principles
EDB said the framework rests on three principles: measure, optimise and govern. In practice, that means quantifying the energy and infrastructure cost of AI output, reducing compute, storage and network demand for each operation, and maintaining oversight of databases, indexes and pipelines created by autonomous systems.
The company linked that framework to broader claims about operational efficiency in its platform, saying EDB Postgres AI has also shown analytical workloads completing 50x to 100x faster on live operational data, alongside up to 58% lower cost and limited performance degradation under concurrent use compared with rival cloud analytics platforms.
For EDB, AI economics and environmental impact increasingly depend on how data systems are designed. That view reflects a wider industry debate over whether improvements in infrastructure efficiency can offset rapid growth in demand from generative AI and agent-based software.
Kevin Dallas, Chief Executive Officer of EDB, linked energy efficiency to financial returns from AI projects.
"Enterprises succeeding with AI at scale are 275% more likely to prioritize energy-efficient data infrastructure than the rest of the market. They're also seeing 5x the ROI. That's the connection most of this industry is missing. This idea of 'intelligence per watt' isn't just an environmental metric-it's a performance indicator. The companies getting the most from AI are the ones demanding the most from their data layer," Dallas said.