
Yet revenue trails far behind and the economics remain unsettled. Unlike earlier technology waves where scale drove costs down, artificial intelligence grows more expensive the more it is used. That contradiction of extraordinary adoption and extraordinary cost is the paradox shaping today’s AI economy.
In an op-ed by the law firm Baker Botts LLP, the authors warned that “the gap between significant investment and modest monetization lies at the heart of AI’s business challenge.” They drew a direct line to the dot-com era, when companies with rapid user growth but weak monetization failed to deliver sustainable returns.
Building Faster Than the Business Case
OpenAI sits at the center of the story. The company now carries a 500 billion dollar valuation, following a secondary share sale reported by PYMNTS. Half-year revenue rose 16% to $4.3 billion. Yet those gains remain modest when set against its obligations. OpenAI signed a $300 billion cloud agreement with Oracle to secure compute capacity, while Nvidia is preparing to invest up to $100 billion, setting a private funding record.
The op-ed from Baker Botts LLP also illustrated the scale of revenue and investment imbalance with simple math. Even if OpenAI were able to convert 100 million users into paid subscribers at $30 a month, annual revenue would reach only $36 billion, a figure that barely begins to cover long-term infrastructure contracts worth hundreds of billions per year. At more common subscription rates of under 10 dollars per month, the gap would widen dramatically.
The same pressures face the industry’s largest players. The Wall Street Journal has reported that Microsoft, Google and Amazon are expanding their data center footprints at unprecedented speed to support AI workloads. PYMNTS has reported that these expansions represent one of the most aggressive infrastructure buildouts in modern technology as AI-related infrastructure spending by Big Tech is expected to surpass $2.8 trillion through 2029.
Adoption at Record Speed
Karen Webster, CEO of PYMNTS, has emphasized that generative AI is already unique among technologies because of how fast it has spread. In “Gen AI: The Technology That Broke the Adoption Curve,” she noted that AI adoption bypassed traditional barriers: no new hardware, no merchant upgrades, no network rollouts. Consumers could experiment instantly, and they did.
That acceleration has created pressure for enterprises to catch up. Karen’s point is that speed alone is not the milestone. The real test is whether AI becomes a habit, embedded in workflows the way email or cloud storage did. “Technologies that last embed themselves into daily routines,” she wrote. The distinction between novelty and habit is where long-term value will be proven.
ROI Beyond Queries
Metrics complicate the adoption story. Today’s shorthand is the query, often used as a measure of AI engagement. However, queries, as Webster argued in “How Leading Enterprises Really Measure Gen AI ROI,” risk becoming vanity metrics. “Queries are the new eyeballs,” she observed. Activity is not the same as impact. What matters is whether AI projects produce measurable outcomes such as cost savings, efficiency gains or revenue.
There are early signs of progress. PYMNTS research shows enterprises reporting up to 40% time savings in compliance documentation when AI is paired with structured oversight. Banks are piloting AI to reduce false positives in fraud detection, which reduces both costs and customer friction. Retailers are experimenting with AI personalization to lift conversion rates without raising acquisition costs. These are measurable efficiencies that move AI beyond experimentation.
CFOs are also starting to put tighter discipline on AI investments. A recent PYMNTS Intelligence survey found that only 26.7% of CFOs plan to increase generative AI budgets in the next 12 months, down from 53.3% a year earlier. The shift signals a move away from hype and toward results-driven spending. Among firms reporting very positive returns from generative AI, half plan to expand budgets further, while only 16.7 percent of those seeing negligible returns will do so. This shows that the ROI filter is hardening, and future investments will be judged by whether they can consistently demonstrate financial or operational value.
The pressure to meet the energy demands of the AI infrastructure is also real. By 2035, data centers are projected to account for 8.6% of all U.S. electricity demand, more than double their 3.5% share today, according to BloombergNEF. It is not just electricity that matters. Stanford University has reported that AI data centers are already straining water supplies and local grids in the western United States, competing for land, plumbing and power capacity.
The Next Chapter
The story of AI today is a juxtaposition. On one side, record-setting valuations, historic adoption and infrastructure contracts measured in the hundreds of billions. On the other, an ROI equation that remains unsettled, strained by infrastructure costs and resource realities.
But there is also momentum. OpenAI’s revenues are climbing. Enterprises are reporting measurable efficiency gains. And as Webster has argued, adoption is already breaking historical curves. If these gains scale from pilots to enterprise-wide deployment, investment will not just be justified, they may unlock a new wave of productivity growth.
Source: https://www.pymnts.com/