Loading...
Loading...
Executives and researchers warn that AI’s short-term economics favor human labor: Nvidia VP Bryan Catanzaro and a 2024 MIT study both show that compute—specialized GPUs, data-center capex and energy—often costs more than paying employees, with only about 23% of vision-heavy roles economically automatable today. Despite massive Big Tech capex (Morgan Stanley cites roughly $740 billion) and ongoing model improvements, rising hardware and software fees mean firms treat AI as a complement rather than a substitute. Experts say the industry will need cheaper inference, more efficient hardware and better pricing models before automation’s labor savings materialize at scale.
Tech leaders and practitioners must plan investments knowing compute costs can exceed labor today, affecting ROI and deployment timelines. Decisions on hiring, architecture and vendor selection hinge on when hardware and inference pricing improve.
Dossier last updated: 2026-05-13 14:58:14
NVIDIA CEO Jensen Huang said older GPUs from around five years ago are behaving like “fine wine”: their value and demand keep rising as AI workloads drive explosive demand for accelerator compute. Huang used the term to capture both performance gains from software optimizations historically and, increasingly, sustained price appreciation as supply struggles to keep pace. Industry players including CoreWeave report inventory tightness and rising prices for models such as H100, H200, L40S and A100; data-center AI workloads, constrained wafer and memory capacity, and full-stack capacity shortages are cited as causes. The trend matters because it reshapes procurement, secondary markets, cloud capacity planning and total cost for AI deployment.
Nvidia VP Bryan Catanzaro told Axios that compute costs for AI teams currently exceed employee salaries, highlighting a gap between AI investment and labor savings. Studies and industry voices back this: a 2024 MIT analysis found only 23% of vision-centric roles could be economically automated, while Yale and McKinsey data show rising AI capital and data center spending. Big Tech continues heavy AI spending—Morgan Stanley reports $740 billion in capex so far—coinciding with widespread tech layoffs in 2026. Experts say a short-term mismatch exists because hardware, energy, and software pricing keep AI costlier than human labor, so many firms treat AI as a complement until inference and infrastructure costs fall.
Nvidia exec says training and running AI models currently costs more in compute than equivalent human labor, arguing that infrastructure, GPU demand and electricity make AI development expensive today. Speaking publicly, the executive highlighted that while model performance improves, capital costs for data centers, specialized GPUs (like Nvidia’s), and energy consumption drive up total expenses relative to paying employees for many tasks. The comment matters because it reframes short-term economics around AI adoption: firms weighing automation must consider high up-front and operational compute costs, not just potential labor savings. The remarks could influence enterprise procurement, cloud pricing, and investment in efficiency-focused hardware and software optimizations.
Nvidia VP Bryan Catanzaro told Axios that compute costs for AI exceed the cost of human employees, a claim supported by an MIT 2024 study finding AI automation is economically viable for only 23% of vision-heavy roles. Despite that, Big Tech continues to pour capital into AI—Morgan Stanley reports $740 billion in capex this year—while companies such as Meta and Microsoft still run large layoffs. Experts and studies cited (Yale Budget Lab, McKinsey, Tropic) warn hardware, energy, and rising software fees keep AI more expensive than labor today, prompting firms to treat AI as a complementary tool until operating costs and pricing models stabilize. The piece highlights a short-term mismatch between AI investment and economic returns.