Home: Motoring > Auto Industry Enters Era of Full-Scale AI Inference, Driving Surge in Computing Demand and Accelerating Domestic Substitution

Auto Industry Enters Era of Full-Scale AI Inference, Driving Surge in Computing Demand and Accelerating Domestic Substitution

From:Internet Info Agency 2026-04-11 20:53:00

At the High-Level Forum on Intelligent Electric Vehicle Development held on April 11, 2026, Shi Qinghua, Vice President of Baidu, stated that the automotive industry is transitioning comprehensively from the AI training phase into an era of "full-scale inference," leading to a sharp surge in demand for inference computing power and pushing the sector into a "computing power crunch." According to forecasts, by 2028, inference computing will account for 73% of total computing demand in the automotive field—marking a significant increase from roughly one-third in 2023. Currently, intelligent cockpits, end-to-end intelligent transformation across the entire R&D chain, and Vibe Coding (AI-driven software development) have become the primary drivers of computing power consumption. Intelligent cockpits are evolving into native platforms for large models, enabling generative human-machine interaction, multimodal perception, and long-term memory capabilities. Meanwhile, AI is being integrated across all stages—including vehicle R&D, manufacturing, supply chain, sales, and after-sales services—to enhance efficiency. Additionally, AI-powered automated coding is substantially boosting software development productivity, further escalating inference workload demands. Due to constraints on the supply of high-end chips from overseas, domestic computing solutions are rapidly stepping in to fill the gap. Shi Qinghua advised automakers to proactively plan their computing infrastructure, build enterprise-grade large model platforms, and strengthen data governance. He emphasized that computing power should be treated as a core production resource rather than merely an R&D cost, and suggested commercializing cockpit services to resolve the paradox of "enhanced user experience but declining profitability." Baidu announced it has launched the Tianchi Super Node and plans to release the Kunlunxin M100 dedicated inference chip, integrating hardware, chips, and computing networks to create an efficient, low-cost, purpose-built inference foundation supporting the deployment of autonomous driving and intelligent cockpit applications.

Editor:NewsAssistant