Home: Motoring > 2026 Intelligent Driving Battle Shifts to AI Ecosystem Capabilities as Legacy Automakers Rush to Build Independent Smart Architectures

2026 Intelligent Driving Battle Shifts to AI Ecosystem Capabilities as Legacy Automakers Rush to Build Independent Smart Architectures

From:Internet Info Agency 2026-04-22 07:30:00

By 2026, intelligent driving has evolved from an optional feature to a core component of vehicles in China’s automotive market. Data shows that user engagement with Navigation on Autopilot (NOA) is rising rapidly, and intelligent driving systems are shifting from mere functional implementation toward delivering experiential—and even emotional—value. However, the industry faces a homogenization dilemma: mainstream intelligent cockpit interfaces are highly similar, with UI design, voice assistant personas, and app ecosystems becoming the primary differentiators—yet these alone struggle to create meaningful competitive advantages. She Shidong, Deputy General Manager of Intelligent Products at Great Wall Motor, noted that large model applications in intelligent cockpits remain in their early stages, significantly lagging behind the iteration pace seen in the internet industry. A growing industry consensus holds that future competitiveness will hinge not on computing power, algorithms, or parameter scale, but on whether automakers possess independent, end-to-end intelligent system capabilities. Large models are transitioning from add-on features to the foundational core of cockpit systems. This evolution unfolds in three phases: from 2022 to 2023, “post-hoc large models” were primarily used for content generation; this was followed by voice agents capable of contextual understanding and memory; and by the second half of 2025, large models began serving as native conversational entry points, deeply integrated into the vehicle’s overall architecture. This shift transforms the competitive logic from “quantity of features” to “depth of user understanding.” Traditional automakers face three major challenges in this transition: First, limited edge-side computing power—mainstream chips currently support only sub-billion-parameter small models, creating a significant cognitive gap compared to mainstream large models. Second, there is no mature solution for modeling spatial relationships within the vehicle cabin. Third, automakers lack sufficient capability to leverage their accumulated domain-specific automotive knowledge (e.g., HVAC control strategies) in AI training. Intelligent driving competition can be broken down into three fronts: **First**, algorithmic approaches are rapidly evolving—from BEV + Transformer multi-stage architectures, to end-to-end systems, and now to Vision-Language-Action (VLA) models. Li Auto has launched its VLA Driver Large Model, while the IM LS8 integrates Alibaba’s Qwen. Yet the core challenge lies not in the algorithms themselves, but in full-stack engineering capabilities, including systematic coordination across data annotation, training, and evaluation frameworks. **Second**, edge-side computing power has entered an arms race. High-end models now exceed 700 TOPS in the ADAS domain and over 300 TOPS in the cockpit domain. Great Wall’s Guiyuan platform uses NVIDIA’s Thor chip, delivering a combined ~1,000 TOPS across both domains. However, the key lies not in raw compute stacking, but in deployment efficiency and power consumption control. **Third**, competition over ecosystem control is intensifying. Major large model providers—including Baidu’s ERNIE, Alibaba’s Qwen, ByteDance’s Doubao, Huawei’s Pangu, and Tesla’s Grok—have all entered the automotive space. Automakers must strike a balance between in-house development and external partnerships to avoid ceding control of critical links to suppliers and retain ownership over iterative development. Interaction paradigms are also being redefined. The traditional “human-machine” binary model is giving way to a new “human-agent-vehicle” paradigm: users converse with an intelligent agent via natural language, which then orchestrates vehicle functions. For example, saying “Turn on my seat heater” allows the system to automatically identify the user’s location, environmental conditions, and personal preferences to execute the optimal action. Based on data from nearly 10 million users, Great Wall found that human-vehicle interactions occur roughly 4–5 times per hour—a sparse behavior pattern—highlighting that the true value of intelligent cockpits lies in precise responsiveness, not feature overload. Multiple automakers are advancing agent-centric strategies: IM Motors launched the IM Ultra Agent, Li Auto is building its VLA Driver Large Model, and Baidu Maps is developing an AI cockpit agent—all aiming to unify cockpit and driving experiences under a single intelligent agent. The competitive focus is thus shifting to “experiential depth”: whoever can deliver “this car truly understands me” will gain the upper hand in the next phase. Differentiated “personality” is emerging as a new moat. Brands can train unique agents based on distinct user demographics, brand DNA, and scenario-specific data. Huawei emphasizes safety redundancy, XPeng prioritizes commuting efficiency, Li Auto leans into family-friendliness, and Great Wall infuses off-road characteristics. These traits stem from real-world scenario data accumulation—not merely algorithmic tuning. Great Wall Motor has structured its intelligent transformation into three phases: From 2021–2023, it unified its software platform and established OTA capabilities; From 2023–2025, through over 1,000 user conversations and dozens of deep ethnographic studies, it achieved high user satisfaction; Starting in 2025, it entered the AI agent era, reconstructing the user experience around the “human-agent-vehicle” paradigm. This roadmap reflects the broader evolution of traditional automakers—from selling hardware, to selling software, and ultimately to selling experiences. She Shidong emphasized that automakers must not outsource their connection with users to suppliers. As AI becomes the intermediary between humans and vehicles, controlling the intelligent agent means controlling the user relationship—a proposition now seen as central to successful intelligent transformation.

Editor:NewsAssistant