News
Mar 9, 2026
baCta Raises €7M to Build AI-Powered Molecular Factories for Sustainable Industrial Ingredients
Tech Updates
Enterprise
Artificial Intelligence
Americas
NewDecoded
4 min read

Image by Nvidia
NVIDIA confirmed at its GTC event that global automotive leaders BYD, Geely, Isuzu, and Nissan are adopting the NVIDIA DRIVE Hyperion platform for their next-generation Level 4 autonomous vehicle programs. This standardized reference architecture integrates high-performance compute, sensors, and networking to streamline the path to full autonomy. The move signals a major consolidation in the industry toward a unified hardware and software stack that accelerates validation and global deployment.
The momentum extends to the mobility sector, where Uber plans to launch a massive robotaxi fleet powered by NVIDIA DRIVE AV software across 28 global markets by 2028. This rollout begins in San Francisco and Los Angeles in early 2027, utilizing vehicles built on the Hyperion foundation. Other major providers including Lyft, Bolt, and Grab are also leveraging this platform to scale their autonomous ride-hailing initiatives and software-defined fleets.
Central to these advancements is the launch of Alpamayo 1.5, an open AI reasoning engine that allows vehicles to process complex scenarios using natural language prompts. This steerable model enables developers to refine driving trajectories by replaying rare road events and providing updated behavioral guidance. Over 100,000 developers have already downloaded the Alpamayo suite to build more transparent and safer autonomous systems. Safety remains the core priority with the introduction of NVIDIA Halos OS, a production-ready safety foundation for Level 4 autonomy. The OS features a unified architecture built on ASIL D-certified foundations and an active safety stack designed for five-star ratings. Partners such as Gatik and Lucid have joined the new Halos AI Systems Inspection Lab to ensure these systems meet rigorous automotive-grade standards. To validate these systems, NVIDIA unveiled Omniverse NuRec, which uses 3D Gaussian Splatting to turn real-world data into interactive simulations. This technology allows manufacturers to stress-test reasoning-based AI without the immense costs of manual worldbuilding. Providers like dSPACE and 51WORLD are already integrating NuRec to close the loop between real-world failures and digital training. Beyond the driving task, NVIDIA is also enhancing the passenger experience through a collaboration with Amazon. The partnership integrates Alexa Custom Assistant with multimodal AI on the DRIVE AGX platform, offering privacy-focused in-cabin intelligence. This ensures that while the vehicle handles the road, the interior provides a sophisticated and secure ambient environment for occupants.
This announcement marks a definitive end to the era of fragmented, proprietary autonomous stacks. By coalescing around NVIDIA's standardized hardware and the new Halos OS safety architecture, industry giants like BYD and Nissan are prioritizing interoperability and rapid scaling over isolated development. This shift suggests that the race for Level 4 autonomy will no longer be won by those with the most secret code, but by those who can best leverage shared platforms and unified simulation environments to solve the complex "long tail" of real-world driving hazards.
Related Articles