Explore high-fidelity and diverse sensor simulation for safe autonomous vehicle development.
Simulation / Modeling / Design
Automotive and Transportation
Return on Investment
Risk Mitigation
NVIDIA Omniverse Enterprise
NVIDIA OVX
NVIDIA DGX
Developing autonomous vehicles (AVs) requires vast amounts of training data that mirrors the real-world diversity they’ll face on the road. Sensor simulation addresses this challenge by rendering physically-based sensor data in virtual environments. Conditioned on these physics, world models add variation to sensor simulation, amplifying lighting, weather, geolocations, and more. With these capabilities, you can train, test, and validate AVs at scale without having to encounter rare and dangerous scenarios in the real world. The precision and diversity in sensor data and environmental interaction are crucial for developing physical AI.
Why AV Simulation Matters:
Render diverse driving conditions—such as adverse weather, traffic changes, and rare or dangerous scenarios—without having to encounter them in the real world.
Accelerate development and reduce reliance on costly data-collection fleets by generating data to meet model needs.
Deploy a virtual fleet to configure new sensors and stacks before physical prototyping.
Developers can get started building AV simulation pipelines by taking the following steps.
NVIDIA Omniverse™ NuRec provides APIs and tools for neural reconstruction and rendering, allowing developers to turn their sensor data into high-fidelity 3D digital twins, simulate new events, and render datasets from new perspectives.
Cosmos Transfer is conditioned on ground-truth and structured data inputs to generate new lighting, weather and terrain—turning a single driving scenario into hundreds. Developers can use prompts as well as sensor data as input to generate different variants of an existing scene.
Both NuRec and Cosmos Transfer are integrated with CARLA, a leading open-source AV simulator. This integration allows developers to generate sensor data from Gaussian-based reconstructions using ray tracing and increase scenario diversity with Cosmos WFMs.
With these tools, developers can:
The integration includes a starter pack of pre-reconstructed scenes, enabling rapid creation of diverse, corner-case datasets for AV development.
Developers can use the latest NVIDIA Cosmos data processing workflows and world foundation models to enhance AV development with faster, scalable synthetic data generation. Filter, annotate, and de-duplicate massive datasets with Cosmos Curator, and quickly create tailored post-training datasets with Cosmos Dataset Search. Cosmos Predict and Cosmos Transfer WFMs generate new video data for testing and validation, scaling across weather, lighting, and terrain conditions.
Learn how our partners are delivering physically-based simulation for safe and efficient autonomous vehicle development.
Please sign up using our interest form to get the latest information.