Sensor Realistic Simulation with rFpro
Physics based sensor models for camera, LiDAR and radar to enable ADAS and Autonomous Vehicle simulation
Camera Simulation
rFpro offers a huge range of possibilities for camera simulation including lens distortion models, tone mapping, exposure control and much more in both the real-time and non-real-time modes. One of the major challenges facing autonomous vehicle developers is the generation of training data which, if based on real-world camera data, is expensive, time consuming and inflexible. Using rFpro we can create an infinite number of different scenarios with variations in weather conditions, lighting conditions, traffic and pedestrian motions and all the data is already fully annotated. Find out more about the data farming possibilities offered by rFpro. The non-real-time Synchro-Step mode available in rFpro allows for extremely high fidelity camera simulation allowing a single camera frame to be built up over multiple time steps running at the CCD sample rate which could be 15-30kHz. This means we can run camera simulations that generate HDR images with true motion blur and run fully synchronised stereo cameras.
The video to the right shows a simulation of a camera with an RCCC colour filter and a lens distortion model. Here the vehicle is being driven around Paris and notice the effect the filter has on the colours of objects in the scene. Green trees appear grey whilst the red brake lights and traffic lights still shine through in the expected colour. The effect of the lens distortion model is most apparent at the edge of the view where buildings and lampposts can be seen to curve. These effects can all be calibrated to allow the simulation to represent your camera and your lens.
LiDAR Sensor Simulation
Taking advantage of the sensor model API provided by rFpro we can create physics based LiDAR models of real devices. These models rely on the same physics based rendering used to generate the normal camera images but utilise a different set of material property data suitable for the different wavelength that LiDAR operates at. Combining this capability with models derived from physical testing of real sensors we produce realistic models of real sensors that take into account weather and other noise effects. We carried out some of this work as part of our participation in the Streetwise project, a collaborative R&D project part-funded by CCAV through UKRI and led by Five AI.
The LiDAR video to the left shows the output from our Velodyne Ultra Puck® model as the vehicle is driven around Paris. The sensor output is visualised using Veloview, Velodyne’s own software, so that we can make sure the LiDAR data stream is indistinguishable from the real device. Veloview updates the visualisation at the rotational speed of the sensors, in this case at about 10Hz. Similar work has been done with other LiDAR sensors such as those from Ouster and Ibeo Automotive Systems.
Radar Sensor Simulation
Radar sensor simulation is perhaps the most challenging type of sensor to model in a real-time capable simulation environment like rFpro. We deal with this challenge by scaling the radar sensor simulation fidelity to take account of rFpro’s real-time and fully synchronous (non-realtime constrained) simulation modes. At Claytex, we have existing real-time radar sensors that use a line-of-sight detection method to find targets in the scene as well as higher fidelity, ray-tracing based sensors for when the simulation isn’t constrained by real-time operation. In our first generation real-time capable sensors the radar cross section of each pixel in the scene is calculated using a flat plate approximation combined with an approximation of the material properties to determine which are the strongest returns within the scene. This is all wrapped up in our generic radar sensor model allowing you to configure the fields of view, resolutions and number of targets.
The second generation radar sensor models use a full multi-path modelling approach to detect objects in the scene. The new ray tracing support available within rFpro combined with the ability to supply different material properties for every object in the scene is enabling these new models to demonstrate very good correlation results as presented here. Using a similar approach to that employed for the LiDAR sensors we are incorporating noise effects based on data from physical testing of multiple radar sensors. If you want to know more about these second generation radar sensor models then please contact us.
Ground Truth and Training Data
One major advantage of simulation is the ability to generate fully labelled training data for every type of sensor. Data that is labelled from simulation is always pixel accurate because the simulation knows exactly where every object is in the scene. This approach is called Data Farming and compares to Render Farming which revolutionised the economics of popular animation. Data Farming allows you to build complete datasets that cover the full vehicle system where every sensor is simulated at the same time. The data is synchronised across all the sensors and automatically labelled. You can also access object level ground truth data such as 2D screen space and 3D bounding boxes of all the objects in the sensors view giving you many different ways to use rFpro to support the development, training and validation of your systems.
Combining this approach with the rFpro’s library of digital twins of public roads, test tracks and proving grounds, spanning North America, Asia and Europe means huge quantities of training data can be generated quickly. The digital twins include multi-lane highways, urban, rural and mountain routes, all replicated faithfully from the real world using rFpro’s unique 3D reconstruction process.
Full Suite of Sensor Models
It’s important not to forget all the other sensors that your vehicle needs and uses to perceive the world around it. We also offer sensor models for ultrasound, GPS and IMU’s so that your vehicle control system can be fully immersed into the simulation environment. Please get in touch if you want to know more.
Sensors and Vehicle Dynamics
Your sensors have to operate in a noisy environment as a vehicle driving along a road generates a lot of noise that the sensors have to be able to cope with. When it comes to validating your control system you need to make sure that all these noise factors are included in your simulation and rFpro provides interfaces to a wide range of vehicle dynamics models including our own VeSyMA solution.
Related Articles
- Automating the animation of pedestrian motion
To train autonomous driving systems for critical collision scenarios, the DeepSafe project aims to address the “reality gap” in these simulated scenarios. A crucial component of these scenarios is the realistic behavior of pedestrians interacting with their environment. AVSandbox uses…
- AVSandbox’s Advanced LiDAR Simulation for Safe and Effective Testing
In the ever-evolving landscape of autonomous driving, ensuring the safety and efficacy of self-driving systems is paramount. This is where simulation tools like AVSandbox come into play, offering a revolutionary solution for testing and training autonomous vehicles in a virtual…
- Emergency vehicle scenarios in AVSandbox
Incidents involving autonomous vehicles (AV) and emergency vehicles raise lots of questions on how an AV should behave when they have to deal with those type of traffic vehicles at different situations such as high-speed motorways or low-speed intersections (both…