Many companies are currently developing radar sensors for the autonomous driving application and each system made available will have unique specifications and thus capabilities. To gain a sufficient amount of radar measurements in the real-world, to confirm that a given sensor meets certain requirements, will be both time consuming and expensive. The virtual environment in rFpro allows you to bypass the costs associated with purchasing equipment and acquiring a test facility whilst also incorporating realistic roadside factors such as the behaviour of other vehicles. Simulations within this environment can be both run continuously and concurrently which allows a large amount of test data to be acquired in a relatively short period of time. ADAS (Advanced Driver-Assistance Systems) systems that rely on AI devices for cognitive behaviour require a large amount of training data prior to deployment and the sensor catalogue developed at Claytex can be used alongside rFpro to make this obtainable.
The systems simulated here are range-doppler radars that scan in the azimuth plane. This type of sensing allows the range, angle and velocity of detected targets to be estimated and hence can be used to construct an image of the scenario being measured. The important parameters set for each device are the:
- Minimum and Maximum: Range, Velocity and FOV (Field-Of-View)
- Range, Velocity and Angular resolution
- Output image refresh rate
This demo uses the manufacturer specifications of three commercially available radar systems and simulates the output image produced by each. One of these systems (Model C) has two modes of operation in an attempt to meet the requirements of short, medium and long range applications whereas the other two devices (Model A and Model B) operate in a single mode.
The table below shows the range, angular and frame rate parameters of the three radar systems under test:
Manufacturer | Model | Mode | Range resolution (m) | Max range (m) | Angular resolution (Deg) | Hor FOV (Deg) | Frame Rate (Hz) |
1 | A | Long/Med range | 1.5 | 200 | 2 | ±25 | 20 |
1 | B | Short range | 1.2 | 80 | 1.5 | ±60 | 20 |
2 | C | Med/Short range | 0.66 | 64 | 1.5 | ±50 | 18 |
2 | C | Long range | 1.8 | 175 | 4 | ±16 | 18 |
This video shows the raw output images in real-time when using the Model A and Model B devices on-board a moving vehicle within the Paris Streets map in rFpro:
The video below shows the Model C system collecting data on the same map whilst operating in the short-range and long-range modes:
The color of each detection within a radar image relates to the relative velocity between the ego vehicle and the detection. Moving vehicles become distinct from stationary objects due to the doppler effect. Any deacceleration/acceleration of the ego vehicle will cause the relative velocity to also change.
Analysing the performance of a cognitive system that makes use of the output images produced by a set of sensors simulated will tell an ADAS system designer if the configuration under test meets the requirements set. In terms of the sensors simulated in Table 1, if the two-mode system from Manufacturer 2 is deemed adequate then a cost and spatial saving can be made. A virtual environment with accurate sensor models makes this possible at a much lower cost compared to physical testing.
Written by Ben Willetts – Project Engineer: Sensor Modelling
Please get in touch if you have any questions/suggestions relating to this or any other blogpost. You can submit your questions / suggestions via: Tech Blog Questions / Topic Suggestion.