Article 5 of 7 in our Content Series: How can content from third-party ADAS or vehicle dynamics engineering tools be converted for use in Cruden’s driving simulators? Dennis Marcus reveals all.
The Panthera software suite is the beating heart of any Cruden driving simulator. It does much more than control the motion platform or provide the right torque feedback to the steering wheel, however. It also renders the visual element of the simulation.
The motion is the most intriguing – especially to mechanical engineers! – but the visual input is what brings a simulator to life. All other inputs are secondary but must be aligned properly with the visual cue.
As with so much of what Cruden does, Panthera’s ability to integrate all the different elements of the simulator represents a unique approach. This seamless integration through Panthera is key to low latency – to properly synchronize all the feedback channels. For the most immersive experience, the simulator must respond immediately to the driver’s commands, not a few milliseconds later.
Panthera also integrates third-party automotive engineering tools. Much of a driving simulator’s value lies in its ability to integrate with vehicle dynamics models such as IPG CarMaker, CarSim or dSPACE ASM as well as for sensor simulation, traffic simulation, hardware integration and more. All of these tools are also used in offline simulation – without the driving simulator – and not necessarily in real time. Some simulations can be done much faster than real time and therefore more efficiently.
Offline simulations are often conducted in 3D environments that come ready-made with the tools, or that have been created from editors that are bundled with them. These 3D sceneries might be optimized for a particular sensor – like LiDAR, radar or ultrasonic – that works differently to the human eye. The tools also generate a visualization of the 3D world so that, for example, the ADAS sensor engineer can verify the experiment, even if the car is driving automatically on a predefined path.
Lots of work goes into creating 3D environments that are relevant to certain types of testing and validation, so when you connect the engineering tools to the driving simulator, it can be useful to run experiments over the same routes that were used before. The problem is that the quality of the graphics in the engineering tools is much lower than those of a driving simulator. That’s not necessarily an issue for the sensor engineer, but the graphics won’t prompt an authentic response from an inexperienced human driver. (To read why it’s important to distinguish between the two different needs, see our earlier article.)
To do the job properly, the 3D worlds from the engineering tools must be converted into something of a higher quality. This can be achieved in two ways:
The first is in-house by the graphics artists of the Cruden Content Studio. Engineers send in scenery files and our 3D artists convert them into something that has the immersive detail of the content we create from scratch. The starting point, often contained in an OpenDRIVE file, is usually only the geometry – the trajectory of the road, elevation changes etc.
Converting the graphics into a format that can be read by Panthera’s faster rendering engine immediately provides the benefit of increasing from 60 frames per second to 120 (see our previous article for more on frame rates). Next comes the manual labor, sometimes weeks of work – adding realistic asphalt and tar lines to plain, grey road surfaces, for example – to create a more immersive environment for the human drivers in the simulator.
When the 3D world is finished, the engineer can run tests on sensor models, autonomous driving or ADAS systems with a driver in the loop, using exactly the same scenarios that were used in their original experiments.
Whether you choose to take the Cruden route to conversion, or the VectorZero Roadrunner route, depends on your expectations for the 3D world that you’ll be using for a particular experiment. Expert drivers won’t need such detailed graphics, but if you’re trying to figure out how people interact with an ADAS system that is almost ready to roll out, you’ll typically experiment with large groups of inexperienced drivers. The purpose is to monitor how the test participants respond to the ADAS system, so it’s essential to provide a much more realistic, immersive virtual world so that the simulator drivers behave more naturally. Either way, Cruden has you covered!
For more information on the topics covered in this article, please contact Dennis Marcus via d.marcus@cruden.com or on +31 20 707 4646.
Other articles in the series:
View all articles in our Content Series of articles: here.
Article 1: 3D content for driving simulators – all you need to know! (Intro)
Article 2: How we build 3D tracks and geographic databases for driving simulators
Article 3: Engineering v human-centric visuals for simulation
Article 4: Blockbuster content on a driving simulator near you!
Article 6: Rendering a world of new possibilities
Article 7: Not just for billboards - why LED panel technology is the future for high-end driving simulators