Go to top of the page

Simulation

Utilizing simulation for autonomous systems allows safe, repeatable conditions for users to develop, test, and train autonomous algorithms. By using simulation, autonomy algorithms can be trained and debugged before bringing the robot to the physical test track, or even before the hardware is completed. Test time can be optimized by performing thousands of simulated tests, instead of field tests. Simulation can also perform some safety tests more realistically than field tests can. For example, testing obstacle detection and avoidance algorithms are much safer when simulating a pedestrian walking, falling or kneeling in front of the robot, instead of using a human test subject.

CAVS is developing a physics-based simulator for sensors, environment simulation, and vehicle dynamics and mobility that will enable real-time simulation of closed-loop autonomous performance. Featuring MPI-based communication between components, the environment will be scalable and can simulate large numbers of robots interacting with each other. The software framework will also allow different codes for sensor and vehicle dynamics for use in the simulator. The sensor simulator will feature a high-fidelity ray-tracing engine for simulating realistic LIDAR returns from vegetation, GPS pop and drift error, and complex lighting effects – utilizing MSU’s supercomputing resources to perform all computations in real time.

vehicle

Figures taken from: C Goodin et al. “Unmanned Ground Vehicle Simulation with the Virtual Autonomous Navigation Environment.” Military Technologies (ICMT), 2017 International Conference on. IEEE, 2017.