Data fusion from a range of sensors — GNSS, inertial, LiDAR, radar and more — to enable autonomous navigation is a highly complex operation that requires very sophisticated software stacks. A free webinar on October 27 presents leading-edge research and development in this realm that brings modular solutions ready to hand. Three expert panelists provide diverse, complementary perspectives on this dynamic, rewarding environment.
Fusion with Installed Sensors
Sensors already installed on most modern vehicles can be exploited via the CAN-bus for positioning. These sensors include low-resolution odometry (DMI) and consumer-grade IMUs currently used for dynamic stability control and wheel slip detection. A novel approach for combining vehicle speed, steering angles, transmission settings and multiple odometry inputs demonstrates achievable results while operating under a GNSS-denied environment. A 90% performance improvement compared to a standalone GNSS/INS solution shows promise for future production models.
Radar-based Parking
An integrated radar-based localization system that supports Level 4 autonomous driving performs well in automated parking inside covered parking garages. The system integrates automotive radars and dead reckoning technologies supported by high-definition (HD) maps to offer decimeter-level positioning accuracy.
What’s Out There
Several sources of applicable data and versatile software are available for use in autonomous vehicle applications, including full datasets, full software stacks, convolutional neural networks (CNNs) to be integrated into those stacks, data on which to train a new CNN, simulators, simulation environments, recorded data, annotated data, and more. For example, Autoware is a well-supported open source project with many large companies involved in and collaborating on the efforts to improve and further develop the project.
Register now for the free October 27 webinar, Sensor Fusion in Autonomous Vehicles.
Expert panelists:
Ryan Dixon is the Sensor Fusion and Autonomy Lead in NovAtel’s Applied Research group. In this role he is responsible for exploring sensor fusion methods and relating them to autonomy applications. Prior to this he was Chief Engineer of the SPAN GNSS/INS products group at NovAtel, responsible for the dedicated team maintaining and enhancing NovAtel’s inertial product portfolio.
Dr. Aboelmagd Noureldin received bachelor’s and Masters degrees in engineering physics from Cairo University and a Ph.D. in electrical and computer engineering from The University of Calgary. He is currently a Cross-Appointment Professor with the Departments of Electrical and Computer Engineering, Royal Military College of Canada, Queen’s University, Kingston, Ontario, Canada.
David van Geyn is the Open Autonomy Engineering Manager in the Products and Services group at Hexagon | AutonomouStuff, based in Ottawa, Ontario, Canada. He manages a team of software engineers that work on multiple open-source autonomous vehicle stacks, customer deployments/projects using those stacks, and sensor drivers, as well as other products such as the AutonomouStuff Shuttle. David has a B.Cmp.H. and M.Sc. from Queen’s University at Kingston and over 10 years of experience in the industry, having worked on automotive software and research for autonomous vehicles.
Register now for the Tuesday, October 27 webinar.