Learning Robot Navigation: from Perception to Action
Davide Scaramuzza
ETH Zürich, CH
Abstract
What does it take to build autonomous mobile robotic systems that are safer and more efficient than humans? I will talk about the standard robotic autonomy architecture and show how perception and control are tightly coupled. I will also talk about the role of machine learning to improve perception, modeling, and control and how we can benefit from simulation to train robust algorithms that work in the real world with minimum adaptation. I will also talk about neuromorphic, event-based cameras, bio-inspired vision sensors with much lower latency, higher dynamic range, and much lower power consumption than standard cameras, which opens the door to new type of applications and opportunities for robotics and computer vision.