Title: DragonFly: A Localization and Perception Module for Reliable and Affordable
Autonomous Driving
Abstract : Autonomous Driving technologies have attracted enormous interests from both
academia and industry in last few years. By equipping with multiple-ways Lidar,
complicated tasks can be handled by autonomous vehicles. But the expensive cost of
Lidar module prevents the popularization of this solution. Based on this, we
propose an affordable method by using our DragonFly module to fulfill autonomous
driving tasks. In this talk, a thoughtful introduction for DragonFly module will be
given, it is comprised by 4 HD (1080P) global-shutter cameras (a stereo pair in the
front, and a stereo pair in the back), an IMU device, a Nvidia Jetson TX1 computing
submodule and a rich set of I/O. It provides three functions for autonomous
driving: perception, localization, and panoramic video streaming and maintain a
decent performance. Several demos will be shown as well, such as localization via a
VIO SLAM algorithm, perception made up by real time road conditions, vehicle and
pedestrian detection, spatial information of environment, depth information of
vehicles and pedestrians.
Bio: Zhiliu Yang, is currently a Researcher at PerceptIn. He receive his B.E. in
Electrical Engineering at Yunnan University and M.S. in Electrical Engineering at
Beijing Institute of Technology. He is interested in Computer vision, Deep
Learning, and High Performance Computing.