TY - GEN
T1 - Fleye on the car
T2 - 14th International Symposium on Information Processing in Sensor Networks, IPSN 2015
AU - Nasser, Soliman
AU - Barry, Andew
AU - Doniec, Marek
AU - Peled, Guy
AU - Rosman, Guy
AU - Rus, Daniela
AU - Volkov, Mikhail
AU - Feldman, Dan
PY - 2015/4/13
Y1 - 2015/4/13
N2 - Vehicle-based vision algorithms, such as the collision alert systems [4], are able to interpret a scene in real-time and provide drivers with immediate feedback. However, such technologies are based on cameras on the car, limited to the vicinity of the car, severely limiting their potential. They cannot find empty parking slots, bypass traffic jams, or warn about dangers outside the car's immediate surrounding. An intelligent driving system augmented with additional sensors and network inputs may significantly reduce the number of accidents, improve traffic congestion, and care for the safety and quality of people's lives. We propose an open-code system, called Fleye, that consists of an autonomous drone (nano quadrotor) that carries a radio camera and flies few meters in front and above the car. The streaming video is transmitted in real time from the quadcopter to Amazon's EC2 cloud together with information about the driver, the drone, and the car's state. The output is then transmitted to the "smart glasses" of the driver. The control of the drone, as well as the sensor data collection from the driver, is done by low cost (<30$) minicomputer. Most computation is done in the cloud, allowing straightforward integration of multiple vehicle behaviour and additional sensors, as well as greater computational capability.
AB - Vehicle-based vision algorithms, such as the collision alert systems [4], are able to interpret a scene in real-time and provide drivers with immediate feedback. However, such technologies are based on cameras on the car, limited to the vicinity of the car, severely limiting their potential. They cannot find empty parking slots, bypass traffic jams, or warn about dangers outside the car's immediate surrounding. An intelligent driving system augmented with additional sensors and network inputs may significantly reduce the number of accidents, improve traffic congestion, and care for the safety and quality of people's lives. We propose an open-code system, called Fleye, that consists of an autonomous drone (nano quadrotor) that carries a radio camera and flies few meters in front and above the car. The streaming video is transmitted in real time from the quadcopter to Amazon's EC2 cloud together with information about the driver, the drone, and the car's state. The output is then transmitted to the "smart glasses" of the driver. The control of the drone, as well as the sensor data collection from the driver, is done by low cost (<30$) minicomputer. Most computation is done in the cloud, allowing straightforward integration of multiple vehicle behaviour and additional sensors, as well as greater computational capability.
KW - Collision alert system
KW - Internet of Things
KW - Quadrotors
KW - Video streaming
UR - http://www.scopus.com/inward/record.url?scp=84954109060&partnerID=8YFLogxK
U2 - https://doi.org/10.1145/2737095.2742919
DO - https://doi.org/10.1145/2737095.2742919
M3 - Conference contribution
T3 - IPSN 2015 - Proceedings of the 14th International Symposium on Information Processing in Sensor Networks (Part of CPS Week)
SP - 382
EP - 383
BT - IPSN 2015 - Proceedings of the 14th International Symposium on Information Processing in Sensor Networks (Part of CPS Week)
Y2 - 13 April 2015 through 16 April 2015
ER -