In 2018, the first unmanned vehicles in Russia appeared on the roads of Innopolis. According to the developers of intelligent transport, in 5-10 years, any manager will be able to replace an ordinary car with a full-time driver with an autonomous vehicle.
Salimzhan Gafurov, head of the Autonomous Transport Systems Laboratory at Innopolis University, Which is part of the NTI Competence Center in Robotics and Mechatronics, told us how the drone works and who is responsible for driving safety.
How the robotic car copes with the problems of Russian roads: recklessness and off-road
What Does A Drone Have Instead Of Eyes And Driving Skills?
The car can drive without human intervention, thanks to the autonomous control system. Unique automation systems collect information about the car’s position on the road, analyze it and select the optimal driving scenario. Let’s see how this happens in practice.
Instead of the experience and knowledge of an ordinary driver, the drone has a software-controlled server. The program of the car includes the rules of the road.
An infinite number of situations can happen on the road and scenario options for how to act in a given case.
The server receives information from external sensors. The computer analyzes it, compares it with its database, and selects an action scenario. The database is constantly updated. If an unusual situation occurs on the road, the operator describes it; experts like all possible options for action, and developers update the database.
How a Self-Driving Car “Sees” What Is Happening Around
Radars, antennas, and cameras replace the eyes of the drone. With the help of the camera, the drone sees the object; with the use of the radar, it “feels.” The radar receives the reflected signal, converts it into a three-dimensional image, and selects a similar object from the database. The radar can distinguish a person from a car and work in poor visibility conditions.
But the central organ of vision of the drone is not the camera – it is a lidar (laser radar) mounted on the roof. It scans the surrounding space, forms a three-dimensional image, and transmits it to the car’s brain. The viewing angle of the lidar is 360⁰.
The car determines the location and direction of movement using the GLONASS, GPS, Baidu, and Galileo satellite navigation system – it all depends on the number of visible satellites. The built-in GNSS (Global navigation satellite systems) itself determines which service to choose. A localization system is built into the car to prevent the drone from losing orientation in conditions of weak signal.
In addition to information from the satellite, it considers data from visual and wheel odometry (the speed of each car’s wheel) from the internal inertial navigation system IMU (internal measurement system).
Also Read: Grafana: A Tool For Convenient Visualization Of Monitoring Metrics