Tesla has revealed the latest version of its driverless Autopilot system will be even more reliant on radar – and that’s both a good and a bad thing. In a new blog post, Tesla says the latest version of Autopilot will use “advanced signal processing to create a picture of the world using the onboard radar,” and will use machine learning to improve over time.
At the moment, Tesla’s Autopilot uses a combination of radar and camera sensors, using the latter to confirm detected objects. With the new change, Tesla says: “The car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero-visibility conditions.” What’s more, Tesla thinks the system could eventually be used to bounce signals under other vehicles, and stop for hazards in front of other cars.
The reason for the change is simple, but getting it to work is much harder. Unlike cameras, radar is great for detecting objects in the fog, dust, rain and snow – but it’s also fooled by objects a camera can easily analyse. For example, Tesla’s blog says: “The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.”
To get around these issues, Tesla has introduced several tweaks to make its radar system more intelligent. First, the radar in Tesla cars will now sample six times as many points as before, allowing the Autopilot system to build up significantly more detailed picture of the situation – despite using the same hardware.
Second, Tesla’s new Autopilot will create these 3D pictures every tenth of a second, and that will allow an even better virtual environment to be recreated. By comparing frames and using other data such as the car’s velocity and direction, the Autopilot will be better at predicting what to avoid and what to brake for.
Lastly, Tesla’s Autopilot will also use a form of machine learning. Tesla says initially the radar will initially log information in the background, comparing what it would do to what the driver actually does, and then recording and logging the data. Once the system isn’t braking falsely, the object and area will be added to Tesa’s geocoded whitelist. For example, if a driver is using Autopilot and has to overrule the system by braking, the data and location of the incident will be recorded and sent to the cloud. That way, Tesla’s Autopilot will slowly learn to differentiate hazards from other objects over time.
Cameras out, radar in
Although it sounds pretty mundane, this might be one of the most important Tesla Autopilot updates ever. As explained, moving to radar is a great idea, removing Autopilot’s reliance on camera, thereby making it almost weatherproof. Throw in the possibility of eventually reading hazards in front of the car ahead, and it’s clear this is amazing technology that will eventually make Autopilot much safer.