Forget about radar for self-driving tech. Tesla announced yesterday that it’s officially transitioning to its camera-based autonomous driving system, known as Tesla Vision.
Starting this month, Model 3 and Model Y vehicles, which will be delivered to the North American market, will be the first cars to ditch radar entirely, and employ camera vision and neural net processing for the performance of “Autopilot, Full-Self Driving, and certain active safety features.”
Naturally, one wonders why on earth would Tesla make such a change. And when Elon Musk twitted the launch of “pure vision Autopilot,” there were several comments expressing the same question.
there will never be secure autonoumous driving with vision only. obviously even we humans use a lot more senses which are not technically substituteable easily. Fear and Adrenalin for example, or Audio.
But only vision is an approach that will never be secure.
— gewure (@gewure) May 26, 2021
Is pure vision enough? After all humans use sound to drive safely- car horns, emergency vehicle sirens to name just a couple. In India for example drivers use their car horns extensively to communicate with other drivers.
— Anne-Marie Hancock (@AnneMarieHanco7) May 26, 2021
What are the primary benefits of pure vision for Tesla? Higher production? Lower cost? Improved software performance? No need to integrate external suppliers?
What are the primary benefits of pure vision for Tesla? Higher production? Lower cost? Improved software performance? No need to integrate external suppliers?
— Whole Mars Catalog (@WholeMarsBlog) May 26, 2021
In fact, it makes no sense to skip radar and rely solely on cameras.
Cameras may offer higher resolutions and lower production cost, but they have significant limitations. They are less effective in bad weather conditions and have lower accuracy during nighttime. Most notably, the neural networks that determine what is being seen by the cameras require large amounts of training and processing power – and both of these factors are intrinsically limited within a car’s computer system.
On the other hand, radar sensors are much more reliable as they offer better range and higher accuracy in terms of object distance and speed – no matter the weather or light conditions. They do have lower resolution, however, which means that they require tech enabling them to run at higher frequencies, and that’s costly.
To put it in a nutshell, the transition to cameras reduces cost which seems to be a larger trend for Tesla. Nevertheless, it’s impossible to comprehend how a camera could offer the same level of safety radar provides, which raises serious vehicle and road safety concerns. The excellent thread by Ed Niedermeyer in the tweet below is illuminative:
The vast majority of pedestrian/cyclist fatalities take place in low-light conditions, where vision-based features perform poorly. Tesla’s decision to use “pure vision” for its driving automation could disproportionately impact vulnerable road users’ safety.
— E.W. Niedermeyer (@Tweetermeyer) May 25, 2021
To make matters worse, Tesla warns that its Vision system will probably cause temporary limitations to some features.
- Autosteer will be limited to a maximum speed of 75 mph and a longer minimum following distance.
- Smart Summon (if equipped) and Emergency Lane Departure Avoidance may be disabled at delivery.
What’s more, customers don’t really have a choice regarding the radar or camera system of their prospective vehicles. Those who ordered a car before May 2021 and are matched to a model with Tesla Vision will be notified of the change through their Tesla accounts prior to delivery.
Whether this switch will be successful for Tesla is highly doubtful. Let’s hope that it doesn’t result in any negative impacts on vehicle safety. Nobody wants that.
Do EVs excite your electrons? Do ebikes get your wheels spinning? Do self-driving cars get you all charged up?
Then you need the weekly SHIFT newsletter in your life. Click here to sign up.
Get the TNW newsletter
Get the most important tech news in your inbox each week.