Tesla’s latest Autopilot feature, Navigate on Autopilot, has been trashed in a review by Consumer Reports who said the system is “less competent than a human driver”, makes “poor decisions”, and creates “potential safety risks.”
Navigate on Autopilot enables on-ramp to off-ramp driving on the highway with the system doing its own lane changes based on speed and the destination entered within the navigation system.
It’s presupposed to handle interchanges and off-ramps.
Of their latest review of Navigate on Autopilot, Consumer Reports found the system to be poor:
“In practice, we found that Navigate on Autopilot lagged far behind a human driver’s skill set: The feature cut off cars without leaving enough space and even passed other cars in ways in which violate state laws, based on several law enforcement representatives CR interviewed for this report. In consequence, the motive force often had to forestall the system from making poor decisions.”
Jake Fisher, Consumer Reports’ senior director of auto testing, added:
“The system’s role must be to assist the motive force, but the way in which this technology is deployed, it’s the opposite way around. It’s incredibly nearsighted. It doesn’t appear to react to brake lights or turn signals, it may possibly’t anticipate what other drivers will do, and because of this, you continuously need to be one step ahead of it.”
The magazine argues that the state of the system points to Tesla being late on its goal to deliver a full self-driving system by the top of the yr.
That was a reasonably harsh review.
Based on my several hundred miles of experience with NoA, I agree with lots of the weaknesses highlighted within the review, but I believe they miss the purpose by calling it “less competent than a human driver” and a “safety risk.”
They make it sound prefer it is useless in the intervening time and I disagree.
It’s still very much a driver assist system and subsequently, the proven fact that is less competent than humans shouldn’t be really the purpose. And if it’s used as a driver assist system, then it doesn’t really end in a security risk.
In fact, that’s apart from clear bugs like suggesting a lane change into oncoming traffic.
In my very own experience, I rarely need to intervene when using the system. Probably the most common intervention is to return into the fitting lane faster after passing because I feel like NoA is a bit too careful in being far ahead of the passed vehicle.
But I all the time listen and take a look at to know the system’s intentions.
It all the time appears to be attempting to do the fitting thing, but when I don’t just like the hesitation, I take over.
For essentially the most part, I discovered that it reduces my driving workload and I remain protected so long as I listen. It even prevented me from missing my exit on two occasions.
I believe there’s loads of room for improvement as highlighted by CR, but I disagree about portraying the feature as dangerous. It’s simply not if used as intended.