Energy News Today

Tesla FSD Beta users show how the system works — and doesn’t

Tesla CEO Elon Musk has been promising customers a driverless vehicle since at least 2016.

While the company hasn’t delivered on that promise, Tesla lets thousands of employees and customers try new and unfinished driver assistance features on public roads in the U.S. through a program called Full Self Driving Beta, or FSD Beta.  

Only Tesla owners who have the company’s premium FSD driver assistance system installed in their cars can join the FSD Beta program. (That option costs $12,000 up front or $199 per month in the U.S. today.) Owners must then obtain a high driver-safety score, as determined by Tesla software that monitors their driving habits, and maintain it to keep FSD Beta access. No safety certification or professional training is required.

FSD Beta can best be summarized as a host of new features that are not yet debugged. Chief among them is “autosteer on city streets,” which lets the car navigate around complex urban environments automatically.

In January and February, CNBC rode along with three Tesla owners who are participants in FSD Beta to get an uncensored look at how the system works — and doesn’t — today.

All three drivers understood that the technology does not make their Tesla electric cars fully self driving, despite the brand name. The vehicles with FSD Beta engaged maneuvered around some suburban and rural roads successfully, but suffered some dramatic glitches as well, especially in crowded urban environments.

Tesla bull, FSD skeptic

Cost of making it better

Another Tesla Model Y owner and FSD Beta participant, Kevin Smith, in Murfreesboro, Tennessee, sees glitches and disengagements during drives as inevitable, and part of the process of making FSD Beta into a truly, autonomous system someday.

Smith has driven more than 5,000 miles with FSD Beta, he told CNBC.

“Any time the car could just make a mistake,” he said. “And I have to be ready for that. My stress levels go up, not down from using Full Self-Driving… But that’s that’s the cost of of making it better,” Smith said.

On the ride-along, CNBC witnessed his vehicle automatically stopping and navigating through an intersection, without Smith having to steer.

He’s generally impressed with the technology so far, but notes it hasn’t worked in snow or inclement weather, and that every new version of FSD Beta, released via over-the-air software updates to his car, can solve one problem while introducing a new one.

By using the FSD beta on public streets, Smith says, “I don’t feel I’m increasing the assumed risk that people are putting themselves in by also being on those public streets. We share those streets with people who are, you know, using a car for the first time with their learner’s permit.”

Another Tesla owner, Dan Eldridge, took CNBC for a ride in his Model 3 in San Francisco on Feb. 1. He said that he’s been able to use the feature safely by remaining attentive.

“I haven’t really been in a situation where I felt like I couldn’t gain control, like I didn’t have enough time to gain control,” he explained at the outset.

The car navigated some thoroughfares well, but Eldridge had to stop it from rolling through a stop sign, which it nearly did without warning him to take over steering. The car didn’t navigate roundabouts properly. It also nearly cut off another driver by trying to automatically change lanes, requiring Elridge to disengage the system.

While he was vigilant and avoided any accidents, Eldridge said, “When I’m using the FSD, I’d say I’m less worried about hitting a pedestrian than I am about being the victim of a road rage incident, because I’m just not driving in a courteous way.”

Experimenting in the real world

While there are no federal laws barring Tesla from doing this, transportation experts are not convinced Tesla’s experiment on public roads is a safe or sensible one.

So far, two Tesla owners have submitted complaints to the federal vehicle safety authority NHTSA saying they thought FSD Beta contributed to or caused crashes they experienced. The FSD Beta program and technology is under investigation by NHTSA and the California Department of Motor Vehicles.  

Kelly Funkhouser has been testing Tesla’s systems including FSD Beta on a closed route for Consumer Reports. She found that a lot of the time, Tesla’s driver monitoring systems, including cabin cameras that are supposed to track drivers attentiveness, don’t work.

“For Full Self-Driving beta, that’s when Tesla claims that they have these additional messages that say things like the camera’s blocked or to please pay attention. We’ve never experienced any of those warnings,” she said.

She added, “I understand why Tesla might want to get novice testers out there experiencing it. But I definitely think that that’s a huge risk that that consumers take when they are doing this on public roads.”

Destiny Thomas, the founder and CEO of Thrivance group, an urban-planning organization with a focus on marginalized communities, said she’s concerned that Tesla is thinking about what drivers want, but not about the safety of people who share the roads.

“How is this car going to recognize someone using an assistive device that maybe isn’t a wheelchair and doesn’t look like one? How is this technology going to be able to recognize someone who has purple undertones in their skin that don’t react to the sensors that are in the car?” She asks.

She would like to see Tesla do more community engagement before putting experimental vehicles with novice testers behind the wheel on city streets.

CNBC reached out to Tesla but the company did not reply to a request for comment.

Watch the full video here.

Read More: Tesla FSD Beta users show how the system works — and doesn’t

2022-02-15 17:48:00

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy
%d bloggers like this: