Tesla driver completed 6,392km trip using autopilot and full autonomous driving software – ran into some issues

Tesla Model Y

Tesla Model YTim Levin/Insider

  • Tesla driver Tim Heckman covered 10,392 miles, mostly using autopilot and fully self-driving.

  • He said Autopilot had “deteriorated” over the years and FSD was “extremely poor outside of California”.

  • However, the Tesla driver said the software was a “lifesaver” when it came to long road trips.

The Tesla owner made the 6,392-kilometre journey, using mostly Tesla Autopilot and fully autonomous driving (FSD) – and said that while the software was a “lifesaver”, there were some issues along the way.

In December, Tim Heckman drove an S Plaid from Los Angeles to Pennsylvania and back, using autonomous software for 99% of his trips, documented on Twitter.

Heckman, the site’s reliability engineer, told Insider that while the autonomous software proved helpful during his travels, it also caused “stressful driving” at times, detailing incidents where the tech phantom would brake and try to abide by the speed limit, proper distances, or stay on its own lane.

The pros and cons of autonomous driving

While Autopilot is driver assistance software built into all Teslas and designed for highway driving, FSD is a beta add-on that can run in urban environments and is designed for lane changing, stop sign and traffic light recognition, and parking.

Tesla CEO Elon Musk said that eventually the software will be able to run completely on its own and be safer than drivers, but the beta program still requires a licensed driver to constantly monitor it.

Tesla Model S plaid sedan

Tesla Model S PlaidTesla

Heckman told Insider that the software sometimes registered cars on the screen that weren’t there or had difficulty identifying lane markings when there was salt on the road.

“Sometimes it’s like riding with a 15 or 16-year-old driver,” Heckman said of using the FSD on city streets outside of California. “There are strange violent maneuvers. Stops or enters the turning lane too early. In a way, it’s just a general lack of awareness about the environment.”

On the other hand, a Tesla owner said that the autopilot “saves lives” on highways, adding that although he had to disable the FSD software many times, the autopilot was only disengaged once when the car in front of him on the highway hit his brakes.

“It can be a huge cognitive relief. Long trips can take a mental toll,” Heckman said, noting that he had used autopilot on previous trips and found he could keep going without getting tired.

The software has also helped him avoid highway collisions in the past.

“I realize that sometimes I switch off when I’m driving,” said Heckman [software] it may increase it, but I know that if I switch off, at least I know the vehicle is supporting me.”

Getting worse, not better

In his Twitter thread of the experience, Heckman wrote that Autopilot was “worse” than when he bought his first Tesla in 2019, and that FSD was “extremely poor outside of California.”

Hedges & Company, a digital marketing firm for automakers, found in an analysis of more than 175 million car owners in 2019 that most Tesla owners live in California – meaning AI software could have more opportunities to learn from Californian roads.

Ultimately, Heckman said he doesn’t see himself buying a non-Tesla electric car – at least until charging networks catch up with Tesla – but he would like the automaker to rely on LiDAR, radar sensors that can help vehicles detect nearby objects.

Tesla Motors CEO Elon Musk addresses the media next to his Model S in Hong Kong, January 25, 2016.

Tesla Motors CEO Elon Musk speaks to the media next to a Model S in Hong Kong, January 25, 2016.Nora Tam/South China Morning Post via Getty Images

Musk has spoken out against expensive hardware in the past and has reportedly requested cameras instead of radar because he wants autonomous software to act like human eyes. The car company stopped installing LiDAR in its cars in 2021.

Heckman is not the first person to detail problems with Tesla’s Autopilot or the FSD add-on. Many FSD testers have posted videos showing bugs in the software. The National Highway Traffic Safety Administration is investigating Autopilot and its potential involvement in several crashes.

“Ultimately, I think this material has great potential,” says Heckman he wrote on Twitter. “But at this point you have to focus and do well without causing a regression in the experience, especially in features that affect your safety and that of others on the road.”

Do you drive a Tesla or have any insights you want to share? Contact the reporter from a non-work related email at [email protected]

Read the original article in Business Insider

Leave a Reply

Your email address will not be published. Required fields are marked *