Skip to Content

Premature release of Tesla autopilot sparks discussions on safety implications

Art by Grace He
Art by Grace He

Teslas equipped with the company’s advanced driver assistance systems, also known as autopilot, were involved in at least 18 fatal crashes since summer of 2021, according to the National Highway Traffic Safety Administration, leading some to doubt the future of self-driving cars.

As recently as December, a Tesla Model S with the brand new Full-Self-Driving system caused an eight car pileup on the Bay Bridge, injuring nine people.

Security footage from the tunnel, released on Jan. 11, shows the self-driving Tesla attempting a dangerous lane change before slamming on the breaks, causing the pileup.

In recent months, the NHTSA has investigated 35 incidents involving Teslas with FSD, or autopilot, activated. These incidents have collectively killed 19 people.

Tesla first developed its autopilot feature in 2016, when its cars came equipped with cameras and sensors, and software updates utilized this hardware to self-drive on freeways.

After years of testing and delays, the FSD system became available in November 2022. These AI features, however, have built up a questionable reputation.

Courtney Mitchell, who has owned a Tesla for five years, defends Tesla because he said FSD isn’t meant to allow people to not pay attention while driving.

Instead, Mitchell said the negligence of the driver is to blame in most crashes involving FSD.

“When you’re choosing to move into a hands-off-the-wheel mode on the freeway, you need to still pay attention and be ready to grab that wheel instantly,” Mitchell said.

Steve Beck, another Tesla owner in Palo Alto, isn’t so sure.

“I’m not ready to trust full self-driving,” Beck said. “When it gets to the point that it is demonstrably safe, I would consider it at that point.”

A MotorTrend article from January, reporting on car crashes that involve Teslas on autopilot, said it’s possible that Tesla is trying to get FSD on the market before it’s ready.

The excitement surrounding FSD draws back to a video that Tesla released in 2016, demonstrating the cars’ ability to drive around city streets without a human in the driver’s seat.

However, on Jan. 17 Tesla’s director of Autopilot software Ashok Elluswamy testified that the promotional video was staged.

An article from Fortune Magazine said Tesla built excitement too early in the development of FSD, which led Tesla to prematurely release a version of it.

The car’s shortcomings on the roads gave rise to concerns over a potential threat to drivers and pedestrians, especially in the Bay Area, where there many people own Teslas. The company is based in Fremont.

Palo Alto’s Transportation Planning Manager Sylvia Star-Lack said there’s not much a city like Palo Alto can do about autonomous vehicles.

“On a staff level, we can’t really do much (about autonomous vehicles),” Lack said. “We talk about it in the office, but we don’t have any additional insight into the testing or regulation of those vehicles.”

Using autopilot is a personal choice, however, and there is a community of people who decide not to use it.
Senior Anna Gurthet is a part of this community and said she prefers gas cars.

“I like the feel of an engine and the actual driving experience more of a gas car, (and) I personally like driving, so autopilot is useless to me,” Gurthet said.

Regardless of people’s preferences, Beck said he is concerned for the future of Tesla as a company because of some of its recent missteps, including those of its CEO.

Beck said, “As a Tesla owner, I want the company to be healthy and still be there if I ever need service. (Elon Musk) is neglecting what he needs to do.”

Donate to The Campanile
$300
$500
Contributed
Our Goal

Your donation will support the student journalists of Palo Alto High School's newspaper

More to Discover
Donate to The Campanile
$300
$500
Contributed
Our Goal