"After crashes, Tesla has used these data to shift blame onto drivers."

Mar 17, 2026 2:49 PM

Antininny

Views

1100

Likes

33

Dislikes

8

https://archive.ph/VQQs1

everything_is_computer

tesler

I was going to comment "Are people really so stupid they trust "AI" of any sort...? "but the answer is plain to see.
Any moron contributing to "AI" is equally stupid and the assumption that "AI" controlling everything is a foregone conclusion is lunacy.

1 week ago | Likes 4 Dislikes 1

JFC. Why would anyone buy a tesla, at this point?

1 week ago | Likes 24 Dislikes 1

New ones, sure. But i can imagine people having bought them a couple of years ago when musks badshitcraziness, and problems like these, werent common knowledge under common folk yet, they might be stuck with it. Its not like its resale value would come close to a replacement ev.
Anyone still buying tesla’s this day and age? Yeah, morons

1 week ago | Likes 4 Dislikes 0

I have no idea, yet when I drive on the highways in the Bay Area, they are fucking EVERYWHERE.

1 week ago | Likes 8 Dislikes 2

Whenever I see a tesla on the road I say "oh look a dumbass".

1 week ago | Likes 5 Dislikes 1

In short: they disable thee 'self driving' 0.0000000000000001 seconds before the crash, so technically YOU were driving, and thus Tesla is innocent. Sure you can go to court, but you just had a crash, you have a hospital bill, you need to buy a new car, you have a trauma. Tesla just ties you up in court for the next 12 months, costing you tons of money. Sure, if you win you might get a wad of $$$, but if you lose or simply run out of money, you're fcked.

1 week ago | Likes 15 Dislikes 6

No, they dont. It is bizarre that people think this. Teslas are all L2 automation - fancy cruise control - and the driver is ALWAYS completely and fully responsible for the vehicle at all times.

1 week ago | Likes 5 Dislikes 0

No, they don't disable anything or claim that it was disabled. Crash telemetry reviewed by regulators like the NHTSA shows the system’s state seconds before impact, not microseconds. Tesla's legal defense doesn't rely on timing tricks. Instead, they cite the "Level 2" status of the car, which legally mandates that the driver remain fully attentive. They blame drivers by arguing that human oversight is required at all times, regardless of system activity.

1 week ago | Likes 14 Dislikes 1

Its like the use of llm ai’s, whats the point of these things if you still have to check and control everything yourself? Especially when its marketed as the opposite. These things are simply not good enough for what they get sold as

1 week ago | Likes 1 Dislikes 0

Like the article says, "To lull them and blame them when things go wrong"...

1 week ago | Likes 9 Dislikes 0

fwiw, I would never use the smart cruise control on my Kia Niro if I thought it took me an entire five seconds to "mentally reengage". I think there are too many people who use Level 2 autopilot without fully understanding what it does and doesn't do.

1 week ago | Likes 6 Dislikes 1

I used to use cruise control all the time. I stopped back in like 2008, after two friends within a year of each other had their cruise controls flat-out refuse to disengage, in two completely different makes of vehicle. Both friends had to resort to some desperate dangerous shenanigans that damaged their tires to stop. My pedal foot now gets full exercise when I drive, forever.

1 week ago | Likes 1 Dislikes 2

Your fear stems from malfunctions that occurred nearly two decades ago, but modern cruise control and level2 systems use fundamentally different architectures with multiple fail-safes. Modern vehicles are designed so that depressing the brake pedal provides a mechanical or electronic override that physically disengages the system. Using 2008-era failures to dismiss today's safety tech is a bit like avoiding modern smartphones because a 2008 flip phone once dropped a call. Reliability has evolved

1 week ago | Likes 1 Dislikes 0