Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(The Verge)   Tesla provides data records that shows the Tesla Model S autopilot crash in April outside Houston was not, in fact, an autopilot crash, wasn't a Model S, didn't happen in Texas, and actually happened in 1987   (theverge.com) divider line
    More: Followup, Tesla Motors, Tesla Model S, National Transportation Safety Board, Nikola Tesla, data logs, passenger seats, Tesla Roadster, Transport  
•       •       •

1698 clicks; posted to STEM » on 21 Oct 2021 at 8:54 PM (6 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



41 Comments     (+0 »)
View Voting Results: Smartest and Funniest
 
2021-10-21 8:58:09 PM  
But we should still short Tesla stock, right?
 
2021-10-21 9:03:01 PM  

anuran: But we should still short Tesla stock, right?


I haven't been paying much attention, but it still seems to be inflated doesn't it?
 
2021-10-21 9:03:10 PM  
Weird.  This is, like, the first time I've seen evidence that people will jump to ill-informed conclusions just to have a good story to tell.
 
2021-10-21 9:06:27 PM  

New Farkin User Name: anuran: But we should still short Tesla stock, right?

I haven't been paying much attention, but it still seems to be inflated doesn't it?


Along with snake-ball of other stocks that'll tank in a hurry once all this inflation and the jobs weirdness mate to form a new recession.
 
2021-10-21 9:28:56 PM  
So, after the local yokels say they're "100% certain" the driver's seat was unoccupied and the media runs with that the pros come in and say "Nope, just dumbasses"?

Where's my fainting couch?
 
2021-10-21 9:43:09 PM  

scanman61: the media runs with that


Media carries with it a credibility that is totally undeserved. You have all experienced this, in what I call the Murray Gell-Mann Amnesia effect. (I call it by this name simply because I once discussed it with Murray Gell-Mann, and by dropping a famous name I imply greater importance to myself, and to the effect, that it would otherwise have.)

Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward-reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them.

In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.

That is the Gell-Mann Amnesia effect. I'd point out it does not operate in other arenas of life. In ordinary life, if somebody consistently exaggerates or lies to you, you soon discount everything they say. In court there is the legal doctrine of falsus in uno, falsus in omnibus, which means untruthful in one part, untruthful in all.

But when it comes to the media, we believe against evidence that it is probably worth our time to read other parts of the paper. When, in fact, it almost certainly isn't. The only possible explanation for our behavior is amnesia.

~ Michael Crichton
 
2021-10-21 9:54:49 PM  
"Also, security camera footage from the vehicle owner's home captured him entering the driver's side door, while his companion got in on the passenger side "

Yeah that's a pretty big clue someone was driving
 
2021-10-21 10:10:33 PM  
Those two were trying to get the fark out of Texas, before the rolling coal guys sobered up.
 
2021-10-21 10:17:36 PM  
Tesla provided the data?

There was a report this morning that a Dutch institute had managed to decode it, and found there was much mure information in there than Tesla had been passing on:

https://www.autoblog.com/2021/10/21/t​e​sla-driving-data-decrypted-dutch-foren​sics-lab/

I'd be interested to take Tesla out of the loop when investigating their car crashes, especially if they're claiming it's driver error.
 
2021-10-21 10:30:27 PM  
Wait, that one guy who always shiats these threads said that the authorities weren't even investigating this. I was led to believe that Elon owns our government and the poor, poor oil companies who try to compete fairly can't keep up with his captured regulators...

I can't believe that wasn't true
 
2021-10-21 10:43:18 PM  

Oneiros: Tesla provided the data?

There was a report this morning that a Dutch institute had managed to decode it, and found there was much mure information in there than Tesla had been passing on:

https://www.autoblog.com/2021/10/21/te​sla-driving-data-decrypted-dutch-foren​sics-lab/

I'd be interested to take Tesla out of the loop when investigating their car crashes, especially if they're claiming it's driver error.


NTSB not good enough for ya?

With the assistance of the [event data recorder] module manufacturer, the NTSB Recorders Laboratory repaired and downloaded the fire-damaged EDR. Data from the module indicate that both the driver and the passenger seats were occupied, and that the seat belts were buckled when the EDR recorded the crash.

The EDR is part of the airbag module.  EDR recovery has been standard part of NTSB's toolbox for a while.
 
2021-10-21 11:08:09 PM  
c.tenor.comView Full Size
 
2021-10-21 11:47:57 PM  

SansNeural: Weird.  This is, like, the first time I've seen evidence that people will jump to ill-informed conclusions just to have a good story to tell.


I told it to my wife, Morgan Fairchild, and she tells me people go on the internet and lie all the time.
 
2021-10-21 11:57:18 PM  
Once I saw that picture of Elon hitting that joint I knew it wasn't the car's fault.
 
2021-10-22 12:08:11 AM  

mrmopar5287: scanman61: the media runs with that

Media carries with it a credibility that is totally undeserved. You have all experienced this, in what I call the Murray Gell-Mann Amnesia effect. (I call it by this name simply because I once discussed it with Murray Gell-Mann, and by dropping a famous name I imply greater importance to myself, and to the effect, that it would otherwise have.)

Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward-reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them.

In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.

That is the Gell-Mann Amnesia effect. I'd point out it does not operate in other arenas of life. In ordinary life, if somebody consistently exaggerates or lies to you, you soon discount everything they say. In court there is the legal doctrine of falsus in uno, falsus in omnibus, which means untruthful in one part, untruthful in all.

But when it comes to the media, we believe against evidence that it is probably worth our time to read other parts of the paper. When, in fact, it almost certainly isn't. The only possible explanation for our behavior is amnesia.

~ Michael Crichton


Offer not valid for conservative politicians.
 
2021-10-22 1:14:38 AM  
This is the problem with "self-driving" as currently on offer, and why it's so wrong of Tesla or anyone else to call it "self-driving."

There's driving automation level 0 (you drive it) and level 1 (lane departure, adaptive cruise), there's level 5 (the car is able to drive itself, on its own, as good as you if not better), and then there's a giant ass black hole of not acceptable in between those levels.

The reason it's a giant ass black hole of not acceptable is simple: Have you noticed that even with zero driving automation, every day is International Drive Like An Idiot Day? Now you want to give the same idiots who can barely manage to pay enough attention to aim their self propelled couches when they know they have to pay attention 100% of the time, a car that says "you can ignore everything 99% of the time"?

Oh HELL no.
 
2021-10-22 2:08:46 AM  

erik-k: This is the problem with "self-driving" as currently on offer, and why it's so wrong of Tesla or anyone else to call it "self-driving."

There's driving automation level 0 (you drive it) and level 1 (lane departure, adaptive cruise), there's level 5 (the car is able to drive itself, on its own, as good as you if not better), and then there's a giant ass black hole of not acceptable in between those levels.

The reason it's a giant ass black hole of not acceptable is simple: Have you noticed that even with zero driving automation, every day is International Drive Like An Idiot Day? Now you want to give the same idiots who can barely manage to pay enough attention to aim their self propelled couches when they know they have to pay attention 100% of the time, a car that says "you can ignore everything 99% of the time"?

Oh HELL no.


This accident did not involve FSD. In this accident the car was under the direct control of the driver.

The driver killed himself and his passenger. Should drivers be banned?
 
2021-10-22 2:20:25 AM  

scanman61: Oneiros: Tesla provided the data?

There was a report this morning that a Dutch institute had managed to decode it, and found there was much mure information in there than Tesla had been passing on:

https://www.autoblog.com/2021/10/21/te​sla-driving-data-decrypted-dutch-foren​sics-lab/

I'd be interested to take Tesla out of the loop when investigating their car crashes, especially if they're claiming it's driver error.

NTSB not good enough for ya?

With the assistance of the [event data recorder] module manufacturer, the NTSB Recorders Laboratory repaired and downloaded the fire-damaged EDR. Data from the module indicate that both the driver and the passenger seats were occupied, and that the seat belts were buckled when the EDR recorded the crash.

The EDR is part of the airbag module.  EDR recovery has been standard part of NTSB's toolbox for a while.


Besides, anyone with a Tesla knows this was not autopilot. Autopilot plays by certain rules and requirements, and this crash did not fit.
 
2021-10-22 3:17:29 AM  

erik-k: This is the problem with "self-driving" as currently on offer, and why it's so wrong of Tesla or anyone else to call it "self-driving."


Except that the car was under their control and didn't even have FSD installed.
 
2021-10-22 3:39:33 AM  

anuran: But we should still short Tesla stock, right?


If you have the liquidity from covered calls, buying naked puts on Tesla might not be a bad idea. But don't go short, the market can stay irrational longer than you can stay solvent.
 
2021-10-22 4:07:08 AM  

wildcardjack: anuran: But we should still short Tesla stock, right?

If you have the liquidity from covered calls, buying naked puts on Tesla might not be a bad idea. But don't go short, the market can stay irrational longer than you can stay solvent.


The biggest lie of economics is "Markets are rational".
 
2021-10-22 4:18:57 AM  

erik-k: This is the problem with "self-driving" as currently on offer, and why it's so wrong of Tesla or anyone else to call it "self-driving."

There's driving automation level 0 (you drive it) and level 1 (lane departure, adaptive cruise), there's level 5 (the car is able to drive itself, on its own, as good as you if not better), and then there's a giant ass black hole of not acceptable in between those levels.

The reason it's a giant ass black hole of not acceptable is simple: Have you noticed that even with zero driving automation, every day is International Drive Like An Idiot Day? Now you want to give the same idiots who can barely manage to pay enough attention to aim their self propelled couches when they know they have to pay attention 100% of the time, a car that says "you can ignore everything 99% of the time"?

Oh HELL no.


Actually, autonomous driving is much safer than human driving. All the time, every time.
 
2021-10-22 5:38:54 AM  

dericwater: erik-k: This is the problem with "self-driving" as currently on offer, and why it's so wrong of Tesla or anyone else to call it "self-driving."

There's driving automation level 0 (you drive it) and level 1 (lane departure, adaptive cruise), there's level 5 (the car is able to drive itself, on its own, as good as you if not better), and then there's a giant ass black hole of not acceptable in between those levels.

The reason it's a giant ass black hole of not acceptable is simple: Have you noticed that even with zero driving automation, every day is International Drive Like An Idiot Day? Now you want to give the same idiots who can barely manage to pay enough attention to aim their self propelled couches when they know they have to pay attention 100% of the time, a car that says "you can ignore everything 99% of the time"?

Oh HELL no.

Actually, autonomous driving is much safer than human driving. All the time, every time.


lulz
 
2021-10-22 5:43:20 AM  
When Toyota had its acceleration problems, they let NASA look for a problem. NASA never found anything.

I think NASA might be able to find some problems with a Tesla vehicle if they looked. Maybe NASA should give it a try.
 
2021-10-22 5:46:58 AM  

anuran: wildcardjack: anuran: But we should still short Tesla stock, right?

If you have the liquidity from covered calls, buying naked puts on Tesla might not be a bad idea. But don't go short, the market can stay irrational longer than you can stay solvent.

The biggest lie of economics is "Markets are rational".


I think the lie is that people always assume that they know what "rational" is. And maybe the key word there is not RATIONAL but rather ASSUME. One might find that few economists claim rationality as a conclusion, even though very many will claim rationality as an assumption. One needs to recognize the difference.
 
2021-10-22 5:55:21 AM  

Oneiros: Tesla provided the data?

There was a report this morning that a Dutch institute had managed to decode it, and found there was much mure information in there than Tesla had been passing on:

https://www.autoblog.com/2021/10/21/te​sla-driving-data-decrypted-dutch-foren​sics-lab/

I'd be interested to take Tesla out of the loop when investigating their car crashes, especially if they're claiming it's driver error.


If you can stifle any whistle blowers or opposition using NDAs, internet campaigns, legal threats, etc.

and

If you can stop any lawsuit or investigation or safety inquiry by claiming that you have data that can only be released through discovery, and which might be falsified by some opaque means.

and

If you are allowed to make vague claims that can be neither proven nor disproven with evidence.

Then you have probably got a nearly bullet proof company when it comes to safety claims, wrongful death, negligent homicide, etc.

Where is the vulnerability? Just like with tobacco companies, once someone figures out the whole thing has been structured to prevent people with valid claims from making claims, then you will be sued for billions.
 
2021-10-22 7:32:46 AM  

DrunkenIrishOD: "Also, security camera footage from the vehicle owner's home captured him entering the driver's side door, while his companion got in on the passenger side "

Yeah that's a pretty big clue someone was driving


Well, not really. That only proves someone drove the car out of the driveway.

I'm still going with the 2nd investigation that says both front seats were occupied.
This sounds like someone just did something stupid in their car.
 
2021-10-22 8:05:26 AM  
Ah, the old "we investigated ourselves and found nothing wrong" ploy.
 
2021-10-22 8:09:09 AM  

Tyrone Slothrop: Ah, the old "we investigated ourselves and found nothing wrong" ploy.


The NTSB concluded the driver was operating the vehicle...
 
2021-10-22 8:43:09 AM  
The next of Greg Abbott's Special Session of the Texas legislature will put in stone that a car driven by a woman shall be considered a driverless car.
 
2021-10-22 11:17:03 AM  
well that settles it. they really are building an android and it's not just a guy in a costume.
 
2021-10-22 11:29:18 AM  
The "evidence" doesn't seem as convincing as the title makes it out to be.
 
2021-10-22 11:56:54 AM  

anuran: wildcardjack: anuran: But we should still short Tesla stock, right?

If you have the liquidity from covered calls, buying naked puts on Tesla might not be a bad idea. But don't go short, the market can stay irrational longer than you can stay solvent.

The biggest lie of economics is "Markets are rational".


"A Random Crash Down Wall Street"
 
2021-10-22 11:58:54 AM  

2fardownthread: When Toyota had its acceleration problems, they let NASA look for a problem. NASA never found anything.

I think NASA might be able to find some problems with a Tesla vehicle if they looked. Maybe NASA should give it a try.


Toyota didn't have a financial relationship with NASA.
 
2021-10-22 2:07:15 PM  

tzzhc4: The "evidence" doesn't seem as convincing as the title makes it out to be.


That's because you don't understand what an Event Data Recorder does
 
2021-10-22 2:08:16 PM  

I am Tom Joad's Complete Lack of Surprise: 2fardownthread: When Toyota had its acceleration problems, they let NASA look for a problem. NASA never found anything.

I think NASA might be able to find some problems with a Tesla vehicle if they looked. Maybe NASA should give it a try.

Toyota didn't have a financial relationship with NASA.


And Tesla doesn't have one with NTSB
 
2021-10-22 2:12:25 PM  

erik-k: This is the problem with "self-driving" as currently on offer, and why it's so wrong of Tesla or anyone else to call it "self-driving."

There's driving automation level 0 (you drive it) and level 1 (lane departure, adaptive cruise), there's level 5 (the car is able to drive itself, on its own, as good as you if not better), and then there's a giant ass black hole of not acceptable in between those levels.

The reason it's a giant ass black hole of not acceptable is simple: Have you noticed that even with zero driving automation, every day is International Drive Like An Idiot Day? Now you want to give the same idiots who can barely manage to pay enough attention to aim their self propelled couches when they know they have to pay attention 100% of the time, a car that says "you can ignore everything 99% of the time"?

Oh HELL no.


Level 4 is tolerable.  It knows enough to nope the situations it can't handle.  It might just park itself in the road but it's not going to run into anything.  2 and 3 are the real dangers.

2fardownthread: When Toyota had its acceleration problems, they let NASA look for a problem. NASA never found anything.

I think NASA might be able to find some problems with a Tesla vehicle if they looked. Maybe NASA should give it a try.


Yeah, the demographic distribution of the Toyota problem strongly suggested it was driver error.  Wrong pedal accidents happen and object-in-the-pedals accidents happen.  A driver should be aware of both scenarios.

I've had a brakes-don't-work scenario happen to me at low speed--heading straight for a gas pump.  Stopped it with the parking brake and went to yell at the mechanic.  Feces happens, you need to be able to deal with it.
 
2021-10-22 4:18:17 PM  
What could possibly go wrong?
 
2021-10-22 8:57:54 PM  

New Farkin User Name: anuran: But we should still short Tesla stock, right?

I haven't been paying much attention, but it still seems to be inflated doesn't it?


Yes, but you see they've decided that the problem with all the other bubbles was that they popped so they're just not gonna pop this one this time so what you should really do is buy all the stonks you can. Too the moon baby!
 
2021-10-23 6:53:52 AM  

Likwit: Wait, that one guy who always shiats these threads said that the authorities weren't even investigating this. I was led to believe that Elon owns our government and the poor, poor oil companies who try to compete fairly can't keep up with his captured regulators...

I can't believe that wasn't true


Elon still won't sleep with you.
 
2021-10-23 6:55:49 AM  

Rambino: scanman61: Oneiros: Tesla provided the data?

There was a report this morning that a Dutch institute had managed to decode it, and found there was much mure information in there than Tesla had been passing on:

https://www.autoblog.com/2021/10/21/te​sla-driving-data-decrypted-dutch-foren​sics-lab/

I'd be interested to take Tesla out of the loop when investigating their car crashes, especially if they're claiming it's driver error.

NTSB not good enough for ya?

With the assistance of the [event data recorder] module manufacturer, the NTSB Recorders Laboratory repaired and downloaded the fire-damaged EDR. Data from the module indicate that both the driver and the passenger seats were occupied, and that the seat belts were buckled when the EDR recorded the crash.

The EDR is part of the airbag module.  EDR recovery has been standard part of NTSB's toolbox for a while.

Besides, anyone with a Tesla knows this was not autopilot. Autopilot plays by certain rules and requirements, and this crash did not fit.


Even if it spots someone who looks like Sarah Connor on the roadside?
 
Displayed 41 of 41 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.