adamjcook, (edited )

I was at Pride all weekend while visiting with my wife (videos and photos soon!), so I missed this Drama concerning that erupted.

Ok.

Let us, again, all put on our hats and take a look at the situation here as I understand it.

Below is the video that kicked the beehive between Tesla defenders and detractors on "what really happened?".

This clearly chaotic video was taken from a larger drive sequence in which FSD Beta was active.

🧵👇

adamjcook,

The video above was extracted from the Tweet shown below.

If you do not know the players involved here...

Ross Gerber is the individual driving this vehicle. Ross is an ardent Tesla supporter, Tesla investor and a prominent voice within the Tesla Community.

Dan O'Dowd is the individual in the passenger seat. Dan is the creator of The Dawn Project which primarily targets Tesla's program in terms of what it perceives as its public safety issues.

adamjcook,

First off, one really has to "love" the placement of the mobile device charging pads in this vehicle - a vehicle which is unvalidated as it pertains to being equipped with a highly-complex automated driving system, no less.

The placement of these pads ensure maximum driver inattentiveness to the roadway and to the dynamic driving task.

This is not a "side issue". The components are an integral part of the human-machine system in terms of a validation process.

adamjcook,

Alright. A few background notes.

With respect to systems, "positive assumptions of safety" are incompatible with the analysis of these systems - particularly by those outside of a systems safety lifecycle.

is included here as well since, per my previous threads on the matter, Tesla is not maintaining a systems safety lifecycle with their program.

The assumption must be made that the Tesla vehicle would have blown the stop sign.

Case closed on that.

adamjcook,

The other thing here is that Dan O'Dowd, as a individual that has claimed to be advocating for the public's safety, should not have participated in this -active drive.

There was no public safety upside here.

Zero.

None.

Only a downside.

The drive shown above revealed nothing that was not known before.

The program is unsafe because, at the very least, individuals like Ross Gerber are not under a well-maintained Safety Management System when operating these "test" vehicles.

adamjcook, (edited )

Now, after Dan O'Dowd published the above video, all kinds of vehicle owners are rushing to this exact same area in order to "test" it for themselves (to see if their vehicles attempt to run the same stop sign).

All that does is multiply the danger to the public.

And for what?

We already know, again, that the program is unsafe.

All this "testing" is pointless.

And dangerous!

adamjcook, (edited )

Here is what Ross (the driver of the -active vehicle) stated after Dan O'Dowd published this video clip.

This is exactly what I am talking about.

How can the vehicle be capable of "self-driving" if the human driver must remain the fallback at all times and under all circumstances?

Do people not understand the meaning of the word "self"?

Ross apparently does not.

So, this "test operator" cannot even define the capabilities of the vehicle that they are operating...🤔

adamjcook,

Frankly, all I hear from the various contortions of anyone defending or this -active vehicle is "blah, blah, blah".

I do not need a video to tell me that this program is unsafe.

It is unsafe - structurally and terminally.

adamjcook,

I suppose that I should also note that no system can ever be "perfectly safe".

That is not possible

And the concept of "perfection" is not relevant to systems safety.

Ross submits that "at no time was anybody at any risk of crashing".

No.

There is always risk!

is about maintaining processes such that always-present, finite risk is continuously and exhaustively identified and managed.

It is about appreciating that risk exists - the opposite of what Ross submits.

DoesntExist,

@adamjcook
A bunch of tech bros who couldn't pass muster in aviation are instead building cars and submarines.

chargrille,
@chargrille@progressives.social avatar

@adamjcook

He knowingly ran a stop sign at speed.

What an asshole. Typical libertarian bro treating safety rules as if they are optional and don't apply to him because he knows better.

jamiesaker,

@adamjcook An inattentive human who has handed over the primary task of driving is a terrible backup control in the event sudden response is required.

medains,

@jamiesaker @adamjcook perhaps counterintuitively, the better the system is at driving, the more inattentive the human operator, the less likely they are to 'step in' when required. A self drive system must be at least as good as a human driver, or they are worse than not having one at all.

adamjcook,

@jamiesaker @medains Ah. So, you are definitely not wrong here, but I will have another thread up soon that discusses this in additional detail.

Great point.

ibreezy,

@adamjcook People don't understand that if you create a car that encourages people to be distracted, encourages people to allow the car to take over, you are taking on that liability. It is fundamentally unsustainable for drivers to carry that liability themselves, system-wide, and the only reason to have it set up that way is to maximize TSLA profits.

I have a car with limited "self-driving" capabilities. Adaptive cruise-control and lane-centering. There is no expectation that the car will recognize stop signs, so I must. If I take my hands off the wheel, the car refuses to engage those capabilities.

CAWguy,
@CAWguy@mstdn.ca avatar

@adamjcook The “fallback option” reminds me of that simple science experiment where a person has to catch a falling ruler. If YOU drop the ruler, it’s an easy catch, but if SOMEONE ELSE drops it, you likely won’t have the reaction time necessary to prevent it from hitting the floor.

adamjcook,

@CAWguy It is indeed similar in terms of the activities between human driver and an opaque, automated system.

rowdypixel,
@rowdypixel@hachyderm.io avatar

@adamjcook “Debate” favors the side that wants the status quo. Even if one side has already been established as more correct by a consensus of experts. The same thing happens in every field when the established powers that be would be harmed by a change that is beneficial for anyone else.

You see the same thing in the discourse about climate change or lowering car dependency or building housing, etc.

MisterMadge,

@adamjcook The assumption arises from the premise (or operational state), "the vehicle continues to follow the road at 35 mph"?

adamjcook,

@MisterMadge The assumption that some in the pro- camp are making (the best that I understand it) is that:

  1. The -active vehicle would not have run the stop sign; and/or
  2. The white SUV already in the intersection "cut" the FSD Beta-active vehicle off which created an "artificial" disengagement.

If any assumption must be made here it is that the Tesla vehicle would have blown the stop sign and collided with the white SUV vehicle.

There is nothing else to go on.

MisterMadge,

@adamjcook
Just to get into the engineering side of it: 1 is a poor assumption, 2 is a hypothesis.

A better assumption would be "the car will return control to the driver in an ambiguous situation". But if ambiguous situations happen all the time while driving, that undermines the very idea of "full self driving"

chemoelectric,
@chemoelectric@masto.ai avatar

@MisterMadge @adamjcook

I think the first thing we do is write off ‘full self driving’ as a kind of fallacy of distraction, where they change the subject from the real one to that made up term and their lie-full definition of it.

The real question does not involve the term at all: ‘Does Tesla produce cars that of a kind and in a way where they should be allowed on public roads?’ And the answer is no, no matter what distractions they toss in front of the facts.

MisterMadge,

@chemoelectric
What is your reasoning for "no"? Don't Tesla cars without the self driving capability meet the safety standards required to be allowed on public roads?
I think the point @adamjcook is making is that the NHTSA is improperly overseeing the Tesla's development and testing of these higher levels of autonomy.

adamjcook,

@MisterMadge @chemoelectric I did not mention it, but there is a case that must be made that if is willing to so brazenly engage in these uniquely extreme, overt wrongdoings that the systems safety of the entire vehicle lifecycle is in serious question.

The has visibility on none of it.

That should be recognized.

But separately, sure, per the effectively non-existent auto regulations in the US (and, largely, globally) - Tesla vehicles are allowed for sale and use.

chemoelectric,
@chemoelectric@masto.ai avatar

@adamjcook @MisterMadge

I think the answer should be ‘We don’t know.’

But I live in a state that has no car inspections, so really here in Minnesota we have no idea whatsoever what’s on the road with us. We let motorcyclists zoom around with bare skin and no helmet, too. It’s insane out there.

KatM,
@KatM@mastodon.social avatar

deleted_by_author

  • Loading...
  • adamjcook,

    @KatM I agree.

    In fact, it is my understanding that a full web browser is accessible while the vehicle is in motion from the center HMI.

    vfrmedia,
    @vfrmedia@social.tchncs.de avatar

    @adamjcook @KatM at best this clowns car belongs only on a movie set (in the backlot, well away from anything else), not on the public roads!

    adamjcook,

    @vfrmedia @KatM is an organization that really does everything in its power to pretend that their vehicles are capable of "self driving" - marketing language, vehicle interior design, various technical presentations at industry conferences... everything.

    All Tesla is doing is setting up human drivers for failure.

    And when the completely avoidable failure inevitably arrives, the Tesla vehicle is no longer capable of "self driving" as far as Tesla is concerned.

    KatM,
    @KatM@mastodon.social avatar

    deleted_by_author

  • Loading...
  • chargrille,
    @chargrille@progressives.social avatar

    @KatM @adamjcook @vfrmedia

    It's terrifying just knowing these cars are on the road. It's a giant uncontrolled experiment with other drivers' & passengers' lives written off as unimportant. Tesla drivers pretend like they're only putting themselves at risk - they are incredibly stupid or incredibly selfish, or both.

    chargrille,
    @chargrille@progressives.social avatar

    @KatM @adamjcook @vfrmedia

    Pedestrians too, of course! Sorry I forgot to mention them. I'm anxious every time I'm near a Tesla. They should at least be required to have giant red lights on the roof that are lit up anytime the car is in "self-driving" mode or not fully in control of the driver, to warn other drivers & pedestrians.

    KatM,
    @KatM@mastodon.social avatar

    @chargrille Ooooh, they should have to play ice cream truck music when in full auto mode.

    @adamjcook @vfrmedia

    chargrille,
    @chargrille@progressives.social avatar

    @KatM @adamjcook @vfrmedia

    More like the theme from Jaws, but yes.

    chargrille,
    @chargrille@progressives.social avatar

    @KatM @adamjcook @vfrmedia

    But I am serious about these uncontrolled experiments on the public being required to provide safety warnings for other people out existing in the world & subjected to risk because the NHTSC isn't doing its job properly [where's Buttigieg on this, by the way?]. I hadn't thought about sound cues, that would be important for protecting blind pedestrians.

    vfrmedia,
    @vfrmedia@social.tchncs.de avatar

    @chargrille @KatM @adamjcook

    here in UK only certain functions of FSD are allowed - it only really works on motorways, suffers from "phantom detection" of traffic signals and can't handle roundabouts, so its thankfully not that popular for the extra price (most Tesla drivers are old men who would previously have bought a diesel saloon (sedan) but picked the Tesla due to cheap finance deals)

    chargrille,
    @chargrille@progressives.social avatar

    @vfrmedia @KatM @adamjcook

    I don't understand why it's allowed by governments at all outside of controlled areas where other cars' drivers can provide consent & they don't endanger pedestrians - like race tracks or other closed courses.

    adamjcook,

    @vfrmedia @KatM @chargrille In the US, at least, we effectively have zero auto regulation.

    It is all “self-certifying”.

    The , the US’s theoretical regulator for highway and vehicle safety, cannot understand my thread as an organization - and they do not want to understand it.

    vfrmedia,
    @vfrmedia@social.tchncs.de avatar

    @adamjcook @KatM @chargrille

    Seems to be mostly the US where this is allowed to happen. Other countries only allow these experimental vehicles on test tracks, whilst actively encouraging uptake of "normal" human driven EVs.

    Much the same attitude that led to those rich fools being crushed in their own tin can 4 km below the sea in the name of "market led innovation..."

    adamjcook,

    @KatM @chargrille @vfrmedia Probably.

    The effectively has an “unofficial” dual mandate - which includes keeping personal vehicles affordable.

    Stripping out safety regulations is going to substantially lower direct vehicle costs to consumers - and so it does.

    One thing that the USDOT has spent decades doing is to “blame the human driver” for everything while ripping out regulatory structures behind-the-scenes.

    adamjcook,

    @KatM @chargrille @vfrmedia It is important to see the for what it actually is - a faux-safety regulator that is really about managing the risk to itself.

    Plausible deniability is the name of the game.

    So, the NHTSA wants to be ignorant to this entire conversation because… to know something means added risk to the agency.

    If the shit hits the fan one day, the agency wants to say they were just in the dark.

    Works everytime.

    chargrille,
    @chargrille@progressives.social avatar

    @adamjcook @vfrmedia @KatM

    And yet, it's my understanding that because of NHTSA regulation, I can't import an electric VW van from Europe that has been fully safety tested & approved by their authorities.

    licked,
    @licked@mastodon.social avatar

    @KatM @chargrille @adamjcook @vfrmedia I've had a tesla for a few years. I have CRPS in my right foot. Being able to lay off the gas pedel is priceless, especially in stop and go traffic.

    we're not all monsters. the frunk even fits my travel scooter. i test drive other assisted cars, but none are nearly as good.

    i fully support your ice cream/jaws music idea. my daughters, however, just said "what's jaws?" and "you only buy drugs from the ice cream man."

    chargrille,
    @chargrille@progressives.social avatar

    @licked @KatM @adamjcook @vfrmedia

    😂

    Not all Tesla owners, sure. I'm sure you're not a monster & would feel horrible if someone using Tesla's driving assistance software caused an accident & gave someone else an injury or a new case of CRPS.

    These systems are not ready for road use and experimentation on other people. The general inability to recognize that this is happening, or unwillingness to admit it, is what reinforces Tesla drivers' image as self-absorbed prats.

    As traditional car manufacturers enter the electric vehicle market, Tesla is increasingly under pressure to differentiate itself. Last year, Musk said that “Full Self-Driving” was an “essential” feature for Tesla to develop, going as far as saying, “It’s really the difference between Tesla being worth a lot of money or worth basically zero.”

    licked,
    @licked@mastodon.social avatar

    @chargrille @KatM @adamjcook @vfrmedia you're right to a point. cruise control from forever would just plow into anything the car was pointed at, same as a brick on the gas pedal. If anyone sells "accelerator pedal bricks," i don't know about it.

    i have experienced the "phantom braking" issue. like many of the other issues (including the name Autopilot), it's always been something i manage because i understand the limits.

    i'm, 47. i've never have an accident, pre or post tesla.

    adamjcook,

    @chargrille @KatM @vfrmedia @licked l will need to do a separate thread on this (in general, not specifically to your comment).

    But one of the core issues at play here is that because you (and everyone) are outside of the systems safety lifecycle for these un-/under-validated systems… there is no possibility that you can ever appreciate their limitations at any given time.

    A failure to internalize that is how complacency forms and inevitably bites.

    licked,
    @licked@mastodon.social avatar

    @adamjcook @chargrille @KatM @vfrmedia

    thanks for manspaining it to me.

    epicdemiologist,
    @epicdemiologist@wandering.shop avatar

    @adamjcook "Full Self Driving" except for the "Full", "Self" and "Driving" parts

    douglasvb,
    @douglasvb@mastodon.social avatar

    @adamjcook wow that's way worse than I thought. This is a great 🪡🧵 about issues and how not to do . Absolutely crazy.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • chicago
  • GTA5RPClips
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • tacticalgear
  • cubers
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • osvaldo12
  • ngwrru68w68
  • kavyap
  • InstantRegret
  • JUstTest
  • everett
  • Durango
  • cisconetworking
  • khanakhh
  • ethstaker
  • tester
  • anitta
  • Leos
  • normalnudes
  • modclub
  • megavids
  • provamag3
  • lostlight
  • All magazines