adamjcook, to random

Please . Please. I am begging you. Stop doing this.

's product is not at all, "partially" or otherwise, capable of "self-driving".

The official term, if you need to use something, is a "partial automated driving system" and the control responsibilities of the human driver utilizing such a system are exactly the same as if they were driving a 1995 Dodge Neon.

No Tesla vehicle is capable of "driving itself".

https://www.axios.com/2023/05/10/elon-musk-cathie-wood-tesla-twitter

adamjcook, to random

This is really more of a story about and its immense effectiveness failures - a long-standing problem that underpins US construction and activities.

The agency existing, on paper, while doing nothing actually harms the public as the public thinks there is someone looking out for their interests when there is not.

https://www.texasobserver.org/tesla-texas-worker-death-heat/

adamjcook,

It is just like the lack of effective auto regulation in the US (supposed to be maintained by yet another ineffective US regulator - the ) and how exploits that via its and programs.

It is just like testing eclipsing what should be proper auto regulation via validation process oversight.

Regulatory “illusions” designed to trick the public into thinking there is a cop on the beat.

adamjcook, to random

Ok. So recently, had published an "Impact Report" which contained a slide presenting some "data" that their and products "enhance safety".

And one article and one Twitter thread caught my eye in scrutinizing these numbers.

https://www.forbes.com/sites/bradtempleton/2023/04/26/tesla-again-paints-a-very-misleading-story-with-their-crash-data/

https://twitter.com/NoahGoodall/status/1651323363099553793

While the analyses and arguments in this article and thread are not necessarily wrong, there are more fundamental issues here that need to be surfaced in my view... so let's take a look.

adamjcook,

Here's the rub.

Safety is not at all about numbers on a page!

This makes the article and Twitter thread I posted above, again not necessarily wrong, but entirely moot.

Why?

Because Tesla does not have a systems safety lifecycle in place for their and programs and products… therefore, the program and product are structurally unsafe.

Safety is about continuous maintaining a process of quantifying and handling failure - both seen and unseen failures.

adamjcook,

@CrackedWindscreen EuroNCAP assesses "active safety features" (as you mentioned, LKAS, AEB as so on).

To my knowledge, EuroNCAP does not incorporate assessments of automated driving systems (i.e. Highway/Enhanced , FSD Beta and so on).

But assessing active safety features is moot if, when say is activated, the characteristics of those features change because, say, FSD Beta requires "enhanced" object detection and event response capabilities.

That is what I meant.

CrackedWindscreen, to random
@CrackedWindscreen@mastodon.online avatar

This is an excellent thread outlining the issues with Tesla, driver assistance tech and the fact that the car industry is palming all responsibility onto the public whilst knowing full well we are being set up for failure.

Read it and understand what is happening as this covers the EU and UK, not just the US.

https://mastodon.social/@adamjcook/110275319409180523

adamjcook,

@CrackedWindscreen Note also the "made up" terminology that the jurors used when being interviewed after the trial...

"Self pilot"

"Auto assist"

Note also the illusionary concept of "taking control".

In a partial automated driving system, like , the human driver is always driving... responsible for the exact same control as if no automated driving system was equipped on the vehicle.

To these jurors, a division of control seems to exist... but it is an illusion.

adamjcook, to random

Alright, so last week, a jury in California rejected the claims brought against 's product.

I am not a lawyer and the detailed analysis of legal issues is orthogonal to the obligations of experts in educating the public and in criticizing regulators.

The responses from the jurors in this case are interesting, though... and very much expected.

Let's briefly break this down a bit.

https://www.autoblog.com/2023/04/21/jurors-in-lawsuit-say-tesla-never-claimed-autopilot-to-be-a-self-pilot/

adamjcook,

So, is a partial automated driving system - that is, an automated driving system that requires a human fallback at all times, the exact same as if no automated system existed on the vehicle at all.

This type of system is immediately problematic... and not just in terms of Tesla's system, but industry-wide.

Why?

Because of the passage emphasized above.

The mere presence of such an system makes the vehicle instantly seem different and more capable than it is!

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • kavyap
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • tacticalgear
  • cubers
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • osvaldo12
  • ngwrru68w68
  • GTA5RPClips
  • provamag3
  • InstantRegret
  • everett
  • Durango
  • cisconetworking
  • khanakhh
  • ethstaker
  • tester
  • anitta
  • Leos
  • normalnudes
  • modclub
  • megavids
  • lostlight
  • All magazines