adamjcook,

Ok. So recently, had published an "Impact Report" which contained a slide presenting some "data" that their and products "enhance safety".

And one article and one Twitter thread caught my eye in scrutinizing these numbers.

https://www.forbes.com/sites/bradtempleton/2023/04/26/tesla-again-paints-a-very-misleading-story-with-their-crash-data/

https://twitter.com/NoahGoodall/status/1651323363099553793

While the analyses and arguments in this article and thread are not necessarily wrong, there are more fundamental issues here that need to be surfaced in my view... so let's take a look.

joule,
@joule@mastodon.social avatar

@adamjcook At my university, "Safety-critical system" is taught in a graduate class under the nuclear engineering track. So, no one except the nuclear students are expected to know about "safety" in the technical sense let alone the layman.

adamjcook,

is a divisive topic. Really divisive.

Two-sides constantly at war - generally over the stock price.

And it is also a topic that is surrounded by laypeople that are not competent in systems - so fundamental concepts are often immediately lost.

Totally expected.

Safety-critical systems are an oftentimes very niche and inherently complex topic.

For the pro-Tesla camp, "data", any data, is desperately sought to punch back against critics of Tesla's safety culture.

adamjcook,

Here's the rub.

Safety is not at all about numbers on a page!

This makes the article and Twitter thread I posted above, again not necessarily wrong, but entirely moot.

Why?

Because Tesla does not have a systems safety lifecycle in place for their and programs and products… therefore, the program and product are structurally unsafe.

Safety is about continuous maintaining a process of quantifying and handling failure - both seen and unseen failures.

adamjcook,

Aside: What is this about "unseen" failures?

Check out my previous thread here for Elk users: https://elk.zone/mastodon.social/@adamjcook/110162950844417313

And here for non-Elk users: https://mastodon.social/@adamjcook/110162950844417313

adamjcook,

Ok. Getting back to it.

Take the recent Boeing 737 MAX scandal for instance.

Between May 22, 2017 (its maiden flight) and October 29, 2018 (the first fatal incident), nearly everyone external to Boeing would have seen a perfect 737 MAX safety record.

The "numbers on the page" of fatalities linked to the 737 MAX would have been zero!

But the aircraft was never safe... because Boeing did not maintain an appropriate systems safety lifecycle for the aircraft.

The numbers were moot.

adamjcook,

It really is a broader problem with regulators globally.

The collects (unaudited, effectively voluntary) "data" from automakers on automated driving system incidents that they happen to know about...

testing "data" is viewed as a substitute for independent scrutiny of automaker validation processes...

Everyone is collecting data, but missing the big picture.

It is simply awaiting catastrophic, avoidable injury and death instead of proactively preventing it.

adamjcook,

Lastly, it should be noted that what has published here is not really "data"... it is more like conclusions which cannot be quantifiably scrutinized.

Broadly, unaudited safety numbers provided by manufacturers should obviously not be trusted.

Human lives are on the line and that should demand a much higher bar.

Boeing, after the first 737 MAX fatal incident, adamantly insisted that the aircraft was safe.

Then a second fatal incident occurred under nearly identical root causes...

CrackedWindscreen,
@CrackedWindscreen@mastodon.online avatar

@adamjcook don’t get me started on NCAP. They’ve admitted they ‘didn’t expect manufacture to fame the ADAS tests’. Which shows how incompetent they are.

adamjcook,

@CrackedWindscreen Yup.

NCAP programs are deeply problematic for multiple reasons.

It is essentially used a cover for what should be regulatory rigor, which is totally unacceptable.

They are not rigorous, not exhaustive and, arguably, not independent assessments.

They also completely fall apart now that Over-The-Air updates are in the picture.

adamjcook,

@CrackedWindscreen No NCAP program I am aware of even incorporates automated driving system assessments in their core ratings… which is simply absurd in 2023.

There is no sense assessing active safety features if their characteristics change when an automated driving mode is active.

CrackedWindscreen,
@CrackedWindscreen@mastodon.online avatar

@adamjcook They do. That's caused a huge ruckus over here. Cars are downgraded stars if they don't have the latest "safety systems" such as LKAS, AEB and more fitted. Cars that got 5 stars a few years ago look to be unsafe now as they get 1-3 stars when not fitted with all the tech (that doesn't work but that's by the by). However, structurally (which is all NCAP used to worry about) they are the same and just as strong and safe.

Here's something on what they look for. https://www.euroncap.com/en/vehicle-safety/the-ratings-explained/safety-assist/

adamjcook,

@CrackedWindscreen EuroNCAP assesses "active safety features" (as you mentioned, LKAS, AEB as so on).

To my knowledge, EuroNCAP does not incorporate assessments of automated driving systems (i.e. Highway/Enhanced , FSD Beta and so on).

But assessing active safety features is moot if, when say is activated, the characteristics of those features change because, say, FSD Beta requires "enhanced" object detection and event response capabilities.

That is what I meant.

CrackedWindscreen,
@CrackedWindscreen@mastodon.online avatar

@adamjcook Ah, sorry. I gotcha. Got caught up in the terminology. Yes, you are right, they look at aspects of such systems, individually, but not an overall system and certainly not from any specific backend perspective.

adamjcook,

@CrackedWindscreen "Funnily" enough it is all moot when a Over-The-Air update can disqualify any assessment made at any previous time.

It just all goes back to fully relying on "the word" of the automaker and not actually independently scrutinizing whether or not the automaker has appropriate internal processes at all.

NCAP covers all of that up with a shallow "star rating" and the public eats it up (understandably so).

_dm,

deleted_by_author

  • Loading...
  • adamjcook,

    @_dm It is higher level that a statistical analysis, higher level than any particular engineered system.

    It is about where a robust systems safety lifecycle does not exist, the downstream safety dynamics are purely speculative to everyone at any given time.

    There was literally no effort made to avoid unhandled system failure.

    A roll of the dice with human lives.

    That is what Tesla is doing.

    That is what Boeing did.

    That is how they are the same.

    _dm,

    deleted_by_author

    adamjcook,

    @_dm It is more like statistical analysis on endpoint safety is always (or should always be) a secondary concern.

    It is dangerous to breathe a sigh of relief based on some, let's say, extremely low incident rate if there was never an appropriate validation process in place during development and continuously after delivery.

    It is just a ticking time bomb of avoidable death and injury.

    I believe that is another way of saying what you wrote.

    adamjcook,

    @_dm Additionally, specific to collecting roadway safety data, there are more practical issues... which I outlined in a Reddit post some time ago in detail: https://www.reddit.com/r/RealTesla/comments/xqauju/the_no_news_is_good_news_fallacy_of_fsd_beta/?utm_source=share&utm_medium=web2x&context=3

    Public roadways consist of a myriad of interacting components, so 's data, even if we want to assume is being published in Good Faith (*), is at least one-half incomplete anyways.

    (*) Which should never be assumed, for any automaker.

    _dm,

    deleted_by_author

  • Loading...
  • adamjcook,

    @_dm I think something similar exists in safety-critical systems.

    At some point, during systems validation, there has to be a discussion of numbers, of statistics, of risk.

    Potential systems failures are identified and categorized. Margins of safety are established. And the system is exhaustively exercised under a controlled process to evaluate it.

    This process is continuous (never ends) in all safety-critical systems, as no system can ever be "perfectly safe" at any time.

    adamjcook,

    @_dm Maintaining a systems safety lifecycle is nothing more than a Good Faith acknowledgement that unhandled failure will always exist (even in the best of internal processes) and that there needs to be an efficient "feedback loop" of re-evaluating unhandled failure and determining corrective actions, if any.

    It is indeed very much organic.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • kavyap
  • thenastyranch
  • tester
  • DreamBathrooms
  • mdbf
  • magazineikmin
  • tacticalgear
  • Youngstown
  • ethstaker
  • osvaldo12
  • slotface
  • everett
  • rosin
  • khanakhh
  • megavids
  • ngwrru68w68
  • Leos
  • modclub
  • cubers
  • cisconetworking
  • Durango
  • InstantRegret
  • GTA5RPClips
  • provamag3
  • normalnudes
  • anitta
  • JUstTest
  • lostlight
  • All magazines