J12t, to random
@J12t@social.coop avatar

At this point Elon Musk is not just an enabler, but a leading instigator.

“Elon Musk Calls to Imprison Therapists Helping Trans Kids.”

https://newrepublic.com/post/173201/elon-musk-calls-imprison-therapists-helping-trans-kids

adamjcook,

@peterlhansen @J12t Whatever does, he needs to stay the hell away from systems or commenting on them.

Even when Musk was largely talking about back-in-the-day, it was in large part to exaggerate the capabilities of which has negatively impacted roadway and public safety.

And we can see the recent public and environmental safety issues of Musk's decision to re-insert himself into .

adamjcook, to random

Oh memories.

Taking a break from 's Hate Train on the Hellsite to recall this series of Tweets from a few years ago.

While under-appreciated then and now, the Tweet thread by Musk posted below contains an extremely damning admission and it displays the considerable blind spot associated with remotely updating systems without oversight.

Musk has no clue what he admitted to here, but systems safety experts do.

adamjcook,

@indw Not about specifically.

My thread is about how this series of Tweets from Musk reveal quite a bit about 's internal engineering processes and how troubling those processes undoubtedly are.

adamjcook, to random

I have to say, it has really been a disappointing time under the Administration for - after having so much promise initially.

I do not know what the issues are here, but President Biden should ask Secretary for his resignation.

Sure, former Secretary Elaine Chao was ideologically worse than Buttigieg, but they are both virtually identical in ineffectiveness.

That is unacceptable.

Sad to say, but here we are.

adamjcook,

If I see Secretary do another interview where he pontificates with the press about the wisdom of 's product name... I might just lose it.

We are far beyond that now in outsized vehicle design dangers - and far beyond just Tesla's wrongdoings anymore (although Tesla's wrongdoings do remain somewhat unique and extreme).

Secretary Buttigieg is still putzing around on a field that is nearly a decade old at this point.

adamjcook, to TeslaMotors

This is a lie. is lying here.

At no time, are vehicles capable of “driving themselves” - only capable of the illusion that they are, which Musk exploits.

This is a common Musk Lie, made in the interest of selling Tesla vehicles and pricey add-ons.

People have died under this lie and will continue to do so, completely avoidably.

Let’s explore the underlying foundations of this lie and why it is so dangerous.

🧵

adamjcook,

hand-waves this fact as well on their official Autopilot product website, as shown below.

That “relaxation”, that “workload reduction” you might feel when using or ?

Guess what that really is?

Complacency.

Deadly complacency.

And it strikes when you least expect it, when the vehicle suddenly encounters a failure and it takes you an additional second or three to regain situational and operational awareness.

Tesla washes its hands of that though.

adamjcook,

Now, let’s look at the underlying part.

and -equipped vehicles are Level 2-capable vehicles - with the exact same limitations as many other vehicles on the market today.

Namely, the key limitation is that the human driver must remain the fallback for any dynamic driving task or vehicle failures at all times and under all conditions.

Effectively, that means that the human has the exact same control responsibilities between the two vehicles shown below.

adamjcook,

First off, let us look at ’s “official” statement, from their website, as to the true limitations of and the products.

Hmm. That does not sound like Tesla itself is confident that their vehicles can drive themselves at any time. 🤔

Because the vehicles cannot.

But and Tesla play this dangerous, deceptive mind game with the public - and it is indeed a mind game.

adamjcook,

Hey Secretary ,

If your investigators are still wondering, after years of twiddling their thumbs, why -active, vehicles keep slamming into the back of roadside emergency vehicles… I just provided you with the answer.

Free of charge.

https://jalopnik.com/tesla-autopilot-may-be-responsible-for-another-fatal-cr-1850204165

adamjcook, to ai

So, a couple of things here.

Firstly, we seem to be in some sort of arms race in " safety statements" nowadays.

Or more like a race to the bottom.

The continued perpetuation of the actually harmful Train aside... what is even the point of this?

They did not even bother to write a letter.

Secondly, very hard to take this statement seriously given that personalities like , Sam Harris, Chris Anderson and Grimes have signed it.

https://www.safe.ai/statement-on-ai-risk

adamjcook,

@CrackedWindscreen Indeed.

And, in that vein, I will say it again... where were these same individuals when was using as a masquerade to wholesale and sloppily experiment on the public via its and programs?

I see that several signatories from are on here...

Hmm. Interesting.

I saw many of their DeepMind colleagues clapping like seals during Tesla's "AI Days" - farcical events that presented zero safety cases.

Zero introspection.

arstechnica, to random
@arstechnica@mastodon.social avatar

Angry Tesla customers sue firm over “grossly” exaggerated EV range

Three Tesla drivers launch class action, alleging fraud and false advertising.

https://arstechnica.com/tech-policy/2023/08/angry-tesla-customers-sue-firm-over-grossly-exaggerated-ev-range/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social

oliver_schafeld,
@oliver_schafeld@mastodon.online avatar

Imagine they'd find out that and were "exaggerated" as well.

being just the overture to , or whatever...

MissingThePt, to random
@MissingThePt@mastodon.social avatar

Elon Musk's brain-computer interface start-up Neuralink has begun recruiting people for its first human trial; discarded bodies will be used to train Tesla Autopilot.

Gednet,
@Gednet@mstdn.social avatar
adamjcook, (edited ) to tesla

This is a good piece, but the one small nit that I have again is bringing in a "data argument".

I know that it is tempting for an audience that is generally not at all experienced in systems, but I would highly recommend resisting it.

In fact, there is no could be higher.

It is our obligation to assume that the "crash rate" of 's product is unquantifiably higher.

The sky is the limit.

https://prospect.org/justice/06-13-2023-elon-musk-tesla-self-driving-bloodbath/

🧵👇

adamjcook,

And here is one other thing.

The systems-level distinction between and is nil.

Why?

Because both products contain the same systems-level safety flaws, as noted above in part.

"The software differences" between the two products do not matter if a safety discussion is to be had.

adamjcook, to tesla

🧵Here is my take on this development… this case should have never made it to judges and juries.

It does make it to judges and juries because of the deliberate disinterest of the US’s theoretical auto safety regulator - the .

We are dealing with extraordinarily complex and highly-opaque system engineering issues here.

https://www.bloomberg.com/news/articles/2023-08-17/tesla-failed-to-fix-autopilot-after-fatal-crash-engineers-say

adamjcook,

Complex issues will surface in the courtroom like , mode confusion and system internals that, frankly, judges and juries are going to be strained to understand and appreciate.

But the , in its infinite wisdom, is just tickled pink to have private litigation deal with ’s vast wrongdoings because the agency can continue to do nothing, which they do well.

adamjcook,

The should have arrested ’s wrongdoings in their infancy, nearly a decade ago.

Now people are dead and a brutally-complex engineered system is just wholesale flopped on the back of a jury.

US auto safety regulation, if one wants to pretend it ever existed, is a joke.

End 🧵

BruceMirken, to random
@BruceMirken@mas.to avatar
adamjcook,

@BruceMirken This is the same Bad Faith shit that us in the community have been dealing with on the and wrongdoings for years.

I am not going to lecture Dr. Hotez on how to deal with this... but I can say this... the strategy on the -side and of his sycophants is to passionately pretend like they want to have a Good Faith debate... but they are just looking for "gotcha soundbites".

chrisoffner3d, to llm

“I think we will get the hallucination problem to a much, much better place,” Altman said. “I think it will take us a year and a half, two years. Something like that. But at that point we won’t still talk about these. There’s a balance between creativity and perfect accuracy, and the model will need to learn when you want one or the other.”

What is Altman is talking about? An has no notion of truth or accuracy, so you can't just dial up some "truth coefficient."

https://fortune.com/2023/08/01/can-ai-chatgpt-hallucinations-be-fixed-experts-doubt-altman-openai/

adamjcook,

@kentindell @chrisoffner3d In fact, I see quite a bit of similarities between what Chris has mentioned in the post cited below and the extremely dangerous assumptions that has been using to underpin their program - namely, the so-called "generalized self-driving" (no defined Operational Design Domain).

The Tesla Team has always embraced a very primitive safety strategy that they try to sell as "validation" - if I am being generous.

https://sigmoid.social/@chrisoffner3d/110846996819755228

gabrielesvelto, to random
@gabrielesvelto@fosstodon.org avatar

The piece that the the New York Times made about the latter from and friends about the existential risks posed by AI has so much bull in it that it might have been written by a chatbot as well.

Seriously stuff like: "They say the technology has shown signs of advanced abilities and understanding [...]"

Sure, take the points made by an executive at face value, it's not like he's trying to hype the thing into the stratosphere and pump up his shares.

adamjcook,

@gabrielesvelto The media, with some small, notable exceptions, had embraced this same pattern for years with respect to 's and programs (broadly, advancing 's hype and lies without analysis)... and it has resulted in completely avoidable death and injury.

There was never a reckoning on that and, as a result, we are now at the next stage.

gtbarry, to tesla
@gtbarry@mastodon.social avatar

NHTSA Gives Tesla Two Weeks To Show Its Work On Autopilot And FSD

In a letter dated July 3, NHTSA asks Tesla to describe all changes to the systems in the “design, material composition, manufacture, quality control, supply, function, or installation of the subject system, from the start of production to date.”

Tesla must respond to the request by July 19

https://jalopnik.com/nhtsa-gives-tesla-two-weeks-to-show-its-work-on-autopil-1850609577

itnewsbot, to Electricvehicles

Tesla’s misleading driving range claims trigger DOJ probe - Enlarge (credit: Thomas Trutschel / Contributor | Photothek)

T... - https://arstechnica.com/?p=1978067 #departmentofjustice #electricvehicles #fullself-driving #evdrivingrange #autopilot #elonmusk #policy #tesla

itnewsbot, to cars

Tesla Autopilot not responsible for 2019 fatal crash, jury says - Enlarge (credit: Smith Collection/Gado/Getty Images)

Tesla's c... - https://arstechnica.com/?p=1980233

deborahh, to tesla

Tesla's autopilot seems to be only semi-reliable. Far from foolproof, it relies on drivers to be vigilant in a new way.
I sometimes think: as a pedestrian and user of mass transit, a failure could impact me, too …

"The final 11 seconds of a fatal Tesla Autopilot crash

A reconstruction of the wreck shows how human error and emerging technology can collide with deadly results"


[The Washington Post]
https://wapo.st/45i7naQ

dorzig, to random
@dorzig@mastodon.social avatar

You guys, an F35 just landed in my backyard and made a joke about Tesla’s autopilot. Should I post to Twitter or no?

t3n, to tesla German
@t3n@t3n.social avatar

Stofftier lässt Tesla auf Autopilot schalten

Teslas Full-Self-Driving-Beta-Software ist in letzter Zeit ins Kreuzfeuer geraten und hat mit zahlreichen Rückschlägen zu kämpfen.

Neben Kollisionen und bundesstaatlichen Ermittlungen gibt es nun offenbar einen neuen Vorfall, der das Vertrauen in die Sicherheit des Systems erschüttern könnte.

https://t3n.de/news/youtuber-tesla-autopilot-stofftier-1568187/?utm_source=mastodon&utm_medium=referral

(Bild: Midjourney)

br00t4c, to tesla
@br00t4c@mastodon.social avatar

Tesla Autopilot not responsible for 2019 fatal crash, jury says

https://arstechnica.com/?p=1980233

pallenberg, (edited ) to tesla
@pallenberg@mastodon.social avatar

A former Tesla employee, who said he was harassed, threatened and eventually fired after expressing safety concerns, leaked personnel records and data about the company’s Autopilot driver-assistance software, including thousands of accident reports.

The German Handelsblatt really did an amazing job in investigating this story!

https://www.nytimes.com/2023/11/10/business/tesla-whistleblower-elon-musk.html?smtyp=cur&smid=bsky-nytimes

itnewsbot, to medical

Tesla on Autopilot crashed into stopped truck during highway lane closure - Enlarge (credit: Getty Images | NurPhoto)

A Tesla vehicle bein... - https://arstechnica.com/?p=1950267

  • All
  • Subscribed
  • Moderated
  • Favorites
  • anitta
  • InstantRegret
  • everett
  • ethstaker
  • magazineikmin
  • cubers
  • rosin
  • thenastyranch
  • Youngstown
  • mdbf
  • slotface
  • khanakhh
  • kavyap
  • DreamBathrooms
  • JUstTest
  • cisconetworking
  • GTA5RPClips
  • tester
  • Leos
  • tacticalgear
  • osvaldo12
  • Durango
  • ngwrru68w68
  • megavids
  • provamag3
  • normalnudes
  • modclub
  • lostlight
  • All magazines