"Five police officers are suing Tesla after being injured by a Model X that plowed into them while they were conducting a routine traffic stop. The incident took place on February 27, 2021, when an allegedly impaired driver over-relied on the Model X's 'Autopilot' system, which reportedly gave off 150 warnings to take control of the vehicle in a 34-minute time span." https://www.autoblog.com/2023/08/10/video-tesla-driver-plowed-into-police-car-despite-150-warnings-from-autopilot/
“I think we will get the hallucination problem to a much, much better place,” Altman said. “I think it will take us a year and a half, two years. Something like that. But at that point we won’t still talk about these. There’s a balance between creativity and perfect accuracy, and the model will need to learn when you want one or the other.”
What is Altman is talking about? An #LLM has no notion of truth or accuracy, so you can't just dial up some "truth coefficient."
@kentindell@chrisoffner3d In fact, I see quite a bit of similarities between what Chris has mentioned in the post cited below and the extremely dangerous assumptions that #Tesla has been using to underpin their #FSDBeta program - namely, the so-called "generalized self-driving" (no defined Operational Design Domain).
The Tesla #Autopilot Team has always embraced a very primitive safety strategy that they try to sell as "validation" - if I am being generous.
Teslas Full-Self-Driving-Beta-Software ist in letzter Zeit ins Kreuzfeuer geraten und hat mit zahlreichen Rückschlägen zu kämpfen.
Neben Kollisionen und bundesstaatlichen Ermittlungen gibt es nun offenbar einen neuen Vorfall, der das Vertrauen in die Sicherheit des Systems erschüttern könnte.
#Autopilot and #FSDBeta-equipped vehicles are Level 2-capable vehicles - with the exact same limitations as many other vehicles on the market today.
Namely, the key limitation is that the human driver must remain the fallback for any dynamic driving task or vehicle failures at all times and under all conditions.
Effectively, that means that the human has the exact same control responsibilities between the two vehicles shown below.
#Tesla hand-waves this #HumanFactors fact as well on their official Autopilot product website, as shown below.
That “relaxation”, that “workload reduction” you might feel when using #Autopilot or #FSDBeta?
Guess what that really is?
Complacency.
Deadly complacency.
And it strikes when you least expect it, when the vehicle suddenly encounters a failure and it takes you an additional second or three to regain situational and operational awareness.
If your #NHTSA investigators are still wondering, after years of twiddling their thumbs, why #Autopilot-active, #Tesla vehicles keep slamming into the back of roadside emergency vehicles… I just provided you with the answer.
#Musk’s completely unsupported and unhinged promises of future therapies, treatments or prosthetic devices enabled by his #Neuralink firm and this #Tesla “#Optimus” humanoid robotics project is pretty damn disgusting.
A new, disgusting low.
Even relative to Tesla’s vast #Autopilot and #FSDBeta wrongdoings, which is saying something.
#Musk’s completely unsupported and unhinged promises of future therapies, treatments or prosthetic devices enabled by his #Neuralink firm and this #Tesla “#Optimus” humanoid robotics project is pretty damn disgusting.
A new, disgusting low.
Even relative to Tesla’s vast #Autopilot and #FSDBeta wrongdoings, which is saying something.
NHTSA Gives Tesla Two Weeks To Show Its Work On Autopilot And FSD
In a letter dated July 3, NHTSA asks Tesla to describe all changes to the systems in the “design, material composition, manufacture, quality control, supply, function, or installation of the subject system, from the start of production to date.”
I am not going to lecture Dr. Hotez on how to deal with this... but I can say this... the strategy on the #Musk-side and of his sycophants is to passionately pretend like they want to have a Good Faith debate... but they are just looking for "gotcha soundbites".