Mockrenocks,

Frankly, it speaks incredibly poorly to the NHTSA that this kind of behavior is allowed. “Beta testing” a machine learning driving assistance feature on active highways at 70+ miles an hour is a recipe for disaster. Calling it Full-Self Driving while also not having guardrails on its behavior is false advertising as well as just plain dangerous.

dragontamer,

NHTSA hasn’t had a permanent director in years.

I think NHTSA had a director for like… 2 or 3 months in 2022. But before that, it was blocked in the Senate. And before-before that, it was Elaine Chao’s Department of Transportation and she was incredibly anti-regulation.

Step 1 is that the citizens need to recognize what has happened to the federal apparatus. We’ve gutted our own government and safety regulators. Not just NHTSA, but also SEC, FTC, etc. etc. The anti-regulators / libertarians have the momentum with regards to laws in the past decade, and this is the natural result.

fosforus,

I like my Tesla but there’s no way I’ll be switching that thing on. They’re even calling it beta, what the fuck do people think that means?

megalodon,

FFS. He was testing a beta update at 73 miles per hour. Is he really expecting sympathy?

SomeRandomWords,

I thought all FSD updates were beta updates? Did I miss the announcement of FSD going GA and being stable?

If that’s the case, then yeah I probably wouldn’t test run a new update on the highway first. But I also have no idea if this issue happens at lower speeds as well.

megalodon,

Isn’t that the issue? He’s using something that’s still in beta on the highway.

SomeRandomWords,

Yes, 100%. Anyone is a fool to use Tesla “FSD Beta” pretty much anywhere. But Tesla markets it as totally safe to use anywhere and everywhere (but especially highways) so there’s a point where you have to stop calling everyone that owns a Tesla a fool and acknowledge that the common denominator is Tesla and just not the owner’s foolishness.

megalodon,

I didn’t call everyone that owns a Tesla a fool. I questioned whether someone who decides to risk their life to test a feature still in beta deserves sympathy.

SomeRandomWords,

My bad, I didn’t mean to insinuate that you’d call them a fool. I definitely would though.

Maybe just the recent buyers though.

spezz,

Maybe it shouldnt be released for real world use with such major bugs then. Dont give me the crap that iTs DiFfErEnT because tesla is a “technology company” either. Its a car, safety features on it should work damn near 100% of the time before it is released.

megalodon,

What crap am I giving? I’m just saying it’s a stupid idea to beta test self driving technology on the highway.

hackitfast,
@hackitfast@lemmy.world avatar

It’s the same reason we don’t take drugs that haven’t been tested yet. You know, not in lab rats.

Google treats its users as beta testers all the time. Difference is a phone won’t kill me when it crashes and reboots.

lucidinferno,

“Some of you may die, but it’s a risk I’m willing to take.” - Lord Farquaad and Musk

Blum0108,

Musquaad

bezerker03,

I mean… They opted into a beta. Beta means this may happen.

BCat70,

I guarantee that the other drivers on that road didn’t opt for a “beta”.

ours,

This isn’t just some email web app that may have a few bugs, it’s putting lives at risk on the road. They shouldn’t be able to just label it a beta, overpromise its capabilities, and neglect any responsibility.

grue,

Even if those dipshits “opted in,” the rest of us sharing the road sure as Hell didn’t!

abcxyz,

I just can’t understand how regulators all over the world allow these things on the road. How the fuck do you allow the release of potentially deadly (for everyone involved, not just for the user) software en masse for the public to beta test for you… This is not Diablo IV…

foo,

Beta only means buggy piece of shit to people who use software and then mostly gamers. In industries where prototypes can kill people a “beta” product is one that is safe for the intended use. For example, if you invented a new way to do internal scans of people, before you can even test it on humans you would have done extensive testing on animals to know what works, what doesn’t, and what gives them cancer, and have done the modelling to have a strong understanding on if it is safe with humans.

Nobody would tolerate a scanner that gave people cancer, oops

FlyingSquid,
@FlyingSquid@lemmy.world avatar

I’m not especially sympathetic to the Tesla drivers this might kill.

I’m worried about everyone else.

PsychedSy,

I consider the suicide attempts a feature. I’ll test for you, Tesla.

Asudox,
@Asudox@lemmy.world avatar

It shouldn’t have even been released for normal people to use it in daily life, in real roads full of other cars. This poses a big life risk if you ask me, I hope countries start banning this feature soon otherwise many more other deaths will happen, and Elon somehow will get away with them. What’s so hard about driving a real car manually? Did you all become fatass lazy people that don’t even have the willpower to drive a car? Ridiculous. ML is experimental and for a machine, it’s amazing, but it isn’t as good as a human YET, thus causing life threatening accidents. FSD literally is still in beta, and people are driving full speed in roads with this beta software.

dufr,

It can’t be used in the EU. It would need to pass a review, Elon have claimed they are close to getting it through but Elon says a lot of things.

tony,

In its current state it has basically no chance IMO.

If they’d concentrated in making AP/Highway driving smarter first they might have got that through… there are already rules for that… but cities? I’d love to see the autonomous car that could drive through London or Manchester.

echodot,

Self-driving cars are actually only legal in a few countries. And those countries have tests.

It’s only the United States that just lets anyone do what everyone earth it is that they want, even if it’s insanely dangerous.

Everywhere else any car company that’s espousing self-driving tech would actually have to prove that it is safe, and only a few companies have managed to do this and even then the cars are limited to predefined areas where they are sure they’re not going to come across difficult situations.

Ocelot, (edited )

Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.

Also please provide an example of a life threatening accident cause by FSD.

Chocrates,

Self driving is not there, and it may never get there, but you are right. We can save so many lives if we get this right.

I dont know is Musk is responsible enough to be the one to get us there though.

Zummy,

The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.

CmdrShepard,

Hilarious telling them to read the article first when you couldn’t even be bothered to read their question before replying.

Zummy,

I read it just fine. He asked for an example of a life threatening accident caused by Full Self Driving. I noted that 2 examples were listed in the article. The ONLY difference was that the driver prevented the accidents by being aware. The FSD was going to cause accidents without intervention. I guess in your would people are supposed to do nothing to avoid a major accident. Hilarious that you want to love FSD driving so much that you’re willing to defend a billionaire who wouldn’t piss on you if you were on fire. Billionaires are not your friends. FSD is BETA feature that doesn’t work properly. Take your love somewhere else and away from my comment because you read it, didn’t understand it, and fired off a reply stating I didn’t do something I did because you can understand me. The next time you want to have a discussion come prepared, or don’t come at all!

CmdrShepard,

Ah the “only difference” in your two examples of life-threatening accidents occurring is that no accident occurred in either example? That’s quite the difference if you ask me… this isn’t a level 4 or 5 system so driver intervention is required. These systems can’t improve without real world testing, meanwhile a hundred people die on the road every single day. I guess you’d prefer more people die on the road from drunk or distracted drivers than have manufacturers roll out solutions that aren’t absolutely 100% perfect even if they’re more perfect than human drivers most of the time.

Your obessesion with Musk is clouding your judgment. I made no mention of him, nor do I like or defend him. This tech wasn’t built by Musk so who gives a shit about him in this discussion?

Zummy,

I am not obsessed with Musk in any form, but the fact of the matter is when you have FSD systems that fail to do the thing they are supposed to do, then maybe it’s not the best idea to roll them out the entire world. Maybe it’s better to continue with more limited testing. You act as if all drunk driving/distracted will stop when FSD is used and that simply isn’t the care. Many people still use gasoline powered cars and drink and drive even though it’s dangerous to do so. Furthermore, FSD will lead to more distracted driving because people will assume the self driving means the car will take of everything and there is no need to be vigilant.

The plain truth is that while FSD can be the future, rolling it out despite knowing that it isn’t ready is not the solution it’s irresponsible and will cause harm. The almost accidents that you aren’t concerned with would have most likely killed the driver and probably other people to. Our difference of opinion here is that you believe it’s okay if people die as long the the testing shoes that there is a chance they won’t die in the future and think if anyone dies it’s too much. The feature clearly isn’t ready for prime time and needs more limited real world testing, but the fact of the matter is testing doesn’t bring in money.

Your inability to ever consider the fact that a worldwide roll out might not be the best idea right now since the testing shows the car isn’t ready shows that you really aren’t arguing in good faith. You have chosen the position that FSD is good and is ready even when confronted with articles like the above show it isn’t. I would wager that a lot of people want the era, of FSD, they just want it when it works. Keep the roll out more limited and do further testing. When mistakes happen, take the time to figure out why and how it can be prevented in the future. You argue testing is needed, but are in favor of a roll out now even though we need lots more limited real world testing. Both can’t be true. Time to think what you really want, because I don’t think you know… And accusing any person who doesn’t want a complete roll out of FSD today of having a bias against Musk shows that.

Ocelot,

Teslas have 360 degree dashcams that are recording all the time. Why didn’t they upload the video? I promise you they have it.

Such a video would go viral pretty easily. It would light a fire under tesla engineering to fix such a dangerous and life threatening situation. Where is it? Why is there never any footage attached to these articles? Why can’t I find a video ANYWHERE of such a thing? Why can nobody in this thread bashing the tech over and over produce any justification for their fear?

If I were tesla and I wanted to cover up dangers of FSD trying to kill people I wouldn’t give everyone a constantly running dashcam. It would really make them look bad.

Could it possibly be, just maybe, that the video disagrees with the “journalist” opinion that it was performing dangerously? Could it be that an article that says “Tesla FSD performs admirably, swerves to avoid obstacle that would have caused a blowout” might not get nearly as many clicks and ad revenue? Maybe?

FSD is aware of where barriers and medians are. If it needs to swerve to avoid an obstacle it will go in whatever direction is safest. Sometimes that means towards a barrier. Sometimes the driver panicking and disengaging and taking over interrupts the maneuver and causes danger that wasn’t otherwise present. We will never know what actually happened because there is no evidence. Evidence that I promise you exists but for whatever reason was omitted.

If a cop said something outrageous and dangerous happened to them and they say they are completely clear of fault and wrongdoing, would it not be reasonable to want to see the bodycam footage? If for whatever reason the police department says “we don’t have it” “its corrupted” or whatever other excuse would that not raise eyebrows? The same situation applies here.

There are plenty of youtube channels out there like dirtytesla, whole mars catalog, AI Driver, Chuck Cook, and many others that show and even livestream FSD. None of them have been in an accident, even in very early releases of the beta software. These people are comfortable with the beta and often don’t take over control of the vehicle under any circumstances, even in their torture test scenario.

Is it at all possible, just maybe, that FSD isn’t as dangerous as you might think? Fear is often a result of ignorance.

I am extremely open to changing my mind here just show me some convincing evidence. Every tesla is recording all the time so it should be really easy to find some, no?

Im sure im just a Tesla shill or fanboy whatever. The truth is Im just looking for facts. I would like to know why people feel this way and are so afraid of new technology despite overwhelming evidence to the contrary that it is saving lives.

wizardbeard,
@wizardbeard@lemmy.dbzer0.com avatar

Wow that’s sure a lot of text for someone that didn’t read the article.

The author states that despite having storage plugged in, he was not given the option to save a recording.

const_void,

Lol who would trust their life to Elon Musk? 🤣

sdoorex,

Well, this article is written by FredTesla who use to mod the TeslaMotors subreddit. Not only did he drink the koolaid, he brewed the damn stuff.

silvercove,

idiots and losers

sdf05,

This is like that show “Upload”; the guy literally gets killed by a car

FlyingSquid,
@FlyingSquid@lemmy.world avatar

That was a really good show.

jabjoe,
@jabjoe@feddit.uk avatar

Well hold on there, he survived the crash, and would probably have been ok. It was the upload that killed him.

sdf05,

Yeah, my bad 🤣 I meant the car technically endangered him to not live longer 😔

III,

You should finish watching that first episode before making such bold statements.

Ocelot,

I mean I think its still a valid point. The car in the show was sabotaged, and that is definitely something that might be a thing once all cars self-drive. Especially once they remove controls like steering wheels.

There hasn’t been a tesla FSD hack yet, but it would take spoofing a software update (and spoof the authentication and certs, etc)… The attacker would need to have access to a pretty massive supercomputer to make their own custom self-driving software and today getting the certs and everything right is next to impossible… but even then its only next to impossible, not impossible.

8ender,

Don’t even need sabotage. You already share the road with cars that someone repaired under a tree with the cheapest parts they could find.

reddithalation,

oh look more anti right to repair sentiment.

no, cars repaired by people other than the manufacturer wont kill you

8ender,

I’ve been repairing cars for over 15 years. There’s a massive spectrum for quality on almost any aftermarket replacement part. Literally the same part can range from $50 to $400 and the only difference is quality and durability.

Sometimes the cheap part is fine, sometimes they cause weird problems. Especially electrical parts.

reddithalation,

yeah sure, but that is unlikely to kill you or someone else, and diy repair is almost always good for the consumer

evatronic,

It may be difficult to spoof a certificate today, but tomorrow is a whole new day. To wit, OpenSSL has a pretty long history of serious vulnerabilities, despite being the best SSL library out there.

It is absolutely only a matter of time until the Tesla OTA functionality is compromised. There’s too many moving parts for it to not be.

Ocelot,

There are still a lot of other layers that need to be compromised past the cert for such an attack to even be possible. Even so, I suspect when such an attack does happen it will probably be for stealing cars. Your car would just wake up in the middle of the night and drive itself somewhere else to be cut up for parts. Less likely is any kind of safety issue since its so easy to take over control of the car.

jabjoe,
@jabjoe@feddit.uk avatar

“Attack surface” is the term you want. Big software means big attack surface. So keep code lean for security as well as efficiency.

Ocelot,

Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.

Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.

DingoBilly,

Given your posts and rampant Tesla fanboyism, I honestly wouldn’t be surprised if you’re Elon himself just anxiously trying to save face.

Then again, Elon would just publicly sprout misinformation about it all so it probably isn’t. Still, surprising that people are just so obsessed with Tesla they can’t take the bad with the good.

Ocelot,

all im asking for is some evidence of the bad. Nobody can provide it. It really shouldn’t be that hard.

CaptainAniki,

deleted_by_author

  • Loading...
  • Ocelot, (edited )

    Please let me know where I stated anything inaccurate in the comment about the single incident that has been dug up in the 500k FSD cars and millions of miles traveled self-driving.

    Also lets please keep this civil and not be name-calling. I hate Elon as much as anyone else and he deserves pretty much all the hate he gets. However it doesn’t change facts. Its not like he was responsible for writing even a single line of code in FSD or even designed or built any of the cars himself.

    DingoBilly,

    You were provided evidence and disregard it and make excuses for it. It’s hard to have a discussion if you just exclude all evidence for it.

    Think of it another way, you’re saying there’s absolutely no way that FSD has ever failed in its publicly available software, even with hundreds of thousands of cars on the road? Use a logic test on yourself and ask if that’s realistic.

    Kage520,

    Fsd makes a TON of mistakes. I’ve had the beta from the first public release. I don’t trust it to do anything more than lane holding and cruise control, with maybe some supervised lane changes. But it’s a beta. I understand that I am helping to test beta software.

    FSD in its current form should not be given to everyone. Tesla had it right when they gave it only to proven drivers (okay, it would have been better to test with paid employees, but I digress).

    FSD right now is like handing the keys to your 15 year old child and going to sleep in the back while they drive you home.

    CmdrShepard,

    Can you point to this evidence as I don’t see it anywhere?

    Also busting out a strawman argument one reply in to the discussion isn’t a good sign for the strength of your argument.

    DingoBilly,

    Look at the replies. I’m not going to sit and hand-pick them out for you, there’s plenty on there. Plenty of people either posting videos or stating first hand evidence of issues with FSD.

    Not sure where you see a strawman either. But whatever, if you aren’t seeing any evidence despite the many posts and don’t see how impossible a perfect record is then you won’t be convinced with any evidence regardless or are just a troll.

    CmdrShepard,

    I have looked at the replies and there are only a couple links about Autopilot crashes from users who think this is the same as FSD when it isn’t.

    Your strawman is claiming that this user is saying FSD is perfect and never had a failure. Nobody is arguing that. You guys keep mentioning all the deaths related to FSD, yet nobody has been able to provide a single one as evidence.

    silvercove,

    Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.

    Ocelot,

    so can you provide a link of an accident caused by FSD?

    naeemthm,

    Your posts here show you’re not interested in reality, but I’ll leave a link anyway

    motortrend.com/…/tesla-fsd-autopilot-crashes-inve…

    Excited to see your response about how this is all user error.

    Ocelot, (edited )

    I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.

    First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.

    I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.

    I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.

    The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.

    Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.

    OK, now that being said, lets dig in:

    November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge

    • I’m from the area and have driven this exact spot hundreds of times on FSD and have never experienced anything even remotely close to what is shown here
    • "Allegedly" with FSD engaged
    • Tesla FSD “phantom” braking does not behave like this, and never has in the past. Teslas have 360 degree vision and are aware of traffic in front of and behind them.
    • Notice at the beginning of the video that this car was in the process of a lane change, this introduces a couple of possibilities as to what happened here, namely:
    • Teslas do have a feature under autopilot/FSD that if after multiple warnings for the driver to pay attention and no engagement, the car will slow down and pull over to the shoulder and stop. This particular part of the bay bridge does not have a shoulder, so it stopped where it is. This seems unlikely, since neural networks are very capable of identifying what a shoulder is and that its in an active lane of traffic, and even with tesla’s massive fleet of vehicles on FSD there are no other recorded instances of this happening anywhere else.
    • This particular spot on the bay bridge eastbound has a very sudden and sharp exit to Yerba Buena Island. What I think happened is that the driver was aiming for this exit, saw that they were about to miss it and tapped the brake and put on the turn signal not realizing that they just disengaged FSD. The car then engaged regen braking and came to a full stop.
    • When a tesla comes to a full stop automatically (an emergency stop) it puts the hazards on automatically. This has been a feature since the v1 autopilot days. This car’s hazards do not come on after the stop.
    • What seems especially weird to me is that the driver continued to let the car sit there at a full stop while traffic piled up behind them. In FSD you are always in control of your own car and all it would have taken is tapping the accelerator pedal to get moving again. FSD will always relinquish control over the car to you if you tap the brakes or grab and move the steering wheel hard enough. Unless there was some mechanical issue that brought the car to a stop and prevented it from moving, in which case this is not the fault of the FSD software.
    • Looking at how quickly (or lack thereof) the car slowed down this seems to very clearly be the car using regen braking, not emergency braking. I’m almost positive this means that FSD was disengaged completely.
    • We don’t have all the facts on this case yet and I’ll be anxious to see how this plays out in court but there are definitely many red flags on this one that have me questioning what actually happened here, but I doubt if FSD has anything to do with it.
    • If my earlier point is true this is actually an instance of an accident being caused because the driver disengaged self-driving. The car would have been much safer if the driver wasn’t even there.

    April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet

    • This one is a favorite among the tesla hate community. Understandably so.
    • Smart summon has 0 to do with FSD or even autopilot. It is a party trick to be used under very specific supervised processes
    • Smart summon relies exclusively on the front camera and ultrasonic sensors
    • While smart summon is engaged, the user still has full control over their car via the phone app. If the car does anything unexpected you only need to release your finger from the button and the car stops immediately. The “driver” did not do this and was not supervising the car, the car did not see the jet because it was entirely above the ultrasonic sensors, and as I’m sure you can understand the object recognition isn’t exactly trained on parked airplanes.
    • The app and the car remind the driver each and every time it is engaged that they need to be within a certain range and within eyesight of the car to use it. If you remote control your car into an obstacle and it causes an accident, its your fault, period.
    • Tesla is working on a new version of smart summon which will make this feature more useful in the future.

    February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety

    • I suggest actually watching the video here. What happened is highly at odds with what is actually in the video, but the vid is just over an hour long so I bet most people don’t bother watching it.
    • “It wouldn’t have hit them, it definitely wouldn’t have hit them. Do we need to cut that?” "No, you can keep it in"
    • If you look at what was happening on the car’s display, it detected someone entering the crosswalk and stepping out into traffic on the left side. The car hit the brake, sounded an alert and swerved to the right. There was a bicycle in front of where the car swerved but at no point was it about to “nearly take out a bicyclist”. It did definitely overreact here out of safety but at no point was anyone in danger.
    • Relatively speaking this is a very old version of FSD software, just after the first wave of semi-public release.

    December 6, 2021: Tesla accused of faking 2016 Full Self Driving video

    • lol

    March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car

    • Now we’re getting into pre-FSD autopilot. See above comments about the capabilites of autopilot. Feel free to compare these to other cars LKAS systems. You will see that there are still lots of accidents across the board even with LKAS. That is because it is an assist system and the driver is still fully responsible and in-control of the car.

    June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck

    • Again, pre-FSD. If the driver didn’t see the overturned truck and disengaged to stop then I’m not sure how anyone expects a basic LKAS system to be able to do that for them.

    March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes

    • This one involves a fatality, unfortunately. However, the car was not self-driving. There is something else very important to point out here:
    • The feature that allows Teslas to change lanes automatically on the freeway (Navigate on Autopilot) was not released until a year after this accident happened. That means, that if AP was engaged in this accident, the driver deliberately instructed the car via engaging the turn signal to merge into that truck.

    May 7, 2016: First known fatality involving Tesla’s Autopilot system

    • Now we’re getting way back into the V1 autopilot systems which weren’t even made by tesla. This uses a system called MobilEye and is made by a third party and is even less capable than V2 autopilot

    So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.

    Excited to see your response.

    naeemthm,

    Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software by trade” isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.

    “The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s “Full Self-Driving” software from around 12,000 vehicles to almost 400,000 in about a year”

    caranddriver.com/…/report-tesla-autopilot-crashes…

    You claim the timeline is important here and this is all post-2022.

    Ocelot, (edited )

    I am not a “Software by trade” that was a typo. Believe it or not I wrote that entire thing on mobile.

    Correlation does not equal causation. Tesla sold a huge number more vehicles in the past 2 years than ever before. Also in 2019,2020 and part of 2021 not a lot of people were driving due to the pandemic.

    And, yes, a lot of the first incident I covered there was mostly anecdotal or what I think is happening. Importantly, what I think is happening as someone with years and tens of thousands of miles of experience using FSD beta. I do not have the facts and also importantly, neither do you. I am interested to see what comes out of that court case, but from where I sit I do not think FSD was involved at all.

    Please let me know where I have misrepresented facts, I will either correct them or cite sources.

    Again, Teslas come with a factory installed 360 dashcam. It records all the time. Where are all of the videos of these FSD related incidents?

    adeoxymus,

    Tbh the other side is also anecdotal. There’s no stats here.

    CmdrShepard,

    What’s fishy about it? You realize 40,000 people die every year from car accidents, meaning 110 die every single day, and you’re referencing 17 fatalities spread out over a few years as some big crisis. This tech (from any manufacturer) isn’t going to prevent 100% of accidents, and there’s not much you can do when drivers willingly drive their car into the side of a semi just like they did before this technology existed.

    I won’t argue AP, FSD, or any other system doesn’t have it’s issues but most of these responses are overblown sensationalism.

    zeppo,
    @zeppo@lemmy.world avatar

    Musk just did a 20 minute video that ended with it trying to drive into traffic.

    Ocelot,

    this one? Where does it drive into traffic? youtu.be/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX

    zeppo,
    @zeppo@lemmy.world avatar

    The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.

    Ocelot, (edited )

    Here’s the specific timestamp of the incident you mentioned in case you wanted to actually see it: youtu.be/aqsiWCLJ1ms?t=1190The car wanted to move through the intersection on a green left turn arrow. I’ve seen a lot of human drivers do the same. In any case, its fixed now and never was part of any public release.

    The video didn’t end there, it was near the middle. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…

    This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.

    zeppo,
    @zeppo@lemmy.world avatar

    it has to perform flawlessly 99.999999% of the time. The number of 9s matters. Otherwise, you are paying some moron to kill you and perhaps other people.

    Ocelot,

    ok so im totally in agreement but 99.999999% is one accident per hundred million miles traveled. I dont think there should be any reasonable expectation that such a technology can ever possibly get that far without real world testing. Which is precisely where we are now. Maybe at 4 or 5 9s currently.

    If you do actually want to have that level of safety, which lets be honest we all do, or ideally 100% safety, how would you propose such a system be tested and deemed safe if not how it’s currently being done?

    silvercove,

    One of many many examples: businessinsider.com/tesla-stops-tunnel-pileup-acc…

    Tesla has a huge problem with phantom breaks.

    Ocelot,

    See my huge post about that very accident right below. Do you have any other “Many many examples”?

    silvercove,

    Here is more: motortrend.com/…/tesla-fsd-autopilot-crashes-inve…

    How many do you want?

    Ocelot,

    ffs that is the exact same article again. Please read my other comment (the huge one) let me know if anything doesn’t make sense or you find anything factually inaccurate.

    kinther,
    @kinther@lemmy.world avatar

    Wishful thinking that Tesla would publicly distribute footage of an accident caused by one of their cars…

    Ocelot, (edited )

    Its saved on to a thumb drive. any user can pull off and use or post the footage anywhere. It never gets uploaded to tesla, only snapshots and telemetry.

    lol the anti tesla crew will downvote even the most basic facts.

    kinther,
    @kinther@lemmy.world avatar

    But is it technically the user’s data, or is there some clause in Tesla car ownership that says it is Tesla the company’s data?

    Forgive me I’m ignorant of the fine details. I purchased a Chevy Bolt but had been looking into a Tesla as an alternative until Elon tried to be the super-cool Twitter guy.

    Hotdogman,

    I saw the videos of them running over infants in strollers. Does that count?

    CmdrShepard,

    The ones from that guy who runs his own competing autonomous driving company who also refused to allow anyone else to perform the test with the car (which was all proven to be bullshit later because he was hitting the accelerator pedal)? There’s a lot of misinformation and FUD floating around out there.

    Ocelot,

    Dan O’Dowd of Green Hill Software. Spent millions of dollars on a superbowl ad hitpiece and it backfired spectacularly. Although clearly there are still a few people that believe it. You should listen to the podcast with whole mars catalog of him trying to explain himself. Its really wild.

    Tesla took him to court and won

    Ocelot,

    on FSD? link please

    LibertyLizard,
    @LibertyLizard@slrpnk.net avatar

    Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.

    Ocelot,

    They do whatever gets them clicks. Facts do not matter.

    LibertyLizard,
    @LibertyLizard@slrpnk.net avatar

    And this opinion is based on what? Obviously every online news source is concerned with increasing readership. But I’m not aware of any consistent factual issues in their reporting.

    Ocelot,
    LibertyLizard, (edited )
    @LibertyLizard@slrpnk.net avatar

    Honestly the quality of journalism in this article is pretty low. Some of the points are valid but most are just nitpicks about little opinion pieces at the ends of the articles. I don’t find these particularly valuable, and they sometimes contain some bad takes as pointed out here, but that’s not an issue of factual reporting. So the worst they’ve identified is a few minor omissions which, sure, but if you write thousands of articles that’s going to happen.

    And by the way, this article is making the case that Electrek is deliberately biased towards Tesla, not away from them. So if anything it undermines your point.

    I think the scandal about car referrals was pretty suspicious, but again, when you look at their reporting it comes down as pretty balanced. Perhaps you could argue they talk too much about Tesla but they cover the good and the bad. And I would say almost everyone in America has been talking about Tesla too much for quite some time.

    drdabbles,
    @drdabbles@lemmy.world avatar

    They are for sure not balanced. Alfred might have become more realistic about Elon and his bullshit for guarantee he would never get his roadster. That doesn’t mean he’s balanced.

    drdabbles,
    @drdabbles@lemmy.world avatar

    Bud, we’ve seen literally thousands of videos of this happening, even from the Tesla simps. You’re seven years behind on your talking points.

    CmdrShepard,

    FSD has only been out for less than 3 years.

    drdabbles,
    @drdabbles@lemmy.world avatar

    The first public release was much later than the smaller beta, which I had access to. And my reference to seven years was Josh Brown being killed by autopilot in 2016.

    Ocelot,

    Can you link a few? Something where FSD directly or indirectly causes an accident?

    drdabbles,
    @drdabbles@lemmy.world avatar

    You’re working very hard in this thread to remain in the dark. You could take two seconds to look for yourself, but it seems like you won’t. Hell, they performed a recall because it was driving through stops. Something it’ll still do, of course, but they performed a recall.

    Astroturfed,

    Elon literally had to hit the brakes manually in a Livestream of the self driving tech as the car was going to go strait through a red light. Like less than a week ago… SOOOO safe, all the news stories of it killing people are fake!

    CmdrShepard,

    Can you link to even a single news article about a death involving FSD?

    Astroturfed,

    caranddriver.com/…/report-tesla-autopilot-crashes…

    That was the top google result with a ton of other articles. I’m sure that somehow doesn’t qualify and you’ll move the goalpost in response. Go read some Qanon Elon jizrag news. That’s where the real facts are.

    CmdrShepard,

    And that’s an article about Autopilot which is a completely separate system. For someone with such strong opinions, you sure seem to lack even a basic understanding of the technology that you’re discussing here, but I’m sure you’ll just pull out more insults and keep making references to your current obsession, Musk, as if that makes your argument any more credible or factual.

    Astroturfed,

    Yup it’s so totally different. Sure, sure. Imma block you now bye. Go bullshit someone else, doesn’t look like anyone else is buying your musk cocksucking spam either tho.

    CmdrShepard,

    How cowardly of you rather than just admitting you’re wrong and misinformed.

    You even topped it off with some homophobic slur. How very on point for someone so uninformed and unhinged with such strong opinions.

    drdabbles,
    @drdabbles@lemmy.world avatar

    Yeah. These people aren’t even good liars, but they try their hardest to defend the complete nonsense and lies.

    Ocelot,

    The early alpha build not part of public release? That video? The one with the known regression in the model S?

    That video was a demo of the new FSD beta 12 software, which is the first time a neural network was in complete control of the car, resulting in a massive reduction in code and overall smoothness. Did I mention the part where it was unreleased to the public? Maybe there’s a reason for that?

    Other than that the car performed flawlessly in the entire 40 minute drive.

    Ocelot, (edited )

    The recall was most definitely not for “driving through stops”. It was to fix the behavior of doing a “rolling stop”, which is something 99.5% of drivers do, which is how it learned to do that. Where do you see that it still does not make a complete stop at stop signs?

    forbes.com/…/feds-make-tesla-remove-rolling-stops…

    I’m not trying to remain in the dark here, I’m just presenting facts. I’m very open to change my mind on this situation entirely just give me the facts. You said there were thousands of these videos I’m just asking for evidence. I just get downvoted and nobody posts any of the evidence.

    Im an AI professional and have been an FSD beta tester for almost 3 years with tens of thousands of miles logged. How can I possibly be the one “in the dark” here?

    drdabbles,
    @drdabbles@lemmy.world avatar

    “rolling stop”

    Or put another way by someone not desperate for Elon’s attention, not stopping. Driving through stops.

    Where do you see that it still does not make a complete stop at stop signs?

    Signs, lights, it’ll gladly not stop for any of them. Where do I see it? Real life. Actually owning one of these foolish gadgets for 5 years. Where do you see your examples?

    Also, don’t send me brad templeton opinion pieces, he’s a complete hack and has outed himself as such many times. He does have a nice video explaining what he thinks of Tesla stans like you though. Did you watch that one, or do you only link his material when it’s convenient?

    I’m just presenting facts

    No you aren’t, you’re presenting a curated social media marketing campaign. Congrats, you fell for the ad. Do you think that beer is going to make you more attractive, too?

    I’m very open to change my mind on this situation entirely

    Ok. Tell us what evidence it would take for you to completely change your mind on this and realize Elon is a hack, running a dangerous con with low quality software being released to cars in the US and Canada? What evidence would you require to change your mind and accept that Tesla doesn’t properly test releases before they go out to customers?

    I’m an AI professional

    This has absolutely zero bearing on anything except that you’re probably extremely susceptible to Elon’s outright lies.

    have been an FSD beta tester for almost 3 years

    Doubt.

    How can I possibly be the one “in the dark” here?

    The term is “delusion”.

    Ocelot,

    Lets not resort to name calling or personal attacks here. You stated there are “Thousands of videos” of FSD related accidents, I only asked for a few examples. Please tell me where it is you’re getting this information. Help me change my mind.

    Do you understand what a “rolling stop” is? It is when you don’t come to a complete stop at a stop sign, you slow down to 0.5 or 1mph, check both ways and move through. This has been studied time and time again that practically NOBODY on the road comes to a full and complete stop at stop signs. That is how the FSD beta was working in earlier releases because thats how it learned to drive. NHTSA said they had to come to a complete stop so Tesla fixed it. You again said that Teslas were still rolling through stop signs and I’m once again asking where you got that information?

    “Actually owning one of these foolish gadgets for 5 years” I’m guessing you’re trying to say you own a Tesla? You clearly don’t have FSD because if you did you’d know that it makes full stops 100% of the time. I certainly have doubts you actually do own a tesla because if someone spends 40-60 grand on something they consider a “Foolish Gadget” why on earth would they hold on to it for so long? Just sell it and get something else, move on with your life and don’t bash people who like their cars.

    I’ve asked you, now 3 times now to present evidence. Video evidence of FSD doing dangerous things. Given that all teslas have 360 dashcams that are constantly recording and we live in an age of such ease of video sharing that really shouldn’t be a big ask if FSD is as dangerous as you’re implying. These incidents should be happening daily. That is what would change my mind. What would change your mind?

    I bought my model Y in 2020 with FSD and emailed tesla for early beta access based on my engineering experience and the part of the country I’m located in. They granted it almost a year later and I’ve been driving with it almost every day since. Why on earth would you doubt that? Do you need some kind of evidence?

    drdabbles,
    @drdabbles@lemmy.world avatar

    Lets not resort to name calling or personal attacks here

    If you’re going to be a liar, I’m going to call you one.

    Someone has already provided you samples and you contorted yourself trying to deny their existence still. That shows the caliber of person you are.

    Do you understand what a “rolling stop” is?

    Do you understand what a red light and a stop sign are? See, traffic control devices are to be obeyed properly, and creating software that intentionally breaks the law is irresponsible at best. Only a clown would attempt to defend this. Meanwhile, you’re ignoring Elon’s own video from a week ago because it instantly disproves your insane position.

    I’m guessing you’re trying to say you own a Tesla?

    I did. I learned my lesson and am a proud one-and-done former tesla owner.

    You clearly don’t have FSD

    I did. Swing and a miss.

    it makes full stops 100% of the time

    Except, you know, the fucking recall proves it didn’t. And it still doesn’t after the recall. And of course, it misses traffic control devices frequently, ignores them at speed, attempts to pull through them when stopped, etc. Please, do yourself a favor and end this now. Lying to me isn’t going to work.

    if someone spends 40-60

    2018 P3D with performance package. More like 70+, with EAP from the factory, and the $2k FSD upgrade when Elon was busy being an idiot about pricing. If you have any questions for someone that’s actually owned one, I’d be glad to answer them for you.

    why on earth would they hold on to it for so long?

    Waiting for my replacement.

    don’t bash people who like their cars.

    I didn’t. I bashed you for being a liar.

    I’ve asked you, now 3 times now to present evidence.

    I asked what evidence would change your mind, and I see you entirely dodged that question. Because there is none. There’s nothing that would change your mind, because your mind is made up. It’s religion, and you don’t convince someone their religion is nonsense. I’m not surprised, of course. All liars behave like this- they pretend there’s something that could completely shift their world view, and change a core piece of their identity… like simping for Musk. But deep down, they know. There’s no such evidence. The racism, the sexual assaults, the financial grift, the hard right bullshit, the transphobia and homophobia, none of that changes your mind. The untested nature of AP and FSD, the release of “smart” summon that immediately started crashing into things, the fact they sent engineers down to Chuck Cook’s intersection for three months to program a single behavior. None of that sinks in when you believe in the religion of Tesla.

    I bought my model Y in 2020

    lmao, so absolutely didn’t have FSD longer than me. Delightful. Hysterical and delightful.

    Ocelot,

    I genuinely feel like I’m losing my mind here. Maybe that was your entire goal. Maybe its that you’re not listening and just like projecting your own opinions. I don’t know who or what hurt you or why you’re so angry but I’m done here. I’ve asked repeatedly for any sort of evidence as to why you feel this way, why you think FSD behaves this way and instead of providing anything you just escalate it and now we’re at name calling and calling me a liar. This is far from being a productive conversation. There’s so much to unpack here in this… whatever this is… you just posted I just really don’t … I can’t spend any more energy on it.

    drdabbles,
    @drdabbles@lemmy.world avatar

    I’ve asked repeatedly for any sort of evidence

    People already posted links. Go look at them.

    why you think FSD behaves this way

    Why do I think FSD behaves poorly? Because the software is garbage, proof of concept level, implemented hastily under nonsense timelines, driven by a weirdo that has absolutely no idea WTF he’s talking about at any moment.

    calling me a liar.

    I prefer to use the term “identifying” you as a liar.

    Anyway, nice work providing what kind of evidence would convince you. I knew you’d never do that.

    CmdrShepard,

    FSD was released in late 2020 yet you’re claiming you bought and used it in 2018? Who’s the liar now?

    Also how can you have so much time to write your long winded rants about Musk, but can’t be bothered to actually discuss the technology nor provide a single shred of evidence backing up a single one of your claims?

    drdabbles,
    @drdabbles@lemmy.world avatar

    Someone has poor reading comprehension. See where I said I bought the car in 2018, with eap equipped, and then I say I bought FSD in 2019 when Elon lowered the price to $2k?

    Simping isn’t going to get you very far here.

    CmdrShepard,

    Cool. How’d you get access to it years before it came out? It wasn’t available in 2018 nor 2019 (which you did not state in your comment) nor til roughly early 2021 with special permission from Tesla. Sounds like you’re not telling the truth here. There’s a term for people that don’t tell the truth…

    drdabbles,
    @drdabbles@lemmy.world avatar

    Oof. You reading the words you’re typing here? You realize that FSD has been on sale since long before people were invited to the beta, right? Like, you wouldn’t come here to attempt to argue with me and not even know that extremely basic fact, would you? Because that would be embarrassing for you.

    Sounds like you’re an alt for the account farming downvotes trying to smoke screen for yourself.

    CmdrShepard,

    I am reading what you’re typing, which is why I’m confused about your story here as it seems to be changing with each new comment.

    1. You bought what you referred to as “a foolish gadget” in 2018
    2. After driving this ‘foolish gadget’ for an entire year, you willingly decided to spend thousands of dollars extra for a software upgrade, even going as far as to state Musk was “being an idiot on the pricing” meaning you think this software was priced too low.
    3. You then laugh tell the guy who bought his Tesla and FSD in 2020 stating you’d been using FSD “way longer” than he had even though he’d bought it pre-release and had to apply for permission to use it in late 2020, the first time they’d allowed non-employees to begin using the software
    4. Now you claim you were only talking about having bought it earlier, conveniently ignoring the fact that you’d just previously talked about using said software earlier.

    So which is it? You bought it with the car like you initially stated? You bought it later in 2019 after driving the car you hated for a whole year, thinking Tesla deserved more of your money? Did you use the software early like you’d initially claimed or is it that you just bought it early like you’re now claiming?

    I think I’m beginning to see why you were so quick to resort to ad hominems and calling people liars when asked for evidence of your claims earlier. Almost as if you’re projecting your own personal feelings onto others because you know your argument isn’t backed by truth or facts.

    drdabbles,
    @drdabbles@lemmy.world avatar

    Nothing I’ve said has changed, you simply have no understanding of the topic at hand but you’re attempting to tell me I’m wrong. There’s a name for people like that.

    1. Yep.
    2. I owned it for 5 years, actually. You missed those comments in the other thread, no doubt. Also, I bought the $2k upgrade because I knew the cost of upgrading from HW2.5 to HW3 alone would be $2k. Turns out i actually saved money there.

    meaning you think this software was priced too low.

    No, I mean he was being an idiot. Again, because you have no idea WTF you’re talking about, you’re missing the required context here. Elon changed the price three times in the span of a month and a half. First he jacked it up, people got very angry, so he dropped it again and people that just paid the jacked up price got angry. So then he offered a discount on existing purchases in the form of a refund check, dropped the price to $2k for FSD, and gave a very loose timeline describing when he’d raise the price again. Knowing that I had HW 2.5 and that the HW3 computer alone could cost more than $2k to upgrade, I took that deal. Also got my $5k refund on top of it.

    If you knew anything about Tesla’s history, I wouldn’t have to explain that to you. But here you are, trying hard to be a White Knight and failing miserably.

    1. Yup. See, the guy that bought it on his Model Y? He didn’t get access as soon as I did. One of the perks of being an early buyer- Some of us did actually get earlier access, even though Elon lied about that benefit.
    2. No I’m not. You came in here like a fool talking about when I bought it, obviously not realizing that you could buy it in 2019. Which I made fun of you for. I still got access to it before the other guy, but I also bought it before them. You know so little about the topic that you conflated the two until I explained the difference.

    You bought it with the car like you initially stated?

    I didn’t say that anywhere. Quote for me where I said that. You can’t because I didn’t. So now you’re resorting to your other account’s tactics? Neat.

    The funny thing here is that you can’t conceive of the fact that I bought the car, it was good for a year but I smelled the scam cooking. But I still wanted the hardware upgrade. And that as the “upgrades” to the vehicle’s software kept coming, the car got worse over time. You know so little about this topic that it’s almost funny to see you floundering like this. You can’t imagine someone buying a Tesla, enjoying it until it starts having problems, then realizing what the con was all along. And that’s why this is religion to you people.

    Anyway, bye now. I know it’s harsh you got all those downvotes on your other account, but that seems to be the way Musk simps get dealt with on Lemmy.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    Do you understand what a “rolling stop” is?

    I sure do. I got pulled over for doing one. Because they’re not legal.

    Ocelot,

    Correct. They’re not. And thats why there was a recall and that’s why FSD no longer does them.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    You were just defending them as something lots of people do. Make up your mind.

    Ocelot,

    How is that not possible that something that lots of people do can’t still be illegal? Doing 1mph over the speed limit is also illegal.

    Coming to a complete stop is illegal, yes. The overwhelming majority of drivers do not do this. As a result, drivers do not expect it. Doing so increases risk of being rear-ended and can cause road rage. A self driving car is wrong, technically, in both cases if it does or doesn’t come to a complete stop. Just as it is wrong in both cases to exceed or not exceed the speed limit to keep with the flow of traffic. At the end of it I would side with what is legal which is what FSD does now as a result of the recall.

    The reason it did not come to a full stop before is because the whole system is trained on human drivers. And thats how human drivers drive.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    Again, you were defending it because a lot of people do it. You were defending it breaking the law.

    Astroturfed,

    Ah yes, there’s no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it’s made up. Of course.

    Ocelot, (edited )

    not all accidents are that violent. I would even accept a video of a simple fender bender to prove that FSD beta causes accidents with any sort of frequency. Those should be pretty common if FSD is dangerous as a lot of people are implying, right?

    Astroturfed,

    Look, I don’t like children either but wanting more child mowing cars out on the road is pretty twisted.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    Wait, are you now suggesting you won’t accept that Teslas with FSD ever get into accidents without video evidence? FSD is perfect?

    Ocelot,

    No I would never suggest that. The overwhelming consensus here is “FSD is dangerous. More dangerous than humans” Im asking for any proof of that here. So far, nothing. If they were getting in to accidents all the time there would be all kinds of footage, no? The fact is that even in this beta stage its already safer than human drivers. That apparently rubs people the wrong way for some reason. Don’t we all want safer roads?

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    You sure did suggest that when you said you would even accept a fender bender.

    Ocelot,

    Do you have any footage to share of FSD fender benders? If not how can you even claim it’s dangerous? Every car equipped with FSD hardware is equipped with 360 dashcams. It should be really easy to find some footage where FSD is at-fault for an accident.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    Again- you’re suggesting they’re perfect by implying they don’t even get into fender benders.

    Ocelot,

    no what im suggesting is that currently, as of today, they don’t. There will come a day when it does cause an accident. Any self driving system will be at risk of that as long as other humans share the road. What is most important is that we don’t lose sight of the accidents it prevents. As it stands right now hundreds are dying daily in auto accidents in the US and NONE of them so far are from self driving cars. Even with 500k+ Teslas in the hands of everyday people. Any effort to dismiss or shut down self driving car programs is an incredible disservice to road safety when theres no evidence to suggest its as or more dangerous to the average human driver.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    So your contention is that FSD has not caused a single accident of any sort? And the reason for that is that you’ve never seen video of it?

    Ocelot,

    No im saying if it was extraordinarily dangerous and causes “lots of accidents” as others have suggested, shouldn’t there at least be something to back that up? Some kind of footage? I mean ffs they’re recording all the time it shouldn’t be hard.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    Except you keep bringing up fender benders, which are not dangerous at all, and suggesting there hasn’t been a single one ever with Tesla’s FSD cars being at fault.

    Ocelot,

    I said I would accept any form of proof of an accident. Any sort of accident. Major or minor. Are you just trolling?

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    Again, implying that there has never been one since you haven’t seen it. I’m not trolling, I’m showing you what you’re implying. Either you admit that it is likely that a FSD car has caused at least one accident, no matter how minor, or you say you won’t accept it until you see it for yourself. You are doing the latter, which is a silly position to take.

    Ocelot,

    You’re missing the point. The whole thing is to prove that FSD is more dangerous than a human. Im asking for dashcam proof because that should by far be the easiest thing to get. Given the number of clickbait misinformed articles about the subject im very confident an actual major accident caused by FSD would be all over the news. It would be front page everywhere. All aboard the Elon hate train. Given the fact that nobody can produce evidence of even a minor FSD related accident and instead just calling me a Tesla fanboy and downvoting everything instead is baffling to me. Is it at all possible, just maybe, that FSD is a safe technology? Thats what i’m suggesting. If you have evidence to the contrary please present it. Unlike everyone else here I’m actually open to changing my mind.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    You’re upset with people calling you a fanboy, yet you keep implying FSD is so perfect that there’s never been the mildest of accidents. That’s my point. It’s a ludicrous thing to suggest.

    coffeebiscuit,

    Auto pilot beta? People are willing to test betas for cars? Are you insane? Insurance is going to have a field day.

    MrSqueezles,

    The craziest part of the article is just how much effort the author put into collecting data and filing feedback and really really hoping that Tesla could pull the videos (they can), then went on to actively try and succeeded in recreating the problem at high speed next to another car.

    drdabbles,
    @drdabbles@lemmy.world avatar

    Even better, several people have died using it or killed someone else. It also has a long history of driving underneath semi truck trailers. Only Europe was smart enough to ban this garbage.

    Ocelot,

    FSD has never driven under a truck, that was autopilot, which is an LKAS system. The incident happend 1 year prior to “Navigate on autopilot” so the car in question was never even able to change lanes on its own. The driver deliberately instructed the car to drive into the trailer.

    FSD beta is currently available in most of Europe and has been for several months.

    drdabbles,
    @drdabbles@lemmy.world avatar

    FSD has never driven under a truck

    Yes it has. Well, into the back of one so fast that it went under at least.

    which is an LKAS system.

    So is FSD. 🤣 It’s level 2 bud, you’re really REALLY confused for someone pretending to own one.

    The driver deliberately instructed the car to drive into the trailer.

    Are you saying Josh Brown killed himself? Because if you are, that would be a new repulsive low even for you Elon simps.

    elxeno,

    From what i read, Auto Pilot (AP) is just to keep u on your lane while Full Self Driving (FSD) just switches lanes into oncoming traffic.

    Nomad,

    Funny how George Hotz of Comma.ai predicted this exact same issue years ago: “if I were Elon Musk I would not have shipped that lane change”.

    This issue likely arises as the cars sensors can not look “far enough ahead” on the lane it changes to. Which can lead to crashes from behind due to much faster cars and in this case lane confusion aa the car can not see oncoming traffic.

    meco03211,

    Not Auto Pilot (AP). There’s a difference between FSD and AP. AP will just keep you between the lane lines and pace the car in front of you. It can also change lanes when told to. There’s also Enhanced Auto Pilot (EAP). EAP was supposed to bridge the gap between AP and FSD. It would go “on ramp to off ramp”. So it could switch lanes as needed and get to exit ramps. FSD is the mode where you shouldn’t need to touch it outside of answering the nag (the frequent nag to “apply force to the steering wheel” to tell it you are still alive and paying attention)*.

    '* At least I think that’s the same for FSD. I’m only on AP with AP1 hardware. Never had an issue that I’d blame on a “bug” or the software doing something “wrong”.

    coffeebiscuit,

    It’s the beta part that scares me the most, the type of assistance isn’t really relevant. People shouldn’t be driving around in betas. These aren’t phones.

    zeppo,
    @zeppo@lemmy.world avatar

    What bothers me is, I have to drive on the road with people running some braindead Elon Musk software?

    Ocelot,

    Have you seen how humans drive? Its not a very high bar to do better.

    coffeebiscuit,

    But betas? Seriously?

    Uniquitous,

    I was taught to always drive defensively. You never know when someone’s going to get distracted, get stupid, have a stroke… add glitchy robots to the list, it doesn’t make a whole lot of difference.

    skyspydude1,

    And yet FSD is still worse than the one time I got in the car with an exchange student who had never driven a car before coming to the US and thought her learners permit was the same as a driver’s license.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • ngwrru68w68
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • anitta
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • provamag3
  • tester
  • Leos
  • megavids
  • JUstTest
  • All magazines