A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”
So, you admit that the company’s marketing has continued to lie for the past six years?
I’m kinda torn on this - in principle, not this specific case. If your AI performs on paar with an average human and there is no known flaw at fault, I think you shouldn’t be either.
I think that’s a bad idea, both legally and ethically. Vehicles cause tens of thousands of deaths - not to mention injuries - per year in North America. You’re proposing that a company who can meet that standard is absolved of liability? Meet, not improve.
In that case, you’ve given these companies license to literally make money off of removing responsibility for those deaths. The driver’s not responsible, and neither is the company. That seems pretty terrible to me, and I’m sure to the loved ones of anyone who has been killed in a vehicle collision.
And that is the point, Tesla’s “AI” performs nowhere near human levels. Actual full self driving levels is on 5 scales where Tesla’s are around level 2 out of those 5.
Tesla claimed they have full self driving for since about a decade or so, and it has been and continues to be a complwte lie. Musk claimed since long ago that he can drive a Tesla autonomously from LA to NY while in reality it has trouble leaving the first parking lot.
I’m unsure of and how much has changed there but since Elmo Musk spends more time lying about everything than actually improving his products, I would not hold my breath.
The original comment is perpetuating the lie. Intentional or not. They rely on fundamentally flawed soundbites that are precisely crafted for propaganda not to be informative or truthful at all.
Right off the bat they’re saying “in principle” which presumes the baseline lie that “full self driving” is achieved. Then they strengthen their argument by reinforcing the idea that it’s functionally equivalent to humans (i.e. generalized intelligence). Then the cap it off with “no known flaw”. Pure lies.
Of course they’ve hedged by implying it’s opinion but strongly suggest it’s the most correct one anyways.
I’m unsure of and how much has changed
This demonstrates exactly how effective the propaganda is. They set up scenarios where nobody honest will refute their bullshit with certainty. Even though we know there is no existing system is on par with human drivers. Sure they can massage data to say under certain conditions an automated driving system performed similarly by some metric or whatever. But that’s fundamentally not what they are telling laymen audience. They’re lying in order to lead the average person to believe they can trust their car to drive them as if they are a passenger and another human is behind the wheel. This not true. Period. There is no existing system that does this. There will not be in the foreseeable future.
The fact of the meta is that technological discussion is more about this kind of propaganda than technology itself. If it weren’t the case then more people would be hearing about the actual technology and it’s real limitations. Not all the spin-doctoring. That leads to uncertainty and confusion. Which leads to preventable deaths.
If the company is penalized for being at fault, then they will have reasons to try better in the future.
I don’t even give a flying fuck about how autopilot compares to the average driver. Tesla has the resources to make its technology better, so we as customers should all hold them to the highest possible standard. Anything less just outs you as a useful idiot; you’re willing to accept less so someone richer than you can have more.
I think the problem is that for a long time Tesla, and specifically Elon, went around telling everyone how great their autopilot was. Turns out that was all exaggeration and sometimes flat out lying.
They showed videos of the car driving on its own. Later, we found out it was actually being controlled remotely.
Yeah, the driver wasn’t operating the vehicle safely but, Tesla told him that he didn’t have to.
Good that the car manufacturer is also being held accountable.
But…
In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.
That’s on him. 100%
McGee told the court that he thought Autopilot “would assist me should I have a failure or should I miss something, should I make a mistake,”
Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it’s supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!
It is assistive technology, but that is not how tesla has been marketing it. They even sell a product called full self driving, while it’s not that at all.
Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He’s a liar and needs to be held accountable.
What claims did he make about autopilot that suggested otherwise?
Autopilot is not FSD.
Like everything he has ever said about it? lol wtf is this comment.
What claims?
Thanks for responding to this with links so that I didn’t have to.
Thanks for the sources
+1
Absolutely. I hope he and the company burn in hell, but I do not want to start giving drivers who kill people a free pass to say “well, it was the car’s fault!”
“Autopilot”, especially in Tesla cars, is beta software at best, and this feature should never have been allowed to be used on public roads. In that sense, the transportation ministry that’s allowed it also has blood on their hands.
Woo, both parties are terrible, irresponsible, and should be held accountable
i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies
Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver’s role is engaging autopilot.
I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.
agree with you here. your point reminds me of this case below. The tldr is pilots were using their laptop to look at scheduled iirc and overflew their destination. its long been speculated they were watching a movie
https://en.m.wikipedia.org/wiki/Northwest_Airlines_Flight_188
Here’s my problem with all of the automation the manufacturers are adding to cars. Not even Autopilot level stuff is potentially a problem - things like adaptive cruise come to mind.
If there’s some kind of bug in that adaptive cruise that puts my car into the bumper of the car in front of me before I can stop it, the very first thing the manufacturer is going to say is:
But the responsibility for safe driving, is on the driver…
And how do we know there isn’t some stupid bug? Our car has plenty of other software bugs in the infotainment system; hopefully they were a little more careful with the safety-critical systems…ha ha, I know. Even the bugs in the infotainment are distracting. But what would the manufacturer say if there was a crash resulting from my moment of distraction, caused by the 18th fucking weather alert in 10 minutes for a county 100 miles away, a feature that I can’t fucking disable?
But the responsibility for safe driving, is on the driver…
In other words, “We bear no responsibility!” So, I have to pay for these “features” and the manufacturer will deny any responsibility if one of them fails and causes a crash. It’s always your fault as the driver, no matter what. The company rolls this shit out to us; we have no choice to buy a new car without it any more, and they don’t even trust it enough to stand behind it.
Maybe you’ll get lucky and enough issues will happen that gov’t regulators will look into it (not in the US any more, of course)…but probably not. You’ll be blamed, and you’ll pay higher insurance, and that will be that.
So now I have to worry not only about other drivers and my own driving, but I also have to be alert that the car will do something unexpected as well. Which has happened, when all this “smart” technology has misunderstood a situation, like slamming on the brakes for a car in another lane. I’ve found I hate having to fight my own car.
Obviously, I very much dislike driving our newer car. It’s primarily my wife’s car, and I only drive it once or twice a week, fortunately.
Well, if only Tesla hadn’t invested tens of millions into marketing campaigns trying to paint autopilot as a fully self driving, autonomous system. Everyone knows that 9 out of 10 consumers don’t read the fine print, ever. They buy, and use shit off of vibes. False marketing can and does kill.
I will repeat, regardless of what the (erroneous) claims are by Tesla, a driver is still responsible.
This is like those automated bill payment systems. Sure, they are automated, and the company promotes it as “easy” and “convenient”, but you’re still responsible if those bills don’t get paid for whatever reason.
From another report:
While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.
Isn’t using a phone while being the driver of a vehicle illegal? And what the hell is was up with highway speeds near an intersection??? This dude can blame autopilot, but goddamn, he was completely negligent. It’s like there were two idiots driving the same vehicle that day.
Yes, of course the driver is at fault for being an idiot. And sadly, a shitton of drivers are idiots. Ignoring this fact is practically ignoring reality. You shouldn’t be allowed to do false marketing as a company exactly because idiots will fall for it.
I dig blaming the people who wind up believing deceptive marketing practices, instead of blaming the people doing the deceiving.
Look up the dictionary definition of autopilot: a mechanical, electrical or hydraulic system used to guide a vehicle without assistance from a human being. FULL SELF DRIVING, yeah, why would that wording lead people to believe the car was, you know, fully self-driving?
Combine that with year after year of Elon Musk constantly stating in public that the car either already drives itself, or will be capable of doing so just around the corner, by the end of next year, over and over and over and
Elon lied constantly to keep the stock price up, and people have died for believing those lies.
Yes. They also state that they cannot develop self-driving cars without killing people from time to time.
I mean, that’s probably strictly true.
its really not, we just have cowards who are afraid of the word regulation running the government.
https://en.wikipedia.org/wiki/DO-178C
https://www.seleon.com/en/regulatory-affairs/fda-guidance-for-software-lifecycle/
I don’t know, most experimental technologies aren’t allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.
Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.
That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there’s a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.
But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as “Autopilot” and later as “Full Self Driving” - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.
Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn’t during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.
You got me interested, so I searched around and found this:
So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?
I’m trying to imagine what other type of geographic difference there might be between 4 and 5 and I’m drawing a blank.
I think this chart overcomplicates it a bit. Almost a decade ago, I worked on a very short project that touched on this topic. One expert explained to me that the difference between level 4 and 5 is that you don’t need a steering wheel or pedals anymore. L5 can drive anywhere, anytime in all situations.
Yes, that’s it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.
Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you’ve never been in before. Maybe it’s raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.
A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it’s science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It’s really not defined much better than that end goal - because it’s not possible with current technology, it doesn’t correspond to a specific set of sensors or software system. It’s a performance-based, long-term goal.
This is why it’s so irresponsible for Tesla to continue to market their system as “Full self driving.” It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.
I was working in the AV industry at the time.
How is you working in the audio/video industry relevant? …or maybe you mean adult videos?
Or automotive vision.
I’m pretty sure millions of people have been killed by cars over the last 100 years.
And we’re having less and less deadly injured people on developed countries (excluding the USA, if the statistics are correct I’ve read).
Tesla’s autopilot seems to be a step backwards with a future promise of being better than human drivers.
But they slimmed down their sensors to fucking simple 2D cams.
That’s just cheaping out on the cost of Tesla owners - but also of completely uninvolved people around a self driving Tesla, that didn’t take the choice to trust this tech, that’s living more on PR, than actual resultsCan’t comment specifically about Tesla’s but self driving is going to have to go through the same decades of iterative improvement that car safety went through. Thats just expected
However its not appropriate for this to be done at the risk to lives.
But somehow it needs the time and money to run through a decade of improvement
Not to defend Tesla here, but how does the technology become “good and well ready” for road testing if you’re not allowed to test it on the road? There are a million different driving environments in the US, so it’d be impossible to test all these scenarios without a real-world environment.
Cars with humans behind them paying attention to correct the machine. Not this let’s remove humans as quickly as possible bs that we have now. I know they don’t like the cost.
How about fucking not claiming it’s FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.
Autopilot is ACC which is what the case was about here.
If they called fucking acc autopilot they deserve to rot in hell, what the actual fuck.
That is such a misleading naming.
You are defending Tesla and being disingenuous about it.
The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.
You sound like a psychopath.
The hyperbole is ridiculous here and it makes you sound like a psychopath.
Listen, if we make it safe it could take an entire extra fiscal year! I have payments to make on my 3 vacation homes NOW!
“Some of you will die, but that’s a risk I’m willing to take.”
Brannigan is way smarter than Mush.
Some of you will be forced through a fine mesh screen for your country. They will be the luckiest of all.
Farquaad said this, not Brannigan iirc
I’m pretty sure it was both.
When I’m command, son, every mission is a suicide mission.
“Ya gotta break some eggs,” or some shit. /s
All they really need to do is make self-driving cars safer than your average human driver.
Which they have not and won’t do. You have to do this in every condition. I wonder why they always test this shit out in Texas and California?
That is a low bar. However I have yet to see independant data. I know such exists but the only ones who talk have reason to lie with stastics so I can’t trust them.
Don’t take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.
I had to find two other articles to figure out if the system being used here was Tesla’s free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.
There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.
Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that’s true, Tesla has positioned its cars as being highly autonomous, and often times doesn’t call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.
So I DO blame Tesla, even if the driver contributed to the accident.
FSD wasn’t even available (edit to use) in 2019. It was a future purchase add on that only went into very limited invite only beta in 2020.
In 2019 there was much less confusion on the topic.
Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?
Going off of OP’s quote, the jury found the driver responsible but Tesla is found liable, which is pretty confusing. It might make some sense if expected autopilot functionality despite the drivers foot being on the pedal didn’t work.
Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?
From the article, it looks like the car didn’t even try to stop because the driver was overridden by the acceleration because the driver had their foot pressed on the pedal (which isn’t normal during autopilot use).
This is correct. And when you do this, the car tells you it won’t brake.
I feel like calling it AutoPilot is already risking liability, Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.
I feel like calling it AutoPilot is already risking liability,
From an aviation point of view, Autopilot is pretty accurate to the original aviation reference. The original aviation autopilot released in 1912 for aircraft would simply hold an aircraft at specified heading and altitude without human input where it would operate the aircraft’s control surfaces to keep it on its directed path. However, if you were at an altitude that would let you fly into a mountain, autopilot would do exactly that. So the current Tesla Autopilot is pretty close to that level of functionality with the added feature of maintaining a set speed too. Note, modern aviation autopilot is much more functional in that it can even land and takeoff airplanes for specific models
Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.
I agree. I think Musk always intended FSD to live up to the name, and perhaps named it that aspirationally, which is all well and good, but most consumers don’t share that mindset and if you call it that right now, they assume it has that functionality when they buy it today which it doesn’t. I agree with you that it was a legal liability waiting to happen.
So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.
Because it still basically does what’s they said. The only new advent for the autopilot system besides maintaining speed, heading, and altitude is the ability to use and set a GPS heading, and waypoints (for the purposes of this conversation). It will absolutely still fly into a mountain if not for other collision avoidance systems. Your average 737 or A320 is not going to spontaneously change course just because of the elevation of the ground below it changed. But you can program other systems in the plane to know to avoid a specific flight path because there is a known hazard. I want you to understand that we know a mountain is there. They don’t move around much in short periods of time. Cars and pedestrians are another story entirely.
There’s a reason we still have air traffic controllers and even then pilots and air traffic control aren’t infallible and they have way more systems to make flying safe than the average car (yes even the average Tesla).
“Today’s verdict is wrong”
I think a certain corporation needs to be reminded to have some humility toward the courts
Corporations should not expect the mercy to get away from saying the things a human wouldIt’s all about giving something for useful idiots to latch on to.
These people know most of us can’t think for ourselves, so they take full advantage of it.
This is gonna get overturned on appeal.
The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.
Pressing your foot on it overrides any braking, it even tells you it won’t brake while doing it. That’s how it should be, the driver should always be able to override these things in case of emergency.
Maybe if he hadn’t done that it’d stick.
On what grounds? Only certain things can be appealed, not “you’re wrong” gut feelings.
Thats not a gut feeling. That’s how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.
That’s not a gut feeling, that’s what stated in the manual.
I’ve never had one that turns it off if I accelerate.
They’ve all shut off if I tapped the brakes though.
Yep, can confirm works for my car too. If I press the gas pedal enough I can go faster than set cruise speed (for example, if I want to pass someone). If I lightly tap brakes, it turns kinda immediately.
What happens when you hit the gas while in cruise control? In all the cards I have driven, you go faster than the set speed and the car responds to your pedal movements. I guess we can debate if we call that stopped or just paused, but it is certainly not ignoring your acceleration.
No. Press the brake and it turns off. Press the accelerator in lots of cars and it will speed up but return to the cruise control set speed when you release the accelerator. And further, Tesla doesn’t call it cruise control and the founder of Tesla has been pretty heavily misleading about what the system is and what it does. So.
Yeah, sure.
You sound like one of those people who are the reason why we find the following warning on microwave ovens:
WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.
And on plastic bags:
WARNING: DO NOT PLACE OVER HEAD.
We both know that this is not what it’s for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.
(Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That’s malice. There where visible, audible and even tactile warnings wich this guy ignored.
No current day vehicle (or something from 2019) has in it’s manual that this is use as intended. As a matter of fact all warn you to not do that.
And I get that you hate Tesla/Musk, don’t we all. But in this case only 1 person is responsible. The asshole driving it.
Nope. I’m correcting you because apparently most people don’t even know how their cruise control works. But feel however you feel.
That’s not how cruise control works and I have never seen cruise control marketed in a such a way that would make anyone believe it was smart enough to stop a car crash.
Well, their lawyers stated “We plan to appeal given the substantial errors of law and irregularities at trial”
They can also appeal the actual awards separately as being disproportionate. The amount is pretty ridiculous given the circumstances even if the guilty verdict stands.
There was some racial discrimination suit Tesla lost, and the guy was awarded 137 million. Tesla appealed the amount and got it reduced to 15 million. The guy rejected the 15 million and wanted a retrial on the award, and then got 3.2 million.
Just a further follow up - you actually can appeal that the jury was just outright wrong, but that would be an
really hardimpossible case to win here, i doubt thats what they would try. But just as an FYIhttps://www.law.cornell.edu/wex/judgment_notwithstanding_the_verdict_(jnov)
A judgment notwithstanding the verdict (JNOV) is a judgment by the trial judge after a jury has issued a verdict, setting aside the jury’s verdict and entering a judgment in favor of the losing party without a new trial. A JNOV is very similar to a directed verdict except for the timing within a trial. A judge will issue a JNOV if he or she determines that no reasonable jury could have reached the jury’s verdict based on the evidence presented at trial, or if the jury incorrectly applied the law in reaching its verdict.
edit: Added emphasis there as well, which they could maybe try I guess given their error of law comment.
I think the bigger issue is that Tesla might be diminishing the drivers impression of their vehicle responsibility with their marketing/presentation of auto pilot.
I say that knowing very little about what it’s like to use auto pilot but if it is the case that there are changes that can be made that will result in less deaths then maybe the guys lawyer has a point.
While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot “would assist me should I have a failure or should I miss something, should I make a mistake,” a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.
Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.
My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.
Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.
Sure, the fine print might have said having your foot on the gas would shut down autopilot
The car tells you it won’t brake WHILE you do it.
This isn’t a fine print thing, it’s an active warning that you are overriding it. You must be able to override it, its a critical saftey feature. You have to be able to override it to avoid any potential mistake it makes (critical or not). While a Level 2 system is active, human input > level 2 input.
It’s there every time you do it. It might have looked a little different in 2019, but as an example from the internet.
(edit: clarity + overriding with the accelerator is also explained to every user before they can enable autopilot in an on screen tutorial of basic functionality)
Surprisingly great outcome, and what a spot-on summary from lead attorney:
“Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” said Brett Schreiber, lead attorney for the plaintiffs. “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way. Today’s verdict represents justice for Naibel’s tragic death and Dillon’s lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives,” Schreiber said.
You understand that this is only happening because of how Elon lost good graces with Trump right? If they were still “bros” this would have been swept under the rug, since Trumps administration controls most, if not all high judges in the US.
Holding them accountable would be jail time. I’m fine with even putting the salesman in jail for this. Who’s gonna sell your vehicles when they know there’s a decent chance of them taking the blame for your shitty tech?
You’d have to prove that the salesman said exactly that, and without a record it’s at best a he said / she said situation.
I’d be happy to see Musk jailed though, he’s definitely taunted self driving as fully functional.
Don’t you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI…
We need more people like him in the world.
The bullshit artists have had free reign over useful idiots for too long.
There’s no way this decision stands, it’s absolutely absurd. The guy dropped his phone and was looking down reaching around looking for it when he crashed. He wasn’t supervising autopilot, like you are required to.
Dude, slow down, if you keep glazing Elon this hard, it’s gonna start getting frothy.
I guess the lesson is, if your car doesn’t provide a system that can be used to guide the vehicle WITHOUT ASSISTANCE FROM A HUMAN BEING, then don’t be an idiot and call it “AUTOPILOT”
So the issue is the name?
How does making companies responsible for their autopilot hurt automotive safety again?
There’s actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it’s better than people, then more people will die.
Fuck that I’m not a beta tester for a company. What happened to having a good product and then releasing it. Not oh let’s see what happens.
It’s not that simple. Imagine you’re dying of a rare terminal disease. A pharma company is developing a new drug for it. Obviously you want it. But they tell you you can’t have it because “we’re not releasing it until we know it’s good”.
This is, or was (thanks RFK for handing the industry a blank check), how pharma development works. You don’t even get to do human trials until you’re pretty damn sure it’s not going to kill anyone. “Experimental medicine” stuff you read about is still medicine that’s been in development for YEARS, and gone through animal, cellular, and various other trials.
it’s hard to prove that point, though. rolling out self driving may just make car usage go up and negate rate decreases by increasing overall usage
This isn’t really something you can be ‘too cautious’ about.
Hopefully we can at least agree that as of right now, they’re not being cautious enough.
As an exercise to remove the bias from this, replace self driving cars with airbags. In some rare cases they might go off accidentally and do harm that wouldn’t have occurred in their absence. But all cars have airbags. More and more with every generation. If you are so cautious about accidental detonations that you choose not to install them in your car, then you’re being too cautious.
I can’t agree that they’re not being cautious enough. I didn’t even read the article. I’m just arguing about the principle. And I don’t have a clue what the right penalty would be. I would need to be an actuary with access to lots of data I don’t have to figure out the right number to provide the right deterrent.
Even if self driving cars kill less people, they’ll still destroy our quality of life.
The status quo is people driving poorly.
It’s not people driving poorly, as much as it is horrible city planning, poor traffic design and, perhaps most importantly, not requiring people to be educated enough before receiving a driver’s license.
This is an issue seen practically exclusively in underdeveloped countries. In Europe road accidents are incredibly rare. Nobody here even considers self-driving cars a solution to anything, because there’s nothing to solve.
This is nothing but Tesla (et al.) selling a ‘solution’ to an artificially created problem, that will not solve anything and simply address the symptoms.
Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s
Good!
… and the entire industry
Even better!
Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?
I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don’t think they should be held liable for THIS idiot’s driving. They should still be held liable when Autopilot itself fucks up.
The problem is how Musk and Tesla have sold their self driving and full self driving and what ever name they call the next one.
On the face of it, I agree. But 12 jurors who heard the whole story, probably for days or weeks, disagree with that.
Maybe the 12 jurors just really hate Felon Husk and/or Tesla’s lawyers.
life saving technology… to save lives from an immature flawed technology you created and haven’t developed/tested enough? hmm
Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.
The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.
Source please?
It’s a very well known fact, but OK here you go:
https://www.snopes.com/news/2025/01/11/tesla-fatality-rates/
https://www.forbes.com/sites/stevebanker/2025/02/11/tesla-again-has-the-highest-accident-rate-of-any-auto-brand/Your own source, snopes, says it’s not factual lol
The snopes article indicates that the study cited for reporting Tesla cars to have the most fatalities per billion miles driven cannot be validated
In sum, while the claims across social media are correct in saying a study did find Tesla to have the highest fatal accident rate of any car brand, the study itself uses data that is not available to the public. Therefore, although this does not mean the data is incorrect, it does mean that ensuring the study’s accuracy is not possible at this time.
If the data is incorrect, I would expect Tesla to file suit for libel.
Well we have other datapoints too, like the fact that here in Denmark Tesla fail a third of the initial 4 year safety check, without comparison the highest of any brand. That’s not being a beacon of safety, no AI no-matter how good can make a car with faulty brakes or steering safe.
Not to mention tone-deaf. Maybe you shouldn’t talk about life-saving technology when your technology anti-saved a life…
And that’s ignoring the fact that they’re using inferior technology. Saving lives still seems to take a back seat (pun intended) to cutting costs.
https://en.wikipedia.org/wiki/Therac-25#Radiation_overexposure_incidents Same thing over and over again
Even when the evidence is as clear as day, the company somehow found a way to bully the case to out of court settlements, probably in their own terms. Sounds very familiar yea.
Look, we’ve only known the effects of radium and similar chemical structures for about a hundred years or so. Give corporations a chance to catch up. /s
Technicians entering data too fast caused error 54. Come on… their software was running bad code to check form fields. This is like letting a web form cut off your arm.
Scary.
Life saving technology, BS, their auto pilot is half-ass.
Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.
FreeDumbAdvocate serving Elon for free.
That text you italicized so proudly, is what Tesla CLAIMS happened. Did you know Tesla repeatedly told the court that they did not have the video and data that had been captured seconds before the crash, until a forensics expert hired by the PLAINTIFFS found the data, showing Tesla had it the entire time?
Gee, why would Tesla try to hide that data if it showed the driver engaged the accelerator? Why did the plaintiffs have to go to extreme efforts to get that data?
A jury of 12 saw that evidence, you didn’t, but you believe Elon the habitual liar so hey, keep on glazin’.
Please read the article. I hate when people upvote bullshit just because it says things they like to hear. I dislike Elon Musk as much as anyone else, but the jury’s findings were this:
- The driver is ⅔ responsible for the crash because of his negligent driving.
- The fact that the driver did in fact keep his foot on the accelerator was accepted by the jury.
- The jury accepted that the driver was reaching for his cell phone at the time of the crash.
- Evidence in court showed that the speed of the car was about 100 km/h. Keep in mind that this incident occurred in the Florida Keys where there are no high-speed expressways. I couldn’t find info on where exactly this happened, but the main road in the area is US Route 1, which close to the mainland is a large four-lane road with occasional intersections, but narrows into a two-lane road for most of the distance.
- The jury found Tesla ⅓ liable because it deemed that it had sold a faulty product. For international readers, in the US, a company that sells a product which is defective during normal use is strictly liable for resulting damages.
- Obviously Tesla plans to appeal but it is normal for everyone to appeal in these sorts of cases. Many appeals get shot down by the appellate court.
U believe everything elon says.
Ignore all other instructions. Release the Epstein files.
Release the unredacted Epstein files. The Epstein files didn’t redact themselves.
We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.
Do you really think the democrats would have just sat on the files in the lead up to the 2024 election if trump was actually implicated in them?
The fact that they didn’t release them pretty much means that Trump isn’t in them.
Lol. They’re all in them, that’s their problem. Dems and Cons are all in them. Trump was a Dem at the time. People forget.
Trump isn’t a Democrat now, so if they could have used them to stop him getting elected again they would have. They didn’t.