> If I read the article it says autopilot, not FSD.
What's the difference? And does it matter?
Both are misleadingly named, per the OP:
> In December 2025, a California judge ruled that Tesla’s use of “Autopilot” in its marketing was misleading and violated state law, calling “Full Self-Driving” a name that is “actually, unambiguously false.”
> Just this week, Tesla avoided a 30-day California sales suspension only by agreeing to drop the “Autopilot” branding entirely. Tesla has since discontinued Autopilot as a standalone product in the U.S. and Canada.
> This lands weight to one of the main arguments used in lawsuits since the landmark case: Tesla has been misleading customers into thinking that its driver assist features (Autopilot and FSD) are more capable than they are – leading drivers to pay less attention.
My argument was that the idea that the name Autopilot is misleading comes not from Tesla naming it wrong, it comes from what most people think "Autopilots" on an aircraft do. (And that is probably good enough to argue in court, that it doesn't matter what's factually correct, it matters what people understand based on their knowledge)
Autopilot on a Tesla historically did two things - traffic aware cruise control (keeps a gap from the car in front of you) and stays in its lane. If you tell it to, it can suggest and change lanes. In some cases, it'll also take an exit ramp. (which was called Navigate on Autopilot)
Autopilots on planes roughly also do the same. They keep speed and heading, and will also change heading to follow a GPS flight plan. Pilots still take off and land the plane. (Like Tesla drivers still get you on the highway and off).
Full Self Driving (to which they've now added the word "Supervised" probably from court cases but it always was quite obvious that it was supervised, you had to keep shaking the steering wheel to prove you were alert, same as with Autopilot btw), is a different AI model that even stops at traffic lights, navigates parking lots, everything. That's the true "summon my car from LA to NY" dream at least.
So to answer your question, "What's the difference" – it's huge. And I think they've covered that in earlier court cases.
But one could argue that maybe they should've restricted it to only highways maybe? (fewer traffic lights, no intersections), but I don't know the details of each recent crash.
Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking.
Tesla’s Autopilot being unable to swap from one road to another makes is way less capable than a decades old civilian autopilots which will get you to any arbitrary location as long as you have fuel. Calling the current FSD Autopilot would be overstating its capabilities, but reasonably fitting.
Recover from upsets is the big thing. Maintaining flight level, speed, and heading while upside down isn’t acceptable.
Levels of safety are another consideration, car autopilot’s don’t use multiple levels of redundancy on everything because they can stop without falling out of the sky.
That's still massively simpler than making a self-driving car.
It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude you want and over a reasonable timescale it will do just that.
> It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude
That seemingly shifts the difficulty from the autopilot to the airframe. But that’s not actually good enough, it doesn’t keep an aircraft flying when it’s missing a large chunk of wing for example. https://taskandpurpose.com/tech-tactics/1983-negev-mid-air-c...
Instead, you’re talking about the happy path and if we accept the happy path as enough there’s the weekend equivalents of self driving cars built using minimal effort, however being production worthy is about more than being occasionally useful.
Autopilot is difficult because you need to do several things well or people will defiantly die. Self driving cars are far more forgiving of occasional mistakes but again it’s the or people die bits that makes it difficult. Tesla isn’t actually ahead of the game, they are just willing to take more risks with their customers and the general public’s lives.
> Self driving cars are far more forgiving of occasional mistakes
I would say not, no.
It's almost impossible to crash a plane. There's nothing to hit except the ground, and you stay away from that unless you really really mean to get close.
It's very easy to crash a car, and if you do that most of the time you'll kill people outside the car, often quite a lot of them.
There are no production aircraft fitted with autopilots that can correct for breaking a wing off.
Autopilots have contributed to a significant number of crashes and that’s with a very safety conscious industry.
In a hypothetical Tesla style let’s take more risk approach, buggy autopilots can surprisingly quickly get into a situation at cruising altitude which isn’t recoverable before hitting the ground. What is the worst possible thing an autopilot could do in this situation is eye opening here.
> There are no production aircraft fitted with autopilots that can correct for breaking a wing off.
Granted that specific case depends on the aircraft being a lifting body etc so obviously doesn’t extend to commercial aviation. But my point was lack of aerodynamic stability on its own isn’t enough that giving up is ok.
> Autopilots have contributed to a significant number of crashes and that’s with a very safety conscious industry.
"Contributed to", in the sense that the pilots decided to just blindly trust the autopilot and let it make a developing situation worse rather than, oh I don't know, maybe FLYING THE DAMN PLANE.
> buggy autopilots can surprisingly quickly get into a situation at cruising altitude which isn’t recoverable before hitting the ground
If you allow the autopilot to fly the plane into the ground, yes. If you're paying attention you ought to be able to recover just about anything, if most of the plane is still working. The vast majority of incidents where aircraft have departed controlled flight and crashed are because the pilots lost sight of the important thing - FLYING THE DAMN PLANE.
> But my point was lack of aerodynamic stability on its own isn’t enough that giving up is ok.
It's got nothing to do with aerodynamic stability. If you adjust the steering and suspension in a car correctly, it'll drive in a perfectly straight line with no user input for a surprisingly long way. With modern electronic power steering and throttle-by-wire systems it's actually surprisingly easy to turn an off-the-shelf car (even something cheap, secondhand, and quite old like a 2010s Vauxhall Corsa) into a simple line-following robot like we used to build at uni in the 80s and 90s in robotics class. Sure, you need a disused aerodrome to play with it, but it'll work.
There is the far greater problem that self-driving cars have to cope with a far more rapidly changing environment than an aircraft. A self-flying plane would be far easier to get right than a self-driving car.
A human driver can't just react, painfully slowly, in the way that current "self-driving" cars do, they have to anticipate and be "reacting" before the problem even begins to start. You do it yourself, even if you don't realise it. You hang back from that car because you know they're going to - there, right across two lanes, not so much as a glance in their mirror, what did I tell you? - they're going to do something boneheaded. That car's just pulled in, the passenger in the back is about to open their door right into your - nicely done, you moved out to the line and missed them by 50cm at least.
Self-driving cars can't do that, and probably never will. Self-flying aircraft won't need to do that.
And an autopilot is a surprisingly simple device that responds in simple and predictable ways to sensor inputs.
>"Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking."
Can you elaborate? My very limited knowledge but of very real airplane autopilots in little Cessna and Pipers is that they are in fact far easier than cars - they are a simple control feedback loop that maintains altitude and heading, that's it. You can crash into ground, mountain, or other traffic quite cheerfully. I would not be surprised to find adaptive cruise in cars is far more complex of a system than basic aircraft "autopilot".
Autopilot is similar to cruise control that is aware of other cars, and lane keeping. I would fully expect the sort of accident that happened to happen (drop phone, stop controlling vehicle, it continues through an intersection).
FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.
The fact that Tesla misleads consumers is a different issue from Autopilot and FSD being different.
> FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.
FSD at one point had settings for whether it could roll through stop signs, or how much it could exceed the speed limit by. I've watched it interpret a railroad crossing as a weirdly malfunctioning red light with a convoy of intermittent trucks rolling by. It took the clearly delineated lanes of a roundabout as mere suggestions and has tried to barrel through them in a straight line.
I'd love to know where your confidence stems from.
My confidence comes only from what I hear people doing with the system. I have zero experience with it and consider most of the PR from Tesla to be junk.
"would not expect" is the way a cautious person demonstrates a lack of confidence.
well the other person in the comments said the guy literally held his accelerator to the floor the entire time. is that actually a reasonable standard, or are you preemptively out for blood because you would never let reality get in the way of a good agenda? ironic, given that you go out of your way to accuse others of this. methinks you doth protest too much?
When my spouse worked in the area of determining "the value of an individual" (economically, not morally), it was computed as present value lifetime earnings: the cumulative income of the individual, converted back to its current value (using some sort of inflation model). IIRC, the PVLE averaged out to about $1-10M.
You shouldn't be down voted. Regardless of the moral or technical issues involved, there are established formulas used to calculate damages in wrongful death civil suits. Your range is generally correct although certain factors can push it higher. (Punitive damages are a separate issue.)
There are not "established formulas" or, to the extent that they are, the coefficients and exponents are not determined. The parties always argue about the discount rates and whatnot.
"""Results. At a discount rate of 3 percent, males and females aged 20-24 have the highest PVLE — $1,517,045 and $1,085,188 respectively. Lifetime earnings for men are higher than for women. Higher discount rates yield lower values at all ages."""
They claim have a pretrial agreement to reduce it to 3x compensatory damages (which would make the total judgemnet 160 million instead of 243 million).
Appealing is expensive because they have to post a bond with 100% collateral, and you pay for it yearly.
In this case, probably around 8 million a year.
So in general its not worth appealing for 5 years unless they think they will knock off 25-30% of the judgement.
Here it's the first case of it's kind so i'm sure they will appeal, but if they lose those appeals, most companies that aren't insane would cut their losses instead of trying to fight everything.
They will 100% fight it regardless. This is a massive verdict against them and it sets a bad precedent. There's no way they don't fight it. It's a ridiculously large verdict.
this is literally one of 1-3 companies who have a decent strategy in the age of AI. the rest is pretending changes will not affect them. even this judgement:
the guy decided to pick the phone while driving car not capable of red light detection. It could be any other car with similar auto steer capabilities. Right now same car with OTA updates would keep him alive. Sure, they are doing something wrong.
Ultimately, I believe there will need to be something catastrophic to oust musk/change leadership. And by that point, its questionable if anything worthwhile will be left to salvage.
My current bet is that optimus will fail spectacularly and Tesla gets left far behind as Rivian's R2 replaces it.
One thing I will note: I know folks that work at TSLA. Musk is more of a distraction. If he goes and if competent leadership is brought in, there's still enough people and momentum to make something happen...
Tesla has made some great cars, but their CEO is not making sound decisions . I really think a new CEO could turn around Tesla. It doesn't need to hit rock bottom first. Every major auto company has been through this.
This case will make settlement amounts higher, which is the main thing car companies care about when making decisions about driving features/marketing.
With Robotaxi it will get even higher as it will be clear 100% the company's fault.
You're already downvoted, but this quote from Fight Club always annoyed me as it misunderstands how recalls work.
1. Insurance companies price in the risk, and insurance pricing absolutely influences manufacturers (see the absolute crap that the Big 3 sold in the 70s)
2. The government can force a recall based on a flaw whether or not the manufacturer agrees
I'm not clear on what Tesla is doing these days. They've been left in the dust on autonomous driving, they've failed to update their successful car models, and their only new model was a spectacular failure.
>> I'm not clear on what Tesla is doing these days.
> Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?
I believe Musk wants to hype humanoid robots, because he can't get away with irrationally hyping electric cars or self-driving technology like you used to.
Tesla was never a car company, their real product is sci-fi dreams.
Agreed, and he’s already behind in humanoid robots, so the hype there won’t last long. The problem is that China is obliterating him at every turn because they actually build things that work instead of just hyping things and saying fake numbers of how much money it could be if every human on the planet bought 20.
By which metrics has Tesla been left in the dust wrt autonomous driving? Right now they are the only brand where you can buy a car and have it do basically 90% (or sometimes 100%) of your daily driving. Sure, it's supervised, but the alternatives are literally extremely geogated taxis
It's not free, is it? You buy the car, subscribe to their arbitrarily-priced subscription service, and then it does 90% of your driving.
That's like paying for a "self-juicing juicer" that only works with proprietary juice packages sold through an overpriced subscription.
Edit: Mostly a criticism. I have no bone to pick with Elon, but subscription slopware is the reason why Chinese EVs are more desirable to average Joes like me.
You could argue though Tesla isn’t targeting the average Joe, since they basically still haven’t made a affordable EV that they have mentioned they are working on (I think the new affordable cyber truck is still above the 40k promise, and they hinted the price will likely rise).
For luxury car owners or people who want the statement, subscriptions are good way to capture additional payment. Since if customers wanted to pick the cheapest option, they would have already picked something else.
Tesla has a level 3 system that it's willing to gamble on not needing intervention for a handful of miles for a handful of Tesla fanboys. It's very telling that their "level 4" robotaxis are basically unicorns and only exist (existed? it's not clear they are even available anymore) in a single neighborhood subsection of the level 3 robotaxis full area in Austin.
Waymo on the other hand has a level 4 system, and has for many years, in many cities, with large service areas.
Tesla is unquestionably in the dust here, and the delusional, er, faithful are holding out for this mythical switch flip where Elon snaps his fingers and every Tesla turns into a level 4 robotaxi (despite the compute power in these cars being on the level of a GTX 5090, and the robotaxis having custom hardware loadouts)
I don't understand the point of your reply. Waymo is geofenced taxis. You cannot buy a Waymo. It cannot drive basically wherever you want. Teslas mostly can. So, again, how is Tesla the one left in the dust?
> By which metrics has Tesla been left in the dust wrt autonomous driving
By the fact that they don't have autonomous driving. And this very judgement demonstrates that.
If you have to keep your full attention on the road at all times and constantly look out for the 10% case where the autopilot may spectacularly fail, it instantly turns off the vast majority of prospective users.
Funny enough the tech that Musk's tweets and the Tesla hype machine has been promising for the last decade is actually on the streets today. It's just being rolled out by Waymo.
In 2 years Tesla will be replacing most factory workers with fully autonomous robots that will do most of the work. This will generate trillions in revenue and is totally definitely trust me bro possible.
Expect huge updates on this coming in the near future, soon. Tesla will be the most valuable company on Earth. Get in the stock now.
(cars, solar panels, energy storage, and robotaxis are no longer part of the roadmap because optimus bots will bring in so much money in 2 years definitiely that these things won't matter so don't ask about them or think about them thanks.)
I don't understand how that's a retort against the claim that they've "been left in the dust on autonomous driving". Are you contending that autonomous driving is the only reason that Tesla owners would like their cars?
> Tesla also claimed that references to CEO Elon Musk’s statements about Autopilot during the trial misled the jury....
> The company essentially argued that references to Elon Musk’s own public claims about Autopilot, claims that Tesla actively used to sell the feature for years, were somehow unfair to present to a jury. Judge Bloom was right to reject that argument.
Of course, since Elon Musk has lied and over-promised a lot about Tesla's self-driving technology. It's an interesting defense to admit your CEO is a lair and can't be trusted.
Yeah, when the SEC pushed on the same issue, the company's response was to add fine print that Elon's statements "may be aspirational" and "may not reflect engineering realities".
It's crazy that they weren't reeled in by a regulator and it had to make it all the way through the court system. People are dead. A court judgement can't change that. Preemptive action would have.
I’m not usually an apologist, and I’d agree with this judgement if the car was left to its own devices, but the driver of the car held his foot on the accelerator which is why it blew through those stop signs and lights.
In regards to the autopilot branding, would a reasonable person expect a plane on autopilot to fly safely if the pilot suddenly took over and pointed it at the ground?
The average person does not know how to fly a plane or what a plane autopilot does. It's a ridiculous superficial comparison. Planes have professional pilots who understand the capabilities and limits of aviation autopilot technology.
Tesla has had it both ways for ages - their stock price was based on "self-driving cars" and their liability was based on "asterisk asterisk the car cannot drive itself".
If the average person does not know what an autopilot does, why would they expect Tesla's 'autopilot' to take such good care of them? I am reminded of a case many years ago when a man turned on the cruise control in his RV and went to the back to make himself lunch, after which the RV went off some sort of hill or cliff.
Rudimentary 'autopilots' on aircraft have existed for about a century now, and the earlier versions (before transistorization) only controlled heading and attitude (if conditions and other settings allowed it), with little indication of failure.
According to your analogy. Certified pilot = Certified driving license holder. Its not like Tesla is advertising non driving license or in eligible person can drive using Autopilot. I wonder how can you even justify your statement
Autopilot is part of a private pilots license and systems are approved by the FAA. Tesla autopilot isn't part of a driving license, nor did it undergo review by the NHTSA prior to launch because Elon considered it "legal by default".
Autopilots in airplanes are kind of dumb (keep heading, speed, and altitude, they won’t do much anything else), which is why Tesla doesn’t use the name as branding for its full self driving software. People at least know that much.
But then again even on HN people like parent think that autopilot is the same as full self driving, when it is and always has been just smarter cruise control. The payout was for autopilot (a feature that most new cars have these days under various names), not full self driving.
Autolanders are separate systems from autopilots, and there are definitely planes in production today with autopilots (pretty universal) and no autolanders (like almost all Cessnas, you need the garmin auto land system, and it’s only for emergencies). Autopilots have been a thing since the 1920s, when they were just a rope tied to a stick, they definitely didn’t do auto landing back then.
If you are trying to claim that all autopilots come with auto landers, thats absolutely not true. Even most, and again, they are always separate systems even if the auto lander can access the same servos used by the autopilot. Additionally, autolanders, unlike the autopilot, require the runways to support it (well, the ones used on commercial airplanes where it’s used sometimes in low visibility situations).
I really think only a few people on HN don’t get that autopilots are actually very much simple systems that have been around forever and do one thing well (keep the plane going in one direction at a specific altitude and speed).
I don't mean a reasonable pilot. Would a reasonable person expect autopilot in a plane prevents a plane from crashing into something that the pilot was accelerating towards while physically overriding the controls. The claim is that autopilot should not have been able to crash even with the driver actively overriding it and accelerating into that crash.
To me, it's reasonable to assume that the "autopilot" in a car I drive (especially back in 2019) is going to defer to any input override that I provide. I wouldn't want it any other way.
The original judgement held that the driver was 2/3 responsible, Tesla 1/3 responsible, which seems reasonable. The $243 million wasn't for causing the accident, but was a punitive amount for doing things that looked an awful lot like lying to the court and withholding evidence.
It seems clear that "autopilot" was a boisterous overclaim of its capabilities that led to people dying.
It may be minorly absurd to win founder-IPO-level wealth in a lawsuit, but it's also clear that smaller numbers don't act as an effective deterrent to people like Elon Musk.
Right! We demand engineering perfection! No autopilot until we guarantee it will NEVER kill a soul. Don't worry that human drivers kill humans all the time. The rubric is not better than a human driver, it is an Angelic Driver. Perfection is what we demand.
Tesla Autopilot seems to mostly drive hubris. The fine print says you're still supposed to maintain control. They don't have as sophisticated sensors as competitors because Elon decreed "humans don't have LiDAR, so we don't need to pay for it."
Nobody is saying it has to be perfect, but Tesla hasn't demonstrated that it's even trying.
I can see where they're coming from with the video-only concept, but even they admit it's not self-driving yet, so just don't call it self-driving (or "FSD**" or "autopilot") until it is.
I've always thought of it more as "Co-Pilot", but formally: "Autopilot" might truly be the better definition (lane-keeping, distance-keeping), whereas a "Co-Pilot" (in aviation) implies more active control, ie: pulling you up from a nose dive.
So... informally, "Tesla Co-Pilot" => "You're still the pilot but you have a helper", vs "Tesla Autopilot" => "Whelp, guess I can wash my hands and walk away b/c it's AuToMaTiC!"
...it's tough messaging for sure, especially putting these powertools into peoples hands with no formal training required. Woulda-coulda-shoulda, similar to the 737MAX crashes, should "pilots" of Teslas required training in the safety and navigation systems before they were "licensed" to use them?
I'm so lost. The guy decided to pick up the phone from the floor while driving the car at high speed.
1. It could be ANY car with similar at that time auto steer capabilities.
2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).
Not sure if it’s using the same FSD decision matrix but my model S chimed at me to drive into the intersection while sitting at a red light Last night with absolutely zero possibility it saw a green light anywhere in the intersection.
Perfectly isn’t a descriptor I would use. But this is just anecdotal.
Another name for "false promise" when made for capital gain is "fraud". And when the fraud is in the context of vehicular autonomy, it becomes "fraud with reckless endangerment". And when it leads to someone's death, that makes it "proximate cause to manslaughter".
As the source article says, the jury did agree that the driver was mostly liable. They found Tesla partially liable because they felt that Tesla's false promise led to the driver picking up his phone. If they'd been more honest about the limitations of their Autopilot system, as other companies are about their assisted driving functionalities, the driver might have realized that he needed to stop the car before picking up his phone.
Will this have any effect on other companies developing self driving tech? It sets a very high precedent for fines, and may discourage companies from further working on such tech.
That's an old argument by corporations against liability. Should they not be fully liable?
It should discourage them from making unsafe products. If it's not economical for them to make safe products, it's good that they go bankrupt and the economic resources - talent, money - go to someone else. Bankruptcy and business failure are just as fundamental to capitalism as profit.
These product-liability lawsuits are out of control; perhaps this judgement is directionally correct, but the punitive damages seem insane. This reminds me of the lawsuits which drove Instant Pot bankrupt, where the users were clearly doing very stupid things, and suffered injuries because they were able to physically overpower the safety mechanisms on their pressure-cookers.
> These product-liability lawsuits are out of control
Businesses also claim that, all the time. We need some evidence.
I remember doctors claiming that malpractice lawsuits were out of control; research I read said that it wasn't an economic issue for doctors and that malpractice was out of control.
I invite you to read both the claims and the judgements related to the Instant Pot lawsuits yourself; they're all quite clear, and you can come to your own decision about how reasonable they are.
My read is that people overpowered the safety interlock, after which the lid (predictably) flew off, and they were injured (mostly by the hot steam and bits of food). I think it's ridiculous for people to expect safety mechanisms to be impossible to bypass, but maybe you disagree!
I've owned and used two of the affected models (for ~6 years in total), and also read the version of events presented by the plaintiffs (in their claims), as well as the judges' view of what happened.
I originally read the documents to understand whether the unit I own is safe, and believed in its safety strongly enough that I have continued to use it.
I’d make that bet, because you obviously don’t know much about pressure cookers or the law. Juries hear expert testimony; they do not perform independent investigations. I have inadvertently tested the safety mechanisms, and they are quite secure. I will not do what plaintiffs say they did, because it is so incredibly stupid.
My thought when it comes to self driving vehicles though, is how do you determine what is considered unsafe here. I'm not sure about Tesla's autopilot, but Waymo for example has an excellent safety record that is better than most drivers when you compare the statistics. And yet, they are still entirely unable to avoid accidents, as mistakes and unpredictable situations happen. Maybe we should be considering a self driving vehicle safe if it has been proven to drive more safely than the average driver? It is something I think that should be considered.
Should they compensate the victims of such accidents? Of course. Should they pay out the equivalent of winning the lottery? That seems a bit much.
Developing, no, but once companies start releasing vehicles onto our shared public streets I have a lot less tolerance for launching science experiments that end up killing bystanders.
I can understand the argument that in the abstract over-regulation kills innovation but at the same time in the US the pendulum has swung so far in the other direction that it’s time for a correction.
I have no tolerance for bystanders being killed in general. If the science experiments kill on average less bystanders I'm all for them, if they don't they should be stopped until made safer.
In this case the judgement is so extreme because the judge had no tolerance for Tesla lying in relation to the server logs' existence and what they contained (namely that is was indeed their autopilot that was in full control, had been in full control for almost half an hour, and was not worried at all/not issuing warnings, at the time of the crash)
Almost all of these eye watering fines get reduced in further legal action. This has even happened to Tesla before with their news-making hostile workplace suit.
100% correct. Also, with 40,000 deaths due to car crashes each year in the U.S. (and 2 million+ severe injuries not resulting in death), I'd consider it drop in the bucket.
For every story like this, there are 10 stories of people who died the old fashioned way behind the wheel.
Accidents like this are obviously tragic, but let's remember that self driving software is already ~10x safer than a human. Unfortunately, lawsuits like this will slow down the rollout of this life-saving technology, resulting in a greater loss of life.
Companies that roll it out responsibly aren’t having issues. Tesla deserves these judgments and should not be allowed to roll this software out in the irresponsible way they have been.
You … can’t be serious, can you? Do you really think these statistics — even taken at face value — mean that self-driving software is “better than humans”?
This article is comparing miles with FSD engaged vs miles driven by humans. You do realize that those are driven under different conditions, yes? That FSD is engaged only for the easy stuff?
Also, you say “self driving software” but you’re linking to a Tesla press release. Do you think Tesla currently has the best self driving software? It seems to me that Waymo and Mercedes are way ahead, both of them putting their money where their mouth is insurance-wise, and using better tech. Yet neither Waymo nor Mercedes would claim their tech is anywhere near as good as a human.
The comments on this post are very misinformed. The car did not issue a "Take over immediately" alert despite having the capability or claiming to do so[1]. Hence the partial blame.
The punitive damages are justified since, Tesla during the whole process has been found to repeatedly lie about the existence of the server logs. @greentheonly on X, found that the server logs was indeed uploaded to their servers[2] and after this they had to hand over the logs. The logs were crucial in determining that Autopilot/Autosteer was on and that no warnings were issued. Tesla knew that they had the server logs all along. For a corporation to engage in this behavior is almost criminal.
Tesla has been taking the legal position in these lawsuits that Tesla vehicles are not intended to be operated without an attentive driver with their hands on the steering wheel at all times.
I suspect it is technologically feasible to design a Tesla to detect:
(i) someone is sitting in the driver's seat (seatbelt systems detect that for years);
(ii) hands are on the steering wheel;
(iii) the driver is attentive (not asleep and is actively monitoring the road).
Are Tesla vehicles designed to blare a loud annoying sound for as long as the system fails to detect an attentive driver?
jqpabc123 | a day ago
dekhn | a day ago
palmotea | a day ago
What's the difference? And does it matter?
Both are misleadingly named, per the OP:
> In December 2025, a California judge ruled that Tesla’s use of “Autopilot” in its marketing was misleading and violated state law, calling “Full Self-Driving” a name that is “actually, unambiguously false.”
> Just this week, Tesla avoided a 30-day California sales suspension only by agreeing to drop the “Autopilot” branding entirely. Tesla has since discontinued Autopilot as a standalone product in the U.S. and Canada.
> This lands weight to one of the main arguments used in lawsuits since the landmark case: Tesla has been misleading customers into thinking that its driver assist features (Autopilot and FSD) are more capable than they are – leading drivers to pay less attention.
atonse | a day ago
My argument was that the idea that the name Autopilot is misleading comes not from Tesla naming it wrong, it comes from what most people think "Autopilots" on an aircraft do. (And that is probably good enough to argue in court, that it doesn't matter what's factually correct, it matters what people understand based on their knowledge)
Autopilot on a Tesla historically did two things - traffic aware cruise control (keeps a gap from the car in front of you) and stays in its lane. If you tell it to, it can suggest and change lanes. In some cases, it'll also take an exit ramp. (which was called Navigate on Autopilot)
Autopilots on planes roughly also do the same. They keep speed and heading, and will also change heading to follow a GPS flight plan. Pilots still take off and land the plane. (Like Tesla drivers still get you on the highway and off).
Full Self Driving (to which they've now added the word "Supervised" probably from court cases but it always was quite obvious that it was supervised, you had to keep shaking the steering wheel to prove you were alert, same as with Autopilot btw), is a different AI model that even stops at traffic lights, navigates parking lots, everything. That's the true "summon my car from LA to NY" dream at least.
So to answer your question, "What's the difference" – it's huge. And I think they've covered that in earlier court cases.
But one could argue that maybe they should've restricted it to only highways maybe? (fewer traffic lights, no intersections), but I don't know the details of each recent crash.
Retric | a day ago
Tesla’s Autopilot being unable to swap from one road to another makes is way less capable than a decades old civilian autopilots which will get you to any arbitrary location as long as you have fuel. Calling the current FSD Autopilot would be overstating its capabilities, but reasonably fitting.
beering | a day ago
Retric | a day ago
Levels of safety are another consideration, car autopilot’s don’t use multiple levels of redundancy on everything because they can stop without falling out of the sky.
ErroneousBosh | a day ago
It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude you want and over a reasonable timescale it will do just that.
Retric | a day ago
That seemingly shifts the difficulty from the autopilot to the airframe. But that’s not actually good enough, it doesn’t keep an aircraft flying when it’s missing a large chunk of wing for example. https://taskandpurpose.com/tech-tactics/1983-negev-mid-air-c...
Instead, you’re talking about the happy path and if we accept the happy path as enough there’s the weekend equivalents of self driving cars built using minimal effort, however being production worthy is about more than being occasionally useful.
Autopilot is difficult because you need to do several things well or people will defiantly die. Self driving cars are far more forgiving of occasional mistakes but again it’s the or people die bits that makes it difficult. Tesla isn’t actually ahead of the game, they are just willing to take more risks with their customers and the general public’s lives.
ErroneousBosh | 7 hours ago
I would say not, no.
It's almost impossible to crash a plane. There's nothing to hit except the ground, and you stay away from that unless you really really mean to get close.
It's very easy to crash a car, and if you do that most of the time you'll kill people outside the car, often quite a lot of them.
There are no production aircraft fitted with autopilots that can correct for breaking a wing off.
Retric | 6 hours ago
In a hypothetical Tesla style let’s take more risk approach, buggy autopilots can surprisingly quickly get into a situation at cruising altitude which isn’t recoverable before hitting the ground. What is the worst possible thing an autopilot could do in this situation is eye opening here.
> There are no production aircraft fitted with autopilots that can correct for breaking a wing off.
That was a production aircraft still in service. https://simpleflying.com/how-many-f-15-eagles-are-still-in-s...
Granted that specific case depends on the aircraft being a lifting body etc so obviously doesn’t extend to commercial aviation. But my point was lack of aerodynamic stability on its own isn’t enough that giving up is ok.
ErroneousBosh | 4 hours ago
"Contributed to", in the sense that the pilots decided to just blindly trust the autopilot and let it make a developing situation worse rather than, oh I don't know, maybe FLYING THE DAMN PLANE.
> buggy autopilots can surprisingly quickly get into a situation at cruising altitude which isn’t recoverable before hitting the ground
If you allow the autopilot to fly the plane into the ground, yes. If you're paying attention you ought to be able to recover just about anything, if most of the plane is still working. The vast majority of incidents where aircraft have departed controlled flight and crashed are because the pilots lost sight of the important thing - FLYING THE DAMN PLANE.
> But my point was lack of aerodynamic stability on its own isn’t enough that giving up is ok.
It's got nothing to do with aerodynamic stability. If you adjust the steering and suspension in a car correctly, it'll drive in a perfectly straight line with no user input for a surprisingly long way. With modern electronic power steering and throttle-by-wire systems it's actually surprisingly easy to turn an off-the-shelf car (even something cheap, secondhand, and quite old like a 2010s Vauxhall Corsa) into a simple line-following robot like we used to build at uni in the 80s and 90s in robotics class. Sure, you need a disused aerodrome to play with it, but it'll work.
There is the far greater problem that self-driving cars have to cope with a far more rapidly changing environment than an aircraft. A self-flying plane would be far easier to get right than a self-driving car.
A human driver can't just react, painfully slowly, in the way that current "self-driving" cars do, they have to anticipate and be "reacting" before the problem even begins to start. You do it yourself, even if you don't realise it. You hang back from that car because you know they're going to - there, right across two lanes, not so much as a glance in their mirror, what did I tell you? - they're going to do something boneheaded. That car's just pulled in, the passenger in the back is about to open their door right into your - nicely done, you moved out to the line and missed them by 50cm at least.
Self-driving cars can't do that, and probably never will. Self-flying aircraft won't need to do that.
And an autopilot is a surprisingly simple device that responds in simple and predictable ways to sensor inputs.
NikolaNovak | a day ago
Can you elaborate? My very limited knowledge but of very real airplane autopilots in little Cessna and Pipers is that they are in fact far easier than cars - they are a simple control feedback loop that maintains altitude and heading, that's it. You can crash into ground, mountain, or other traffic quite cheerfully. I would not be surprised to find adaptive cruise in cars is far more complex of a system than basic aircraft "autopilot".
roywiggins | a day ago
https://en.wikipedia.org/wiki/Autoland
dekhn | a day ago
FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.
The fact that Tesla misleads consumers is a different issue from Autopilot and FSD being different.
jqpabc123 | a day ago
Thanks for explaining why labeling it "Autopilot" is misleading and deceptive.
omnimus | a day ago
FireBeyond | a day ago
FSD at one point had settings for whether it could roll through stop signs, or how much it could exceed the speed limit by. I've watched it interpret a railroad crossing as a weirdly malfunctioning red light with a convoy of intermittent trucks rolling by. It took the clearly delineated lanes of a roundabout as mere suggestions and has tried to barrel through them in a straight line.
I'd love to know where your confidence stems from.
dekhn | a day ago
"would not expect" is the way a cautious person demonstrates a lack of confidence.
keeganpoppen | a day ago
selridge | a day ago
keernan | 21 hours ago
tim333 | 6 hours ago
DoesntMatter22 | a day ago
LeoPanthera | a day ago
DannyBee | a day ago
The appeal will go to the 11th circuit.
tiahura | a day ago
[OP] jeffbee | a day ago
dekhn | a day ago
nradov | a day ago
dekhn | a day ago
[OP] jeffbee | a day ago
dekhn | a day ago
"""Results. At a discount rate of 3 percent, males and females aged 20-24 have the highest PVLE — $1,517,045 and $1,085,188 respectively. Lifetime earnings for men are higher than for women. Higher discount rates yield lower values at all ages."""
DannyBee | a day ago
Appealing is expensive because they have to post a bond with 100% collateral, and you pay for it yearly. In this case, probably around 8 million a year.
So in general its not worth appealing for 5 years unless they think they will knock off 25-30% of the judgement.
Here it's the first case of it's kind so i'm sure they will appeal, but if they lose those appeals, most companies that aren't insane would cut their losses instead of trying to fight everything.
DoesntMatter22 | 19 hours ago
Hamuko | a day ago
EastSmith | a day ago
> Tesla has indicated it will appeal the verdict to a higher court.
DoesntMatter22 | 19 hours ago
josefritzishere | a day ago
dolphinscorpion | a day ago
breakyerself | a day ago
maxdo | a day ago
breakyerself | a day ago
pm90 | a day ago
My current bet is that optimus will fail spectacularly and Tesla gets left far behind as Rivian's R2 replaces it.
One thing I will note: I know folks that work at TSLA. Musk is more of a distraction. If he goes and if competent leadership is brought in, there's still enough people and momentum to make something happen...
madeofpalk | a day ago
You’re a lot more optimistic about this than I am.
mmooss | a day ago
josefritzishere | a day ago
xiphias2 | a day ago
With Robotaxi it will get even higher as it will be clear 100% the company's fault.
1970-01-01 | a day ago
coredog64 | a day ago
1. Insurance companies price in the risk, and insurance pricing absolutely influences manufacturers (see the absolute crap that the Big 3 sold in the 70s) 2. The government can force a recall based on a flaw whether or not the manufacturer agrees
1970-01-01 | a day ago
netsharc | 23 hours ago
How excusable is it that maybe it's the narrator misunderstanding it, or making stuff up while talking to the lady?
standardUser | a day ago
mey | a day ago
palmotea | a day ago
> Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?
I believe Musk wants to hype humanoid robots, because he can't get away with irrationally hyping electric cars or self-driving technology like you used to.
Tesla was never a car company, their real product is sci-fi dreams.
blackjack_ | a day ago
m463 | a day ago
though they did update the model y (looks like a duck), they just cancelled the model S and X
mr_00ff00 | a day ago
bryanlarsen | a day ago
Almondsetat | a day ago
bigyabai | a day ago
That's like paying for a "self-juicing juicer" that only works with proprietary juice packages sold through an overpriced subscription.
Edit: Mostly a criticism. I have no bone to pick with Elon, but subscription slopware is the reason why Chinese EVs are more desirable to average Joes like me.
Almondsetat | a day ago
mr_00ff00 | 22 hours ago
For luxury car owners or people who want the statement, subscriptions are good way to capture additional payment. Since if customers wanted to pick the cheapest option, they would have already picked something else.
WarmWash | a day ago
Waymo on the other hand has a level 4 system, and has for many years, in many cities, with large service areas.
Tesla is unquestionably in the dust here, and the delusional, er, faithful are holding out for this mythical switch flip where Elon snaps his fingers and every Tesla turns into a level 4 robotaxi (despite the compute power in these cars being on the level of a GTX 5090, and the robotaxis having custom hardware loadouts)
Almondsetat | a day ago
WarmWash | a day ago
If you want to call this "autonomous" well then we are arguing semantics. But I think colloquially, autonomous means "no human".
paxys | a day ago
By the fact that they don't have autonomous driving. And this very judgement demonstrates that.
If you have to keep your full attention on the road at all times and constantly look out for the 10% case where the autopilot may spectacularly fail, it instantly turns off the vast majority of prospective users.
Funny enough the tech that Musk's tweets and the Tesla hype machine has been promising for the last decade is actually on the streets today. It's just being rolled out by Waymo.
WarmWash | a day ago
In 2 years Tesla will be replacing most factory workers with fully autonomous robots that will do most of the work. This will generate trillions in revenue and is totally definitely trust me bro possible.
Expect huge updates on this coming in the near future, soon. Tesla will be the most valuable company on Earth. Get in the stock now.
(cars, solar panels, energy storage, and robotaxis are no longer part of the roadmap because optimus bots will bring in so much money in 2 years definitiely that these things won't matter so don't ask about them or think about them thanks.)
slowmovintarget | a day ago
Tesla Model 3 highest overall in owner satisfaction.
"Left in the dust?"
protimewaster | a day ago
mrguyorama | a day ago
That's why Chevy has a bunch from them, including "Highest initial quality"
blinding-streak | a day ago
1970-01-01 | a day ago
blinding-streak | 7 hours ago
palmotea | a day ago
> The company essentially argued that references to Elon Musk’s own public claims about Autopilot, claims that Tesla actively used to sell the feature for years, were somehow unfair to present to a jury. Judge Bloom was right to reject that argument.
Of course, since Elon Musk has lied and over-promised a lot about Tesla's self-driving technology. It's an interesting defense to admit your CEO is a lair and can't be trusted.
FireBeyond | a day ago
tehjoker | a day ago
tass | a day ago
In regards to the autopilot branding, would a reasonable person expect a plane on autopilot to fly safely if the pilot suddenly took over and pointed it at the ground?
jrjeksjd8d | a day ago
Tesla has had it both ways for ages - their stock price was based on "self-driving cars" and their liability was based on "asterisk asterisk the car cannot drive itself".
nickff | a day ago
Rudimentary 'autopilots' on aircraft have existed for about a century now, and the earlier versions (before transistorization) only controlled heading and attitude (if conditions and other settings allowed it), with little indication of failure.
D-Coder | a day ago
The average person does know what an autopilot does, they're just wrong.
I think the example you provided supports that.
tass | a day ago
nitinreddy88 | a day ago
tapoxi | a day ago
tim-tday | 22 hours ago
seanmcdirmid | a day ago
But then again even on HN people like parent think that autopilot is the same as full self driving, when it is and always has been just smarter cruise control. The payout was for autopilot (a feature that most new cars have these days under various names), not full self driving.
philistine | 21 hours ago
That is absolutely not true. A plane on autopilot can land itself except for applying the brakes.
seanmcdirmid | 20 hours ago
If you are trying to claim that all autopilots come with auto landers, thats absolutely not true. Even most, and again, they are always separate systems even if the auto lander can access the same servos used by the autopilot. Additionally, autolanders, unlike the autopilot, require the runways to support it (well, the ones used on commercial airplanes where it’s used sometimes in low visibility situations).
I really think only a few people on HN don’t get that autopilots are actually very much simple systems that have been around forever and do one thing well (keep the plane going in one direction at a specific altitude and speed).
Gud | a day ago
Pilots undergo rigorous training with exam after exam they must pass.
No one is handed the keys to a Boeing 747 after some weekly evening course and an hours driving test.
tass | a day ago
To me, it's reasonable to assume that the "autopilot" in a car I drive (especially back in 2019) is going to defer to any input override that I provide. I wouldn't want it any other way.
zadikian | a day ago
Starman_Jones | a day ago
carefree-bob | a day ago
bsimpson | a day ago
It seems clear that "autopilot" was a boisterous overclaim of its capabilities that led to people dying.
It may be minorly absurd to win founder-IPO-level wealth in a lawsuit, but it's also clear that smaller numbers don't act as an effective deterrent to people like Elon Musk.
eYrKEC2 | a day ago
bsimpson | a day ago
Tesla Autopilot seems to mostly drive hubris. The fine print says you're still supposed to maintain control. They don't have as sophisticated sensors as competitors because Elon decreed "humans don't have LiDAR, so we don't need to pay for it."
Nobody is saying it has to be perfect, but Tesla hasn't demonstrated that it's even trying.
zadikian | a day ago
ramses0 | a day ago
So... informally, "Tesla Co-Pilot" => "You're still the pilot but you have a helper", vs "Tesla Autopilot" => "Whelp, guess I can wash my hands and walk away b/c it's AuToMaTiC!"
...it's tough messaging for sure, especially putting these powertools into peoples hands with no formal training required. Woulda-coulda-shoulda, similar to the 737MAX crashes, should "pilots" of Teslas required training in the safety and navigation systems before they were "licensed" to use them?
maxdo | a day ago
1. It could be ANY car with similar at that time auto steer capabilities. 2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).
madsmith | a day ago
Perfectly isn’t a descriptor I would use. But this is just anecdotal.
BugsJustFindMe | a day ago
Another name for "false promise" when made for capital gain is "fraud". And when the fraud is in the context of vehicular autonomy, it becomes "fraud with reckless endangerment". And when it leads to someone's death, that makes it "proximate cause to manslaughter".
SpicyLemonZest | a day ago
robotnikman | a day ago
mmooss | a day ago
It should discourage them from making unsafe products. If it's not economical for them to make safe products, it's good that they go bankrupt and the economic resources - talent, money - go to someone else. Bankruptcy and business failure are just as fundamental to capitalism as profit.
nickff | a day ago
mmooss | a day ago
Businesses also claim that, all the time. We need some evidence.
I remember doctors claiming that malpractice lawsuits were out of control; research I read said that it wasn't an economic issue for doctors and that malpractice was out of control.
nickff | a day ago
My read is that people overpowered the safety interlock, after which the lid (predictably) flew off, and they were injured (mostly by the hot steam and bits of food). I think it's ridiculous for people to expect safety mechanisms to be impossible to bypass, but maybe you disagree!
judahmeek | a day ago
And you obviously think doing so was next to impossible therefore Instant Pot shouldn't be liable.
But what evidence do you have of the difficulty of bypassing that specific safety mechanism?
nickff | a day ago
I originally read the documents to understand whether the unit I own is safe, and believed in its safety strongly enough that I have continued to use it.
judahmeek | 21 hours ago
Because I'd bet you that the jury certainly did.
nickff | 17 hours ago
judahmeek | 54 minutes ago
robotnikman | 22 hours ago
Should they compensate the victims of such accidents? Of course. Should they pay out the equivalent of winning the lottery? That seems a bit much.
janalsncm | a day ago
I can understand the argument that in the abstract over-regulation kills innovation but at the same time in the US the pendulum has swung so far in the other direction that it’s time for a correction.
Zababa | a day ago
spwa4 | 12 hours ago
UncleMeat | 55 minutes ago
Almost all of these eye watering fines get reduced in further legal action. This has even happened to Tesla before with their news-making hostile workplace suit.
motbus3 | a day ago
hnburnsy | a day ago
alamortsubite | a day ago
lern_too_spel | 21 hours ago
joshfraser | a day ago
Accidents like this are obviously tragic, but let's remember that self driving software is already ~10x safer than a human. Unfortunately, lawsuits like this will slow down the rollout of this life-saving technology, resulting in a greater loss of life.
bathtub365 | a day ago
sonofhans | a day ago
This is, technically speaking, pure bullshit. You have no proof because none exists.
joshfraser | 15 hours ago
https://driveteslacanada.ca/news/tesla-releases-new-fsd-safe...
sonofhans | 14 hours ago
This article is comparing miles with FSD engaged vs miles driven by humans. You do realize that those are driven under different conditions, yes? That FSD is engaged only for the easy stuff?
sonofhans | 14 hours ago
zadikian | a day ago
randyrand | a day ago
diasf | 23 hours ago
[1] https://electrek.co/2025/08/04/tesla-withheld-data-lied-misd... [2] https://www.pcmag.com/articles/hacker-who-helped-score-243-m...
keernan | 21 hours ago
I suspect it is technologically feasible to design a Tesla to detect: (i) someone is sitting in the driver's seat (seatbelt systems detect that for years); (ii) hands are on the steering wheel; (iii) the driver is attentive (not asleep and is actively monitoring the road).
Are Tesla vehicles designed to blare a loud annoying sound for as long as the system fails to detect an attentive driver?