They claim that unsupervised autonomy in existing cars will arrive in California and Texas next year (with an easy bogieman that it will depend on regulatory approval), but no details as to what exactly this would mean.
It’s possible that they might be able to get an Level 3 product out similar to offerings by the likes of Daimler, Cadillac, and Ford - where on certain highways under certain conditions you don’t have to pay much attention but still must be available to take over relatively quickly if the conditions change. That seems the most likely route, although all other systems I believe rely on vision+radar or vision+lidar fusion. Those approaches have a lot more broad industry experience and quantifiable benefits in safety, but it’s possible Tesla has compelling data on the performance of its vision system, especially during daylight hours.
I’m honestly not sure how they could ship what they are implying - basically FSD as it is today but without anyone in the driver’s seat. That would imply they are (nearly) comfortable with it driving 10’s to 100’s of millions of miles between fatal accidents without any intervention. Either that or they are willing to ship and know it’s less safe than an average driver. That’s ignoring non-fatal accident rates.
There are some middle ground options where UFSD would have a larger set of conditions it can operate “unsupervised”, say in good weather and possibly daytime, and maybe only on some types of roads. But the edge cases where it transitions out of those conditions can be brutal and not easy to address. It’s relatively easy to say “just pull over and make the driver take over”, but especially on highways or heavy traffic that can take a while.
> (with an easy boogieman that it will depend on regulatory approval)
Right. Tesla has avoided getting the California DMV's autonomous driving licenses. They have a "learner's permit" for testing with a safety driver. California DMV's regulations for self-driving vehicles mirror those for drivers. There's the "learner's permit", (with safety driver), which has much the same restrictions as a human learner's permit. There's the autonomous testing permit, which is comparable to a regular (class C) driver's license - you can drive yourself and your employees, but not for hire and not large trucks. Then there's the deployment license, which allows charging money and is hard to get. Mercedes, Nuro, and Waymo have one. Cruise used to have one, but DMV revoked it after a fatal crash.
Tesla reported zero autonomous miles driven on California roads in 2023.[1]
They're not even trying. Tesla has long been scared of the reporting requirements. All disconnects have to be logged, miles driven have to be logged, and all accidents, however minor, have to be reported. Everybody else in the real self driving industry, from Apple to Zoox, does this. The ones with bad numbers grumble about it sometimes. Waymo doesn't.
Ford and Cadillac have Level 2 systems, not Level 3. Tesla also has Level 2, but it is significantly more capable. (I don't think any of the others work on city streets at all, or even change lanes automatically based on navigation.)
Mercedes is so limited to just technically qualify for Level 3 that it could be likely be trivially outmatched by some limited FSD conditions if that's the route Tesla wanted to go.
But yeah I assume you start with some limited Level 3 subset, probably highway, then extend it to city streets. Then just start working your way through validating new conditions.
? this demonstrates the limitations of drive pilot. and its describe a feature that is not out until 2025. current limitations are 40mph on a straight line highway, no lane changes, navigation, etc
They’re taking legal liability when you are driving a straight line on selected freeways going less than 40mph and with a car in front of you to follow during the day. This doesn’t demonstrate advanced capability, just limited scope.
Tesla’s system is purportedly far more advanced — do you believe that they could offer the same safety promises and legal protection for that limited scope if they wanted to?
(leaving aside for the moment, why they wouldn’t want to)
I’m not sure, I think it’s technically feasable given the current state, I expect that scope has pretty good safety numbers on current software, since it’s such a narrow scope.
But they would probably want to do all kinds of extra training and validation and fine tuning on it first rather than just blast out the current version.
If you look at their wording, they are saying they are ready to defend themselves and their software, not that they will protect anyone from a lawsuit.
The owners manual even explicitly states you are always the operator under drive pilot.
> The owners manual even explicitly states you are always the operator under drive pilot.
Just a straight up lie. The manual states:
> The person in the driver's seat when DRIVE PILOT is activated is designated as
the fallback-ready user and should be ready to take over control of the vehicle.
> As soon as the driver steers, accelerates or brakes, the responsibility for
driving and safe operation of the vehicle, including compliance with traffic regulations, will be returned to the driver.
IDK how you got that from what I quoted other than just wishcasting it to be the case. There is a 10 second handover window after which the car goes into an emergency stop procedure if the user hasn't taken control.
> basically FSD as it is today but without anyone in the driver’s seat.
I thought Elon said it would be L4 FSD with only vision, but it'll be available later. If he can deliver it, then a $25K L4 robotaxi certainly will have an advantage over Waymo's $200K mod. Well, I guess the stock market believes it's more of a vaporware than reality.
I would bet that it would not arrive in SF (at least the parts governed by Karl the Fog). Vision based sensors cannot slice through fog and rain and they would need constant take over by a driver.
It’s possible that they might be able to get an Level 3 product out similar to offerings by the likes of Daimler, Cadillac, and Ford - where on certain highways under certain conditions you don’t have to pay much attention but still must be available to take over relatively quickly if the conditions change. That seems the most likely route, although all other systems I believe rely on vision+radar or vision+lidar fusion. Those approaches have a lot more broad industry experience and quantifiable benefits in safety, but it’s possible Tesla has compelling data on the performance of its vision system, especially during daylight hours.
I’m honestly not sure how they could ship what they are implying - basically FSD as it is today but without anyone in the driver’s seat. That would imply they are (nearly) comfortable with it driving 10’s to 100’s of millions of miles between fatal accidents without any intervention. Either that or they are willing to ship and know it’s less safe than an average driver. That’s ignoring non-fatal accident rates.
There are some middle ground options where UFSD would have a larger set of conditions it can operate “unsupervised”, say in good weather and possibly daytime, and maybe only on some types of roads. But the edge cases where it transitions out of those conditions can be brutal and not easy to address. It’s relatively easy to say “just pull over and make the driver take over”, but especially on highways or heavy traffic that can take a while.