Why not the other way around? You do the driving, and the computer is the backup in case you mess up?
Humans are extremely bad at monitoring monotonic things, even if we are trained to do them. That‘s why trains have elaborate dead man switches to ensure the operator is paying attention.
From these accidents it‘s clear that Teslas aren‘t good enough at making sure the driver is paying attention.
And it‘s also pretty clear that these autopilot victims weren‘t aware how distracted they really were. I‘m pretty sure they thought they had everything under control, right up to the point when their car did something stupid.
> Why not the other way around? You do the driving, and the computer is the backup in case you mess up?
Unfortunately having the computer takeover has the same issues. If it thinks you're following the wrong lane it could choose to steer you into a barrier, for example.
Chris Urmson's team at Google ran user tests at the start of their autonomous vehicle program and found that users misbehave [1] when behind the wheel of semi-autonomous vehicles, despite having been given instruction to pay attention. We're seeing that scenario play out in slow motion in Teslas and perhaps other driver-assistance programs that get less press.
Humans are extremely bad at monitoring monotonic things, even if we are trained to do them. That‘s why trains have elaborate dead man switches to ensure the operator is paying attention.
From these accidents it‘s clear that Teslas aren‘t good enough at making sure the driver is paying attention.
And it‘s also pretty clear that these autopilot victims weren‘t aware how distracted they really were. I‘m pretty sure they thought they had everything under control, right up to the point when their car did something stupid.