Two men stood in front of the autonomous vehicle, operated by ride-hailing company Waymo, and literally tipped a fedora at her while she told them to move out of the way.
If a man jumps out in front of my car in traffic and points a pistol at me after I stop. I am going around or thru him and there is no other option. Anyone else trying to stop me even without visible weapons is going to get evasive maneuvers to protect myself because I am not dealing with that bullshit. That includes weaving far outside my travel lane or going over a sidewalk. That is self defense and a split second decision that any driver may have to make. Waymo prioritizes all outside obstacle avoidance which means it doesn’t even want to leave it’s set travel lane, which makes them trivial to stop like this with no recourse.
The point I am making is that self driving has a really hard time interpreting traffic edge cases or passenger emergencies like this. A remote operator could make the decision to drive over curbs and other lanes, if free, to save the passenger, and realistically should avoid hitting pedestrians too… but in the case of an armed attacker - well, yknow. Like force for like force.
Calling police would only be an auxiliary function. They cannot be depended on to respond in time to most calls to actually make a difference.
Would a remote operator interpret things accurately in 10 seconds or less, or be a job anyone would even want? How does the liability chain of command work? Who knows. But the current system makes no decision at all, and that is unacceptable. And the medical point still stands too, a remote operator could immediately reroute the vehxile to a hospital and alert the medical staff. A panic button is absolutely needed.
If a man jumps out in front of my car in traffic and points a pistol at me after I stop. I am going around or thru him and there is no other option. Anyone else trying to stop me even without visible weapons is going to get evasive maneuvers to protect myself because I am not dealing with that bullshit. That includes weaving far outside my travel lane or going over a sidewalk. That is self defense and a split second decision that any driver may have to make. Waymo prioritizes all outside obstacle avoidance which means it doesn’t even want to leave it’s set travel lane, which makes them trivial to stop like this with no recourse.
The point I am making is that self driving has a really hard time interpreting traffic edge cases or passenger emergencies like this. A remote operator could make the decision to drive over curbs and other lanes, if free, to save the passenger, and realistically should avoid hitting pedestrians too… but in the case of an armed attacker - well, yknow. Like force for like force.
Calling police would only be an auxiliary function. They cannot be depended on to respond in time to most calls to actually make a difference.
Would a remote operator interpret things accurately in 10 seconds or less, or be a job anyone would even want? How does the liability chain of command work? Who knows. But the current system makes no decision at all, and that is unacceptable. And the medical point still stands too, a remote operator could immediately reroute the vehxile to a hospital and alert the medical staff. A panic button is absolutely needed.