This feels like the trolley problem applied at scale. Will you deploy a self driving system that is perfect and stops all fatal accidents but kills one randomly selected person everyday?
Nope: there is no moral justification to potentially kill a person not participating in the risky activity of driving just so we could have other people be driven around.
Would you sign up for such a system if you can volunteer to participate in it, with now those random killings being restricted to those who've signed up for it, including you?
In all traffic accidents, there is some irresponsibility that led to one event or the other, other than natural disasters that couldn't be predicted. A human or ten is always to blame.
Not to mention that the problems are hardly equivalent. For instance, a perfect system designed to stop all accidents would likely have crawled to a stop: stationary vehicles have pretty low chances of accidents. I can't think of anyone who would vote to increase their chances of dying without any say in it, and especially not as some computer-generated lottery.
> Would you sign up for such a system if you can volunteer to participate in it, with now those random killings being restricted to those who've signed up for it, including you?
I mean, we already have. You volunteer to participate in a system where ~40k people die in the US every year by engaging in travel on public roadways. If self-driving reduces that to 10k, that's a win. You're not really making any sense.
USA-wide rate is 1 in 7,800 people dying in traffic accidents yearly, whereas NYC has a rate of 1 in 30,000. I am sure it's even lower for subway riders vs drivers. Even drivers, somebody doing 4k miles a year has different chances than somebody doing 40k. People usually adapt their driving style after having kids which also reduces the chances of them being in a collision.
Basically, your life choices and circumstances influence your chances of dying in a traffic accident.
At the extreme, you can go live on a mountaintop, produce your own food and not have to get in contact with a vehicle at all (and some cultures even do).
FWIW, I responded to a rethorical question about killings being random: they are not random today, even if there is a random element to them!
If you want to sign up to a completely random and expected chance of death that you can't influence at all, good luck! I don't.
In traffic incidents, humans drivers are rarely held accountable. It is notoriously difficult to get a conviction for vehicular manslaughter. It is almost always ruled an accident, and insurance pays rather than the human at fault.
Traffic fatalities often kill others, not just the car occupants. Thus, if a self-driving system causes half as many fatalities as a human, shouldn't the moral imperative be to increase self-driving and eventually ban human driving?