If you set impossible standards for driverless cars, or any other technology, it's no surprise that they then miss these. But I don't see why anyone can possibly believe that the situation will be made worse by handing control of driving to a machine. At the very least, people seem to be blind to, or just unreasonably tolerant of, the current state of affairs, when road accidents and deaths are commonplace. So, so many of these come down to driver error, and can therefore be avoided.
The transition phase could be awkward, and there will, no doubt, still be some accidents. But it betrays the irrationality of this debate where after one accident the technology is written off by people who are happy to ignore the hundreds of others that day caused by humans who were tired, distracted, asleep, or just overreacted.
As to Naomi's question about trusting a computer to land on a river: depends on the programming. I don't know enough about the computing technology behind planes to know if it is possible currently or in the future to manage such a landing. On the other hand, if a computer could be programmed to fly the plane in such circumstances, it would do a rather better job of things.