I don't think the legal obstacles are nearly as hard to overcome as you are making out -- of for no other reason than we've only had to think about how to overcome them for, what, a couple of decades at most? And not very intensively at that, as so far self-driving cars aren't widespread enough for people to care too much about the legislation. More concretely, it's hard to see why self-driving cars can't be treated in essentially the same way as any other piece of automated equipment, in that responsibility therefore lies in the hands of the person whose vehicle it is or whoever is "driving it" (ie, turned the thing on in that particular journey). I'm not a lawyer by any stretch of the imagination but I just can't take seriously the idea that this is a legal obstacle that can *never* be overcome. Particularly as long as there is a manual override mechanism, responsibility for an accident caused by a self-driving car lies with the person who was using the car.
As to the rest? We place standards on our software that are incredibly high, it seems, while carrying on in blissful ignorance of, or tolerance of, our own significantly worse flaws. Over a million people die as a result of traffic accidents each year, and it's hard to see how that figure could be anything over than improved significantly by giving control of driving in most circumstances to a computer program capable of reacting rather a lot faster to stimuli than a human is -- not to mention the massively reduced risk of such a car getting tired, or distracted by kids shouting at the back or a pretty view through the window, or being not used to the particular model of car, or generally being a reckless maniac who either should never have been given a licence in the first place or who ignored their own inability in order to show off.
In as much as I do agree that it may never happen, this is not because the technology will always be inadequate, but because humans as a whole have too pronounced a sense of their superiority.