News0 min ago
Self Driving Cars, 80% Off Insurance?
It has been estimated that as 94% of accidents are caused by human error insurance premiums could be reduced by 80% once 'autonomous cars' are the only ones on the road
http:// blogs.b reakery ard.com /blog/l egal-is sues-au tonomou s-cars/ ?utm_so urce=Cu stomers -10-201 3&u tm_camp aign=ec 2b4876a 4-Febru ary_201 6_p_to_ z_Newsl etter_2 016& ;utm_me dium=em ail& ;utm_te rm=0_c0 8575103 9-ec2b4 876a4-7 8798353
What do you think?
http://
What do you think?
Answers
Best Answer
No best answer has yet been selected by EDDIE51. Once a best answer has been selected, it will be shown here.
For more on marking an answer as the "Best Answer", please visit our FAQ.I am sure we will have 'driverless only' motorways within 30 years.Other traffic being confined to other roads. That would solve most of the problems over animals, pedestrians, cycles that have been raised. 'Around town'I am not so sure.
How about 'Dual control' cars, where you drive them to the start of the driverless zone then the computer takes over?
How about 'Dual control' cars, where you drive them to the start of the driverless zone then the computer takes over?
The profits must stay the same or improve, so I doubt any excessive drop in premium will occur. They'll work it between them to ensure it doesn't, after all it'd take a lot of cash to enter the market as a new undercutting service provider. What'll happen will be a smaller drop as a bribe to start with, and a while later a massive increase on the premium of those who didn't comply. It's the way it tends to go when the industry wants to control the population. But first, before self driving cars are feasible, let's await compulsory insurance spy-in-the-car monitoring, forced by the same method.
So in the meantime we have to endure human drivers responsible for tens of thousands of deaths annually. But that's all right because at least once all those people are dead it's easy to assign blame. But woe betide anyone trying to fix the problems by handing them over to machines, whose accident and safety rate is so far so impressive that people even think it worth caring about a small scratch caused by hitting a bus at about ten mph.
I don't get people sometimes.
I don't get people sometimes.
T T T By your logic human drivers should have been abolished after the first few car crashes, no one seems to be arguing with the figure of 94% of accidents caused by human error, surly a computer controlled driverless can do better than that?
Remember it is fully possible (and actually happens) that an airline flight can be totally 'Autopilot' from taxing out from the stand to landing and parking at the stand at the other side of an Ocean ! The day is fast approaching when most pilots will hardly ever actually fly a plane, they will just sit in the seat in case they have to take over from the autopilot!
On one would have believed that 26 years ago!
Remember it is fully possible (and actually happens) that an airline flight can be totally 'Autopilot' from taxing out from the stand to landing and parking at the stand at the other side of an Ocean ! The day is fast approaching when most pilots will hardly ever actually fly a plane, they will just sit in the seat in case they have to take over from the autopilot!
On one would have believed that 26 years ago!
Oh stop being so patronising TTT. And as to the "has to happen at the same time", nonsense. In the end self-driving cars are just another class of vehicle, and multiple classes exist and have always existed at the same time. There's no reason to regard self-driving cars as somehow a special case that is either everything or nothing.
It's still some way from reality, but the road-testing so far has been fairly encouraging, with good signs that the cars are able to cope with real-world scenarios well.
It's still some way from reality, but the road-testing so far has been fairly encouraging, with good signs that the cars are able to cope with real-world scenarios well.
No I'm not saying that eddie. All I'm saying is who is responsible for the inevitable carnage? I refer both you gentleman to my first post on this. all pie in the sky, will never happen. Not patronizing, common sense. I work in IT, I do not trust software, even my own, especially when it relies on subjective decisions about things like other road users and road furniture.
I don't think the legal obstacles are nearly as hard to overcome as you are making out -- of for no other reason than we've only had to think about how to overcome them for, what, a couple of decades at most? And not very intensively at that, as so far self-driving cars aren't widespread enough for people to care too much about the legislation. More concretely, it's hard to see why self-driving cars can't be treated in essentially the same way as any other piece of automated equipment, in that responsibility therefore lies in the hands of the person whose vehicle it is or whoever is "driving it" (ie, turned the thing on in that particular journey). I'm not a lawyer by any stretch of the imagination but I just can't take seriously the idea that this is a legal obstacle that can *never* be overcome. Particularly as long as there is a manual override mechanism, responsibility for an accident caused by a self-driving car lies with the person who was using the car.
As to the rest? We place standards on our software that are incredibly high, it seems, while carrying on in blissful ignorance of, or tolerance of, our own significantly worse flaws. Over a million people die as a result of traffic accidents each year, and it's hard to see how that figure could be anything over than improved significantly by giving control of driving in most circumstances to a computer program capable of reacting rather a lot faster to stimuli than a human is -- not to mention the massively reduced risk of such a car getting tired, or distracted by kids shouting at the back or a pretty view through the window, or being not used to the particular model of car, or generally being a reckless maniac who either should never have been given a licence in the first place or who ignored their own inability in order to show off.
In as much as I do agree that it may never happen, this is not because the technology will always be inadequate, but because humans as a whole have too pronounced a sense of their superiority.
As to the rest? We place standards on our software that are incredibly high, it seems, while carrying on in blissful ignorance of, or tolerance of, our own significantly worse flaws. Over a million people die as a result of traffic accidents each year, and it's hard to see how that figure could be anything over than improved significantly by giving control of driving in most circumstances to a computer program capable of reacting rather a lot faster to stimuli than a human is -- not to mention the massively reduced risk of such a car getting tired, or distracted by kids shouting at the back or a pretty view through the window, or being not used to the particular model of car, or generally being a reckless maniac who either should never have been given a licence in the first place or who ignored their own inability in order to show off.
In as much as I do agree that it may never happen, this is not because the technology will always be inadequate, but because humans as a whole have too pronounced a sense of their superiority.
my car just swerved to avoid a dog and killed a bus queue, who's fault? who do the families of dead claim from? who do the police prosecute? From who do the council claim for a new bus shelter? Who Pays compo to the innocent bystanders covered in the brains of the victims? I'm still asleep on the back seat having fitted the ACME self drive system. The Dog survived by the way.....
Related Questions
Sorry, we can't find any related questions. Try using the search bar at the top of the page to search for some keywords, or choose a topic and submit your own question.