ChatterBank4 mins ago
Self-Drive Cars Next Year?
https:/ /www.th eguardi an.com/ technol ogy/202 0/aug/1 8/self- driving -cars-a llowed- motorwa ys-indu stry-ri sk
I'm in favour of the idea, given that cars are less likely to be drunk and aggressive, but I wonder if the technology will really be there by 2021?
I'm in favour of the idea, given that cars are less likely to be drunk and aggressive, but I wonder if the technology will really be there by 2021?
Answers
There will be a robotic husband on the options list LB.
15:41 Wed 19th Aug 2020
probably what will happen first is that motorways will be restricted to vehicles equipped for autonomous operation. the legislation already exists making motorways "special roads" rather than highways open to all, a tweak is all that's required to enact the new limitation.
as part of a parallel exercise, motorways will be equipped with overhead electric traction equipment, allowing hybrid HGVs to operate in electric mode/have their batteries charged while travelling on motorways. in one way good for the environment but not in another way as it will see off railway freight forever. hey-ho at least they won't need HS2 then....
of course that can't happen until sufficient vehicles have been converted to make it worthwhile.
as part of a parallel exercise, motorways will be equipped with overhead electric traction equipment, allowing hybrid HGVs to operate in electric mode/have their batteries charged while travelling on motorways. in one way good for the environment but not in another way as it will see off railway freight forever. hey-ho at least they won't need HS2 then....
of course that can't happen until sufficient vehicles have been converted to make it worthwhile.
TTT:
//possibly but we'll never get away from the fact that a person must be responsible, if that's true then the whole thing is pointless.//
This is an interesting point but why can't a company be responsible? If the business model is not an 'autonomous car' that somebody buys and rather an 'autonomous car service' service' that company provides it's paying customers then if that service is unsatisfactory, and it turns out that it's not the customers fault, then there is recourse via the law.
//possibly but we'll never get away from the fact that a person must be responsible, if that's true then the whole thing is pointless.//
This is an interesting point but why can't a company be responsible? If the business model is not an 'autonomous car' that somebody buys and rather an 'autonomous car service' service' that company provides it's paying customers then if that service is unsatisfactory, and it turns out that it's not the customers fault, then there is recourse via the law.
We've discussed insurance problems here before; the other day I came across "someone involved" stating that it had been agreed that if the car were being driven autonomously the manufacturer of the control system would be considered to be responsible for any accident. I can't remember exactly where I saw it but I think it was on Breakfast TV and the "involved person" was a representative of the company manufacturing the system.
//We've discussed insurance problems here before; the other day I came across "someone involved" stating that it had been agreed that if the car were being driven autonomously the manufacturer of the control system would be considered to be responsible for any accident.//
There's not a chance of that. The driver will be expected to be in a position to take control as soon as he considers it necessary. You also have to ask yourself why would the industry that is developing these systems would place itself in considerable jeopardy by being responsible for something over which they have absolutely no control once it has left their factory? A non-starter, I'm afraid.
There's not a chance of that. The driver will be expected to be in a position to take control as soon as he considers it necessary. You also have to ask yourself why would the industry that is developing these systems would place itself in considerable jeopardy by being responsible for something over which they have absolutely no control once it has left their factory? A non-starter, I'm afraid.
archibaldy:
//If the business model is not an 'autonomous car' that somebody buys and rather an 'autonomous car service' that company provides it's paying customers then if that service is unsatisfactory, and it turns out that it's not the customers fault, then there is recourse via the law.//
It didn't work out that well for BP in the Gulf of Mexico.
BP lease the Deepwater Horizon drilling rig and crew from Transocean.
Deepwater Horizon rig and crew cause major oil spill.
BP fined $4.5 billion - Transocean fined $1.4 billion
i.e. liability deemed - customer 75% : service provider 25%
//If the business model is not an 'autonomous car' that somebody buys and rather an 'autonomous car service' that company provides it's paying customers then if that service is unsatisfactory, and it turns out that it's not the customers fault, then there is recourse via the law.//
It didn't work out that well for BP in the Gulf of Mexico.
BP lease the Deepwater Horizon drilling rig and crew from Transocean.
Deepwater Horizon rig and crew cause major oil spill.
BP fined $4.5 billion - Transocean fined $1.4 billion
i.e. liability deemed - customer 75% : service provider 25%
bhg: "We've discussed insurance problems here before; the other day I came across "someone involved" stating that it had been agreed that if the car were being driven autonomously the manufacturer of the control system would be considered to be responsible for any accident" - no way jose, can you imagine! I'd like to see evidence of that.
Aberrant:
//It didn't work out that well for BP in the Gulf of Mexico.//
BP were determined to be mostly liable due to poor maintenance and other reasons. But the point is that the provider was also found to be liable to a certain degree.
So a 'service' model is one way to maintain accountability to the provider.
Another possible way to look at it is that the 'auto-driving' function is no different to any other part of a vehicle and manufacturers can be liable for defects that cause injury. I really don't think it'll be a problem.
I think a bigger issue will be the ethics programmed into autonomous systems and which option do they take when faced with only options that must harm somebody (choice of killing passenger or pedestrian for instance).
//It didn't work out that well for BP in the Gulf of Mexico.//
BP were determined to be mostly liable due to poor maintenance and other reasons. But the point is that the provider was also found to be liable to a certain degree.
So a 'service' model is one way to maintain accountability to the provider.
Another possible way to look at it is that the 'auto-driving' function is no different to any other part of a vehicle and manufacturers can be liable for defects that cause injury. I really don't think it'll be a problem.
I think a bigger issue will be the ethics programmed into autonomous systems and which option do they take when faced with only options that must harm somebody (choice of killing passenger or pedestrian for instance).
I suppose those are really problems already, archibaldy, but they're not defined in ethical terms. If someone walks out in front of a car we respond by instinct because we don't have time to see if we avoid one person but endanger another.
But because computers are assumed to have all the information in the world at hand and to be able to act on it in nanoseconds, we have to program them to make decisons no human would make in real life. We require them to be infinitely better than us, and that seems to set a very high bar.
I don't necessarily want a car with the highest ethics. I want it to drive as well as me, plus maybe 10%, and let me put my feet up, that's all.
But because computers are assumed to have all the information in the world at hand and to be able to act on it in nanoseconds, we have to program them to make decisons no human would make in real life. We require them to be infinitely better than us, and that seems to set a very high bar.
I don't necessarily want a car with the highest ethics. I want it to drive as well as me, plus maybe 10%, and let me put my feet up, that's all.
archibaldy: "Another possible way to look at it is that the 'auto-driving' function is no different to any other part of a vehicle and manufacturers can be liable for defects that cause injury. I really don't think it'll be a problem." - err no, as now, manufacturers are not generally liable for defects that occur during the life of the vehicle. If my brakes fail and cause an accident the maker of the car is not blamed it's down to the driver to make sure the vehicle is safe and roadworthy.
"I think a bigger issue will be the ethics programmed into autonomous systems and which option do they take when faced with only options that must harm somebody (choice of killing passenger or pedestrian for instance). " - those are never black and white choices in reality, humans make choices unavailable to a computer, below is an example that I personally experienced, no computer could/would ever do what I did.
"I think a bigger issue will be the ethics programmed into autonomous systems and which option do they take when faced with only options that must harm somebody (choice of killing passenger or pedestrian for instance). " - those are never black and white choices in reality, humans make choices unavailable to a computer, below is an example that I personally experienced, no computer could/would ever do what I did.
I was delivering a car that I'd sold to a guy in a town < 10 miles away, locally there is a section of dual carraigeway at the end of which is often a queue. I reached the queue and waited. I heard tyres screeching, n my mirror I saw a car approaching fast, too fast in fact to stop and it was definitely going to hit me. Dreading the whole fiasco of accident and therefore not selling the car I had to think fast.................how did I avoid the collision? The lane to my left was empty, I turned into it and floored it, the car that was behind me skidded to a halt inches from the car that was in front of me and had moved on slowly with the traffic queue. No computer could do that. Give a human any of the "choice" scenarios and they'll find another way. Ignore all the twee cobras above about computers being safer, in the end they are incredibly dangerous, easily surpassed by even a drunk driver.
Related Questions
Sorry, we can't find any related questions. Try using the search bar at the top of the page to search for some keywords, or choose a topic and submit your own question.