View Single Post
  #6  
Old 15-02-2016, 07:01 PM
Bassnut's Avatar
Bassnut (Fred)
Narrowfield rules!

Bassnut is offline
 
Join Date: Nov 2006
Location: Torquay
Posts: 5,065
yes, a googlebot will be generally safer but consider this.
5 kids barge across a road without looking and your car is 2m away, googlebot has 2 choices, sacrifice 5 kids or avoid and possibly sacrifice you.

Everyone understands a human driver has to make a fast difficult choice and **** happens and whatever ensuses is just bad luck. But this scenario MUST be pre programed in a googlebot. So, do you buy a car that has optional (stated, would have to be) software that will 1/ preserve the driver always 2/ make a valued decision based on amount of human destruction and act accordingly. What would you buy?.

Who sues who?. Insurance companys will go nuts. Sue the driver, no, software made the choice. The coder?, or the car co?, or maybe you anyway because you picked the algorithim (I bet cars owners want the choice on purchase).
What if you had a prang with another human driven car which confused your googlebot and it made a wrong choice because it was not predicted?.

At a minimum, googlebot cars arnt possible unless ALL cars are googlebots and accident response is universally regulated by government.
Reply With Quote