My thought for the day...
Robot driven cars technology are comming along nicely. Improving the route finding, obstacle avoidance and generally not crashing-into-shit abilities. The one problem I forsee is this...
Creatativly solving a dynamic problem at speed.
Given that there are situations on the road that often involve poor weather, multiple moving objects, unpredictable drivers and variable surfaces, stationary obstacles of more or less value, dogs, kids and multiple physics models.... whats a poor computer to do when things get complicated and it becomes a situation of picking the less bad outcome?
It's not that I think humans are so much better in the same situation... its just we forgive them for their fuckups... or not as the case may be. My point is... will we forgive a robot car if it makes a set of decisions that lead to fatalaties? People accept that other people are flawed. They are generally very unforgiving of flawed machinery. Its a trust thing...
This is related to the setup in "I,Robot" where the lead character has developed a unshakable distrust of the robots simply because he has experienced an event where the decision making ability of the robot apparently did not parallel human values. The robot calculated that is should save the man over the child; while the human (having surviors guilt) had a lot more trouble with that choice and chose to illogically blame some factor in the robots 'nature'. He then extrapolated that beleif to all robots having that same flawed 'nature'.
The point is not that people don't get into the same bad situations and have to make the same horrible choices... its just that I think it will be easy to blame the choice on some perceived "difference" than for the survirors to accept that they could not have done any better. Survirors are like that. Blaming someone and trying to find a 'reason' or a pattern is just what we do.
No comments:
Post a Comment