Discussion in 'Article Discussion' started by bit-tech, 28 Mar 2018.
You say "ethical" I say "tomato".
Good intentions and all that.
That awful autonomous car death raises a fundamental question. If it had identified it was going to hit a pedestrian but taking avoiding action may have caused injury to its passengers, what would it do?
People, I expect would take reactionary avoidance actions, what would/should AI do?
You're literally describing the trolley problem, which dates back to 1905. If you're interested in the subject, MIT's Moral Machine asks you to judge which particular choice an autonomous car should make in a range of trolley problem scenarios - and you can even design your own and share 'em with your friends, which is a nice touch. There's also a very good write-up on the topic from the Alan Turing Institute, which is one of the groups behind the new Ada Lovelace Institute.
I always knew I was undervalued
I believe I read somewhere that most autonomous car developers would first and foremost try to protect the passengers. Don't hold me to that though, I don't remember my sources (though probably Jalopnik).
It's odd as i can't recall making a moral judgment when I've had close calls or accidents, it's always been a matter trying to avoid hitting anything, then again i suppose I'm lucky not to have been confronted with a no win death/injury or death/injury scenario.
I dunno. Maybe its just me but I don't think they will program complex moral scenarios into these cars. There will just be the base rules that are required to operate the vehicle normally and that's about it. So if the brakes fail, it might just try to change gear to slow itself down and then maintain the other rules of the road such as don't break a continuous white line, don't mount the footpath etc. and then wait to stop/hit something.
I don't know a lot about AI but maybe people are anthropomorphising it a bit when it comes to these edge case moral decisions.
AI is still pretty basic, but does have goals - keep to the rules of the road, etc, as you say - but defining those goals has to go through a moral framework. Do you use utilitarianism and try to kill the least amount of people possible in a trolley-problem scenario even if that means killing the passengers? Would you buy a car that might opt to kill you to save others?
No anthropomorphising, I just want to know the answer to IF-THEN.
PISS, or ( Pedestrian Indiscriminate Strike System ) will obviously be a £1600.00 option on all 2020 Audi vehicles, but will come with complimentary bonnet wipes as part of your Audi customer experience.
Please contact your system administrator for further information.
Did someone say Trolley Problem?
Separate names with a comma.