Driverless cars are too cautious for chaotic human drivers and they get into twice as many accidents. Yet, all of them are minor-scrapes. The main cause – human error.
Self-driving cars are too cautious for their own good. They obey the law at all times with absolutely no exception. This may sound like great news, but when you have to merge into an angry jammed highway with cars flying way past the speed limit, then everything starts to crack.
The main argument among programmers at Carnegie Mellon University and Google Inc. is that – Should we teach the A.I to commit infractions here and there in order to avoid a more damaging accident?
Raj Rajkumar, co-director of the GM Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh has said that that they have decided to stick with the speed limit. Yet when you go outside and drive on the highway maintaining that speed limit, you’ll immediately notice everyone zapping past you, he continued to add.
Just last year, Rajkumar performed a drive test with members of the Congress in a driverless Cadillac SRX SUV. The self-driving vehicle performed incredibly well and it didn’t commit any infractions.
However, when they reached the I-395 South the car had to merge into a lane. That’s when the huge problem surfaced. The vehicle had to swing across three lanes – that’s about 150 yards. The car used its cameras and laser sensors to detect incoming traffic in a 360-degree view, but it didn’t know what to make of the other drivers. Would they make room or will they zap straight into it?
That’s when the human driver had to take control of the vehicle and complete the merger.
According to reports by University of Michigan’s Transportation Research Institute in Ann Arbor, self driving cars have twice as high accident rates than regular cars. Yet, self-driving cars were never at fault: usually, they were hit from behind by unaware or aggressive human drivers.
The state of California has urged caution in the deployment of self-driving cars on public roads. The proposed rules published this week requires a human driver to be present inside the vehicle at all times in order to take the wheel if something horrible is about to happen. Also, it forces companies to offer a monthly report on the behaviour of the driverless vehicle.
Google, which has built a driverless car that has no gas pedal and no wheel has said that it is gravely disappointed by these proposed rules, and that California could, unfortunately, set the standard for regulations across the country.
To close the article, here’s another ethical issue self-driving car creators are having a hard time with. Should they program autonomous-vehicles to make life-or-death decisions in the event of an accident?
Let’s say, the car asses that it can avoid killing of a school bus full of children if it swerves off a cliff. There are 20 lives inside the bus, but 4 inside the car. Should the car do it?
Image Source: 1