The idea of a self driving car is spectacular. Imagine sitting in traffic and you don’t have to endure your leg cramping, back hurting, and the impending rage filled honking because someone dared to try to come into your lane, just to sit there. Now that is the car’s problem. While you can stretch out in the back, have a snack, and wait for the rubbernecking traffic to be over. At the same time, that sounds horrifyingly irresponsible! What if a deer suddenly runs in to the road or debris from a truck scatters along the street. How can we rely on the reaction time of a self driving car? What can we expect from a machine that can not see?
I believe there are opportunities with every type of technology and these self driving cars are no different. As explained the video the car ‘sees’ with LIDAR and Mach-Zehnder mudulator. These devices work together to create the ‘eyes’ of the machine and that got me thinking. Any type of machine needs maintenance. Therefore to continue to evolve this technology there will need to be teams in place to execute that evolution. There are already driverless arsenal and supply trucks for the military in development which means there will be companies that want to manufacturer and maintain said trucks, which opens opportunities for government contractors to create something incredible for the future.
In the movie I, Robot there is a scene where Will Smith’s character is discussing the lack of ethics in robots. He and a family with a little girl got into a car accident and ended up in a river. A nearby robot saw the accident and calculated that Will Smith’s character had a 45% chance of survival and the girl had an 11% chance of survival. Since his character had a higher chance of survival the robot saved him and let the girl drown. He reaffirmed his story by saying ‘That was somebody’s baby, 11% chance is more than enough. A human being would have known that.’
Remembering this scene makes me think about the ethical dilemma when a car with it’s incredible ‘eyes’ must maneuver in traffic thinking about the safety of the driver over the safety of the cars, motorists, or pedestrians around them. While LIDAR and Mach-Zehnder mudulator can sense different objects around it or coming up on the road, can it make a decision on how to keep everyone in the car and on the road alive. The problem is self-driving cars are not the norm currently. And the adjustment will be a rough one because you will have humans and robots interacting and trying to co-exist.
Once again there is an opportunity here. I would love to be able to lean back and let the car do it’s thing. However knowing that the technology still needs to perfect it’s ethical ‘human’ side, it’s not quite time for us to be solely reliable on self-driving cars. The opportunity for companies is to employ more robotic, mechanical, and driving experts to create more margins for error so that the robot can function simultaneously on a mechanical and human level so it won’t be driving blind without ethics.