Death by software - crimes of programmers and their robot cars

Last Sunday night a programmer inadvertently killed a woman with software. We know where he works but he has not been charged with any crime. The place of business was Uber's Advanced Technologies Center in Pittsburgh, Pennsylvania and one of the 400 programmers there directed a self-driving car in Tempe, AZ to ignore and strike Elaine Herzberg, a pedestrian, crossing a wide four lane street with stoplights and well-marked crosswalks.

We know that accidents happen because drivers make mistakes. We know accidents happen because pedestrians make mistakes. But this is the first time that there's been an interaction between a programmer and a pedestrian where someone died that I know of.

Washington Post article quoted a Duke engineering professor,  Dr. Mary Cummings, saying that the federal government does not require the programmers to perform any testing on their software before placing these nearly two-ton autonomous machines in close proximity to people who don't know that the software running these massive machines are not bug-free. The word bug was only mentioned by a lawyer at the end of the article.

Arizona Gov. Doug Ducey withheld oversight on regulatory controls to attract the now confirmed lethal computerized vehicles back in February after California kicked out the 16 robotic cars being tested there. Should Gov. Ducey be reminded of the 1996 $6 billion Ariane 5 rocket project that blew up on launch because budget concerns cancelled a complete system test?

The death of Elaine Herzberg was minimized in favor of excusing the technology behind self-driving cars with language like "Growing up is really painful" from University of South Carolina law professor Bryant Walker Smith. He compared her tragic accident by computer-aided car to the hundred fatalities a day resulting from 263 million drivers who did make a choice to drive their own car. Are these the growing pains of a robot child we want to see mature?

The Post article describes the pedestrian fatality as a crash. I would describe it as committing a crime because a programmer didn't understand the problem to solve when Mrs. Herzberg stepped off the curb. The article also promotes the belief that if we can just get through this "curve bend" we will be saving lives in the long run. Essentially, this Notre Dame professor says that the fatalities of a few people resulting from poor quality software are necessary for the greater good. "It's going to be difficult to accept the deaths..."

This is our defense. We willingly blind ourselves in the hope of a better slave master. And then the cure is not to regain sight but to blame a man-made machine for not being good enough. It's not the car, the driver, the pedestrian, Uber, or the programmer that is culpable. It's all of us who choose to trust other men with their promise of perfect control, when God would rather we stand up for those who need help. Should we ask why Elaine, a 49 year old woman with a bicycle, was on the streets late at night?
This is what the LORD says: "Cursed is the one who trusts in man, who draws strength from mere flesh and whose heart turns away from the LORD. Jer 17:5

Choices

You have a choice. You go out for a walk. It takes a while but you decide to enjoy nature and keep your life simple. Your major decision is whether to bring an umbrella or not. Is it worth the risk of losing it if you bring one? It's a small risk of loss of property.

You have a choice. You can walk down the street slowly by yourself or you could build a skateboard-like machine to get you there faster. Do you think it's better to get there faster? Is the efficiency worth the risk of a scraped knee? It's a higher risk due to injury and possible bone fractures.

You have a choice. You can walk down the street by yourself while you stop for cars and step around obstacles in your way. Or you could close your eyes and put your trust in someone you haven't met who learned how to figure out how to identify cars and obstacles just a few years ago.  And they would tell you what to do. Do you trust them? Is the convenience worth the risk of hitting something your guide didn't see? Now you have the risk of death.
It is better to take refuge in the LORD than to trust in humans. Psalm 118:8

Imperfect machines

A machine is an extension of the human ability to perform an action. We substitute the action of movement by legs for a better, more efficient form of movement.The machine's power is under control of the user/driver. The driver is responsible for guiding a motorcycle or car off the road. It may have constraints and flaws that cause lack of control but the driver has accepted those conditions as an acceptable risk for the faster speed he gains. He is also now responsible.

When you replace the driver with the decision making of another person, the responsibility now transfers from the driver to the decision maker. The user/driver of the machine has put their trust in the decision maker and is now just a passenger. Now the autonomous car is more than just a mechanical tool, but a way that transfers responsibility from a person who earns a living from a taxi machine to an unseen and anonymous C++ developer in some Pittsburgh cubical who will never know how his code affected the lives of Tempe people.
Woe to those who go down to Egypt for help, who rely on horses, who trust in the multitude of their chariots and in the great strength of their horsemen, but do not look to the Holy One of Israel, or seek help from the LORD. Isa 31:1
Programming transfers decisions to a programmer so that others can benefit from them. We write code that performs other people's decisions and trust that the resulting actions are worth the risk. But they also increase the amount of complexity that we have to deal with. More complexity in life requires us to pull back from managing that activity in the way we used to deal with it, and now simplify that so it becomes more manageable.

When life was simpler, specifically in 1942, one of the ground-breaking authors of science fiction, Isaac Asimov, wrote three laws of robotics. His first rule for robots was that a robot may not injure a human being or, through inaction, allow a human being to come to harm. That rule has now been broken and the optimism for a digital world order seems dulled.

This is where I draw the line. But unfortunately, knowing the history of technology, that line will not be respected. Any technology that can be developed where others find a short-term benefit in either money or power will be used. Asimov is not a part of the real technology world but a part of a utopian view for what man could achieve with technology. The real life version is darker and depressing.

The common car, one of the machines blamed for climate change, is now at the center of another maelstrom. Hopefully, we won't just accept the deaths of some family member as a necessary tribute to the gods of technology and push back against this utopia. Our dream of a perfect system of transportation automata will be imperfect because there is only one perfect One.



Comments

Popular posts from this blog

People power in the church - volunteerism trends and charts

Hello!