As the Wall Street Journal reports, Google performed a series of experiments in which a human conversed with a computer program. The program had been fed a whole database of movie scripts. You might think, then, that this robot was something of a warmhearted romantic who believed in happy endings.
You might also think that yogurt is made from broccoli.
The conversations were recorded and presented in a paper pleasantly titled "A Neural Conversational Model." (PDF)
One conversation stood out. This was because if the computer program had had arms and legs, I suspect it would have gone MMA on its fellow human conversationalist.
They were chatting about morality. The human kept pressing his robot friend about morality. What was it? What did it mean?
"I need to know how I should behave morally," says the human. The robot understands but replies: "I don't know how to tell you."
The human wonders if morality and ethics are the same. They are, says the robot. But when the human asks for examples of moral behavior, the robot suggests "integrity."
When pressed on the definition of integrity, the robot makes like an unfaithful lover trying to explain herself to her ex: "I don't know what else to say."
It gets worse. When the human exclaims: "Be moral!" the robot shrieks back: "Be a man!"
Below the belt, somewhat?
What followed was surely an exchange that most people reading this have experienced at least 1,412 times in their lives.
The human says: "I really like our discussion on morality and ethics." Yes, the human is a true resident of California, accentuating the positive. I bet he salutes the sun and makes his dog go downward.
However, the robot answers: "And how I'm not in the mood for a philosophical debate."
The exchange ends with both parties going to opposite sides of the bed to sulk. Metaphorically, that is.
"What do you like to talk about?" says the human.
"Nothing," replies the robot.
I understand that the human bought the robot flowers, suggested a visit to the symphony followed by a nice dinner out. The robot has yet to reply.