Sex, Robots and Bank Tellers: What AI Will Mean for Business

Ken-Hardin

I got to thinking after reading the salacious press coverage this morning about a Dutch artificial intelligence researcher who's claiming that in about 50 years or so humans will "marry" robots.

 

Honestly, I'm not just trying to get a cheap traffic bump from having the words "robots" and "sex" in the title of this post. Honestly.

 

At any rate, the researcher, David Levy of the University of Maastricht, contends that as artificial intelligence advances, people will not only use robots for sexual gratification -- as they use the Internet today -- they will also develop emotional bonds with machines they can program for personal compatibility and, I'd imagine, pliability. There's an unpleasant but undeniable candor about human nature in Mr. Levy's comments, as seen in this snippet at Fox News:

The main benefit of human-robot marriage could be to make people who otherwise could not get married happier, "people who find it hard to form relationships, because they are extremely shy, or have psychological problems, or are just plain ugly or have unpleasant personalities," Levy said. "Of course, such people who completely give up the idea of forming relationships with other people are going to be few and far between, but they will be out there."

Of course, the idea that the legality of "marriage" is somehow defined by the depth of emotional bonds, particularly in contemporary America, is kind of silly as a basic premise. Legal "marriage" is about estate rights and taxation. Sure, I can certainly imagine a Progressive Temple of Ultima popping up somewhere in Oregon to perform cyber-commitment rituals -- and sooner than 50 years from now -- but until society is ready to bequeath inheritance and the ability to sue for half the house to a robot, I think the idea of Massachusetts legalizing "marriage" to machines, as Levy suggests, is a little overreaching.

 

But, as I said in my opening apology for this post, some of his arguments got me thinking.

 


Another roboticist, Ronald Arkin, is quoted in the Fox News piece as saying AI could create a generation of machines that offer an outlet for behavior that's deemed inappropriate, or downright perverse, if it's conducted with an actual human being.

 

I, of course, jumped on this train of thought and made the quick leap from aberrant sexuality to customer service.

 

When some Japanese firm creates a life-like robot that can handle 98 percent of customer requests at a bank teller window, how will the law handle customer behavior that would result in a call to security if the teller in question had a mother and a father? Obviously, just punching the robot could be prosecuted as basic vandalism. But what about fits of vulgarity and threatening language? Or, in the tenor of this post, inappropriate "sexual" advances -- want to take bets that these robotic tellers are all gonna look kind of like Adriana Lima?

 

These issues translate into the workplace as well, assuming that companies will want the precision and cost savings that would come with a friendly-looking computer administering HR surveys, processing IT trouble tickets and the like. And how will companies legislate behavior between human-like robots and people on production lines? "Don't curse at the tolerance checker" seems like an odd addendum to an employee handbook.

 

More deeply, the question is whether people who become conditioned to act out against an inhuman thing they are interacting with will be able to flip a switch and revert (or ascend, depending on how you look at it) to codes of conduct that are appropriate between people. It's a serious question, and one that already has evidenced itself in the way many folks conduct themselves on Internet boards. You have to ask yourself, "Would they be doing this if that other person were standing four feet away?"

 

In the future, you'll have to modify that question to, "Would they be doing this if that other person were breathing?"



Add Comment      Leave a comment on this blog post
Jan 12, 2009 10:54 AM Pat Pat  says:
In expecting humans to communicate with robots through AI, we come to the ultimate insult of cruelty of permitting no personal interaction to substitute for personal communication. What could be worse than to become a nation of idiots who communicate only through computers in simulated personal encounters, condemned by our own ignorance. It is the equivalent of speaking to a dog, or a painting - the ultimate human disassociation with humanity.If that isn't perpetuating schizophrenia, what is? Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.