Artificial Intelligence? Yes. Artificial Humanity? No.

metalrobotI recently saw a Facebook friend decry the ubiquitous end-of-year cards Facebook had created for each of its users, the ones that allow people to share the highlights of their year in the style of an e-card for all their Facebook friends to see. Most people have shared these with the default tagline “It’s been a great year. Thanks for being part of it.”, as if the thirty-three percent of one’s Facebook friends who share vacuous political jingoisms and Dos Equis memes actually played an acknowledgeable role in filling the year with happiness. She concluded that “social media sucks” because it poorly represented her year and who is important to her. I decided not to share mine because, well, it showed a picture of my grandpa, who died in 2011, and who could only have played a positive role in my 2014 supernaturally.  At least my loss was relatively distant compared with this story of a father who lost a daughter this year to cancer, only to see her picture show up prominently surrounded by party favors in his end-of-year Facebook card.

Clearly, something was not quite right about the algorithm Facebook used to produce these automatically created greeting cards. Did the algorithm consider only the number of likes or comments, and not the content or meaning of the comments, in deciding to include a picture as a point of celebration? Did it consider other comments or messages posted at around the same time of the picture to consider what the context was? I suspect that Facebook took the more computationally efficient and simpler-to-code route of basing what to include in the end-of-year card strictly on the popularity of the post as measured by the number of likes and comments.  Because of that, the algorithm had terribly incomplete information to use to make reliable judgements on what to include in the card and what to omit. I’m sure there will be more appalling stories like this one.

A dislike button surely would have helped Facebook improve this application. Get on it, Zuckerberg.

The more important point here, however, is that computers do not feel, cannot empathize, cannot commiserate, and cannot discern feeling from word or context. Certainly, computers can think. Artificial intelligence is a reality, computers can learn from their experiences, and, contrary to what the article claims, they can deviate from the script initially written by the programmer. Numerous artificial intelligence (AI) algorithms have been written over the past forty years to help them freelance based on the knowledge and experiences they acquire.

Computers can think, but they can’t feel.

Can a computer be taught to feel? Can it use AI to learn how to sense and respond appropriately and supportively to others who exhibit human emotion? That is a provocative question, and I’m only an engineer-turned-Computer Scientist who is in danger of leaving his cube. However, if we answer the question affirmatively – if we say that computers can be taught to feel – we must then conclude that all our interactions are calculated, that we console a hurt loved one because we want something in return, or we fear not doing so will cause us to lose points and imperil future opportunity. Some among us believe that is, in fact, how we humans operate. Our emotional responses are regulated by some form of invisible but important currency that alternately pulls and releases our heartstrings; that we are puppets of some maniacal accountant-marionette hybrid hellbent on keeping karma at bay.

I don’t choose to believe that. Humanity is not an accounting exercise. Humanity is not a state machine. Humanity cannot be programmed.

You want proof? I have none. I’m human after all. For now, let’s celebrate. Thank you all, Facebook friends, for being part of yet another year in which we humans have thoroughly vanquished those unfeeling, inhuman boxes of silicon and wire. Here’s to another year of ruling the planet. Go humans!

About Ray Klump

Professor and chair of Mathematics and Computer Science Director, Master of Science in Information Security Lewis University http://online.lewisu.edu/ms-information-security.asp, http://online.lewisu.edu/resource/engineering-technology/articles.asp, http://cs.lewisu.edu. You can find him on Google+.

Leave a Reply

Your email address will not be published. Required fields are marked *