Cake Is Good For Me Cake Is Good For Me Cake Is Good For Me

I am getting better looking with age.

I am smart enough to detect propaganda when I see it.

We have a tendency to lie to ourselves, don’t we? The lies well tell ourselves console us. They make us feel better about who we are. They give us hope and distract us from worry.  Some of these lies are harmless: if we aren’t winning beauty pageants now, it is highly unlikely we will in fifteen years, but so what? Other lies are … um … fake news, and they blind us to real danger.

Two-thirds of American adults get at least some of their news from social media sites like Facebook. That same survey showed that nearly 80% of people under age 50 read news on these sites. With two billion users now on Facebook alone, social media has become a tool for disseminating information – and misinformation – that offers unprecedented reach.

Perhaps we were better off when the Internet was just a place to watch cat videos.

Facebook and a data mining firm called Cambridge Analytica have come under fire this week. Cambridge Analytica gained access to data about 50 million Facebook user accounts. They acquired this information through a psychology professor who was given access by Facebook to user data for his research but then violated Facebook policies by sharing it. It is now alleged that Cambridge Analytica used the data it had gotten from the professor to target particular individuals with fake news and misinformation in advance of the 2016 election.

As a data analytics firm, Cambridge Analytica would know how to use data to maximize the impact of such a campaign. The firm would likely develop a mathematical model of users who were likely to be swayed by particular types of news. They would probably include parameters for slowing or accelerating the drip of misinformation so as not to set off propaganda alarm bells. If they could throttle the flow of misinformation so that it didn’t turn their users off but instead gradually moved them to a particular point of view through repeated, reinforcing waves of increasingly believable anecdotes, they could succeed. With fifty million users to choose from, each of whom would share data within their bubble of like-minded friends, the space in which Cambridge Analytica could experiment was practically unbounded. Certainly, not all the news items they’d spread would stick, but those items that did attract attention and win over some minds would find an incredibly vast audience. The more people who saw the news, the more likely it could gain traction through reposts, debate, and the growing sense that, if so many people are seeing it and discussing it, surely it must carry some truth.

Facebook is in a difficult spot. They have pledged to double the number of people who review content to flag offensive or damaging material. But the scale of the problem is so much bigger than what can be handled through a staff of what will eventually be 20,000 people. There are almost 300,000 new status updates posted every single minute. This is their Frankenstein’s monster. Their success has created a problem they cannot control. I would like to think they didn’t see this coming.

Regrettably, Facebook’s problem is our society’s disease. It is perhaps easier for fake news to spread now than at any point in history. You don’t have to fall blindly for a pitch for it to have a significant impact; it will act like a time-release drug. Even a hardened skeptical response tends to weaken when you see something repeatedly, no matter how outrageous it may seem at first. You go about living your life when, all of a sudden, you see or hear something that reminds you of that outrageous article your now-crazy-seeming friend posted, and you begin wondering if, perhaps, there was at least a tiny shred of plausibility to it. All a piece of fake news needs to be credible today is a shred of plausibility.

The credibility bar is too low. But who sets that bar? Are the bar-setters credible, or are they simply partisan hacks? And who are they to set the bar for me anyway?

When no one knows what or whom to believe, it becomes increasingly difficult to believe in anything, including, most consequentially, our democratic institutions. We’re in a dangerous place.

I’m reminded of a line from Lou Reed’s “Last Great American Whale”:

They say things are done for the majority.
Don’t believe half of what you see
and none of what you hear.

And, evidently, less than nothing of what’s posted on your Facebook timeline.

As citizens, we have a responsibility to be mindful and vigilant that we are being pitched constantly. There are millions of voices all seeking to win us to their side. We have to discern the drip of data and realize its potentially lasting effects on us. We have to stick to our moral principles and beliefs, buoyed by our direct experiences, even as the waves of data undulate against us and occasionally wash over us. This is not within technology’s ability to solve. Tools exist to keep a drunkard off the road, but technology can’t keep an alcoholic sober. We can’t allow ourselves to become drunk on data.

The first step is admitting we have a problem.

 

About Ray Klump

Professor and chair of Mathematics and Computer Science Director, Master of Science in Information Security Lewis University http://online.lewisu.edu/ms-information-security.asp, http://online.lewisu.edu/resource/engineering-technology/articles.asp, http://cs.lewisu.edu. You can find him on Google+.

Leave a Reply

Your email address will not be published. Required fields are marked *