It’s not the Russians or fake news, the overhyped threats du jour, I’m most worried about. It’s algorithms.
We’re all being sliced and diced like in an autopsy, analyzed and scrutinized so we can be messaged and manipulated. We are being told what we want to hear or what fits our biases. We accept lies because we’re being trained to do so.
As Howard Beale shrieked about television’s voice in the movie Network, “But, man, you’re never going to get any truth from us. We’ll tell you anything you want to hear; we lie like hell…We deal in illusions, man! None of it is true!”
We’ve gotten so used to the manipulation we usually don’t recognize it.
While recently strolling about the Washington Square mall’s new Amazon bookstore, I noticed that some of its racks had embraced the a fortiori tactic of many online sellers, “If you like …, you’ll love ….”
On one shelf, printed notes said that if I liked Emma Donoghue’s novel “Room”, I’ll love Paul Pen’s “Light of the Fireflies” (“which deals with some very deep and disturbing topics, including incest”), Gillian Flynn’s “Sharp Objects” (featuring “…an incredibly flawed and fragile character…”) and Wally Lamb’s “I know this to be True” (an Oprah Book Club Pick in 1998).
It looks like the store is just being helpful, but it it is really steering your purchasing decision in a particular direction based upon your characteristics and previous behavior.
It’s like LinkedIn alerting you to job openings that might appeal to you and Twitter feeding you promoted tweets based on your profile information, mobile device location, IP address or apps on your device.
It’s like Facebook delivering information to you on topics you’ve already signaled an interest in with a bias you’ve already displayed, and cutting out contrasting views, or not showing you certain ads based on your ethnicity (as it did until recently).
In Sept. 2016, ProPublica, an independent, non-profit that produces investigative journalism, wrote about Facebook having a comprehensive set of dossiers on its more than 2 billion members.
“Every time a Facebook member likes a post, tags a photo, updates their favorite movies in their profile, posts a comment about a politician, or changes their relationship status, Facebook logs it,” ProPublica said. “When they browse the Web, Facebook collects information about pages they visit that contain Facebook sharing buttons. When they use Instagram or WhatsApp on their phone, which are both owned by Facebook, they contribute more data to Facebook’s dossier.”
And in case that wasn’t enough, ProPublica said, Facebook also buys data about users’ mortgages, car ownership and shopping habits. Talk about invasive.
In a TED Talk, Eli Pariser, Moveon.org’s Board President, called this the “invisible algorithmic editing of the web.”
It’s like Breitbart and The Daily Beast satisfying their conservative and progressive audiences with red meat, allowing each group to retreat to what University of Wisconsin Journalism Prof. James Baughman has called “safe harbors”.
Algorythms are being used to personalize all your communications, constantly reaffirming and constraining your current perspectives, establishing and solidifying your opinion silos. As they get more sophisticated and widely used algorithms are creating what Pariser calls your “filter bubble”, accentuating rifts and perverting our democratic system.
When you log on to Facebook, an algorithm takes into account countless variables to predict what you want to see. Facebook also uses algorithms to categorize your political bent, taking into account your full range of interactions, including the pages you like and the political leanings of people who like the same pages you do.
If you want to know how Facebook categorizes you, just go to facebook.com/ads/preferences. Under the “Interests” header, click the “Lifestyle and Culture” tab. You may have to click on “More” to find it. Then look for a box titled “US Politics.” In parentheses, it will describe how Facebook has categorized you, such as liberal, moderate or conservative.
This and other information is used by opinion influencers to target you. Among those influencers are media of all stripes and politicians of all persuasions.
Politicians have long sought to appeal to different segments of voters with targeted messaging and carefully constructed personas, but until recently the process has been fairly rudimentary.
The image-making tactics described in Joe McGinnis’ groundbreaking book “The Selling of the President” about marketing Richard Nixon in the 1968 presidential race, came as a shocking surprise to a naive general public back then.
But the tactics that were pathbreaking almost 50 years ago are now old hat. They’ve been superseded by once unimaginable data collection and analysis and unforeseen content delivery systems.
Algorithm advocates are adamant that what’s being done is good for you. “Humans are facing an increasing number of choices in every aspect of their lives,” Netflix’s VP of Product Innovation Carlos A. Gomez-Uribe and Chief Product Officer Neil Hunt wrote in a co-published paper last year. “We are convinced that the field of recommender systems will continue to play a pivotal role in using the wealth of data now available to make these choices manageable, effectively guiding people to the truly best few options for them to be evaluated, resulting in better decisions.”
Gomez-Uribe and Hunt argued that Netflix’ impressive system, which breaks down films into over 75,000 hyper-specific sub-genres and uses those, and your past behavior, to make recommendations, is obviously a great thing because 80% of hours streamed at Netflix end up being of recommended films.
But Issie Lapowsky, at Wired, is less sanguine about the implications of algorithms, arguing that there’s a dark side to their use. “This (2016) election has shown us how the same platforms that put a world of facts and information at our fingertips can just as easily be used to undermine basic truths,” she wrote on Nov. 7.
In Weapons of Math Destruction, Cathy O’Neil argued that algorithms pose as neutral tools, but too often exploit people and distort the truth, contributing to the erosion of democracy.
“The social network (i.e. Facebook) may feel like a modern town square, but thanks to its tangle of algorithms, it’s nothing like the public forums of the past,” she said. “The company determines, according to its interests and those of its shareholders, what we see and learn on its social network. The result has been a loss of focus on critical national issues, an erosion of civil disagreement, and a threat to democracy itself.”
Algorithms cause us to “contribute to our own miseducation”, reinforcing echo chambers and making us more partisan, O’Neil said. “Thanks in part to filtering and personalization… our information has become deeply unbalanced, skewed, and has lost its mooring.”
The increasing sophistication of data gathering and analysis reflected in algorithms is also allowing politicians to shape shift for almost each individual voter. A politician used to be one person, or maybe two if you didn’t like him. It used to be that a presidential candidate delivered similar personas and key messages to all audiences. If he didn’t, his duplicity was exposed. Today, multiple personas and positions are carefully constructed and messages are carefully targeted so they can be delivered to tiny slices of the electorate, often with no broader public awareness.
Micro-messaging allows specific online messages to be delivered to a certain group, such as just to attendees of the 2016 National Right to Life Convention at the Hilton Washington Dulles Airport in Herndon, VA, or even to two members of a family in the same house with different views.
Often the dissection of voters allows a message to be massaged such that the recipient on social media or other channels believes she and the politician are in agreement, even if that’s not the case. For example, an anti-union Congresswoman might tell a same-minded constituent of by her support for a right-to-work bill, while telling a union supporter about her vote for higher infrastructure spending that tends to reward unions.
Stanford Prof. Neil Malhotra’s research led him to suspect that this kind of hypocrisy helps explain how members of Congress can get away with voting in a highly partisan or polarized way when their constituents are actually much more moderate.
“These people are good strategic communicators who can potentially take very extreme positions that are out of step with their constituents but then massage them with language,” Malhotra said in a Stanford Business article.
Of course, targeting voters is hardly a new thing; politicians have been doing it forever. But now the databases are substantially more comprehensive, sometimes scarily so, the messaging vehicles, such as social media, can be much more individualized and the political elite are fully embracing the new technology.
“Algorithms show us what we like, not what is ‘right’ ”, said Sebastian Buckup on Quartz. “As a result, they increase ideological segregation rather than creating a digital agora. Influencers no longer waste their time with facts…Rather than seeking truth, the age of data is creating its own.”
That new truth will put more power in the hands of manipulators who won’t have our best interests at heart.
Asked, “How did you go bankrupt?”, Ernest Hemingway replied, “Two ways. Gradually, then suddenly.”
That’s how our democracy will collapse, too, if algorithmic tools aren’t tamed to function in our best interest.