We don’t fully understand the novel coronavirus — SARS-CoV-2. We are still learning about the ways it attacks various organs of the body. We’re not sure at what stage, exactly, infected persons are contagious. We don’t know if successful recovery bestows immunity. We’re testing existing drugs to see if they offer relief, and we’re developing new drugs, as yet untested. We don’t understand why certain individuals experience very mild cases, and others die.
We will know more in a few years.
We also don’t fully understand the psychological virus known as the Internet — but we do know that social media websites like Facebook and YouTube and Instagram are highly addictive. We don’t, however, know exactly what damage is being done to users, or to our society.
We will know more in a few years.
From an article by reporter Henry Farrell in the Washington Post, August 2018. Mr. Farrell is quoting professor Siva Vaidhyanathan, director of the Center for Media and Citizenship at the University of Virginia:
Facebook engineers were for many years influenced by a strain of thought that emerged from Stanford University, where in the early 2000s scholars of human-computer interaction, design and behavioral economics were promoting the idea that games could generate “stickiness” among users, giving users just enough positive feedback to want to return to the game but deny users enough pleasure so that they don’t get satiated. As technology consultant Nir Eyal explains in his revealing and, frankly, frightening book, “Hooked: How to Build Habit-Forming Products,” this idea spread quickly through Silicon Valley, uniting game designers, application engineers, advertising professionals and marketing executives…
Facebook is in the social engineering business. It constantly tries to manipulate our experience and, thus, our perspective on our friends, issues and the world. It does so haphazardly and incoherently, it seems at first. But, in fact, there is a coherent driving force…
The consequences for politics are stunning: In 2012, the head of state of a country with massive surveillance and military power had sensitive personal data on millions of Americans, and no one cared. When my colleagues in the social media scholarship world and privacy world tried to raise this issue, no one responded with interest. We could not get reporters to pay attention or editors to run op-eds. The Obama campaign was seen as this supercool digital pioneer, a happy, friendly phenomenon. No one thought about what might happen if a not-so-friendly campaign got the same sort of information on millions of Americans…
We don’t fully understand what social media is doing to America, but we did see something unprecedented take place in 2020. A relatively small number of public health industry ‘experts’ were able to convince entire nations to shut down their economies and social institutions to hinder the spread of an unfamiliar — and apparently highly contagious — virus, for which no pharmaceutical treatment was available.
Yesterday in Part One, I mentioned the famous George Orwell novel, Nineteen Eight-Four, a dystopian future where everyone was constantly “watched” by Big Brother via ubiquitous ‘telescreens’ installed inside homes, on street corners, in offices, in restaurants. We have created a similar future, but it wasn’t imposed upon us by a totalitarian government. We created it willingly, by spending our hard-earned wages on iPhones and iPads and then spending our leisure moments giving away personal information to Facebook and Google and other massive collections of semi-intelligent computer complexes.
But the sharing of personal information, voluntarily, is only the tip of an iceberg, because the things you share and the way you share, play into a monstrous, highly profitable ‘feed-back’ system — precisely constructed to reinforce every one of your prejudices and assumptions. The system is built to assure you, day after day, hour after hour, that your opinions and preferences are shared by many of the people you care about — to assure you that you’re… well… a remarkably intelligent person.
Here’s Professor Vaidhyanathan, talking about how Facebook checks to make sure you’re ‘happy’ with your online experience:
For Facebook, that proxy is “engagement,” the number of clicks, shares, “likes” and comments. If a post or a person generates a lot of these measurable actions, that post or person will be more visible in others’ News Feeds. You can already see how this could go wrong. Unsurprisingly, items advocating hatred and bigotry, conspiracy theories or wacky health misinformation generate massive reactions — both positive and negative. A false post about the danger of vaccines would generate hundreds of comments, most of them arguing with the post. But the very fact of that “engagement” would drive the post to more News Feeds. That’s why you can’t argue against the crazy. You just amplify the crazy. Such are algorithms and feedback mechanisms.
If you believe that the global COVID crisis is a massive conspiracy to make us all into slaves, Facebook will deliver confirmation that your belief is incredibly interesting — by quickly delivering comments from “friends” concerned about those very same ideas. Very simply, that’s how Facebook makes its billions of dollars in advertising review: by convincing you that you’re a fascinating person who obviously understands how the world works.
If you believe Dr. Fauci is a gift from God, sent to save mankind from a modern plague, Facebook will reveal just how important your ideas are, by creating an instant conversation focused on whatever you just said — and by linking you up with people who believe the same way you do.
From Professor Vaidhyanathan:
Facebook’s ability to precisely target voters allows for massive amounts of political communication to occur without oversight or an opportunity to respond. It removes political communication from the gaze of the public. It’s ephemeral, and coded. Political communication moves even further from the … Jeffersonian ideal of public conversation about matters of policy, and more toward motivation.
Healthy republics need both motivation and deliberation…
Facebook is not a virus that kills people. It’s a virus that kills democratic political debate.
Enter, stage left, the coronavirus. SARS-CoV-2.
(If you’d like a bit more explanation from former Facebook executives Sean Parker and Chamath Palihapitiya, you can view this 11-minute video:)