Escaping the Matrix
The Social Dilemma, a Netflix documentary, calls for the collective will to demand regulation of social media, which is hurting humanity.
“There are two industries that call their customers ‘users’: illegal drugs and software.”
-Edward Tufte, Yale University professor emeritus of political science, statistics and computer science
The Social Dilemma, a powerful Netflix documentary, examines the dystopian/utopian relationship of social media with society. Technicians, who helped create Facebook, Instagram, Twitter and other social media, academicians and a venture capitalist call for collective demand for industry regulation. Otherwise, the only operating rule in that virtual arena is for advertisers to make money at whatever cost. And the costs are high.
Social media is addictive. It makes a shared reality among people impossible. And it stunts the emotional growth of children.
The goal of social media is to keep you engaged thereby, rendering your feed more profitable when advertisers buy space on it.
“We want to, psychologically, figure out how to manipulate you as fast as possible and, then, give you back that (feel-good hormone) dopamine hit,” says Chamath Palihapitiye, former Facebook vice president of growth.
Facebook, Instagram, What’s App, Snapshot and Twitter all use the same general ways of hooking you, but personalize them to you. The documentary intersperses interviews with scenes of a fictional family grappling with the consequences. This personalization is creating polarization in families, countries and other social groups.
“We need to have a shared reality of a country, otherwise we aren’t a country,” says Tristan Harris, who worked as a design ethicist at Google, where he studied the ethics of human persuasion. “If we don’t agree on what is true or if there’s such a thing as truth, we’re toast. This is the problem beneath the problems because if we can’t agree on what’s true, then we can’t navigate out of our problems.”
Social media feeds users news, posts and advertisements, which reinforce their worldview. They have accumulated large amounts of data on users that help them interpret the user’s feelings and feed into it.
“Over time, you have the false sense that everyone agrees with you because everyone in your news feed sounds just like you,” says Roger McNamee, an early investor in Facebook. “And that first time you’re in that state, it turns out you’re easily manipulated, the same way you would be manipulated by a magician. A magician shows you a card trick and says, ‘Pick a card, any card.’
“What you don’t realize is that they’ve done a set-up, so you pick the card they want you to pick. And that is how Facebook works. Facebook sits there and says, ‘Hey, you pick your friends. You pick the links that you follow, ‘But that’s all nonsense,” McNamee says.
“It’s just like a magician. Facebook is in charge of your news feed.”
Rashida Richardson, director of policy at A.I. Now institute, says:
“We all are operating on a different set of facts. When that happens, at scale, you’re no longer able to reckon with or consume information that contradicts that worldview that you’ve created.”
“That means we aren’t actually being objective, constructive individuals,” says Richardson, New York University School of Law adjunct professor.
Harris, co-founder of the Center for Humane Technology in San Francisco says:
“And then, you look over at the other side, and you start to think, ‘How are those people being so stupid? Look at all of this information that I’m constantly seeing. How are they not seeing that same information?'
“And the answer is: they’re not seeing that same information,” says Harris, whom The Atlantic magazine called the “closest thing Silicon Valley has to a conscience.”
The Social Dilemma makes the point that the business model is corrosive. Harris explains:
“We live in a world in which a tree is worth more dead than alive, in a world in which a whale is worth more dead than alive. For as long as our economy works in that way and corporations go unregulated, they’re going to continue to destroy trees, to kill whales, to mine the earth, and to continue to pull oil out of the ground, even though we know that it is destroying the planet. We know that it’s going to leave a worse world for future generations. This is short-term thinking based on this religion of profit all costs.
“This has been affecting the environment for a long time. What is frightening and what is, hopefully, the last straw that will make us wake up as a civilization to how flawed this theory has been, in the first place, is to see that now we’re the tree, we’re the whale. Our attention can be mined. We are more profitable to a corporation if we’re spending time staring at a screen, staring at an ad, than if we’re spending that time living our life in a rich way.”
Two of the interviewed technicians do not allow their children any screen time at all. These are the people who helped to create the monster that has taken on a life of its own. Telling.
Jonathan Haidt, a social psychologist at New York University Stern School of Psychology, said that there is a correlation between the use of social media and rates of suicide and hospitalizations due to cutting and self-harm for teenaged girls (higher rates) and pre-teen girls (a few times higher). Haidt says:
“Generation Z, the kids born after 1996 or so, those kids are the first generation in history that got on social media in middle school. How do they spend their time?
“They come home from school, and they’re on their devices. A whole generation is more anxious, more fragile, more depressed. They’re much less comfortable taking risks. The rates at which they get driver’s licenses have been dropping. The number who have ever gone out on a date or had any kind of romantic interaction is rapidly dropping.
“This is a real change in a generation.”
Harris says: “We’re training and conditioning a whole new generation of people that when we are uncomfortable or lonely or uncertain or afraid, we have a digital pacifier for ourselves that’s kind of atrophying our ability to deal with that.”
The technicians did not foresee this future. Most believed that they were creating a tool that would be used for the betterment of the world. They point out that there is a flip good side to all this: search for organ donors, reuniting of relatives, calling a car that arrives in 30 seconds. Justin Rosenstein, a former Facebook engineer, said that he and his colleagues believed that they were releasing love and positivity into the world with the creation of the Like button, not depression in young people or political polarization.
I applaud the Dr. Frankensteins who are calling for change in their monster. These people say that they did not set out to create a dystopian environment.
It is heartening that they are taking responsibility. However, none of these engineers admit to the initial dopamine rush of creation, which can be overwhelming.
In my novel, Turn On, Tune Out, I ask readers to be conscious of their time on the Internet. Surely, something insidious happens to our brain when we spend hours online:
Our minds are hijacked by words and images.