Rich Fury/Getty Images
Jeff Orlowski, director of the new documentary on Netflix about how tech giants affect everything from mental health to politics, says, "only years later did we realize there are huge, huge consequences"
There’s an oft-quoted saying about tech companies: If you aren’t paying for the product, you are the product. That idea is brought into stark reality in the new Netflix documentary The Social Dilemma, in which director Jeff Orlowski weaves together deft interviews with some of the very people who designed the networks and platforms that govern so much of the everyday life of billions of people around the globe today.
Orlowski, the director of the climate change documentaries Chasing Coral and Chasing Ice, hadn’t always thought of tech as pernicious. But he knew Tristan Harris, a former design ethicist at Google and the co-founder of the Center for Humane Technology, from his college days at Stanford, and began talking with him about “the problems of the technology companies and how their business model is misaligned with society,” says Orlowski.
In the film, which premiered earlier this year at Sundance, Harris is joined by an array of former tech executives (such as the co-creator of Facebook’s Like Button) and scholars (including an addiction specialist) breaking down the business models and effects of technology companies.
It covers such topics as the spread of conspiracy theories and misinformation, extreme polarization in politics, data mining, surveillance capitalism (the commodification of personal data), and the addictive qualities baked into the systems. As far-reaching as the impacts are, they are the result — as the film hammers home — of decisions made by a very small number of people in Silicon Valley.
“We’re in the midst of a massive societal shift where a handful of people in Silicon Valley have reprogrammed human civilization,” says Orlowski, a former Apple campus rep in college, who talks with The Hollywood Reporter about how hard it was to get social media out of his own life, the efforts one company, in particular, Facebook, uses to try to lure users back, and plug-ins he recommends to make social-network platforms less of a trip “down the rabbit hole.”
How did you come to do this film?
Orlowski: Tristan [Harris] and I went to Stanford. He ended up going to Google. Many of my friends went to work at different tech companies coming out of school. I went down the film route. Tristan started talking about the problems of the technology companies and how their business model is misaligned with society. He started to see some of the problems. He started to talk about them internally and he did an internal presentation. The executives let him invent a new title; he became a design ethicist at Google, someone thinking about the ethics and the implications of what they are programming. He was working within Google to really think about these big questions. It didn’t go anywhere. We met up and I just really started to understand what was going on in a different way and how these companies were actually programming society, and that was just a huge light bulb for me.
Tristan Harris, former design ethicist at Google and co-founder of the Center for Humane Technology. (Courtesy of ExposureLab)
How did you feel about tech before that?
Orlowski: Tristan and I were both Apple campus reps at Stanford. I’m a huge fan of technology. A lot of great things have come from social media. We’ve had amazing things come from these technologies and positive things can happen on these platforms. But what I really learned in this process is that some technology is designed for us, the public, and some technology has different customers in mind — it’s an advertising model. They are selling manipulation. Where do you see misinformation and conspiracy theories? It’s running rampant on Facebook and Twitter and it’s not happening on Netflix and Hulu and HBO, where the business model is that we pay for entertainment and humans are curating the content. The other model is, “We let everything come in.” Quantity is great. The more we can receive, the more we can spread, the more we collect eyeballs, the more money we can make. It’s the one system designed around quantity of time, of ad placement, of eyeballs.
How does that play out, at, say, Facebook?
Orlowski: At Facebook, there’s the algorithm called People You May Know. It’s a very particular algorithm it uses to try and grow the network and it shows you, regularly, people that you might know. They’ve taken the word friend and it has morphed into something that has nothing to do with friendship. It is a quantity that helps grow their business model. It’s not designed for genuine meaningful connection between me and any of these people.
And are you yourself not using social media?
Orlowski: During the process of doing this film I took myself off social media. There was a period I was having a hard time removing myself from the platform. At the same time, I learned about this thing the programmers have written. It’s an algorithm that’s designed to resurrect you.
I’ve never heard of that before. What is that?
Orlowski: They call it a resurrection. There’s an entire team designed to get you to come back to the platform. I started getting more and more emails from Facebook, “Oh, these friends miss you. This friend just posted about this.” Then they started sending me text messages and started sending me photographs. I saw photographs of former girlfriends show up in these resurrection emails. I don’t know if that’s still the case when you leave Facebook. It just made me recognize they don’t necessarily have my interests at heart. This technology isn’t designed to improve my life. It's technology that’s designed to get me to come back and spend more time there. That’s how a free company is worth hundreds of billions of dollars. These are the richest companies on the planet and the only industry where you can get something for free. They’ve created these digital versions of us. Facebook has a model of you. Google has a model of you. Twitter has a model of you. They are trying to make the best model of you possible to predict who you are and what you are going to buy. If they can successfully predict you will buy these things, they will make more money off of you. The reason they collect so much data on you — and this is the surveillance part of the phrase surveillance capitalism — the better they are predicting and persuading you both with advertisements and with content.
Who are some of the people you interviewed for the film?
Orlowski: Our interview subjects are all former Google, Facebook, Instagram, Twitter, YouTube employees. That was for me the most critical part. That’s why I thought the film was going to be interesting. We connected to former insiders and really crucial executives, like the former president of Pinterest, and the guy who built the business model at Facebook back in the day when Facebook wasn’t making any money. This was the guy who really pushed them to use the advertising model that Google had already perfected. It spread from Google to Facebook and then to Twitter. We interviewed the guy who was the co-inventor of the Like button. They invented the Like button to hopefully spread positivity and joy like a thumbs up and that has so completely morphed into this beast now, and it’s leading people to feel depressed and anxious around this relationship with this technology. We really tried to get a lot from the insiders, the people who wrote the code. It seemed so much more credible when you had former employees saying this is how the product is designed.
So you don’t think their intentions were bad?
Orlowski: I don’t think they realized what the consequences of their actions would be. Knowing the people that I know who worked there, both subjects and people outside of the film, I do believe that their intentions are good in general. Their intentions are also driven at least at the company level by a capitalist system that’s trying to make as much profit as possible and I think they never really understood the scale and the power that they were going to have. I think many of them do regret not thinking more carefully about the exponential consequences.
Certainly, people at these companies must now understand what some of these negative consequences are at this point though, right?
Orlowski: They can change the code at any time. It’s just code. The problem is that they would probably lose a lot of money. That’s why they are not changing the code. They could use a business model where the public subscribes like we do for Netflix and HBO and Hulu. They could use a business model where people have to pay to have them store data, store my posts and 10 years of my photographs. The companies themselves could be taxed by the government for how much data they collect and be treated more as a utility. People with big followings could pay for how many followers they have. There are so many different business models they could be using. My understanding is they did the math and none of those would make as much money as the advertising model.
And what’s so interesting of course is that so many tech executives don’t let their own kids use social media and severely limit their screen time.
Orlowski: That comes in at the very end of the film. A number of people say, ‘I don’t let my kids use this.’ That should be a very good testament to the problems of the technology. There’s a phrase that programmers use called inherent vice where they recognize there’s a problem with a premise of what they are designing but they have to do it anywhere. They built the entire thing on a flawed model of extracting attention. I have an analogy I like to use around climate change and the fossil fuel industry. When we first discovered fossil fuels, it seemed like a really great thing. Look at how awesome this oil is. We can build cars and fly planes. Only years later did we realize there are huge, huge consequences. Similarly, when social media was just born — we can travel around the world digitally — it seemed so innocent. It seemed too good to be true. It’s turning out it is too good to be true.
Are you still off social media?
Orlowski: I’ve completely stopped using it personally. I stopped about two years ago. I haven’t felt the urge to go back. It took me a long time to wean myself off.
Do you have any advice for ways to limit social media use?
Orwloski: There’s also a thing called Facebook News Feed Eradicator and other plug-ins which are referenced at the end of the film. If I were to open Facebook again, my News Feed wouldn’t even show up. I wouldn’t see any posts for anybody. And there’s a plug-in for Youtube, where I don’t see any of the recommended videos on the side panel. I just see the one video I went there to see. It doesn’t send me down the rabbit hole.
THR's Degen Pener contributed to this post.