Facebook ignored that Instagram was harmful to kids, but so did we.
Understanding and responding to the negative impact of social media on children is a necessary first step, but it's just that –a first step.
My goal with this newsletter, and with Counterweight as a broader project is not to be reporting on news surrounding tech and tech legislation. But the recent reports about Facebook’s internal documents, and the recent congressional hearings that came about as a result, provide a nice lens for examining several aspects of what I’ll be calling around here “The Problem.” We’re eventually going to get into “The Problem” with social media and our evolving digital landscape in all its complexity and fullness here, but for today we’re going to focus on one aspect of that problem.
Instagram v. The Children
Here’s a quick overview if you’re not up to speed on the situation. The Wall Street Journal recently released a series of articles and podcasts called “The Facebook Files” taking a look at internal documents at Facebook that were leaked by whistleblower, Frances Haugen. I haven’t had time to look through all of this reporting yet, but one of the biggest stories to come out of this was the report that Facebook has done internal research and has knowledge that Instagram was “toxic” for a percentage of teen girls, and harmed their mental health. Here’s The Wall Street Journal quoting Facebook’s internal report:
“Teens blame Instagram for increases in the rate of anxiety and depression,” said another slide. “This reaction was unprompted and consistent across all groups.”
Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram, one presentation showed.
There’s a lot more, but the crux of the issue is that Facebook has had knowledge of significant detrimental effects of Instagram on teenage girls, and in response has done very little. If that isn’t concerning enough, the company is actively working on an Instagram app for even younger kids. The internal fallout from these revelations has been extensive, with the NYT reporting that Facebook is actively working to crack down on any additional document leaks. The company’s motivations seem clear —while some individuals working for Facebook are trying to take responsibility and make improvements— there’s a broader culture of denial and placing profit ahead of the wellbeing of its users, many of which are children.
All this led to a Senate Commerce Committee hearing examining these issues, including testimony by whistleblower Frances Haugen.
What the report and the response tells us about “The Problem”
I think it’s an important story, and I encourage people to read about it themselves. The Wall Street Journal’s investigation is the kind of important work we need right now in this space. But what I’m most interested in examining here is how this whole story and the reaction indicates just how woefully behind the curve we are, how ill-equipped our society is for dealing with these issues, and how badly this conversation needed to be happening, yesterday.
That the Senate Commerce Committee is investigating this issue is a positive step, but what is the practical solution they’re looking at as a result of their investigation? Updating the Children’s Online Privacy Protection Act from 1998. 1998! Social media as we know it now did not even exist when the current laws legislating how companies like Facebook handle these issues were written. The government is moving achingly slow on these issues, in part because many of our legislators are woefully out of touch with digital culture and the nuance and complexity of these issues. It’s good that the government is responding, but one of the key elements of what makes “The Problem” such a big one is the acceleration of change. We’re not just facing the individual changes represented by each new technology, but a larger order change in how quickly technologies are developed and adopted en-masse by society.
So while The government is looking at updating laws from 1998 to respond to FB’s findings on how Instagram impacts children from years ago, younger people and people who follow and pay attention to digital social trends know that for many young people Instagram is already old-school, they’ve moved on to TikTok. Meanwhile, the existing COPPA legislation from 1998, which is supposed to prevent children under the age of 13 from using these apps and services, is not even heavily enforced. Companies like Facebook know that children younger than 13 simply lie about their age and use the apps anyway, choosing largely to turn a blind eye. Updating legislation is great, but it will be largely meaningless if it’s not enforced or is utterly ineffective in preventing what it’s supposed to prevent. By the time action is taken to deal with the negative side-effects of issues that started cropping up years ago, new platforms and technologies will exist potentially causing new issues.
Here’s one of the key points from Facebook whistleblower Frances Haugen’s testimony:
“Facebook’s closed design means it has no real oversight. Only Facebook knows how it personalizes your feed for you. At other large tech companies like Google, any independent researcher can download from the internet the company search results and write papers about what they find, and they do. But Facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system.”
As long as we have a “wait and see if there are any negative side-effects and then respond” attitude to digital technology, we’ll constantly be behind the curve. One of the big elements of this story that’s overlooked is that people are becoming aware of the problem and responding, because of research done by Facebook itself! Research that we only know about because Haugen was bold enough to share it with the public. Haugen continues:
“This inability to see into Facebook’s actual systems and confirm that they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway. Today, no regulator has a menu of solutions for how to fix Facebook, because Facebook didn’t want them to know enough about what’s causing the problems, otherwise they’re wouldn’t have been need for a whistleblower. How is the public supposed to assess if Facebook is resolving conflicts of interest in a way that is aligned with the public good if the public has no visibility into how Facebook operates? This must change.”
To properly understand and address these issues, we need independent research into the safety and impact of all these platforms. And we need regulators who are willing to examine that data and respond on the timescale at which modern technology develops. We shouldn’t be waiting around for these companies to discover their problems for themselves and for someone with a conscience within that company to leak the reports of those problems.
The darker reality is that Facebook and Instagram’s negative impact on society and children isn’t really news. People are all up in arms now because Facebook knew Instagram was harmful. But that attitude ignores the fact that data on the harms of social media on children and society has been out there publicly for years. It’s not just Facebook asleep at the wheel, it’s all of us.
Peer-reviewed research from 2017 showed that spending more time on Instagram makes you more likely to develop an eating disorder1. Research from 2019 showed that social media usage was a significant predictor of depressions levels for adolescents2. This is just to name a few, additionally, children on social media are often exposed to images of self-harm, sexual predators, and cyberbullying (exposure to which correlates heavily with risk-taking behavior later in life).
There’s another interesting angle about the response to this latest Facebook scandal, and the company’s intention to build an “Instagram Kids” app. One of the key objections to allowing younger children onto these platforms that I’ve seen repeated is that it will “get them hooked from an early age.” I agree that this is a concern, but the implicit messaging here is that being hooked at all is bad so we must protect our children from the dangers of “getting hooked.” So what is being done about the negative effects of these platforms on people of all ages? Children are particularly vulnerable yes, but so are you, and so am I. The negative side effects of social media on children and young girls are alarming, and it makes enraging headlines, but the negative side effects are much broader than that.
For many, this latest scandal is the first time they’re hearing about the negative side-effects of Facebook or Instagram. And this should be a wake-up call. But not just a wake-up call to the problems that Facebook and Instagram present, it should be a wake-up call to the larger order problems presented by unregulated monopolistic social media platforms, and a society that blindly engages with new technology without thinking through the ramifications. We don’t just need to update our legislation to better protect children, we need a comprehensive, holistic understanding of what the issues are, how we’re being affected, and what we might be able to do about it.
The issue at hand is not just how do we respond to our realization that Facebook and Instagram can be harmful, or even that social media broadly can be harmful, the issue at hand is how do we develop a system for detecting and responding to the potential harms presented by rapidly developed and adopted technologies?
We’re just now addressing the idea that there are ills caused by social media, and that we can and should do something to mitigate those ills. Are we prepared to face the problems that AI, Virtual and Augmented Reality, Blockchain, Deepfakes, and other coming technologies will pose?
Practical Takeaways
What can you do? I think the first step is realizing that we can’t wait around for Congress to take action to protect ourselves and our children. I am certainly in favor of increased regulation for tech companies, but taking steps to do what you can on your own now can protect you and the people you care about. This has to start with an increase in awareness of these issues. We can’t take action as individuals or as a society if we don’t know what’s happening and without having conversations about what can be done. Resources like The Center For Humane Tech, and Common Sense Media are valuable for learning about what’s going on, and my goal for this project Counterweight is to add another voice to the growing number of voices examing and raising awareness of these issues and trying to find a way forward. In the future, we’ll delve more into defining “The Problem” as a whole, as well as theory that will give us better ways of thinking about the digital environment we interact with, and more practical techniques for approaching digital health and wellness. For now, I’ll leave you with this quote from Marshall McLuhan:
“There is no inevitability, so long as there is an ability to contemplate what is happening.”
If Counterweight is valuable to you, please consider sharing it with someone you think would appreciate it, or supporting the project financially. Counterweight is committed to never taking money from advertisers and is entirely reader-supported.
The whole thing is such an indictment of our gerontocracy. Looking forward to your further definition of The Problem.