Share This Story, Choose Your Platform!

I’ve Been Trying to Solve The Social Dilemma for 3 Years

“These technology products were not designed by child psychologists who are trying to protect and nurture children,” says Tristan Harris. According to the tech ethicist, the people behind social media apps and platforms “were just designing to make these algorithms that were really good at recommending the next video to you.” Speaking in the recent documentary/drama hybrid, The Social Dilemma, Harris makes these statements as viewers watch a fictional tween girl delete a selfie because it only garnered two “Likes” (a phenomenon I’ve written about before). She promptly takes another selfie, adds filter after filter, and posts the new heavily edited photo online. She’s briefly relieved when this version gets more engagement. Harris continues his narration in the background, explaining that, “it’s not just that it’s controlling where they spend their attention… social media starts to dig down deeper and deeper into the brainstem and take over kids’ sense of self-worth and identity.” Then, we see the girl on screen suddenly dejected as one of her followers makes a joke about the size of her ears.

For me, this was one of the more striking scenes in the documentary. It not only illustrated the effects of social validation metrics on the brain—but it also highlighted the dangers social media can pose for our children specifically. Harris identifies the key reason I founded a company to build better products for kids: the technology we currently have was not designed with children in mind. It’s a thesis that I delved deep into with my book, Screen Captured, which I published just over a year ago.

Divisive technology, divisive documentary

Love it or hate it, The Social Dilemma sparked some intense response—from people and tech companies alike. Filmmaker Jeff Orlowski features some of the most outspoken technology critics, many of whom are responsible for designing some of social media’s most manipulative features. Their main argument? That these tech platforms are going to great lengths to capture our attention through manipulative design and advanced algorithms—ultimately threatening our wellbeing and society. And even though I might not describe technology as an existential threat, I’ve been alarmed by big tech for a long time—especially where kids are concerned.

“I’m worried about my kids. And if you have kids, I’m worried about your kids.” We hear this statement from Dr. Anna Lembke, the Medical Director of Addiction Medicine at Stanford University, just before we see her own kids on screen. They proceed to estimate their own social media usage—and both her 15-year-old son and 17-year-old daughter discover that they have wildly underestimated the hours they spend on social media. And this is another symptom of the problem: these platforms are so effective at separating us from our time—without us even realizing it. And while the minutes and hours tick by, we’re not experiencing a proportional benefit. In fact, we’re often left feeling more disconnected than before.

Imagine my concern when I started to see these same patterns emerge on platforms that my 7-year-old daughter was interested in using to connect with family and friends. And this isn’t limited to apps designed for adults: I noticed it on the kids’ platforms she was using, like Popjam, which features a feed and rewards children for accumulating more “Likes” and followers. When these persuasive patterns and manipulative design tactics hit this close to home, it fuelled a passion in me to get more involved—by writing my book and creating my company, Kinzoo.

I did a great deal of research before founding Kinzoo, and I encountered a lot of the same arguments that we hear in The Social Dilemma. I resolved to never intentionally put persuasive design into our product. The whole team also works hard to make sure we don’t inadvertently include mechanisms or patterns that are less obvious, like Snapstreaks or “Like” buttons.

I also became convinced that our kids especially need technology that prioritizes their wellbeing. I saw how the big tech companies use features like infinite scroll and “Like” buttons that give us a little hit of dopamine and keep us coming back for more. And while I’m careful about using the word “addiction” to describe the behavior, I definitely didn’t want to give my young daughter technology that operates that way.

Can we create a social solution?

This film raises some critical issues, and I think it’s important for people to watch. But, it presented a lot of problems without many suggestions for what users can do about these platforms short of deleting them altogether. That might work for some of us but definitely won’t work for everyone. I believe we need two things to happen in order to begin seeing meaningful and positive changes: We need better platforms built by founders who are motivated to serve the needs of their users, not the needs of their advertisers. And, we need a critical mass of people to begin rejecting platforms built to persuade and manipulate. But in order for that to happen, they need viable alternatives.

In the interim, I don’t think we need to delete our accounts altogether—but many of us could tighten up our networks and start relying more on direct means of communication rather than social media. If we make the effort to have more meaningful interactions with family and friends—rather than just seeing them appear in our feed once in a while—I suspect our relationships will benefit anyhow. We can wait for regulatory change, but I wouldn’t hold my breath. The only parties that can catalyze change from the platforms are the advertisers (their customers) or us and our attention (their product).

It’s true that the status quo for technology needs to change for our children most of all. Currently, the onus is more or less on parents to protect and guide their kids. They’re left to navigate a complex and opaque landscape on their own. The things I learned in my own research helped me feel more confident about how we use technology in our family, and I hope that my book (and the work we’re doing at Kinzoo) can help other families do the same. Until tech companies and regulators start cleaning up the current mess, everyday people need a lifeline to help separate good technology from bad.

 

Image credit: REDPIXEL / Adobe Stock