University of Minnesota researchers cite the “perception of authenticity” as the top reason that TikTok’s almost 74 million U.S. users continue to scroll through the app’s infinite video feed.
Many users find solace in videos detailing personal mental health matters, from users like K.C. Davis (@domesticblisters), who shares down-to-earth tips about effectively managing the mental load of life’s daily responsibilities, or Elyse Myers (@elysemyers), who vulnerably shares her experience with depression, anxiety, and Attention-Deficit/Hyperactivity Disorder with her 6.7 million followers.
But while many individuals who struggle with mental illness find community, understanding, and empathy from this kind of content, some U of M researchers are referring to TikTok’s For You Page (FYP) as a “runaway train.”
“People were being sucked in,” says Ph.D. student Ashlee Milton, “and not having a whole lot of control over … what content they were being shown.”
This phenomenon was among key findings in “I See Me Here: Mental Health Content, Community, and Algorithmic Curation on TikTok,” a research collaboration by Milton, assistant professor Stevie Chancellor, fellow U of M Ph.D. candidate Leah Ajmani, and Michael Ann DeVito, a postdoctoral fellow at the University of Colorado Boulder, in The ACM Conference on Human Factors in Computing Systems—one of the most prestigious computer science conferences in the world. Their research comprised 16 semi-structured interviews with subjects of broad identities between the ages of 16 and 54.
The runaway train effect is a double-edged sword, Milton says. “Seeing other people having similar experiences, having gone through very similar things, made [users] feel like they weren’t alone… It was giving them that sense of community without having to interact with anyone.”
Taking a Swipe at Endless Scrolling
TikTok’s FYP creates what the researchers are calling “Online Mental Health Communities” of people who either: share medical information about mental health (such as diagnostic criteria), engage in experiential information (sharing a day in the life of someone who struggles with a certain mental illness), or reach for comfort content (coping mechanisms that people use to manage their mental illness).
But at the same time, being confronted with an endless stream of mental health content, often recommended based on someone’s past viewership or interactions, can become exhausting (even triggering!) in and of itself. “Your option is to swipe and hope you don’t pull the thing that’s actually really upsetting you in the moment—because there’s no way to get rid of those videos on your feed—or just stop using the app entirely,” says Chancellor, who works in the U of M’s department of computer science and engineering.
“How can we help mitigate this, and make this space that is still navigable? Kind of like a safe space, where they’re not having to make that choice between, am I going to get another traumatizing video? Or do I just need to shut the app for the day?” Ashlee Milton, Ph.D student at the University of Minnesota
There are options on TikTok to filter video keywords, refresh your FYP feed, and click a button that shows you’re “not interested,” but researchers say these tools are ineffective. “It doesn’t work in the way that people intend for it to,” Chancellor says.
While some of the information provided by TikTok users can be insightful and educational—sometimes eerily revealing parts of users’ personalities and identities that they may not even know themselves—it can also provide misinformation and downright bad information, the research says.
For example, some TikTokers who regularly post content about ADHD claim that going out of your way to step on crunchy leaves may be symptomatic of the condition. “I would say that leans more toward bad information than misinformation,” Chancellor says. “There are things about sensory processing that are important for an ADHD diagnosis … Crunchy leaves could potentially be a stim [self-stimulating behaviors] for someone. But for the vast majority of people who see that video—myself included—I love stepping on crunchy leaves, and I’m neurotypical. So that kind of information is problematic but not in the same way that actively harmful information is.”
Future research, which Milton aims to publish in the next few years, will address where people go to seek out information on mental illness, the kind of information they find, and how they navigate the overwhelming amounts of information about various health issues on different platforms. “We dive more into credibility and the truthfulness of content and how people choose to go to different platforms,” Milton says.
They also plan to pursue another study in which they’ll conduct workshops with people diagnosed (or self-diagnosed) with mental illnesses and ultimately redesign an intervention on TikTok and other social media sites when it comes to mental health content. “How can we help mitigate this, and make this space that is still navigable?” Milton says. “Kind of like a safe space, where they’re not having to make that choice between, am I going to get another traumatizing video? Or do I just need to shut the app for the day?”
This intervention, in theory, would allow users more control over the content that appears on their feed without deleting the app or logging off, and without the risk of potential triggers.
“People need to be very cognizant of themselves,” Milton says. “How much are they using [TikTok]? Is this going from something that’s causing them to have increased wellbeing before it starts going into, Oh, this is actually harming me, and really having that self-check in? Because everything in moderation, right?”