Written by Kathy Wheatley on
 August 15, 2024

Brief TikTok Use Linked To Harmful Body Image Effects, Study Shows

Short sessions on TikTok displaying pro-anorexia content can notably weaken self-esteem and elevate the risk of eating disorders among young women, according to recent Australian research.According to the New York Post, the study from Charles Sturt University highlights the swift impact on body image after less than ten minutes of exposure to harmful TikTok videos.

Researchers at Charles Sturt University in Australia undertook a detailed analysis of the influence of TikTok content promoting anorexia. Their findings were published in the journal PLOS ONE, shedding light on the significant psychological effects of short exposure to such material.

The 273 participants, university freshmen aged between 18 and 28, showed concerning trends in their self-esteem and eating habits. This group was divided, with half viewing pro-anorexia content for about eight minutes, while the others watched neutral videos.

The comparison revealed that even brief engagement with damaging content led to notable detriment in body image satisfaction levels, especially for those exposed to pro-anorexia material.

Study Exposes Immediate Damaging Impact Of Content

This recent study aligns with ongoing concerns about the potency of social media influence among youth. Although TikTok has implemented measures to curb access to harmful content, the persistence of such videos remains troubling.

“Our study showed that less than ten minutes of exposure to implicit and explicit pro-anorexia TikTok content had immediate negative consequences for body image states and internalization of appearance ideals,” noted the researchers, highlighting the urgency for more stringent control and regulation.

A TikTok spokesperson responded to the developments, indicating that the platform continually works to provide a secure and diverse viewing experience, mindful of the varied impact content can have on individuals.

Regulatory Changes And The Role Of Social Platforms

The study’s findings underline the pressing need for platforms like TikTok to tighten their content filtering processes to prevent the spread of eating disorder-related content. TikTok has updated its community guidelines to block harmful weight-loss claims and behaviors associated with eating disorders.

Further, regulatory measures have seen President Biden sign a law mandating ByteDance, TikTok's Chinese parent company, to either sell the app or face a ban by January 2025 due to concerns over data security, further complicating its operational environment.

Despite these changes, researchers argue that merely blocking search terms such as '#anorexia' is insufficient as users often find ways around these restrictions. “Current steps are being taken to delete dangerous content, however, further regulation is required,” they emphasized.

User Behavior And Platform Responsibility Intersect

The study observed significant correlations linking more than two hours of daily TikTok use to increased disordered eating behaviors. This statistic, while not overwhelmingly strong, points to a troubling trend that merits attention.

As platforms continue to evolve and adapt their policies, the emphasis remains on the necessary balance between user freedom and protective measures. The responsibility is dual, lying both on the users to navigate content responsibly and on the platforms to enforce safe browsing environments.

This dual responsibility frame is central to mitigating the risk and ensuring that TikTok and similar platforms do not become conduits for harm.

Concluding Insights: A Call For Broader Measures

In conclusion, the Charles Sturt University study encapsulates the considerable impact brief exposure to detrimental content on TikTok can have on young women. Both the decrease in body image satisfaction and the normalization of unhealthy appearance ideals call for an aggressive stance on content management and regulatory oversight.

The need for comprehensive strategies is clear, as current measures, while steps in the right direction, are not enough to curb the nuanced ways users interact with and circumvent content controls. The call for broader regulatory measures, combined with proactive platform management, forms the pillar of safeguarding mental health in the digital age.

Author Image

About Kathy Wheatley

Your trusted source for independent, comprehensive entertainment news.
© 2024 - Insider Journal - All rights reserved
Privacy Policy
magnifier