THE HOUSE OF SEKHON - YOUR PARTNER IN CAPITAL ASSETS CREATION. USING FREE MARKETS TO CREATE A RICHER, FREER, HAPPIER WORLD !!!!!

Can TikTok Really Make Social Media Less Harmful?

In the past decade alone, social media has ruled our world. The vast majority of us spend every morning scrolling through Instagram, reading a couple of Tweets, and if you’re old, checking Facebook, and it goes without saying that it has had a major impact on our mental health and general wellbeing.

As Facebook has faced some difficulties with its rebrand to Meta, it isn’t hard to understand why. As a platform, Facebook let misinformation and harmful content spread like wildfire, and as a result, it gave birth to some pretty horrible groups. When Facebook then purchased Instagram, its reluctance to address mental health and other issues has resulted in people leaving the platform. Now, it is trying to address sexual harassment in the Metaverse – but how was that even able to exist in the first place?

In the past few years, Instagram has gone from a photo-sharing social media app, to becoming a money-making platform for influencers and talent all over the world. All of a sudden, we’re seeing overly retouched images, unrealistic body standards, and careless promotions of weight loss supplements, vitamins, face-altering filters, and so on, without any warning or regulation.

Recently, Instagram made an effort in controlling the content we see on the app by labeling when something has been altered or posted with a filter, as well as making tagging branded content a rule. But, it was a little too little, a little too late.

Now, TikTok has announced updates to its community guidelines, which aim to protect users from harmful content.

The new policy updates include: “Strengthening our dangerous acts and challenges policy, Broadening our approach to eating disorders, Adding clarity on the types of hateful ideologies prohibited on our platform, Expanding our policy to protect the security, integrity, availability, and reliability of our platform.”

Unlike other social media platforms, TikTok has been holding itself accountable and aims to be transparent when it comes to the work it is doing surrounding community guidelines. Quarterly, the platform releases a “Community Guidelines Enforcement Report,” with its most recent report showcasing that 91 million videos that violated its policy were removed in Q3 of 2021 – which is 1% of the platform’s videos. “Of those videos, 95.1% were removed before a user reported it, 88.8% before the video received any views, and 93.9% within 24 hours of being posted.”

TikTok is far from perfect, but at least it is taking important steps to becoming less harmful for users and doesn’t shy away from tackling issues head-on – something that plenty of other platforms struggle with.

At the end of the day, social media will never be good for you, but at least it can improve and make us feel less impacted as a result. You can read TikTok’s full outline here, and learn more about the actions taken by the platform.

Liquid error (layout/theme line 205): Could not find asset snippets/jsonld-for-seo.liquid
Subscribe