On Thursday, TikTok announced the release of new tools to tackle online bullying on its platform. Users can now report comments for possibly breaching TikTok's Community Guidelines or remove multiple comments at once.
They're introducing a new way for creators to handle interactions with their content more easily today. On TikTok, people put their hearts and souls into making and entertaining others, and we understand how discouraging it can be to receive negative feedback on these reels.
Users can open a window of options by long-pressing on a comment or tapping the pencil icon in the upper left corner of their screen to control interactions on a video.
Rather than having to go one by one comment, users can now pick up to 100 comments or accounts.
Rather than reacting to offensive comments, creators who use the new "Filter All Comments" feature will be able to choose which comments appear on their videos. They'll have to go through each comment individually to accept them once it's enabled, and they'll have to use a new comment management tool to do so.
This feature expands on TikTok's current comment controls, which enable developers to filter spam and other offensive comments and filter by keywords, similar to other social media platforms such as Instagram.
Users in the United Kingdom, South Korea, Spain, the United Arab Emirates, Vietnam, and Thailand were among the first to receive this function yesterday. The option will allow them to remove multiple comments at once. This option will be available worldwide in the upcoming weeks. The company has added a feature that allows only the video maker to choose which comments appear on their video.
TikTok's Duet and Stitch functionality, as well as DMs and comments, impose certain restrictions for users aged 13 to 17.
The aim of the new features, according to TikTok, is to maintain a welcoming, positive atmosphere in which users can concentrate on being creative and finding community.
How does TikTok choose which users to make popular?
On the other hand, the other feature would encourage users to think twice before making negative comments, such as those that seem to be bullying or inappropriate. Users will also be reminded of TikTok's Community Guidelines, and they will be able to edit their comments before publishing them.
These types of "nudges" help by calming people down and allowing them to think about what they're doing rather than reacting quickly. In an effort to hold up the spread of disinformation, TikTok is also using nudges to ask users whether they want to post unsubstantiated statements that fact-checkers can't verify.
Other social media platforms took years to implement reminders that ask users to pause and deliberate before posting. Instagram, for example, debuted in 2010, but it took nearly a decade for the company to experiment with a feature that made users think twice before making offensive comments.