Discord Introduces New Safety Features for Teenage Users

Discord, the popular online platform for gaming and socializing, has announced a new initiative called Teen Safety Assist to create a safer environment for its younger users. The initiative includes default enabled safety features for teenagers on the app, such as proactive filters and alerts.

Teen Safety Assist: What Is It and How Does It Work?

According to a blog post by Discord, Teen Safety Assist is a set of safety features that will be automatically enabled for users who are under 18 years old. These features are designed to protect teens from potentially harmful or inappropriate content and interactions on the app. Some of the features are:

  • Image blurring: Discord will automatically blur any images that are flagged as sensitive or explicit by its content moderation system. Teens will have the option to unblur the images if they want to, but they will see a warning message before doing so.
  • New sender alert: Discord will alert teens when they receive a message from someone they don’t know or haven’t interacted with before. The alert will prompt them to either reply, block, or report the sender. Teens will also be able to mute or block anyone they don’t want to talk to at any time.
  • Rule violation warning: Discord will send a warning message to users who break its community guidelines or terms of service. The message will inform them of the rule they violated, the consequences of their actions, and how to avoid repeating the offense. Discord said that these warnings are meant to educate users and help them become better digital citizens.

Discord said that Teen Safety Assist will roll out next week and will be the default setting for all teenage accounts. Users who are 18 years or older can also opt in to these features if they want to.

Discord Introduces New Safety Features for Teenage Users

Why Did Discord Launch Teen Safety Assist?

Discord said that it launched Teen Safety Assist as part of its mission to create a safer internet for everyone. The company said that it recognizes the challenges and risks that teens face online, especially on platforms that allow them to communicate with strangers and share content.

“Discord is a place where people can express themselves, find their communities, and make friends. But we also know that not everyone on the internet has good intentions, and some people may try to take advantage of or harm others,” the company said in its blog post.

Discord said that it has been working hard to improve its safety and moderation tools over the years, such as adding server verification, trust and safety reporting, and parental controls. However, it also acknowledged that these tools are not enough to prevent all forms of abuse and harassment on the app.

“That’s why we’re launching Teen Safety Assist, a new initiative that aims to proactively protect our younger users from potentially harmful content and interactions on Discord,” the company said.

Discord also said that it hopes that Teen Safety Assist will encourage teens to have more open and honest conversations with their parents or guardians about their online activities and safety.

“We believe that parents and guardians play a vital role in helping teens navigate the online world safely and responsibly. We encourage teens to talk to their parents or guardians about what they do on Discord, who they talk to, and what kind of content they see,” the company said.

How Are Users Reacting to Teen Safety Assist?

The announcement of Teen Safety Assist has received mixed reactions from users on social media and online forums. Some users praised Discord for taking steps to protect its younger audience and create a more positive and inclusive community. They said that Teen Safety Assist will help prevent teens from being exposed to harmful or offensive content and messages on the app.

“I think this is a great idea. There are a lot of creeps and trolls on Discord who try to harass or scam young people. This will make it harder for them to do that,” one user commented on Reddit.

“I’m glad Discord is doing this. I have a younger brother who uses Discord a lot, and I worry about him sometimes. He’s very naive and trusting, and he doesn’t always know how to deal with strangers online. This will give him some extra protection and guidance,” another user wrote on Twitter.

However, some users criticized Discord for being too intrusive or paternalistic with its new safety features. They said that Teen Safety Assist will limit teens’ freedom and autonomy on the app, and that it will interfere with their personal preferences and privacy.

“This is ridiculous. Discord is treating teens like babies who can’t make their own decisions or handle anything online. Teens are not stupid or helpless. They can decide for themselves what they want to see or do on Discord,” one user argued on Reddit.

“This is an invasion of privacy. Discord is basically spying on teens’ messages and images without their consent. They have no right to do that. Teens have a right to privacy and confidentiality on the app,” another user complained on Twitter.

Some users also expressed doubts about the effectiveness and accuracy of Discord’s content moderation system. They said that Discord’s system may not be able to detect or filter all types of sensitive or explicit content, and that it may also block or blur content that is not harmful or inappropriate.

“How will Discord know what is sensitive or explicit? What if it blurs something that is not offensive or dangerous, like a meme or a joke? What if it misses something that is actually harmful or illegal, like child pornography or hate speech?” one user asked on Reddit.

“What if I want to see the blurred images? What if they are relevant or important to the conversation or the topic? What if they are part of my personal expression or identity? Discord is taking away my choice and my voice,” another user said on Twitter.

Leave a Reply

Your email address will not be published. Required fields are marked *