How Snapchat is Tackling the Rising Concerns About Content Moderation

Samuel David

2023-03-19

blog image

With the increasing concerns about content moderation on social networks, Snapchat has recently taken steps to safeguard its users from inappropriate content. It has recently added a new element to its Family Center that will help parents restrict the content their children see and provide more transparency into its content guidelines so that users can understand how it ranks and distributes uploads. 

The improved Family Center will give parents more peace of mind that their kids are not exposed to offensive material. The new sensitive content toggle enables them to filter out any stories marked as suggestive. In addition, Snapchat has also published its Content Guidelines, which will help people better understand the rules around what is and isn't allowed in the app. 

These guidelines cover topics such as engagement bait, creative quality, and the prohibiting of overt solicitation from non-approved creators. Snapchat is also looking to add a new element to its Family Center, which will allow parents to monitor their kids' use of its My AI feature.

My AI, released last month, allows users to interact with an AI chatbot and has raised a few concerns regarding drugs, alcohol and hiding things from parents. By adding oversight to this element, Snapchat will be able to provide a higher level of safety and security. 

Ultimately, all of this will help keep the Snapchat community safe and secure while also giving parents extra protection when it comes to monitoring the content their children are exposed to. Furthermore, it will provide additional transparency for marketers looking to ensure their content is up to standard and reaches the right audience.

Follow: