Ending Child Pornography on Social Media Platforms

Last November, it was discovered that child pornography was being shared on the popular social media platform Tumblr. As a result, Apple removed the Tumblr app from their App Store. A month later Tumblr made the decision to remove all pornographic content from their site.

According to the National Center on Sexual Exploitation (NCSE), WhatsApp, a popular texting app, is following suit and facing the same issue of child pornography being shared through the app. WhatsApp, owned by Facebook, is one of the most popular messaging apps in the world. Early last year, WhatsApp hit 1.5 billion monthly users with over 60 billion messages sent per day.

There are two major factors the NCSE believes contribute to the issue of inappropriate content being shared:

  1. WhatsApp allows adult pornography on their platform.
    While child pornography is banned on WhatsApp, adult pornography is allowed. As Tumblr learned first-hand, permitting adult pornography opens the door for a toxic environment. Boundaries are slowly pushed and content becomes gradually more extreme.
  2. WhatsApp lacks proper moderation.
    Facebook has a team of 20,000 employees monitoring content posted on the social media platform. However, none of these employees work for the Facebook-owned WhatsApp. WhatsApp has 300 employees to independently manage content on a large, ever changing platform.
It is time for WhatsApp to follow Tumblr's example and remove all explicit adult content from the platform. Take action by raising awareness about this important issue. You can also leave a negative review for WhatsApp on the App Store asking them to stop sharing all forms of exploitive content.