Here’s what you can do if your social media post gets taken down

You can appeal when your social media content gets taken down, but you need to know where to look.
Statue butt

Share

Some social media-content is easy to evaluate. Cute cat pictures? Great. Helpful how-to videos? Perfect. Unfortunately, parsing posts gets a lot more complicated when the content flowing through the tubes of social media involves news. Earlier this week, the House Judiciary Committee brought in representatives from Twitter, YouTube, and Facebook to discuss the companies’ policies and practices for evaluating and moderating content (mostly political in nature) and the internet scourge that is fake news.

During the hearing, Facebook’s rep, Monika Bickert, mentioned that the company revamped its process for users to appeal when their posts are removed from the site. The company is halfway into what it considers a three-year plan to address the spread of false information and figure out its tactics for identifying things like hate speech and propaganda. No one likes having their content moderated or removed, so we followed up with each company to find out what recourse users have in the case of a deletion.

Facebook

Earlier this year, Facebook publicly published its community guidelines, which it uses internally to decide when and how to moderate content. To go along with this disclosure, the company also revamped its method for allowing users to appeal when their posts get removed from the service.

Now, Facebook will notify you when your content is removed, and you’ll have an opportunity to appeal if it was removed for nudity, sexual activity, graphic violence, or hate speech—all of which the company detects, in part, using AI. The prompt comes in the form of a button in the alert about the initial removal. When you click appeal, your request goes to someone on Facebook’s community team. According to the company, every appeal goes through a human, which is in line with requirements set earlier this year as part of the General Data Protection Regulation that’s now in effect in the European Union.

As part of the GDPR, consumers have the right to an explanation about how and why an algorithm made a decision. While the appeal process—with a real human!—is a positive step in terms of content regulation, you shouldn’t expect a full-on hearing or a chance to argue your case in some grand Law & Order fashion. Rather a person will look at your request and let you know whether or not your content fits the community guidelines.

Youtube appeal process

YouTube

You can check out YouTube’s community guidelines on this page to see what kind of content will or won’t get your video removed from the service. Predictably, YouTube also has a video version that explains the top-level concepts of what will get your reel taken down.

When YouTube believes you’ve violated the guidelines, you’ll receive a strike on your account that will appear in messages with an explanation of why it took that action. If you want to appeal the removal, you’ll have to actively request it by digging into the menus.

Like with Facebook, once you file an appeal, a human member of the community team will review the video to see if the violation will stand.

Once the YouTube team member makes their decision, there are four possible outcomes:

If we find that your video did not violate our Community Guidelines, we’ll reinstate it and remove the strike from your account. In some instances, it’s possible that we may remove the strike from your account, but keep the video down. In some instances, it’s possible that we may reinstate your video behind an age restriction. This will happen if a violation is not found, but the content isn’t appropriate for all audiences.

If we find that your video was in violation of our Community Guidelines, we will uphold the strike and the video will remain down from the site.

Before you appeal your removal, however, you should take a moment to carefully read the Community Guidelines as well as the the Policy page because if YouTube rejects your appeal, you won’t be able to appeal again for 60 days.

Twitter

YouTube and Facebook’s process for removed posts are fairly simple, but Twitter’s are slightly more complicated. The service typically handles abuse or other bad behavior by temporarily or permanently banning entire accounts, rather than removing or suppressing single Tweets.

The Twitter Rules page outlines what the service does and does not find acceptable, but the examples and the guidlines aren’t quite as specific as Facebook’s or Youtube’s.

Twitter uses AI systems to immediately detect some tweets that contain harassing language or what it considers hate speech. Those tweets may result in a user’s account losing some functionality, like restricting replies and tweet visibility to only their followers.

There is an appeal process for getting a suspended account reinstated, but the link can be difficult to locate. You can find it here. The fields give you an opportunity to explain your actions and make a case to a human reviewer, whocan decide whether or not to give you back a fully-functioning account.