Users have long-been policing the sorts of posts creators are allowed to put up through tactics like bad-faith reporting. With the new, vague borderline policy, this culture of targeting suggestive content will likely proliferate. To sex educator and BDSM coach Cory B., this policy simply legitimizes targeted harassment, which she points out as ironic considering the Instagram’s stated efforts to curb the practice on its platform.
“Instagram has already been doing this to accounts,” Cory wrote in an email to The Outline. “They’ll remove a photo then place a shadow ban on the account for two weeks down to the minute. A few weeks ago there was a 24-hour period where a bunch of people in the sex-positive sphere were unable to post.”
Cory had her account taken down in Janurary; after a month she was told her account had been incorrectly disabled for pornography and she regained access. She lost out on income streams, including a donation-based Q & A service she operated through her DMs, and access to her 9,700 followers, to whom she regularly promoted her workshops and events.
“The whole point of my business is to educate those who aren’t already in the know about things like sex-positivity,” Cory stated. “It doesn’t make sense for me to flock to a new platform full of a bunch of sex-positive people because then I’m just preaching to the choir. So it looks like I’m going to have to continue shifting my content so that I can continue to do this important work that clearly some people at Instagram would benefit from.”
In a similar vein, erotic model and artist Kasey, whose stage name is Skye Blue, points to the fact her issues with posts being deleted and her accounts disabled started once she reached 100,000 followers. The fact the removals happened one right after the other made her feel as though someone was reporting her to the point of harassment. With this new policy it can be assumed such practices will ramp up.
“I had even received emails from people saying they knew who was reporting me and wanting to personally take me off Instagram,” Kasey said in an email to The Outline. “Anything can be a trigger word for someone to report now. If someone doesn’t like a fetish, or even a word for a body part, they can report me and I can get deleted again. My work is mainly nude; I’m having to change the way I shoot just to make images for this stupid platform. The conformity is too much.”
“To say that if a post follows the guidelines, it will still get demoted if deemed ‘inappropriate’ — how does anybody work with that?” Cory stated. “Who decides what is inappropriate? I don’t see how sexism and sex-negativity aren’t going to end up in these algorithms.”
Platforms, including Facebook, Twitter and YouTube, have all been called on at one point or another to perform some form of content moderation on their sites, often with regards to the spread of misinformation and hate speech. Of course, as these platforms have become massive public spheres intertwined with our daily lives they do bear a great deal of responsibility in that regard. But amid cries for clear and consistent rules which will keep users safe, Instagram has developed a deliberately vague policy that just keeps us covered up. The problem is the Nazis, not the nipples.