The Future

The Future

Keep Instagram horny

The platform's latest announcement suggests the campaign to sanitize the internet of sex is going strong.

Facebook is for keeping up with your family, Twitter is for depression (both causing it and complaining about it), and Instagram is for being a freak. From artfully crafted thirst traps to extremely NSFW content, the most visual app is also the horniest. But that may be changing. Instagram announced last week it will be limiting the spread of what it deems “borderline” content. The move is part of a broader updated strategy of “remove, reduce and inform” by Facebook for its family of apps.

“We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines, limiting those types of posts from being recommended on our Explore and hashtag pages,” the company said in a statement Wednesday. Essentially, a post does not have to violate the platform’s community guidelines — like spam posts or violently graphic photos — outright, but if it is considered sexually suggestive, in bad taste, or hurtful it could receive far less engagement as the platform will not recommend it to other users through the Explore or hashtag pages. If a user already follows the account that made the “borderline” post they will still be able to see it, but potential followers will not happen across it themselves while browsing.

So far, the company has not offered greater detail of what a borderline post actually looks like; a note in the Help section on the topic — titled, “Why are certain posts not appearing in Explore and hashtag pages?” — doesn’t say much to guide users on what might cause a post to be flagged as borderline. The platform’s Community Guidelines clearly state if a user posts certain content, such as depicting sex acts, genitals, “close-ups of fully-nude buttocks,” or female nipples (with exceptions), it is subject for deletion and the account overall may be disabled. In its note regarding borderline content, however, Instagram’s expectations are notably vague.

“While some posts on Instagram may not go against our Community Guidelines, they might not be appropriate for our global community, and we'll limit those types of posts,” the post reads. There is no bulleted list or visual aid to distinguish how content may conform to Community Guidelines but also be inappropriate enough to warrant limiting its exposure. Apparently, content creators will just be expected to stay in the dark on that front as they post, unsure of what might set off a decline in their engagement numbers.

This is especially troubling for users who post erotic, sex-positive, or just plain horny content on Instagram. Which, let’s face it, who among us hasn’t been a little thotty on there from time to time? The reality is Instagram created an environment that incentivized certain kinds of content and everyone — from the users who lurk more than actually post, to micro-influencers, to big-name creators — have been able to capitalize on these trends to build communities and sometimes careers.

The company’s increasingly pearl-clutching stance feels like a concerted effort to further push conversations around sex and thirst, whether playful or educational, off their platform. This move on Instagram’s part also sends a message that even when the content is not explicit, when it follows the rules laid out for all of us, the fact it is even suggestive is enough to limit who may happen upon it while casually browsing. Not to mention the possible impact this will have on the careers and livelihood of users producing erotic or sex-positive content: If their posts are deemed borderline, bringing in new followers could become even more difficult.

Users have long-been policing the sorts of posts creators are allowed to put up through tactics like bad-faith reporting. With the new, vague borderline policy, this culture of targeting suggestive content will likely proliferate. To sex educator and BDSM coach Cory B., this policy simply legitimizes targeted harassment, which she points out as ironic considering the Instagram’s stated efforts to curb the practice on its platform.

“Instagram has already been doing this to accounts,” Cory wrote in an email to The Outline. “They’ll remove a photo then place a shadow ban on the account for two weeks down to the minute. A few weeks ago there was a 24-hour period where a bunch of people in the sex-positive sphere were unable to post.”

Cory had her account taken down in Janurary; after a month she was told her account had been incorrectly disabled for pornography and she regained access. She lost out on income streams, including a donation-based Q & A service she operated through her DMs, and access to her 9,700 followers, to whom she regularly promoted her workshops and events.

“The whole point of my business is to educate those who aren’t already in the know about things like sex-positivity,” Cory stated. “It doesn’t make sense for me to flock to a new platform full of a bunch of sex-positive people because then I’m just preaching to the choir. So it looks like I’m going to have to continue shifting my content so that I can continue to do this important work that clearly some people at Instagram would benefit from.”

In a similar vein, erotic model and artist Kasey, whose stage name is Skye Blue, points to the fact her issues with posts being deleted and her accounts disabled started once she reached 100,000 followers. The fact the removals happened one right after the other made her feel as though someone was reporting her to the point of harassment. With this new policy it can be assumed such practices will ramp up.

“I had even received emails from people saying they knew who was reporting me and wanting to personally take me off Instagram,” Kasey said in an email to The Outline. “Anything can be a trigger word for someone to report now. If someone doesn’t like a fetish, or even a word for a body part, they can report me and I can get deleted again. My work is mainly nude; I’m having to change the way I shoot just to make images for this stupid platform. The conformity is too much.”

“To say that if a post follows the guidelines, it will still get demoted if deemed ‘inappropriate’ — how does anybody work with that?” Cory stated. “Who decides what is inappropriate? I don’t see how sexism and sex-negativity aren’t going to end up in these algorithms.”

Platforms, including Facebook, Twitter and YouTube, have all been called on at one point or another to perform some form of content moderation on their sites, often with regards to the spread of misinformation and hate speech. Of course, as these platforms have become massive public spheres intertwined with our daily lives they do bear a great deal of responsibility in that regard. But amid cries for clear and consistent rules which will keep users safe, Instagram has developed a deliberately vague policy that just keeps us covered up. The problem is the Nazis, not the nipples.