To stop child pornography, Instagram limits some hashtags

Some Instagram users took advantage of the social network to exchange child porn images. Quickly alerted by its users, the network reacted by limiting some hashtags.

Child pornography is a plague that gangrene internet. Obviously illegal, the exchange of such images is punishable by law, but this does not prevent its authors to find new ways to communicate with each other in public places. For this, they use some coded languages, which sometimes end up in the open.
That's what happened on Instagram, the social network dedicated to Facebook photos. Although it is not particularly thought for private communications or file exchange, some have managed to divert it using hashtags at first sight innocuous. The main hashtags, #dropboxlinks and #tradedropbox, used on photos quite commonplace, allowed pedophiles to get in touch with each other and thus privately exchange illegal content via file sharing platforms like DropBox.
These hashtags could sometimes be accompanied by unfortunately much more explicit declensions like # gay13years, or comments like "I have links" or "I exchange boys against girls," reports TNW, screenshots in support.

Instagram has stemmed the problem in its own way by making the hashtags in question inaccessible. A message now says, "Publications for #dropboxlinks have been limited because the community has reported content that may violate the rules of the Instagram community." The rules of Instagram recall that the use of the social network is done in compliance with the law, and that is applied "a zero tolerance vis-à-vis the sharing of sexual content involving minors."
Instagram also said "to develop a technology that proactively finds content related to child pornography and child exploitation as soon as it is downloaded so it can react quickly."

Despite this hashtag limitation, several people who reported the affected accounts received an answer from Instagram stating that the users in question do not violate the terms of use of the platform – since the content is not directly hosted on the platform – and the accounts have not been deleted according to The Atlantic.
The community to the rescue
Contacted by TNW, Dropbox has obviously condemned this use of its service and promises to work with Instagram to ensure that this type of content is removed as soon as possible from its hosting service.
As for the diversion of the use of the social network, it is the Instagram community itself that has tried to drown the practice by massively publishing memes on the hashtags in question in order to make contact with pedophiles more difficult. .

This shows how difficult it is for platforms to stem this kind of behavior from users who always find a way to get around blockages and come back with new methods.

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *