Instagram’s algorithm recommends sexual videos to minors

Between a well-known restaurant chain’s advertising post and a dating app’s announcement, adult videos and other sexual content may appear on Instagram. This is what was discovered by Wall Street MagazineIt raised an alarm about the operation of the reel algorithm that recommends sexual or child pornography content.

Adult videos that minors can access

Journalists of the US newspaper reached this conclusion after conducting a test. They opened several fake profiles, choosing to follow only teenage gymnasts, cheerleaders, and teenage influencers who were active on the Meta-owned social network, but later realized that the reels algorithm was showing a variety of content banned for minors, including some adult-oriented videos.

Journalists also discovered that many adult men were among the thousands of followers of accounts opened by minors. Moreover Canadian Child Protection Center performed a similar test and obtained the same results. The Wall Street Journal investigation’s thesis was rejected by a Meta spokesperson, who said the US newspaper’s journalists’ testing methodology did not represent the actual experience and what billions of users see. Meta responded to journalists’ request for clarification Wall Street MagazineHe declined to explain why the algorithms were recommending videos containing children, sex and advertising.

Meta introduced new security tools last October to give advertisers more control over where their ads appear, a company spokesperson said. This is because big corporate brands can choose where their ads appear; but it appears that the vehicle is not working properly. After discovery Wall Street MagazineDisney, Match Group, Bumble and Hims have suspended ad campaigns on Instagram.

The social network removes or reduces four million videos each month suspected of violating sensitive content standards, according to a spokesperson for the tech giant. However, the US company said that it wanted to launch an investigation into the results of the tests carried out by the newspaper. According to the testimony of current and former employees of the company, it is impossible for Meta to be unaware of the problem.

Let’s clarify. The algorithm should recommend content similar to what the user has seen before, so based on previous activity, but adult videos that shouldn’t be on Instagram should appear instead. According to this Wall Street Magazine, The goal is clear: to “create addiction” for users and at the same time recommend videos from unfollowed users, as on TikTok.

Instagram is where pedophiles meet

This isn’t the first time Wall Street Magazine It draws attention to the proliferation of adult sexual content on Instagram. According to a study conducted last June by a US newspaper and researchers from Stanford University and the University of Massachusetts Amherst, the social network owned by Meta is the network most used by pedophilia networks to promote and sell content depicting sexual violence against minors. This is because the Instagram algorithm helps bring together a vast network of profiles to make child pornographic and therefore illegal content accessible. Despite the moderation, Instagram still allows users to search for specific hashtags; This is useful for connecting with other profiles clearly dedicated to selling content related to child sexual abuse.

According to the US newspaper, a simple search for keywords such as #pedowore or #preteensex leads to accounts using these terms to advertise content depicting sexual abuse of minors or sexual acts performed between minors and animals. It was later revealed that in many cases the profiles appeared to be mostly run by children. Even then, the newspaper had contacted Meta because promoting and selling content related to child sexual abuse was a federal crime as well as a violation of the platform’s guidelines.

Mark Zuckerberg’s company recognized the existence of obstacles to moderating the aforementioned content and created an internal task force to address the issue: so the tech company removed from the platform “thousands of hashtags” that pedophiles used to communicate. with dealers and other sex offenders.

Continue reading today…

Source: Today IT

\