Instagram accusations about accounts that sexually exploit children What's the story?

Instagram accusations about accounts that sexually exploit children What's the story? Meta's Instagram has failed to remove accounts attracting hundreds of sexual comments for posting pictures of children in swimwear or small pieces of clothing, despite the parent company's claim that it has a zero-tolerance approach to child exploitation.  Meta's Instagram has failed to remove accounts attracting hundreds of sexual comments for posting pictures of children in swimwear or small pieces of clothing, despite the parent company's claim that it has a zero-tolerance approach to child exploitation.  The Guardian said that although some of these accounts were reported through the in-app reporting tool, they remained active.  In one case, a researcher reported an account that posted pictures of children in sexual positions, using the in-app reporting tool, and it was answered through automated moderation technology, that the report was accepted.  "Due to the large size, the report has not been seen, but our technology has found that this account may not go against our community guidelines," the Instagram app said.  The user was advised to ban the account, unfollow it, or report it again, which made the account continue until Saturday with more than 33,000 followers.  The newspaper notes that similar accounts were also found on Twitter, which the application did not delete until after active human rights groups published about them publicly.  More often than not, the accounts are used for manipulation, with offenders posting photos that are technically legal but arranging to meet online in private message groups to share other material.  Andy Burrows, head of online safety policy at the UK's Society for the Prevention of Cruelty to Children, described the accounts as a "shop window" for children looking for such purposes.  "Companies should proactively identify this content and then remove it themselves," he said.  He criticized the companies that "even when they are informed of this, they judge that it is not a threat to children and should remain on the site."  A spokesperson for Meta, the owner of Instagram, confirmed that it has strict rules against content that sexually exploits children or puts them at risk, and that these accounts are removed when it becomes aware of them.  "We also focus on preventing harm by blocking suspicious profiles, restricting adults from sending messages to unrelated children, and hiding people under 18 from private accounts," he told the newspaper.  Twitter also responded that the accounts it had reported had now been permanently suspended for violating its rules.  A company spokesperson said: "Twitter has zero tolerance for any material that exposes or promotes the sexual exploitation of children.

Meta's Instagram has failed to remove accounts attracting hundreds of sexual comments for posting pictures of children in swimwear or small pieces of clothing, despite the parent company's claim that it has a zero-tolerance approach to child exploitation.

Meta's Instagram has failed to remove accounts attracting hundreds of sexual comments for posting pictures of children in swimwear or small pieces of clothing, despite the parent company's claim that it has a zero-tolerance approach to child exploitation.

The Guardian said that although some of these accounts were reported through the in-app reporting tool, they remained active.

In one case, a researcher reported an account that posted pictures of children in sexual positions, using the in-app reporting tool, and it was answered through automated moderation technology, that the report was accepted.

"Due to the large size, the report has not been seen, but our technology has found that this account may not go against our community guidelines," the Instagram app said.

The user was advised to ban the account, unfollow it, or report it again, which made the account continue until Saturday with more than 33,000 followers.

The newspaper notes that similar accounts were also found on Twitter, which the application did not delete until after active human rights groups published about them publicly.

More often than not, the accounts are used for manipulation, with offenders posting photos that are technically legal but arranging to meet online in private message groups to share other material.

Andy Burrows, head of online safety policy at the UK's Society for the Prevention of Cruelty to Children, described the accounts as a "shop window" for children looking for such purposes.

"Companies should proactively identify this content and then remove it themselves," he said.

He criticized the companies that "even when they are informed of this, they judge that it is not a threat to children and should remain on the site."

A spokesperson for Meta, the owner of Instagram, confirmed that it has strict rules against content that sexually exploits children or puts them at risk, and that these accounts are removed when it becomes aware of them.

"We also focus on preventing harm by blocking suspicious profiles, restricting adults from sending messages to unrelated children, and hiding people under 18 from private accounts," he told the newspaper.

Twitter also responded that the accounts it had reported had now been permanently suspended for violating its rules.

A company spokesperson said: "Twitter has zero tolerance for any material that exposes or promotes the sexual exploitation of children.
Previous Post Next Post