New York Times document exposes Facebook's actions to protect children from sexual abuse New York Times document exposes Facebook's actions to protect children from sexual abuse

New York Times document exposes Facebook's actions to protect children from sexual abuse

New York Times document exposes Facebook's actions to protect children from sexual abuse  The training document used by Facebook's content moderators raises several questions about whether the social network has enough emphasis to prevent images of potential child sexual abuse from being shared on the platform.  According to a report in The New York Times , a training document for moderators says that they should rate the photos they are shown and question the age of those in them by assigning them to adults, according to what it called “adult fault” when evaluating Pictures, a practice that audit supervisors objected to but which company executives defended.  The dispute revolves around how Facebook's moderators handle photos where the age of the people in the photo is not immediately clear.  The people interviewed, who are content moderators who have worked with Facebook, described a policy called "bumping up" with which they disagree.   The policy applies when the content moderator cannot determine whether the subject in the suspected photo is a minor (B) or an adult (C).  In such cases, content moderators are asked to assume that the topic is an adult, and then allow more images to appear without reporting them to the National Center for Missing and Exploited Children (NCMEC), which forwards the images to the authorities. Law enforcement in the United States.  In this case, photos of adults may be removed from Facebook if they violate its rules, but outside authorities are not reported.  But, as The New York Times points out, there is no reliable way to determine age based on the photo. The report says facilitators are trained to use a method over 50 years old to determine "progressive stages of puberty", but "it was not designed to determine a person's age".   Because Facebook's guidelines direct moderators to assume that photos they aren't sure of should be labeled as adults, moderators believe this could allow too many children's photos to appear on the platform.  Complicating matters further is the fact that Facebook administrators, who work for outside companies and don't get the same benefits as full-time employees, may have only a few seconds to make a decision and may be penalized for making a mistake.  A 3D printed Facebook's new rebrand logo Meta and Facebook logo are placed on laptop keyboard in this illustration taken on November 2, 2021. REUTERS/Dado Ruvic/Illustration Facebook said it fears legal liability in the event of false reports on images (Reuters) Facebook says it reports more child sexual abuse material than any other company, and says classifying the bug with adults is aimed at protecting users' privacy and avoiding false reports that could hamper authorities' ability to investigate actual abuse.  Antigone Davis, Facebook's head of safety, said the designation also protects the company from legal liability for false reporting.   Notably, other companies don't share Facebook's philosophy on the matter. Apple, Snap, and TikTok take the "opposite approach" and report photos when they're unsure of age.

The training document used by Facebook's content moderators raises several questions about whether the social network has enough emphasis to prevent images of potential child sexual abuse from being shared on the platform.

According to a report in The New York Times , a training document for moderators says that they should rate the photos they are shown and question the age of those in them by assigning them to adults, according to what it called “adult fault” when evaluating Pictures, a practice that audit supervisors objected to but which company executives defended.

The dispute revolves around how Facebook's moderators handle photos where the age of the people in the photo is not immediately clear.

The people interviewed, who are content moderators who have worked with Facebook, described a policy called "bumping up" with which they disagree.

The policy applies when the content moderator cannot determine whether the subject in the suspected photo is a minor (B) or an adult (C).

In such cases, content moderators are asked to assume that the topic is an adult, and then allow more images to appear without reporting them to the National Center for Missing and Exploited Children (NCMEC), which forwards the images to the authorities. Law enforcement in the United States.

In this case, photos of adults may be removed from Facebook if they violate its rules, but outside authorities are not reported.

But, as The New York Times points out, there is no reliable way to determine age based on the photo. The report says facilitators are trained to use a method over 50 years old to determine "progressive stages of puberty", but "it was not designed to determine a person's age".

Because Facebook's guidelines direct moderators to assume that photos they aren't sure of should be labeled as adults, moderators believe this could allow too many children's photos to appear on the platform.

Complicating matters further is the fact that Facebook administrators, who work for outside companies and don't get the same benefits as full-time employees, may have only a few seconds to make a decision and may be penalized for making a mistake.

A 3D printed Facebook's new rebrand logo Meta and Facebook logo are placed on laptop keyboard in this illustration taken on November 2, 2021. REUTERS/Dado Ruvic/Illustration
Facebook said it fears legal liability in the event of false reports on images (Reuters)
Facebook says it reports more child sexual abuse material than any other company, and says classifying the bug with adults is aimed at protecting users' privacy and avoiding false reports that could hamper authorities' ability to investigate actual abuse.

Antigone Davis, Facebook's head of safety, said the designation also protects the company from legal liability for false reporting.

Notably, other companies don't share Facebook's philosophy on the matter. Apple, Snap, and TikTok take the "opposite approach" and report photos when they're unsure of age.

Post a Comment

Previous Post Next Post

Everything Search Here 👇👇👇