Issue with Facebook Issue with Facebook

Issue with Facebook

Issue with Facebook


Facebook’s Oversight Board on Wednesday upheld the social network’s ban on former president Trump for encouraging violence following the Jan. 6 attack on the Capitol, a decision that holds major implications for how the speech of political leaders should be policed online.

But the 20-member Oversight Board, which is largely independent and funded by the social network, also left open the door for Trump’s return. The expert panel took issue with Facebook’s “indefinite” suspension of Trump, calling it “vague and uncertain.” It sent the decision back to Facebook and said it had six months to clarify Trump’s punishment and come up with a response that fits its known rules.

In the ruling, the board agreed that Trump’s comments on the day of the insurrection “created an environment where a serious risk of violence was possible.” The board pointed to the former president calling the mob members “patriots,” “special,” and telling them to “remember this day forever.”

The board recommended that Facebook publish a report explaining its own role in fomenting the Jan. 6 attack.

The ruling opens a new chapter in the global debate over the power of social media giants. Critics are already calling into question the legitimacy and value of the Oversight Board, which was set up by Facebook to help hold it accountable in making such calls.

The board’s decision provoked swift responses from political leaders, advocates, and experts around the world. Many said that the decision merely kicked the can back to the social network and did not provide clear guidance for the treatment of boundary-pushing politicians.

“The practical effect of this decision will be that Facebook — and possibly other platforms that might have been watching the Oversight Board for unofficial guidance — will have to continue to grapple themselves with the problem of what to do about political leaders who abuse social media to spread lies and incite violence,” wrote Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights, in a statement.

Facebook currently exempts political figures from some hate speech rules on the grounds that those comments are newsworthy. The Board took issue with that exemption, noting that “it is not always useful to draw a firm distinction between political leaders and other influential users," and that such users have greater power than others to cause harm.

In its response to the decision, Facebook emphasized that Trump would remain off the social network for the time being, following the board’s order.

“We will now consider the board’s decision and determine an action that is clear and proportionate,” Nick Clegg, Facebook’s vice president of global affairs and communication said in a blog post Wednesday. “In the meantime, Mr. Trump’s accounts remain suspended.”

Trump said in a statement that Facebook, Twitter and Google embarrassed the United States.

“Free Speech has been taken away from the President of the United States because the Radical Left Lunatics are afraid of the truth, but the truth will come out anyway, bigger and stronger than ever before,” Trump said in a statement. ”The People of our Country will not stand for it! These corrupt social media companies must pay a political price, and must never again be allowed to destroy and decimate our Electoral Process.”

Allies swiftly condemned the decision and Facebook’s vast power over public expression. Conservatives are already escalating their calls for antitrust action targeting the tech giant following the decision.

“Facebook’s status as a monopoly has led its leaders to believe it can silence and censor Americans’ speech with no repercussions," said Rep. Ken Buck (R-Colo.), the top Republican on the House Judiciary antitrust subcommittee. “Now more than ever we need aggressive antitrust reform to break up Facebook’s monopoly.”

Critics have argued that Facebook should have banned Trump at different points throughout his presidency, saying that his inflammatory language and frequent promotion of misinformation — about the coronavirus in particular — constituted an abuse of his office and of Facebook’s own community standards. But chief executive Mark Zuckerberg felt strongly that politicians should be given wide latitude because their speech was in the public interest.

The last straw came on Jan. 6, when Trump’s comments on Twitter appeared to encourage the Capitol insurrection, Zuckerberg said, and the company said it would suspend him indefinitely.

Facebook referred its decision about Trump to the Oversight Board shortly afterward. The board, which is less than a year old and had yet to decide a case at the time, was first conceived by Zuckerberg in 2018 as a way to outsource the thorniest content moderation decisions without having the government intervene.

Over the past few months, members spanning time zones from Taiwan to San Francisco connected on videoconference calls to pore over more than 9,000 public comments on the matter, including from Trump himself, according to the board.

In a letter submitted to the board on Trump’s behalf, asking the board to reconsider the suspension, Trump’s allies said it was “inconceivable that either of those two posts can be viewed as a threat to public safety, or an incitement to violence.” It also claimed all “genuine” Trump supporters at the capital that day were law-abiding, and that “outside forces” were involved.

In its decision, the board faulted Facebook for making “arbitrary” decisions on the fly, and said that the company had no published criteria for suspending a user indefinitely. Facebook’s normal penalties are removing a comment, a time-limited suspension or disabling the user’s account permanently, the board said.

“The Board has upheld Facebook’s decision on January 7, 2021, to restrict then-President Donald Trump’s access to posting content on his Facebook page and Instagram account,” the board wrote. “However, it was not appropriate for Facebook to impose an ‘indefinite’ suspension. It is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.”

Facebook has called the board an “experiment” in the policing of political speech online, and experts say that Wednesday’s decision was the first major test of a new system Facebook put in place to essentially front-run the possibility of the government stepping in to play that role.

“This doesn’t begin and end with Donald Trump,” said Nathaniel Persily, a Stanford Law School professor. “They’ve got all kinds of elections coming up around the world.”

If the board is viewed as a success, it could also become a template for new laws governing social media companies, said Rep. Ro Khanna (D-Calif.). He said that Congress should consider requiring social media companies of a certain size to have their own independent board focused on these decisions.

“You could see Congress requiring that kind of regulation in social media companies, recognizing that there is a public dimension to the digital town halls that they’ve created,” he said.

Under U.S. law, social media platforms are not held legally responsible for policing unwanted or even much illegal content on their services, with some exceptions for copyright issues and child pornography. But in recent years, Silicon Valley has been dealt with a series of crises over enabling disinformation and the spread of extremism from both domestic and international forces, and the blowback has forced the companies to invest significantly in content moderation. That investment picked up in 2020, when the companies launched stronger policies aimed at combating misinformation surrounding the election and the coronavirus.

The crises have also led to new regulatory scrutiny around the world — especially in Washington, where Democrats have promised to use their new powers to update existing antitrust laws, crack down on misinformation and pass federal privacy legislation. The Oversight Board’s decision comes as Facebook is the target of a landmark Federal Trade Commission lawsuit, which focuses on the company’s practice of buying up rivals.

Zuckerberg and Facebook executives publicly floated the idea of creating the Oversight Board in 2018 as lawmakers around the world mulled new ways to regulate Facebook. The company faced broad criticism that it lacked accountability for content decisions that had wide-reaching social consequences, and that there were no checks on Zuckerberg’s power to determine what could be said on a service that had become the public square for billions of people.

“You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world,” Zuckerberg told Vox in a 2018 interview.

Facebook then embarked on a months-long process collecting feedback on how to design the board and consulting more than 2,000 people in 88 countries. It released the rules and selected its first members in 2020. The board was a lightning rod for controversy during its formation, as Facebook’s critics warned its authority was too limited and that the company’s role in picking board members compromised its independence.

The board issued its first decisions in late January, a week after Facebook announced it would refer the high-profile Trump case. The initial round of decisions — which touched on alleged hate speech, coronavirus misinformation and references to dangerous organizations — signaled that the board would demand greater clarity from Facebook about its policies, as well as transparency. Before Wednesday’s decision, the board had overturned Facebook’s decisions six times, upheld them twice, and was unable to complete a ruling once.(WashinghtonPost)

Post a Comment

Previous Post Next Post