|
A pair of verdicts held social media companies accountable for harming young users, highlighting a growing backlash as Congress struggles to pass legislation.
|
|
Meta didn't consult its Oversight Board last year when it announced sweeping policy changes to content moderation and a rollback of third-party fact checking in the United States in favor of Community Notes. But the company did ask the board for advice on how to expand the crowd-sourced fact checks to other countries.
Now the Oversight Board is publishing its advice to Meta. In a 15,000-word policy advisory opinion, the group urged Meta to be cautious with an international rollout, warning that an expansion of the program could "pose significant human rights risks and contribute to tangible harms" if safeguards are not put in place.
The board, notably, was asked to weigh in on a fairly narrow set of questions, including how it should evaluate whether to withhold the feature in certain countries. Meta "respectfully" asked the Oversight Board to avoid "general" critiques about the system, which it has said is modeled after X.
In its opinion, the Oversight Board said that Community Notes "could enhance users' freedom of expression and improve online discourse" with enough safeguard. But it recommended Meta withhold the feature in countries with "high polarization," as well as countries in the midst of a crisis or "protracted conflict." The board also said that Meta should avoid countries with a history of organized disinformation networks, because the notes may be more easily manipulated in such places, and countries with "linguistic complexity" that Meta may be ill-equipped to understand.
Depending on how you interpret that advice, that could exclude quite a few countries, though the board stopped short of making country-specific recommendations. Still, it raises questions about how closely Meta will follow the suggested guidelines. For example, the United
|
|
The UK government has conceded one of the more controversial parts of its Online Safety Bill, stating that the powers granted by the legislation will not be used to scan encrypted messaging apps for harmful content until it can be done in a targeted manner.
Companies will not be required to scan encrypted messages until it is "technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content," said Stephen Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, in a planned statement during the bill's third reading in the House of Lords on Wednesday afternoon.
To read this article in full, please click here
|
|