Facebook stated the company would assess the feasibility of commissioning an independent human rights assessment into its work in Ethiopia. The board recommended reviewing how Facebook and Instagram have spread content that heightens the risk of violence there.
Meta has been under pressure from lawmakers for user safety and its handling of abuses on its platforms all over the world, particularly after a whistleblower, Frances Haugen, revealed internal documents that showed the company’s struggles in policing content in countries where such speech was most likely to cause harm, including Ethiopia.
Facebook mentioned it has “invested significant resources in Ethiopia to identify and remove potentially harmful content” as part of its response to the board’s suggestions on a case involving content posted in the country.
The board also suggested that Meta rewrite its value statement on safety to reflect that online speech can cause a risk to the physical security of persons and their right to life. The company mentioned it would make changes to this value, partially implementing the recommendation.