Kenya has threatened to shut down Facebook over its failure to improve content moderation on hate speech in the run-up to the August 9 elections.
The National Cohesion and Integration Commission (NCIC) has given the social media giant seven days to adhere to recommendations on taming online hate speech on its platform in the country.
The state agency said Meta, Facebook’s parent company, has been reluctant to take action to combat the spread of hate speech, propaganda and disinformation, escalating the risk of violence ahead of the elections.
As such, the commission has asked Facebook to urgently increase the number of content moderators in Kenya, expand its capacity to cover content expressed even in indigenous languages, and deploy integrity systems to “mitigate risk before, during and after the upcoming Kenyan election.”
This follows an investigative report by human rights organisation, Global Witness, which revealed that Facebook approved several adverts promoting hate speech in both English and Kiswahili languages.
Jon Lloyd, a senior advisor at Global Witness, said that Facebook approved content that violates its own policy and community standards since it qualified as hate speech and ethnic-based calls to violence.
“Much of the speech was dehumanising, comparing specific tribal groups to animals and calling for rape, slaughter and beheading. We are deliberately not repeating the phrases used here as they are highly offensive,” Mr Lloyd said while presenting the findings to the NCIC.
NCIC Commissioner Dr Danvas Makori said Facebook’s inaction toward the inappropriate content on its platform is an outright violation of the Kenyan Constitution and threatens the peace of the country, especial during this election period.
“The freedom of expression does not extend to propaganda, incitement to violence, hate speech, or advocacy of hatred,” he said. “Facebook violates our laws because they have allowed themselves to be a medium of hate speech, incitement, misinformation, and disinformation.”
Dr Makori said that the commission has already engaged Meta’s representative in the country and informed them of the requirements failure to which the company’s operations in the country will be suspended until they abide.
Last Wednesday, Facebook published a statement saying it is working to “ensure a safe and secure” general election in Kenya.
“We’re investing in people and technology to reduce the spread of misinformation and remove harmful content across our apps,” Mercy Ndegwa, Meta’s Director of Public Policy for East and Horn of Africa, said in the statement.
However, according to Global Witness’ report, Facebook still allowed hate speech and spiteful adverts to run on the platform even after declaring its efforts against it.
Meanwhile, Meta is fighting a court battle with Daniel Motaung, a South African national who was employed as a content moderator for Facebook in Kenya.
Motaung’s petition, also filed against Meta’s local outsourcing company Sama, alleges that workers moderating Facebook posts in Kenya are subjected to irregular pay, inadequate mental health support, refusal to join trade unions and violations of their privacy and dignity.
Last week, a group of human rights organisations, including Global Witness and Article 19, criticised Meta, saying it was actively trying to silence Motaung.
Recently, a report by Mozilla also revealed how social media platforms, including Facebook, Twitter, and TikTok, were used to propagate disinformation, misinformation, and hate speech in Kenya during the electioneering period.
Dr Makori said Twitter and TikTok have taken quick action to curb the menace, but Facebook has been slow, even refusing to promote the commission’s peace messages while allowing inappropriate content to continue.