- Facebook and TikTok said they continue to view the Taliban as a terrorist organization and that content related to the group will remain banned on their platforms.
- Afghanistan fell to the Islamic militant group over the weekend, as it seized the capital of Kabul as well as the Presidential Palace.
- Facebook said it has a team of content moderators that is monitoring and removing posts, images, videos and other content related to the Taliban.
Facebook and TikTok said Tuesday they won't lift bans on content that promotes the Taliban after the group took control of Afghanistan.
The social media giants told CNBC they consider the Afghan group, which has used social media platforms to project its messages for years, to be a terrorist organization.
Facebook said it has a dedicated team of content moderators that is monitoring and removing posts, images, videos and other content related to the Taliban. It's unclear how many people are on the team.
Get Chicago local news, weather forecasts, sports and entertainment stories to your inbox. Sign up for NBC Chicago newsletters.
The Taliban's spokesman, Zabihullah Mujahid, criticized Facebook for censorship in a public press conference in the capital of Kabul on Tuesday, claiming the group's freedom of speech is being stifled by the tech giant's ban. Facebook reportedly removed several user accounts linked to Mujahid this week after they were flagged to the company by journalists at The New York Times.
Afghanistan fell to the Islamic militant group over the weekend as it seized Kabul, including the Presidential Palace. After President Joe Biden's April decision to withdraw U.S. troops from Afghanistan, the Taliban made stunning battlefield advances — and nearly the whole nation is now under the insurgents' control.
A Facebook spokesperson told CNBC: "The Taliban is sanctioned as a terrorist organization under U.S. law and we have banned them from our services under our Dangerous Organization policies."
The Taliban has been banned from Facebook for several years, the spokesperson said.
Facebook said this means it removes accounts that are maintained by or on behalf of the Taliban, as well as those that praise, support and represent them.
"We also have a dedicated team of Afghanistan experts, who are native Dari and Pashto speakers and have knowledge of local context, helping to identify and alert us to emerging issues on the platform," the Facebook spokesperson said.
Facebook said it does not decide whether it should recognize national governments. Instead, it follows the "authority of the international community."
TikTok declined to share a statement but told CNBC that it has designated the Taliban as a terrorist organization and that it continues to remove content that praises, glorifies or provides support to them.
Facebook's ban also applies to Instagram and WhatsApp but reports suggest that the Taliban are still using WhatsApp to communicate. The chat platform is end-to-end encrypted, meaning Facebook cannot see what people are sharing on it.
"As a private messaging service, we do not have access to the contents of people's personal chats however, if we become aware that a sanctioned individual or organization may have a presence on WhatsApp we take action," a WhatsApp spokesperson told Vice on Monday.
A Facebook spokesperson told CNBC that WhatsApp uses AI software to evaluate nonencrypted group information including names, profile photos, and group descriptions to meet legal obligations.
Alphabet-owned YouTube said its community guidelines apply equally to everyone, and that it enforces its policies against the content and the context in which it's presented. The company said it allows content that provides sufficient educational, documentary, scientific and artistic context.
"The situation in Afghanistan is rapidly evolving," a Twitter spokesperson told CNBC. "We're also witnessing people in the country using Twitter to seek help and assistance. Twitter's top priority is keeping people safe, and we remain vigilant."
"We will continue to proactively enforce our rules and review content that may violate Twitter rules, specifically policies against glorification of violence, platform manipulation and spam," the spokesperson added.
Rasmus Nielsen, a professor of political communication at the University of Oxford, told CNBC it's important that social media companies act in crisis situations in a consistent manner.
"Every time someone is banned there is a risk they were only using the platform for legitimate purposes," he said.
"Given the disagreement over terms like 'terrorism' and who gets to designate individuals and groups as such, civil society groups and activists will want clarity about the nature and extent of collaboration with governments in making these decisions," Nielsen added. "And many users will seek reassurances that any technologies used for enforcement preserves their privacy."