Almost 150 German Facebook accounts and pages linked to anti-lockdown demonstrations have been shut down by the social media company.
They fell foul of new rules focused on groups that spread misinformation or incite violence.
It is part of a new crackdown on co-ordinated campaigns of real users that cause harm on and off its platforms.
The accounts on Facebook and Instagram spread content linked to the so-called Querdenken movement, a disparate group that has protested lockdown measures in Germany and includes vaccine and mask opponents, conspiracy theorists and some far-right extremists.
Posts from the accounts included one making the debunked claim that vaccines create viral variants and another that wished death upon police officers who broke up violent anti-lockdown protests in Berlin.
The action is the first under Facebook’s new policy focused on preventing “coordinated social harm,” which company officials said is an attempt to address content from social media users who work together to spread harmful content and evade platform rules.
Under its long-standing guidelines, Facebook has removed accounts that use false personas or spread hate speech or make threats of violence. The new policy is intended to catch groups that work together in an attempt to get around the rules, while still spreading harmful content.
In the case of the Querdenken network, Facebook said multiple account holders used both individual and duplicate accounts to spread content that violated Facebook’s rules on Covid-19 misinformation, hate speech, bullying and incitement of violence.
It was that co-ordinated effort to deceive, along with the harmful content and a history of past violations, that prompted Facebook’s action, according to Nathaniel Gleicher, the company's head of security policy.
“Simply sharing a belief or affinity with a particular movement or group wouldn’t be enough” to warrant a similar response, he said.
Mr Gleicher said these campaigns typically involve tightly-organised networks of real users who systematically violate its policies to cause harm, including away from Facebook, but do not violate the company's rules on influence operations using fake accounts or against dangerous organisations.
The changes could have major significance for how the world's largest social media network handles organised political and social movements on its sites.
In a recent report on influence operations, the company said a key trend it saw was the blurring between authentic public debate and manipulation by foreign and domestic campaigns.