Facebook de-platforms 4 Myanmar armed groups. Military behind Rohingya abuse didn't like them either.

Is Facebook following government orders in Myanmar?

Facebook said in a blog post today it has banned content from four armed groups in Myanmar (Burma), and promises to continue to remove all "related praise, support, and representation."

There's big pressure for Facebook to do the right thing in the Southeast Asian country where the social media platform has been used to incite racial tension and violence.

Here is the entire Facebook post, links removed:

Over the past year, we have repeatedly taken action against violent actors and bad content on Facebook in Myanmar. The ethnic violence happening in Myanmar is horrific and we don't want our services to be used to spread hate, incite violence or fuel tension on the ground.

Our approach to this problem, like the problem itself, is multifaceted, but our purpose is clear: to reduce the likelihood that Facebook will be used to facilitate offline harm. Our tactics include identifying and removing fake accounts; finding and removing violent actors; building better tools and technology that allows us to proactively find bad content; evolving our policies; and continuing to build partnerships and programs on the ground. We have shared regular updates on this work. Since last August, we've taken down three networks who were misrepresenting who they were and what they were doing, banned Myanmar military officials, given an update on the steps we're taking to prevent the spread of hate and misinformation and released the findings of the Human Rights Impact Assessment on the role of our services in the country.

Today, we are taking more action, designating four more groups in Myanmar as dangerous organizations – the Arakan Army, the Myanmar National Democratic Alliance Army, Kachin Independence Army and the Ta'ang National Liberation Army. These armed groups are now banned from Facebook and all related praise, support and representation will be removed as soon as we become aware of it.

In an effort to prevent and disrupt offline harm, we do not allow organizations or individuals that proclaim a violent mission or engage in violence to have a presence on Facebook. This includes terrorist activity, organized hate, mass or serial murder, human trafficking, organized violence or criminal activity. There is clear evidence that these organizations have been responsible for attacks against civilians and have engaged in violence in Myanmar, and we want to prevent them from using our services to further inflame tensions on the ground.

We don't want anyone to use Facebook to incite or promote violence, no matter who they are. That's why we are always evaluating and analyzing our policies around violence committed by state and non-state actors. We recognize that the sources of ethnic violence in Myanmar are incredibly complex and cannot be resolved by a social media company, but we also want to do the best we can to limit incitement and hate that furthers an already deadly conflict.

Analysis from

Reuters here, a TechCrunch post below, and tweeted observations from voices out of Myanmar and a number of well-informed observers who are skeptical or critical of today's move by Facebook.

Facebook removes hundreds of accounts linked to fake news group in Indonesia