Britain First is the organization behind three anti-Muslim tweets that President Trump retweeted. Twitter then suspended the Britain First account, and the President sort-of apologized for his actions two months later.
The organization continually posts inflammatory racist content on its own Facebook Page, as well as on the Pages of leaders Paul Golding and Jayda Fransen. The social network notes that this content violated its Community Standards, and the Page administrators have been warned multiple times about their posts. After a final written warning that went ignored, Facebook removed Britain First's page, as well as those of its two leaders.
The mayor of London, Sadiq Khan, weighed in on Facebook's actions against Britain first via a tweet, saying "Britain First is a vile and hate-fuelled group whose sole purpose is to sow division. I welcome Facebook's decision to remove their content from its platform -- their sick intentions to incite hatred within our society via social media are reprehensible." Khan called out Facebook at SXSW earlier this week for not doing enough to stem the hate speech on its platform.
It's an ongoing struggle for social networks, especially as anti-minority and racist views become increasingly tolerated in our polarized society. How do you distinguish between legitimate political views and those that spew hate? "We are an open platform for all ideas and political speech goes to the heart of free expression," Facebook said in a statement. "But political views can and should be expressed without hate." It's been a long time coming, and Facebook has a lot more work to do when it comes to hate speech on its platform.