Recent events in Charlottesville, Virginia have brought change to a number of companies and government groups. Joining that list is Facebook, which recently condemned the white supremacists and pledged to remove any threats of physical harm made using Facebook.
“There is no place for hate in our community. That’s why we’ve always taken down any post that promotes or celebrates hate crimes or acts of terrorism—including what happened in Charlottesville,” said Facebook CEO Mark Zuckerberg in a post on his Facebook page “With the potential for more rallies, we’re watching the situation closely. We won’t always be perfect, but you have my commitment that we’ll keep working to make Facebook a place where everyone can feel safe.”
Zuckerberg Takes Action
After the deadly rallies in Charlottesville, Facebook removed pages belonging to a number of white supremacist groups including White Nationalists United and Right Wing Death Squad. The groups had previously been using Facebook to spread their messages, recruit new members, and organise events. In the emotion-filled days following the rallies, Zuckerberg acknowledged that it was a delicate process to monitor Facebook’s content but said the site would still be vigilant in protecting its users from hateful speech.
“It’s important that Facebook is a place where people with different views can share their ideas. Debate is part of a healthy society,” he wrote, while also saying that Facebook won’t allow users to post just anything that they believe in, especially when those ideas can lead to hurting or killing other people. The site faces a difficult balance of maintaining free speech without promoting racism and hate. With 20,000 employees monitoring the posts of more than 2 billion users, some things are bound to slip through the cracks, but Facebook will put forward the best effort it can.
Zuckerberg condemned the attacks and asked Facebook users around the world to focus on love instead of hate. He wants Facebook to be a place where people can stand up against intolerance and hate instead of a place that harbours racism and dangerous thoughts.
Additional Tech Companies Stepping In
Social media has played a large role in organising hate groups and terrorists in recent years because it allows people from the world to communicate and connect easily. As the world’s largest social network, Facebook is often thrown into public events. In the past, the company has been criticised for being too slow to respond to issues and remove hate speech. However, some organisations that monitor hate groups acknowledge that Facebook is better than many other tech companies at removing racist content. It has recently increased the speed at which complaints are responded to and questionable content is removed.
Other tech companies have also stepped up their removal of white supremacist content. In the wake of the events in Charlottesville, GoDaddy and Google cancelled the domain of a Nazi website and PayPal cut off service to white supremacists.
Facebook has a lot of work to do to keep its pledge to cut off racism and white supremacists from its site, but its actions could be a step in the right direction to make the world a less hateful place.