Facebook on Sunday announced steps to combat hate speech, harassment, and voter suppression on its platform. The changes came as part of an ongoing internal audit into the tech giant’s handling of civil rights and discrimination.
The audit effort has already resulted in numerous changes to the platform since it got underway last May. The process is being run by Laura Murphy, former director of the ACLU in Washington, DC, and has already led Facebook to make tweaks to its policies, including banning certain expressions of extremism and combating the most overt voter suppression tactics in the 2018 midterm elections.
On Sunday, Facebook released a report, by Murphy, which details the latest changes and includes recommendations for going forward. The full audit is expected to be completed next year.
In an interview with Mother Jones Friday, Murphy pointed to new updates in hate speech moderation and anti-voter suppression tactics, as well as major changes to its advertising platform that were already announced in March after Facebook settled multiple lawsuits over discrimination in advertising.
Sunday’s report also announces that Facebook’s new effort to fight disinformation around the 2020 census. The platform could play a pivotal role in the decennial count, which determines political representation and the allocation of federal funds, by either encouraging participation or allowing bad actors to discourage people from being counted. But, Murphy stressed, this is still an ongoing process, and multiple civil rights leaders involved in the process agreed.
“All of these are steps in the right direction and they required a tremendous amount of leadership from the civil rights community,” said Malkia Devich-Cyril, executive director of the racial justice group Media Justice. But, she added, the company will need to invest in longterm structural changes to make sure civil rights remains a priority at Facebook. “While these short term investments are critical, without longterm investments, these changes go to waste.”
In the area of content moderation, the report highlights efforts to remove extremist content as well as new steps to correct a common mistake of removing posts that condemn hate speech. But Murphy’s report also points to a hole in its hate speech enforcement that she is urging Facebook to close. Facebook’s prohibition on hate speech has a carveout for humor. The problem, as Murphy explains, is that “what qualifies as humor is not standard, objective, or clearly defined; what one user (or content reviewer) may find humorous may be perceived as a hateful, personal attack by another,” the report states. “Nor does labeling something as a joke or accompanying it with an emoji make it any less offensive or harmful.” This, she warns, “runs the risk of allowing the exception to swallow the rule.”
Murphy told Mother Jones that Facebook has had trouble drawing a line between what is and isn’t humor, but expects progress on this front.
Another area of major change includes policies around voting rights and election interference. In 2016, Russia used Facebook and Instagram to suppress voter turnout among people of color. Facebook already took a number of steps ahead of the 2018 midterms to ban content that provides false information about how to cast a ballot. Sunday’s report announces a new policy banning ads that discourage people from voting.
Despite the audit, advocates continue to have reservations about Facebook’s commitment to civil rights. Over the past several years the company has compiled a record of promising changes that turn out to be toothless or incomplete. In March, for example, Facebook announced that it would ban “praise, support and representation of white nationalism and separatism on Facebook and Instagram.” This came months after the company agreed to ban expressions of white supremacy, even though the toxic ideologies are clearly connected, and appeared prompted at least in part by internal leaks that caused bad publicity.
Even Facebook’s decision to block white nationalist content came with an asterisk: Users could get around these new restrictions if they avoided using the terms “white nationalism” and “white supremacy.” In her report, Murphy urges Facebook to go further. “The narrow scope of the policy leaves up content that expressly espouses white nationalist ideology without using the term ‘white nationalist,’” Murphy writes. “As a result, content that would cause the same harm is permitted to remain on the platform.”
When Murphy took on the civil rights audit last year, Facebook faced a multitude of discrimination problems on its site. After 15 years of unregulated growth, the platform of more than 2 billion users had amplified the worst elements of the internet—hate speech and harassment, extremism, and voter suppression. Ads for housing, credit, and jobs could be targeted at white people to the exclusion of minorities in violation of federal civil rights laws.
For years, Facebook largely ignored the complaints of civil rights groups that tried to work with Facebook to address the issues, as Mother Jones chronicled in April. When multiple lawsuits, beginning in 2016, targeted ad discrimination, Facebook denied any harm to its users. (Facebook ultimately settled multiple lawsuits in March of this year.) Finally, a cascade of public relations crises forced Facebook’s hand.
In August 2017, the Charlottesville rallies shone a spotlight on the ways hate groups were operating unencumbered on many social media platforms, and prompted a crackdown on the most extreme users on Facebook and elsewhere. From the fall of 2017 until February 2018, a constant dribble of news revealed the extent of Russia’s activity on Facebook and Instagram to suppress minority turnout and sow chaos in the 2016 elections, including Special Counsel Robert Mueller indictment of 13 Russians and three Russian entities. The following month, Facebook faced a new scandal when news broke that Cambridge Analytica, a political consulting firm, had illicitly harvested the personal data of tens of millions of Facebook users to help Trump win. The crisis forced Zuckerberg, for the first time, to testify before Congress about his company’s actions. There, in April 2018, Sen. Cory Booker (D-NJ) got him to agree to a civil rights audit.
For activists, the promise of the audit was to illuminate, both for the public and for Facebook’s own leaders, the ways in which Facebook’s internal rules, policy enforcement, and products were harming minorities. As Mother Jones reported in April, after years of meetings with Facebook executives went nowhere, dozens of advocates came to believe that the company needed outside help to come to grips with how the company was affecting minority communities in the US, not to mention helping to ignite violence around the world.
Murphy began interviewing representatives of 90 civil rights organizations last summer. She divided her work into broad categories: policies around hate speech, harassment, and content moderate; ad discrimination; voter suppression. She also acknowledged the need for more diversity and new process and teams inside the company to advocate for civil rights in the development of new products and policies.
An initial update in December was light on specifics, and civil rights advocates were unimpressed. This audit update announces more concrete changes and highlights continued areas of improvement. Over the past few months, according to Rashad Robinson, the president of Color of Change, a civil rights group deeply involved in the effort from multiple groups to reform the platform, Facebook’s attitude has changed. The company is more responsive. It’s spending more resources. And it’s responding to activists more constructively. “I do think that Facebook is taking it seriously,” he said.
This spring, Murphy told Mother Jones that the audit’s success will ultimately be determined by how it holds up after she completes her work. “I’m very much into structural change and accountability mechanisms that outlast the audit,” she said at the time. “I don’t see how this can be a serious, company-wide effort, unless structures are in place that outlast the timeframe of the audit.”
To that end, Sunday’s report announces the creation of an internal Civil Rights Task Force led by Chief Operating Officer Sheryl Sandberg and filled with division leaders from across the company. The goal of is to create a body dedicated to civil rights that will field concerns from inside and outside the company and have the authority to make decisions. Whereas advocates have for years seen their complaints to Facebook go nowhere, the task force will theoretically function as a council that can take action. But the task force still lacks some obvious elements of meaningful, ongoing enforcement, including civil rights expertise and full-time dedication to the subject.
The task force is “a beginning, and it’s an important beginning” for structural change, Murphy told Mother Jones on Friday. “It establishes that civil rights is important to every area of the company. It establishes that the senior-most executives have to be engaged.” But, she said, “Does it go as far as I would like it to go? No. But I don’t think this part of the audit represents the end point in that discussion either.”
Civil rights advocates agree more should be done structurally. Sandberg’s attention to the civil rights audit over the last few months, according to Robinson, who has had multiple conversations with her, has meant more progress and more resources toward the effort. He finds her leadership of the task force encouraging.
“Sheryl Sandberg has been an extraordinary advocate,” said Devich-Cyril, who believes the company should hire a full-time civil rights expert to ensure the company’s dedication. “I don’t think that a lot of the things that we have asked for would have happened without her leadership.” But, she cautioned, “She can’t do that alone. There needs to be real civil rights expertise brought into the company from outside that can take this the rest of the way forward.”