For years, social media platforms have fueled political polarization and hosted an explosion of hate speech. Now, with four months until the U.S. presidential election and the country’s divisions reaching a boiling point, these companies are upping their game against bigotry and threats of violence.
On Monday, Ford Motor Co. put the brakes on all national social media advertising for the next 30 days. The company says hate speech, as well as posts advocating violence and racial injustice, need to be eradicated from the sites.
Companies such as the consumer goods giant Unilever — one of the world's largest advertisers — as well as Verizon, Honda and many other brands have joined the boycott, some for the month of July and others for the rest of the year. (Autoblog is a Verizon Media company.) New companies have been signing on to the boycott almost every day. While some are pausing ads only on Facebook, others have also stepped back from advertising on Twitter and other platforms.
What's not yet clear is whether this action is too little, too late — nor whether the pressure on these companies, including a growing advertiser boycott, will be enough to produce lasting change.
Reddit, an online comment forum that is one of the world's most popular websites, on Monday banned a forum that supported President Donald Trump as part of a crackdown on hate speech. Also on Monday, live-streaming site Twitch, which is owned by Amazon, temporarily suspended Trump’s campaign account for violating its hateful conduct rules.
YouTube, meanwhile, banned several prominent white nationalist figures from its platform, including Stefan Molyneux, David Duke and Richard Spencer.
Social media companies, led by Facebook, now face a reckoning over what critics call indefensible excuses for amplifying divisions, hate and misinformation on their platforms. Civil rights groups have called on large advertisers to stop Facebook ad campaigns during July, saying the social network isn’t doing enough to curtail racist and violent content on its platform.
While the ad boycott has dinged Facebook’s and Twitter’s shares, analysts who follow the social media business don’t see it as having a lasting effect.
Raymond James analyst Aaron Kessler noted that YouTube has faced several ad boycotts in the past over hate speech and other objectionable material. Each time, it adjusted its policies and the advertisers returned. In addition, July is generally a slow month for advertising. Companies have also been cutting their ad budgets due to COVID-19, so the spending declines are not a surprise for investors. Kessler called Facebook's stock pullback — its shares fell more than 8% on Friday, then rallied a bit Monday — a “buying opportunity.”
Reddit's action was part of a larger purge at the San Francisco-based site. The company said it took down a total of 2,000 forums, known as the site as “subreddits,” most of which it said were inactive or had few users.
The Trump Reddit forum, called The_Donald, was banned because it encouraged violence, regularly broke other Reddit rules, and defiantly “antagonized” both Reddit and other forums, the company said in a statement. Reddit had previously tried to discipline the forum.
“We are cautiously optimistic that Reddit is finally working with groups like ours to dismantle the systems that enable hateful rhetoric on their platform," Bridget Todd, a spokeswoman for the women's advocacy organization UltraViolet, said in an emailed statement.
The group said its members met with Reddit CEO Steve Huffman via Zoom last week, encouraging him to address racism and hate speech on the platform.
Despite optimism from some critics, others said it is not clear if such measures will be enough. For years, racist groups “have successfully used social media to amplify their message and gain new recruits," said Sophie Bjork-James an anthropology professor at Vanderbilt University who specializes in white nationalism, racism and hate crimes.
“However, limiting access to a broader public will have unintended negative consequences. Far-right and white nationalist groups are increasingly gathering on encrypted apps and social media sites that do not monitor for offensive speech or violent content," she added. “This shift allows for coordinating more violent and radical actions."
The algorithms tech companies developed to keep users glued to their services “have provided perhaps the biggest boon to organized racism in decades, as they help racist ideas find a much larger and potentially receptive audience," Bjork-James said, adding that she is hopeful that the same companies that “helped this anti-democratic movement expand" can now help limit its impact.