Facebook and Big Tech Are Facing Their ‘Big Tobacco’ Moment

In 1996, the popular and well-respected U.S. television news program “60 Minutes” aired a whistleblower’s devastating account of corporate malfeasance at America’s third-largest tobacco company. At the time, an estimated 25 percent of Americans smoked cigarettes, and the idea that smoking could be linked to cancer and heart disease or produce birth defects was still a matter of public debate. That changed after Jeffrey Wigand, a biochemist who was hired to oversee the science of making cigarettes more marketable at the Brown and Williamson Tobacco Corporation, told “60 Minutes” that the tobacco company, which he had left in 1993, was lying to the public and to Congress about the harmful effects of its products.

Big Tobacco companies knew, Wigand said then, that there was a problem, because their own research told them so, but they suppressed evidence about how cigarettes hurt public health. It was not until 1998—five years after Wigand left Brown and Williamson and two years after he had exposed its misdeeds—that 45 state attorneys general forced the company and three other Big Tobacco majors to pay out a multimillion-dollar settlement to cover Medicaid costs associated with illnesses linked to cigarette smoking. 

That was about the same time another younger future American whistleblower, Frances Haugen, was just starting high school in Iowa City. Now Haugen, a former Facebook data scientist who this week openly accused the social media company of misleading the public in complaints filed against the Silicon Valley giant with the U.S. Securities and Exchange Commission, may go down in history alongside Wigand as one of the most consequential whistleblowers in American history.

Wigand and Haugen may be a generation apart, but their paths could not be more alike. And after Haugen’s own interview on “60 Minutes” and testimony in Congress this week about Facebook’s corporate malfeasance, her impact on the regulation of Big Tech in the 21st century will likely be no less significant. But what really sets Haugen apart from the long line of whistleblowers that have gone before her is that she has a well-thought-out strategy for waging what could be a long grinding war against one of the world’s biggest rogue tech companies.

When Haugen left Facebook in May after working there for two years, she secretly copied thousands of pages of damning evidence about what Facebook knows about the harm its products cause and what it tells the world. Now known as the “Facebook Files,” the cache of documents Haugen shared with The Wall Street Journal last month confirmed what many have suspected for years: Facebook is lying to the world about making progress on containing violent content, hate speech and misinformation on its platform. The fact that Haugen’s revelations coincided with a worldwide outage of Facebook, Instagram and WhatsApp that left the company’s billions of users disconnected for hours this week only underscores the magnitude of her claims by highlighting the virtual monopoly Facebook exercises.

Haugen’s trove of documents shows that Facebook has known for years, for instance, about the psychological damage caused by Instagram to young women and girls struggling with body image issues. Then there is Facebook’s so-called whitelist, the index of influential VIPs exempted from company fact-checking and moderation policies because their online and offline behavior generates oceans of clicks and—presumably more importantly—ad revenues for the trillion-dollar tech company. The Facebook whitelist policy exempts certain high-profile individuals from the platform’s community standards rules, a practice that, the leaked documents show, Facebook acknowledged undermines its fairness standards, elevating the company’s legal and compliance risks. 

Far more important than content moderation are the algorithms that drive social media users into isolated caverns reflecting their unhealthy anxieties and obsessions back to them.

Haugen and her lawyer, John Tye, clearly learned and applied a few important lessons from whistleblowers past. Haugen’s rollout of the damning evidence against Facebook has been flawless. Her credentials as a computer scientist, knowledge of the tech industry, exceptionally polished and poised demeanor, and empathetic tone when testifying on Capitol Hill also give her a lot of credibility. There are risks, though, with Haugen’s strategy, the biggest one being that amid the sheer volume of evidence she has provided Congress, the SEC and other federal authorities, they will miss the forest for the trees. So, it’s worth stating up front some of the most important but perhaps least obvious takeaways, beyond the fact that Facebook is not the socially conscious company it makes itself out to be. 

Content moderation on social media platforms is the least of our problems. Far more important are the algorithms that drive platform users into isolated caverns reflecting their unhealthy anxieties and obsessions back to them—sometimes exponentially. Right now, those algorithms are like airplane black boxes that haven’t been retrieved yet from the bottom of the ocean after a crash. Just as we need to be able to look at flight recorders after every failed flight to understand what went wrong, we need to be able to evaluate platform algorithms. It is these automated interventions, and not necessarily the messages contained in posts, that are pushing social media users into “funhouse mirror” corners of platforms that distort their sense of self. 

Facebook, and many other social media platform providers, want the public and lawmakers to stay focused on content moderation as the big issue. This grows out of the first takeaway, because social media platforms do not want regulation that forces more transparency on their algorithms and policies. This is why Facebook is seemingly much more amendable to taking down malign content after the fact, and eager to publicly tout these efforts, but much more defensive about claims made by the company’s own researchers that Facebook does not have the capacity or will to proactively mop up all the hate speech and violence on its platform. 

The Facebook Oversight Board is a total sham and should be shut down. With all due respect to the esteemed, accomplished and undoubtedly sincere experts on the Facebook Oversight Board, the body, which was set up to monitor compliance with the company’s opaque content moderation policies, was set up to fail from the start. Haugen’s SEC complaints against Facebook show that the company has spent millions to entertain the world with the Kabuki theater of quasi-judicial, impartial rulings on appeals from errant users, instead of giving the board the power to audit the company’s algorithms. The board should either be dissolved wholesale or Facebook should transfer the board foundation’s $140 million in assets into a public trust run much like the nonprofit Corporation for Public Broadcasting, responsible for conducting real oversight of algorithms as well as company policies and practices.

There will be many more takeaways to come as the SEC weighs and responds to the complaints filed against Facebook this week. What matters now, however, is that lawmakers and the rest of the Big Tech industry start incorporating these top-line insights into their thinking on next steps. If they don’t, they’ll be discredited in much the same way that Big Tobacco was 25 years ago, but not before having caused untold damage to the health of their users—and the countries they operate in.

Candace Rondeaux is a senior fellow and professor of practice at the Center on the Future of War, a joint initiative of New America and Arizona State University. Her WPR column appears every Friday.

Please subscribe to our page on Google News

SAKHRI Mohamed
SAKHRI Mohamed

I hold a Bachelor's degree in Political Science and International Relations in addition to a Master's degree in International Security Studies. Alongside this, I have a passion for web development. During my studies, I acquired a strong understanding of fundamental political concepts and theories in international relations, security studies, and strategic studies.

Articles: 15380

Leave a Reply

Your email address will not be published. Required fields are marked *