Facebook Decency Rating, 2021: 3/20

Total:2/20
Privacy:0/5
Trust:1/5
Fairness:1/5
Impact:3/5

Of the companies we studied, none scored worse than Facebook. Their startling three out of twenty gives them an Evil rating. That’s not a rating I hand out lightly, but Facebook’s track record does not fare well under the slightest scrutiny

Time and time again, Facebook has shown no respect for users’ privacy and no responsibility as a caretaker of personal information. They hold a monopoly over social networking, having unfairly consolidated companies such as Instagram and WhatsApp in the early 2000s. And their impact on the world has been undeniably for the worse: enabling the spread of misinformation, bolstering extremist politics, and contributing to ethnic violence around the world.

But before we go further, let’s talk about Facebook’s revenue model and the corporate ideology that guides their decision-making.

For more information, click here to read the introduction to our Tech Decency Ratings series.

Facebook’s Revenue Model

Facebook is an advertising company. Of the $86 billion in revenue they made in 2020, $84.2 billion came from advertising, with $1.8 billion coming from other sources. Facebook does not break down its revenue distribution in greater depth, but the core business is clear enough: 98% advertising.

The biggest segment of that advertising revenue likely comes from personalized advertisements on Facebook itself. From its inception, Facebook promised to offer a completely free service to users. Rather than charge for subscriptions, Facebook sells access to – and information on – its users to sell advertisements.

This model has done well for them – $84 billion a year isn’t bad by any measure, amounting to roughly one fifth of all digital advertising. They’re outmatched only by Google. Together, these two companies comprise roughly 60% of all digital advertising.

Notably, Facebook’s revenue falls well short of its rivals in big tech. $86 billion is considerable, but less than half the revenue of Google, their closest competitor, and roughly a quarter of Amazon and Apple’s annual revenue. This position puts some pressure on Facebook, to the extent that it competes with its big tech rivals.

Facebook’s Corporate Ideology

Facebook’s stated mission is “to give people the power to build community and bring the world closer together”. It’s not hard to see how their core social media platforms do so. Bringing people closer together is admittedly a broad mission, but that’s exactly what Facebook does.

Facebook also lists five values on its website:

  • Be bold
  • Focus on impact
  • Move fast
  • Be open
  • Build social value

Not quite absent is Facebook’s most famous motto: “Move fast and break things”. Facebook officially retired this mantra in 2014, but this mentality lives on, as their culture page makes clear:

We believe that it’s better to move fast and make mistakes than to move slowly and miss opportunities. Doing so enables us to build more things and learn faster.

While there is certainly merit in learning from your mistakes, Facebook takes this to an extreme. Facebook views mistakes as the natural byproduct of growth. The bigger mistake you make, the better you’re doing. And Facebook’s trail of breaches and scandals would seem to show this value in effect.

But if Facebook really were learning from their mistakes, they would have presumably found a way to keep users’ data secure by now – unless they’re learning the wrong lessons. If you value “moving fast and making mistakes”, security measures and privacy controls that get in the way and prevent mistakes are, by their nature, opposed to your values. If a privacy review gets in the way of moving fast and innovating, that review would likely end up disregarded or dismantled.

Privacy: 0/5

Facebook’s record on Privacy has not been stellar.

The Cambridge Analytica scandal, in which a small company looted personal information from over 85 million Facebook users, remains their highest profile incident. This has sometimes been called a data breach, but it can’t quite be described as such: Facebook let it happen.

To recap, Cambridge Analytica acquired this data when roughly 300,000 Facebook users took an online personality quiz in the mid-2010s. In so doing, they gave up not only access to their own data, but to their friends’ data as well.

Facebook allowed this to happen. They knew it was happening as far back as 2015, but did not take action against Cambridge Analytica until damning media reports in 2018.

After the Cambridge Analytica scandal broke, Mark Zuckerberg apologized and Facebook repeatedly promised to do better. But the results have not been so promising. The company saw major data breaches in 2018 and 2019, leaving hundreds of millions of users exposed.

In June 2018, they also “accidentally” switched users’ posts from private to public – all while promising to do more to protect users’ privacy. This echoed a 2010 controversy, in which Facebook switched its 400 million user profiles from private to public.

In December 2018, scandal struck again when the New York Times uncovered how widely Facebook was sharing and selling user data without its customers’ permission. Even after Facebook promised the FTC it would not share user data without explicit permission, Facebook continued to sell users’ information to over 150 companies. Companies such as Netflix and Spotify could even read users’ “private” messages.

After lying to their users and the FTC, Facebook remained unapologetic. Facebook’s director of privacy, Steve Satterfield, justified this behavior and claimed users’ privacy had not been violated because the companies were asked to abide by Facebook’s policies.

Time and time again, Facebook has made public statements promising to take privacy seriously. But these can be hard to disinguish from PR, and hard to take seriously alongside their record.

More recently, an internal memo leaked indicating a new direction for privacy at Facebook. Titled “The Big Shift”, Facebook VP Andrew Bosworth outlines a new direction for the company:

Instead of imagining a product and trimming it down to fit modern standards of data privacy and security we are going to invert our process. We will start with the assumption that we can’t collect, use, or store any data. The burden is on us to demonstrate why certain data is truly required for the product to work.

If followed, this could lead to a sea change at Facebook. But that’s a pretty big “if”, and in the memo, Bosworth acknowledges that their work remains ahead of them:

Local teams haven’t yet internalized the magnitude of the shift we are undergoing. The next step is for the priority of privacy to permeate the entirety of our culture, we’ve made inroads here but we have a long ways to go. . . The tools will only be effective insofar as we stop fighting them at the cultural level . . . We are on a long road to redemption.

If this effort is followed through, it is not impossible for Facebook to turn things around on privacy. But their business model, reliant on personalized ads, makes this difficult: at every turn, privacy will come into conflict with Facebook’s bottom line.

For now, we can only judge Facebook on what they’ve done, not what they promise in years to come. On that score, Facebook rates zero out of five on Privacy.

Trust: 1/5

Trust and privacy are closely connected. The repeated failures that undermine Facebook on privacy drag them down on trust as well. Unlike Google and Amazon, which harvest users’ personal data but at least take fairly good care of it, Facebook has been totally irresponsible with their users’ private data.

The fact that Facebook has repeatedly promised better while continually failing users only makes them harder to trust. At this point, it’s hard to take seriously any public statement Facebook makes on privacy. They have failed too many times to warrant your trust, which is why we rate them a one out of five.

Fairness: 1/5

Facebook has a monopoly over social media. Between Facebook and Instagram, they have over 3.5 billion monthly active users, roughly 60% of the world’s social media audience.

60% doesn’t quite approach the 94% share Google holds over North American search traffic. But crucially, Facebook and Instagram are the two networks actually used to connect real life friends and family. Other popular social media apps, such as TikTok or Pinterest, are more about sharing content than fostering connections. If you’re looking to keep up with people you actually know, Facebook and Instagram are likely your only two real options.

Short of anti-trust action, Facebook’s position is nigh-unassailable. That’s because of the network effect: these networks are viable because so many people are already on them. You can join another social network, but good luck connecting with friends and family if none of them are on it.

If you don’t like Facebook, your only alternative might be Instagram. But increasingly, Facebook has been asserting its control over Instagram and integrating the app into Facebook itself. When Facebook acquired Instagram and WhatsApp, it promised they would remain autonomous. But they haven’t held to that promise. In 2018, both Instagram and WhatsApp’s founders left the companies as Facebook exerted more and more influence over both apps.

More recently, Facebook has begun integrating both Instagram and WhatsApp into Facebook’s Messenger, opening both up to the privacy concerns that have racked Facebook. They also went back on their word when they acquired Oculus that Oculus headsets would not require a Facebook connection to use.

Not only is Facebook unfairly consolidating these companies, but they have lied to both businesses and their users about this consolidation. When Facebook acquires a company, they promise not to tie it into Facebook. But based on their track record, users and regulators should never take them at their word.

Facebook also has no qualms colluding with other tech companies. In 2017, they made an under-the-table deal with Google, their only serious competitor in digital advertising. Per this arrangement, Facebook agreed not to compete with Google via a new bidding platform, and in exchange received preferential treatment on Google’s new ad platform, allowing them to out-compete other publishers.

The anti-trust case against Facebook is clear-cut. Through a serious of acquisitions, they have consolidated a monopoly over social media. In making these acquisitions, they lied to other companies, regulators, and the general public. In digital advertising, they’ve colluded with their biggest “competitor” to stake out an unfair advantage over smaller businesses and publishers. Facebook has unfairly leveraged its monopoly power, and it would probably be best to break them up.

Impact: 1/5

Much has been made of the proliferation of misinformation on Facebook since 2016. Time and time again, so-called “fake news” provokes more engagement than the real thing. Per Facebook’s algorithm, more engagement means more visibility – and so misinformation spreads like wildfire across Facebook.

Unfortunately, Facebook has not been able to reduce the spread of misinformation since 2016. In 2020, fake news stories reached new heights: misinformation outlets doubled their reach on Facebook in the past year, outpacing traffic growth to legitimate news sites.

And in Facebook groups and public pages, Facebook users have even gotten together to coordinate actions. Over 100,000 Facebook users piled on to insurrectionary hashtags in the run-up to the January 6th attack on the Capitol. So far, Facebook’s actions have been insufficient to shut down far-right organizing: typically, they only respond to these groups after violence has been done.

Facebook’s inability to control hateful and violent discourse has also had terrible effects outside the United States. In Myanmar, Facebook became a platform to promote ethnic violence against the Rohingya. Thousands of people were killed and over a million people have been displaced as part of the ongoing violence.

Whatever changes Facebook has made, they have not been enough to prevent it from happening again; even now, hate speech on Facebook fosters ethnic violence in Ethiopia. Facebook remains behind the ball on moderation, only responding to these outbreaks of violence after they take place.

But Facebook does not value moderation. It contracts out this work rather than handling it in-house, paying contractors less than $30,000 a year to filter through hate and gore for hours at a time. These workers’ time is so thoroughly micromanaged that they don’t have enough time to go to the bathroom. Many workers on Facebook’s contracted moderation teams have developed PTSD symptoms, for which they receive no support.

Final Score: 3/20

Looking at Facebook’s record, they have done harm across the board. Time and time again, “moving fast and breaking things” has led the company to do terrible damage.

Facebook has repeatedly shown zero regard for users’ privacy. While promising better over and over again, they have continued to infringe on users’ personal data without their knowledge or clear consent. And across multiple breaches, they have left sensitive information exposed.

But Facebook can be difficult to avoid. If you want to stay connected to friends and family, they’re pretty much the only game in town. That’s because they’re a monopoly. Their acquisitions of Instagram and WhatsApp should never have happened, and by now, it is clear we would be better off if Facebook were broken up. They’ve even colluded with their so-called “rivals”, earning an advantaged position on Google’s advertising platforms at the expense of smaller companies.

It is hard to find this company’s upside. It really is. There has been some encouraging dialogue within the company on privacy and other subjects, but a mountain of work remains to be done if Facebook really wants to do better. For now, we must judge them by what they’ve done. And what Facebook has done can not be described in charitable terms.

How Facebook Compares to Other Big Tech Companies

Facebook’s three out of twenty puts them at the bottom of the barrel. While no company is perfect, no other big tech company has been as irresponsible across the board as Facebook.

Total:Privacy:Trust:Fairness:Impact:Full Article:
Amazon7/201/53/52/51/5Click here
Apple14/204/204/202/204/20Click here
Facebook3/200/51/51/50/5Click here
Google7/201/53/51/52/5Click here

For more information, see our Big Tech Decency Ratings hub.

How Should Customers Handle Facebook?

Can you live without social media? Facebook holds such a dominant position that boycotting their products means stepping away from the two most popular social media platforms in the western world.

If you take your privacy at all seriously, you have a great reason to leave these apps behind. With Facebook bringing Instagram and WhatsApp under the umbrella of Facebook Messenger, you can no longer expect any measure of privacy on these apps – even when it comes to “private” messages.

If you do use any of these, you should assume anything you post or send is fully public. Better yet, switch to a secure app, such as Signal or Telegram, when you need to send messages. In January 2021, Signal skyrocketed to #1 on Apple and Google’s app stores. That could give it enough of a network effect to compete with Facebook, at least when it comes to messaging.

If you care about the economic and social impact of Facebook, the best thing you can do is find ways to agitate for change via government action or activism. A great place to start is by writing your representatives to express your support of the FTC anti-trust suit against Facebook, which will likely go on for two or more years.

Leave a Comment