Is Facebook evil?
The answer to this simple question is not that simple. The tools that have enabled Facebook to enjoy its position are its access to massive amounts of data and its machine learning algorithms. And it is in these two areas that we need to explore if there is any wrongdoing on Facebook’s part.
Facebook, no doubt, is a giant in online space. Despite their arguments that they are not a monopoly, many think otherwise. The role that Facebook plays in our lives, specifically in our democracy, has been heavily scrutinized and debated over the last few years, with the lawsuits brought on by the federal and dozens of state governments toward the end of 2020 being the latest examples. While many regulators and most regular folks will argue that Facebook exerts unparalleled power over who shares what and how ordinary people get influenced by information and misinformation, many still don’t quite understand where the problem really lies. Is it in the fact that Facebook is a monopoly? Is it that Facebook willingly takes ideological sides? Or is it in Facebook’s grip on small businesses and its massive user base through data sharing and user tracking? It’s all of these and more. Specifically, it’s Facebook’s access to large data through its connected services and the algorithms that process this data in a very profit-focused way to turn up user engagement and revenue.
Most people understand that there are algorithms that drive systems such as Facebook. But their view about such algorithms is quite simplistic—that is, an algorithm is a set of rules and step-by-step instructions that informs a system how to act or behave. In reality, hardly any critical aspect of today’s computational systems, least of them Facebook’s, are driven by such algorithms. Instead, they use machine learning, which by one definition means computers writing their own algorithms. Okay, but at least we’re controlling the computers, right? Not really.
The whole point about machine learning is that we, the humans, don’t have enough time, power, or ability to churn through massive amounts of data to look for relevant patterns and make decisions in real time. Instead, these machine learning algorithms do that for us. But how can we tell if they are doing what we want them to do? This is where the biggest problem comes. Most of these algorithms optimize their learning based on metrics such as user engagement. More user engagement leads to more usage of the system, which in turn drives up ad revenue and other business metrics. On the user side, higher engagement leads to even more engagement—like an addiction. On the business side, it leads to more and richer data that Facebook can sell to vendors and partners.
Facebook can use their passivity in this process to argue that they are not evil. After all, they don’t manually or purposefully discriminate against anyone, and they don’t intentionally plant misinformation in users’ feeds. But they don’t need to. Facebook holds a mirror on our society and amplifies our bad instincts because of how their machine learning-powered algorithms learn and optimize for user engagement outcomes. Unfortunately, since controversy and misinformation tends to attract high user engagement, the algorithms will automatically prioritize such posts because they are designed to maximize engagement.
A user is worth hundreds of dollars to Facebook, depending on how active they are on the platform. A user that is on multiple platforms that Facebook owns is worth a lot more. Facebook can claim that keeping these platforms connected is best for the users and the businesses and that may be the case to some extent, but the one entity that has most to gain by this is Facebook.
There are reasonable alternatives to WhatsApp and Instagram, but none for Facebook. And it is that flagship service and monopoly of Facebook that makes even those other apps a lot more compelling and much harder to leave for their users. Breaking up these three services will create good competition, and drive up innovation and value for the users. But it will also make it harder for Facebook to leverage its massive user base for the kind of data they currently collect (and sell) and the machine learning algorithms they could run. There is a reason Facebook has doubled its lobbying spending in the last five years. Facebook is also trying to fight Apple’s stand on informing its users about user tracking with an argument that giving the users a choice about tracking them or not will hurt small businesses. Even Facebook’s own employees don’t buy that argument.
I may be singling out Facebook here, but many of the same arguments can be made against Google and other monopolies. We see the same kind of pattern. It starts out by gaining users, giving them free services, then bringing in ads. Nothing wrong with ads; televisions and radio have done them for decades. But with the way the digital ad market works, and the way these services train their machine learning algorithms, it’s easy for them to go after data at any cost (such as user privacy). More data, more learning, more user engagement, more sales for ads and user data, and the cycle continues. At some point the algorithms take on a life of their own, disconnected from what’s good or right for the users. Some of these algorithms’ goals may align with the users and businesses, but in the end, it is the job of these algorithms to increase the bottom line for their masters—in this case, Facebook.
To counteract this, we need more than just regulations. We also need education and awareness. Every time we post, click, or “like” something on these platforms, we are giving a vote. Can we exercise some discipline in this voting process? Can we inform ourselves before we vote? Can we think about a change? In the end, this isn’t just about free markets; it’s about free will.
About the Author
Dr. Chirag Shah, associate professor in the Information School at the University of Washington.
Sign up for the free insideAI News newsletter.
Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1
Great post! detailed and informative, thanks for sharing this article with us.