This blog post was originally published by Colin at the eDeliberation Blog
But things have changed in the past year or two. I don’t blame Facebook, at least not entirely. I worked at eBay and PayPal for eight years, and I know how a site can get so big you can’t really conceptualize everything that’s going on within it. At its peak during my tenure, eBay had about 250m accounts (though we learned later that a lot of those accounts were fake, set up by spammers and scammers). Facebook today has more than 2b accounts. We used to say at eBay, if you counted all our users as citizens, we’d be the fifth largest country in the world. These days, Facebook would be number one, by a long shot.
Imagine the challenge of policing a country with 2b people in it. You couldn’t do it. Facebook only has about 25,000 employees. Not even Facebook knows all the things that are happening within its website. At eBay, we had fraudsters who went to work every day in suits, in skyscrapers, with Ph.D.s in computer science, who were trying to defraud our users. We did more transactions on a daily basis than the NASDAQ, and we were expected to police all of them in real time. Can you imagine the same challenge, but for 2b users instead of 250m? All in different languages? At least when you’re dealing with ecommerce purchases you can look for anomalies in payments and buyer problem reports. When you’re dealing with people communicating with each other in free text, there’s no common structure to the information.
Facebook is running an ecosystem more than micro-managing all the different interactions therein. They can’t read more than .0001% of the communications passing through their platform every day. And now that their reach and influence is so global, big institutions (read: governments and corporations) are studying how to game Facebook’s rules to achieve their own ends.
I’ve started to feel how Facebook is manipulating me. And again, I’m not talking about some nefarious individual deciding how I should think and hand-selecting stories targeted to influence me. People want to personify the awesome power of technology, so they imagine Zuckerberg (or Jobs or Gates) as the actor, personally deciding what to do and when. But I’m not talking about Zuckerberg, or any employee or group of employees. I’m talking about the algorithms.
Facebook wants my attention. It wants my engagement. It wants my clicks and likes. And the sad reality is that, by playing on my anger and fear, it gets more of me. I want balance and peace in my life, but balance and peace don’t generate clicks. Anger is an energy. Fear is a powerful motivator. And I sense that the algorithms have discovered that. And what’s scary is that it works.
I am a strong believer in the core principle that people are good. Yes, people are capable of bad actions. And a very small segment of people, perhaps due to trauma or personal inclination, do act in intentionally harmful ways. But the vast, vast majority of people around the world want to be good, they want to do good things, they want to be a force for good in the world. And even when an individual does something that is hurtful, they usually rationalize it as being good (or just, or appropriate) in the larger scheme of things. This is why you can go to almost any country in the world and people will be nice and hospitable. It’s burned into our DNA from millions of years of co-evolution.
But kindness and empathy and understanding doesn’t get clicks. Outrage gets clicks. And the forces on Facebook who are gaming the platform for their own ends know that well. I can feel Facebook feeding my outrage. I can see how it corrals me into groups of similarly outraged people, where we stoke each other’s anger.
I can also see how complex discussions over hard topics, topics that touch on identity, security, safety — core principles for every person on the planet — are transformed into shouting matches on Facebook. I’ve had hundreds of political conversations on Facebook and I can’t think of a single one where someone has changed their mind. On the contrary, every political conversation plays out in front of all of one’s friends and family, so it’s rarely about listening to the other side and more about demonstrating to your social graph that you’re willing to fight for what’s right (an activity a friend of mine refers to as “virtue signalling.”)
I could see the seeds Facebook was planting in my brain. And even though it contradicts a lot of my work in conflict resolution, I saw that I was susceptible to it. If I posted a political link, my conservative friends from eBay and back in Texas would respond, with my progressive friends in New England and California replying back, and inevitably a fight would ensue. There was rarely any insight or empathy generated by the exchanges. The Facebook platform encourages communication in short, SMS-like bursts, which makes nuance difficult. If you write something longer than just a few sentences it’s even cut off, with a “see more” link.
It’s clear I’m not the only one feeling this way. The recent backlash against Facebook (mostly focused on privacy and data sharing) has sparked a #deletefacebook movement, and celebrities are announcing their departures from the platform in a steady stream. But it’s not easy to get off Facebook. It’s hard to say goodbye to the regular interactions with friends and family, as Sarah Jeong describes so well. It’s easy to miss the photos of graduations and videos of new babies. There’s a strong sense of missing out. But that’s the bait Facebook uses to keep us logging in.
For now, I’ve only deactivated my account. I could still re-activate. I see some people have written scripts to systematically go through Facebook and delete all their past posts. I’m not ready for that. But I am definitely going to take a break to get my balance a bit and to restore some perspective. I’ll think further in this blog about how Facebook might be able to change to address some of these issues, and if not Facebook, how other services might step into the breach.
Colin Rule is CEO of Mediate.com. From 2017 to 2020 Colin was Vice President for Online Dispute Resolution at Tyler Technologies. Tyler acquired Modria.com, an ODR provider Colin co-founded, in 2017. From 2003 to 2011 Colin was Director of Online Dispute Resolution for eBay and PayPal. Colin co-founded Online Resolution, one of the first online dispute resolution (ODR) providers, in 1999 and served as its CEO and President. Colin worked for several years with the National Institute for Dispute Resolution (now ACR) in Washington, D.C. and the Consensus Building Institute in Cambridge, MA.
Colin is the author of Online Dispute Resolution for Business, published by Jossey-Bass in September 2002, and co-author of The New Handshake: Online Dispute Resolution and the Future of Consumer Protection, published by the ABA in 2017. He received the first Frank Sander Award for Innovation in ADR from the American Bar Association in 2020, and the Mary Parker Follett Award from the Association for Conflict Resolution in 2013. He holds a Master’s degree from Harvard University’s Kennedy School of Government in conflict resolution and technology, a graduate certificate in dispute resolution from UMass-Boston, a B.A. from Haverford College, and he served as a Peace Corps volunteer in Eritrea from 1995-1997.