The Lost Art of Critical Thinking

We have never been more connected in the history of mankind. And yet, ironically, we have never been more polarised as a society, living in silos, echo chambers of conformity.

Two billion people actively use Facebook every month. That’s 26% of the population of the planet who are continuously, actively and interactively exchanging, sharing and posting very private as well as public information. But over and above our innate need to share information, 67% of Americans report that they get at least some of their news on social media, with 45% indicating Facebook as their source of news.

This reliance on social media, rather than traditional journalism, has far-reaching and alarming (albeit unintended) consequences, manifested most visibly during the US 2016 elections. Several factors come into play. Our human innate tendencies to “homophily” (to connect with people who share the same interests as ourselves), to “confirmation bias” (our unconscious bias to believe what we want to believe), as well as our bias to trust news from people we know (rather than anonymous media), constitute a volatile cocktail. And social media is the match — a match that ultimately altered our way of thinking.

It all started so innocently.

When the number of our Facebook friends grew too much, it became difficult for us to keep track of all of them on our wall (indeed the number of friends were limited to about 1000 friends at the beginning). As their postings were listed in chronological order it became easy to miss important events if we left Facebook for a short while. So Facebook developed an algorithm that displayed posts according to their “relevance” to us, becoming more and more accurate as the algorithm got to follow us and know our habits. We are of course feeding this algorithm, both willingly and unwittingly. As we give up our privacy and hand it nuggets of our most private information, our likes and dislikes, including our physical location, so it gets to hone its knowledge of us to a frightening degree.

Thanks to this ostensibly useful algorithm we get exposed only to what we like. Coupled with our tendency for homophily, we tend to share only news of which we approve, positively reinforcing the algorithm — which in turn shows us even more of what we agree with. And the cycle continues, as we dig ourselves deeper and deeper into our own bubble. The danger is insidious, for we are no longer exposed to points of view different from our own. We thus unconsciously embolden our own perspective, resulting in a polarisation of our society, if continued unchecked. A vibrant democracy is predicated upon the exposure of diverse thoughts and opinions, where citizens are tolerant of the ideas of others, taking informed decisions based on eclectic perspectives.

Confirmation Bias, our tendency to believe what we want to believe is so strong that we do that often in the face of adverse scientific evidence. It is the same characteristic that has us unconsciously change the orientation of a compass in order to make it point in the direction we believe is correct. Put in the context of social media, we believe at face-value news that reinforce our own beliefs without fact-checking — and then we spread the word. The viral spread of “fake news” during the US 2016 elections, is a testament to this tendency. “Pizzagate” is an excellent case in point. Edgar Welch, a 28-year old from Salisbury, NC, believed the news that Hillary Clinton was running a sex trafficking operation of children from the back office of a pizzeria in Washington DC. So convinced was he that Welch took his AR-15 semi-automatic weapon and drove off to the pizzeria to “free the children”. He fired 3 shots in the restaurant before surrendering to the police. Luckily no-one was hurt. There are many other similar examples of fake news spreading virally, and de-bunking them is tantamount to putting the genie back in the bottle.

So how big is Facebook’s role in this tendency? As Facebook continues to mine our data, it starts to know us better than we know ourselves. Facebook “can predict user’s race, sexual orientation, relationship status and drug use on the basis on our ‘likes’ alone”. (“World without Mind: The Existential Threat of Big Tech”, Franklin Fore). The danger lies in how that data is being used and to what purpose. For it is so easy to manipulate us. Russia did not need to do anything illegal when it attempted to influence the US 2016 elections. There was no “hacking” involved (at least not in social media). It just had to tap into people’s innate biases and fears — and Facebook obliged.

Facebook obliged because of its business model. Its only source of revenue is advertising, and therein lies the problem. (Remember Facebook is “free” to the user). For everything Facebook does is designed to get you hooked and make you stay on the page so it can continue to serve targetted ads to you, which in turn bring it revenue. This has become known as the “attention economy”. Facebook is one of the most effective and efficient advertising platforms. It allows you to target successfully to an esoteric group of people. It took ProPublica (a non-profit) less than 15 minutes and $10 to advertise on Facebook to people who openly call themselves “Jew haters” and want to know “how to burn Jews”. And Facebook obliged. They have since said that this was “a fail on our part”, and that they will put measures in place to avoid such an occurrence in the future. But how easy is it to monitor? Political advertising is becoming such a sophisticated business, a far cry from the days of pamphlets in people’s mailboxes. As Tim Berners-Lee (the inventor of the Web), pointed out, “in the 2016 US election, as many as 50,000 variations of adverts were being served every single day on Facebook, a near-impossible situation to monitor”. The US 2016 election was indeed a watershed moment, where data bought from a data mining company, Cambridge Analytica, came to supplement other sources, and enabled political parties for the first time to target advertising based on “psychometrics” (eg the personality traits), rather than pure demographics.

But it’s not just about advertising. As Facebook becomes a powerful source of traffic, people take advantage of this by posting content with the sole purpose of having people click on its links in order to send them to their own web sites. These sites are filled with revenue generating ads, thanks to Google’s “ad-sense program”. This phenomenon, called “click baiting”, can be done, for example, by opening fake Facebook accounts that appeal to the “alt right” in the US, who would then follow and share its contents. These free accounts tapped into the racism and bigotry of extremists who easily believed stories of Syrian refugees inundating towns, Muslim men collecting welfare for their 4 wives, and other surrealistic stories. They forwarded the news and people acted upon them, resulting in numerous organised demonstrations. Forced by Congress, Facebook disclosed the details of close to 125,000 such accounts, some of them Russian, and others that had no motive other than greed.

Katherine Viner, The Guardian’s Editor-in-Chief, made a causal link between the degradation of the quality of digital journalism with the bulk of online advertising going to Google and Facebook. Since advertising, she claims, is now aimed at audiences with certain characteristics (ie “algorithmic adverts”), rather than being specifically placed on certain sites (as they are traditionally), publishers are now “locked in a race to the bottom in pursuit of any audience they can find – desperately binge-publishing without checking facts, pushing out the most shrill and most extreme stories to boost clicks”.

The danger is real and pernicious and should not be underestimated. In a November 2016 study, researchers of Stanford University, assessed 7,800 middle and high school students across 12 states to evaluate their “civic online reasoning—the ability to judge the credibility of information that floods young people’s smartphones, tablets, and computers.” They discovered that, “overall, young people’s ability to reason about the information on the Internet can be summed up in one word: bleak” (emphasis not mine). These are tomorrow’s professionals, politicians, business people…

Facebook was never intended to be a publisher, and indeed its owners have always been fighting against this classification. But in only 10 years Facebook has morphed from a place to share information between friends, to the most powerful publisher in history by replacing editors with algorithms (and one that is totally unregulated). And it seems to be always morphing one step ahead of its own owners, every new feature having unforeseen consequences. When they introduced live streaming, it was not only used by friends and family connecting across oceans. People also used that feature to film their suicides as well as premeditated murders.

The biggest concern I have is with such a gold mine of user data at its fingertips, Facebook’s power to disturb our very fabric of society is huge — and unchecked. Can we trust Facebook to self-regulate, as they claim they can? The answer is a clear no. For even with the best of intentions, even if they were able to solve the logistical nightmare of parsing millions of posts (a near-impossible task), there is an inherent conflict of interest between self-regulation and Facebook’s business model. The very foundation of that model, in the attention economy, resides on its ability to grab our attention, hook us and get us to come back again and again.

I might seem to be singling out Facebook, but my comments can easily apply to other social media sites, such as Twitter and Google. Facebook stands out to me for two reasons, for the sheer number of its active users (it also owns Instagram and Whatsapp), and the fact that we feed it willingly our own private information. Indeed, by giving up our privacy we have made a Faustian deal with GAFAM (Google, Amazon, Facebook, Apple, Microsoft), and we risk swapping our souls for the gain of “convenience”. But what have we lost in return?

We can argue about regulation (and that is important), but at the end of the day, it is up to us to regain control of our digital destinies, to harness technology and not be harnessed by it. We may by all means continue to use social media to obtain our news, but as we move away from traditional media, the onus becomes on us to do the job of the journalist, to check the sources of stories, to cross-check them with other sources, to be exposed to a diversity of views.

It is up to us to regain the lost art of critical thinking.


Bibliography / Further Reading

  • ‘Who shared it?’: How Americans decide what news to trust on social media, American Press Institute, 20 March 2017
  • Evaluating Information: The Cornerstone of Civic Online Reasoning, Stanford History Education Group, with the support of The Robert R. McCormick Foundation, 22 November 2016
  • Amazon’s Antitrust Paradox, Lina M. Khan, yalelawjournal.org
  • Our minds can be hijacked’: the tech insiders who fear a smartphone dystopia, Paul Lewis, The Guardian, 6 October 2017
  • Fake News Spreads Because People Trust Their Friends Too Much, New York Magazine, March 2017
  • Scienceblind: why our intuitive theories about the world are so often wrong, Andrew Shtulman, Basic Books, ISBN 0465053947
  • Facebook’s Ad Scandal isn’t a “Fail”, it’s a Feature, Zeynep Tufecki – New York Times 24/9/17
  • A Mission for Journalism in a Time of Crisis, Katherine Viner, The Guardian, 16/11/2017
  • Former Facebook executive: social media is rupping societ apart, The Guardian, Juila Carrie Wong
  • How we Built our Bubble, Molly McHugh, theringer.com
  • How Technology is Making our Minds Redundant, Franklin Foer
  • Facebook doesn’t like what it sees in the mirror, Noam Cohen, New York Times

One thought on “The Lost Art of Critical Thinking

Add yours

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: