August 10, 2017

Facebook : How it meddles with your mind

Facebook is the mythical 800-lb gorilla in the media world that, as the original joke goes, “sits down wherever it wants to”. With 1.2 billion pairs of eyeballs eyeing it every day, it has an audience greater than any American, European or Asian TV news network, newspaper or online news portal. This immense reach also makes it the most effective medium of entertainment. In societies where it has crossed a critical threshold of penetration, it has become the most potent mobilising force in politics and all this eventually translates into Facebook being one of the  most valuable companies in the world.
image borrowed from https://mymuddledmind.blog/

We know that information is power. We also know that power corrupts and absolute power corrupts absolutely. Should we be wary of Facebook? Consider the following ...

In the Foundation series of iconic science fiction novels by Isaac Asimov, we have  the villain, a mutant psychopath called the Mule, using popular musical concerts as a mechanism, a medium, to transmit subliminal messages to an unsuspecting audience, that demoralizes the population and breaks its resistance to the Mule’s political hegemony.  On December 17, 1997, in a chilling realisation of this fictional scenario, many news channels, including the New York Times and CNN, reported from Tokyo, that “The bright flashing lights of a popular TV cartoon became a serious matter Tuesday evening, when they triggered seizures in hundreds of Japanese children. In a national survey, the Tokyo fire department found that at least 618 children had suffered convulsions, vomiting, irritated eyes, and other symptoms after watching "Pokemon."”

Can a mass media platform be used to meddle with or influence, human minds, en masse?

As an early adopter and ardent evangelist of social media, I had always thought that platforms like Facebook and Twitter were an excellent replacement for television and newspapers as channels for current news and diverse views. But after getting drawn into a series of unintentional and inconclusive spats and flame wars with strangers with whom I have little in common and which left both sides as unconvinced about the other’s point of view as ever, I am sceptical. Was the price I was paying for using these “free” channels far too high in terms of the collaterals of irritation and anger generated in an otherwise placid and cheerful person like me? Was this my fault? Was I not savvy enough to handle this new media just as an earlier generation is psychologically uncomfortable with shopping at Flipkart or using an Android smartphone. How did the evangelist in me morph into a social media luddite, ranting against a technology? Was it just me? Or is this feeling universal?

In a peer reviewed paper published in the Harvard Business Review in April 2017, Holly Shakya and Nicholas Christakis has established what I had recently come to believe, namely, that “The More You Use Facebook, the Worse You Feel”! This is paradoxical because social interaction is a necessary and healthy part of human existence and many studies have shown that people thrive when they have strong, positive relationships with others. But when real world, physical relationships are replaced by digital and virtual relationships, the situation changes. The authors measured well-being -- through self reported life satisfaction, mental and physical health and body-mass index -- and Facebook usage -- through the number of likes, posts and clicks on links -- from three waves of data of 5208 users over two years, and came to the conclusion that overall well-being was negatively associated with Facebook usage, with the results being particularly strong for mental health. Moreover, the study also showed that the decline in well-being is strongly tied to the quantity of Facebook usage and not just the quality of interactions as it was believed to be in the past.

While the authors offer no explanation for this negative association of well-being with Facebook usage, it is not difficult to see why this is so if we consider what shows up on your newsfeed. Depending on the number of posts that your friends, and pages that you have liked, have shared there would be approximately 2000+ items that Facebook could show you but since this  leads to an uncomfortable information overload, the actual number shown is possibly as low as 200. This selection or curation is not performed by any human editor but by an artificial intelligence (AI) program that is designed to maximise benefits for Facebook. Since it is in Facebook’s interest to stimulate conversations, it’s AI will obviously select items that would provoke a user to react -- just as in a zoo, visitors throw stones at the animals instead of allowing them to rest in peace. Hence, while placid and informative items will not be totally ignored, there will always be a slight bias towards items that will provoke a reaction. For example, a Hindutva follower -- and Facebook knows our preferences to the last detail -- will be shown more items on minority appeasement, knowing fully well that is more likely to trigger a torrid response, and a subsequent equally torrid counter response,  than pictures of flowers and birds. Of course this bias is neither obvious nor in-your-face. You will still see the usual quota of bland, feel-good quotes and pictures of friends holidaying in Goa or Singapore. Which is fine, except that you just might feel a tad disappointed that you are stuck in messy Mumbai instead of being in Goa which in another reason for feeling a bit sore with yourself! Since nobody posts about their problems, this too leads to the depressing belief that everyone except you is happy.

In fact, playing and tampering with Facebook users’ emotions and deliberately trying to modify it is the subject of a very controversial paper - “Experimental evidence of massive-scale emotional contagion through social networks”, published in the June 2014 issue of the Proceedings of the National Academy of Science USA, by members of the data science team of Facebook. For the purpose of this paper, the Facebook team deliberately introduced a certain bias in the nature of items included in the Facebook user’s newsfeed and observed the impact on their subsequent behaviour. To quote the authors, “In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

This paper was criticised for violating basic ethical principles of psychology research because no consent was sought from the subjects whose emotions were being tampered with. That does not detract from the fundamental premise that Facebook has the ability to modify the emotions of its users and has done so in the past.  In fact, what is even more disturbing is that Facebook now has the technology to use  webcams and smartphone cameras to track emotions in real-time by detecting, decoding facial expressions as we read posts! While there is no evidence of any deliberate evil intent as yet, the fact that it’s AI based news selection service can detect and tamper with the emotions of users is a big red flag because, as noted earlier, Facebook touches more people than any newspaper, television channel or news portal and so has the ability to mould the emotions of a significant part of the global population.

While Facebook has been targeted for being a channel or firehose for fake and unstantiated news, the real danger lies in its ability to tamper with our emotions and, as reported in the HBR paper, make all of us feel angry, frustrated, jealous and upset with the world around us. Can we do anything to mitigate this unfortunate state of affairs? At a personal level, one could reduce the amount of time spent on the platform but since Facebook is an addiction like tobacco or alcohol with similar withdrawal symptoms, this may not be a feasible solution for everyone.

What users could ask for instead, is greater transparency in the algorithm, the procedure, used to determine what they see or don’t. If I want to see posts about birds and flowers, I must not be shown pictures of stone-pelters in Kashmir. In fact, such a process does exist, because you can indicate the kinds of posts that you want to see less of, but a more direct method should go a long way to restore the sense of choice that we have in newspapers and TV to read or ignore specific items of news and views

Social media is here to stay and Facebook, with its unassailable reach and immense clout, is something that -- like the monsoon rain -- we have to learn to live with. However knowing the danger that it poses and working on ways to reduce its impact is something that needs urgent action.


This article originally appeared in Swarajya, the magazine that reads India right.

No comments: