Frances Rigby discusses social media and the polarisation of political views

Social Media and the Polarisation of Political Views

 

Over the last 20 years the world has seen a rapid development and rise in the use of social media platforms. At the start of 2020, approximately 3.8 billion people were using some type of social media and it is predicted that this will have increased to include more than half of the world’s population by the end of the year. The sheer scale of social media and the simplicity of its method of delivery give power to these companies which is unmatched by other alternative methods. It is the first time that consumers can be monitored and data about their interests and preferences stored on such a large scale. As a consequence, social media provides a huge opportunity for advertisers, influencers, journalists and politicians.

One significant development in more recent years is the appearance of ‘echo chambers’ that have been created across many social media platforms due to the algorithms used on them. Echo Chambers can be defined as an environment where an individual’s thoughts and opinions are reflected back to them, allowing no other opinion or viewpoint to be heard or seen by them. Anytime we use social media an algorithm keeps track of the things we like and dislike, what we choose to click on, our search history, and what our friends look at too. From this information the algorithm is able to personalise the content that is shown to us, based upon what it believes we wish to see. Although this may sound like a good idea, and was probably created with good intentions, it creates a ‘filter bubble’ in which we are only shown things we have expressed an interest in and adversely hides away things we have not.

 

By creating such filters over what is shown online, people see less and less of opposing viewpoints as they do not see those that are filtered out. Without actively searching this out, will people be bothered to do this, especially if they are comfortable with what is presented to them? This can present the problem of the difficulty of seeing the whole picture of a certain event, or even seeing information about the event in the first place. People end up surrounded by similar minded people and only hear the beliefs they already hold, rather than being confronted by ideas that may challenge them. In certain situations people may only want to see things they like and are interested in, for example, a particular hobby. But when it comes to political issues, filter bubbles present a serious problem.

On social media platforms such as Twitter and Facebook, a large proportion of the content available is of a political nature. Algorithms filter out anything that has a contrasting viewpoint to the user. This can foster the impression that the user holds ‘correct’ views and anyone who does not agree with them is not worth listening to, and is even denied an opportunity to convey their opinions.

Similarly, this type of filtering is also able to manipulate the type of news stories presented to the reader. Taking the most recent general election as an example, supporters of Labour would be most likely to be presented with and click on positive stories about Jeremy Corbyn and negative ones about Boris Johnson, and vice versa for Conservative supporters. To some extent this would have happened with printed media before social media existed, but the scale and interactive quality of social media transforms what was a personal matter for the reader, into a more public one by allowing for immediate response and the opportunity to attack those you disagree with. So as people fall into certain filter bubbles, groups become more separated and polarised, allowing less opportunity to form a balanced understanding. The danger is that with the middle ground disappearing the extreme position will dictate the agenda. The problem for society is how could social media platforms be required to strike a balance and what would be the appropriate organisation to judge what this should be.