Think Change

Is social media’s hate problem beyond repair?

ODI

In the wake of our decision at ODI to suspend all activity on X (formerly Twitter), this episode examines the polarisation and hate so prevalent in today’s social media landscape and what is driving it.

Since Elon Musk took ownership of X two years ago there has been a rise of hate speech, mis- and disinformation on the platform. The UK riots, which we dissected in a recent episode, are a case in point. This worrying trend – alongside the commercialisation of its validation functions – led many politicians and organisations to stop using X.

But are we just putting plasters on a much deeper issue? To what extent has this divide in society already existed?

This episode examines these questions, and asks what action is needed to build safer, fairer digital societies. Guests explore the potential of social media to tackle subconscious biases, and take a stand against misinformation and online hate.

We hear about the role policy-makers should play in tackling fake news and misinformation as well as social media platforms themselves, and what role AI could play in influencing these dynamics.

Guests

  • Sara Pantuliano (host), Chief Executive, ODI
  • Nanjala Nyabola, Kenyan author, political analyst and and activist
  • Sandra Pepera, Chief Strategy and Governance Officer, ODI

Related resources

00:09 - Sara Pantuliano (Host) 

Welcome to Think Change. I'm Sara Pantuliano. Some of you may remember that one of our recent podcasts touched on the role of the platform X in the incitement of increased levels of violence during the recent riots in parts of the UK. Well, in this episode, we would like to take it a step farther and look more closely at the polarisation being manifested in our current social media landscape and its underlying drivers. Elon Musk's takeover of X has, de facto, seen the dismantling of mechanisms to tackle hate speech that Twitter had put in place, and the commercializing of the validation functions of the platform that mitigated against impersonation. As a result, many politicians and prominent organizations have decided to leave the platform and share their content on alternative platforms such as Blue Sky and Threads. ODI itself seized its activity on the platform yesterday. But are these actions enough, or are we merely putting plasters on an issue that is much deeper than the way it presents itself? In our society, has this strong divide always existed and has social media just given it a powerful platform to progress and grow? And how can we encourage citizens to use social media for productive discourse and give them the tools to understand and tackle subconscious bias and take a stand against misinformation and violent threats that affect themselves and others? 

I've invited two experts to explore this issue today and to discuss how we can collectively devise just and inclusive approaches to building safer and fairer digital societies. Joining me for this conversation are Nanjala Nyabola, a well-known Kenyan author, political analyst and activist and a senior research associate at ODI, and Sandra Pepera, ODI's chief strategy and Governance Officer. Nanjala, let me start with you. Amongst many other things, you specialize in digital democracy. How do you think the use of online media varies across different cultures and geographical contexts? You've written a lot on Kenya, which is one of the most electronically advanced countries in Africa. How do you feel social media platforms have influenced the political and social discourse?  

02:30 - Nanjala Nyabola (Guest) 

Thank you for having me and thank you for that question. I think one thing that I always emphasize is that none of these developments happen in a vacuum. Whatever new technology is deployed, and however it is used, it is usually deeply connected to what was there before and what vacuum that existed, either in the information space or in the media space.... What is social media? What space does it exists in and what space is it going to occupy? Digital platforms, what are they going to occupy?  

03:07 

I think it's very important, and this is a point that I emphasize in my own research, in my own analysis instead of thinking about the platforms per se, I think it's really crucial to think about an information ecosystem, and that is the ecosystem within which people get their political, especially their political information, but really their information more broadly, and the context in which they are able to deploy that information. What are they able to do with these new things that they know? In authoritarian contexts, for example, where the information ecosystem might be dominated by a state-owned broadcaster or by a media space that is compromised by different forms of influence or pressure or coercion by the state, you might find that social media platforms then come to play this sort of quasi-media platform and the public accountability platform, because they become the only space within which people are able to publicly express disaffection with power because of the affordances of social media, and that includes anonymity. That includes being able to coalesce around groups without physically meeting. It includes even being able to make payments by circumventing centralized financial systems.  

So in an authoritarian context, we sometimes find that social media becomes almost like a substitute for the media.  

It becomes the main place through which people can get information about their political system, and this is the paradox that we're dealing with now, I think, with X and the platform formerly known as Twitter. In the early stages, when traditional media did not see social media as competition but rather as a way of getting their information out to large audiences, there was a lot of compromise and dependencies that developed because journalists became dependent on social media to reach audiences. People no longer go on the websites that belong to the news outlets, and now that they're competing for the same advertisers, there's a little bit of friction there, that the traditional media houses kind of want to take their power back, and so this is just one example, but is to underline the point that it's not really about the platforms per se. It's really about the ecosystem in which we get political information. The same dynamics will result in different outcomes in different contexts, because of all of the other issues that might exist within that society.  

05:43 - Sara Pantuliano (Host) 

Thanks, Nanja. That's really helpful and it really chimes with a lot of the analysis and research we have done at ODI that very much have seen these same trends, but on the platform specifically, we've also seen how the commercial imperative has trampled a lot of the careful moderation that is so important to make sure that we don't see more nefarious content being promoted.  

But obviously, as you say, the algorithm promotes or creates greater engagement, and we're seeing exactly that more hateful speech gets promoted because it creates more engagement. But let me come to Sandra. Sandra, before joining ODI, you worked as Director for Gender, Women and Democracy at the National Democratic Institute in Washington and, of course, for the UK Government for many years, and you've dealt with these issues in depth. What do you think the rise of violent extremism online means for the political discourse in the UK as in the US?  

06:50 - Sandra Pepera (Guest) 

No, thanks, sara, and hello to Nanjala. We haven't met for a while, but I've worked with a number of your colleagues in Kenya, people at Policy and other places that you'll know. I mean, I think you know we're at a sort of a point in time now when we have clearly understood that the digital world is changing everything. It's not just changing how we engage in politics, it's changing in some ways, our underlying values and our behaviours as well. This has been something that those of us, if you like, who've been in that sort of feminist space have been canaries in the coal mine really for over a decade now, and every single kind of innovation that undermined, absolutely undermined the potential of the digital world to really, as Nanjala says, increase information, increase education, connect people, make people able to empower people in their own spaces, allow them to undertake accountability and monitoring of government action. All those positive potentials have been undermined by the absence of any kind of regulation. You know the digital space is almost synonymous with what was called at one point, you know, the Wild West. The frontier is endless and the ability to engage on it in ways that are shunned in any other space has never been properly addressed. So I think what we've had in the last couple of years in particular, as some of my colleagues at NDI kind of described it, two different currents coming together. Definitely a greater interest in regulating the way people do behave in that digital space, so governments coming under greater pressure to do something about what is going on. But on the other side you've also got a number of governments who are in the, sometimes, under the pretext of protection, are actually trying to constrain voices on the internet, particularly if they are voices that, as Nanjala suggested, are counter to their own interests.  

You asked the question about the rise of violent extremism. For some people, this is, in a way, this has always been their case. This has always been the state at which they've been able to engage on the internet and in the digital space. So for some, the question is at what standard are you applying to say that there's been a rise right, a rise from where? We've already been at a very high level of violent extremism against particular groups, particular genders, particular interests for a long time. So I think what is happening now perhaps, is that it's kind of the genie has been really let out of the bottle and others, if you like, the majority of the mainstream have actually begun to understand the extent to which we are being manipulated by forces. Unfortunately, it is ridiculously easy and cheap to do this. We have to also get out of this idea that there is online and offline. We are not living separate lives. It's not like those sliding door kind of films. Right, you know, in this life, I'm in here and you know we are living one life and this, this technology has completely transformed it in ways I don't know yeah.  

10:53 - Sara Pantuliano (Host) 

100%. I want to go back to Kenya. I mean, we know that digital spaces have allowed Kenyans to build new communities which, you know, transcend all ethnic and gender divisions. That has particularly been true for women and for people with disabilities. You explore this in your recent book on digital democracy and analog politics, and you also talk about how Kenyan elites have tried to contain online activism. I mean, if you think about the 2017 elections, there was this reported recruitment of Cambridge Analytica by Uhuru Kenyatta, the then Kenyan president, which really contributed to tensions and, in addition, of course, the failed digital vote counting systems and other sources of fake news. Can you tell us more about that? That was an interesting example of where the two worlds really blended. 

11:54 - Nanjala Nyabola (Guest) 

Yeah, I mean, I think that for Kenya,really, it's emblematic of the challenges that almost all the regions in the world are going to face because, again, as I said, just the way the political system oscillates between, um, huge gains in terms of democracy and in terms of openness, and then huge losses can come almost immediately after, and that's really what the 2017 election represents. This, almost since the 1990 I would say, this cycle of booms and busts as far as democracy is concerned, that the wins are big and the losses are just as big. And so, like what started with the promise of digital technologies as a space where people were able to really pay attention to the election in a way that was incredibly impressive. Whether you're talking about voluntary election watching, you know people going to their polling stations and taking pictures of results and posting them online in order to show that the results that were being posted on the website were inaccurate. So the government promised a digital election. By the time the election came around, about a third of the polling stations were going to vote manually, even though they were in areas whereby mobile phone coverage- because most of the digital stuff in Kenya happens on mobile phones and GSM networks- even though there was mobile GSM coverage, the electoral commission said, you know, one third of these polling stations would not be voting electronically, and the response from the public was really a great deal of healthy scepticism. And so that's why you saw this wave of almost it was spontaneous, but also voluntary people on their own time going to their polling station, standing in line, taking photographs, documenting, challenging, collecting, you know, taking information from the IABC website and verifying it with the information that was happening in the polling station. These are the red flags that sort of led people to challenge the opposition, especially to challenge the final electoral outcome.  

And it's the kind of participation that you want in a democracy. You want people to care enough about what's happening in the electoral system to feel like they can give up their free time and their leisure time in order to be part of the process, that they are holding officials accountable as taxpayers, as people who are part of the public sphere, and you know you mentioned the recruitment of Cambridge Analytica. That represents the response of power. Power always learns and power always adapts, and that is the main lesson that I wanted, especially activists, to take from that book that there is no time democracy requires vigilance and there's no time for complacency that the moment you become complacent, you become vulnerable specifically to external factors, but also to people who are agnostic about democracy as a value but certainly have very strong feelings about profit. And this is the challenge that almost all countries in the global majority face is that we are trying to build those of us who are interested at least in civic society or whatever, trying to build democratic societies in a moment in which neoliberalism has really captured so many of the backstops that the global minority countries take for granted. But so this was the situation in which Cambridge Analytica is introduced into the matrix.  

Cambridge Analytica is a British company. It was licensed as a British company. It comprised of British academics and American sort of political consultants, and they were recruited by the former president to influence people's public perception of him and his candidature. This is the man who had been indicted at the International Criminal Court for crimes against humanity. It was on record, it was very much in the public discourse and then, because of the combination of the capture of local press and this massive spend on PR regionally, internationally, suddenly the narrative shifted. You know, it went from accountability “don't be vague, go to The Hague” to the ICC is a tool for Western imperialism. And that is probably one of the most successful PR campaigns. You know, I think Coca-Cola would be jealous of the successes that that campaign had in terms of shifting public perception. And so social media, you know I want to complicate and trouble the narrative, and this is what I try to do in the book; that it was just the presence of the social media platforms that made this outcome possible. 

16:46 - Sara Pantuliano (Host) 

Absolutely; and I want to go back to what you were saying about institutions and you know, go back to Sandra. I mean, given everything that Nanjala has said in the context of Kenya, but as we know, that applies to many other contexts. How can we build trust in our institutions, you know, in an age of fake news, misinformation, ai-led content, especially ahead of elections? I mean, what's the role of national and global policymakers to mitigate the effects, in addition to the social media channels themselves?  

17:28 - Sandra Pepera (Guest) 

The social media channels are not going to do this. You know, we have to understand, exactly as she described it, the sheer power of the profit model of the business, sorry, of the profitability of the business model that these platforms have put in place, and you know the sort of clicking for profit around hate is the extreme cash machine in some way for these platforms. So I don't think we can rely on them to do anything, unless and until we're prepared to disrupt the business model, and there are people who are thinking about this. If you go back in history, for example in the United States, you'd be thinking about the antitrust and anticartel work that was done at the beginning of the 20th century, to break them up, because for the longest time the internet, so to speak, has been sort of owned by like five or six people. In a way it's not a massively dispersed ownership model. Rather, as Nanjala has made the point, as have media outlets, become concentrated in the hands of a few people. The internet actually grew in that way.  

I think tackling the business model is massively important. Yes, we will have to have a better mechanism or set of mechanisms that test the products and the software. I mean, you know we don't allow cars and you know, I don't know what, sewing machines and things to just be put on the market with no testing for harm and the possibility of danger. Shouldn't we be thinking a bit more about software in the same way? So you know this, the ability to actually mislead people is unfortunately growing at pace and fact-checking, as we've seen, fact-checking alone won't do it, not least because if you are using social cleavages, you need not be inaccurate. You need not actually be making an inaccurate statement to be manipulating people's feelings and norms and perceptions. That's one thing, and the second thing is, we seem to be globally in a space where, if it's not a fact that I agree with, it cannot actually be a truth. It's my truth, everything is my truth. It's not a data point, unless it's my data point, right. So we're in a situation where this may be problematic for us as researchers, where presenting fact and data is in itself increasingly contested. It's not about the ideas anymore, it's about the actual data and the facts. So we are in this space where the challenges are great and growing.  

One of the key things, though, I'm going to say to you, Sara, is you were at UNGA just now. And, there was sort of a Summit for the Future and a global digital compact and all these things which, at least nine months ago, or probably a year ago, we were all actually quite worried about, because the governance pieces that were being pushed by the voices that were driving this were not going to lead to an open internet which had accessibility, which allowed for free flow of ideas, which gave space for freedom of expression, freedom of association and those things. So how do you see the outcomes of the recent global digital compact and other things? How did that sit with you? What was the chatter in new york about this?  

21:41 - Sara Pantuliano (Host) 

Well, it's interesting you say that because I was just about to put one last question to all of you reflecting on some of the, that chatter and some of that discussion that didn't quite go as far as people were hoping, because, yes, there was fear, but there was also hope that we could get towards a more, you know, some sort of global digital governance that we didn't achieve, and so there was a lot of discussion on how we can, you know, reinforce measures to curb the influence of these networks and their owners, like Elon Musk, that will continue to influence public discourse in the UK, in Kenya, anywhere.  

And so there was a lot of discussion about the need for a cross-border law regime to do this in some sort of global normative framework that could really regulate the digital space, and of course there was the fear that then this could turn into, you know, the stifling of the social media space, but, on the other hand, is very much a recognition that some boundaries are needed and some level of regulation should be in place, especially when this allows the level of online violence, misogyny and what not that we have seen so far. So I was actually going to ask all of you to conclude, as we need to wrap up, what do you think the value of such cross-border law regime is? Should we push for that? You know such cross-border law regime is, you know. Should we push for that? Can different nations work together to achieve something that is balanced enough, and how could that be implemented if it is desirable?  

23:19 - Nanjala Nyabola (Guest) 

I think it's very important to tread carefully on the idea of a cross-border regime and I think I understand the impetus behind it, because there is a great deal of threat that emerges; you know the same mechanisms that make it easy for activists and feminists and you know people to find each other and organize for positive change also make it possible for people to find each other to organize to do crime. That is the unfortunate reality of the digital age. I still do urge caution because you know one of the assumptions that is embedded in a lot of the political theory that we do in the world and this includes in the digital space, in the human rights space, in international relations is the assumption that the state is either a neutral or a positive actor, it's sort of a silent assumption, but it is present in the idea that let's create a tool or a platform or a space or whatever that gives the state more power in space XYZ and for the vast majority of people who live in the world, that's just not an assumption that we can take for granted.  

The Computer Misuse and Cybercrimes Act in Kenya was passed specifically to challenge not just cybercrimes in general, but you know, hate speech, misinformation, all of these things and I haven't run the exact numbers, but I've been monitoring it anecdotally and I would say probably like 98% of the time that it has been used, like a case has been brought, someone has been charged, whatever, has been for someone who is criticizing the government, right, it's a blogger who said something that a politician and some of it is banal; one photographer was charged for publishing a photograph of a governor who was launching a bridge that was badly constructed and the bridge collapsed and the governor fell in the water and the photographer took the picture and published it online and he was charged with harassment, right, and it's like, well, you shouldn't build terrible bridges.  

And these are the kinds of paradoxes that international rulemaking has to navigate, that we don't want to create a digital regulatory framework that makes it easier to punish people who are pushing for greater accountability and transparency and openness from the states, not the states but from states, and I think this was the challenge that some of the documents that were on the table through the GA process didn't meet that standard, that really they were being a lot of authoritarian governments were salivating because they were excited about what power was on the table for them, and so, rather than push through an incomplete or poorly thought out document, let's take a step back. Do we need 197 countries, or would a regional approach work better? Does it make more sense to have an AU instrument, an EU instrument, and think about interoperability between those regional instruments that are more sensitive to local domestic realities? And I think those are the kinds of conversations that require a little bit more time than the you know the deadlines that were tied to this particular GA made possible.  

26:37 - Sara Pantuliano (Host) 

Those are very good points and exactly what Sanda was saying. You know there was an element of relief, because many fear that the worst would come out of the process, rather than something that would, you know, improve the status quo. I think I'm going to use your expression that you have to think about laws with the worst of governments in in mind, you know, in the global human normative framework, rather than the best ones. Sandra, anything to add to this?  

27:04 - Sandra Pepera (Guest) 

I want to kind of reinforce that. I mean, it's clear the internet will not be better if only governments are in charge. It's just not going to happen that way, and certainly up until when I was with colleagues and we were thinking this through at NDI, you know we firmly believed that you had to have a multi-stakeholder framework for this. You know so the idea that you have yet another UN intergovernmental body that looks at this is absolutely dangerous to all of us, for all the reasons Nanjala said, and for many more.  

27:39 - Sara Pantuliano (Host) 

Thank you so much, Nanjala and Sandra, for this excellent conversation. I think the point you are making around the necessity for a multi-stakeholder framework for the digital space is a really important one. As we discussed, social media is highly commercialized, and it will focus on its survival by providing content that sells, which includes misinformation and disinformation, and so, as such, technology companies must be held to account from multiple parties to curb the spread of falsehoods and limit the impact that this has on our institutions. This cannot be exclusively government's remit. If it's used right, social media has the potential to engage citizens in unprecedented ways, documenting their civic engagement in real time, collecting ideas and data to hold governmental institutions to account. But to achieve this, we must ensure that the environment for democracy remains conducive and provides people with access to various sources of information and opportunities so that they can interact with their elected leaders and their fellow citizens.

And social media should remain one of these channels. But we must make sure that we promote the skillset to use it wisely, that we tackle subconscious bias and that we have adequate accountability mechanisms in place to regulate its undue use. Thank you so much to our listeners for being with us today for this episode of Think Change. If you have any feedback, please make sure to let us know and do subscribe if you haven't already. Until next time.

People on this episode