Sandra Peter and Kai Riemer
Protests, platforms and free speech on The Future, This Week
This week: protests, free speech and the responsibilities of social media platforms. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.
The stories this week
02:04 – Facebook chooses not to regulate online content
Other stories we bring up
Global #BlackLivesMatter protests across the UK, Germany, Italy, New Zealand, Canada, and more
33 former Facebook employees’ open letter in The New York Times
Zuckerberg defends hands-off approach to Trump’s posts on Facebook
The Twitter Trump repost experiment by @SuspendThePres
Snapchat stops promoting Trump’s account due to “racial violence”
Spotify, Apple Music, YouTube, and Amazon come out in support of #BlackoutTuesday
Black squares with hashtags meant to support #BlackLivesMatter overtake activist hashtags
30%-49% of people tweeting about the protests might be bots at any given time
K-pop fans are flooding extremist hashtags
You can subscribe to this podcast on iTunes, Spotify, Soundcloud, Stitcher, Libsyn, YouTube or wherever you get your podcasts. You can follow us online on Flipboard, Twitter, or sbi.sydney.edu.au.
Our theme music was composed and played by Linsey Pollak.
Send us your news ideas to sbi@sydney.edu.au.
Dr Sandra Peter is the Director of Sydney Executive Plus at the University of Sydney Business School. Her research and practice focuses on engaging with the future in productive ways, and the impact of emerging technologies on business and society.
Kai Riemer is Professor of Information Technology and Organisation, and Director of Sydney Executive Plus at the University of Sydney Business School. Kai's research interest is in Disruptive Technologies, Enterprise Social Media, Virtual Work, Collaborative Technologies and the Philosophy of Technology.
Share
We believe in open and honest access to knowledge. We use a Creative Commons Attribution NoDerivatives licence for our articles and podcasts, so you can republish them for free, online or in print.
Transcript
This transcript is the product of an artificial intelligence - human collaboration. Any mistakes are the human's fault. (Just saying. Accurately yours, AI)
Disclaimer We'd like to advise that the following program may contain real news, occasional philosophy and ideas that may offend some listeners.
Intro This is The Future, This Week. From Sydney Business Insights. I'm Sandra Peter and I'm Kai Riemer. Every week we get together and look at the news of the week. We discuss technology, the future of business, the weird and the wonderful and things that changed the world. Okay, let's start. Let's start!
Kai Today in The Future, This Week protests, free speech and the responsibilities of social media platforms.
Sandra I'm Sandra Peter. I'm the Director of Sydney Business Insights.
Kai I'm Kai Riemer, professor at the Business School and leader of the Digital Disruption Research Group.
Kai So, Sandra, what are we talking about today?
Sandra Well, there has been one thing that's been dominating the news other than the coronavirus, and that has been the protests associated with the Black Lives Matter movement. And they've started in the US, but they've now spread across the world.
Kai We've been looking for topics other than coronavirus for quite a while. We wish it was a different one, but this is one that we obviously have to deal with, and we thought about what do we do? Do we put out an empty episode, eight minutes, forty-six seconds long?
Sandra We could just observe eight minutes and forty-six seconds of silence, which is how long a police officer knelt on George Floyd's neck, which resulted in his death. But considering the role that social media platforms like Twitter and Facebook play in the aftermath of these events and during the ongoing protests, as we are recording this podcast, we thought we must discuss this.
Kai Yeah, you're right. And I think that should be the topic. The role of platforms in these public protests, the human rights movements off the back of this, but also more broadly misinformation and their role in upholding free speech, the democratic process. This is a complex matter, and let's try to unpack this.
Sandra Let's do it.
Sandra So Kai, what happened in The Future, This Week?
Kai Our main story, and there's many stories that we will bring up, but the main story want to focus on comes from The Guardian, and it's titled Zuckerberg says Facebook won't be arbiters of truth after Trump threat.
Sandra And just to make it clear, the Facebook story is one of many stories that have emerged around the role of platforms in the protests. We've had stories about Twitter, stories about Instagram, about Snapchat.
Kai There's been stories about music platforms joining the protest to Black Out Tuesday in honour of George Floyd's death and the Black Lives Matter movement.
Sandra So stories about Spotify, Apple Music, YouTube, Amazon.
Kai But they all go to the heart of the matter, which is what we want to focus on. What is the role of these platforms in propagating information around these topics? And there have been a number there has been issues around misinformation, there has been issues around inciting violence. Something we need to take a closer look at. So let's start from the beginning.
Sandra It's useful to place these events in a larger context. So the killing of George Floyd comes on top of years of cases of black people dying in police custody or dying at the hands of the police. This latest incident has led to protests that have started in Minneapolis.
Kai But we've seen them spread across the United States, but also around the world. There have been protests in solidarity in many European cities. There have been protests in large Australian cities, sometimes crossing over, raising issues in Australia as well with the indigenous population and similar issues of violence.
Sandra So across the world, we've seen protests condemning racism and police brutality. That's been the case in the UK, in Germany, Italy, Australia, New Zealand, as you've mentioned, Canada, France, Ireland, Denmark, the Netherlands.
Sandra And everywhere we must remember that social media platforms like Twitter and Facebook have played a central role to how we understand these protests to unfold, has also played a large role in how they are being organised and how they are being represented. At the same time as being a place for misinformation and for glorifying violence.
Kai And of course, these events were further inflamed with, in particular, two tweets by President Donald Trump, where he suggested that he was about to dispatch military troops to disperse the protests.
Sandra And where he added that "when the looting starts, the shooting starts," which, of course, in the U.S. has a very racially charged history behind it going back to the civil rights era, when a white police chief threatened to crack down on protests.
Sandra What ensued was different social media companies have taken a different stance to this. On the one hand, Twitter has flagged the tweet for glorifying violence, so if you tried to see it you will get the notification that says that this tweet violated the Twitter rules about glorifying violence. However, Twitter has determined that it may be in the public's interest for the tweet to remain accessible, and you actually have to press view to get to the content of the tweet.
Kai And this, of course, follows Twitter's move just last week to flag one of Donald Trump's tweets around fraud in postal votes as basically misinformation, which at the time infuriated Donald Trump into signing an executive order that is threatening to take away some of the platform protections from social media companies, a move that many tech experts say doesn't have any teeth and will not pass muster in the courts. But it has ignited this controversy around the role of platforms. And then, of course, there's Facebook, which has taken a very different approach.
Sandra Facebook flat out refused to take down the inflammatory posts after CEO Mark Zuckerberg made it very clear that he does not believe that Facebook should be an arbiter of truth and said that his position on Trump's tweet is based on research and conversation. And whilst he expressed disgust with Trump's language, he concluded that Trump's tweets would not be read as a call for vigilante supporters to take justice into their own hands and promised to re-examine policies about the use of force.
Kai Now, we do not, of course, not know who Mark Zuckerberg refers to when he says he had conversations around this or he looked into research. What we do know is that many people disagreed with Zuckerberg's stance, including many of the Facebook employees who just this week staged a walkout from the company, much like we've seen with Silicon Valley companies before, such as Google back in the day, around working on controversial projects for the U.S. military. This time it was on Facebook and the employees making it clear that many people do not agree with this stance.
Sandra And we should be clear here that this was a very different type of worker action. We've seen a lot of worker actions at big tech in previous years. We've seen Google walk out over sexual harassment, the protests at Amazon, at Microsoft, at Salesforce. This time, it was a virtual walkout, that means pretty much just not opening your Zoom and sending your calls to voicemail.
Kai Yeah, well, this is the topic that raises some questions around how effective is a walkout when it is people not actually walking out from a building into the street in front of cameras. But, you know, me not opening my Zoom in the morning, or walking from my living room to my bedroom, it's not exactly the same. So what then is the role of platforms in this?
Sandra So on the one hand, they help us understand what is going on during the protest. So it's a way of getting information, people telling us what is happening in the street, people making sense of it. On the other hand, it's also a tool for mobilising action.
Sandra Many people in movements such as Black Lives Matter have used Twitter and have used Facebook to organise the protest to know where people are and to provide them with resources, with information.
Kai Which is an important point to understand, because on the one hand, these platforms are criticised for providing a platform to people who might be inciting violence. On the other hand, these very platforms are also used by the protesters, by civil rights movements, to self organise and to mobilise people to help with the course, which goes to show the platform nature as a forum for all kinds of activities.
Sandra And we must remember that underpinning this model is the way these platforms make money. So Facebook, for instance, monetises attention on the platform. And the more inflammatory the content, the more outrageous the content, the more views it gets.
Kai And this is, of course, why these companies are often criticised because they make money from promoting divisive content often. What we want to focus on today is the controversy around what can or cannot be said on these platforms, the kind of information that is legitimately propagated on these platforms, for example, around inciting violence, as in the case of Donald Trump's tweets.
Sandra So this is not just about glorifying violence, but in particular, it's about what are the practicalities of policing content? How do we treat people equally on the platform? What constitutes freedom of speech on these platforms? And what is the disproportionate role that the CEOs still play in how these platforms are policed?
Kai Let's look at the practicalities first. So arguments, of course, have been made for a long time that these platforms needed to do a better job weeding out dangerous content, abusive content, misinformation, disinformation, fake news or those posts and users who tend to incite violence or propagate racist or otherwise hateful speech.
Sandra In this particular instance, the CEO of Facebook, Mark Zuckerberg, argued that on the one hand, yes, the post might be considered as glorifying violence, but that people do need to know if their government is planning to deploy force. So it becomes difficult to know which argument should trump in this case.
Kai And of course, there's always a grey area, you know, is something really misinformation or disinformation, or is it just misleading? Is something really meant to incite violence or is it only in certain contexts that it would be read this way? So, set aside the practicalities of determining clear cut "which is which" the question is also, what do you do with content that most people would think does incite violence, which might well be the case with some of Donald Trump's recent tweets.
Sandra So on the one hand, you could label the content like Twitter, that on the other hand, you could stop promoting the content altogether like Snapchat did, or you could potentially ban the account altogether and take the person off the platform, which is what has happened previously with other groups.
Kai Or you do nothing, which is also a choice, the choice that Facebook has made, to either say, you know, people deserve to know that this person has these views or to say we're just a platform, we're not arbiters of truth, we're just a neutral channel in which anyone can say anything because it's considered freedom of speech.
Sandra And this conversation is, of course, not new. We have spoken at length on a previous podcast that we will include in the shownotes about how difficult it is to identify whether or not a post complies with the terms and conditions of the platform.
Sandra So, for instance, Twitter has decided last year that they will ban political advertising, but that put them in the position to decide what is and what is not political. Identifying political issues in political ads is quite straightforward, but then identifying whether or not certain conversations around certain legislative topics are or are not political ads is much more difficult.
Kai This could be issues like climate change, solar power, healthcare, immigration, taxes, national security, or indeed civil rights movements like Black Lives Matter, which are, of course, political.
Sandra But this issue is also not just about the content. It can also be about who has posted the content.
Kai There is always this argument around free speech and we should allow certain things on the platform, like Zuckerberg says around Donald Trump when he says something about deploying military force. But the question then is, is free speech really about what is said or is it not sometimes about who says it? So do the rules apply equally for anyone? Is one big question here.
Sandra And that's go back for a second to Donald Trump's tweet, because one of the arguments that Facebook has made was that it will refuse to take this down because it is Donald Trump, a president, who says it, it is newsworthy content.
Kai Which raises the question whether Donald Trump or people of power, people of public visibility, can get away with things that other people wouldn't.
Sandra And this was also pointed out in a public letter that was published in The New York Times, where 33 former employees at Facebook, including the ones who helped create Facebook's community guidelines, have said that it seems that today Facebook interprets freedom of expression to mean that they should do nothing to interfere with the political discourse and that they have decided that elected officials should be held to a lower standard than those that they govern. So that implies that one set of rules would apply for presidents or other politicians, and there would be different sets of rules that would apply to normal people.
Kai And interestingly, you found an experiment that put that hypothesis to the test.
Sandra Indeed, an article in MarketWatch reported on an experiment done by @SuspendThePres, which is a Twitter account that began copying Donald Trump's tweets verbatim and reposting them in an effort to find out whether they would also get the pass if they weren't coming from the president. And, a couple of days later, the account was actually suspended for glorifying violence for the same tweet that we mentioned before that Donald Trump had put out around the looting and the shooting. It seems that Twitter will flag such tweets for violating rules about glorifying violence and will suspend accounts if they do not come from a person who holds public office.
Kai Which already poses some question marks around the idea that Facebook is not the arbiter of truth, or Facebook does not interfere in free speech or freedom of expression. But we also want to raise a further issue here, and that is around what counts as speech or what is speech. And interestingly, we've discussed this on the podcast before. In an episode in November last year, which was mostly around political advertising and the Democratic primary, but where we made the point that speech is more than what is being said. It also includes the audience.
Sandra And here's a bit of what we said.
Kai So Zuckerberg, in a conference call with journalists said, "in a democracy, I don't think it's right for private companies to censor politicians or the news." And that sounds like a lofty goal and a nice thing to say, to couch Facebook as the upholder of free speech and freedom of expression. But as the article in The New York Times put it, it's a little bit disingenuous in the first instance because Facebook is known to very much censor content in other countries. And the article mentions Pakistan, India, Turkey, where the governments have clear rules about what to censor, and Facebook happily complies with those rules to be able to trade in these countries. So to say we're all for freedom of speech is a little bit disingenuous. But we also wanted to take a bit of a look at what we actually mean by speech, because we tend to think about it exclusively in terms of what someone can say. And we can't stop people from saying what they want to say and forget the audience part of this.
Sandra For that, we would need to remember how does content and how do messages reach their audience on platforms like Twitter and Facebook? And the fact that this is done by an algorithm that for all intents and purposes, is opaque to the people who receive that message and to the person delivering that message as well. So when you say something on Twitter, when you post something on Facebook, the algorithm decides the optimal audience that your message should reach. And that is not optimised to ensure that your message reaches the people it needs to reach, but rather to drive engagement on the platform and indeed advertising spend.
Kai And I think this is a really important point. So a quick look at the dictionary tells us that "speech is a formal address or discourse delivered to an audience." We only tend to focus on one half of that definition, which is the delivery part or the content part. You know, we can express whatever we want, but forgetting that speech has an audience. And Facebook is very much involved in deciding who the audience is, who gets to see what message and to enable people to say different things to different audiences in the political discourse to pretty much manipulate them into voting for them, for example. So for Facebook to say that they provide a platform for free speech is an incomplete argument, problematic in so far as we tend to forget that they are the ones who control the audience part of that free speech equation.
Sandra And that analysis that we had back in November still holds true for the dispute today, where it is again a one-sided view of the freedom of speech argument.
Kai And so the audience part of the speech is really important here. Mark Zuckerberg seems to imply that when platforms interfere in free speech it's usually about censoring what can be said. But platforms are interfering in speech because they make decisions on who gets to see certain things. And we do know that algorithms capitalise on divisive content, on content that is more extreme because it gets more clicks. And so the algorithms that monetise content often privilege or prioritise more extreme content and therefore provide wider audiences for this content. So what platforms should do is make sure that while things can be said on a platform, that when something is misinformation or disinformation or controversial or of a potentially violence inciting nature that those algorithms do not unnecessarily propagate this content. The content might still be there, might still be seen by people who are following these accounts, but it might not also be promoted into the news feeds of those who do not already subscribe to an account of, for example, Donald Trump.
Sandra So in a way, through the very way this companies work to their very business models, these companies are already arbiters of truth, whether or not they choose to flag content or block accounts.
Kai Which brings us to one more thing that we wanted to highlight about this topic.
Sandra Which is one that is often underscored by the free speech conversations that seem to dominate, which is the power of CEOs themselves. So people like Mark Zuckerberg and the power that they have in deciding what counts as free speech and even how we talk about it. The examples that are often given are Twitter and Facebook because of their prominent CEOs. But in our previous conversation, we had another example that really highlights the roles that CEOs play in what truth looks like on these platforms.
Kai And the example that we had back then was CloudFlare, so let's have a listen.
Sandra Which brings us to one more thing we wanted to say about these stories, which I think is often underscored by the free speech conversations, which is the power of the CEOs that some of these companies have in deciding what counts as free speech and how we talk about it, and even what truth then looks like on these platforms. And our examples are always Twitter and Facebook, and Mark Zuckerberg is the one who always gets flaunted as the example for this, so we thought we'd bring in another example.
Kai And that example is a company called CloudFlare. And CloudFlare provides infrastructure for the content that we see on the Internet. So it basically sits between the actual servers that have the content and our browsers. And it is responsible for about 10 per cent of content distribution in the US, so it has a very key position to make sure that we all can see the websites that we want at a certain speed, so it provides a important back and service. So it's in a key position to control who gets to see what on the Internet, and that's the important bit here.
Sandra And last year, one fine morning in August, its CEO, Matthew Prince, sent an internal company email in which he had decided to physically kick some people off the Internet. And I quote from the internal email, "My rationale for making this decision was simple. The people behind the Daily Stormer are arseholes, and I've had enough." Prince wrote, "Let me be clear, this was an arbitrary decision," he continued, "I woke up this morning in a bad mood and decided to kick them off the Internet. It was a decision I could make because I'm the CEO of a major Internet infrastructure company."
Kai So we're talking about one guy in a bad mood one morning banning the Internet content of an entire group, in this case unashamed white supremacists, off the Internet. And while many people would and have applauded this decision, it goes to highlight how much power certain individuals in key positions providing the infrastructure for our daily conversations online hold in making such decisions. And of course, it's Zuckerberg, it's Dorsey, but it's also people that most of us have never heard of.
Sandra Who can just wake up one morning and decide to kick someone off the Internet because they were in a bad mood. And if it's a far right neo-Nazi group, white supremacist group, no one thinks twice about that decision. It still highlights that this can be an arbitrary decision that can be made overnight by a few people in the world.
Kai And so for a person like Mark Zuckerberg to say that Facebook is the defender of free speech and freedom of expression is problematic not only because he seems to be the person who can make that decision, but also because the platforms are not designed to actually deliver a level playing field for freedom of expression to occur As we've discussed with the audience aspect of free speech.
Sandra But we don't want to end its podcast without discussing what can be done and how we can manage this better.
Kai So on the one hand, there is, of course, labelling tweets like Twitter has now shown, you're not actually censoring content, you don't take down the content, but you call out clear misinformation or disinformation by putting a fact checking label onto the tweet, or you are hiding or marking content that incites violence as such. So Twitter really has been a leader here, not just doing this for any content, but also for powerful people like Donald Trump.
Sandra Another way to do this is to stop promoting certain accounts, and this is what Snapchat has done. Snapchat has stopped promoting Donald Trump's account due to racial violence and they said that it will no longer feature in the app's discover section and that they will not contribute to amplifying the voices that incite racial violence and injustice.
Kai Which goes to the heart of the matter that we just discussed, so Snapchat realises that they have a role in deciding audiences for content that is being posted and they have decided not to promote certain content, something that Facebook and other companies could do. For example, you could combine the labelling with the promoting aspect by deciding that anything that gets labelled as potentially misinformation, inciting violence, is excluded from algorithmic propagation.
Sandra And last but not least, is what the rest of us can do. For instance, considering very carefully what the effects of our actions are online. On Instagram, there was a movement to post black squares in support of the Black Lives Matter movement. Early this Tuesday, social media users began posting these black box images with hashtags meant to support the movement. But they ended up drowning out the real conversation that was happening around that hashtag. So by the same evening, people were panicking that the Black Lives Matter hashtag was suddenly flooded with thousands and thousands of empty images.
Kai Which wasn't really helpful because many users used pages that aggregate posts for certain hashtags as resources to organise their movements.
Sandra On a slightly more positive note, K-pop fans have been flooding extremist hashtags with memes, GIFs and videos of their favourite K-pop artists.
Kai And then, of course, there is employees who organise to stage walkouts or exercise their rights to freedom of expression by quitting their jobs and calling out their employers for not taking action, such as in the case of Facebook.
Sandra And last but not least, we might want to confront our own biases in this case and think about how we interact with the world and of course, to remember to take part in the democratic process by taking the time to think what we're voting for.
Kai And this is where we want to leave it. This is, of course, not the end of this topic as we're already continuing conversations from previous episodes, it is likely that we will come back to this in the future.
Kai See you soon.
Sandra On the future….
Kai Next week.
Sandra This week?
Kai Yes, but next week.
Sandra On Future, This Week. Next week. Thanks for listening.
Kai Thanks for listening.
Outro This was The Future, This Week made possible by the Sydney Business Insights team and members of the Digital Disruption Research Group. And every week right here with us, our sound editor Megan Wedge, who makes us sound good, and keeps us honest. Our theme music was composed and played live on a set of garden hoses by Linsey Pollock. You can subscribe to this podcast on iTunes, stitcher, Spotify, YouTube, SoundCloud or wherever you got your podcasts. You can follow us online on Flipboard, Twitter or sbi.sydney.edu.au. If you have any news that you want us to discuss, please send them to sbi@sydney.edu.au.
Close transcript