Sandra Peter and Kai Riemer
The Social Dilemma and platforms on The Future, This Week
This week: The Social Dilemma, our first movie review on The Future, This Week. We bring in everything we’ve learnt about social media over the past three years.
Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.
The stories this week
04:41 – The Social Dilemma (2020)
Other stories we bring up
Twitter’s photo preview algorithm seems to favour white faces
3 million+ users have registered to vote on Facebook, Instagram, Messenger and Snapchat
Microsoft gets exclusive license for OpenAI’s GPT-3
Microsoft president Brad Smith advocates for ethical AI
Facebook whistleblower claims tech giant ignored political manipulation
Facebook just invented … Facebook (Campus)
Disinformation campaigns used to require a lot of human effort, but in comes AI
The New York Times’ review of ‘The Social Dilemma’
Slate’s review of ‘The Social Dilemma’
Wired’s review of ‘The Social Dilemma’
The Verge’s review of ‘The Social Dilemma’
The Guardian’s review of ‘The Social Dilemma’
Our previous interview Cathy O’Neil, author of ‘Weapons of Math Destruction’ on Sydney Business Insights
Our previous interview with Jonathan Haidt, author of ‘The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure’ on Sydney Business Insights
Our previous discussion of platform competition on The Future, This Week
Our previous discussion of economic incentives on The Future, This Week
Our previous discussion of capitalism in crisis on The Future, This Week
Our response to the ACCC report into digital platforms
Follow the show on Apple Podcasts, Spotify, Overcast, Google Podcasts, Pocket Casts or wherever you get your podcasts. You can follow Sydney Business Insights on Flipboard, LinkedIn, Twitter and WeChat to keep updated with our latest insights.
Our theme music was composed and played by Linsey Pollak.
Send us your news ideas to sbi@sydney.edu.au.
Dr Sandra Peter is the Director of Sydney Executive Plus at the University of Sydney Business School. Her research and practice focuses on engaging with the future in productive ways, and the impact of emerging technologies on business and society.
Kai Riemer is Professor of Information Technology and Organisation, and Director of Sydney Executive Plus at the University of Sydney Business School. Kai's research interest is in Disruptive Technologies, Enterprise Social Media, Virtual Work, Collaborative Technologies and the Philosophy of Technology.
Share
We believe in open and honest access to knowledge. We use a Creative Commons Attribution NoDerivatives licence for our articles and podcasts, so you can republish them for free, online or in print.
Transcript
Disclaimer We'd like to advise that the following program may contain real news, occasional philosophy and ideas that may offend some listeners.
Sandra So what shall we talk about this week?
Kai There's a whole bunch of stories around data, social media and AI.
Sandra Yeah, for example, Axios reported that in the US, about 3 million voters registered already through social media, about two and a half million on Facebook and Instagram, and another, almost a million people registered via Snapchat. And the interesting thing here is that the social media platforms have such a broad reach that they're managing to be very effective at engaging voters, especially young people and registering them to vote, obviously not an issue here in Australia, since we all get to vote, all the time.
Kai Because voting is compulsory. So that's a really good use of social media. There's been, you know, the dark side of social media in the media, again, because Twitter has problems with its photo cropping algorithm, which seems to favour white faces over black faces. Users have discovered this and have posted quite a few examples. One of the most popular ones is a user posting what he says is a 'horrible experiment;, whereby the Twitter algorithm cropped out Barack Obama from a photo that showed him and Mitch McConnell. So this algorithm is supposed to help people when they post photos to only show what is relevant in the photo, and oopsies. That's not good.
Sandra And in a related story, it seems that the Zoom algorithm that does the interesting backgrounds also struggles sometimes with people who have a different toned skin colour.
Kai Yeah, showing just empty shirts without their faces. So quite upsetting, again, showing that these machine learning algorithms that are trained on loads and loads of data can get it wrong when the training data has racial biases, and more white faces than black faces for example.
Sandra There was also an update on our GPT-3 saga, whilst The Atlantic points out that the supply of this information will soon be infinite as the algorithm that we discussed last week that seemed to have written an op-ed for The Guardian, is about to be released onto the World Wide Web, it seems that Microsoft got an exclusive license for OpenAI's GPT-3 model.
Kai So does that mean that Microsoft will control who gets access to GPT-3, and who will do what with it? At this point, it seems that that's what that means. Well, that might actually be good news, given that Microsoft vice president Brad Smith has been touring the world outlining how Microsoft sees itself as the ethical force in the use of AI and that they try to employ ethical principles. So maybe that is actually good news. Maybe they keep an eye on who can do what with GPT-3.
Sandra For now it remains to be seen, as the word from Microsoft was that it will "create new solutions that harness the power of natural language generation," whatever that might mean.
Kai And here's another social media story. Apparently, "Facebook just invented... Facebook".
Sandra I agree, that was an excellent headline from the MIT Tech Review that pointed out that Facebook Campus which aims to help college students find each other on campus is just coming online, which is pretty much what Facebook was back in the day. So indeed, Facebook just invented Facebook.
Kai So that's a trip down memory lane, 15 years ago or so when Facebook was The Facebook, and pretty much exactly what they're not relaunching, a platform exclusively for college students.
Sandra So you need a college email address., and it's meant to help students connect with classmates over shared interest, and wait, we've seen this one before.
Kai Yeah, and probably also a move by Facebook to re-engage Gen Z who have you know, moved on to other platforms, Snapchat, TikTok, you name it.
Sandra In this flurry of articles around social media, and algorithms, we must point to all the reviews that we've seen for The Social Dilemma. This is of course, the new documentary on Netflix, which has been in the top 10 in Australia, and in the US over the past couple of weeks.
Kai So we've decided we'd do something different. We have mentioned movies before, but we have never done a movie review for an episode. But this one is so important to what we're talking about on this podcast.
Sandra So we're gonna give that a go today, the first movie review on The Future, This Week.
Kai Let's do it.
Sandra This is The Future, This Week from Sydney Business Insights. I'm Sandra Peter.
Kai And I'm Kai Riemer. Every week we sit down to rethink and unlearn trends in technology and business.
Sandra We discuss the news of the week, question the obvious, explore the weird and the wonderful and things that change the world.
Kai So Sandra, what happened in the future this week?
Sandra We both watched Netflix this week.
Kai Most definitely.
Sandra Yeah, we watched The Social Dilemma which has been in the top 10 in Australia for a while, and so it has been in the US, which is a 2020 documentary/drama, directed by Jeff Orlowski.
Kai And it has also been all over the news, there have been a fair few reviews that both of us have read and we thought, hmm, we should talk about this because there's some really harsh reviews for what we both thought was actually quite a well-made documentary.
Sandra Yeah, so the documentary does explore the impact of social media and its underlying algorithms and business models on society. And we should say from the very beginning, that we're quite sympathetic to the topics that the movie raises. We've been talking about the issues raised in this documentary for years now. We've covered things like misinformation and disinformation and political polarisation, almost three years ago, when we talked about the Cambridge Analytica scandal. We've interviewed many of the people featured in the documentary.
Kai Yeah, for example, Cathy O'Neil, the author of Weapons of Math Destruction. We had her on The Future, This Week. And you also interviewed Jonathan Haidt.
Sandra Who's a professor at NYU, and who wrote, among other books, The Coddling of the American Mind, looking, among other things, that the effects that social media, together with other societal changes has on Gen Zed, or Gen Zee.
Kai And we've talked about issues such as privacy and bias and algorithms, and how social media monetises its content and the kind of unintended consequences this has had. So across a number of episodes, we've engaged with the content of the movie. So we thought, we'll unpack this, we engage with the criticism a little bit, look at what might be unfair critique, but also where the movie could have gone further. And before we start, a spoiler alert, because we might bring up certain things from the movie. So if you haven't watched it, go and watch it, we think it's a good idea to have seen it.
Sandra So let's give a short synopsis of what the documentary is about.
Kai
So the movie features a number of experts from inside these big tech companies such as Facebook, Google, and Instagram, who were involved in building these algorithms, and experts such as Jonathan Haidt, Cathy O'Neil, but also Shoshana Zuboff or Jaron Lanier, who for years have been vocal in the media calling out some of the consequences that social media have in society. And on the other hand, a hypothetical family as a dramatization to illustrate some of the effects that social media has in people's daily lives.
Sandra The movie does call out a number of problems, and some have to do with design ethics, they look at things like addiction or body image or bullying, data privacy, but also problems with algorithms, many of them are underpinned by the business model that these companies have that create bubbles and echo chambers that lead to political polarisation, or that lead to the proliferation of extremist content or to misinformation or disinformation.
Kai And so while we get to see how this family struggle with some of these issues, and one thing that the movie does really well, it brings in the very people who helped build some of these technologies. And so we thought, it's a good reminder to just give a bit of a history of how we got there.
Sandra And one of the good things that this achieves, is to give enough of the history and background to companies like Facebook, to put what happens today into context. It's not that these companies have started out to build this mass platforms for misinformation or platforms for addiction. But historically, the way the company's evolved led them to this point where a focus on purely growth, engagement and monetisation actually traps them in the cycle that reinforces the problems that we always discuss with this platform.
Kai Which is still true to a certain extent but misses by a long mile the fact that Facebook today is predominantly an advertising company that makes its money from corporations who advertise on the platform. So let's take a look at how we got there. There's really four big developments that have happened since Facebook first launched. So Facebook launched in 2004, as we earlier heard Facebook Campus as a service for college students in the US. It was around 2007 that it opened up to the general public so that anyone could join and then connect with and make friends on the platform. And at that point, you would basically follow people and read their messages in what was then the early newsfeed. But since then, four really important developments have happened that are all featured in the movie. The first is the invention of the Like button in 2009. And we learned in the movie that the intention behind this was really to spread positivity, to give people a chance to value contributions by other members. But the designers at the network learned pretty quickly that they could also from this glean preferences of people, and learn about what people are into, what their profile really is.
Sandra And turns out that is one way you can make money on a social media platform, by selling that to advertisers.
Kai And that's the second big development, is the monetisation, which started out around 2012 when Facebook had its IPO, and it has to actually make money to please shareholders for the first time. And around that time, it started to let companies put advertisements into the newsfeed of users. And that turned out not only to be lucrative, but an opportunity, because you could target those advertisements to the preferences of users that they disclosed by using the Like button.
Sandra So what then happens is that Facebook and other social media companies, Facebook is not the only one that does this, get trapped in a cycle where they have to constantly grow the number of users on the platform, increase engagement, so increase the time that users spend on the platform in order to increasingly monetise their activities on the platform.
Kai And that's really the third thing is that Facebook started optimising for, that they brought in emoticons for the reactions, they started to pull in news content from news organisations in about 2015, making the platform more sticky to give the users more to engage with to spend more time on the platform. But it was the fourth move that really allowed them to get to where they are today, which also is at the heart of many of the problems. And that is the creation of the algorithm that organises the newsfeed, that decides what content people get to see which they worked on hard, lots of modifications around 2015, 2016 in the lead up to the US elections, which led to all the kind of problems that have emerged since then, which are at the heart of this movie.
Sandra And we need to make it clear again here that it's not just the algorithm itself that creates these problems. But it's the algorithm in conjunction with the fact that all these activities online had to be monetised.
Kai Exactly. It's the combination of using the algorithm to engage people to keep them on the platform for longer, and to decide how this engagement can then be monetised for the benefit of the advertisers.
Sandra So the first big category that the movie takes issue with is things to do with design ethics, with how the four things that you've just mentioned were implemented to make the platform stickier and stickier. And the film interestingly notes that people who are addicted to drugs and people who use social media are both referred to as 'users'.
Kai
Which is an interesting concept, because we also learned that many of the designers inside Facebook and Google have actively engaged with a field that is called persuasive computing, which merges insights from computer science with psychology and neuroscience, around how technology can be made more engaging, more addictive, and research, incidentally, that has been used in the design of slot machines.
Sandra So besides the constant stream of dopamine hits that we get every time our phone buzzes in our pockets that has been designed to make us addicted to our phones, the movie also focuses on design ethics that have to do with data privacy. And we've covered that in numerous episodes of The Future, This Week previously, from the Cambridge Analytica scandals to how we're being tracked on the internet, to how companies are selling, or monetising our privacy.
Kai And that bridges the connection to the business model, because the algorithms that are at the heart of driving engagement and giving users the kind of content that keeps them engaged are also the algorithms, of course that collect more and more data about the users that can then be used to monetise. So what really has happened is that as the business model of these platforms has crystallised around creating these data-based advertising services, the algorithms really started optimising for this. And Facebook started to actively collect more and more data about its users using, for example, logging with Facebook to follow users around the Internet during their day to day activities, harvesting as much data as possible about each user.
Sandra And at this point, we should say that as we were watching this documentary, we really had this déjà vu moment, because among the people in The Social Dilemma there was Cathy O'Neil, who's the author of Weapons of Math Destruction, and who's also been a guest on this podcast almost two and a half years ago, speaking about this very aspect. And let's hear a bit from the interview with it with her back in July 2018.
Cathy O'Neil Whoever builds the algorithm chooses the definition of success and chooses the way to train their data. So for example, when Facebook optimises its newsfeed algorithm, what it's really doing is choosing a definition of success. And that definition of success is you stay on Facebook longer. And when you think about what that looks like, staying on Facebook longer, engaging more with more articles, getting addicted to the experience of Facebook, all of that, of course, is highly correlated with their profit. That's why they choose that. So they choose to optimise to a success which is correlated to profit. But what it also does, is it privileges outrage, so you're going to more likely stay on Facebook, if you get into a really long, difficult conversation with someone about an outraged topic. In particular, it doesn't optimise to truth, it doesn't optimise to short efficient conversations, it doesn't optimise to all sorts of things that we actually would prefer over engagement and outrage. So when we give Facebook the power to secretly choose a definition of success, and without conferring with us, we're giving them power over us and our attention. And that's just one example of literally every algorithm.
Kai So what Cathy really outlines here is how optimising for engagement, which leads to addiction, incidentally, and optimising for monetisation, which is at the heart of Facebook's business model, really leads to the kind of polarisation, the echo chamber effect, and the proliferation of more and more extremist content on Facebook, precisely because it helps not only the engagement, but also to further segment users, which in turn benefits, how they can be targeted through advertising.
Sandra And enter here Shoshana Zuboff, who we've also mentioned on The Future, This Week, and we'll put all the episodes and all the links in the shownotes. But we spoke about the crisis of capitalism in that. And Zuboff of course coined the term 'surveillance capitalism', the way in which data about our collective behaviours is monetised to basically commodify our everyday behaviour on these digital platforms, not just on Facebook, but on all these digital platforms, Amazon, Google, and to sell access to these behaviours and manipulate them for the financial gain of these organisations.
Kai Because I think while a subtle point, it's an important point, because companies like Google and Facebook, they do not sell our data. They also don't just sell advertisements. What they sell is what the product is, is what Jaron Lanier in the movie calls the gradual, slight and imperceptible change in our own behaviour and perception, which is essentially the product. So these platforms have gone one step further. They use their algorithms to manipulate us into not only believing things, but also doing certain things such as clicking on ads, and engaging with content and buying products.
Sandra So this is what Zuboff terms the 'behavioural surplus'. That is all the information about what we do and how we think that can then be traded and create markets that are not based on producing this information, but rather on predictive or indeed producing our behaviours. So what is crucially different here is that this is not simply about exploiting our data or mining our data are stripping us about data, but it goes directly towards shaping or directing or controlling our activities online.
Kai And it's here that we need to give credit to the movie because it does a really good job showing, much like Zuboff has argued in her book, how these platforms have gradually discovered what they can do with these algorithms. There wasn't a grand master plan to actually do this. It was iterating forward, doing a lot of experiments with you know, little changes and tweaks to the platform, that they discovered how they could not only glean more data from users and keep them engaged more, but nudge them in the direction that they could predict what we would do on these platforms and then sell those predictions to advertisers. So really an emergent phenomenon that is not only at the heart of the business model, but also of the many unintended consequences such as polarisation, the fact that misinformation and fake news content does much better in engaging users on the platform, in all the kind of problems that the movie highlights.
Sandra And as we said, we are sympathetic to the movie and what the movie does. But we do want to point out some things that could have been done better. And here we diverged from many of the reviews of the documentary that we've seen online, and the first thing we do need to bring up and we've never had one of these before. It's really the dramatization. And the issue that we take there is not as The Verge put it the "after-school special-style dramatization of what happens when Johnny and Janey scroll through feeds all day".
Kai So many of the critics took issue with how this family was portrayed in the movie. We think that's actually a strong point because it makes accessible some of this content. We take issue with the way in which the technology side is portrayed in the movie.
Sandra So Vincent Kartheiser plays all three sides of the algorithm, and he of course played a manipulative marketer in Mad Men. And here he is the face of the algorithm behind the scenes that controls and responds to every attempt that the human makes to distance itself from the technology. And inadvertently, whilst helping us understand what the algorithm actually does, and we empathise with the fact that this is really, really hard to portray, it also inadvertently gives a face and humanises the algorithm that now seems to have a mind of its own that can be changed.
Kai So of course, Jeff Orlowski, the director, faced what he called the 'challenge of visualising the invisible', what the algorithm does, but in the process, we think he overstepped a little bit, because there is now the actor who argues with himself about, you know, what should we show? And at some point almost has an introspection is this actually a good thing what we're doing here? So it almost suggests as if the algorithm consciously does what it does and could have the capacity to step back and change. And that also suggests that, you know, we could potentially just tweak the algorithm and instil some ethics to make things better.
Sandra It also seems to remove it in a way from the control of the organisation, and to make the algorithm accountable rather than the organisation behind it. We never actually get to see the social media company behind the algorithm.
Kai And we've called this out on the podcast a number of times the fact that these companies want to object responsibility and blame their algorithms, which is exactly what is invited by the portrayal of the algorithm in the way it does.
Sandra And speaking of the company behind it, it was a bit weird that at the end of this thing that talked about addiction and how we're nudged into more and more engagement, at the end of the movie, Netflix suggested a whole range of other content to me, and indeed started playing trailers for other things I should be watching.
Kai Yeah, so a similar recommendation algorithm wanted you to be engaged with now Netflix.
Sandra But back to our movie review. The other thing that a lot of the Internet took issue with was the range of experts used in the movie. And there were a number of articles that raise the issue that these people who have helped create the Like button or create many of the strategies that are now called out, have been featured in the movie, and that there was only a small range of experts like Jonathan Haidt or like Shoshana Zuboff. But many critical internet and media scholars like Safiya Noble and Sarah Roberts were not included or interviewed.
Kai So for example, The Guardian took issue with the fact that the movie draws a lot of its insights from what they call people who built the addiction machines of social media, but have now repented, and now talk openly about their feelings of guilt about the harms they inadvertently inflicted on society.
Sandra And here why we know that there are many, many people out there, including some of our colleagues here at the business will who do work on the nefarious effects of algorithms and social media, we do see an important role for people from the industry speaking up on this. First is that it does help convince or make people understand what is going on within this organisation. Second, as researchers we know how difficult it is to get to the data and the algorithms and the practices within many of these quite secretive organisations. Hence, it often takes someone who has worked on these topics to fully reveal the inner workings of it, rather than just the nefarious effects that it has on consumers around society.
Kai And incidentally, in a world where the same social media has discredited experts and researchers and their opinions in conversations around climate change and what have you not, those researchers opinions might easily be dismissed. But when the very people who built the technology, and we have people who, you know, invented the Like button or were in charge of monetisation at Facebook, bring their views on the movie, those insights are much harder to dismiss because those people know the inner workings of the machines they created.
Sandra And incidentally, even this week in the news, there was another former Facebook employee Sophie Zhang, who in a six and a half thousand word exposé talked about how she feels she has blood on her hands because she was part of a machine that failed to stop political meddling in elections around the world, that she was aware and that Facebook had ignored evidence that there were fake accounts on its platform and that they were practices that were disrupting political events around the world, and that the actions that she has taken or rather not taken, have affected national presidents.
Kai But there's one point we agree with the critics, and that is that the movie very much concentrates on just the role of social media in creating those ills in society that are portrayed, rather than taking a look at the bigger picture and the context in which social media operates.
Sandra And indeed, many articles have already pointed out that documentary seems to ignore the world out there, and the fact that social and economic inequality has continued to widen over the last 10 years, and that many people today are deeply sceptical towards those in power towards international institutions, or towards scientists. And that has created the background on which many of these conspiracy theories misinformation disinformation, could take hold.
Kai For example, the polarisation happens on cable television as well. It is just that social media with its algorithms are a very powerful accelerant, an amplifier of these effects.
Sandra And also not the only culprit. Let's remember that WhatsApp equally spread misinformation which inspired lynchings in India, many far-right killers were incubated on different forums like 4chan, and that there are a range of extremist website that are not necessarily part of social networks.
Kai And there's, of course, the technologies we mentioned earlier, like GPT-3 that contributes to the creation and spread of fake news and misinformation.
Sandra And of course, both state and private actors who might have political interest in weaponizing some of these platforms.
Kai And another thing that is missing is what Jonathan Haidt, who you interviewed in July last year, mentions about how social media has changed the way we communicate in a society, how we roleplay how it has walked the way in which we engage, that's another thing that is missing.
Sandra And here's a clip from that interview.
Jonathan Haidt Social media, I think needs to be separated from the internet, because social media connects people not in ways that encourage authentic communication. But in ways that put everyone on notice that what I say to you doesn't actually really matter very much how you react, you're just one person, what I say to you is really guided much more by what thousands of other unseen people will think. And this changes the nature of communication, it makes us much more fearful. It makes us much more concerned with display. And it therefore warps conversations, it's always been hard to find the truth. But in the age of social media, I'm almost in a state of despair, it's very hard to find the truth now.
Sandra And here's the point where it should become really obvious to anyone who's seen the documentary or is listening to this episode of The Future, This Week, that the problems associated with social media are really, really complex. And one of the things that we could hurl at the social dilemma at this point would be that, even with the two of us who are very knowledgeable about the topic, it's really difficult to summarise in one sentence, what is the problem with social media, there seem to be so many of them. Technology seems to have gone wrong in so many ways. And there's so many ways we could attempt to fix it. And indeed, the end of the documentary has a laundry list of turning off your notifications and trying to engage more with people in real life, or inventing more technology to fix this, or indeed regulating the social media giants.
Kai But misses the one thing that is actually at the heart of this problem. And that is if we want meaningful change, we have to change the business model, we need to understand that we're not dealing with a social media company that might not have enough competition, TikTok or otherwise. But we're actually dealing with an advertising company that is in fierce competition with another advertising company, such as Google, and that the very nature of the operations drive the effects that we're talking about. And if we want meaningful change, we would have to change the way in which monetisation drives and shapes interactions on these platforms.
Sandra And again, here, it's the same issue that we've seen in Australia with the ACCC's report on digital platforms last year, where unfortunately, both regulators and the public do not seem to have an awareness of fundamentally how these platforms operate, or that they are fundamentally advertising companies. And unfortunately, all recommendations, both here in Australia and in the US seem to accept those business models and suggest that we tinker with things like data privacy or the amount of time we spent on the platforms, rather than challenging their business models. And by proposing any measures to either monitor or curb the activities that they do, we tacitly accept the fact that this is the only way for companies that now have become infrastructures to operate. And it gives legitimacy to the business models underpinning these problems.
Kai And so in the absence of anyone taking on the business models as such, we are left with the useful recommendations that the social dilemma gives in its credits at the end. And that is things that individuals can do to remove themselves from the worst harm that these platforms can do to our daily lives.
Sandra So we want to end this in the spirit of the documentary by adding our own recommendations to the list.
Kai And so the first one is to bring more diversity into the teams that design these kinds of technologies. That point has been made previously, if the original design teams at Facebook had included people who were exposed to cyber bullying or hate speech online, rather than almost exclusively white Caucasian males in their 20s, then maybe many of the unintended effects could have been avoided.
Sandra And second is to echo something that Jeff Orlowski, the director has said in an interview where he did acknowledge that platforms need to change their business models and build in different incentives. That is, we need to rebalance the value that derives from these platforms for all stakeholders, not just for advertisers or shareholders, but also for users, for media company for democratic discourse, as these companies provide a vital infrastructure for all of these things.
Kai And that's where we want to leave it for this week. Now, please go away, feed more algorithms with your recommendations for The Future, This Week if you like this episode, or tell your friends.
Sandra Until next week, thanks for listening.
Kai Thanks for listening.
Megan Sandra Peter is the Director of Sydney Business Insights. Kai Riemer is Professor of Information Technology and Organisation here at the University of Sydney Business School.
Sandra With us every week is our sound editor Megan Wedge.
Kai And our theme music was played live on a set of garden hoses by Linsey Pollack.
Sandra
You can subscribe to The Future, This Week wherever you get your podcasts.
Kai If you have any weird and wonderful topics for us. Send them to sbi@sydney.edu.au.
Sandra Should I bring up the Kim Kardashian story?
Kai What's that about?
Sandra She's temporarily leaving Facebook and Instagram to support a stop hate for profit campaign.
Kai How long?
Sandra For Wednesday.
Kai One day?
Sandra One day.
Kai No, we're not bringing that one up.
Sandra "Please join me tomorrow when I will be freezing my Instagram and Facebook account to tell Facebook to stop hate for profit".
Kai Creating a better world by leaving Facebook for one day. Yeah, no. Let's leave that one.
Close transcript