This week: fake fact-checking videos take disinformation to a different level. For MisinfoDay we talk to expert Jevin West from the University of Washington’s Center for an Informed Public.

Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Futures Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.

Our guest this week

Associate Professor Jevin West

MisinfoDay – 15 March 2022

Resources from the Center for an Informed Public

The story this week

00:59 – Fake fact-checking videos are being used to spread disinformation

Our previous discussions with Jevin on calling bullshit and algorithms and social media disinformation in the 2020 US elections

Videos false claims about the crisis go viral

Groups are urging people to leave Google, Yandex or other reviews on restaurants, shops and tourist sites in Russia to counter misinformation

McDonald’s, Coca-Cola and Starbucks halt Russian sales


Follow the show on Apple PodcastsSpotifyOvercastGoogle PodcastsPocket Casts or wherever you get your podcasts. You can follow Sydney Business Insights on Flipboard, LinkedInTwitter and WeChat to keep updated with our latest insights.

Send us your news ideas to sbi@sydney.edu.au.

Music by Cinephonix.

Disclaimer We'd like to advise that the following program may contain real news, occasional philosophy and ideas that may offend some listeners.

Sandra In Australia, the floods have been declared a national emergency. The conflict in Ukraine still rages on with now many businesses, McDonald's, Coca Cola, Starbucks, Universal Music, Unilever, L’Oréal, you name it, have stopped their operations in Russia.

Kai And another special day is coming up, Misinformation Day on the 15th of March. And again, we have just the right person to talk about it.

Sandra Because Misinformation Day, the annual event, was started by the Center for an Informed Public at the University of Washington.

Kai Jevin West is the co-founder. So let's do this.

Sandra Let's do this.

Intro From The University of Sydney Business School, this is Sydney Business Insights, an initiative that explores the future of business. And you're listening to The Future, This Week, where Sandra Peter and Kai Riemer sit down every week to rethink trends in technology and business.

Jevin West My name is Jevin West, I'm an Associate Professor at the Information School at the University of Washington, and I co-founded what's called the Center for an Informed Public where we spend day and night looking at misinformation/disinformation campaigns, finding ways to intervene, trying to understand the ways in which misinformation is amplified through its actors and processes that sort of spread these ideas. But we also spend a lot of time on policy and community engagement around the issue and stuff. That's why we're talking today.

Sandra Jevin, great to have you back.

Jevin West Thanks for having me again, this is a fun place to be.

Sandra Today we're gonna talk about a new type of misinformation. That's information that has to do with the Ukrainian conflict, fake fact-checking videos.

Kai So we found an article in ProPublica titled, "In the Ukraine Conflict, Fake Fact-Checks Are Being Used to Spread Disinformation". Now Jevin, that should go straight to the heart of what you're doing, because you're doing fact-checking and trying to uncover misinformation and disinformation. So even what you are doing is now being faked. What do we make of this? And can you explain how this works?

Jevin West I guess the question is what can't be faked nowadays? Videos can be faked, text writing can be faked. Now, even fact-checking can be fake. Yes, we are to that stage now. Because if something is there to intervene in misinformation, some will find a way to intervene in that intervention. Fact-checking, of course, is something that I yell from the rooftops and tell students and the public to be using, when they come across a source that they're not sure of. Go to Snopes, go to FactChecker.org go to a fact-checker that's done some of the investigative work to see whether what you're reading is true or not. And now, now that activity has been hijacked, and it makes it that much more difficult. There was all this concern, you know, a couple of years ago about the role that deepfakes could play, this idea of being able to create videos and audio files that are indistinguishable, really, from you know, what would look like an authentic video from someone, but putting words in their mouth, and everyone was worried about that. Back then, back way then, two years ago, we would have never thought that even now fact-checking itself could be faked. And yeah, that's where we are.

Sandra So this video in the ProPublica article talks about the clip that shows kind of two videos showing a large explosion in an urban area. And then the caption says that well actually Ukraine is circulating this video showing a Russian missile strike in one of the Ukrainian cities. But rather the footage is actually an arms depot explosion from about five years ago. The message is kind of clear there, you shouldn't trust any videos you see of explosions in kind of urban areas or Ukrainian cities, because they're probably fake videos of something else. But it turns out the missile strike/depot explosion video never really circulated. Instead, it's this video that's kind of debunking it, the fake-fact checking video that's being circulated. Can you talk a bit about what this new kind of misinformation does? Or is it disinformation?

Jevin West Oh, absolutely it's disinformation, because the intention is to deceive here. So I would definitely classify this as disinformation. And just real quickly in terms of definitions, just so we're on the same page here. Misinformation can really be any kind of false information. It can be nefarious, but it can also just be honest mistake, which encompasses a lot of the information that we see, just like any kind of rumour can grow and spread through honest sense-making. The thing with disinformation though, it has an intention, it's intended to deceive. And a lot of times disinformation is not just a discrete article. It's not just an article to debunk it sometimes includes multiple actors that has political motivation, financial motivation, propaganda objectives. And they're harder to detect because sometimes they have grains of truth. But they can be examples like this where they could be a video that supposedly needs debunking, when in fact, it never did need to be debunked. It never was spreading. And using fact-checking, it certainly has the intention to deceive, or just confuse. And I think that's the thing that, of course, is scary. And actually, when you talk about policy, it's really important to distinguish between disinformation and misinformation, because they're two different things. So when I see lots of conversations about addressing disinformation, at the national level for policymakers, I think it's important that they always distinguish between these two. But going back to this example, it's quite clever. Because when we talk to the public about fact-checking, especially during a crisis like the Ukrainian war that's going on, we always mention how videos and images are re-circulated from one crisis event to the next. And we've talked about that, we've published it. In fact, a lot of these disinformation campaigns are probably paying attention to the things that, you know, the community is talking about saying, 'Look, that shark that you see after every hurricane, floating in the streets of whatever city just had a major hurricane, is the same shark that was shared during Hurricane Katrina and other major hurricanes, probably seen in Australia as well. And so watch out for those because they're going to be recirculated'. Well, they're using the exact same thing here. And they're saying, 'Look, this was a video from another war, and you can't trust it'. And that then makes people then not trust a lot of the videos they might be seeing.

Kai It is quite creative, isn't it? I mean, they're faking the fake video, right? So they’re asserting there's this fake video circulating, but the fake video is fake, and then they fake the fact-checking. So they're basically posing as people like yourself with your Center. And as we educate the public to look out for fact, checking, they basically go in there and try to paint this false picture as the Ukrainians, you know, spreading propaganda at a large scale trying to dress up the war as worse than it supposedly is. So it really becomes very, very complicated to educate, to create an informed public. So what's your reaction?

Jevin West This is relatively new, so we haven't really had a chance to respond to it. I think there's two things, one, make people aware of it through podcasts and journalistic writing so they can spread and so people least know that these tactics are being used, so they can be a little sceptical of those tactics. I mean, it's like, we have to be sceptical of everything, which is unfortunate. You know, I'm sad that the world is that way. But it is that way.

Kai Including this podcast.

Jevin West Yeah. Right. Right. Exactly, you know, fake versions of this podcast maybe even. Okay, so making people aware, just like when deepfakes came on board, it wasn't that we were going to be able to teach people how to detect fake videos, per se, but we could at least let them know that it exists out there so they can do some extra sourcing if needed. So the first thing is just let people know about it, that's the biggest thing. And then they can make an assessment and least be aware of that tactic. The second thing is, this is somewhat of a verifiable fact, that if they claim that this has been spreading in Ukrainian social media, we can check that. ‘We’ being not just our Center, because we might not have access to all the data you'd need to check that, but that can be verified. And so if they make that claim, we can counter the claim, even though by the time we counter it, it's probably drowned out by the next fake event that's being thrown at the Ukrainian citizens or just the world more generally. But we could go and see whether it ever did go viral, how many posts was it shared on? And you can say, 'Look, this hasn't spread at all. You're just using this as an example', which is exactly how this one example was spotted. And so hopefully, we can just call out these groups. And here's the other thing, I guess a third thing is to remind people of legitimate sources. So they could check this suppose it fact-checking organisation, they could check the individual and say, 'Hmm, how often have they done this before?' If you kind of show that they're not a real credible source hopefully you don't use them again, although they'll probably just change their name, and change the website, and do things that make it hard to track them.

Kai But how do you check that because if they pose as a fake Institute, and they've created lots of these videos, then there is a history of, you know, maybe they have already supposedly debunked 20 of these supposed Ukrainian propaganda videos. Then, you know, becomes very hard to discern who's a credible fact-checker and who isn't.

Jevin West It is, and that's why, you know, we have to rely on at least those fact-checkers that have been around for the last 20 years or that have been doing it for a long time. Although there are attacks on some fact-checking organisations because they have fact-checked groups and people and movements that aren't real popular and so people will be looking for other fact-checkers. And so that's where these new fake fact-checkers could take advantage and try to be that next reliable source. But yeah, it's hard to distinguish. So you have to rely on resources that have built some reputation. You know, even if you don't agree with everything that they're fact-checking, at least, you know, that they've been verified. There are some efforts in journalism to sort of verify some of the new sources. So you could do the same thing for fact-checkers. So you may have heard of NewsGuard, NewsGuard is a pretty substantial effort and trying to do that for journalistic venues, you run in to a new journalistic venue, you can see whether it's got, you know, this seal of approval by some of these third-party organisations like NewsGuard. And then I would say, with fact-checkers, one thing that you could do is, you could look to any fact-checker you trust, you could ask them because there is an international organisation of fact-checkers, and you can see whether they were a part of that organisation. But again, the problem is, by the time you do all this, they're already doing the next tricky thing, whether it's around fact-checking, or it's the next thing. And really what they're trying to do, when you look at this historically, it's just to confuse. It's just to disorient. It's to add noise to the system. So you don't know who to trust, what to trust. And so it's not even just confusing you about that one individual video, like, 'look, there's bombings in this city'. Really, the effort is in confusing and adding noise to the conversation, and then you lose faith in the system overall.

Sandra And we've seen this, right, in the context of Ukraine, like this bombing wasn't the only thing that was kind of fake fact checked. There was a video shot on a Ukrainian production set like two or three years ago with a movie being made about an explosion, and people having fake blood applied and all that. And that was reshared, as you know, victims posing in Ukrainian cities, as you know, in the aftermath of an explosion. And the actor had shared himself the scenes two years ago saying, 'Look, I'm filming a TV show'. There have been videos from Vienna from a climate change protests, where people, you know, were protesting in body bags, but still obviously moving around. And they were saying, look, it's fake victims, again, being reshared. But the problem is, then these videos get taken up by official sources. In the case of the explosion video, it was the Russian state television broadcaster, and again, a Russian government Twitter account that then reshared it, that then tends to give legitimacy to these videos. What do we do then? And also, what do we do, given the fact that these are obviously not English language sources, on kinds of platforms and even environments on the internet that we don't really have access to.

Jevin West And that's a big problem. I have colleagues that have written about this pretty extensively, that if you look at Facebook, for example, that does fact-checking to some extent, and they've hired like 35,000 fact-checkers, the vast majority of them are in the English language, a very small percentage are in other languages, and all the other languages are a tiny sliver. So when you split that tiny percentage into tinier percentages for all the different languages, it's a big uphill battle. And so I think it really is a big problem. And sometimes we amplify things we don't even mean to amplify. So it's not that we're not going to talk about this, but by talking about it, and by the BBC doing it, and ProPublica doing it, we need to let the public know these tactics are being used. But by doing this, then you know, it's happening, and then you're just sceptical, more general. I mean, I'm more sceptical, I have to be more careful. So then I don't know myself, and I do this kind of work, I do fact-checking. So it makes it even harder for me. It's just it's so disorienting, not just as a researcher in this space, but it's disorienting as a consumer more and more. And so there is a real opportunity out there for any listener that wants to solve a big societal challenge, and that is trying to figure out how to better identify authentic videos, authentic events, authentic text, people. And there are people working on this. It's not like it's not being worked on, you know, there's some blockchain answer to it, or whatever, that can help solve, and I hope someone does solve it. But right now, a lot of the work that's being done to authenticate these kinds of events, especially if they have a geographic element to them, is to train these journalists. So if you're a big newspaper, like the New York Times, you can afford to hire full-time staff to help authenticate quickly whether a video or image is real. There's all sorts of forensic kinds of techniques they can use to authenticate it. But who has time for that? And how many other kinds of journalistic venues have budgets to be able to do that, but really, that's the most effective way. But even with satellite imagery, I have a colleague, his name's Bo Zhao, he's a researcher in the Geography Department on my campus, and he's looking at the ways in which images themselves, like our geographic images, can be deepfaked, essentially. You can use some of the same deepfake technology to generate images that look very authentic, images that are maps, essentially. And so then you use those as your verifier. But then even if those are synthetically created and not real, it makes it even more challenging. So I guess the point here is that we need to be able to train all consumers like these journalists that can authenticate, but even if we train them, it's very difficult for them to do that quickly in real time. So what we need to truly do is just eventually come up with better techniques for authenticating real events, real people, real stuff. And you know, that's no easy task. But if there's an opportunity, maybe a business opportunity, I think people would pay good money for that.

Sandra But staying in that area of kind of two different internet's or two different spaces in which people interact. Obviously, you know, in Russia, people would be on different social media platform, Facebook is now pretty much out of Russia as well. But what do you think of efforts that people have been making, kind of ground-up efforts, to inform citizens about the dangers of these, you know, fake fact-checking videos and other disinformation by going to places like Google Maps, or by going to places where you can of leave restaurant reviews or other things, and trying to explain what's going on, trying to inform citizens that way? I mean, they've even been calls for efforts to do that in places like Minecraft. Is that a solution, or is that part of the problem?

Jevin West Well, again, this is such an interesting arms race to be watched, where you have those that say that pushing disinformation, those that are responding like, you know, going to Minecraft, going to Yelp or Google Maps, you know, even just the old-fashioned efforts of having people that have connections within their country, and just having a good old-fashioned conversation outside the country, that's helpful. But there's a whole area also, your listeners probably know about this, intelligence being done by just regular everyday citizens. A lot of what was going on before bullets were shot a couple of weeks ago, a lot of intelligence. I mean, I don't know how much but I have colleagues that sort of study this to a larger degree, look at the ways in which regular citizens are using satellite imagery, and doing that human work that you can't do even if you have a huge CIA department. And I think there's work going on there where you can do some of that GIS kind of work. And you could report that to the world, you could do things in a gaming world. Yeah, there are other ways to sort of penetrate these borders that are now forming in the information world, they're certainly in the physical world. But I think we need to be more clever. So maybe that would be the homework of your listeners too, to go out and think of ways to get around those information borders that are forming. I can see reasons why large technology platforms are shutting the doors, and governments are probably encouraging them to do that. But the door shutting has costs to that, You can't use those platforms to push certain narratives, but also, it gets in the way of counter efforts. And I think that's a real cost.

Kai And we also see private companies now providing a lot of satellite imagery. So it's private companies posting a lot of satellite images from Ukraine, it's private citizens doing the job of intelligence analysts. So there's a lot of bottom-up efforts to do some of these jobs and trying to bridge those boundaries. But I wanted to come back to the creation of these boundaries as well, shutting off access. That's the hard way, big tech putting out of Russia creates splinternet. But isn't what we've discussed just earlier, this sowing of doubt in the information, in the public internet, isn't that also a form of discrediting this channel for large numbers of people to basically say, 'you cannot trust anything on the public Internet, come and watch our TV channel', or come to this platform, which can't be controlled like Telegram or WhatsApp, where it's much, much harder to be even becoming aware of the circulation of misinformation. So coming back to this idea of sowing doubt, what does that do to the future of the internet as an information space?

Jevin West To me, it's the thing we should be worrying about almost the most. Because if we lose trust in all institutions, and that is not an unrealistic endpoint for kind of where we could be going more generally, for the world when it comes to our information spaces. That's worst-case scenario. I don't think we're there. And I don't think we're like even close to there yet. But we're inching closer towards full distrust for our system. In fact, you could probably do a random interview of people on the streets, and you do see this, I talk to the public a lot about this issue. I'll be talking to some high school students or middle school students, and I'll point someone out I'll say, 'so what information sources do you trust?'. 'Well I don't trust anything. I know that everything on the internet is not true'. And to me, that's worst-case scenario. And so what do we do as we inch closer and closer to that? Well, first of all, we need to remove ourselves from those information environments and speak in person again to people, and to find avenues to interact in real life as much as we can. I think businesses that have been remote for a long time, I think are recognising the importance of bringing people back. But I think that schematic of society more generally, that's one step towards it. I think we need to find ways of authenticating. We need to figure out how we're going to do content moderation, because these platforms do have a lot of control, and there are things that probably do need to be removed. But we also need to protect for these counter measures as well. We need to protect free speech, and we need to protect free expression. I'm a huge advocate of those things, so it's no easy answer. I do think that we need to do a better job of educating our public, it's one thing to connect the world to the internet, it's another to train them on how to use that. There is a lot of efforts, even the United States, I'll give you an example. So in the United States, there's been a lot of attention about this Build Back Better policy. This is this bipartisan effort to build new bridges and new roads, but also information highways. So there's about a billion dollars in this policy to enhance connectivity across rural parts of the United States. And every country is dealing with this unless, you're one of the really small countries. I mean, a lot of countries, including Australia has a real rural component, I'm sure, and there's efforts to connect to the world, but you can't just connect them, and India's realising that. If you just connect people and give them their devices, and they jump on the internet for the first time, without training, a lot of the mob attacks that we were seeing a couple years ago, driven a lot by WhatsApp had to be addressed because it was that serious. I mean, people were being killed because of rumours. And that's a real-world implication for how rumours can spread. But also rumours can spread without training of digital literacy, media literacy, and even just digital civics, again. There's lots of things I'd like to change in our education systems, but one would just have to have curriculum around digital civics in class, not just civics. Civics used to be a part of curriculum, in the old-fashioned education systems, you had to learn about being a civic-minded member of society. But let's bring it back in a digital context and just train people how to watch out for things. So I think you know, ultimately it comes to training of consumers on how to address it. But if we get to that point where people just don't trust anything, then that gives more power to those that have the power, and it makes it easier for them to push the narratives that they're pushing. Because if you can't trust anything, you look to an autocrat, or someone who's at the top of the food chain. And that can be problematic, as we all know.

Kai It is Misinformation Day Jevin. So, if you had one piece of advice for us as universities or business schools, and also our listeners, many of our students, what should we do? What can every one of us do?

Jevin West I think everyone needs to realise that they do play an important role because they feed the algorithms that feed their friends and family. And so you might think, 'well, you know, it's not that big of a deal that I share this, and I didn't put the effort in. I didn't quite read the article, but I shared it anyways'. Everyone plays a role, and they play an important role for their community, they play an important role in the information ecosystem, everyone does. And that if they just pause a little bit more and share a little bit less, not feeding those algorithms as much. That simple piece of advice, just think more, share less, is one thing that we talk a lot about on Misinfo Day. It's a day that we bring 1000s of students in on this program, which actually can go international, so if anyone in Australia wants to do it's a free program we make available to teachers and librarians and students. And we create new modules every year, we share those modules, we co-create them. And those are resources to dealing with some of the topics around misinformation and media literacy more broadly. Then if I had a second piece of advice, I would say, 'if it gives you an emotional response, you should especially pause'. The things that go most viral tend to have that emotional hook. They're designed to spread, and watch out for those especially.

Sandra All the links for that will be in the shownotes. We'll make sure that everyone can access all the resources that the Center provides. And we want to thank you so much for chatting to us today.

Jevin West Thank you so much. It's always fun. You guys are running a great podcast. Please keep spreading the good information out there.

Kai Thank you, Jevin. Appreciate that.

Sandra Thanks, Jevin. And we'll hear you again soon.

Kai Yes, and thanks everyone for listening.

Sandra Thanks for listening.

Outro You've been listening to The Future, This Week from The University of Sydney Business School. Sandra Peter is the Director of Sydney Business Insights and Kai Riemer is Professor of Information Technology and Organisation. Connect with us on LinkedIn, Twitter, and WeChat, and follow, like, or leave us a rating wherever you get your podcasts. If you have any weird or wonderful topics for us to discuss, send them to sbi@sydney.edu.au.

Related content

Weird new jobs

This week: the AI whisperer, AI artist managers, data detectives, metaverse supply chain strategy consultants, and more cool jobs in the digital era.