This week: puppets reading news, fake stuff, and Sweden’s out of cash. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.
00:45 – Chinese News agency has created an AI news reader – or has it?
11:20 – Deep fakes – where truth goes to die.
16:59 – Sweden’s cashless society, the e-krona, and inequality
26:46 – Robot of the week: Inspirobot
The stories this week
Other stories we bring up
Our theme music was composed and played by Linsey Pollak.
Send us your news ideas to firstname.lastname@example.org.
Disclaimer: We'd like to advise that the following program contains real news, occasional philosophy and ideas that may offend some listeners.
Intro: This is The Future, This week. On Sydney Business Insights. I'm Sandra Peter. And I'm Kai Riemer. And every week we get together and look at the news of the week. We discussed technology, the future of business, the weird and the wonderful, and things that changed the world. Okay let's start. Let's start!
Sandra: Today on The Future, This Week: puppets reading news, fake stuff and Sweden's out of cash. I'm Sandra Peter, I'm the Director at Sydney Business Insights.
Kai: I'm Kai Riemer, professor at the Business School and leader of the Digital Disruption Research Group.
Sandra: So Kai, what happened in the future this week?
File audio: Hello everyone. I'm an English artificial intelligence anchor. This is my very first day in Xinhua News Agency. My voice and appearance are modelled on Qin Hao, a real anchor with Xinhua. The development of the media industry calls for continuous innovation and deep integration with the international advanced technologies. I will work tirelessly to keep you informed as texts will be typed into my system uninterrupted. I look forward to bringing you the brand-new news experiences.
Kai: So yet again, another attempt to replace me on the podcast clearly Sandra.
Sandra: And another failure to do so. Our first story comes from the South China Morning Post and its titled "Xinhua News Agency debuts AI news anchors in partnership with search engine Sogou".
Kai: This has really been all over the news, including and - here's a shout out to my fan base over in Perth - on the ABC breakfast radio show in Perth, where I gave an interview on Monday at 5.40 in the morning. So I'm big in Perth. At least that early time of morning. But seriously it has made headlines, particularly because of what I think is a little bit of hyperbole in the way in which Xinhua calls this thing AI news anchors. So what's behind this?
Sandra: So what the news agency introduced was basically composite news anchors based on real people, and then using AI technology to simulate their appearance and their voices. But what these composite news anchors are doing are basically reading out text that is imputed by journalists. And then using AI to render the face and the voice that is seen on the video clip which we will include in the show notes.
Kai: They basically read out text. They are puppets. Some of the conversation in the media was led astray, I would say, by the fact that AI features prominently in the way in which it was announced. And indeed machine learning was used to produce the face animation and the synthetic voice. But there is no intelligence behind this. This thing does not produce the news, interpret the news, have an opinion on anything. In fact there is no "it" there. It's just a shopfront so to speak, which is being fed by text. And we've talked about these technologies on the podcast in the past, under the label of digital humans. These technologies have been coming for the past couple of years. It's a very fast-moving space and as you've just heard, the voice is not quite there yet. It sounds decidedly robotic. But my view is that within a matter of a few maybe weeks or months this will become much, much better.
Sandra: And let's be clear the news anchors - so there is a Chinese version and an English-speaking version. They are quite realistic in the way they look. And the voice yes, as you mentioned it's not quite there. But it's actually fairly close to being there even though the delivery is fairly monotonous.
Kai: The bodies are a little stiff and that's because of how the technology currently works. It's what we call a hockey mask technology. There's a real body with the head and then the face cut out is being animated by the machine learning. And so it helps when the body is rigid and still and it doesn't move much. But over time we will see more animated puppets that become much more lifelike, to the point where soon enough these will be indistinguishable from real video footage of human news anchors.
Sandra: So let's talk a little bit about the implications of this, because Xinhua has said that the achievement is a breakthrough in the field. And let's not take away from the fact that this is actually a huge achievement to be able to do this, and soon to be able to do this in real time. It does also come with certain considerations about the cost of such news anchors. If we think about the remuneration of celebrity news anchors, whether they be in China or whether they be in Australia or in the US. If you think about CNN, the article reports on CNN anchor Anderson Cooper being paid about 100 million dollars. And they can't read news 24/7. So there could be something to be gained by introducing the digital puppets.
Kai: Well let's not forget that Xinhua does not claim that they will use this technology to replace their prime-time television news anchors. This technology at the present date is supposed to be just an interface for presenting a continuous news stream online. On the web, or internet, social media platforms where this kind of technology allows the company to present news in other forms and then just readable text or just voice. So the comparison here at the present date is not between a real news anchor on television and this technology but a text representation of news and a more personable presentation via these puppets. But soon enough we will be able to create these synthetic humans. They will look believable, they will sound believable, they will read out whatever we feed them in terms of text. And because we are synthetically generating them, they will adhere to any standards of beauty and appearance that their designers want.
Sandra: Which also raises interesting questions about their status in the public sphere. Many of the news anchors that we see on TV we recognise as public figures. We know details about their private lives. We identify with them off the screen not just on the screen. But once we start creating these digital puppets, there is a question to be asked whether they will be standalone puppets, or they will be digital representations of actual news anchors and can be used interchangeably with those.
Kai: Let's not forget that the puppet in this instance is based on quite a famous news anchor employed by the company. So the question is will this put real news anchors out of their job? And I guess the answer to that depends on what you take a news anchor to be. If a news anchor job is exclusively to be the pretty face and the nice comforting voice that reads out whatever is given to them, then yes technology will be able to just simulate that. What the technology cannot do is provide commentary, have an opinion, empathise and have a dialogue. Interview people. So because there is no intelligence behind it, no cognition - it's pretty much an interface - it will do exactly that. It will just read out what is given to them. So most news anchors on television, they will read out the news. But they might also engage in an interview, or be more interactive, show a bit more of a human side to what they are doing and therefore they have value to the company as a person that engages in all kinds of activities.
Sandra: I want to take us a little bit now to what you could do with these technologies to actually enhance the work that real news anchors do. And this is another article that came out this week and it comes from the BBC. And its reports on the BBC newsreader that speaks languages he can't. So the same type of technology that has been used to create the digital puppet has in this case been used on a BBC news reader who only speaks English. But using AI, his face and voice have been manipulated so that he speaks Spanish, Mandarin or a Hindi. And the technology replaces the movements of his face with a computer-generated version of the same face and with speech was delivered in a language that this person doesn't speak. At this point they are puppeteered by a person actually speaking that other language. But of course you could use the same technology that have seen in the Chinese case to manipulate the speech that is being delivered. In this case we see the guy speaking Spanish, Mandarin and Hindi.
Kai: Languages that the person doesn't know how to speak, but the AI is capable of animating the face, making it look like if they did. And we've mentioned this before and this was actually for the past 12 to 18 months. One of our use cases that we often speculatively mention as to what this technology can do. And lo and behold, we now see the first examples of this emerge in the mainstream media.
Sandra: And how these technologies come about is actually something that we not only do research on at the University of Sydney Business School, but we also teach in our classes. And as recently as a few weeks ago in one of our classes, one of our groups of students investigating these technologies actually created an artifact which was this podcast that we are on happening sometime in the future. And they generated my voice. And it's a genuine fake interview of my voice talking to somebody else. So these technologies are actually getting quite close to the point where people with very little training and not computer experts are able to use them to impersonate other people or to manipulate footage that they have access to.
Kai: And our sound editor Megan is openly contemplating the possibility of just automating the two of us, not having the hassle of turning our garbled recordings every week into a coherent podcast. But just writing out a neat script that our two voices would then read out. Which certainly would make her job much easier. So maybe a real possibility for us to explore, because you know we could just go to the coffee shop and...
Sandra: Yes, thank you class.
Kai: And let our artificial doppelgangers do the job.
Sandra: But the ability to simulate voices and faces also opens up this entire field to the possibility of manipulation.
Kai: Fake stuff essentially, right. So someone's AI news anchor is someone else's deep fake.
Sandra: And we've spoken about a few examples that were done as public stunts or as entertainment to showcase. For instance, Obama being puppeteered by a comedian, and saying things that you would never say clearly rendered side by side, so you could see how this technology actually works. But there have already been a number of cases come up, and not with huge stakes, but the number of cases that show us the possibility of this type of technology being used for nefarious purposes.
Kai: And just this week, the Guardian in an article titled "You thought fake news was a bad? Deep fakes are where truth goes to die" reports on a case from Belgium where a small political party, the Socialistische Partij Anders, had an agency produce a video which shows Donald Trump...
Sandra: Offering the people of Belgium much needed advice on climate change. And he looks directly into the camera and says. "As you know I had the balls to withdraw from the Paris climate agreement, and so should you.".
Kai: The party thought that clearly people would see this as a joke, and they would see through the deception. The puppeteering of Trump again with machine learning wasn't even very good. But people took it to be the real thing. It created outrage and the company had to go into damage control mode. Which just shows how effective these deep fakes, these manipulations can be. To the extent that in this instance people really thought Donald Trump was interfering with the Belgian election.
Sandra: Ironically, this comes on the same week as the White House press secretary shared a video of a CNN reporter that appeared to be there to make his actions at the news conference, his actions towards a White House aide who was trying to take away his microphone, appear much more aggressive than they actually were.
Kai: And this video seems to have originated from famous source of fake news, Infowars. And was clearly manipulated, and users on social media have posted side by side videos which clearly shows that the video was doctored to make the actions of Jim Acosta look much more violent than the brushing of the hand actually it was.
Sandra: But the reason we're bringing this up is really that we want to foreground the fact that this conversation should not be just centred around the technology that allows us to create deep fakes or to manipulate video footage in general.
Kai: And the Guardian article goes into a long discussion of the arms race between the creators of deep fake technology and those who develop technologies to prove that a video has been tampered with. But this ultimately will not be the solution because the technology will get better and better, and it will be democratized soon enough. Anyone will be able to create fake videos and puppeteer avatars of themselves or other people, provided there is enough data available to train the technologies. And we've seen examples where sometimes a single picture might be enough to do a rough version of a puppeteer avatar. The real issue here is the social fabric of how we produce and share the news.
Sandra: So how do we come to know things, and how do we come to know collectively whether things are true or not? The fact that we are now able to deceive using such technologies will make people increasingly question whether what they see on TV, whether what they are exposed to on social media, is true or not. And this can lead to two things. On the one hand it can lead to them even questioning real footage and asking whether this is something that can be trusted or not.
Kai: Of perpetrators like Donald Trump retracting things and say I never said this. Someone fabricated this, this is not me in the video. This was doctored. This was manipulated. So basically the ability of people to just renounce reality.
Sandra: And what happens if this is perpetrated at scale, or if people come to know that these technologies are out there and being used often - the article refers to this as reality apathy - is the fact that people come to regard everything as potentially disruptive and stop trusting what they see as a source of truth. So this leads us to asking then if we cannot trust the videos that we're seeing, we cannot trust that video footage which up until now was the ultimate reliable source of truth, then what is it that we can trust? And this is where we want to highlight the potential for a resurgence in the role of trusted institutions, whether they be media institutions, whether they be academic institutions.
Kai: And remember we've discussed this on the podcast before. The way in which Facebook for example has decoupled news from its source, making it shareable, has led to every bit of information having to stand up for itself. Which then leads to things that come from reputable and things that come from questionable sources all be presented in the same stream. Which will have to change the moment we can no longer believe even video content in those shareable bits of information. So in my view people, if they are indeed interested in gaining some sense of reality, will have to go back to reputable sources and engage in some critical thinking. So no surprise here that media or digital literacy skills will become more and more important. But there's grave concerns for the way in which many people just consume bits of information on social media.
Sandra: And we'll include our podcast with Alan Dennis on fake news and research around how easily we are being deceived and we deceive ourselves in the show notes.
Kai: So let's go to our next story which comes from the World Economic Forum. And is somewhat of an update on a story we have discussed previously which involved owls and Sweden's cashless society. And the World Economic Forum reports, "Why Sweden's Cashless Society is no longer a utopia". And it's written by Cecilia Skingsley, who's the Deputy Governor of the Central Bank of Sweden.
Sandra: So the story reports again on the fact that the Swedish market is rapidly moving away from cash. And right now the outstanding value of cash in circulation in the Swedish economy has dropped to about 1 percent of Swedish GDP. And this is something that we've spoken about before on The Future, This Week. We will include the link in the show notes.
Kai: So this story is significant because it shows us the future. I mean in Australia, cash is very much alive. There's still lots of mum and pop shops, take away shops, where you have "cash only" signs in the window. In Sweden the opposite happens. Lots of shops nowadays refuse to take cash, and they are legally allowed to do so. And cash is basically falling out of favour and disappearing from existence. And the questions raised in the article is, what are the implications? What are the implications of a society where you no longer have cash to fall back on?
Sandra: Interestingly, the article really considers the role of the state in the payment market. And previously we discussed what this is doing to the, to society. And in this case actually, the article focuses more on the role of government within the new cashless economy. When we last spoke about this, we spoke about the fact that there are a number of communities that would be disadvantaged by the lack of cash or the lack of ability to pay with cash. And we especially focused on people from low income backgrounds, or the elderly people who do not readily have access to the technology, or people who might be paid in cash. People really living up the fringes.
Kai: What is often called the unbanked minorities. Homeless people. People who have problems getting a bank account because of their credit history for example. And so the question here is what is the role of the Central Bank and the state when there is no longer cash in circulation? And payments are exclusively organised by the private sector.
Sandra: In the Swedish economy you can actually connect your bank account, whichever bank your banking with, with a mobile phone number. And then use the Swedish app Swish to pay a restaurant bill, to collect money, to distribute money. Pretty much everything you would be able to do with cash. And settlement does take place with the help of the Central Bank.
Kai: Interestingly, Swish has become somewhat of an infrastructure being widely adopted. So while in this country people might have the mobile phone app of their particular bank - and remember this is for big banks in Australia - in Sweden it’s really this one app that people can use to settle and share a restaurant bill, to distribute pocket money to their children, or indeed to collect money for a birthday gift at the office. This thing is really use for everything and it has become a verb. So "to Swish" is now a commonly used term in Sweden. Which points to the fact that this app has taken the place of what cash would normally do in many other places. And Swish is run by six large Swedish banks in cooperation with the Central Bank. But the problem that is raised in this article is what happens if all cash were to disappear? How could the state or the Central Bank still ensure that the payment system would be, and I quote, "safe efficient and inclusive". And the inclusive bit is important here. So at the moment cash is a way for the Central Bank and the state to ensure that everyone has access to money. That they can use money. Even if they were unable to get an account with a private bank, they'd still be able to do their transactions in cash.
Sandra: But then if cash stops working - so we take all cash out of the economy - that would mean that currently, as things stand in places like Sweden, the private sector alone would be giving you access or allowing people access to money and to payment matters. So individuals would have no other alternative but to rely on the private sector both for access to money and for payment methods.
Kai: The fall-back option that exists in other jurisdictions would just fall away. Of course we could argue as long as there are any other countries with physical currency, I could always go and put my stash of money into U.S. dollars.
Sandra: But you still wouldn't be able to use your U.S. dollars to pay at the gas station in Sweden.
Kai: No, that's true. Though the article comes up with two main solutions. The first one would be to regulate. To force banks to give accounts to everyone, and then to control and monitor that. Indeed payments between banks, between individuals using banking infrastructure is inclusive and efficient and safe. But the article raises a second option.
Sandra: And the second alternative would be for the Central Bank of Sweden to issue money in a digital form rather than in the physical form.
Kai: They call it e-krona.
Sandra: So the idea for the e-krona would be the following. It would have a 1 to 1 conversion with the krona, the current Swedish currency. And it would be held in a bank account at the Central Bank. Or it would be held locally on a card or a digital wallet or on a mobile phone app.
Kai: And therefore you could hold each e-krona as an individual without having to rely on any of the private institutions, any of the banks. And you'd be able to transfer e-krona from individual to individual based on this physical infrastructure. A card, card reader, app, or otherwise an electronic device.
Sandra: So of course for this the Central Bank would need to invest in providing an infrastructure for the new digital currency. It would also have to build in a payment system so that the end users, the citizens, would be able to make use of this without any reliance on the private sector.
Kai: Now there is of course a third option that isn't mentioned in the article, which would be for the Central Bank to actually open a retail banking arm and start banking directly to the general public. Therefore allowing them to have bank accounts even though they might not be able to get those with private banks. But that raises a number of questions such as competition with the private sector. Which might be unwanted because obviously the Central Bank has an advantage with their access to money. And also would potentially be very costly to set up the necessary infrastructure. Now the e-krona is interesting because while it would be digital currency the article makes it clear that it does not have to rely on a distributed ledger or block chain. In this instance because it would be issued by the Central Bank and it would not have to be distributed. Because remember the whole idea of block change is to get rid of the middlemen or the Central Bank. In this instance it would be digital currency that is just like fiat currency but not existent in physical form. Just the digital equivalent. It remains to be seen whether this is something that they will actually do.
Sandra: And it remains to be seen how they will iron out the details that surround this. Because remember, if you have cash you are earning zero interest. So would you then have a digital currency that again earns zero interest, or on the other hand would you have a digital currency that actually earns interest? Could you be developing new policy tools for the Central Bank to conduct monetary policy?
Kai: So obviously this is not a technology story. Whether there's blockchain or any other technology underpinning this payment system is not relevant. The crucial point will be how to establish an e-krona with a different infrastructure that should be as easy to use and somehow has to interact with the widely adopted Switch already. But also importantly, how do you make sure that you still have access to a form of payment that allow people some privacy in making their transactions so that you don't have either the banks or the state knowing, you know, what kind of shit you might be up to.
Sandra: And that there's something that not only Sweden is grappling with, but indeed other places that have seen a huge rise in mobile payments. And especially countries that have seen these payments go through private operators. And I'm thinking here of course of China, where we have seen the role off cash diminishing very rapidly over the last few years. And where private companies like TenCent and Alibaba through the WeChat platform and through Alipay are controlling a very significant share of the payments that are now made through their respective apps. And of course the Central Bank in China is also asking itself whether a digital currency will improve its capacity to collect taxes or enforce foreign exchange controls or indeed track what the money is being used for.
Kai: We started on China. We end on China. But we will leave you with a few of this week's quotes from our friend Inspirobot. "You are about to turn into a chubby insect. Remember that".
Sandra: That's not nice.
Kai: It's not me, it's Inspirobot robot.
Sandra: Here's a better one. "It's never too late to ask".
Kai: Well I say, "Imagine normal people".
Sandra: And we could do this forever. My Inspirobot says "Keep an eye on humans".
Kai: "A hangover happens". And I think that's all we have time for today.
Sandra: Thanks for listening.
Kai: Thanks for listening.
Outro: This was The Future, This Week made possible by the Sydney Business Insights team and members of the Digital Disruption Research Group. And every week right here with us our sound editor Megan Wedge who makes our sound good and keeps us honest. Our theme music is composed and played live on a set of garden hoses by Linsey Pollak.
You can subscribe to this podcast on iTunes, Stitcher, Spotify, YouTube, SoundCloud or wherever you get your podcasts.
You can follow us online on Flipboard, Twitter or sbi.sydney.edu.au. If you have any news that you want us to discuss, please send them to email@example.com