This week: face recognition for a noble cause, the capability to find you in a crowd and DNA predictions. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.

The stories this week

Facial recognition at Sky’s royal wedding broadcast

Facial matching still makes a ton of mistakes

Facial prediction from DNA data

Facial recognition can be creepy

A BBC reporter’s China experiment

Facial recognition at Singapore airport

“The Capability” raises privacy concerns

Australian government could allow private access to “The Capability”

Facial recognition feature creep

NVidia generates fake celebrity faces

One of our 2017 conversations about facial recognition 

Facebook can now find your face, even when it’s not tagged

Facial recognition catches criminals at China beer festival

KFC in China lets people pay by smiling at a camera

Facial recognition in China to authorise payments and catch trains


You can subscribe to this podcast on iTunesSpotifySoundcloud, Stitcher, Libsyn or wherever you get your podcasts. You can follow us online on Flipboard, Twitter, or sbi.sydney.edu.au.

Our theme music was composed and played by Linsey Pollak.

Send us your news ideas to sbi@sydney.edu.au.

Disclaimer: We 'd like to advise that the following program may contain real news, occasional philosophy and ideas that may offend some listeners.

Intro: This is The Future, This Week. On Sydney Business Insights. I'm Sandra Peter and I'm Kai Riemer. Every week we get together and look at the news of the week. We discuss technology, the future of business, the weird and the wonderful and things that change the world. Okay let's start! Let's start!

Sandra: Today in The Future, This Week: face recognition for a noble cause, the capability to find you in the crowd and DNA predictions. I'm Sandra Peter. I'm the Director of Sydney Business Insights.

Kai: I'm Kai Riemer, Professor at the Business School and leader of the Digital Disruption Research Group. Welcome to The Future. This Week. I'm still all alone while Sandra is away on holidays swimming with turtles or something. We have pre-recorded another episode for you, this one on facial recognition. We will be back with a regular episode next week, for this week we have this for you.

Sandra: Our first story from The Washington Post and who knew we would be talking about the Royal Wedding. The story's titled "Who's the Royals? Sky News will use artificial intelligence to I.D guests at Prince Harry and Meghan Markle's wedding." UK broadcaster Sky News has announced that they're partnering up with Amazon Web Services and their recognition software to identify the celebrities and the British nobility who will be attending the Royal Wedding next week. And they will use AI to basically display the guest’s names and their details and maybe how they're connected to the royal couple on screen during the live broadcast of the wedding.

Kai: And if you're following the broadcast on Sky's app you can then instantly access background information on those celebrities to really indulge in the who is who of who's in attendance at what has been termed the event of the year. So, Sandra you're not invited either are you?

Sandra: Well I'll be on holiday.

Kai: Oh you have an excuse, yeah I’ll be here by myself. I'll be neither swimming with turtles nor am I invited to the Royal Wedding. Think of a Doctor Who moment because we have our timeline messed up. So, we're obviously recording this before the event has gone down. Oblivious to how it actually turned out but we're also only using it as our starting point to launch us into a discussion about facial recognition.

Sandra: So, while on the face of it this is an interesting new way to consume live events where you would get so much more insight not only in the couple getting married in this case but also on the audience. This raises a number of questions around other potentially high-profile events or a concert where such technology might be used and the privacy concerns around getting permission from all these people to actually identify them in the broadcasts. Whilst arguably for a Royal Wedding most of these people are public figures or celebrities even though not all. You could argue that this is implied through the status that they have in our society but it's so far unclear whether or not Sky News has actually asked that any of these users to participate in the event.

Kai: So, one could make the argument that because celebrities are in the public domain, part of what they do is to be recognized right? So, you could argue that the use of this technology is quite in the spirit of what being a celebrity means. But we should also remember that privacy legislation applies to everyone, and so people have raised the question of whether it's okay, for example, to use picture materials and pass broadcasts, actually train these algorithms to be able to recognise celebrities or whether this is a misappropriation of data because it was not collected for this purpose and therefore not in the spirit of the law. So, with people in the public domain we're certainly in a grey area and we might argue that it's okay to use there but it points to the capability of the technology to do live face recognition on crowds and we've recently in the past few weeks seen a number of articles emerge where people have discussed how this is actually being done in all kinds of different areas.

Sandra: So, for the past two weeks we've seen an increasing number of stories that spoke about face recognition being used in a variety of instances. We've seen facial recognition technology being used by the UK police at an Elvis festival, at the Champions League finals last June in Cardiff. We've also seen the UK police using it to monitor protests. We've seen news out of China where over 170 million cameras are now installed and employed to do this facial recognition, on how they manage in a city of over 4 million people to pick up one person in less than seven minutes. It's also being used at a pop concert with over 60000 people to identify certain suspects. We've also seen news out of Singapore for instance, coming out to say that Changi Airport is considering employing facial recognition technology beyond identifying people at the gates to make sure it’s the person in the passport, but rather considering employing it more broadly throughout the airport to identify for instance in case you're running late for one of your flights. And we've also had Facebook coming out again with updates on their facial recognition that they've released last year. Arguably one of the best such systems out there where they use templates to recognize users faces in photographs that might be uploaded even from third parties or accessible through other data brokers. All these stories come to complement stories we've discussed previously on The Future. This Week, stories for instance, around China identifying jaywalkers or being able to pay at KFC using your own face, being able to board trains using your face or even stories coming out of Australia where the New South Wales and Victorian government are considering employing such technologies to identify potential threats on our streets.

Kai: So, we thought we'd catch up on this topic and before we do so we try to bring some order into what is a very broad topic given that face recognition is being discussed in so many different areas. So, we would like to offer a little taxonomy which would show how facial recognition underpins many different practices. So, we want to start with what we might call face verification. This is where a single person voluntarily uses their face to gain access to a service, most prevalent on the new iPhone 10 which unlocks the phone through facial recognition.

Sandra: But also, at Australian Border Control checkpoints when you enter Australia via one of our airports you actually can face a camera and present your passport and the system will match your picture to the picture on your passport. It's also been used in similar ways to match people to their government IDs in China upon boarding a fast speed train. So, when you present your ticket your face is actually matched to a face on the database, so you can be identified so you pay for your ticket.

Kai: So technically this then is the checking of one scanned face against a record for that person. The next step in complexity is what we might call face detection. This is where we're trying to pick one person out of a crowd of people, so we're looking for a particular face, but we have to scan through an often-large number of faces that are presented on video footage, for example, and there has been quite a high-profile case recently. The BBC reported last December where one of their reporters had put himself through a bit of a test in China, where he had his photograph taken and then added to a Most Wanted list. He went on foot for a walk through the Chinese city of Guiyang and it took the system seven minutes to locate and apprehend him and bring him in for questioning.

Sandra: And we also want to highlight here that facial recognition that's being used that the Royal Wedding is actually also being sold by Amazon to police departments in the US, so for instance it's currently used by police in Orlando and in Oregon where it's actually running real time facial recognition on a network of cameras that are being used by the police department and it can also tap into police body cams and other surveillance systems.

Kai: So again, this is a technology that works really really well because the technology is now quite reliable spotting one particular face by scanning large crowds. The next step up from this is what we would call face matching. And this is where it gets a bit more complicated. This is where we're actually scanning a crowd of faces say at a rock concert or at the Champions League final but we're not looking for one particular person, but we are actually matching whatever we scan in the crowd against a larger number of records in a database of the most wanted individuals that we're looking for.

Sandra: So, there are a couple of examples out of the U.K. which actually show how hit and miss this technology currently is. One of them talks about the 2017 Elvis festival where police were trialing out this system. The cameras actually spotted 17 faces that they believed matched people in the database. Turns out only 10 of those were correct and seven were wrongly identified. And since then the facial recognition system has been used at a variety of sporting events and concerts and other large sort of festivals. But numbers coming out of the UK and in this case from the South Wales Police and are currently published on their website show that this is actually far from being a perfect technology. During the UEFA Champions League final week in Wales last June there were over 2297 false positives out of 2400.  That is 92 percent of matches were incorrect. Only a hundred and seventy people were correctly identified.

Kai: Now technically that's to be expected because if we compare this to face detection where we're looking for one particular face it is very rare or very unlikely that we get a false positive because there's not many doppelgangers or lookalikes out in the public if we're looking for one specific face, but once we're comparing across a number of facial features from a longer list of photographs, the likelihood that some of these features are matching what is in the database increases almost exponentially.

Sandra: And whilst many of these misses were blamed on low quality images in the database and the fact that the system has not been sufficiently trained, it still raises questions around what happens when you are wrongly identified and suspected of a crime you didn't commit. Which in a number of cases reported have happened twice to some of these people?

Kai: And a recent article in Wired actually raises these questions and they basically make the point that while the law enforcement agency might regard a system that produces false positives as a success because it doesn't cost them that much to filter through this and even if they have to bring someone in for questioning who might be innocent they will still apprehend a number of people from the list. That turns out to be very different for the people who end up on that false positive list.

Sandra: And just to be clear, even though there were reports of improvements in the system the South Wales police actually dropped the false positive rate to eighty-seven-point five percent...

Kai: ...That's a lot!

Sandra: ...That there is a whole lot. At the recent boxing match in Cardiff and that was just two months ago. So, so far, we've got three different types of facial recognition: we've got facial recognition for verification, unlocking your iPhone; we've got facial recognition for detection, finding the one person; and we've got facial recognition used for matching a number of people in the database to a large crowd of people. And at this point it's worth noting the Wired article brings up a good point around the fact that quite often one of these systems is developed for verification or for detection and then is used for something else, in this case matching. So, whilst the system is trained and very good at detection, at identifying that one person, even though it's extremely good at that it might not be well suited for something else, and normal error rates that would be acceptable in case you're trying to identify one person might be exaggerated when the system is actually deployed for something other than what it was trained for.

Kai: And this is where we now have to talk about Australia. In Australia it was announced last year that the Federal government wants to utilize the state's databases of people's photographs that go on driver's licenses to build a national face recognition system as a way to combat terrorism for homeland security purposes and they aptly named this system The Capability and I guess Skynet wasn't available as a name. But the point that has been made in a more recent article is that earlier warnings of so-called Feature Creep, which you just alluded to, that a system that is built for one purpose might be used for a wider range of purposes is already happening.

Sandra: This was alluded to in a Guardian article titled "The coalition could allow firms to buy access to facial recognition data" and that this was insight into the fact that the Federal government in Australia is considering allowing some private companies to use the National Facial Recognition database for a certain fee. And the first example was for instance, financial institutions wanting to use some of this data for identification purposes. We can imagine a future where I would just walk up to an ATM machine and the ATM machine would know it's me, but there is no obvious reason why we should stop here. The article points to the obvious questions around our ability as individuals to make choices about this and to opt into these schemes rather than having to opt out of them or even being aware that these things happen. The facial recognition database for instance, came to be in the first place by using driver's license photos that we did not agree in the first place could be used for our national database and for such purposes. This seems to be slowly feeding in to other parts of our lives. For instance, not only for claiming or updating driver’s licenses but passports, Medicare cards, visas, citizenship certificates and other places where we need to be identified, but this could easily be used by banks, by insurance companies, by other organisations that might have a similar argument for one thing to quickly and efficiently identify us, prevent fraud, prevent money laundering and so on.

Kai: And an additional point that the Guardian article raises is that with each one of those interfaces that we're building to allow other third parties access to these services, we're increasing the risk of malicious access to that data, of checking data breaches and the article mentions the data breach at Equifax in the US which exposed highly sensitive personal information such as medical histories and credit scores of over 140 million US citizens and raises the question what were to happen if the biometrical data of Australian citizens ended up on the Internet, on the dark web, in the wrong h ands, rendering obsolete many of the services that are currently being contemplated to build on top of that facial recognition database.

Sandra: So, to recap: we've had verification making sure it's me; detection...

Kai: ...Finding you...

Sandra: ...and matching...

Kai: ...finding a number of people in a crowd.

Sandra: ...and systems that embody all three like the National Facial Recognition database that we might have in Australia...

Kai: ...The Capability as they call it. But there is another layer which takes facial recognition to a whole another level which is face mapping, which is the idea that we map every face that we can detect in a crowd to a person in what might be a complete database of all citizens and then track those faces across locations, across public spaces or for that matter across photographs on social media platforms to know where everyone is at any point in time, which is the surveillance end game.

Sandra: Which is arguably some scary shit.

Kai: Yes, and my understanding is the aptly named Facebook is already doing this.

Sandra: Yep. Turns out Facebook will now find your face even if it's not tagged. They are actually on the way to becoming Face Book. In a future that was rolled out late last year, the social network actually uses facial recognition for most of its customers excluding Canada and parts of Europe, but most everywhere else the face recognition software is used across the board, so you have automatically opted into and you would actively need to opt out of a service that identifies you in a Facebook photo even if you haven't been tagged in that photo. So you would receive notification that says the someone's uploaded a picture with you in it and you would be asked to review it so you can either tag yourself or message that person who's uploaded the picture or actually tell Facebook that it isn't you, by that, helping it become a bit better at recognising you or report it for someone trying to use your picture let's say in their own profile.

Kai: So, we can only imagine what happens to photographs taken in tourist spots and other public places where you know Sandra, or I might walk along Circular Quay here in Sydney, which we often do, which has the Opera House and the Harbour Bridge in the background. What if photos taken in those spaces end up on Facebook which they do a lot. Will everyone visible in the photos being tagged and alerted, and will I therefore have a flurry of messages that I have to deal with and then I can you know basically maybe opt out of the information. But the point is Facebook knows that I was at Circular Quay, that I was there even though I did not sign up for being photographed or having that information being scanned for that matter.

Sandra: So, let's have a look at how Facebook does this and let's not forget that Facebook actually does have some of the best systems in the world in terms of facial recognition technology, after all it's the company that has hundreds of billions of photos that have been uploaded to its service and that it can use to analyse and to distinguish faces. So, Facebook creates these templates for each and every one of its users and also possibly for people not currently using its platform, let's not forget Facebook also works with other data brokers that supply the company with additional data on various users, and also many of its users’ tag people in photographs whether or not these are actual Facebook users or not. To be fair Facebook does allow you to opt out of this and it says that it will delete the face template that is used to find you, to identify you in products and services but there is a clear argument to be made here that this should be a service that you should opt into rather than have to opt out.

Kai: And again, what we want to highlight is that decisions at the individual level to opt in or in this case not to opt out of these services will materialize at the systemic level. So, while it might be annoying, or slightly creepy and disturbing for the individual that they have their face tagged in other people's photographs, even incidentally, from a Facebook point of view what will emerge is an almost complete picture of where its users are, what they are doing and who interacts with whom and shows up in other people's pictures. Adding yet another layer of complexity to its social data network that they are building, by employing face mapping to people's photographs.

Sandra: So, let's summarize: what we're talking about face verification, face detection, face matching and face mapping and the final one we want to mention here is face monitoring.

Kai: So, face monitoring comes out of a field called affective computing. Basically, the idea that a computer algorithm can read off a person's face, their emotions. So, this is one step further to a simple detection of a face, in a crowd, for example, it has more to do with reading someone's face and detecting different emotions on that person's face and that can be used in various different contexts. So, people have experimented with reading user's emotions when they browse a web server or an online shop but the article that we want to highlight reports on a different use in a school classroom.

Sandra: So, a Chinese school has actually installed this facial monitoring technology to see whether students are attentive or not in class. Theoretically this system could pick up emotions including things like you're happy, you're sad, you're angry, you're disappointed, you are surprised. And it can alert the teacher in the class that one of the students might be distracted and the teacher could take action.

Kai: So, the intention here is to read off students faces in a continuing way, how engaged they are, whether they are content, happy or showing signs of boredom. Now this certainly is a next level in people's surveillance and we can argue at length how creepy or inappropriate this might be. But this is not the point of what we want to highlight. Face monitoring and facial expression reading is certainly something that is technically possible and as we can see is certainly something that we can do these days.

Sandra: And we are trying out, we're experimenting different ways of employing it.

Kai: And while it raises questions of how it might actually change people's behavior as with all facial surveillance technologies, this is a discussion for another time.

Sandra: And whilst we've spoken about face recognition mostly in terms of how we can recognise or identify faces the next that for this kind of technology is also the ability to either create them or predict them.

Kai: We've discussed previously that the kind of machine learning algorithms that go into these technologies in the form of deep learning, for example, are really good at pattern recognition which is at work here in face recognition, but they can always be turned around and be used for face generation. And we've seen some recent examples with N VIDIA demonstrating a technology for generating hypothetical faces from a data set of face recognition data, generating faces of beautiful people, potential celebrity faces that look decidedly photoshopped, but which were not real faces. But we want to highlight another aspect of face generation that of face prediction from DNA data.

Sandra: An article from The Conversation from last week titled "The DNA facial prediction could make protecting your privacy more difficult", actually highlights the possibilities for using DNA and the large DNA databases we have currently, millions and millions of sequenced genomes that medical professionals have been using to treat rare diseases or to develop personalized medicine, would now be employed to actually predict certain traits such as eye color or skin color or a hair color and potentially, eventually, relatively accurately reconstruct faces from these traces.

Kai: And while this form of what's called DNA phenotyping is currently quite limited, there is also some strong criticism around the feasibility of what you can do on the basis of this database in terms of predicting someone's hair color or facial features, eye color, skin tone and these kind of things. This is only just the beginning because the end game will be to have available and match the photographs of an entire population of people with their DNA data and then train a machine learning algorithm to make associations between DNA features and facial features and then use that to predict what a single person's face would look like based on their DNA. And the article actually points to the Australian capability face database and raises the prospect to combine that with DNA databases for building such a system, which is hypothetical at this point in time, but could easily be envisioned again in the name of counterterrorism and homeland security.

Sandra: So, whilst we started with face recognition we're actually seeing that there is a vast amount of data about us that might be encoded in our DNA, for instance, that might be encoded in our general appearance that can actually be used for similar purposes and actually feeds into large systems that can identify us, the same way you could use DNA to make facial predictions. We saw hints from Facebook a few years ago that say that they will recognize your face through your clothing or your posture or the back of your head or your hair, so using a whole range of other cues to actually predict that it is you. So seemingly there is a lot of data that we are giving up about ourselves that enable the emergence of new identification mechanisms.

Kai: So, we've taken you from face verification, detection, matching and mapping to the potentially brave new world of face prediction. And we also recognize that faces are just one way to recognize and detect a person, that's obviously audio and voice recognition, there are systems now that can use the way a person walks to recognize people in video footage. And the Ministry of Silly Walks comes to mind here, as a way to disguise your identity on video footage and then off course the end game will be DNA and questions remain - How much of our information will we have to give up to gain access to services such as air travel for example. Will we have to give up our DNA to travel to foreign countries? We want to leave it here. It raises the prospect for many more discussions that we will no doubt have in the future. We leave you with this next week, it will be our regular episodes again, thank you for listening!

Sandra: Thanks for listening!

Outro: This was The Future, This Week, made possible by the Sydney Business Insights Team and members of the Digital Disruption Research Group. And every week right here with us, our sound editor Megan Wedge who makes us sound good, and keeps us honest. Our theme music was composed and played live from a set of garden hoses by Linsey Pollak. You can subscribe to this podcast on iTunes, Stitcher, Spotify, SoundCloud or wherever you get your podcasts. You can follow us online, on Flipboard, Twitter or sbi.sydney.edu.au. If you have any news that you want us to discuss, please send them to sbi@sydney.edu.au.

Extra: What do you call facial recognition for mountains? Glacial recognition.

Related content