This week: classy Facebook, the end of the internet, and China, crypto and space stations in other news. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.

The stories this week

Facebook patents tech to determine social class

Aviv Ovadya is worried about an information apocalypse

Inside the two years that shook Facebook and the world

Early social media employees join forces to fight what they built

Facebook was designed to prey on fear and anger

Facebook funded experts who vetted its Messenger for Kids

Facebook employees are paranoid that company spies on them

Google tests bot to chat with friends for you

Our discussion of how Facebook figures out family secrets

Our discussion of fake reviews of fake videos

AI makes it easy to create fake videos

Fake hyper-realistic photos of objects, people and landscapes

Stanford’s real-time face capture and re-enactment of videos

The University of Washington synthesizes  Obama – lip-sync from audio

Our latest research in digital human technology

Meet real time Mike

Future bites

Manchester City Council website caught cryptomining

Mining cryptocurrencies has led to a global shortage of GPUs and we now can’t look for alien life

FOR SALE: Orbiting space station

Billionaires are prepping for the apocalypse

China’s 10 new unicorns

Didi taking on the world

Fake reviews on Amazon mean you get free stuff


You can subscribe to this podcast on iTunes, SpotifySoundcloud, Stitcher, Libsyn or wherever you get your podcasts. You can follow us online on Flipboard, Twitter, or sbi.sydney.edu.au.

Our theme music was composed and played by Linsey Pollak.

Send us your news ideas to sbi@sydney.edu.au.

Disclaimer: We'd like to advise that the following program may contain real news, occasional philosophy and ideas that may offend some listeners.

Introduction: This is The Future, This Week on Sydney Business Insights. I'm Sandra Peter and I'm Kai Riemer. Every week we get together and look at the news of the week. We discuss technology, the future of business, the weird and the wonderful and things that change the world. Okay let's start.

Kai: Let's start. Today in The Future, This Week: classy Facebook, the end of the Internet, and China, crypto and space stations in other news.

Sandra: I'm Sandra Peter I'm the Director of Sydney Business Insights.

Kai: I'm Kai Riemer professor at the Business School and leader of the Digital Disruption Research Group.

Sandra: So Kai, what happened in the future this week?

Kai: Well even though there's nuclear facilities, mining crypto, there's 10 new unicorns in China, the biggest story of the week is Facebook. In fact there's been so many stories around Facebook this week that we really couldn't go past the topic. The article we picked is Engadget and it's titled Facebook patents tech to determine social class."

Sandra: So Facebook has just patented a plan for what they call the "socio economic group classification based on user features" which is basically Facebook's way to patent the technology to decide if its users are upper class, or middle class, or working class without using the usual markers that you would have things like individuals income. So what Facebook is doing instead is classifying people by using things like whether or not you own a home. What is your level of education? The number of gadgets you own. Do you have a phone? Do you have an iPad? Do you have a phone and two iPads? How much do you use the Internet? Potentially what restaurants you go to or what brand of shoes you wear. If you for instance have only one gadget and you don't use the Internet so much, Facebook's algorithm will determine that you probably are a person who's not really well off. So the question then becomes why does Facebook care?

Kai: Well they do care because that's what they do. So there's this article and there's a significant number of other articles looking at Facebook from different angles that we will go into. So we will look at this story and then look at the bigger picture. Now first of all, Facebook is in the business of classifying and characterising its users because that's what they sell to the advertising industry. So from that point of view it's not really surprising that Facebook would engage in this type of classification. Pun intended. But the author really gets worked up about first of all what it reveals about Facebook itself, the way in which they seem to understand the world and class, but also what it will lead to when rolled out onto the platform.

Sandra: So given the way this is set up by Facebook and all of the patent documents are there you can have a look at them online and they detail the types of features that they would be looking at, things like demographic data or device ownership or even your travel history or other household type of data. It's not a stretch to think that you would be able to say well this sort of restaurants are attended by a particular class of people or these brands of phones are owned by a particular type of people. So we can see the way that this would not only contribute greatly to the way Facebook already makes its money by determining these groups of users and allowing people to advertise but also the way in which this will reinforce how the algorithm actually creates and shapes this groups of people.

Kai: So what is significant for me is that Facebook seems to ignore by and large the huge body of research and literature that exists on milieus and class and social stratification in society. Instead they come up with their own criteria and they seem to have a ranking of cities you live in for example. Meaning if you live in Palo Alto, it's better than if you live in Mountain View, which is better than living in San Jose. It seems to me a rather haphazard and naive approach. But more importantly is that if they roll this out into the platform it will have material consequences on the kind of things that will be put in front of people so depending on where you live and whether you only have two and not seven devices that communicate with Facebook, you might be given less opportunities you might not be shown that job ad, you might be shown ads for predatory loans. So really what happens here is that they come up with this crude definition of what a class stratification in their user group is which then will actually create that very reality and the author points that out.

Sandra: So it's indeed the two points that you make. One is that in a ridiculous way this will create the simplistic reality that Facebook describes by assigning people to these categories and then reinforcing those categories with those ads. And second it's important to note that this is not out of Facebook's malicious intent. The fact that poor people will be shown predatory loans, it's not something that Facebook is setting out to do or that Facebook plans or wishes in any way it's a byproduct of the way this is set up. The algorithm has to show those ads to people who are most likely to click on them. So we know people who are desperate will most likely accept higher interest rates simply because they cannot access those loans so the algorithm just does what it's supposed to do. Those people anyway have lower chances to get a fair loan or to get access to normal loans. I will actually show them a ad that will allow them access to those loans so they're more likely to click on it. So this is again not out of any intent that Facebook has but rather a byproduct of the algorithm.

Kai: To me what makes this story significant is the timing at which it is released. In a week when we had articles in The Outline talking about how Facebook gave money to and funded most of the experts that vetted its messenger kids app and the backlash around that has been in the media in the past couple of weeks. There was an article on early employees on Facebook and Google forming a coalition to fight the negative impacts on social media. A long article in Wired magazine on what it was like inside Facebook during the two years that shook them to the core during the election and what that revealed about how Facebook works. And then revelations that Facebook is spying on its own employees to try and get them from talking to the media and revealing insider information so it hasn't really been a good week for Facebook so we thought we'd take a deeper look at the problem and summarise what the issues are and what it means for us.

Sandra: So what we want to do is have a look at what are basically four big conversations that are happening around the issues that social media platforms in general have and Facebook's been singled out this week but we could see this of most of the social media platforms that we are part of and try to unpack what this means not only for the business model of companies like Facebook but also what this means for other organisations/ companies using their services and the consumers themselves. So the first conversation we want to look at is this idea of the ability to exploit people who are part of these networks.

Kai: We've known this for a long time right. So saying that if the service is free you are the product or the old joke about Facebook that if it was a butchery we'd be the sausages. We've been telling these stories for a long time but it is now becoming quite evident that the business model of Facebook is really to sell users and their information to advertisers and anyone who is prepared to pay money for using Facebook's information to target content towards users.

Sandra: This idea is not inherently wrong. We are accessing free services for which we should pay something and the way we pay for some of these services is by disclosing some of our data so for instance if I use Google Maps the service is free that's a lot of value to my life but I need to share some of my data in exchange for that free service. But something a little bit different is enabled on a platform like Facebook.

Kai: Because it is happening at a larger scale and it is done algorithmically, it's basically out of control in the sense that even Facebook doesn't understand 100 percent how it works, who is shown which information, and what are the kind of keywords that the algorithm will allow people to utilise to discriminate between people in targeting content and that has led to these unfortunate incidents where people were able to target users on racial grounds for example and that has been in the media.

Sandra: In addition to that what happens is that companies who wish to enlist large numbers of people to achieve their own goals and we've seen this with the US elections can do so based on these algorithms, they can create types of content that will engage a certain category of people, make sure it gets shown to that particular type of people who then will be tricked into behaving in ways that they didn't intend to.

Kai: Yes so the second issue that is often singled out with Facebook and social media is that it leads to usage addiction and that's at the heart of the initiative by former Silicon Valley employees who basically say these platforms are built just like poker machines, they are built in a way that maximises people's time on the platform, that entices them to click on things that make these platforms sticky and people basically share more information as a result of this but at the same time they can't really let go of the device so it becomes an addiction and it starts very early so we see this in children, we see this in teenagers and it's linked to studies that show that it has a measurable effect on brain development for example.

Sandra: The third conversation is that around this aspect of social isolation and again with this conversation we want to stress that this is not something that Facebook set out to do. This is not something that they consider a good thing.

Kai: No ironically it's quite the opposite of what they set out to do I mean Zuckerberg has been outspoken about that the job of Facebook is bringing people closer together.

Sandra: But by trying to bring people closer together it tries to show people content that they would want to engage in, news from their friends, news from their loved ones, news about topics that are of interest to them. The unintended side effect of that is that it creates this echo chambers where I have a certain belief, I click on articles that reinforce that belief or a news that reinforces that belief, I befriend people who share those beliefs who then again share stories that reinforce those beliefs. So I end up in my own bubble of news, in my own bubble of beliefs and of interests that I cannot get out of. And of course this is not a new argument, one that has been discussed before and people are now starting to be aware of this but they have to make a conscious effort to get out of that thought bubble they have to go to other places to look for that news which is neither easy to do nor is something that the platform itself can help you do.

Kai: No first of all you have to be aware of the way in which what you're being presented is not the full picture but actually what the algorithm has deduced is most likely to engage you in clicking on stuff. So that's the first step and then obviously it's quite convenient to just stay in your bubble in your daily life and just consume whatever shows up in your feed. Actually there was a great article in Techcrunch a few weeks ago titled "Social media is giving us trypophobia" and don't Google trypophobia - the pictures that come up you cannot unsee. But anyway the fear of holes it's called.

Sandra: It's not real. I looked it up on Wikipedia.

Kai: The pictures are still gross but we digress. The article is excellent because it shows that this echo chamber works on different levels so it leads to social segmentation so it polarises so we end up in these groups that are very homogenous but do not interact in any ways with other homogenous groups so it leads to social fragmentation.

Sandra: And they find it quite difficult to have a public conversation where the arguments and the debates that they are having are quite distinct.

Kai: And we see this in the political landscape but it also works at the individual level where content is more and more targeted and what I am presented is quite different from what Sandra is presented and we do not actually get the news as they happen but the news as the algorithm decides is relevant to me or Sandra.

Sandra: So the last conversation around Facebook in particular this week, well social media in general, has been the privacy one and this goes back to conversations we have had before on The Future, This Week where we've talked about Facebook's uncanny ability to suggest, match up friends and contacts who you might not even know exist. Family members that you might not know exist, connections that you do not know how they are made. And here we discussed how Facebook uses up to 100 different data providers so it accurately aggregates your behaviour and data about you from sources other than your activity on Facebook to have a full sometimes eerily accurate picture of who you are and who you interact with and what you've done previously. Every single minute detail of your life.

Kai: So what we're saying is that Facebook has been in the news for those four issues: the way in which users are exploited and fake news becomes a problem as a result, that it leads to social media addiction, to echo chambers and social isolation or fragmentation, and that it comes with large scale privacy invasion or eradication that privacy is basically non-existent once you become part of this ecosystem. So the question is how did we get there, how is this happening?

Sandra: The way this comes about is really the business model that Facebook is built around. Facebook is built around the ability to provide this service to users, the ability to connect and read news and all of that but funds the way it does this by allowing corporations or individuals or governments or charities access to particular groups of people with particular traits, it allows them to advertise to these people, to put content in front of them and it is based on the ability to do that really really well.

Kai: So this harks back to last week's discussion about fiduciary business models and we discussed there that Facebook has two different groups it caters for: it has users that use the platform but the income is generated from another group which are advertisers and corporations so they optimise for two different groups and balancing this out can be difficult because turning the platform into something that is more profitable does not necessarily come with making it better in whatever sense for their user group.

Sandra: So how does Facebook generate income?

Kai: Well it's a combination of a number of factors. So the story that we've discussed around social class means that at the very heart of the business model is to be able to profile its users by way of complex psychometric analysis which comes hand in hand with the collection of a very broad set of data and then the discrimination of content that is being presented to users be it news, be it adds, be it opportunities for jobs and things like that. And that is paired with the claim that Facebook is a platform that every content bit is presented equal so this hands off approach to content anything can be presented. So those factors in combination basically gets us these effects and given that this is done in an automated way and at a large scale gets us the effects that we've discussed with fake news and echo chamber so absolutely Facebook has never set out to deliberately do it in that way but what we've learned in the past year and Facebook has learned this as well is that this system can be exploited. The system has been exploited by Russia meddling in the US election but also by people who just optimise for income with no regard for the kind of information that will get engagement from users be it real or be it fake.

Sandra: And the potential for this to continue so the story we started out with was a story around Facebook and its ability to determine social class. Now we talked about predatory lending but you can think about how this would enable people to rent out their apartments only to a certain type of people and how that would segregate neighbourhoods. You might be able to use the same thing for dating, you could date only a certain type of people or show certain profiles only to people of a certain social class or social background so the potential to do this is far greater than what we've seen so far.

Kai: And the argument has been made in one of the Wired articles this week is that at the heart of the problem is Facebook's hands off approach so Facebook has basically tried to be a news company without engaging journalists and without making judgements about content whether it's real, whether it's fake, whether it's good, whether it's engaging whether it's of a particular genre, an opinion piece or a factual article. Facebook is doing none of that and has left it to its algorithm to basically discriminate among content and present it to its users. And this is now what's coming to bite back.

Sandra: So where this leaves us at the moment is with a pretty decent understanding of what the problems are and this is true across the various conversations that we've discussed with Facebook across all four conversations. We have a pretty good understanding of what the problem is and why we have the problems that we do. They are baked into the business model. It doesn't leave us a lot further however in how we solve this. Facebook has been having go's at addressing all of this issues for the past almost a year.

Kai: They've hired thousands of people to weed out the most obvious problems with content presentation for example.

Sandra: They've tried various models for how to present less fake news one of them was adding a Wikipedia button where you would look up on Wikipedia whether the news is trustworthy. They flirted with the option of promoting content that your family and friends think is relevant content, a variety of different approaches so it's not for lack of trying.

Kai: But if you look at the analysis and the complexity of the problems it's pretty clear that this is dealing with the symptoms more than addressing the systemic problem that is at the heart of what is happening.

Sandra: So because this is baked into the business model this is really just tinkering with the surface of it.

Kai: But what do we do about it? I mean we could just say that's just the way it is right, this is how social media works.

Sandra: But to be honest this is just the way it is with Facebook. So if you look up a company like Tencent who owns WeChat they have managed to create a business model that does not rely as strongly on advertising. They found ways to monetise the small transactions that occur in WeChat let's remember WeChat also enables payment. So all those little small transactions enable WeChat to have a variety of sources of income and not rely so heavily in advertising. The whole Tencent business model is also spread out around the variety of industries including finance and infrastructure and so on. So their model is not solely dependent on selling users to companies and maximising clickthrough rate.

Kai: So my personal opinion is that Facebook will have to do something about it much more substantially than obfuscating or denying or selling bullshit to the media. They will have to actually look at the business model in order to recapture this positive image that has been with Facebook in its early days where it was known as a way for friends to connect and for expats to stay in contact with friends and family where the whole narrative of what Facebook is centred around the front end of what it was doing for its users. Facebook is at a risk at the moment that their narrative shifts and that Facebook becomes known for what it does at its back end, being a large scale discrimination machine that basically sells users to the highest bidder. And if that was the case it might disengage its users and therefore erode the very basis on which it rests and we have seen in the past with MySpace that large scale networks can fall apart and unravel. And just because Facebook is big doesn't necessarily mean that it is too big to fail if something came in its place and we've seen that WeChat is making advances into other countries it's not just operating in China. So who's to say that WeChat or something else isn't coming along to capture some of what Facebook used to be.

Sandra: So if this wasn't gloomy enough for you...

Kai: Here's our second story from BuzzFeed News titled "He predicted the 2016 fake news crisis, now he's worried about information apocalypse". So who is he Sandra?

Sandra: So he is Aviv Ovadya and he's an MIT graduate who's done a few things at various tech companies around the Valley companies like Quora. But who has now moved on and is a Chief Technologist for the University of Michigan's Centre for Social Media Responsibility. And Aviv has pretty much dropped everything in 2016 where he realised that the business models behind companies like Facebook and Twitter would enable the kind of onslaught of fake news that we've seen influence the American elections. And he has been trying to raise attention ever since to what he perceives the inherent problems of the web and the information echo system that has developed around the social media platforms.

Kai: So the point that Aviv and this article are making is that on top of the kind of social media platforms that we've just discussed with Facebook and the way in which it tends to spread all kinds of fake news and things like that, that on top of this we are seeing a new wave of technological developments that make it easier to not just create but automate the creation of fake content and not just text but video, audio at a large scale.

Sandra: And he thinks that if the stories like the one we've had before about the conversations that Facebook scare us, what comes next should really scare the shit out of us.

Kai: So the stories collected in the article are truly worrying.

Sandra: So what he does is ask a very simple question. What happens when anyone can make it appear as if anything has happened regardless of whether or not it did.

Kai: And he presents a collection of examples that show what is already possible and then he imagines of where things are going in a reasonably short amount of time. So for example: there's examples now and that has been in the news of fake pornographic videos where algorithms use faces of celebrities sourced from Instagram for example to put it on fairly explicit video material.

Sandra: There is also the research out of Stanford that we've discussed here before that combine recorded video footage with real time face tracking to manipulate that footage and for instance puppeteer images of let's say Barack Obama in a very believable way and make him look like he said things or he did things that he hasn't really done.

Kai: And we've discussed previously research that is being done here at the University of Sydney Business School into a photo realistic believable avatars which are piloting the technology to basically create synthetic believable humans that can be puppeteered which can be copies of real people or synthetically created people so the prospect is that we now enter an era where I can make anyone say anything on video. I can create synthetic humans, we would not even know whether we're conversing with a real or a fake human. We're entering an era where not even video or live video streams can be believed.

Sandra: So why does this spell the end of the Internet for Aviv?

Kai: So there's two steps to this. First of all obviously there's these scenarios that fake content can create serious problems, mass panic, diplomatic incidents. US presidents declaring war on other countries.

Sandra: And currently war game style disaster scenarios based on this technologies are being run to see how these would play out.

Kai: But the more worrying part is that the ability to create any type of fake content at the same time discredits what is real. So I can always turn around and say this piece of information could be fabricated or this video isn't real because we now have the technology to create this kind of video synthetically. Who gets to say what is real and what is fake as a result.

Sandra: Furthermore these things will all start competing with real people and real voices for the same type of limited attention. So think about your representatives in Government. They could be flooded with a range of messages from constituents that sounds real, look real, asking them to behave in a certain way or vote for certain policies. These things will compete for real time and real attention from legislators and will come at the expense of other voices.

Kai: And even in the absence of large scale doomsday scenarios it starts small. What happens when you can no longer distinguish between real messages from colleagues in your inbox and fake messages that are there as phishing attempts, as spam essentially. There's just this week been an article about Google rolling out a new service called Reply which is essentially a bot that allows you to have friend messages answered automatically.

Sandra: So say someone contacts me while I'm doing this podcast with Kai, this bot would see in my calendar that I'm scheduled to record this, would send the message saying I'm just recording this with Kai, please call back between these and these hours when I'm free and so on. But you can imagine the complexity of these things growing and the accuracy with which they will be able to conduct conversations on our behalf growing.

Kai: So the real issue is that when you can no longer discern what is true, what is real, what is fake, what is fabricated in social conversations online, on the Internet, or news platforms on social media this will basically mean the end of the Internet as we know it as a source of information, as a source of social interaction.

Sandra: So where does this leave us in a world with a lot of fake content where social platforms like Facebook act as accelerators for distributing this content and amplify the effects that they would have. Whilst we are trying to find solutions to this and we've discussed this with our previous story. In the interim we have very few alternatives left. One of them arguably could be turning to trusted institutions where we know that certain news outlets have been trusted institutions for over 100 years, turning to universities as sources of knowledge.

Kai: Yes absolutely I think we will see a move back to the future if you wish in the sense that for a long time we have decoupled content from its source, social media Facebook trades in presenting all kinds of content decoupled from its originators. I think we will have to see a big move back to trusted brands such as The New York Times or consumer brands Amazon, Apple to institutions who have a proven interest in presenting real and trusted information and services. So I think the Internet as a global democratic space where every voice is equal, I think we're about to see destroyed precisely because it's very hard to know what and who we can trust to tell us something about the world or something that is entirely fabricated to serve some ulterior motive and self-interest.

Sandra: Well that is slightly depressing.

Kai: That's more than slightly depressing but lucky enough we have Future Bites so our quick fire round to finish off this podcast. So what did you learn this week Sandra? Other than that we're doomed.

Sandra: I've learned some very interesting things about how to mine for cryptocurrencies. An interesting case from Manchester where if you went to the city council's website, so let's say trying to check on when you should put your bins out for your garbage to be collected, your computer would be automatically recruited to mine for crypocurrencies and if you think that's weird apparently something similar happened that the Russian nuclear facility where scientists actually used these super powerful computers to mine for cryptocurrency.

Kai: And predictably got in trouble for it. Oh and apparently this whole crypto craze also challenges our search for extraterrestrial life.

Sandra: The SETI program cannot buy powerful graphics cards because apparently they're all being enlisted to mine for cryptocurrencies so they're unable to purchase them.

Kai: Well speaking of extra terrestrial life, the space station apparently is up for sale. Donald Trump and his government are changing the funding for NASA and apparently they are no longer prepared to foot the bill for the space station and they are putting up for sale, that's some exclusive private real estate, comes in at a handy 3 to 4 billion dollars. So who's going to buy that one?

Sandra: Well we had a think about this and given that Elon Musk has just launched his Tesla into space we think this could be a great opportunity for him to acquire it.

Kai: Build a charging station for it or use it as a garage. But there's been another article about people like US billionaire Peter Thiel who increasingly buy property around the world to build bunkers to hide out from the doomsday scenarios that we're so fond of discussing on The Future, This Week and hey wouldn't that be a great hideout spot.

Sandra: So space stations and earth bunkers.

Kai: But you want to pick them wisely right. While an earth bunker is great to hide out from alien invasions, the space station might be better in case of a zombie apocalypse so...

Sandra: Get one of each?

Kai: Hedge your bets.

Sandra: Speaking of bets and things to come, there's been a whole bunch of articles about China's tech revolution that has only just begun and they talk about the ascendance of this super power through entrepreneurship through innovation through a number of start ups that have reached unicorn status this year alone. There have been about 15 of them and effectively about a third of the world's billion dollar companies were created in China last year alone. China's ride-sharing company Didi has surpassed Uber, we've talked a lot about Uber on this podcast, we haven't spoken that much about similar companies, bigger companies, coming out of China.

Kai: And we want to stress that Didi has not just surpassed Uber because of the scale at which China does business, but Didi has ventured into many other countries such as India, Brazil, Mexico, and also opening a lab in Silicon Valley. So really starting to compete with Uber and other ride-sharing in other jurisdictions.

Sandra: So any last stories Kai?

Kai: So remember we talked about fake reviews previously? No. Amazon apparently has a feature where you can only leave a review if you actually buy stuff. Turns out that doesn't stop fake reviewers. There's been an article in The Outline that reports on a retired couple in Massachusetts who found that they were receiving a lot of free deliveries from Amazon and they started to investigate and it turns out people who want to leave a fake review for a product buy a gift card to leave no trace and then purchase the item and purchase a data set and just send their stuff to random people who then find that they receive a never ending stream of free stuff that they neither ordered nor want.

Sandra: And on that very strange note that's all we have time for this week.

Kai: Thanks for listening.

Sandra: Thanks for listening.

Kai: Well that was the depressing.

Sandra: Yeah you thought Facebook was shit.

Kai: The whole Internet is gone.

Sandra: And you can't even afford to buy the space station.

Kai: Gonna wait for Musk to take it to Mars now.

Sandra: Better luck next week.

Kai: He's going up there you know.

Outro: This was The Future, This Week made awesome by the Sydney Business Insights team and members of the Digital Disruption Research Group. And every week right here with us our sound editor Megan Wedge who makes us sound good and keeps us honest. Our theme music was composed and played live from a set of garden hoses by Linsey Pollak. You can subscribe to this podcast on iTunes, Soundcloud, Stitcher, or wherever you get your podcasts. You can follow us online on Flipboard, Twitter or sbi.sydney.edu.au. If you have any news that you want us to discuss please send them to sbi@sydney.edu.au.

Related content