Sandra Peter and Kai Riemer
The Future, This Week 19 May 2017
This week: the Frightful Five, white men designing apps, and the ten thousand phones enslaved to like you. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.
The stories this week
Google, Not the Government, Is Building the Future
Guy Gets Inside A Chinese Click Farm
Other stories we bring up
2015 Chinese App Store ranking manipulation farm
MIT grad student Joy Buolamwini TED talk
The Entrepreneurial State, Mariana Mazzucato (book)
The Future, This Week – Amazon is coming to Australia (special edition podcast)
You can subscribe to this podcast on iTunes, Soundcloud, Stitcher, Libsyn or wherever you get your podcasts. You can follow us online on Flipboard, Twitter, or sbi.sydney.edu.au.
Send us your news ideas to sbi@sydney.edu.au.
For more episodes of The Future, This Week see our playlists.
Dr Sandra Peter is the Director of Sydney Executive Plus at the University of Sydney Business School. Her research and practice focuses on engaging with the future in productive ways, and the impact of emerging technologies on business and society.
Kai Riemer is Professor of Information Technology and Organisation, and Director of Sydney Executive Plus at the University of Sydney Business School. Kai's research interest is in Disruptive Technologies, Enterprise Social Media, Virtual Work, Collaborative Technologies and the Philosophy of Technology.
Share
We believe in open and honest access to knowledge. We use a Creative Commons Attribution NoDerivatives licence for our articles and podcasts, so you can republish them for free, online or in print.
Transcript
Introduction: The Future. This week. Sydney Business Insights. Do we introduce ourselves? I'm Sandra Peter, I'm Kai Riemer. Once a week we're going to get together and talk about the business news of the week. There's a whole lot I can talk about. OK let's do this.
Sandra: Today in The Future, This Week: the Frightful Five, white men designing apps and the ten thousand phones enslaved to like you.
Sandra: I'm Sandra Peter. I'm the director of Sydney Business Insights.
Kai: I'm Kai Riemer. I'm professor here at the business school. I'm also the leader of the Digital Disruption Research Group.
Sandra: So Kai, what happened in the future this week?
Kai: Our first story is from The New York Times an author we have discussed on The Future, This Week previously by the name of Farhad Manjoo and the article is titled "Google, not the government is building the future". So the article talks about investments in research and development and how in the US in particular Silicon Valley and tech companies are now investing almost as much in their research and development as the entire US government spends on non-military research and development. And this is just five companies here which the author calls the Frightful Five. And that goes back to an earlier article that we will also discuss. So these are Alphabet which is the parent company of Google, Amazon, Apple Microsoft and Facebook. So Sandra what's that all about?
Sandra: So this is playing on that persistent criticism of Silicon Valley, that what it invests is not world changing ideas but rather things like the Juicero, which we spoke about...
Kai:...Yes the seven hundred or four hundred dollars juicer which we loved so much...
Sandra:...Because you could get the same result by squeezing the packet with your hands basically. So the argument is that Silicon Valley is investing not in the world changing ideas but rather in things like Juicero and that this is a particular problem given that the government investment and this article in particular is looking at the US government funding for big scientific breakthroughs, world changing technologies and even infrastructure projects just keep falling, probably under the current presidency will continue to fall. So the question is then are these technology giants and not the government building the future and the example that the article uses is to discuss artificial intelligence.
Kai: Yeah. So the article says that we shouldn't be fooled by the expensive juice and that those companies also do other good things. And the article enumerates a few things self-driving cars, rockets and space programs as in Tesla potentially flying cars but also everything to do with the smart phone like voice assistance, augmented and virtual reality but also drones and of course artificial intelligence and all the good things that are supposedly to come from artificial intelligence. Now there is a little caveat in the article which says that the tech industry is in a bit of an awkward phase of transition marked by incremental improvements to technologies that we think of as boring and lots of exciting promises, promises that are far off tech revolutions as in artificial intelligence. So that's a little bit of what we're talking about. Right so there is a lot of things that people are working on. There is not too much to show for in artificial intelligence at the moment. There are some things that are great but even examples in the article like we can now find people hugging in photographs. I mean this is not world changing.
Sandra: But this clearly will be a world changing since the frightful five are investing 60 billion in research and development and the government is investing sixty seven billion in all non-military research and that is everything from looking at cancer or Alzheimer's disease on multiverses, basic research in every industry and the mean you can think of and artificial intelligence. I think all of the American government spent about one billion dollars on artificial intelligence last year which is a very small number compared to what these companies are changing. So I think it's worth looking at the assumptions that these makes. First there's this story that we tell ourselves that Silicon Valley is great at reinforcing the story of where innovation happens. Innovation happens in someone's garage. It's someone like Steve Jobs or like Bill Gates who in their free time in their mom's garage came up with this world changing ideas and that all of the major breakthroughs that we have now including our iPhone are based on this sort of invention. But is that actually the case. And if we're looking at most of the major innovations since the Second World War a lot of risk was involved in developing those technologies and that risk wasn't actually taken up in the first instance by companies or by lone entrepreneurs in the garage but rather by government investment. So even if we look at something like the iPhone the iPhone couldn't exist without a lot of initial government investment. So the core technologies that we have around touchscreen around the GPS, around SIRIS, all of these technologies initially go back to publicly funded research that formed the basis for garage entrepreneurs.
Kai: Absolutely. The Internet was invented out of the military at universities. It's not something that any one company could have thought up or brought into being because then it would not be this open she had infrastructure.
Sandra: And that is the point so this is not to say that the Internet would not have existed or appeared if it was only private companies doing this investments but the idea of having a public open internet that is not beholden to a specific country or a specific government or a specific organization that would probably not have.
Kai: Or take the MP3 which was invented or developed really at Fraunhofer Institute a public organisation in Germany as a result of years and years and decade long research into psychoacoustics the science of how the human ear basically processes sound and how the brain takes up sound. So the MP 3 would not exist if it wasn't for basic research that was publicly funded. So some concern here is that if governments keep reducing their contribution to research and development we will have less and less basic fundamental research that is two or three steps removed from actual commercialization and that will down the track actually impede commercialization innovation as we have less and less of the basic technologies that underpin innovations.
Sandra: So we think this is a question of investments or how much money we are putting behind this and also a case of public perception around what entrepreneurship is and where it comes from. And there is really quite a nice book from a researcher from Sussex University called Mariana Mazzucato who writes about the entrepreneurial state and it's basically trying to debunk this public versus private sector myths of where innovation comes from looking at decades of innovation, where in the state is actually the entrepreneur.
Kai: And this myth seems to be hard to debunk because our entire everyday world seems to now revolve around those big five companies which goes back to the article a week ago by the same author which was titled "Techs Frightful Five - They've got us" and it talks about how those five companies underpin a lot of what we do on the Internet how we consume the news how we get our products. And they're really very much an integral part of our lives. And so the author asks can we actually live without those companies.
Sandra: Yes. And the author sets this evil dare to try to choose if you were forced to abandon these companies which one would you let go first. And how would your life deteriorate as a result.
Kai: So the article had a little quiz where you had to choose which ones you would abandon first so say pick two of the five. Which one would you let go of.
Sandra: Again the five are Amazon, Apple, Facebook, Microsoft and Alphabet which is the parent company of Google.
Kai: Yes. Which one would you let go of first Sandra.
Sandra: For me this is a no brainer. Facebook would have to go first for me.
Kai: I could live without Facebook.
Sandra: I think it would make no significant difference to my life. I tend to socialize more on LinkedIn and Twitter and other platforms so I would have no problem.
Kai: Of course with each of those companies we would have to take into account that they're not own you know just one service or Facebook owns Instagram for example which incidentally I could also do without which actually I do without currently.
Sandra: Same here.
Kai: So number one that's an easy one and going to number two. We already find ourselves in a bit of a pickle right.
Sandra: Yes so what we have left is Apple Microsoft Amazon and Google.
Kai: OK. We actually need to make a few assumptions here right. So first of all do we say the world stays the same but I have to abstain using those services or are we talking about that evil monarch that the article talks about taking away that service and just everyone has to do without.
Sandra: I think let's first look which one we could do without personally from a consumer perspective and then let's just imagine that company just does not exist. So as a consumer what would be the next one to go for you.
Kai: Well that's a hard one. I would like to say Microsoft. But then I use Microsoft products all the time so I would have to go generic which isn't easy because you know the whole publishing practice and everything ties in with Microsoft Word OK. PowerPoint I could probably do without I could go to keynote. That'd be easy but letting go of Microsoft is a bit of a hard one. They also own Skype now and yammer and I built a good deal of my research career on Yammer and I'm using yammer but I probably say Microsoft and I have to go generic because let's face it I couldn't do without Apple. Google is just so much part of the Internet, it’s very hard not to use any Google services and you know.
Sandra: You could let Amazon go.
Kai: Yeah.
Sandra: Now that they're coming to us yeah.
Kai: But that raises another question. Let's have you choose first what would be your second one.
Sandra: My second one with probably from a consumer perspective be Amazon. We have become so used down here to living without Amazon from a shopping perspective. If we look at what else Amazon has got to offer then it gets into a much more complicated question. I would have to give up all of my personal websites. We would have to give up the Sydney Business Insights website probably a lot of our podcasts.
Kai: Absolutely. I mean Amazon Web Services underpins so many services these days. If you had to stop using anything that uses Amazon infrastructure you'd be stopping using anything I'd say.
Sandra: So we're down to Microsoft to definitely go next. They're a goner.
Kai: But let's talk about Amazon for a moment. We've had this special on Amazon a couple of weeks back right. So at the moment here in Australia letting go of Amazon is actually not such a big deal. But if I look at what Amazon does in the US what Amazon does in Europe if I was in one of those countries it probably would be a lot harder because you'd be using a lot more of their services which are arguably really quite practical maybe not to the extent as the author Farhad Manjoo is using Amazon so I urge you to have a read of the article is quite frightful the extent to which Amazon can be part of your life because they have really a lot of services that like Alexa which you can have organize your entire home you know your lighting your TV your shopping everything entertainment needs which points to the issue here right. You can have those companies be a very big part of your everyday life. But for me I think either Microsoft or Amazon which leaves the discussion of Google or Apple.
Sandra: Very tough one and unfortunately even though the iPhone is probably the thing I use most every single day so I cannot really imagine a life without my iPhone and my iPad and my MacBook. Apple will be the next one to go for me. It's a difficult life to imagine but Google search Google maps.
Kai: Ah look here I think we part ways because there is no way I will go back to using a Windows computer and there's no way I'm going to...
Sandra: You can't, you've already given up Microsoft already.
Kai: That's right yeah true. So if you have given up Microsoft you haven't. You have given up Amazon so there you go. I've given up Microsoft so I'm stuck with Apple which serves me well because I like their products and I'm very much part of their ecosystem. I'm not however part of the Google ecosystem that much. So yeah I'd have to look for a different search engine but let's assume that there are search engines that do not utilize Google's search mechanism.
Sandra: I think there is a thing called Bing. You might want to google that.
Kai: That's Microsoft. And you know I have already let go of him. So I'm now stuck with no search on the Internet which is a bit of a pickle. But you know I think google for me not Apple. Which means that no longer iMessages for us Sandra.
Sandra: I was going to say this will make it quite difficult. We will need smoke signals or carrier pigeons. But our conversation about how this changes our particular lives also obscures one other thing which is giving up any of these companies would have to mean giving up the large infrastructure that they have built so huge problem with giving up something like Amazon is that most of the web now works on Amazon Web Services. And if we were to lose that a lot of what we today, consider as Internet disappear. Same with Google.
Kai: Absolutely. And also this conversation and thinking your way into this there shows how much some of those companies have managed to become part of that back end infrastructure and therefore have made themselves indispensable. Whereas Facebook for example you would lose a few nice to have services, you could do without, some of those companies, they're really essential and short of giving up using the Internet entirely, you wouldn't be able to do without them.
Sandra: So the most valuable companies on the planet collectively worth trillions are not going to go away anytime soon. And the New York Times article makes the case that whilst these tech giants are investing all this money on working on problems like artificial intelligence they will continue to have a huge impact on society.
Kai: But the article also makes the point that when we leave the development of certain technologies to the private sector they will then shape those technologies in their image. So take artificial intelligence. If it's the big companies that invest in artificial intelligence and shape what artificial intelligence will do then this would benefit them at the expense of others.
Sandra: So the author urges the U.S. government to invest more in artificial intelligence.
Kai: Yeah and I can't see that happening at the moment. The White House is...
Sandra:...Has more pressing problems.
Kai: Yes they're not really interested in artificial intelligence. They're looking for real intelligence at the moment.
Sandra: So speaking about white man building the world in their image...
Kai:...That brings us to our second story. The article in Tech Crunch is called "Designs exclusion problem" and it talks about the problems that when one group of people design a service, a product, an app they will invariably make certain assumptions that embody their view of the world. And there's a few nice examples that show what we mean by that.
Sandra: So the article gives a really good example with Apple's health tracker which was meant to track everything that is really important to you. However it forgot to account for the number one health and wellness item that women track...
Kai:...Fertility cycles and their period. You got me to say it.
Sandra: But it also goes to show things like Facebook's affinity marketing which actually enabled marketers to select based on racial or ethnic profiles and directly excludes certain categories such as Asian Americans or African Americans from their marketing efforts.
Kai: The racism by design. It was basically part of the service to target certain ethnic or racial groups.
Sandra: And this is a reinforcing loop because a lot of the big data that's gathered in this way then goes on to inform things like webcams and there is a wonderful TedTalk by an MIT grad student who was working with facial analysis software where she noticed a sort of big problem that the software didn't actually detect her face because people who had coded the algorithms hadn't taught it to identify darker skins. So she didn't show up as a person at all.
Kai: And we have a short clip for you.
Audio Clip: "So what's going on? Why isn't my face being detected? Well we have to look at how we give machines site. Computer vision user's machine learning techniques to do facial recognition. So how this works is you create a training set with examples of faces, the faces of faith. This is not a face. And over time you can teach a computer how to recognize other faces. However if the training sets aren't really that diverse any face that deviates too much from the established norm will be harder to detect which is what was happening to me."
Sandra: This raises a number of questions around the future of these businesses. One is to what extent you want to democratize your products or your services or are you chasing returns on the money that you have spent developing these products or services.
Kai: Yeah and there's two issues here. One is to what extent does the public tolerate the explicit exclusion of certain groups from certain services. We talk a lot about the digital divide and to what extent are these accidental problems like training artificial intelligence algorithms with biased data sets and you know the camera not recognising a black face is certainly not something that was deliberately done but it shows the dangers of a certain mono culture of people designing services for the world. And we've certainly discussed at length the sexism and other discrimination or biases that come from a certain Silicon Valley culture and also a male dominated and rather young group of people designing those services that are supposedly useful to the wider population.
Sandra: And these services will crop up in a variety of places in our society. So quite often we tend to think about consumer products like webcams or facial recognition on your phone or in your digital camera. But again the same facial recognition algorithms that these companies are developing because the government is investing less and less money in this, will show up in airports will show up in CCTV footage that has to track or recognize or profile a certain category of people so we do have real dangers that will stem from this. The second problem is the one you've highlighted just now with the young white male dominated culture that develops these products and services. What are the opportunities for this to change? Because the moment we design things we design things for the world around us and if the world in Silicon Valley is less and less diversified - how will they overcome this problem.
Kai: Yes. And not only that even if the design of an algorithm or a software or a data analytics engine is not biased per se the data that we use and we often talk about big data and big data sets and we always have to ask where does that data come from, who is represented in the data and who isn't. If we take big data sets that we derive from social media from smartphone usage from certain sensors then it is more than likely that affluent people are younger people are more highly represented in these starter than older people or people that do not have ready access to these technologies. And if we then use this data to make decisions about public services or government services or even the creation of new services to be marketed we are just reinforcing a spiral that will lead to further and further exclusion...
Sandra:...Or even private services, who gets insurance, to whom do you give out a loan who gets admitted to university. How do we price certain products or services for certain categories of people.
Kai: So those services that are supposedly intelligent because they are artificial intelligence based services they will only reinforce the kind of stupid mistakes that we as humans make and where technology is supposedly so much more intelligent than we are unbiased and we've discussed this before.
Sandra: So don't believe everything the data tells you?
Kai: Absolutely. First of all ask the question what does the data not tell you and what is it that the data cannot tell you because it might not even be represented in that data in the first place. And second the data might not even be true. It might be fake data. Because there are all kinds of services now on the Internet where you can buy data where you can buy reviews where you can buy likes.
Sandra: Which brings us to our last story.
Kai: This one we got from Digg and it's called "Guy gets inside a Chinese click farm, and holy crap..."...
Sandra:...You mean holy shit...
Kai: Yes. "..That's a lot of phones.”
Sandra: So this starts out as a joke. A man walks into a Chinese click farm but turns out to be less than a joke. Turns out that if you want to run a business and you want to rate your app or write fake reviews you can't just pretend to have this many phones and downloading the app you actually have to have the phone. So in China there are click farms with tens of thousands of phones were manual labour is used to download apps and to write fake reviews.
Kai: And we have a little video that we will post in the show notes. But imagine this. There's a big shelf with hundreds of phones they're all plugged in front facing and there's a person sitting in front of this usually paid very little money and all they do is they download an app to that phone or they click "like" on certain posts. So this is big business as a business as a developer of an app or even to promote your content on platforms you can buy "likes" you can buy downloads, you can enter into a contract with a click farm and say keep me in the top 10 downloads in this category for three weeks and then a battery of phones and an army of people will just do that for you for comparatively little money compared to what you are likely to make from the customers that you are driving to your business by doing it.
Sandra: So let's first of all know that companies like Apple have been trying to prevent this from happening which is why we actually see the click farm so you cannot really use an iPhone emulator in order to do this you need to actually have that many phones and people manually doing this.
Kai: So you can't just find an algorithm that does that for you. You really have to sit people down in front of phones.
Sandra: You're sitting people down for probably less than a dollar per thousand clicks. These people are in countries like China but also in developing countries like India or Nepal or Sri Lanka or Bangladesh.
Kai: So companies try to prevent this. Apple has made a lot of effort but not every one of the frightful five is doing it that way right?
Sandra: No. So while Facebook explicitly forbids buying clicks from click farms, they do offer a legitimate way to pay for "likes". By advertising your page prominently in front of people who supposedly want to see your page. So you pay some money and the promise is that you will get more "likes" from these people. So check out what happened when Derek Muller from Veritasium created a page for virtual cat. This pet like no other which was supplying the worst most annoying stuff on cats. No one would actually like this page. But then he created this page and paid Facebook to actually get some "likes".
Audio Clip: "So the real mystery to me is why someone somewhere would click on ads that they didn't care about without making money from them. I mean I don't think these clicks came from bots. They're too easy to identify and eliminate. And I also don't think for a second Facebook would pay click farms to click on those ads to generate revenue for them. So it really seems like a mystery. And then in this article I found what I think is the most reasonable hypothesis click farms click the ads for free. In order to avoid detection by Facebook fraud algorithms, they like pages other than the ones they've been paid for to seem more genuine. I mean you can imagine a thousand likes on a particular page coming from one geographic area in a short period of time would seem pretty suspicious but buried in a torrent of other like activity they would be impossible to identify. So workers at these click farms will literally click anything. I mean where do you think Facebook security page is most popular. DHAKA Bangladesh. What about Google? Dhaka."
Sandra: So companies like Facebook might have an incentive to outlaw click farms but they actually have a financial incentive to keep the status quo with their own not so real "likes". Otherwise they really would have to admit that they've generated significant revenues from clicks that weren't really genuine.
Kai: Which goes to show that any system, any business practice really where you have a set of metrics like 'likes' or 'views' or 'number of downloads' which supposedly measure the success of something once you tie this measurement to the extra financial incentive you will have game in the system. So if you measure performance of a CEO by the share price their incentive would be to drive share price not necessarily look after the company as a whole. If you'd measure academics by the number of their journal publications and tie their salary or incentives to those publications you will drive the number of journal publications. But that doesn't necessarily translate into better research. So you might end up with more mainstream research and equally if you measure the success of an app by the number of downloads and you rank them than you give every incentive to drive the number of downloads to be in that ranking which will then drive the number of genuine downloads in there for business. So there's every incentive to gain that system to be in that elusive top 10 because this is where the money is. So how do we deal with this?
Sandra: So to me one answer would be the same as in the design exclusion problem which is the issue of transparency especially if you have an organisation like Facebook promoting these rankings or the opportunity to pay for them themselves. You don't know what's happening behind the process or having multiple sources of realising these ratings and rankings or likes to try to ensure that you have some reliability in them or maybe not doing them at all.
Kai: Yeah. Maybe being comfortable to judge quality qualitatively rather than finding a single measure which supposedly encapsulate quality.
Sandra: So I think this is all we have time for today another quality podcast from The Future, This Week.
Kai: Thank you for listening.
Sandra: Thanks for listening.
Outro: This was The Future, This Week brought to you by Sydney Business Insights and the Digital Disruption Research Group. You can subscribe to this podcast on SoundCloud, iTunes or wherever you get your podcasts. You can follow us online on Twitter and on Flipboard. If you have any news you want us to discuss please send them to sbi@sydney.edu.au.
Close transcript