Sydney Business Insights
The trust shift
Rachel Botsman is a best-selling author, her TED Talk views are in the millions, she lectures at Oxford University on the sharing economy, and her new book “Who can you trust?” seeks to change our perception of trust. Today we talk to Rachel about whether we are on the cusp of one of the biggest social transformations in human history.
Show notes and links
You can subscribe to this podcast on iTunes, Spotify, Soundcloud, Stitcher, Libsyn, YouTube or wherever you get your podcasts. You can follow us online on Flipboard, Twitter, or on sbi.sydney.edu.au.
Sydney Business Insights is a University of Sydney Business School initiative aiming to provide the business community and public, including our students, alumni and partners with a deeper understanding of major issues and trends around the future of business.
Share
We believe in open and honest access to knowledge. We use a Creative Commons Attribution NoDerivatives licence for our articles and podcasts, so you can republish them for free, online or in print.
Transcript
Introduction: Rachel Botsman is a best-selling author, her TED Talk views are in the millions, she lectures at Oxford University on the sharing economy, and her new book "Who can you trust?" wants to change our perception of trust and understanding of how traditional ideas of banking, media, politics and consumerism are being radically transformed. I'm Sandra Peter and today we ask whether we are on the cusp of one of the biggest social transformations in human history.
From the University of Sydney Business School, this is Sydney Business Insights - the podcast that explores the future of business.
Sandra: Welcome Rachel and thank you for talking to us today.
Rachel: It's lovely to be here.
Sandra: How did you start thinking about trust and how did this book come about?
Rachel: Well I was working for the Clintons (Bill Clinton, not Hillary) and I think that was the start of me thinking about the relationship between trust and power. This was in 2008 and it was when we were starting to use technology seriously in election campaigns to mobilise people. And I left to write my first book "What's mine is yours" about the so-called sharing economy and the piece that always fascinated me was how do you take these ideas that really shouldn't work on paper and get strangers all around the world to trust one another through technology. And so I've been researching that and looking at it in the context of marketplaces and then about three years ago I had a hunch that that was part of something even bigger and the hunch was that trust in institutions, so trust that used to flow upwards to experts and regulators and authorities and academics and CEOs, was starting to flow sideways and that a lot of the patterns of disruption, a lot of the pain, a lot of things that we're seeing in politics were tied to this really big trust shift.
Sandra: Is there too much trust these days or too little trust? Is there a trust crisis? We see trust in the newspapers every single day.
Rachel: Yeah I'm not a fan of that headline "trust is in crisis". I don't think it's a helpful narrative because I think it actually just amplifies distrust. So to answer your question, I think there is actually plenty of trust out there, it's just flowing in a different direction and often people were calling this a crisis is because the trust is moving away from them. The most helpful metaphor if you like was how trust is like energy. If energy doesn't get destroyed it continually changes form. And I think that's what happens to trust in society and it's happening right now.
And so yes trust is being eroded in many institutions but that trust is not disappearing, it doesn't dissipate, it just flows in a different direction to someone or something else. And sometimes this is a good thing because you can see how it can democratise voices and power but sometimes it means we end up placing our trust in the wrong people, in the wrong places, and we actually give our trust away too easily.
Sandra: How is the way we trust changing?
Rachel: The interesting thing is that the process of trust itself, so what trust is, that's not changing right, that's something I think is very innate to human beings in the way trust works and I do think of it as a process I think of it is...whenever you're asking someone to trust there's something known and then there's something unknown and the unknown thing it could be a new person that they've got to place their faith in, it could be a new technology like a self-driving car, it could be a new restaurant, a new place, a new concept whatever it is and the line between these two things is something we often talk about, we talk about it as risk, and I feel like we're very comfortable talking about risk as it's something that's quite hard and it feels like it's something that can be measured, but that's not what enables us to be vulnerable, it's not what enables us to place our faith in things and people and that's trust. And so the way I define trust is that it is a confident relationship with the unknown but what's changing is how we get the information, who we can trust, where we place our trust, that's what's changing through the technology.
Sandra: So is technology helping us trust more or trust less?
Rachel: I think technology is actually speeding up the process of trust. So it's kind of like an accelerated age of trust, like trust on speed. If you've ever watched a friend on Tinder you know swipe, swipe, swipe, swipe, swipe. You know the friend was like "I haven't got a date" and I was like it's 11am it's okay you've got like six hours to go. I'm guilty I say oh my Uber driver's three minutes away right, like five stars again because five out five you wrote me five, and so this is the interesting thing is that trust actually needs is a little bit of friction. I think enduring trust actually needs human contact and human judgement because there's so many signals that you tune into and technology and technology companies are very good at making that process seamless to the point that we don't think of what we're trusting. Probably the area of my life I've become most aware of this around is through information and media content. So I'm just trusting algorithms, where I used to trust traditional media institutions as to what I should read, now that process is becoming more and more automated and you have to sort of consciously slow that down.
Sandra: So how do you create that friction? How do you break out for instance of the Twitter bubble you might be in or the LinkedIn one or the Facebook one?
Rachel: The first thing I think is actually acknowledging or seeing that you're in the bubble and that often happens unfortunately through something momentous like you know I realised after Trump I had interacted with a Trump supporter and the only voice I'd really been listening to that said he would get elected was Michael Moore. So I think that's the first thing is an awareness that you are in a bubble and then it's like very very conscious curation that you really have to go out there and find those different voices. I think this is a big problem is that because we can self select what we listen to, to some degree we're not hearing those dissenting voices, we're less comfortable living in that place of discomfort and being challenged. And also I think part of the decline of trust in experts is we demand from experts that they have an absolute black and white opinion on something and it's actually not okay for them to come out and go "you know what we're not certain and here's why we're not certain". And I think we actually have to be comfortable with that uncertainty to encourage that uncertainty. And the other thing I think actually comes from the design of the products and services. I've never had a product or a service experience that says "are you sure you want to click on this?" because you read those terms and conditions in 10 seconds. I was signing up to an account last night and "do you want to link this to your microphone, do you want to link this to your contacts page, do you want to link this to your Facebook account"? Was in a minute right all these things where are you sure you want to give this app the ability to listen to your conversations?
Sandra: And did you?
Rachel: No I didn't and that's the thing I've actually paused now and been like "do I really want to do this?". And then what is the benefit and why am I giving this away, and am I giving my trust away too easily.
Sandra: Is this also about literacy? About how we use these products? How we use these services?
Rachel: 100 percent. And to really understand and I'm not saying the people designing these products and services are bad but it's what the technology inherently wants, it wants to be seamless, it wants to be frictionless, it's measured by how much of our attention it gets every single day. It is a case of literacy and I think it's also it's a really hard challenge because so much of trust a key component or I should say trustworthiness is integrity and intentions and so Sandra you are very competent and I know that you're dependable and I know that you care but it is your intentions I think are really the key piece in terms of enduring trust but I don't know how you trust the intentions of machines and algorithms, how you develop literacy to understand that, is the challenge.
Sandra: To get around that we've tried to digitise things like reputation. We tried to solve problems like the one you described in your book childhood story of the nanny. We've brought in technology to try to digitise reputation. I want to explore that a little bit whether that is useful in any way or the downsides that come with it are too great?
Rachel: I think this is inevitable that we digitise music, film, identity, reputation is next and by reputation I mean like what people think and say about you online, how you behave, so everything from like did you pay on time, to if you're an ebay seller or an Airbnb host whatever it is, what other people think of you at work. The story of the nanny was a really important story in the book because I realised that this obsession with how we place our faith in strangers was because when I was five I think my mum hired this nanny. It's a long funny story but she was competent she was reliable I think she even did care but it turned out that she was running a very very large drugs ring in north London. And then my dad was arrested because she'd use our car as the getaway car in an armed robbery, bank robbery. So I started thinking a) how did my parents make this bad decision, this poor judgement and I realised that they faced this trust gap that they thought they had enough information but there was this illusion of information and then I started to wonder whether they would have made the same mistake in the digital age. One of my favorite quotes in the book, it's not mine it's from Diego Gambetta where he says trust has two enemies not one - bad character and poor information. The reason why I raise that is because that's where the reputation systems become really interesting. Now I'm the first one to say they are extremely flawed on many sites because trust is really contextual. So just saying someone's five out of five isn't helpful right.
And you're starting to see in places like Airbnb accuracy and cleanliness and responsiveness and so you're starting to see people rated against things that are actually useful and then the babysitting sites were really interesting to me because 75 percent of all applicants were rejected because they're using machine learning to actually verify do you have that childcare certificate, do you have a clean driving record, which I thought was really interesting but what they're doing is they're thinking about signals and reputation in a way like who do you know that I know at my son's parent's school, the repeat family badge I think is a really interesting one. So did that family have that sitter back? And so I think the way we measure and aggregate reputations is really different in 10 years time but it will become a currency.
Sandra: Not across the board right? We're seeing places like China develop a very different way of assessing or digitising reputation.
Rachel: We are and this comes a really key question and it's not just around reputations - who manages and owns the data around the reputation and what is done with it and how much control we have over it. So the China chapter was the hardest one to write. So in China there's this system which is called a Social Citizens Score. Think of it like a trust score and will be mandatory by 2020.
Sandra: And how does it work?
Rachel: So it has different inputs. Some of the inputs are pretty easy to understand. So do you pay your bills on time but then they're looking at purchasing histories, they're looking at things that you say. The piece that really I find frightening is the potential that if you say something online that the Chinese government didn't like that your score would actually impact my score because we were linked in.
Sandra: Because you came on this podcast.
Rachel: Literally linked in. And there's a culture in China that you keep other people accountable. When you interview Chinese people about this they think this is a good thing because people have a lack of traditional credit histories in China, there's high instances of fraud so they can see the economic argument and the reason why the chapter was so hard to write is because it's really easy to point the finger at China and go look at that system of Orwellian, you know we'd never have that in the West. But then...
Sandra:...aren't we doing it really in the west.
Rachel: Exactly. It's not visible so at first I wrote it and I was like this is completely wrong because their argument is right, there is an element of control in the visibility of the information. The thing that deeply disturbed me writing that was that the penalties do not fit the crime.
Sandra: So what are the penalties in this instance?
Rachel: I introduced it with the rewards which I always find really interesting and then they just banned more than six million Chinese from going on aeroplanes and fast speed trains because your trust is too low, your kids can't go to certain schools, you can't apply for certain jobs. It goes on.
Sandra: You can't get a nice table at a restaurant.
Rachel: You can't get a nice table, you can't go to hotels, certain golf clubs, it's a very small area.
Sandra: Is this a Black Mirror episode?
Rachel: It is. The thing about Black Mirrors in general is so clever because as Charlie Brooker says it's the future in 10 minutes time if we're clumsy. You know you look at that nosedive episode and she wants the apartment which is why she's trying to improve the score.
Sandra: And smile better.
Rachel: I love that as she practises every morning her smile and these systems they understand that you need a goal, you need a nearing goal right and so therefore what are you willing to give up to get to that nearing goal and you need the carrots and therefore you'll deal with the sticks. And it really is gamified obedience I think is more than that I think it's a popularity contest that only a few people can win and it's just not that far off because if you look at your own behaviours around whatever the social media channel is there's already a form of addiction there in terms of a race to the top or a need to be liked.
Sandra: But also if we think about how that's playing out in the West in places like Instagram where we have hashtag vanlife where we imagine and the fact these pictures of a life that doesn't really exist but we want to portray that it is so perfect. How is this idea of digitising reputation working out with for-profit companies we have in places like the US or Europe?
Rachel: They literally see this as liquid gold because so many traditional insurance industry for example and their whole business model has been able to predict how you behave and they're the first ones to say like a lot of systems they use or the financial industry their retrospective and so if they can have things that are forward looking and start to say we could predict the propensity of your behaviour around a certain thing in the future that's worth a fortune. One of the most interesting people I met interviewing in the book was a guy called Savi (Baveja) who founded this company called Trooly. He was the youngest partner ever made at Bain. He went to Harvard Business School, not that that makes him an ethical person but just a really smart guy and he invented this thing which is very very deep crawling of the web to build a profile of you that you could use to get more control over the way that you get a loan. And maybe naively he didn't realise how valuable this thing would become. That companies would want this because they could get a very deep portrait of people's traits. They did it on me and it's really phenomenal how far they can go and how deep they can go.
Sandra: So it's really a question of when will it happen for us? We're pointing the finger at China but...
Rachel: No I think it's China today somewhere near you tomorrow. I do believe this and I think this is inevitable and that people have high hopes around the blockchain and I really think we need to be asking now who is controlling all this data and where is it going and how are other people extracting value from this. In China it's the government that controls this which I don't think is going to happen in the West but you got to look at who is likely controlling and it's going to be one of the big four - Facebook, Google, Microsoft or Apple and I don't know how you prevent that from happening.
Sandra: What about Amazon? Do they have a different role to play?
Rachel: I do this very very crude exercise with all kinds of audiences may I ask them to clap for the company they think is the most trustworthy. So on the sign is Google, Facebook and Amazon and Amazon always wins. And this is so interesting to me because I can trust Amazon to deliver my packages. I don't trust them to pay taxes. Think it's so interesting that they just launched the Amazon key. It's just astounding to me that people are going to buy a key which was developed by Yo at university and then a camera in their homes so they can see when that parcel's delivered. And Amazon I think is a classic case of guilty as charged when convenience trumps trust. And that we will give the trust away because of the convenience of the services being offered and we mistake that for being a trustworthy company.
Sandra: How are we giving away that trust with things like smart speakers with Alexa?
Rachel: Helpful female assistants as they're often called. I did a very quick experiment with my daughter and she's three and a half and I said to her "meet Alexa, you can do anything you want with her". And her first question was is she like Siri. And what was interesting was you know I think it's because she's half British that she just asks so many questions about the weather like we knew it was not going to rain that day and then the songs, like they're testing it. But then what frightened me was by day two she realised she could order things. She thought it was magic that she could use her voice and then the next day a massive box of blueberries arrived. And then the next day was what worried me because she loves picking her clothes and she stood in front of Alexa and said what should I wear today? And this for me marks a transition point that we're going through at lightning speed, is that my trust in technology, so this recording equipment is that it does something and it's reliable increasing our trust in technology won't be that it does something but that it decides something for us that's where that integrity piece really comes in because how do you trust the intentions of the decision making. So with the Amazon, the Alexa, the next version has a camera so Alexa doesn't just see you, she hears you she sees you.
And because Amazon's just launched their fashion brands because they want to do to fashion what they've done to books and they will which I think is very sad. They'll make recommendations. She could change and the style check would say like 66 percent think you should wear your tiara outfit and 40 percent think you should wear that leopard print thing. But you know what those shoes don't really go would you like to order a pair?
Sandra: And they will come like the blueberries and the repeat songs from Frozen.
Rachel: They will come within an hour. Not in my house but some courier will come round and open the door and voila they'll be there.
Sandra: So why this very rapid and huge shift from trusting technology to do things for us to trusting technology to make decisions for us.
Rachel: It's a really good question. I think it has been waiting in the wings for a long time and one of the things I talk about is these trust leaps when we take a risk to do something differently. And I think we've kind of been training for this. We use Netflix, we don't think about it, but that is outsourcing decision making on what we're going to watch. The Twitter feeds, whatever it is. And so I think now we're ready for these monumental leaps around decision making. I think it's rubbish when people say people won't trust self driving cars. I think people will take the leap and then millions will follow very very quickly and we will trust that car and decide to a point where within a short timeframe the human decision making will come into question.
Sandra: Why do we assume that these machines can behave ethically?
Rachel: Do we assume? Is it an assumption? Is it a desire? Or is it just a lack of thought, a lack of consciousness?
Sandra: Probably the latter. Maybe the speed at which we make these decisions that makes us trust because it's much easier.
Rachel: Yeah because I don't think it's apathy or ignorance, I think it's that it's seamless. I think it's that there is no friction in there and it works so beautifully. We're naturally trusting beings. We want to give it away, we do, so weird things become magnets for our trust. And so you actually have to design things for people not to trust them. If that makes sense.
Sandra: Germany has recently developed, they're not legally binding, but a set of guidelines for car manufacturers around autonomous vehicles and they made a big case out of making sure that the people who design and build these machines are the ones who should be held accountable and responsible for what these machines do. How do we place back accountability and responsibility to platforms like Amazon or Google or Facebook?
Rachel: Accountability is one of the key questions of the decade and I think we see this playing out in all different places. I'm not sure it's as simple as the creator is accountable because who is the creator? To ask a really basic question, like when you're turning ethics into code is it the person that programs a car, is a BMW the car manufacturer, like so where that trust really lies I think is an interesting question. And then I think more broadly with accountability the platforms like Facebook is a classic example they're starting to say okay we kind of acknowledge we're not a neutral pathway, kind of right? Do you agree?
Sandra: I agree with kind of.
Rachel: Yeah it was a big emphasis on "kind of". But we're not a media company. If you're a media company and then exposed to all different kinds of regulation right. So we're not media companies therefore we're only going to have four and a half thousand community analysts who are going to edit this content, well that's one analyst per 466 thousand users.
Sandra: They're up to about eight and a half thousand now but it's not making a difference.
Rachel: But it's is a drop in the ocean right, it really is, is it 4500 photos every second? What are they? And that's the troubling thing. And even like with things like Uber I think I was in London the other day Transportation for London said we're not reissuing your licence it's an argument of accountability because they're not saying we don't believe ride-sharing, they're saying when something goes wrong you the company have to take more responsibility. I do think they need to take more responsibility and I do think we need to grow up but there's this really delicate tightrope we're walking because we don't want to overregulate these things like there has to be some personal responsibility and so it's how we find that balance. And I think they have to take accountability for the things that are hidden and this is where I think this notion of transparency is really interesting when it comes to trust because I fell in this camp where people go "more transparency is the answer to trust right". You've given up on trust when you need things to be transparent. But what I think we need more information around is the intentions of these things and that they are really ethical questions and so again I think who's training these people, how are they asking these questions, and then what is the right body to keep them accountable because I don't think it's traditional regulation.
Sandra: Nor is Wikipedia - Facebook's last attempt is to say well Wikipedia will sort this for us.
Rachel: Is that what they've said?
Sandra: Yes they will include links to Wikipedia on every article so you can look up the source if you want to.
Rachel: Oh really? Well the problem's solved then right.
Sandra: It's all sorted.
Rachel: But I do think the answer will come from the crowd. I do believe that. I think the international consortium of journalists is a really interesting model right so it's not saying we don't need professional journalists and we don't need those traditional structures but how can people work together in a more collaborative way that isn't stuck in a traditional media structure.
Sandra: So Rachel what worries you most about the future?
Rachel: It's going to sound quite bizarre but we forget what it is to be human. That's what worries me. That we won't be able to discern an avatar or a bot from a human being and why that worries me is because human beings are messy and they're complicated and they make mistakes and we're wired not always but to forgive people and to work through that. And what worries me is that machines will be not perfect but they won't have these moral complexities, these behavioural complexities, and so people laugh but I genuinely worry that my daughter will fall in love with a bot.
Sandra: So the problem is not that we won't discern between a bot and a human but we won't care.
Rachel: Well sometimes we won't be able to discern and we'll be fooled. So I think I write in the book about the woman who fell in love with the cyber lover bot and she thought it was a real man but in some cases we'll choose the bot over the human because they will be the perfect boyfriend so to speak. They'll know what to say and they'll always be there and they'll get smarter and understand you over time.
So that's what worries me is our ability to discern but that we choose one over the other and that we really lose this sense of what it is to be human. And that's not this fear of like robots taking over the world. It's actually that my children and future generations really understand what being human is.
Sandra: So our willingness to forgo the messiness for the illusion of perfection.
Rachel: Yeah and I don't know if perfection is the right word but the illusion of... is it efficiency? The illusion of something better. Yeah that is my concern.
Sandra: So what do you think will be our ability to discern fact from fiction and human from robot and useful from right?
Rachel: It will be incredibly complicated. I don't know if we will because you're already seeing this now right you're seeing people who know how to speak to people's feelings over facts and in a weird way it sounds funny to say but trust is trumping the truth which is not a good thing. And this is where we started - that we'll place too much trust in the wrong people in the wrong places is a real fear and it's not even like we won't have the skills to discern it's just that will be so good at fooling people. And so how we emerge from this so-called post truth society is a really big concern.
Sandra: And also that we might not have the skill to discern. I mean recently I think it was the University of Washington that developed AI that could write restaurant reviews that were not only believable but also perceived as useful. So to what extent we will have the ability to actually discern fiction from truth?
Rachel: And do we care that they're true and this is why I think Gambetta's quote is so interesting with this bad character and poor information right so do we care that it is poor information? Well we might care if it was poor information if we really understood the intentions of the person providing that or I think the other thing is the damage that it could do. So how we actually understand the consequences of this decision making.
Sandra: What do you think we can do better?
Rachel: As I said I think we need to take more personal responsibility for this. Some of it is an awareness thing - just really starting to think about how easily we're giving away our trust. I think there are some really exciting things happening in design and I think a lot of it starts young with education, you know so how your training people to think this way. I was at St. Martin's and one of the students said they were going to do a project where they were going to design something that intentionally slowly earned our trust over time and this big smile came on my face and I said well give me an example and he said well imagine I was a bank and you tried to give me ten thousand dollars and I said I don't want your ten thousand dollars I want your ten dollars because I'm going to prove that I'm trustworthy and then ask you for 100 dollars and obviously that's not a world solution but that he was thinking in that way that he could do it in a way that was immediate but that he was going to slowly earn that. The thing that actually gives me hope is I think the organisations that are going to win are the ones that inject the most human humanness - real humans - that really understand how this technology can amplify our emotional intelligence and that's what I think we need to be focusing on is how does it amplify us to be more and better human beings versus outsourcing or this trust to the technology.
Sandra: So what gives you most hope about the future?
Rachel: I am generally a not to miss it person, I mean I think I was naive. I think I'm less naive now. We have this tendency to think things are going to turn out a lot worse. Teaching gives me hope. I think we point our finger at millennials which I find strange because millennials are now like 38 right so they're not young people anymore but when you see how they think about the world and they are asking these questions and I think we're in this mess right now but that there's a generation that will come through and will think completely differently about these things and the face of media, the face of regulation, the face of banking, it will have a very different DNA to it.
Sandra: Rachel thank you so much for talking to us.
Rachel: I really enjoyed it. Thank you.
Outro: You've been listening to Sydney Business Insights, the University of Sydney Business School podcast about the future of business. You can subscribe to our podcasts on iTunes, SoundCloud or wherever you get your podcasts. And you can visit us at sbi.sydney.edu.au and hear our entire podcast archive, read articles, and watch video content that explore the future of business.
Close transcript