Innovation Files: Where Tech Meets Public Policy

Why Societal Trust Is Imperative For Innovation, With David Moschella

June 05, 2023 Information Technology and Innovation Foundation (ITIF) — The Leading Think Tank for Science and Tech Policy Episode 78
Innovation Files: Where Tech Meets Public Policy
Why Societal Trust Is Imperative For Innovation, With David Moschella
Show Notes Transcript

Looking ahead to the technological challenges and opportunities of the next decade, social trust will be more important than ever for the tech industry. Rob and Jackie sat down with David Moschella, a nonresident senior fellow at ITIF and the author of ITIF’s “Defending Digital” series, to discuss how a lack of societal trust harms the U.S. innovation system.

Mentioned

Related

Auto-Transcript

Rob Atkinson: Welcome to Innovation Files. I'm Rob Atkinson, founder and president of the Information Technology and Innovation Foundation.

Jackie Whisman: And I'm Jackie Whisman. I head development at ITIF, which I'm proud to say is the world's top ranked think tank for science and technology policy.

Rob Atkinson: This podcast is about the kinds of issues we cover at ITIF from the broad economics of innovation to specific policy and regulatory questions about new technologies. So if you're into this stuff, which you probably are because you're listening, be sure to subscribe and rate us, it really does help.

Today, we're going to talk about how America's declining social trust will harm its innovation system and also why tech is not the cause of declining social trust.

Jackie Whisman: Our guest is David Moschella, who is a non-resident senior fellow at ITIF. He's the author of our Defending Digital series, which examines popular criticisms, complaints and policy indictments against the tech industry. Our goal in this series is not to defend tech reflexively or categorically, but to scrutinize widely echoed claims that are driving the most consequential debates in tech policy. Before enacting new laws and regulations, we think it's important to ask if these claims hold water. Thanks, David, for being here.

David Moschella: Pleasure.

Jackie Whisman: This is your 14th installment of the Defending Digital series. What led you to write this particular piece?

David Moschella: Yeah, it's always a great question and I think we all know that the last few months has just been an enormous amount of talk about trust and divided country and people not believing this, that, or the other thing.

And two things sort of popped out at me. The first was how many times I heard, well, the lack of trust in America today is due primarily to misinformation and social media, and in my view, that's just not true. And so as part of the Defending Digital sort of mission, I figured, all right, we got to at least take on that issue.

But there was a second issue, and this is really more related to the work that I've always done, is try to look ahead, and if you try to look ahead to the technology challenges and opportunities of, say, the next 10 years or so, many of those actually require more social trust than the tech industry has required in the past.

And so that was sort of the two-edged sword that is in the title of the piece, that digital innovation isn't undermining societal trust but in fact a lack of societal trust could well undermine digital innovation.

Rob Atkinson: I want to talk about the first point. When I was working on Capitol Hill in the nineties, I was working for an organization called the Office of Technology Assessment. And that was eliminated largely because of Newt Gingrich, to be fair. And one of the things that's easy to forget is before that, the Republicans and Democrats were... They had differences, but they weren't fundamental differences. They were differences at the margin, at the edge.

Republicans tended to be a little bit more for fiscal responsibility, lighter touch regulation, Democrats a little bit the other side. But anybody who would say that it was because of Facebook that the parties have gotten to this wide extreme where you have a pretty aggressive and big progressive caucus that's really, really, really liberal, and the opposite now of the Freedom Caucus, that's a reflection both a cause and a reflection of this underlying divergence of sort of consensus in America. And it wasn't caused by Facebook. So will you say a little bit more about that?

David Moschella: Yeah, there was sort of two parts of my research and thinking about. The first was to try to understand as best I can how trust had changed over time. I think there's been a general consensus really going back to the sixties that America was moving from a high trust to a lower trust society. And you look back in the sixties and people talk about the lives that were in Vietnam or stuff that happened with Watergate or doubts about what really happened with the Kennedy assassination and others. And in many ways the whole counterculture movement that said not to trust anyone over 30.

And so it was back there, but it's clearly sort of bumped up a bit in recent years. And I was just sort of looking around for some metrics on that. And I came up with three. The first was a study by US News that says America now ranks 30th in national trust behind Italy, Greece, and most of Eastern Europe. And that's not so good.

There was a Reuters study that American media now ranks last out of 46 countries in trust, and that's pretty hard to do. And a Pew Research study that said, just 20% of Americans trust in, "They're to do the right thing always or most of the time." And so those are pretty brutal numbers that I think broadly described the world we're in today.

So as you said, Rob, the question is, well, why have we gotten to really such a pretty dreadful state? Because it's pretty clear that that level of distrust can lead to the arguments you mentioned, noncompliance, stagnation, lack of problem solving, skepticism, all the things that we're seeing right now. And so my first thing was to try to think about, all right, why does this happen and to try to dispense with this idea that somehow Facebook, social media and misinformation was there.

Because the thing that I immediately discovered that it wasn't misinformation that was driving distrust, it was when people actually learned the truth about what has actually happened and that these sort of revelations have cut across virtually every traditional pillar of trust. So when our intelligence agencies told us that there was weapons of mass destruction that we went to war over and it wasn't true, or that the Steele dossier was highly credible when it wasn't, when we see the police videos of George Floyd and others, when we see the banks going through their bailouts in 2008.

And now again, when you see the sort of sickening abuse at the Catholic church, the Boy Scouts coaches and even the scientific community of what's happened with COVID of the demonization of people who floated the lab theory or perhaps wanted to follow the Swedish model and the really tough way they were all treated, and now seeing that maybe those arguments were better than people said. And even sort of traditional high trust areas, the schools, with people doubts about the school closing, the declining results, some people complaining about indoctrination, the justice system.

For a long time, the minorities felt like there was a two-tiered system, but now conservatives say that too. And liberals complained about the Supreme Court. And finally even the military, if you look at the debacle and leaving Americans behind in Afghanistan and all of that equipment. So really across all of these institutions, most of which for many, many years were highly respected. Trust in all of those as taken a big hit.

And none of that had anything to do with social media at all. It had to do with Americans seeing what the stuff that's actually going on. And social media certainly can help amplify that, but it's certainly not an important factor in any of this. And so that was sort of the thing that's maybe really just sort of take that issue off the table.

Rob Atkinson: So David, just before we move on to why societal trust is important to innovation, it strikes me that one of the things factors here is you've had an elite class that really in so many ways has failed us as Americans. And for example, you have people who just completely got it wrong on Iraq and paid no price, still have not apologized. I get it. You try your best, you make a mistake, but no penalties, no mistakes.

And so I feel like it's a lot easier for the elite class to blame Facebook and blame the rooms on Facebook and Twitter for the problem when they won't admit that they were wrong and they caused a lot of these problems. It's like, "Well, let's pass the bucket over here and then we're scot-free."

Another one you didn't mention was the significant hollowing out of US manufacturing and the complete denial of that. It goes on until today, the complete denial of that.

David Moschella: Absolutely.

Rob Atkinson: And the average American who was in Ohio or whatever, looks at that and goes, "Oh, these people are not on my side."

David Moschella: Yeah, absolutely to all of that. And particularly the first point about accountability that there's been none. The banks, they got bailed out and individual people don't. And yeah, everybody seems on that list that you cited, that I cited, almost everybody on that list has been taken care of and very rarely held to any account.

And so I think it's a very understandable reaction. And certainly the manufacturing, the way jobs were shipped to China, all the advocacy of globalization and sort of unfettered globalization proved to be not such a great idea and then powered China to be the giant that it has become. And this was driven by allegedly our best and brightest leaders for so long.

And as you say, it wasn't so bad that they made a mistake, but obviously they made that mistake ages ago, but they couldn't admit it, couldn't correct it. And even to this day, haven't done much about it.

Jackie Whisman: In what ways can the lack of societal trust harm Americans in America's innovation system? We've talked about it and alluded to it, but maybe some examples would be helpful to-

David Moschella: And this is sort of-

Jackie Whisman: ... steer the conversation in the right direction.

David Moschella: And this is sort of the core of it with a lot of the work that I do that try to understand what are the drivers of innovation out there. And an easy way of trying to see it is if you look back to the first big wave of the internet, the whole dot-com and subsequent success, most of that was a consumer experience.

So if I'm using email or search or buying something from Amazon or using Microsoft Office or my smartphone apps and maps and such things, basically my use of that stuff doesn't affect many other people very much. And if it works well for me, in some ways, that's all I really care about.

So it was a consumer-driven market, and if you had consumer trust, you were in pretty good shape. And that's what those companies did and that's how they succeeded. But if you go ahead to the challenges that are coming, so the 2020s issues, they're not primarily consumer issues. They're more collective and public sphere and B2B in nature. You can sort of go through those. Some of them are tech and some of them may not be.

But if you look at, take basically on climate change, it's not really a consumer issue. But if you don't have trust and people don't believe what you're saying, it gets a lot harder to make the case to do things in that domain. And I think you see that certainly in the US. But to get closer to the tech world, you take something like facial recognition. Well, I might use that to get on my own phone or something nice, but that's going to be deployed more broadly and do all the things it can do.

Well, that requires me to trust how people are going to use it. And of course, a lot of people don't, and therefore use is being pushed back. If you look at the whole AI and the ChatGPT stuff that's going on now, do people trust that? Do they want to pursue it? How is that going to affect society is sort of the biggest question, and then all kinds of specific ones, easier ones in a way. You look at healthcare data. Do people have the trust to allow that data to be shared? You look at something like digital IDs, which might solve a lot of problems in America and workers and immigration and voting and such things, but people don't do it because people don't trust the government.

You look at something like digital cash, which also in some ways would help with a lot of things, but people don't want to do that because they're afraid of the potential course of power that's there. You look at electric vehicles right now, and people keep telling us there are going to be all these charging stations, but do people really believe them? Because it doesn't seem always to be happening.

So all of these areas are things that it's not true that me as a consumer can just do my own thing. You have to have their issues of critical mass and collective agreement and platforms and infrastructure and all of these things. And I would argue that the lack of trust is undermining America's position in most of those areas, certainly not too late, but you can sort of see it. If you want to say which countries are leading some of these, well, in healthcare data, you might look at Iceland. If you're looking at digital IDs and such things, you might look at Estonia. If you're looking at charging systems, you have Norway or smart cities in Singapore, or facial recognition in China, which is more coercion perhaps than trust.

But all of these other countries have mechanisms to build these systems that in many ways are likely to be more successful, at least initially than ours. And so that's where I make the point that unless a certain level of trust can be restored, this will be a factor in risking America losing its tech edge in certain areas and some very important ones.

Rob Atkinson: But David, I couldn't agree more with you. I think this is the number one unappreciated issue in US tech policy. A few years ago, actually about a decade ago, we were asked by a foundation, a Sloan Foundation to do a series of studies looking at a whole bunch of IT application areas, digital IDs and health and intelligent transportation and a bunch of these. And we initially thought that large countries would dominate because these are essentially, you build a system once and then you can scale it with everybody and so your costs per user are quite low. We found exactly the opposite. It was the small countries that were dominating because they had trust. They could bring everybody together and they could get acceptance. And they could bring the key stakeholders together and people said, "Yeah, we're going to trust these systems."

And we don't have that now. I mean, it's almost as if we have an industry, I'm going to get on my high horse now on my soapbox, but we have an industry that promotes lack of trust. And I'll give you an example of that, facial recognition. I'm speaking in Canada next week, although the show will be, or this podcast will be after that. And the Canadian government has a filing or something on AI rules and they're talking about why they need it. And one of the examples they use is, well, there was a study at MIT that showed that facial recognition was biased.

First of all, the study was on facial analysis. And facial analysis essentially is not a one-to-one. I could take a picture of your face and I could say, how old do you think you are and what gender do you think you are, things like that, really very hard to do. And so yeah, there was some bias in it but nobody's using facial analysis at the airport. And it was just one of those things that the group that did it, they knew what they were doing. They knew they were pushing a false narrative because they wanted to advocate against that.

That's what I worry about, is that there's way more people out there pushing the narrative of distrust than there are the narrative of trust.

David Moschella: Overwhelmingly, other than some of the work we're doing, I can't think of too many pushing back on these narratives, which are now the conventional wisdom in most of the areas we've covered. And I don't think the tech industry itself pushes back hard enough on many of these issues to really make their case. They've almost to me, often adopted sort of a passive view of saying, "Yeah, we get it, we'll do better." But they haven't aggressively taken some of these things on so they become entrenched and they're hard to get rid of once they get to that stage.

Rob Atkinson: Absolutely.

Jackie Whisman: How do you see this playing out going forward?

David Moschella: Well, that's the big question for America. And if you look at the two parts of what I've been talking about, there's all the stuff with the government, the banks, the media, the science, none of which really is directly involved in tech, and how are they going to do? Are we going to get better? Are we going to get worse? And that's not necessarily a good question for me to answer, but all I can say is now the needle hasn't gone the other way. It's still seeming to me to be getting worse. So it's not a pretty picture there.

Within the tech sector, I think it's sort of mixed. I think it's a mistake to lump it all together. I think it's still true that Apple, Microsoft and Amazon particular from a consumer point of view, people have a fair amount of trust in using their products and services because they work so well. And there's an underlying foundation of people rely on these things.

The challenges from a trust point of view are almost entirely in the information space, the Facebooks, Googles, Twitter, and now TikTok. And there, I think the tech industry has to do a better job. I think the Twitter files were a very damaging thing because they showed what people sort of suspected that what you see on there is manipulated in ways that aren't transparent to the audience. And I think those companies need to do a better job of regaining that trust.

And so that's sort of the playing field as it is today. But the real question, and I think the bigger one is what's going to happen in these forward-thinking areas that in the end, all the stuff that Apple and all the big giants today, they're really about the past. How are these new industries going to play out? And that is inherently unknowable. But I think it is fair to say that unless America at a public industry and many other levels changes the perceptions of these technologies and makes people see the pluses more than the downside, that you're just not going to see them. And so I think that we're at a period of risk. It's certainly not too late, but I think it's sort of alarm that needs to go off. And I think the industry has a lot of work to do.

China's looking at this and saying, "Hey, they can debate these issues, but we're moving ahead in these areas. We're going to do it in our own way." And we may not like the way they do it, but they will pursue these technologies. They'll pursue the capabilities, and those capabilities will be used in other countries. And if we don't pursue them the same way, we will be left out. And so I think there's a period of risk, but don't want to be too pessimist. There's certainly plenty of time and plenty of capability.

Rob Atkinson: And it's not just a question of risk to the overall US technology leadership ecosystem. It's a question of which system is going to be the dominant one in the world. As much as the privacy advocates and others in whom the tech haters would like to say, "Well, we don't want either one of them." First of all, that's wrong. And secondly, we have to make sure that the US system is the one that's the adopted one around the world. I don't want the Chinese system to be the one that everybody else uses because it's not going to incorporate the right kinds of protections.

David Moschella: Yeah. I mean, look at the TikTok thing and there's a lot of issues there. But if only America bans TikTok and the rest of the planet keeps using it, that's not a great win for us. We need to engage, and the threat from China in almost all the areas I mentioned is very real.

Rob Atkinson: Absolutely. So David, thank you. This was great as usual. I learned a lot and we really appreciate it.

David Moschella: Oh, pleasure to do it. Thanks to both of you.

Jackie Whisman: And that’s it for this week If you liked it, please be sure to rate us and subscribe. Feel free to email show ideas or questions to podcast@itif.org. You can find the show notes and sign up for our weekly email newsletter on our website itif.org. And follow us on Twitter, Facebook, and LinkedIn @ITIFdc.

Rob Atkinson: And we have more episodes of great guests lined up. We hope you'll continue to tune in.