When journalist Nicholas Thompson became editor in chief of Wired this past January, it marked his return to the technology magazine, where he’d worked as a senior editor from 2005 to 2010. In between, he was the editor of The New Yorker’s website.
Thompson is also a frequent speaker, not only at conferences and universities but as a contributing technology and trends editor for CBS and CNN International. He likes talking to “hypertechnical” audiences, but also to “part-interested people who are engaged in how society is changing,” Thompson told Convene. “I fundamentally believe that digital technology is changing us, and redefining what humans are.”
At Convening Leaders 2018, Thompson will present a Thought Leader talk called “The Optimistic Technologist: Keeping the Digital Revolution Human Centric.”
What does it mean to be an “optimistic technologist”?
The most important thing is to understand and harness all of the things that are happening that makes [technology] work for the good of society — like figuring out how actually our phones could make our lives better, or figuring out how Facebook and Google can improve democracy. Or how artificial intelligence can create new kinds of jobs without creating turmoil in the job market. There are choices that we can make now to get better. If we make the wrong choices, [things] won’t get better as quickly.
The revelations about the sources of ads on Facebook and Twitter during last year’s U.S. presidential election have everybody feeling a little off balance lately. Is that something that you’ve been thinking about?
The thing I’m working on right now the most is Facebook. I’m writing a big essay on Facebook and democracy, and that’s very much the question I’m dealing with the most every day. That is one of the reasons that Facebook has been so profoundly jarring, because it is so — “Wait, so Russian intelligence operatives are trying to in filtrate our conversations about race in America?”
How do we make good decisions about whom to trust?
That is really a Facebook problem, interestingly enough, because Facebook controls what we see. I would say two things. One, media’s ecosystem has broken down. It started breaking down a while ago when it became very easy for others to publish. It used to be that it cost a lot of money to print newspapers… they all had to win our trust. If they didn’t win our trust, they weren’t in business.
Then it became very easy to write whatever you wanted on a website, and the relative advantage that newspapers and the publishing industry — with their standards of getting journalism-school [trained] people — that relative advantage kind of went away, and that broke things up a bit. Then Facebook came along and completely leveled the playing foeld.
Now anybody has the same shot of being heard. That’s why our news ecosystem was hijacked, not just by the Russians but the Macedonians and other people peddling fake stories on all sides of the political spectrum. They just made it impossible to actually get real information during the campaign.
The question is, is that inevitable, or can it be solved? One hypothesis I have is that, yeah, it’s Facebook’s responsibility, and Facebook can actually solve it, and there are specific things Facebook can do that help us solve that problem and help us get rid of false information, and regain trust.
What role does design play in helping to increase trust and fact-based communication?
I think a lot of this goes back to social-media companies. It goes back to Facebook, it goes back to Twitter, and the way those products are designed. I’ve given very long talks about the social-media platforms and the way they’re designed and set up to addict you. I can explain the subtle ways they do it and the more obvious ways they do it — and that has a huge in influence on the way we behave, right? Facebook’s design profoundly changes the way we use Facebook.
What role do you think face-to-face communication has in a world where technology is so dominant?
I had a conversation this summer with Tristan Harris [a former design ethicist at Google], who I think is one of the smartest design people analyzing technology today. He said what Facebook needs to do is whenever it sees a conversation getting heated or angry, it needs to insert a button that says, “Hey, meet face-to-face,” so you can actually talk.
When you’re behind a computer screen, you just act in a different way. When you’re behind a computer screen and you have an anonymous account, you act even worse. When you’re actually sitting with somebody, you can see the humanity, recognize that they’re like you, and then you’re much more likely to have a civil conversation or reach some kind of positive solution.