Seriously, Use Encrypted Messaging

This week, we talk about recent developments with messaging apps Telegram and Signal, and discuss the future of encrypted messaging, surveillance capitalism, and nonprofit business models in tech.
An illustration of a message bubble with a lock juxtaposed in front of it.
Illustration: Getty Images

Encrypted messaging is a godsend for mobile communications, whether you’re sending texts to your friends that you want kept private, or engaging in interactions that are better kept secret for safety reasons. Apps like Signal and Telegram offer users the ability to trade messages that can be read by only the sender and the receiver. Of course, people can also use that privacy as a way to conduct unsavory dealings without having to worry about their communications getting exposed.

Encrypted messaging has been in the news for the past couple weeks, largely because of the arrest of Telegram CEO Pavel Durov, who is being accused by the French government of failing to comply with law enforcement demands to help catch some people who are using the app for criminal activity. Durov’s arrest also casts a light on the rising profile of Signal, a fully encrypted messaging app that’s always taken a stance against the collection of its users’ data.

This week on Gadget Lab, WIRED security writer Andy Greenberg joins us to talk about how encrypted messaging works, what can go wrong, and how while Telegram and Signal may seem similar, the ways they operate are different—and might affect what makes them liable for what users share on its platforms.

Show Notes

Read Andy’s interview with Signal president Meredith Whittaker. Read Lily Hay Newman and Morgan Meeker’s reporting on the arrest of Telegram’s founder and the broader criminal investigations. Follow all of WIRED’s coverage of Signal and Telegram.

Recommendations

Andy recommends the memoir My Glorious Defeats: Hacktivist, Narcissist, Anonymous, by Barrett Brown. Mike recommends taking a ride in a Waymo, just to get an idea of the future of driverless cars that is coming. Lauren recommends The Ringer’s story about the a new baseball team, the Oakland Ballers.

Andy Greenberg can be found on social media @agreenberg.bsky.social. Lauren Goode is @LaurenGoode. Michael Calore is @snackfight@heads.social. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our theme music is by Solar Keys.

How to Listen

You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how:

If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts, and search for Gadget Lab. If you use Android, you can find us in the Google Podcasts app just by tapping here. We’re on Spotify too. And in case you really need it, here's the RSS feed.

Transcript

Note: This is an automated transcript, which may contain errors.

Lauren Goode: Mike.

Michael Calore: Lauren.

Lauren Goode: How essential is Signal to you?

Michael Calore: The messaging app?

Lauren Goode: Yeah, the messaging app.

Michael Calore: Completely essential.

Lauren Goode: Why is that?

Michael Calore: I use it all the time. I mean, not just because I'm a journalist and I'm valuing my privacy, but also I'm on Android. So a lot of times, when I message people from my default Android messaging app, I'm not sure what the encryption level is, and Signal brings a sense of sureness to that. So I pretty much use it for encryption, but also it's just better for group chats because Android, iOS group chats are broken still after all these years, and Signal fixes that for me.

Lauren Goode: I thought you were going to say that Signal's essential to you because that's where I message you.

Michael Calore: Yes. That also. Also, yes, it's how we hear from Lauren.

Lauren Goode: What about Telegram?

Michael Calore: Telegram? No. I don't know anybody who's on it, and I basically just ignore it anyway, because it's not as strongly encrypted. Do you use Telegram?

Lauren Goode: I did once while I was reporting out a story. And the moment I joined it, a friend got one of those alerts like, “Lauren joined Telegram,” and messaged me, “Welcome to the dark side.”

Michael Calore: Yeah, it can be pretty dark. There's also been a lot of news about Telegram over the last couple of weeks.

Lauren Goode: There has been, and we should talk about it.

Michael Calore: Let's do it.

Lauren Goode: Let's do it.

[Gadget Lab intro theme music plays]

Lauren Goode: Hi, everyone. Welcome to Gadget Lab. I am Lauren Goode. I'm a senior writer at WIRED.

Michael Calore: And I am Michael Calore. I am WIRED's director of consumer tech and culture.

Lauren Goode: And we're also joined this week by WIRED senior writer Andy Greenberg. Andy, welcome back to the show. It's been a while.

Andy Greenberg: Glad to talk to you all again.

Lauren Goode: We're usually talking to you about hackers and spy capers and exciting things like that. Today, we're bringing you on to talk about messaging apps. But there's a lot of drama around these particular apps right now. As some of our listeners may be aware of by now, Telegram CEO Pavel Durov was recently arrested on charges that a lack of content moderation on his app have led to people using it for criminal activity. This opens up a whole can of worms about whether, quote-unquote, encrypted messaging and social apps should be regulated or considered responsible for the content that's posted on them. So, Andy, let's talk about this. We're going to get into Signal later, because you had this big exclusive interview with the Signal president recently, but let's first unwind what is going on with Telegram. Why was Pavel Durov arrested?

Andy Greenberg: Well, I think the popular idea of why he was arrested is because Telegram is so encrypted, so secure, that it flouts law enforcement, and they got tired of that and arrested him. I would say, in fact, it's the opposite of that. Telegram is not encrypted enough, and it's clear to everybody who uses Telegram, in fact, that it's actually just a platform for all sorts of criminal materials to be shared fully in public. Anybody can go into Telegram at any time and not through some secret encrypted channel, but fully in public unencrypted, find all kinds of criminal activity. What makes Telegram special is that it just doesn't comply with law enforcement's requests for information about who was sharing that stuff or any requests to take it down. Telegram is basically, as you said, this dark-ish social network sharing criminal stuff fully in public.

So anybody who watched this I think has thought it's just a matter of time until this collides with law enforcement. I certainly did not expect that Pavel Durov, the founder and CEO of Telegram, would be arrested in France, but that's what's happened. And he's been charged with basically every criminal activity that is being carried out on Telegram … complicity in it, rather, which includes drug dealing and child sexual abuse materials sharing. There's certainly part of this as well that I think has to do with the Russian state's use of Telegram in the war in Ukraine, and then there are also some technical charges against him. He didn't get the right certification for using cryptography in France where he's been arrested. But I think that that's a sideshow. What really has happened here is that he's provided a platform that criminals are using, and he has not complied with law enforcement requests to help in the investigations of those criminal acts.

Michael Calore: So typically, I know things are probably different in France than they are in the United States, and they're different in every territory, but typically when law enforcement goes to a platform and they say, “We want the goods on this list of users who we can see doing crimes on your platform,” what's the liability there? What does the platform typically have to provide law enforcement?

Andy Greenberg: Well, typically, they just have to comply. I mean, this is the law, that if law enforcement asks for information in your jurisdiction, where you are a legal entity, you have to comply with laws that you hand it over. And Google, Facebook, Twitter, all of these companies have to hand over stuff to law enforcement all the time. Sometimes they fight it, sometimes they go to court on behalf of their users, if they even have an opportunity to fight their requests in court. But generally, this is just the routine that these companies hand over all sorts of cloud data or online data to law enforcement all the time as a part of their investigations. Telegram is based in the UAE. I think that they perhaps thought they could, through a jurisdictional evasion, get away with not complying, but Pavel Durov is even a French citizen. So when he flew to France, he took his life in his hands by having flouted these laws. The ways that communications platforms actually get away with not complying with those laws usually is just to say, "Oh, well, we don't have what you're asking for."

Lauren Goode: Right. That was the case of Apple and the San Bernardino shooting several years ago, basically saying, “Our messaging is so encrypted, we don't even have access to this.”

Andy Greenberg: Exactly. Like, “Sorry, we don't even know what that data is. It's encrypted. It's end-to-end encrypted,” most importantly. That means it's encrypted from one end user to the other with the company in between not even being able to decrypt the information and look at it themselves or share it with law enforcement. That is part of why end-to-end encryption is so powerful. It's why Mark Zuckerberg has, at some point, said that he wants to make Facebook—all of Meta stuff, in fact, Instagram, WhatsApp—fully end-to-end encrypted, because then he doesn't have to worry about moderation. He doesn't have to worry about these data requests. Of course, it sounds good for privacy as well, but it also takes a huge legal compliance load off of his company too.

So that is part of the appeal of not just encryption, but end-to-end encryption. But Telegram, despite its weird reputation as an encrypted messenger, is not. It doesn't provide end-to-end encryption to, I would say, almost any of its users. If you open up Telegram and start chatting with someone, you are not end-to-end encrypting your communications. You're encrypting them on the way to Telegram, and then from Telegram to that person. But Telegram can fully access everything that you've sent, and that means that they could hand it over to law enforcement too, and governments know that.

Michael Calore: So let's quickly talk about what makes Telegram's use of encryption different than the way that fully-encrypted platforms like Signal or WhatsApp operate.

Andy Greenberg: Right. So Signal is an end-to-end encrypted messenger. WhatsApp actually uses Signal's encryption. It's called the Signal Protocol. WhatsApp added that in 2016 and fully end-to-end encrypts all of its communications too, by default. That's the important thing here. You open the app and you are automatically end-to-end encrypting all of your communications with everybody you talk to. In Telegram, you can go into settings and turn on end-to-end encryption. I think very, very few people even know that you have to do that to get end-to-end encryption in Telegram. And if you don't do that, then you are using just what people call transit encryption. It's in transit to the Telegram server, and then from the server to your destination. But Telegram can read everything in between.

But also, when you use Telegram to talk to a whole group of people, there is no end-to-end encryption. WhatsApp offers group end-to-end encryption. Signal offers group-based end-to-end encryption. Telegram does not. That's a really basic lacking feature. And so it's actually very hard to use Telegram in an end-to-end encrypted way. And really, what I see people using Telegram for, what makes it different, in fact, from Signal and WhatsApp in another respect is that it's not really meant to be a private messenger in that way. People create Telegram channels that anybody can join publicly. They're almost more like a Twitter account or something. A hacker group, in my experience as a reporter covering hackers. I'll join a hacker group's Telegram channel to see their broadcast of stuff, and they'll say like, “Here's who we hacked today. Here's all their data,” and post it all on Telegram. You don't have to get invited. You don't have to have some secret encryption key. You just join it like a Twitter feed, and there it is.

And the difference between what makes Telegram good for those hackers, those very often fully criminal hackers, is that Telegram just doesn't moderate or care about any crime happening on their channel, and they don't comply with law enforcement requests for information about it or details on the users either. So Telegram, in some ways, is not even comparable to one of these encrypted messengers, although that's what people seem always to think that's what this. It's more like—

Lauren Goode: Yeah, they're awfully conflated. Yeah.

Andy Greenberg: Yeah, it's more like a just total scofflaw Twitter more than anything.

Lauren Goode: So, Mike, that means all of your group chats on Telegram about all the shitcoins that you're boosting, those aren't encrypted, just an FYI.

Michael Calore: I mean, I got to pump them somewhere.

Lauren Goode: That's right. Andy, we do have to go to break, but I wanted to ask you a question. Based on something you said about how you are hanging out in these Telegram channels, because you want to get a sense of what the hacker communities are up to, is there any privacy danger to a user on Telegram who just lurks?

Andy Greenberg: That's an interesting idea. Anybody at Telegram for sure can see that I am interested in these hackers and I may be a fan of them. I'm not exactly, I'm a reporter, but other people might be. And that's sometimes sensitive information. And I think it's important to point out that it's not just law enforcement asking Telegram for information that users should be worried about. It's that law enforcement can get that data from Telegram. They can hack into Telegram servers. They can turn one of their employees into an informant. They can grab the data in transit and decrypt it if it's not properly encrypted. So all of those things are real privacy threats to Telegram users, and that includes just the fact that you might be in one of these channels that could be actually not as private as you think.

Lauren Goode: All right. We're going to take a quick break, and then we're going to come back with a chat about another messaging app, Signal, which might be a more private alternative.

[Break]

Lauren Goode: All right, we should talk about Signal. This is an app that we are all familiar with here because, as journalists, we use it a lot. We appreciate the truly end-to-end encrypted messaging. We're having lots of private conversations with sources, and of course, we like the idea that our data is not being hoovered up and we're being shown ads or something based on what we're chatting about. Signal though, despite the fact that it's popular among journalists and dissidents and other activists around the world, and the fact that they've established a protocol that is being used by companies like Meta to power WhatsApp, it's still a relatively small portion of the population that is using Signal.

Andy, you did a big interview recently with Signal president Meredith Whittaker, a former Googler, who was incentivized by some of the activities she saw at Google to leave and get involved with privacy-focused organizations. Based on your conversation with Meredith Whittaker, how is Signal's approach different than Telegram's overall, and what does the Telegram saga … actually, I'm going to use this word, signal for Whittaker's plans for growth of the app?

Andy Greenberg: Well, I think the simplest way to describe the difference between Signal and Telegram is that Signal … and I don't mean to sound biased here, but Signal just does everything right. They truly do end-to-end encrypt your communications, your calls and texts. It's a calling app as well. And they don't collect any metadata, and that's really crucial. I mean, that's not a encryption feature. That's just a total lack of data collection that's just a choice by Signal. We have seen evidence that when law enforcement comes to Signal to ask for … Metadata, of course, is the information about who you are, and who you're talking to, and the social graph of your connections rather than the content of information. They just don't collect any of that graph, that metadata, those linkages. And of course, they cannot give any content over it because that's all fully end-to-end encrypted so that even Signal itself cannot read it.

And Signal is open source. So it's been audited for years and years. In fact, it's so reputable, so well regarded that WhatsApp has integrated the Signal Protocol as the way that it end-to-end encrypts all of the messages and calls on WhatsApp. The Signal Protocol is also even used in Facebook Secret Conversations as they're called, the end-to-end encryption feature you can turn on in Facebook Messenger that nobody ever probably even knows about.

Lauren Goode: I didn't know that existed. Is that actually called Secret Conversations?

Andy Greenberg: Yeah.

Lauren Goode: Wow, I didn't know that. Does that work for Marketplace too? If I'm buying a couch from someone, it's like, "Oh, can I keep it secret?"

Andy Greenberg: Once you're in Messenger, you can turn on Secret Conversations for sure, and you'll probably freak out whoever you do that with because they've never heard of it. But then also, even Google, at one point, integrated Signal, the protocol, into its Duo messenger. I don't think anybody remembers that thing. But that's just the degree to which this has become the gold standard for end-to-end encryption. All the Silicon Valley giants have recognized this thing is pretty bulletproof. It's been pored over and tested for many years. So Signal is just very widely trusted in that way. But then finally, the thing that makes Signal truly unique, not just different from Telegram, but different from really any communications platform in the tech world, is that it's a nonprofit, that it offers its services, its tools for free, and there is no profit mode of whatever. There's no ad tracking, there's no data collection, there's no attempt to make money from users except through if you want to donate to them.

Lauren Goode: And we're going to get to that because I have questions about whether or not they're going to eventually pull an OpenAI with regards to that. But I did want to ask you what the Telegram saga means for Signal's growth, because when you spoke to Meredith Whittaker, she was spending the summer in France, sounds like a very luxurious thing to do, but there were business motives behind that, which is she's considering whether or not she may have to move Signal to France at some point, move its headquarters. What does this Telegram saga and the idea that law enforcement is starting to get involved with essentially content moderation? What is that … I hate to keep using this word. What's that signal for a Signal and its growth plans?

Andy Greenberg: Well, right. Meredith was hinting to me that perhaps the United States, we're in a pretty tumultuous political situation in the US. Who knows what's going to happen after November. Maybe there'll be, at some point, a government in the US that is not friendly to an end-to-end encrypted tech nonprofit, and she's looking at possible escape routes. I mean, it's a crazy thing to think about, but if encryption were to be banned in the United States or something like that, or a backdoor was mandated or something that Signal could not abide, they need a new home. And I think she's thinking about the EU, and perhaps France in particular, as a place that might be friendlier to be a new Signal home base.

And then in the midst of that, I mean, the wild thing is that as she was thinking about this, Pavel Durov was arrested ostensibly for running an encrypted messenger, although I think we all just talked about why that's not exactly the right story here. And a lot of people saw that and thought, "Wow, the crackdown on encryption in the EU has just begun and Signal is next, and WhatsApp is going to be next." And to be honest, I personally saw it very differently. I don't actually believe that the arrest of Pavel Durov is about encryption. It's about compliance with law enforcement and demands for fully unencrypted data to get handed over. Signal is not actually vulnerable to that problem as far as I can tell because nobody knows what is happening on Signal. Signal itself doesn't know what criminal activity might be happening there in addition to all the good journalism and activism and political dissidents, and people who use it to survive and evade surveillance. They don't know who might be selling drugs on Signal.

So if you go to Signal and you demand … I mean, you wouldn't even know where to start as an investigator. You don't have anything to latch onto in the first place. Signal cannot be held responsible for end-to-end encrypted communications, which is everything that happens on Signal. Governments even demand metadata from Signal and they say, “Oh, sorry, we don't have any.” That's better than WhatsApp even. WhatsApp actually does collect metadata and hand it over to governments. So Signal is actually invulnerable to the kinds of charges that Pavel Durov is facing.

Michael Calore: That actually gets to something that I noticed in the interview that you did with Meredith Whittaker. There's a phrase that she uses a lot and it becomes the topic of conversation. It's called surveillance capitalism. Can you break this down for us? She's not here, so it's up to you to tell the listeners what surveillance capitalism is all about.

Andy Greenberg: Well, surveillance capitalism is an almost academic phrase, but I think it's an important concept that basically is just capturing the ways that surveillance and profit motive have become intertwined in modern society. I mean, almost everything we use in the world of tech is powered by the profit motive of data collection. Google, Meta, Instagram, all of it. I mean, I guess that there are some things like Apple, which we just pay for with our own money, but I think those are almost an exception now.

Silicon Valley, the tech world, runs on surveillance, which is just a wild dystopian idea, and Meredith Whittaker is really determined to make Signal an exception. And she believes that the only way to do that is just to not have any profit, to be a nonprofit. And there is really, I would say, nothing else quite like Signal in the modern world. No app with hundreds of millions of users, which Signal has. I mean, of course, WhatsApp has billions, but there's nothing of that scale, a technology communications app platform, that infrastructure of our communications that is in this way, immune from the profit motive to collect data. And that really is Meredith's point. And I think it's really important and something that she wants to protect.

Michael Calore: I think it's important to note that she is determined to change that paradigm. Right? She's promoting the nonprofit, non-surveillance-capitalism model as something that is still viable. And something that she says to you at the end of the interview is that these alternative business models, the nonprofit business model, the one that does not rely on surveillance, has not succeeded yet because it has not been given the support. It has not been given the capital, it has not been given any of the same investment, and those alternative models are existing in an economy that forces them to, quote, swim upstream against the tide. So with the right support, we can break that paradigm. And I'm a very idealistic person, but even to me, this feels like a stretch. You have to have a lot of faith in humans giving up something that they have been getting for free that gives them a lot of value just to break that paradigm. People love Gmail because it's free and it comes with docs, and it comes with communications, and it comes with a calendar. They get all this stuff for free in exchange for their privacy.

Lauren Goode: Feeding the data machine.

Michael Calore: Yeah. People have been giving up their privacy for features since the dawn of the internet, and you're asking them to stop getting those free features in exchange for privacy. And it feels like, at this point, maybe the world is so broken that people will not be willing to take that leap to the alternative.

Lauren Goode: Right. Well, we should note too that Signal is free to use. It's donation-based. I've made donations, small donations to Signal before. I'm not writing venture capitalist checks by any means. But I think to your point, Mike, it's that Signal, even though it is a free app to use and it's really not hard for people to download and use, it's not getting that foundational support. It doesn't have even the marketing engine necessarily of one of the big tech companies.

Michael Calore: Right. And things like it. There are productivity tools that you can use that are privacy-focused. I think Proton offers note-taking, and mail, and calendars, and things like that, and they're an organization that is also nonprofit and interested in privacy. But I feel like Gmail has eaten the world. It's taken over everybody. Everybody's on Gmail. And to tell everybody like, “OK, you know what? It's time to transition away from that. It's time to transition to something that is not based on surveillance, something that is more interested in protecting your privacy.” People are like, "Yeah, but I love Gmail."

Lauren Goode: Right. Or they're using the default SMS messaging, which now, in many instances, is getting an upgrade to RCS, which is a more private standard, but just basic SMS from Android phone to Android phone.

Michael Calore: Yeah.

Lauren Goode: Not encrypted.

Michael Calore: So—

Lauren Goode: Wireless networks know what you're doing.

Michael Calore: ... big tech companies have built this empire that feels impenetrable at this point, is what I'm saying.

Andy Greenberg:

Yeah, right. I mean, I asked Meredith this cynical, skeptical question also. It does seem like big tech is winning. The for-profit, the surveillance capitalism model is winning. And she would argue that … Well, she did argue, in fact, that, no, it's not that they've persuaded users to like what they offer more, it's that they're locked in. That this was … I don't know. That this has been a trap set for users since the beginning of the modern technological era that we now no longer have a choice. You can't escape Gmail. You can't escape using, I don't know, Instagram or WhatsApp, which still collects metadata. These things have network effects and they're hard to escape.

And I agree I guess with you, Mike and Lauren, and Meredith, that, yes, it's really hard to escape, and that's a huge problem. I don't have a lot of hope myself for more Signal-like things in the world, these pure ideological nonprofits that went over hundreds of millions of users, but I think Signal is an example that it's possible. It is, in fact, the only example I can think of that has wooed people away from competitors. I don't think people will ever switch out of some idealism, but perhaps they will for the selfish motivation of privacy. And Signal is proof that that's possible. If you offer people something that is truly not possible under surveillance capitalism in a new model, then they do switch to it and hundreds of millions of people have downloaded Signal. That's a small number, but it's this beacon of hope.

And Meredith's point is, I asked her like, “Does that mean that your ambition is for Signal to swallow the world, to take over these other markets, add docs, and drive, and all these other features?” And she says no, but that she hopes it can be a model for other people to do that, for other projects to offer people an alternative, and points actually to Proton, as you did, Mike. Proton, which started out with Proton Mail, but is now offering other kinds of end-to-end encrypted and private services, is transitioning to a nonprofit, which is super interesting to see and maybe a little sign that Meredith is right.

Lauren Goode: I actually loved her explanation for why Signal remains a nonprofit. She said, “Signal is a nonprofit because a for-profit structure leads to a scenario where one of my board members goes to Davos, talks to some guy, comes back excitedly telling me we need an AI strategy for profit.” It's pretty spot on. But can Signal really remain a nonprofit forever? If it swelled to the size of WhatsApp, would it have to pull an OpenAI? This is in the weeds here, but folks know that OpenAI started with all these ideals, and then it became a for-profit structure. Right? Would Signal end up being like OpenAI in terms of its structure?

Andy Greenberg: I definitely worry for the future of Signal when I'm sending videos and pictures on Signal, which is now my default communications app for so many things that are not at all sensitive or secret or whatever, I worry about all the money I'm costing Meredith and all of her colleagues. Signal's annual budget is approaching $50 million a year already. That's a lot for a nonprofit. It's chump change for a Silicon Valley giant, but it's hard to find that money. But I think that part of Meredith's goal is to find funding for Signal, and a lot of it I think has to come from Signal's users. I think one of her biggest projects is to convince Signal users to start donating, and maybe this Mother Jones/NPR model of funding could be something that scales. If Signal has a billion users, but then some of them are actually giving money, are just paying on a monthly basis, then maybe that's a sustainable future.

Lauren Goode: Andy, thank you so much for this. Everyone should go and read Andy's big interview with Meredith Whittaker online on WIRED.com right now. Also, Morgan Meaker has been covering the Pavel Durov saga for us, so go read that story as well. Andy, stick around because we're going to come back and do our recommendations.

[Break]

Lauren Goode: All right, Andy, what is your recommendation this week?

Andy Greenberg: Well, I'm only halfway through this book, but I have been reading My Glorious Defeats, which is a memoir by Barrett Brown. I don't know if people remember Barrett. He spent some time in prison, which is part of the story of his book, but he was one of the … I think he would hate this description. But one of the faces of Anonymous, the hacker collective, and also a journalist and propagandist, I would say, and just a general weirdo and troublemaker, and a brilliant, hilarious writer. His columns for The Intercept actually won a National Magazine Award, which he wrote from prison, often with pencil and paper. That's a big part of the book.

But this is really his whole autobiography. The subtitle is Hacktivist, Narcissist, Anonymous: A Memoir. And it is just hilarious. And also just a wonderful chronicle, a decade later, of the whole bizarre saga of Anonymous, but told from the inside with the most just ridiculous, and funny, and really combative and angry tone. He was also a very serious drug user. It is this Hunter S. Thompson sort of book, but funnier, I would say. It's a really incredible read.

Lauren Goode: And what's that called again?

Andy Greenberg: It's called My Glorious Defeats, by Barrett Brown.

Lauren Goode: My Glorious Defeats. OK. That sounds great. Andy, that's also very on-brand for you.

Andy Greenberg: Well, yeah, I mean, it took him a while to come out with this book, but, yeah, it felt like required reading for me on my beat.

Michael Calore: Nice.

Lauren Goode: I tend to feel that if someone on the WIRED staff is eventually going to be exposed as a spy or double agent, it's Andy.

Andy Greenberg: Well, thanks, Lauren. I'm not sure how I feel about that, but I'm going to pretend to be offended so that it doesn't come off as true.

Lauren Goode: That's right. You have to keep up the charade. Yes, I understand. Mike, what's your recommendation?

Michael Calore: I'm going to recommend that you take a Waymo. If you live in a city where Waymo Robotaxi service is offered, which I think is Phoenix and San Francisco, and maybe a handful of other places, that you should try it. And I think you should try it because you should experience it now. Because this is the future, right? This is the future that we've all wanted, and it's a complicated future because Robotaxis, obviously, they take away business from companies that employ people to drive us around. They take people off the roads and they put robots on the roads instead. I've ridden in a Waymo five or six times now. I have found it to be, just in the city of San Francisco, a very calm and relaxing experience. It's around the same price as taking another car-share service like an Uber or a taxi cab or a Lyft, but it's just more chill. You forget that you're in a car because there's a robot driving and you just don't have to pay attention to anything. There's nothing really to worry about.

It feels remarkably safe, safer than being in a car with a human pilot, no matter how well that human pilot drives. So the reason I'm recommending that you take it, even if … You may hate the idea of Waymo, and you may just reject the idea of robotaxis as bad, but you should take one anyway just so you understand what the experience is like. Because the companies that are putting these things on the road are doing them in order to take over the business of driving people around, right? Humans driving people around. They want that business to go away. They're trying to disrupt it. And they have all of this messaging around how much safer it is, how much more reliable it is, how much better it's going to be for society once we get humans away from the wheel. The entire industry in robotaxis and people moving is moving towards autonomous vehicles. So you should experience it now while it's in its infancy, just so you can see how … If you're interested in it, you can see how it grows.

You might feel better about this if we have universal basic income, if we're taking jobs away from people, if we also provide for them in some other way. You may feel better about this if you're given the opportunity to experience it for free. These are all possible futures. There's many, many different ways that we could go from here. But we're at a starting point right now where these things are creeping into society, and I think it's important to understand how you feel about it earlier than later. Because if we need to guide via regulation or via voting with our dollars, if we need to guide this somewhere, then we should know what we're dealing with.

Lauren Goode: The labor equation is really interesting.

Michael Calore: Right.

Lauren Goode: This past weekend, I took an Uber, and the driver and I compared notes. He said, typically, he … How did he say it? He said, oftentimes the riders will not engage with him in this way, but of course, being a journalist, I was asking him, I said, “How much …” My fare was listed as $45.99, not including tip, I ended up tipping him 25 percent, but, “How much will you get of that?” And he said the app told him, "You would get about $18."

Michael Calore: Wow.

Lauren Goode: And that changes too. It depends on the time of day, supply and demand, that kind of thing. But I was thinking about that, and I was thinking, so this driver drives all day and he's getting a fraction of what I'm paying him, and Uber is getting the rest, and he, meanwhile, has to pay for his car maintenance, gas, if it's a gas-fueled vehicle, all of that. I was like, “How is this sustainable?”

Michael Calore: It's not.

Lauren Goode: And I don't know if self-driving cars of the future … By the way, I also love Waymo. And I have taken Waymos together. Love them. But you're right in that we have to approach this differently than we approached Uber. When Uber came to market in 2009, 2010, they were pitching, “Oh, we solve driver downtime. There's all these gaps in between drivers rides. And, look, if they can just constantly be driving, this is going to be better for them. It's going to disrupt the taxi cab lobby. We're doing good things.” Right? And now we see how that's shaken out. So you are absolutely correct, and we need to think critically about where this is going with robotaxis infiltrating our cities.

Michael Calore: So ride in one is what I'm saying. That's my recommendation. If you get the chance, take it, regardless of how you feel about it, just so you can experience it and understand what the appeal is and what the problems are.

Lauren Goode: And Mike is offering his home to you. You can come visit and crash on his couch in San Francisco if you need to. If you want to take a Waymo.

Michael Calore: I have a very uncomfortable love seat that you can sleep in sitting up if you'd like.

Lauren Goode: You have to take care of the cat too, because Mike's a cat lady. Yes.

Michael Calore: OK. Lauren, what is your recommendation?

Lauren Goode: So am I. My recommendation is an article by The Ringer. It's a story about the founding of the Oakland Ballers. I have not yet seen an Oakland Ballers game, but I really want to go. And this is just an amazing story about how these two entrepreneurs basically came together and did the impossible. In nine months, started a small professional baseball team in Oakland. And it goes into the history of other people having done this in different markets before and the various challenges that they faced, and then even after. And the vibes, how much this is being embraced by the city of Oakland, but also some of the stumbles that the founders had. For example, they said that they would provide housing for the players and, very quickly, that housing was deemed very inadequate. They had to make adjustments along the way. They also solicited funding from the public. People could buy a share and become a, quote-unquote, baseball team owner, which is pretty fun. I missed that opportunity, but that'd be fun to put on your LinkedIn. Right?

Michael Calore: Yeah, sure.

Lauren Goode: You can slash ... Yes.

Michael Calore: Co-owner of the Oakland Ballers.

Lauren Goode: So check out this article in The Ringer. We'll link to it in the show notes and just feel inspired.

Michael Calore: My favorite thing about the Ballers is their logo and their hat. It's a B. So the A's, the Oakland A's, who are now soon to be known as the Las Vegas A's of Sacramento—

Lauren Goode: Correct. Right. Rolls off the tongue.

Michael Calore: So Oakland just goes from the A's to the B's.

Lauren Goode: Yeah. The Ballers was actually inspired by a friend of theirs who died young. He died when he was 22 of a heart condition. But when he and one of the founders used to … I think they were playing pickup basketball together when they were younger, he would refer to himself as a baller. And so they did it in honor of their friend, Bobby.

Michael Calore: Nice.

Lauren Goode: Ballers. Yeah. Great story. Check it out. We'll link to it. Andy, thank you again for joining us. This has been great.

Andy Greenberg: Thank you all. Always fun.

Lauren Goode: And thanks to all of you for listening. If you have feedback, you can find all of us on Twitter where we link to our Signal handles in our Twitter bios. So you can find us there. That's another thing that Signal did. They gave us usernames instead of just phone numbers, which is … Good job, Meredith. Just check the show notes. We'll include all that information. Our producer is the excellent Boone Ashworth. Goodbye for now, and we'll be back next week.

[Gadget Lab outro theme music plays]