Why Signal won’t compromise on encryption, with president Meredith Whittaker

Photo illustration by William Joel / The Verge, photo by Florian Hetz / Getty ImagesSignal messages are more private than iMessage and WhatsApp. Here’s how. Continue reading…

Why Signal won’t compromise on encryption, with president Meredith Whittaker

Meredith Whittaker is the president of Signal, the popular messaging app that offers encrypted communication. You might recognize Meredith’s name from a different context: in 2018, she was an AI researcher at Google and one of the organizers of the Google walkout, during which 20,000 employees protested the company’s handling of sexual misconduct. Meredith also protested the company’s work on military contracts before leaving in 2019.

Now she’s at Signal, which is a little different than the usual tech company: it’s operated by a nonprofit foundation and prides itself on collecting as little data as possible. For that reason, it’s popular with journalists, activists, and people who care about their privacy — Signal even popped up in the Elon vs. Twitter trial because Elon was using it.

But messaging apps — especially encrypted messaging apps — are a complicated business. Governments around the world really dislike encrypted messaging and often push companies to put in backdoors for surveillance and law enforcement because, yeah, criminals use encrypted messaging for all sorts of deeply evil things. But there’s no half step to breaking encryption, so companies like Signal often find themselves in the difficult position of refusing to help governments. You might recall that Apple has often refused to help the government break into iPhones, for example. I wanted to know how that tradeoff plays out at Signal’s much smaller and more idealistic scale.

This is a good one, with lots of Decoder themes in the mix. Okay, Meredith Whittaker, president of Signal. Here we go.

Meredith Whittaker is the president of Signal. Welcome to Decoder.

Thank you. It is wonderful to be here.

There is quite a lot to talk about. The messaging market is pretty ferocious, and the encrypted messaging market has lots of complication with it. Signal is an interesting company structured in an interesting way. One of your jobs as president is to hire a CEO, which is in itself interesting and a pretty fascinating Decoder question. Let’s start with the very basics. Explain what Signal is and where it fits into the messaging universe.

Absolutely. Signal is the most widely used, truly private messaging app on the market. It’s used by millions and millions of people globally, and for people who use Signal, it may feel similar to other messaging apps. You open it, you send a meme, you get party directions, and you close it when you’re done talking to your friends. 

But below the surface, Signal is very different. It is truly private. We go to great lengths not only to keep the contents of your messages and who you are talking to private, but to collect as little data as possible while providing a functional service. We differ from our competitors in that our mission is to provide a private app and in that we are not in any way connected to the surveillance business model. We have a very different model and a very different mission.

Signal is really interesting because it has this nonprofit foundation that sits over top of it. One of the reasons the surveillance business model exists is because that is an easy way to make a lot of money. Signal is obviously not doing that, there is this nonprofit. How is it structured? How does it all work?

The Signal Foundation is a nonprofit. The Signal Messenger LLC is under that nonprofit umbrella and the foundation exists solely to support the messaging app. So in more colloquial terms, we can think of Signal as a nonprofit. This means we don’t have shareholders and we don’t have equity, so we are not being structurally incentivized to prioritize profit and growth over our core mission. And you are not going to see a billion-dollar exit coming — we are not just biding our time until we can get rich and move to a superyacht. So it is a different structure, and a different model. 

That doesn’t mean it’s any cheaper to develop Signal than it is to develop a high-availability surveillance messaging service. We are counting on a sustainability model that relies on donations and a nonprofit model, rather than secretly monetizing data in the background or participating in the surveillance business model, which is the dominant paradigm across the tech industry.

It is across the tech industry, but not so much in messaging specifically. I actually want to push on that a little bit. There are obviously messaging services that look at everything that you send across their service and then aggressively try to monetize you based on what you are saying. I’m specifically thinking of dating apps, which really read all of your messages to figure out when they should nudge you into going on a date. Every time I hear about that it just strikes me as completely bonkers, but that is their universe. Your head up competitors though, like iMessage and WhatsApp, are fully encrypted. Obviously, WhatsApp is owned by Facebook and there is a lot of controversy there. There is also a connection to Signal with Brian Acton, who was a co-founder of WhatsApp and is now on the Signal board. Those services are inherently encrypted. They are not reading your messages in the same way that the surveillance business model is predicated on collecting a lot of data. What is the difference in your mind between the two things?

Listen to Decoder, a show hosted by The Verge’s Nilay Patel about big ideas — and other problems. Subscribe here!

Well, let’s take WhatsApp as a specific example. Again, WhatsApp uses the Signal encryption protocol to provide encryption for its messages. That was absolutely a visionary choice that Brian and his team led back in the day — and big props to them for doing that. But you can’t just look at that and then stop at message protection. WhatsApp does not protect metadata the way that Signal does. Signal knows nothing about who you are. It doesn’t have your profile information and it has introduced group encryption protections. We don’t know who you are talking to or who is in the membership of a group. It has gone above and beyond to minimize the collection of metadata. 

WhatsApp, on the other hand, collects the information about your profile, your profile photo, who is talking to whom, who is a group member. That is powerful metadata. It is particularly powerful — and this is where we have to back out into a structural argument — for a company to collect the data that is also owned by Meta/Facebook. Facebook has a huge amount, just unspeakable volumes, of intimate information about billions of people across the globe. 

It is not trivial to point out that WhatsApp metadata could easily be joined with Facebook data, and that it could easily reveal extremely intimate information about people. The choice to remove or enhance the encryption protocols is still in the hands of Facebook. We have to look structurally at what that organization is, who actually has control over these decisions, and at some of these details that often do not get discussed when we talk about message encryption overall. 

Signal, again, is a nonprofit. We don’t have any access to data like Facebook. We avoid having access to that data. We don’t buy, sell, or trade your data. It is a different paradigm. We can’t point to WhatsApp, however slick their marketing is, and say that it is truly secure and private. All of these details add up to us needing to conclude that it is not. Well, Signal exists solely for that purpose.

Let me ask you a hard question. You have a long history as a critic of Big Tech, and you obviously believe in these criticisms of Big Tech. I don’t believe that you will switch the business model while you are the president of Signal. Mark Zuckerberg has his own reputation. He can say things about privacy and people can believe whether Meta is going to do those things based on their evaluation of Mark Zuckerberg. How do you audit Signal as a consumer? How do I actually make sure that what you are saying is true?

“A big part of our model is telling people not to take our word for it.”

Signal makes its code open-source. It makes the Signal protocol and the key cryptographic primitives that we use to ensure privacy and security open for review. A big part of our model is telling people not to take our word for it. People who have specialized training and skill have engaged in thousands and thousands of hours poring over our code. Every time we have a new piece of code that drops on GitHub, there are people in the Signal community forums who look at it, comment on it, and deduce what features might be coming through that. There is an active and vigilant community that actually checks Signal’s claims against the code and against the cryptographic protocol we use, time and time again. 

Our cryptographic protocol is not just used by Signal. This is what other companies like Facebook have chosen to use because it is the best. We rely on that vigilant community, on transparency and the community auditing.

But that is the protocol, not the app.

That is the app.

The whole app? 

Yes, it’s open-source.

So if I want to fork Signal and make my own, I can just take the code and do it today?

People do it. There are many of those. We don’t endorse them because we can’t guarantee or validate them — we don’t have the time or the resources for that. But yes, there are many out there.

That’s really cool. Let me ask you another question about the structure. It’s a nonprofit. You said you don’t have equity. Maybe not so much today as we speak, but in a different time, Facebook equity was really valuable. Facebook, Apple, Google, or whoever, would pay a high base salary and then give engineers a ton of equity. If you work really hard then the stock might go up and you might get rich. You don’t have that key. Are you just paying people more in cash, or are you hoping that people take a discount because they believe in your values?

We do have competitive salaries. Signal as an organization also has labor politics. We want to make sure we are compensating people and that they aren’t being asked to sacrifice their lives or their standard of living to come work for Signal. We have competitive salaries because we want to hire the best people we can find. 

Equity may not be part of the package, but we are a fully distributed organization, so there is flexibility with which communities you might be able to live in. We also have other benefits that we think make it a great working environment for people who want to apply their talents to something outside of the surveillance tech ecosystem.

On a straight comp basis, do you match Big Tech or are you lower? Where are you?

I don’t have a spreadsheet in front of me and Big Tech is a big, amorphous entity. There are a lot of variables with Big Tech in Europe or Big Tech in Palo Alto. What I can say straight up is that we are competitive.

How many people are at Signal right now?

About 40 total. That’s the org.

How is that structured? Is that mostly engineers? Is it policy people? Is it the C-suite? How does that work?

It’s mostly engineers. It is not a very complex organization. I’m avoiding using the term “flat,” because it is not flat, but it doesn’t have many layers of bureaucracy. There is a leadership team. We have a COO, we have myself as president, we have a director of products, we have two engineering executives — one that looks more at architecture and one that looks more at people management — and then we have Brian Acton as acting CEO. I’m imagining the org chart in my head right now. Pardon me.

That is the whole show. That’s what we do here. By all means.

It is primarily developers, but of course, development isn’t just submitting a pull request. We also have what we call the “user voice team,” and those are the talented people who engage with the community, do a little bit of QA, and test for bugs. I am tasked in my new role with bringing policy awareness. We don’t have a policy team, but I am working on what the right calibration there is for Signal by bringing in my network and my many years of work on those topics.

Then we have what I would characterize as a “narrative team.” We have writers, and we have people who think about how we translate arcane concepts to people who rely on Signal in a way that they will actually understand. We do have folks like that, but it is primarily developers. What we do is produce a high availability app across three platforms, which takes a lot of labor, constant vigilance, constantly squashing bugs, constantly thinking about new features, and making sure there is parity across all the versions. It is endless and difficult work, and I am happy to be working with the people who are doing it.

You have a new role at Signal as president, and I don’t think you were the president of a company before. President of a company is one of those roles that can be whatever you want it to be, as I understand it. How do you conceive of the role of president of Signal?

Well, I have core lanes in that role. To back up for a second, I have been on the board for a number of years and have worked with Moxie [Marlinspike, co-founder and previous CEO]. There is also an open-source community of folks who think hard about technical privacy preservation that I have been in and out of for almost a decade now. I am very familiar with the folks in this space and with the folks at Signal, so it was almost a gradual transition into this role — basically intensifying my attention to Signal until it became my whole work life. 

In this role, I am going to be focusing on the narrative aspects. How do we communicate what Signal is, and why it is so wonderful, to people who might want to use it? This is a particular environment where there is increasing understanding of the harms of the surveillance business model and increasing understanding of the monopoly power of Big Tech. But there are not many actions people can do when they say, “I feel really uncomfortable with this, but what do we do? It interpolates our entire life.”

“Anyone who picks up Signal can talk to anyone they want to without having to think about it and without having to be a privacy ideologue.”

I think it’s about getting the word out that Signal is truly different. By building that network effect of encrypted communications, anyone who picks up Signal can talk to anyone they want to without having to think about it and without having to be a privacy ideologue. My friends aren’t just cryptographers who live in Berlin. My friends are a number of people with varied interests. Some of them probably truly do not care about privacy, but nonetheless, they are cool people I like to hang out with. I need to be able to reach them on that app, and if they are not there, then Signal is significantly less useful to me. I am going to be working on that narrative aspect. 

I also mentioned policy awareness. That is just thinking about the global landscape, the regulatory and legislative landscape, and how that affects Signal. How do we think about that during a product development process? How do we think about that in terms of our high-level strategy? 

I will also be working with the leadership team to direct strategy. As an organization, we grew from what I would characterize as a passion-driven hypothesis project. This was an open-source project on a shoestring. Big thanks to Moxie Marlinspike, Tyler Reinhard, and a number of the original folks at Signal who made a bunch of sacrifices and worked extremely hard to get this effort off the ground.

Signal has matured in the last few years, and I would say it is at an inflection point as an organization. It’s time to take the next step. We have over 100 million downloads in the Play Store, but what will it look like when we reach the next stage and are serving hundreds of millions of people across the globe? How do we build a Signal that can really meet this moment? I think sustainability is definitely part of that. Is there a business model that has not been done yet that can sustain technology like Signal outside of the surveillance paradigm?

You talked about narrative and reaching people, and the network effect of everybody you know using Signal — you’re not even thinking about it, you don’t have to evangelize the service. Another way to characterize that is growth. Your job is to grow the product, and then the back end of it is what you just said, which is to figure out how to monetize that product against that growth and run a product at that scale in a way that is sustainable. Is growth the imperative here?

I have consciously avoided some of those terms because they are so closely aligned with profit motives, and I don’t want to be misunderstood. Yes, of course we want to grow, because our mission is to provide truly private communication to anyone who wants it, across the globe, at any time. We grow so that we can fulfill our mission. We are not looking at growth hacking or adding weird features to get an inflated boost. We are looking at how we can actually reach the people who need to use this tech and reach the people who want a convenient messaging service, and ensure that they are able to use Signal quickly and easily, that they know about it, that when they open it up all their friends are there, and that it is a seamless and pleasant experience. So yes, I think you could put it in those terms. However, I have intentionally not put it in those terms because I don’t even want to echo the language of the alternatives. They are doing something that may look the same on the surface, but is substantively very, very different.

Growth, regardless of the motive, comes with a pretty thick set of challenges once you get to that scale. I will want to come back to that because there is a lot to unpack, but before that, I have to ask you a classic Decoder question. You have been in a lot of different companies, a lot of different environments, and a lot of different roles, so you must have a pretty robust way of thinking about big decisions you have to make. How do you make decisions?

I don’t have a flow chart for decisions. I don’t have a VC Twitter thread like, “The three things I know about decision-making.”

That’s not what I meant. I hope that’s not what I implied. I just meant that you have made a lot of decisions with high stakes in your life. How do you think about it?

It is a combination of as much research as I need to do until I’m satisfied I have the ground truth around something. I will ask very dumb questions until I am sure that there is not some trick or some sort of issue that I haven’t fully understood. That means I will read academic papers, I will call people who have worked in a certain sector, or I will reach out to a mentor who maybe doesn’t know much about the space but has a sense of dynamics and might lend a different eye. I basically have as big a toolbox as I can and I will pull from any tool that feels useful. 

Then I think it will be some combination of instincts. When has this worked well in the past? When has it not? The benefit of having been in this industry for almost 20 years is that you just build up a big set of experiences. There are a lot of pitfalls you have fallen into before so you can avoid them.

I think it also needs to be accompanied with humility. “This is the decision we’re making, and here is the basis for this decision.” I am really committed to making sure everyone understands my basis. “This is why this decision was made. You don’t have to agree, but you can see the logic that led me to it.” 

How do we then measure that it is the right decision? What are the benchmarks we are looking at going forward? How do I remain willing to say, “Look, that was wrong. Let’s back up, because clearly it’s not going the direction we wanted it to go. Let’s recalibrate. Let’s look into our assumptions. Let’s do it over.” I think there is a combination of iterating and learning on the go, also while being sensitive to what our ultimate goals are and why this decision is being made. I bring everything I have to bear on that decision — my experience, my knowledge, and any research I need to do.

Let’s put that into practice. There is a big decision for Signal coming up. You mentioned Brian Acton is currently the interim CEO. Moxie Marlinspike — who is one of the co-founders — was the CEO, but stepped down last January. You have to hire a new CEO now. What are you looking for? How are you going to do it?

I think we’re looking for somebody with stellar product experience. We want somebody who can really focus in on the organization. How do we get our development practices and coordination as cleanly calibrated and well-oiled as possible? How do we think about scaling this organization, growing this organization, and growing our users? How do we ensure that we are choosing the right features and innovations to function on, while understanding that this is not your average tech startup? That our growth is in service of something different, that our organization has the luxury to say no to certain choices and can reject the “move fast and break things” paradigm if it is not going to serve our ultimate mission? 

I think it will be somebody who has that experience and sensibilities that will enable them to discern the differences between Signal and some social app that is just a shim for data collection that goes into some DoD algorithm. That is the truly dark side of tech.

I feel like you might have some history with not being happy with DoD-related projects.

I will just say, the more you know about so-called AI, the more skeptical you become.

Fair enough. Why is it not you? You are very passionate about Signal and it seems like you could do all those things. Why not just pick yourself?

One reason is that I want to go deep on the areas that I have experience in and that I love doing. I talked to the board for a long time about this role and we shaped it around some of my interests. It’s not an increasing realization; I think I have always been a massive Signal booster. I used Signal when it was called RedPhone and TextSecure way back in the day, before there was an iOS app. As I have moved through my career and occupied different positions in academia, in tech, and very briefly at the Federal Trade Commission, it was a realization that this is a place where I could very meaningfully put my time. It is something I thought about a lot. It was clearly the most meaningful thing I could do with my energy and expertise.

I think what we want from a CEO is also somebody who has on-the-ground product experience, which I have some of, but I have not been in the messaging space. It would be great to work side by side with that person and to work on the leadership team. This was a very intentional choice and it was sort of shaped around what I think I do best. We want to shape the CEO role for somebody who fits those specific needs. 

It is really inward-focused, “getting everything on rails.” This is not to say things aren’t on rails now, but if you are preparing for growth — if you are preparing to meet this moment to really recognize this inflection point and mature the organization — we need somebody who is able to focus inward on those issues.

How is the search going? Do you have a timeline?

We don’t have a timeline, but it is active.

What does it look like at the end? Do you have a board meeting where you all sit down and throw secret ballots on the table? Do you raise your hand? Most people are never going to pick a CEO, so give them a vision of what that process is like at the end.

There are interviews. We want to make sure that this is somebody that leadership and the team feel wildly enthusiastic about, somebody who lights up every interview and is clearly showing their knowledge of Signal, their vision, and their insight about the space. We get really excited about that person, we pop virtual champagne, then we have a board meeting and the board votes.

Then there is all sorts of background logistics. You would need to make sure somebody can transition out of another role. If you are hiring a CEO, they probably have significant responsibilities. You work with them to create an offboarding and onboarding timeline. You meet and make sure they have all of the documentation and information they need. 

You may think about the classic timeline in terms of “the first 100 days for executives,” but what is the timeline for impact? What is the initial vision? At that stage, I see myself as a champion and a supporter. How do I back this person up? How do I make sure they are elevated to do the best job they can, that they have all the resources and insight that both I and the org can offer?

Let’s talk about growth. We have mentioned it by different names several times. I understand exactly why you don’t want to use the word “growth,” because it implies a bunch of Silicon Valley tropes. But you want to add users to Signal, so user growth is fundamentally the thing you are talking about.

Yes.

Is everyone on the planet the goal?

The goal is everyone who wants to, needs to, and has a smartphone. Yes, everyone on the planet is the goal, but that almost abstracts it into the land of fantasy. I don’t have the latest numbers, but not everyone on the planet has access to the internet. Not everyone on the planet has access to a smartphone that is running an operating system that can support Signal. There are planetary distinctions, inequities, and contexts, where I feel like throwing that out there would just be a bombastic tech founder goal and not actually anchored in reality. 

However, everyone who wants to use Signal, we want them to be able to use it. Again, the premise there is that Signal is more useful for the people who use it, the more people who use it. A messaging app that no one uses is useless. A hyper-secure, privacy-aware messaging app that only three people use is only secure, private, and useful to those three people. We really want that network effect because that is what makes messaging work.

I asked the question that way because there are a little under 8 billion people on the planet. 1.7 billion of those people live in China, which has blocked Signal. 1.5 billion of those people live in India, where the government does not like encryption. Just off the bat, are those markets that you want to go into? Are those fights you want to have? Are those compromises you would make? Or are those people just off the books for you?

“We are not in the business of compromising on privacy.”

Let’s be clear, we are not in the business of compromising on privacy, and we are not in the business of handing people who want and need Signal a compromised version of it. We are not going to do that. Are there people in South and East Asia who want to be able to talk privately, safely, and intimately outside of the gaze of corporate state surveillance? Absolutely. Do we want them to have access to Signal? Absolutely, we do. Do we want Signal to be available there? Yes. Can we magically transform the geopolitical dynamics? No, we can’t. We will do what is within our power to make sure that Signal is available to as many people as possible, and we will do that without compromising our privacy promises.

The Chinese government has effectively blocked Signal. You don’t have any plans to go into China in some way?

I don’t know what going into China would look like. A ticket to Beijing? Hand out QR codes? I don’t know. That’s a joke. We’re not doing that, but…

Yeah, right. Sure. You’re not doing Uber-style and guerilla user acquisition in China. What I meant was that you are not actively talking to the Chinese government about what it would mean for Signal to be active in the Chinese market.

No. Full stop. We are not going to compromise. That would imply that we are in a negotiating stance. Again, I have been in tech almost 20 years, so I have seen this sort of magical thinking recur. It’s this desire, particularly by state actors, to break encryption for their purposes, without understanding that that breaks it fundamentally across the board. This may sound a little bit dated, but there is no compromising with math.

If encryption is broken, it is broken. If Signal doesn’t keep its privacy promises, then there is no real point for us to exist as a nonprofit whose sole mission is to provide a safe, private, pleasant place for messaging and communication in a world where those are vanishingly few and far between. There are a number of other services, but because very few people use them, they are much less useful to those who pick them up and try them.

Right. So I’m just putting this out there. Apple’s solution to this problem — because China is a gigantic market for Apple and its devices — is to say iMessage is encrypted, but then allow a state-operated company to actually run the iCloud data centers in China. It seems like they have threaded the needle in a way that allows them to claim the thing they want to claim, even though the government holds the encryption keys. You are not going to do that. You are not even allowing for a solution like that to exist.

Let the record show, hell no, we are not going to do that. No. 

Let’s flip this and talk about the business model. Apple is doing that because every quarter they have to report their growth and revenue stats to the board, and if those stats are not looking to acquire a new market, then eventually their executive team is going to be fired and new people will be brought in. To put this in machine learning terms, the objective function of their company is increasing profit and growth forever. That is literally the definition of metastasis, right? 

That’s not us. We don’t need to do that. We do not need to make those compromises. Myself and our CEO will not get fired if we are not bringing new market strategies — however twisted the compromise is — to the board. We have a different mission and a different set of incentives, which makes it easy to say “hell no” to a question like that.

Signal is in India right now. India has claimed Signal is not in compliance with some regulations there that would require one of these magical-thinking backdoors. Are you going to leave India? Are you going to stay there? Are you going to fight? How is that going to work?

We are still available to people in India who want to use Signal. We are not going to compromise on privacy, and that is our stance. We will do everything we can to continue to be available to the people in India who want and need Signal.

If India passes a law or deems Signal to not be in compliance with whatever encryption regulation, will you walk?

I mean, if the choice is breaking Signal or walking… A lot of times, these policies, strategies, and discussions are not a Boolean. It’s not a cut-and-dry engineering decision — these are very muddy. Frankly, these are not things that are usually best to go into detail on publicly. You have to think about a lot of different political and social dynamics all at once and make up-to-the-minute choices based on dynamic situations. That is a very broad answer. 

I think we are going to be keeping our eye on it. We are going to be doing everything we can to remain available to as many people as possible without breaking Signal.

It’s a broad answer to a specific question. If a government in the world says, “In order to operate in our country, we want the keys to your encryption,” would you just walk?

Yes, we would walk. We will not hand over the keys to our encryption, we will not break the encryption. In fact, with the way we are built, we don’t have access to those keys.

There is a flip side to this, which is internal to Signal, what values Signal has as a company, and what things you can and can’t do because you cannot see into the content. I think this is maybe the most difficult thing for any company to reckon with when they operate a service with lots and lots of users who might do lots and lots of things. 

In 2021, we watched a story from Casey Newton about the group chat feature you mentioned, where you can share links to group chats and thousands of people can join them. Signal obviously cannot see what is going on inside of those group chats because it’s encrypted. That means bad actors can do bad things inside of Signal and spread their messages inside of Signal. Is that something that concerns you?

I think that story from Casey Newton was not a totally clear picture of the real dynamics inside Signal. I think the place where we really think about these is in the product direction when we are thinking through new features and capabilities. There’s a number of very smart people who spent a lot of time thinking through the implications. 

“But what if bad actors did it?” I think that question is compelling and is often very emotionally charged. The truth remains, however, that you cannot provide a service that truly protects the privacy of good actors — many of whom often have a lot less power than the people they are not wanting to be surveilled and tracked by — while opening up that service to allow surveillance of bad actors. There is no squaring that circle. We are committed to providing a service that is truly private for both.

I can talk about when I was participating in labor organizing at Google. We used Signal. I knew because I had been at the company for over a decade that at that point, the company had teams that were looking for a pretext to fire me. Those pretexts exist. I was part of ethical whistleblowing networks. We were sharing information we thought was in the public interest with the public and journalists, which I stand behind. A lot of this information should not be behind the walls of proprietary tech companies where the decisions are being made based on profit and not on social good. Full stop.

Yes, I agree with you.

There’s the least controversial statement.

As someone who also participates in that dynamic, yes, I agree with you.

Right. I was participating, along with many others, in networks of ethical whistleblowing, which would have provided the pretext for an easy, “Pick this person off.” But we were using Signal and we knew Signal was secure. We were using it on our personal devices, so there was no device manager that was able to key log — a very important detail for those of you taking notes. That meant I could feel safe being part of those activities. 

It’s hard to describe in clear, analytical, sterile, technical terms what that meant. There is a stomach-dropping fear when you’re like, “Shit, did one of the most sophisticated technical adversaries just blow up my spot? Am I unsafe? Is my health insurance unsafe? Will it implicate some of my friends who are also working on this?” It is the difference between that and being able to clearly and securely participate in those ethical activities. 

There is no splitting the baby on this question. Either it is secure and private for everyone or it’s not. And then there is this big, existential question of, “Why do this at all?”

Like I said, I do agree with you, but there is idealism and then there is in practice. There is an election coming up. If the Proud Boys post a Signal group chat link to recruit people to storm the capital because they don’t believe in the election results, what happens? Do you have a moderation team that takes it down? Do you just let it happen? How does that go?

I think looking back at January 6th is actually a pretty good example. That was planned in the open.

Sure. But I’m asking you specifically, if this happens on Signal, what happens?

I mean, we would not know. Signal is fully private and fully encrypted.

The links are not private, right? You can just post the links.

Well, the links could be posted in a forum. The groups have a limit of 1,000 people.

So you can’t even see that 1,000 people have clicked this link and started planning the thing?

No. We can see that a link exists to a group we don’t know about.

If the link is in a Proud Boys forum, would you not take any action against it, even if it’s like, “Click this link to help plan”?

Are you asking if we have people out there clicking every link and checking if the forum comports with the ideological position that Signal agrees with?

Yeah. I think in the most abstract way, I’m asking if you have a content moderation team.

No, we don’t have that. We are also not a social media platform. We don’t amplify content. We don’t have Telegram channels where you can broadcast to thousands and thousands of people. We have been really careful in our product development side not to develop Signal as a social network that has algorithmic amplification that allows that “one to millions” amplification of content. 

We are a messaging platform. We don’t have a content moderation team because (1) we are fully private, we don’t see your content, we don’t know who you’re talking about; and (2) we are not a content platform, so it is a different paradigm.

Signal has a terms of service. There is stuff you’re not allowed to do with it. How do you enforce that terms of service?

We don’t have access to your messaging or access to who you are talking to. We have minimized our access to information about you, about your conversations, about your friends, and about your networks. We are not out there policing who you talk to or what you talk about. That is anathema to the mission of Signal.

So you have added Stories, which are the ephemeral messages people are probably familiar with. The reason that those are popular, especially for AppStat, is because they are sticky. It gets people to come back and use the app more. You can measure it and say, “We need to add more sticky features.” Does Signal measure the stickiness of the app? Are you measuring how people use it to add features that are sticky like that?

“We don’t do analytics or tracking. That means we have to use other pieces of information and intuition when we are making product choices.”

No, we don’t do analytics or tracking, so we actually don’t have that information. That means we have to use other pieces of information and intuition when we are making product choices. We don’t measure that because we have very limited information; when people last used the app is it.

When you have product people and engineers deciding what features to add, they don’t have the data to back up their arguments? They just have to say, “This is a good idea”?

Yes. We don’t track or analyze use on specific features, but there are insights that are produced outside of Signal. There are basic sensibilities that come from folks having, oftentimes, decades of experience in the messaging space. We are not riding blind, we’re just not relying on surveilling our users to make our choices.

That seems extremely refreshing.

Usually when regulators — particularly in this country lately — want to break encryption, they immediately turn to child abuse. It is a new and somewhat startling trend that this is what regulators have focused on to break encryption. They have pointed that gun at Apple really strongly. Apple developed a system, which it has not rolled out yet, to scan your devices in a way they claim protects your privacy so you can’t use their devices for child abuse material. Is that something that Signal would do? “To protect everyone, we know that there has to be some amount of child abuse material.” That is just the unnerving reality of all services at scale. Is Signal saying, “We can’t even see it, so we can’t take any action against it,” or is there something you would do to take action against it?

I would point to the work of folks like Riana Pfefferkorn, Matt Blaze, and Susan Landau, who have all looked at content scanning and what we might refer to as analog backdoors. The issue with Apple’s proposal to scan everything on your device is that they still control what they are scanning for. I would also point to a number of the sex worker organizers and people who are more on the margins, who are more fearful of being caught up in these scannings. These are often arbitrarily enforced, and very rarely are there ways to contest these decisions. 

When you have a company like Apple, it is very unclear if the US government or another state could mandate scanning for just a little extra through some national security letter or another mechanism. It is an extremely dangerous, slippery slope that is right at the nexus of state corporate surveillance. These techniques, whatever you call them, need to be understood as backdoors into privacy and encryption. Signal has absolutely no plans to scan anyone’s messages to decide which messages are okay or not. That is our general stance there.

We have actually had Pfefferkorn on the show to talk about the Apple system before.

Fabulous. Hi, Riana.

She’s wonderful. Go listen to that episode after you finish this one. Apple’s position there is, “Well, we think this is bad and we are getting this government pressure. We have built this complicated system.” Microsoft has built a complicated system to hash against this known imagery. There are other ways to do it. It is all very complicated. It comes with a set of tradeoffs, but the goal of those tradeoffs is to eliminate the bad things. You can say the tradeoffs are too costly, but the goal is potentially a good one — well, probably a good one, right? “Don’t have this material on our service.” Are you saying the tradeoffs are far too costly, so you are going to just allow this material on the Signal service?

I am saying the tradeoffs aren’t tradeoffs. There is not some scale of justice that we are evening out. You either break it or you don’t. Either Signal’s core premise is intact or it’s not. That’s that. 

I think there are people better positioned than I am to make these arguments. There are a lot of techniques for law enforcement that don’t involve immediately turning to digital surveillance. I think we need to dig into a more troubling history of where these bad things are not being prosecuted. Who doesn’t get prosecuted for them? Where are they allowed to exist in the analog world? 

There is an unwillingness, or a lack of focus or political pressure, to really explore what other mechanisms exist to check these dynamics that are not often using the most emotionally stirring arguments. Really, I don’t think any of us can sit here and listen to stories about child abuse and not be moved unless we’re sociopaths. This really matters and it is horrific. Full stop. 

Too often, I think that pretext gets used to reflexively instill in people a response to these questions that’s like, “Break anything we have to break, because this is too emotionally meaningful for me to sit by.” It almost short-circuits that sort of deliberate and discerning analysis of the whole scope of the problem. I think that is also an issue with this debate.

We have talked about government pressure and we have talked about the content moderation problem. Those are problems that come with scale. As you get more and more scale, more and more governments are going to pressure you to break things. As you get more and more scale, you will get more and more pressure from your users and from your employees to moderate in some way. Let’s actually talk about how you would get that scale right now. 

In the United States, for example, almost everybody has a phone. There is a significant population of people who don’t have access, but it is reasonably fair to say that people who can get phones have phones in the United States. You have to take market share away from competitors in order to grow. People have to start using Signal and stop using SMS, iMessage, or WhatsApp. iMessage is pretty dominant. There is a trope about blue bubbles and green bubbles that exists for a reason. iMessage users are not willing to switch away from those blue bubbles. How do you get them to switch?

First, just install Signal. Use it with the other people who are using Signal. In a sense, yes, of course we want people to switch, but many people use many different, potentially overlapping services for many things. 

We first need to make it clear that Signal is different. What we offer is true privacy, not privacy claims with little caveats in a 15-page terms of service. We need to make it clear that this is extremely valuable, and that this is something that will protect you and allow for intimate, safe conversations with you and your friends. We do see people understanding those distinctions increasingly over the last five years. 

Then our task is to make Signal as pleasant and useful as possible. What are the features we can add that competitors might not be willing or able to, because of our unique business model, because our incentives are different, and because privacy is forefront in our product and in our mission?

We just need to make sure people know about it and that they are able to quickly and easily use it. When you open Signal, you can believe that it’s important and know why you downloaded it, but the second you are using it to share directions, you shouldn’t be actively thinking about that. It should just work. There should be a seamless experience. You get in there, you share your story.

We have a new feature that is in beta right now called Stories. They are very cute. I encourage people to use them when they are fully available. It looks, feels, and acts like a messaging app, which is not easy. The norms and expectations about what messaging apps should do have been set by these big surveillance messaging apps and these big corporate structures.

Just by way of comparison, there is a stat I have thrown around in a couple of places. WhatsApp has over 1,000 engineers — and that is just their engineering team. If you added support, policy, et cetera, you are looking at many thousands of people just sustaining WhatsApp. That is not Meta. Telegram has somewhere around 500 employees, so that is fairly big. Signal is 40 people. That is 40 people maintaining an app across three clients. It’s hard, thankless, constant work. I’m privileged to work with the brilliant people who do it, but nonetheless, they work really hard doing it.

It’s not any cheaper for us just because we don’t participate in the surveillance business model. It is tens of millions of dollars a year in hosting, transit, registration, et cetera. It is the cost of making sure Signal is available everywhere and always seamless, which are the expectations that have been set by the current tech ecology. We do need to continue developing and building Signal so it meets those expectations and figure out ways that a service like ours can sustain itself, given the forever cost and the labor requirements.

The forever cost is what I’m getting at. It is hard enough to get people to not use iMessage. Google has now failed for a decade to get people to switch off of iMessage.

Well, they have a truly appalling strategy.

It is in Google’s best interest to develop a messaging app that works. Google is Google and they can’t get out of their own way.

They developed 40,000 that didn’t work.

Sure, but they can’t do it. Microsoft can’t do it. Facebook can’t do it. Facebook will happily tell you that there is more action on Instagram in messaging than there is on the grid or in Stories, but they haven’t displaced iMessage. You have that problem, but then on top of that the users need to like it so much that they donate to the foundation to keep the thing running.

Not every user needs to donate. We are never going to charge to use Signal because privacy shouldn’t only be for people who want to or can pay for it. I don’t subscribe to the theory that people are idiots. I think people are very discerning and they get it when they hear it. People get that the surveillance business model is no good. They get that we are somewhere fairly scary with the power that has been ceded to these large surveillance tech giants. They understand that alternatives are necessary.

It doesn’t have to be everyone, but some percentage of the millions of people who use Signal donating $5 a month is what we’re looking at. It is a casual Patreon model that is at scale, so that we are able to support the significant maintenance costs for Signal. Frankly, that is the hardest to cut off at the knees. We are really fortunate that Brian Acton’s generous, long-term loan has allowed us this foundation to experiment with sustainability models to grow Signal and to get it in shape. We are really aiming for small donors, both because we think people will be willing to donate and because we want a model where one person pulling out wouldn’t capsize the ship.

Is that the model you have or is that the model you want? It seems like it’s the model you want, but right now you have Brian’s money and a bunch of other big donors.

We have a hybrid. We have only started experimenting with the donation model. For a while, there was a donation page buried on our website that people probably rarely found. Now we are experimenting with in-app nudges that are like, “Hey, if you want to kick down, kick down.” We have badges, which are cute, little signifiers that go on your profile image that demonstrate that you donated. People can click on your badge and they can click through to make their own donation. 

This is very recent, in the last year or so. We are iterating and experimenting with that model. Everyone listening, download Signal if you haven’t and make a little monthly donation. It’s easy. I could use some boring nonprofit trope that it’s like a cup of coffee or whatever.

Oh no.

But really, this is existentially important for a livable future. We have to have a private way to communicate. Folks, particularly those who are in and around tech, will understand that at a visceral level. Come on, join the community, kick in.

Is it going to be like Wikipedia? Are you going to ask us for money every three months?

Well, no. We are going to have a lot of chill. We want to remind people that we need money, but the app is a messaging app. What we are dedicated to, first and foremost, is that you can open it, it’s useful, it’s pleasant, and it’s not in your face. We want to remind you, but we want to be really subtle about it. That is actually something we have a lot of discussions about. What is the minimum viable notice to folks that we can get away with and still ensure that people who can donate know and can sign up easily?

Last year we reported — I think it was Moxie’s assessment — that for Signal to be self-sustaining, it would need 100 million users. Is that still the number in your mind or how close are you to that goal?

That’s a shorthand assessment from Moxie. It’s the percentage of 100 million users who also donate. More users means more hosting, more transit, and more registrations, which is actually a cost without folks donating. It’s tens of millions of dollars a year, so we need enough users donating as a percentage so that we are able to cover those costs. 

Are you at 100 million users? How close are you to it?

We are not. We don’t share user data publicly. I guess the straight answer to that question is that we are not at 100 million users, but our user base is growing. You can see we have over 100 million downloads in the Play Store. We have a significant user base, which is increasing, and I definitely think we will get there.

You recently announced that you’re dropping SMS support from the Signal app. Google is pushing RCS really hard. Converting people into messaging apps is really hard. Getting people to not use iMessage is really hard. Why are you dropping SMS?

This is one of those decisions that has been a long time coming. The leadership team has been agonizing over it before and after I joined. It surfaced all the way up to the board level, so this was not an easy decision. And for kind of a little bit of color on this, Signal is dropping SMS support for Android, not iPhone. Android allowed people to set Signal as their default messaging app. That means someone could send Signal messages, which are fully encrypted, fully private, and secure, or they could also answer insecure SMS text messages. The SMS messages were kind of a guest in their messaging app and were answered through Signal alongside Signal messages.

SMS basically gives your messages in plain text to your telecom provider. That is the opposite of Signal’s mission.

So this has been a feature that has been in the Android client for almost a decade at this point. And in that decade, a lot has changed. SMS has always been insecure, but SMS basically gives your messages in plain text to your telecom provider. So that is the opposite of Signal’s stance and Signal’s mission and frankly we got a lot of reports that this was confusing to people. People didn’t realize the difference between SMS and a Signal message. And we take that seriously because that can be existentially dangerous for some people who are using Signal in some high-risk situations.

There was also the issue, and this is not something that would’ve hit users in the US or in sort of historically rich countries, but in a number of disinvested regions, we were having people who would confuse an SMS message for a Signal message, send a bunch of SMS texts, and because SMS messages are billed at a very high rate, would get a huge bill when they were thinking they were using their data to use Signal. So those are a couple of our key reasons, but I think the security was really the biggest reason. 

Times have changed in the last 10 years. As you said, Google is pushing RCS. They hope, and it appears that, RCS is set to replace SMS at some point. That was actually leading to errors with the SMS integration. You would not receive a message if your phone defaulted to RCS or something like that. And that meant that was increasingly hard for us to deal with on the user report side. That meant that it was increasingly difficult to support SMS as a degrading standard. It’s also something where there still isn’t an official API to put RCS in Signal even if we were considering it, which is not on our roadmap at this point.

So those are the considerations that went into making this choice. Again, I am a lifelong or rather, the life of Signal-long Android Signal user who has used it as my default messaging app the whole time. So this is the front of my pain points. But weighing the kind of security, the confusion, and the fact that SMS is a deprecating standard were things that weighted in the direction of removing it and sort of moving onto a future where Signal is fully secure and there’s no ambiguity.

But let me push into this a little bit. Obviously Apple has played this game for a long time. iMessages, depending on your iMessage settings, are encrypted end to end. They’re blue. They’re more feature-rich. SMS is green. Everyone understands green is worse than blue in iPhone world. Why can’t you just do a solution like that? Because what you’re losing is the opportunity to convert SMS users into Signal users by saying, “Just use this one app for everything. And by the way, if your messages turn blue, you’re in Signal. You’re encrypted to get all these other features.”

Let’s be clear, Apple has advantages we don’t there. They control the hardware, they are the gatekeeper for iOS and the iPhone, and they have a lot of levers they can pull that we can’t. They also don’t support RCS. 

And we’re talking about the Android ecosystem here, but right now there are two competing protocols. There’s SMS and there is RCS. And it is difficult to implement a third-party SMS app when the phone will default to RCS. So there are a lot of other issues that we face as we are not a big tech company that controls the hardware and has that closed ecosystem at our disposal that allows us to reliably make those choices for users. But we did a lot of work trying to disambiguate SMS between Signal messages and this is no fault of the people who use Signal. 

This is simply when people pick up tech, it’s not so that they can be taught small nuances. It’s so they can quickly communicate with their friends. Getting someone to sort of clock the difference in a protocol layer security property, that’s an education task that is pretty steep. It is very difficult to accomplish and it’s particularly difficult to accomplish if, unlike Apple, you don’t control every part of the ecosystem you’re operating in.

You said RCS isn’t on the roadmap. You said Google doesn’t have an API for RCS and Android. If Google had an API for RCS and Android, would RCS support go back on the roadmap for Signal and Android?

I haven’t looked deeply enough at that to have a clear answer. I think that the answer is TBD. Our goal is for Signal to offer unequivocal, casual, just completely reliable security and privacy. So we would want to make sure RCS wasn’t an issue vis-à-vis those goals. And I know RCS is certainly much better than SMS, but I have not poured over the spec because again, our primary motivation in removing SMS was to get rid of a confusing and inherently insecure option.

One of the criticisms that I read after the announcement came out was, “Hey, I was able to put Signal on my mom’s phone and she didn’t have to know anything. But I knew that I was now sending her Signal messages. This is how you grow the network. Now I need to have two apps.”

 They need to have two apps. This is actually worse for Signal adoption because you’re not sort of seamlessly onboarding people onto the encrypted network away from SMS. Are you worried about that or do you think this is just that you have to market and make the thing better?

 This is why this was a hard decision. Those folks are not wrong. That’s real and I’m one of them.  My dad uses Signal even though he doesn’t really know he uses Signal. He just uses the app where the messages come in. This will probably be very confusing. And I think what we did is make a hard, kind of crappy choice. We were presented with two options we didn’t like and we chose the one that privileged privacy and security and a kind of a long-term roadmap. Where again, SMS is being deprecated. People were confused. It was causing an increasing amount of errors. And the development effort of maintaining that in addition to doing all the other things was nontrivial.

Those folks are right. But when we weighed all of the variables, this is what we came out with. And I do think people will continue to use Signal, of course. People will continue to adopt it, but as I put on Twitter, I’m not happy about pulling up on an on-ramp to adoption.  I’m not happy that it’s going to be harder for me to explain this to my dad and my brother and other folks, but we don’t create the reality that we’re operating in and we had to face that.

One of the promises of RCS is that it will be encrypted. I’m not sure how well carriers around the world are going to keep that promise or Google will keep that promise, but that is one of the promises.

Yeah.

Do you think that RCS represents competition for Signal?

Not at the moment, no.

Why’s that?

Well again, I haven’t pored over the RCS spec, so I want to be really careful with any answers and I could do that and come back on and we could have a whole conversation about it. But Signal is not just encrypted. WhatsApp uses the Signal protocol to encrypt its messages. Signal doesn’t just encrypt the message content. It is encrypting metadata. It did something which I consider fairly revolutionary with its new groups, methods, and infrastructure, which was figure out a way to prevent Signal from knowing who is in a group and who’s talking to whom. These things are huge, true, kind of scientific innovations that are also innovations in privacy, which is Signal trying as hard as we can to collect as little information about you, about who you talk to, as little meaningful information about what people are saying, who they’re saying it to, and who is using our service, et cetera.

So I would need to look at the entire kind of end-to-end infrastructure, what incidental data or metadata is being collected. And then I think we have to consider the concept of privacy structurally, not just technologically. People use encryption for a number of things. They still collect data.

For an app controlled by Google, it is pretty easy to join that metadata with a lot of the other wildly intimate and personal data that Google has and make conclusions about people.

If we’re looking at an app that is controlled by Google, it is pretty trivial to join that metadata with a lot of the other wildly intimate and personal data that Google has and sort of make conclusions about people. 

So we need to also look at the organizational and structural differences between a Signal and a Google. And Signal doesn’t have any of that data. We don’t buy data from data brokers. We don’t scrape data from anywhere. We don’t have it. We don’t want it. We actually go out of our way, as I just described, to avoid having it or touching it or knowing it. 

So I think that we’re talking about a difference in kind. And that difference in kind is not just vis-à-vis our technological implementation or whether we’re using this variety of end-to-end encryption – although the people who are using the state-of-the-art messaging encryption system are using the Signal protocol. I think what we’re talking about also is what our incentives are and how we are structured to ensure that we live by our mission and not in the name of profit and growth.

What is next for Signal? What should people be looking for?

You should definitely look out for the Stories update, that is in a couple of weeks. We should be rolling out Stories which are cute, little, ephemeral messages that you may be familiar with from other services, but on Signal, they will be fully private and secure. That is the next big feature launch. We are all using it inside Signal and we all love it. It’s going to be great when we can actually use it with all of our friends and colleagues.

Well, Meredith, it has been really great having you on Decoder. Thank you for all this time.

Thank you.

Decoder with Nilay Patel /

A podcast from The Verge about big ideas and other problems.

SUBSCRIBE NOW!