Posted on Posted in Hacker News

THE ENCRYPTED-COMMUNICATION APP Signal has a sterling reputation within the security and crypto community, but its critics point to a nagging flaw: The app asks for access to your phone’s contact list when you install it. Signal’s creators, like the developers of so many other social apps, consider that contact-sharing request a necessary evil, designed to make the app as easy to use as your normal calling and texting features. But it’s one that some of Signal’s most sensitive users resent as a breach of its privacy promises.

Now Open Whisper Systems, the non-profit behind Signal, is launching an experimental new method to sew up that gap in its data protections, while still letting you flip through your existing address book to make encrypted calls and send encrypted texts. And the way they did it could serve as a model for other apps wrestling with the same address book privacy problem.

Making Contact

Using a feature in the latest generation of Intel processors, the group plans to announce Tuesday that it’s testing a method that lets its servers mine your address book to find other Signal users, while provably deleting all the contact data those servers see without recording it. That means, in theory, no hacker, government agency, or even Signal developers themselves can access that sensitive data.

“When you install many apps today you get this little prompt that asks if you want to give someone access to your contacts. You get an uncomfortable feeling in that moment,” says Moxie Marlinspike, the founder of Open Whisper Systems and Signal’s creator. “This is an experiment in letting you not have that uncomfortable feeling.”

That new experimental protection for your Signal contacts, which Open Whisper Systems is testing now and hopes to roll out to users over the next few months, takes advantage of an Intel processor feature called Software Guard Extensions, or SGX. Intel chips that integrate that SGX component have a “secure enclave” in the processor, designed to run code that the rest of the computer’s operating system can’t alter. Any code running in that enclave is signed with a unique key that Intel, not the computer’s owner, controls. And a computer that connects to that machine running SGX can check its signature to make sure that the code in the enclave hasn’t changed, even if the rest of the computer is infected with malware, seized by the FBI, reprogrammed by its owners to sell out all its users’ data, or otherwise compromised.

Much of the attention to SGX has focused on how it can enable practically unbreakable “digital rights management” anti-piracy measures: If it’s installed on your PC, it could prevent you from fully controlling the code of the videos or games you play on it, making it far harder to crack those files’ copy protections. But Open Whisper Systems is now turning SGX’s trust relationships around, and running it instead on Signal’s servers. As a result, Signal users will be able to check that those servers are behaving in a way that even its administrators, or an outside party who compromises the servers, can’t change.

When you share your contacts with Signal, those servers check your address book against all known Signal users to assemble a list of known Signal-using contacts in the app. Now, that process will be performed within the Signal server’s secure enclave. Every phone with Signal installed will in theory be able to check that Signal’s open-source server code, which is designed to immediately erase that address book info after processing it, hasn’t been changed to somehow store the data instead.

By running the contact lookup process in that SGX-enabled enclave, “we’re hiding the contacts from ourselves,” says Marlinspike, “in the sense that the code is unalterable, and it’s written in an unalterable way where it doesn’t reveal the contacts to anything else outside that enclave.”

That server-side use of SGX is still relatively untested, and the notion that the administrators of a server could prevent even themselves from undetectably fiddling with code in the secure enclave of a computer they physically control isn’t entirely proven, says Rafael Pass, a cryptography-focused professor of computer science at Cornell Tech who presented a paper on server-side SGX implementations for privacy at the Eurocrypt conference earlier this year. “They could potentially break their own SGX enclave. It’s not well understood how expensive that is,” says Pass. “In principle it seems like a viable design. It makes it better, but it’s not clear how much better.”

But Marlinspike argues that the new security measure will at least make it vastly harder for Signal to somehow sabotage its own privacy protections. In the past, the app has obscured users’ contacts by taking a cryptographic “hash” of them, converting them into a unique string of characters that can’t be deciphered to reveal the original information. But that hashing process alone was relatively easy to crack, since someone could simply hash all possible phone numbers and match them to the hashes Signal collects.

Now Signal users will have the extra assurance that Signal’s servers aren’t collecting—and in fact, can’t—those hashes in any permanent way, short of finding some new method of breaking into Intel’s SGX protections. But Signal’s SGX implementation remains just a test, and it will require real scrutiny to ensure it truly hides all parts of a user’s contact list in its secure enclave and allows that code to be publicly verified in a meaningful way.

A Better Method

If it checks out, though, Signal’s use of SGX might offer a new alternative for social apps that seek to thread the needle of convenience and security. If social software wants to offer a calling or messaging experience better than a 1980s-style touchtone telephone sans speed dial, it generally either uploads your phone’s local contact list or stores its own list of your contacts on a server. Either option seriously impinges on the privacy of your personal social network.

Signal’s solution might offer a solid third option. “We want this to be something that’s accessible and generally deployable by everyone who has this problem, not just us,” Marlinspike says. “We’re trying to build something that will scale to more than a billion users.” The result might someday be that the privacy protections Signal has helped to extend to the contents of those billions of users’ communications could apply to the equally precious contents of their contact list, too.