Cornell Tech researchers have developed a mechanism for maintaining anonymity in encrypted messages – which hides the content of a message but may not mask the identity of the sender – while simultaneously blocking spam or abusive messages.
PhD student and co-lead author Nervan Tyagi presented the group’s paper, “Orca: Blocklisting in Sender-Anonymous Messaging” at the 31st USENIX (Association for Advanced Computing Systems) Symposium, held August 10-12 in Boston.
Co-authors included Tom Restenbart, Professor of Computer Science at Cornell Tech and at Cornell Ann S. Bowers College of Computing and Information Sciences; Julia Lin, PhD student in computer science; and Ian Myers, associate professor of computer science at the University of Maryland and former postdoctoral fellow at Cornell Tech.
This work is a continuation of research funded by A A five-year, $3 million grant from the National Science FoundationThe goal is to take important steps towards a safer online connection. Ristenpart is the Principal Investigator of the “Preventing Privacy Abuse Preserving Privacy for Encrypted Communication Platforms” project.
Platforms such as Signal, WhatsApp, and Facebook Messenger rely on end-to-end encrypted (E2EE) messaging to maintain message confidentiality, but user anonymity is not guaranteed. Signal recently introduced an anonymity preservation feature, but it was found to be vulnerable to attack.
“While it prevents content from leaking into the platform, this does not prevent other types from leaking metadata,” Tyagi said.
While E2EE messages provide strong secrecy for messages being sent, the platform can learn the identities of both the sender and receiver of each message sent over the network. Signal, a messaging app released in 2014 that now boasts over 40 million users, introduced a “sealed sender” protocol that ensures anonymity of the sender to the platform.
This highlights a major tension in anonymous sender systems: anonymizing the sender, while mitigating potentially offensive messages. E2E encryption by itself makes certain types of abuse mitigation more difficult, and anonymizing the sender only complicates those efforts. One example of an abuse mitigation that is complicated by the anonymity of the sender is the block list.
“The[Anonymous Sender Block List]is kind of a paradoxical one, because we want the platform to be able to filter based on sender identities, but we also want to hide the identity of the sender from the platform,” Tyagi said.
With Orca, a message Recipients will register an anonymous block list with the platform. Senders create messages that the platform can verify as belonging to someone who is not on the block list.
Verification is done through group signatures, which allow users to sign messages anonymously on behalf of the group. The platform registers individual users, and the group’s opening authority – the recipient – can track the identity of each individual user.
If the sender is on the block list, or if the message is garbled, the platform rejects the message. But if the message is delivered, the recipient is guaranteed to be able to identify the sender.
Orca takes this efficiency one step further: instead of creating and verifying a group signature for every message sent, the group signature will only be used periodically to mint new sets of one-time-use sender tokens from the platform. Messages can be sent by including a valid recipient token; These tokens, or access keys, are more effective for the verification platform and only require checking the list of used or blocked tokens.
“When the sender sends a message, using encryption, they prove to the platform that it is the authorized sender of the recipient and not on the recipient’s block list,” Tyagi said. “And they can do that in a way that they can hide their identity from the platform.”
Tyagi said this type of precaution could be beneficial in a number of scenarios.
“You might be a whistleblower at a company, and you call a journalist, which is not uncommon for most people,” Tyagi said. Then a great story appears. The mere fact that someone from that company was in contact with the journalist recently could raise a red flag.
“Or in the medical field, just by the fact that you’re communicating with a cardiologist, for example, can he reveal confidential information about your health,” he said.
Future work will address the computational challenge of ensuring that a single cryptographic identity corresponds to a single human. It’s just one of the many problems computer scientists face as they tackle the tension between anonymity and mitigating abuse.
“Increased privacy could harm the ability to do certain types of mitigation of abuse and accountability,” Tyagi said. The question is, can we make this trade-off a little less expensive with better encryption? And in some cases, we can.”