Protecting messaging app users while safeguarding privacy
Cornell University-led team working to stop harassment on encrypted communications services
The internet is an essential part of modern life, and dependence on online services accelerated throughout the past two years as millions of people worked, shopped and interacted with others remotely during the pandemic.
This growth in online usage increased awareness about the importance of cybersecurity and privacy. For many, the concerns are primarily about protecting personal information from being collected and exploited by large online companies or stolen by hackers. But the U.S. National Science Foundation is driving research by Cornell University in collaboration with researchers at the University of Washington that looks at another aspect – mitigating instances of harassment and cyberbullying using encrypted social media communications while also ensuring user privacy.
Thomas Ristenpart, an associate professor of computer science at Cornell, is working with a multidisciplinary team of researchers to navigate the technological, legal and social challenges to develop safer and more secure online communications for users of these messenger apps.
Tackling a broad challenge
Ristenpart’s work in computer security and cryptography has been interwoven with NSF throughout his career. His work at Cornell includes NSF-supported research into secure cloud computing and understanding technology abuse in domestic partner violence.
“This opened my eyes to ways interpersonal harms can arise and how abusers exploit technology to cause harm,” Ristenpart said. “It changed my thinking in that there are other things to be concerned about beyond traditional computer security, and it triggered an interest in how to think about the design of cryptographic mechanisms in light of interpersonal abuse.”
The addition of encryption improved security for messaging app users. But since the operators of the messaging apps cannot see the contents of the messages their services carry, it provides online bullies new means to harass others as well as new ways for groups committing illegal activities to communicate.
The challenge for the team is developing ways to mitigate harassment and curtail the use of these app to cause harm while also retaining the safety and security encryption provides for all users. “This is mostly framed as government lawful access versus liberty and confidentially. I think it’s a more subtle landscape than that – with a lot of stakeholders involved,” Ristenpart said.
A diverse team is a good way to move forward on tricky problems like abuse in private messaging and rethink basic conceptions about the problem and potential approaches. - Thomas Ristenpart
To tackle this broad challenge, NSF is bringing together experts across several fields: Mor Naaman, professor of information science at Cornell; James Grimmelmann, professor of digital and information law at Cornell; J. Nathan Matias, assistant professor of communication at Cornell; and Amy Zhang, assistant professor at the University of Washington's Paul G. Allen School of Computer Science & Engineering.
“In science, communities tend to get siloed and focus on particular types of problems, which can be really useful for making rapid progress on them. But this doesn't work as well for problems that need tools and thinking from multiple backgrounds,” Ristenpart said. “NSF projects can and do bring together people who may not work together otherwise. A diverse team is a good way to move forward on tricky problems like abuse in private messaging and rethink basic conceptions about the problem and potential approaches. For me personally, I also get to learn from colleagues that are experts in topics I’m quite ignorant about and see how those thoughts impact my other research.”
The research will focus on developing better cryptographic tools for privacy-aware abuse detection in encrypted settings, such as detection of viral, fast-spreading content, while also ensuring these tools are consistent with applicable privacy and content-moderation laws.
Ristenpart envisions several different outcomes from their work, including a better understanding of the privacy expectations of messaging app users, new ways to think about the law and policy implications around the use of such apps, and a basic toolkit of new cryptographic algorithms to help manage communications.
Secure and Trustworthy Cyberspace program
NSF has played a critical role in the development of encryption technology, dating back to work in the 1970s on the development of public-key cryptography, which makes secure, online communications possible. The research was advanced under NSF’s Secure and Trustworthy Cyberspace program, or SaTC, which supports projects designed to protect and preserve the growing social and economic benefits of cyber systems while ensuring security and privacy. In 2021, NSF invested more than $10 million across four large SaTC awards. Along with the Cornell-led team, other projects will focus on understanding how unsubstantiated information spreads online, mitigating the impact of online disinformation, and securing browser operations. Over the past year, SaTC’s total investment in cybersecurity and privacy research was nearly $80 million across more than 100 awards.
“To achieve a reliable cyberspace that enhances our nation’s economic, security, and socio-technical leadership, requires investments in foundational research that seeks innovative ideas to address cybersecurity and privacy challenges, and result in trustworthy and resilient computing systems and online services that enhance our digital experiences,” said NSF Program Director Jeremy Epstein.
Ristenpart also believes that the work initiated under the program will have longer term impacts on the very nature of cybersecurity research. “We haven’t historically had a research community focus on these problems. They require a different research background than traditional computer security expertise. It’s a bit more human-facing. There are people in cybersecurity and privacy areas discussing how do we mature and develop a research community focusing on tech harassment. Hopefully in 5-to-10 years, we have areas for those interested in security and trust issues.”