Comment

We'll protect privacy and prevent harm, writes Facebook safety boss

The head of safety at Meta, which owns Facebook and WhatsApp, says the company is working hard to get the balance right

Facebook

Every day billions of people send private messages as an essential part of daily life – to check in with family and friends, talk to their child’s teacher, and even communicate with doctors. It’s why apps that use end-to-end encryption – where only the sender and recipient can access the contents of a message – are relied on by the overwhelming majority of Brits to keep their private messages, and all the personal information they contain, safe from hackers, fraudsters and criminals.

At Meta, which owns Facebook and WhatsApp, we know people expect us to use the most secure technology available which is why all of the personal messages you send on WhatsApp are already end-to-end encrypted and why we’re working to make it the default across the rest of our apps.

As we do so, there’s an ongoing debate about how tech companies can continue to combat abuse and support the vital work of law enforcement if we can’t access your messages. We believe people shouldn’t have to choose between privacy and safety, which is why we are building strong safety measures into our plans and engaging with privacy and safety experts, civil society and governments to make sure we get this right.

Our three-pronged approach is focused on preventing harm from happening in the first place, giving people more control, and quickly responding should something occur.

First, we will prevent harm by using proactive detection technology that looks for suspicious patterns of activity and takes action on concerning accounts. If someone repeatedly sets up new profiles or messages a large number of people they don’t know, we quickly intervene to restrict or ban them. This technology is already in place and we’re working to improve its effectiveness.  

We are taking extra steps to protect under-18s such as defaulting them into private or “friends only” accounts and restricting adults from messaging them if they aren’t already connected. We’re also educating young people with in-app tips on avoiding unwanted interactions.

Second, alongside developing this behind-the-scenes technology, we’re giving people more ways to manage who they choose to speak with. Earlier this year, we rolled out controls to let people decide who can message them and who can’t. People can also automatically filter Direct Message requests on Instagram which contain potentially offensive words, phrases and emojis. Just like a spam filter blocks junk mail, these new controls help keep potentially harmful messages at bay. We will continue to improve these features to help protect people from messages they don’t want to see.

Third, we’re actively encouraging people to report harmful messages to us and will prompt them to do so when we think there could be a problem. Once they do, we can view the reported message, investigate the content, offer support where appropriate, and take action where necessary. Where we find abuse, we make referrals to the authorities and respond swiftly to valid requests for data to support law enforcement investigations – as we always will.

Even with billions of people already benefiting from end-to-end encryption, there is more data than ever for the police to use to investigate and prosecute criminals, including phone numbers, email addresses, and location data. In Europol’s most recent annual survey of police and judicial authorities, 85 per cent of those surveyed said this was the kind of data that was most often needed in investigations.

As we roll out end-to-end encryption we will use a combination of non-encrypted data across our apps, account information and reports from users to keep them safe in a privacy-protected way while assisting public safety efforts. This kind of work already enables us to make vital reports to child safety authorities from WhatsApp.

Our recent review of some historic cases showed that we would still have been able to provide critical information to the authorities, even if those services had been end-to-end encrypted. While no systems are perfect, this shows that we can continue to stop criminals and support law enforcement.

We’ll continue engaging with outside experts and developing effective solutions to combat abuse because our work in this area is never done. We’re taking our time to get this right and we don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023. As a company that connects billions of people around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.

Antigone Davis is the Global Head of Safety at Meta

License this content