WhatsApp says it would rather be blocked in the UK than be forced by the Online Safety Bill to weaken its encrypted messaging system.
Its leader, Will Cathcart, said that if they were asked to make encrypted messages less private, they would not do so.
Before, the app Signal said that it might stop working in the UK if the bill forced it to scan messages.
The government said it is possible to have both privacy and child safety.
Ofcom, which is in charge of regulating communications, says that more than seven out of ten adults who are online use WhatsApp to send messages.
Material for child abuse
With end-to-end encryption, messages are scrambled so that even the company that runs the service can’t see what’s in them.
But critics of the Online Safety Bill say it grants Ofcom the power to require private encrypted-messaging apps and other services to adopt “accredited technology” to identify and remove child-abuse material.
Undermining the privacy of WhatsApp messages in the UK would do so for all users, Mr. Cathcart said.
“Our users all over the world want security. 98% of our users are outside the UK, and they don’t want us to make the product less safe,” he said. And the app would rather be blocked in the UK than work around it.
“We’ve recently been blocked in Iran, for example. We’ve never seen a liberal democracy do that,” he added.
Meredith Whittaker, the president of Signal, told BBC News in the past that the company “would absolutely, 100% walk” and stop doing business in the UK if the bill forced it to make its encrypted messaging system less private.
She later tweeted she was “looking forward to working with @wcathcart and others to push back.”
A day later, Mr. Cathcart replied, “It’s very important that we work together to push back, and I’m honored to be able to do so.”
When asked if he would go as far as Signal, Mr. Cathcart replied, “We won’t make WhatsApp less safe. We’ve never done that, and we’re okay with being blocked in other parts of the world.” And he feared the UK would set an example other nations might follow.
“When a liberal democracy asks, ‘Is it OK to look through everyone’s private communications for illegal content?,’ it gives other countries with different ideas of what is illegal content the confidence to ask for the same thing,” Mr. Cathcart said.
The government and many organizations that work to protect children have said for a long time that encryption makes it harder to stop the growing problem of online child abuse.
The Home Office said, “It’s important that tech companies do everything they can to make sure that their platforms don’t become a place where pedophiles can grow up.”
The National Society for the Prevention of Cruelty to Children (NSPCC) says that research shows that crimes in the UK involving grooming and images of child abuse have gone up by a lot.
Richard Collard from the charity said that the Online Safety Bill “will rightly make it a legal requirement for platforms to find and stop child sexual abuse happening on their sites and services, and companies could get ready by developing technological solutions that protect the safety and privacy of all users, including that of child abuse victims.”
“Experts have shown that it is possible to deal with child abuse material and grooming in environments where everything is encrypted from beginning to end,” he said.
“The Online Safety Bill does not put an end to end encryption ban in place,” the government said.
“It is not a choice between privacy or child safety – we can and we must have both.”
Critics, however, say that the only way to make sure encrypted messages don’t contain material about sexual abuse of children is for services to scan them on a device like a phone before they are encrypted and sent. And this scanning on the client side makes encryption less private.
Lawyer Graham Smith wrote on Twitter, “You could argue that you don’t break a fence by digging around the end of it, which is literally true, but what good does that do if the goal is to break into private property? And once the hole is dug, the fence might as well not be there.”
“Tool for mass surveillance”
And Mr. Cathcart asked, “If companies put software on people’s phones and computers to check the content of their communications against a list of illegal content, what happens when other countries show up with a different list of illegal content?”
The Open Rights Group is a group that fights for digital rights. Dr. Monica Horten said, “With more than 40 million users of encrypted chat services in the UK, this turns it into a mass-surveillance tool, which could hurt privacy and free speech rights.”
The Information Commissioner’s Office, which says it works closely with Ofcom, told BBC News that any actions that could weaken encryption must be “necessary and proportionate.”
“Less invasive measures should be used when they are available,” it said. And it backed “technological solutions that make it easier to find illegal content without putting everyone’s privacy at risk.”