The rumour that WhatsApp had commenced an assault on privacy precipitated a mass exodus.
Most of the rumours were untrue, but it was easier to swallow the gossip than dig for the truth. Millions of people, believing that the content of their WhatsApp messages was about to be shared with its parent company, Facebook, decided to jump ship. It didn’t matter where; just anything but WhatsApp.
One competing service, Signal, received an influx of 7.5 million new users in the first three weeks of January. Another, Telegram, amassed 25 million. Even ICQ, the pioneering but largely forgotten messaging service from the 1990s, experienced a resurgence as people sought alternatives.
The result was a tumultuous shift in the messaging landscape, based on unfounded fears and clumsily handled PR. As WhatsApp continues its attempt to minimise the damage, both Signal and Telegram are sensing a huge opportunity to grow.
But that growth runs the risk of triggering the same problems that WhatsApp has faced: how to make a free service profitable, and how to ethically run a private messaging platform that could be used for criminal activity.
WhatsApp will have wondered how their attempt to be more transparent could have had such an extraordinary outcome. The update to its terms of service, announced on January 6, was an attempt to be clearer about its existing policies, but was widely misinterpreted as the introduction of a backchannel by which people's messages could be read by Facebook.
“There are no changes to our data sharing with Facebook anywhere in the world,” explained Niamh Sweeney, WhatsApp’s director of public policy for Europe, the Middle East and Africa. But the damage had been done, and the service was forced to use its own app to plead with people to stay.
“WhatsApp can’t read or listen to your personal conversations,” states one message.
Those reminders fell on deaf ears. Even Facebook users could be seen announcing their departure from WhatsApp, seemingly unaware that their announcement was yielding more data for Facebook than any WhatsApp chat could. In India, the government released a statement expressing concern over “the implications for the choice and autonomy of Indian citizens", fuelling the fire further.
Ultimately, the implementation of WhatsApp’s new terms and conditions, due to take effect on February 8, was delayed until Saturday, May 15. Those three months, WhatsApp hopes, will “clear up the misinformation".
However, people continue to be spooked. Last week, WhatsApp announced the introduction of biometric authentication – fingerprint or face ID – to link a WhatsApp account to a computer. This was done to enhance security, using a system which is known to keep biometric information safe. And yet many incorrectly saw it as a further invasion of privacy.
“There is no way I’m trusting Facebook with my fingerprint and face data,” read one online post. Clawing back trust will continue to be an uphill struggle.
In the meantime, Signal and Telegram have never been more popular. As they proclaim their commitment to privacy, people feel safer in their hands than in Facebook’s – although the influx of new users created some teething problems at Signal, with temporary outages due to sheer weight of numbers.
New users have been welcomed with changes to make them feel more at home; Telegram launched a tool which allows old WhatsApp chats to be imported directly into the app, while Signal introduced chat wallpapers, animated stickers and an "About" page, giving privacy conscious users the golden opportunity to share information about themselves.
But in a report by Casey Newton for technology website The Verge, Signal employees expressed concern that as the platform grows (100 million users is a target specified by company executives), so does the potential for the platform to be used for nefarious purposes. Users can create secret group chats of up to 1,000 people, and there are currently no means of removing bad actors from the platform.
“It’s not only that Signal doesn’t have these policies in place,” said Gregg Bernstein, a former employee, to Newton. “But they’ve been resistant to even considering what a policy might look like.”
Telegram faces a similar problem. Leader of American right-wing group the Proud Boys, Enrique Tarrio, recently posted on the service: “Welcome, newcomers, to the darkest part of the web. You can be banned for spamming and porn. Everything else is fair game.”
The potential for using Signal and Telegram for mass organisation and activism means that they’re close to being social networks, and with that status comes responsibility.
In recent months that responsibility has weighed heavily on the shoulders of Facebook and Twitter, as calls for them to take a more active role in policing their platforms grow in volume. Failure to do so could mean going the way of Parler, the social network that saw a huge influx of right-wing activists before the recent US presidential election. It was effectively destroyed when big tech firms such as Apple, Google and Amazon refused to partner with it on account of its failure to monitor illegal activity.
The water is only going to get hotter for WhatsApp competitors as criminal activity proliferates. Towards the end of January, a Telegram bot was discovered, which enabled users to search a database of phone numbers stolen from Facebook.
The same day, a former US ambassador, Marc Ginsberg, filed a lawsuit against Google for failing to remove Telegram from its app store, providing documentation of many examples of extremist content and hate speech.
The Iranian government has already moved to ban Signal from the country for “criminal content”, and Iranians have reported difficulties using it in recent days.
It’s clear that people don’t want their private messages to be accessible. But nor do they want society to be destabilised. Striking a balance between the two will continue to be an intractable problem.