The beauty of WhatsApp was – and in ways it still is – the simplicity of its one-to-one messages. With over 200 million users in India, it also has a massive Fake News, misinformation and plain malicious rumour problem. The platform is now rife with all kinds of misinformation, whether political, religious, historical, medical, social or legal.
Local administration often doesn’t know what to do when mobs start collecting and rioting, apart from shutting down Internet access – due to messages. According to various internet shutdowns tracker, India had 70 Internet shutdowns in 2017, and in the first six months of 2018, we’re already at 65.
More recently, mobs have attacked and killed people, following the spreading of a video clip warning about gangs kidnapping children. This is going to get worse. This is a complex problem with no single solution: there is no silver bullet here.
Solutions include counter speech, user education and debunking of misinformation from both the government administration and media. We need strong law enforcement to prevent mobs, as well as speedy justice for the victims (as a deterrent).
The challenge with dealing with Fake News and misinformation is that WhatsApp’s end-to-end encryption doesn’t allow even the platform to access messages. While the encryption provides privacy for users when they’re messaging, given the anonymity involved in forwarded messages, the platform becomes an enabler for the spread of misinformation.
The challenge of enforcement in social media, as it was in the 66A case earlier in this decade, is that messages online are both communication (person-to-person, and hence private) and media (for wider consumption, and hence public). The solution for WhatsApp as a platform lies in separating the public from the private — give user’s power over what they make public and allow to be forwarded, and thereby holding them accountable for what they choose to make public
Here are some changes that are recommended for the platform:
- Users can make messages either public (media) or private (P2P message). The default setting for all messages should be private. This will impact going viral on the platform, but that’s a price it will have to pay for bringing in accountability. This will create a level of friction while forwarding: they will be frustrated when they cannot forward certain messages. WhatsApp could keep a slightly different background colour for public messages.
- The original sender/creator of the message should have the power to allow for a message to be forwarded (and made public). This ensures that a message that was meant to be private cannot be made public (by forwarding) without consent. It also attributes intent, when an original sender/creator chooses to make a message public. To forward even your own message to multiple people, you would have to make first make the message public.
- When a creator makes a message public, the message gets a unique ID, which gets tagged with the creators ID. This means that the message, when public, is “media” and has proper attribution to the creator, every time the message is forwarded. A log is kept by WhatsApp only if the message is public. This allows both the platform and law enforcement agencies to trace the message back to the creator. From a platform perspective, there are two things: WhatsApp is now in a position to suspend this particular account, and secondly, it can disable the message, wherever it has been forwarded.
- Allow users to report forwarded/public messages as misinformation, which can then be reviewed by WhatsApp. WhatsApp already reviews spam related complaints. Users should be able to identify the Sender and/or Message ID by selecting the message and tapping on the information (i) icon which appears.
What’s changed now?
Now, following the recent incidents, WhatsApp is moving to fix it and has introduced a new rule that changes the way you forward messages in the chat app.
WhatsApp have already announced that it is now limiting the number of chats where a message can be forwarded at a time. Globally, the limit is 20 chat. In India, it is 5 chats because Indian has this huge problem of misinformation spreading through WhatsApp.
“Today, we’re launching a test to limit forwarding that will apply to everyone using WhatsApp. In India – where people forward more messages, photos, and videos than any other country in the world – we’ll also test a lower limit of 5 chats at once and we’ll remove the quick forward button next to media messages. We believe that these changes – which we’ll continue to evaluate – will help keep WhatsApp the way it was designed to be: a private messaging app,” the company announced in a blog post.
Even WhatsApp has realized the ease of forwarding is being misused to spread fake audios and videos. So to control it, the chat app is removing the quick forward button from the app. You will have to follow a longer method to forward any media file to someone: long press the media and tap on the forward option to send it to others.
Making the internet and its allied technologies safe for its users is the moral responsibility of technological companies. What is WhatsApp’s business model given that there are no ads?
At the time of its launch, WhatsApp was a paid app. Priced around $1 per download, it made money through bulk downloads. In 2016, the app became free and it scrapped the subscription charges. In 2014, Facebook acquired WhatsApp for US$19 Billion. Facebook maintains that WhatsApp messages are entirely safe owing to their end-to-end encryption. This means that every message sent is secured by code and only the receiver (and not even WhatsApp) has the encryption key.
So, what’s the catch? If one has to go by the adage, “if you’re not paying for a product, you are the product”, then we’re yet to see what plans FB has for WhatsApp.