Several Indian WhatsApp users and groups are among those globally that circulate child pornography on the messaging service, an Israeli company has found.
An investigation of public WhatsApp groups by Israeli online safety startup AntiToxin Technologies in December revealed that hundreds of them globally spread child pornography openly, cloaked by the service’s end-to-end encryption. These groups were discovered via third-party apps that often included “adult” sections and offered invite links to other users trading images of child exploitation.
“We did, in fact, find Indian users and groups that were disseminating child pornography. I don’t have an exact number, but there was a considerable amount of participants with “+91” as the (India) international access code for the phone numbers that identified them as participants in these groups,” AntiToxin’s CMO Roi Carthy told ET.
The research was conducted only on public WhatsApp groups, where participants add themselves via an invite link. WhatsApp introduced an invite link feature for groups in late 2016, making it easier to discover and join groups without knowing any members.
While Google deleted these WhatsApp discovery apps from its Play Store, WhatsApp’s invite link feature is still live.
“In our opinion, hiding behind ‘encryption’ as an excuse for not monitoring these groups in particular is completely invalid and disingenuous,” Carthy added.
WhatsApp’s end-to-end encryption ensures that only those communicating with each other can read or see what’s sent, and nobody in between, not even WhatsApp. However, WhatsApp does have access to group names and their accompanying display photographs.
“We deploy our most advanced technology, including artificial intelligence to scan profile photos and actively ban accounts suspected of sharing this vile content. We also respond to law enforcement requests in India and around the world,” a WhatsApp spokesperson said in an email. “Sadly, because both app stores and communications services are being misused to spread abusive content, technology companies must work together to stop it.”
WhatsApp has 300 employees in total and it’s unclear how many are assigned to moderate content on the platform that has 1.5 billion users. Parent Facebook has 30,000 moderators keeping an eye on content shared by the 2.27 billion monthly active users on the social media site.
“If they are to avoid aggressive regulation, they will need to find a way to be more socially responsible, which may not mean prohibiting the sharing of illegal content, but might mean eliminating tools which allow their platform to enable anti-social activities en masse. For example, eliminating group messaging or mass forwarding capabilities,” said Brian Wieser, an analyst at Pivotal Research Group.
Wieser said WhatsApp needs to invest more aggressively to minimise “socially destructive activities” that are enabled by its platform. He added that it is possible that they may end up being regulated like telephones, where authorities can always have a ‘backdoor’ to access communications.
The Indian government recently proposed to amend the IT Act to ask companies to enable traceability of messages on platforms such as WhatsApp to stem the circulation of fake news and child pornography. WhatsApp said its end-to-end encryption policy does not allow for traceability.
“Encryption is a wonderful invention to protect the privacy of individuals and their communications. It is neither sought nor required when persons share child pornography links on WhatsApp. Liability for such content will make WhatsApp and other intermediaries invest in HR and tech, which is currently, unfortunately, done only as a benefaction,” said Arghya Sengupta, founder of Vidhi Centre for Legal Policy.
India’s Supreme Court has asked global technology giants such as Google, Facebook and Yahoo! to take steps to control circulation of child pornography and rape videos
Originally Published – https://tech.economictimes.indiatimes.com/news/mobile/whatsapps-child-pornography-problem-in-india/67427022