As Meta continues to advance its full messaging integration plan – despite various governments and child safety groups advising against the initiative – the company has now offered a level of reprieve in its progress, which could open the door to more discussion on the potential negative impacts of the project.
Back in 2019, Facebook announced its plan to merge the messaging functionalities of Messenger, Instagram and WhatsApp, which would then provide users with a universal inbox, with all of your message threads from each app accessible on either platform. That would simplify cross-connection, while also opening the door to more opportunities for brands to connect with users in the messaging tool of their choice.
But child safety experts raised the alarm, and several months after Facebook’s initial announcement, representatives from the UK, US and Australian Governments sent an open letter to Facebook CEO Mark Zuckerberg requesting that the company abandon its integration plan, which would also, by necessity, include an expansion of end-to-end encryption to all of Facebook’s – now Meta’s – messaging options.
Because WhatsApp messages are encrypted by default, the only way to integrate the other platforms is to bring them up to the same level of security. Which, on one level, is a positive, as it provides more messaging privacy, but on the other hand, it could also further shield criminal activity, as no one, not even Meta itself, is able to track encrypted discussions. That essentially means that, as a side effect, there would be a massive expansion of the company’s encrypted communications network.
As noted, Meta has continued to make steady progress on the initiative despite opposition, but last weekend, in an opinion piece for The Telegraph in the UK, Meta’s Global Head of Safety Antigone Davis has said that the company is now slowing its progress somewhat, in order to ensure that ‘we get this right’.
As explained by Davis:
“At Meta, we know people expect us to use the most secure technology available, which is why all of the personal messages you send on WhatsApp are already end-to-end encrypted and why we’re working to make it the default across the rest of our apps. As we do so, there’s an ongoing debate about how tech companies can continue to combat abuse and support the vital work of law enforcement if we can’t access your messages. We believe people shouldn’t have to choose between privacy and safety, which is why we are building strong safety measures into our plans and engaging with privacy and safety experts, civil society and governments to make sure we get this right.”
Davis says that Meta’s using a ‘three-pronged approach’ to address these concerns, and maximize privacy. That strategy involves utilizing proactive detection technology to look for suspicious patterns of activity in messaging, providing more controls for users to filter DM requests, and encouraging users to report concerning activity.
Davis says that by implementing these measures, Meta will be able to meet the requirements of law enforcement, without having to compromise on user privacy.
“As we roll out end-to-end encryption we will use a combination of non-encrypted data across our apps, account information and reports from users to keep them safe in a privacy-protected way while assisting public safety efforts. This kind of work already enables us to make vital reports to child safety authorities from WhatsApp.”
But that will take more development, which will slow progress. Meta had initially said that it planned to have the full integration process in place by 2022, but now, Davis has pushed that time frame.
“We’re taking our time to get this right and we don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023. As a company that connects billions of people around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.”
Which may not be a significant expansion, but still, it could give authorities more time to plead their case to Meta, and push for revisions to the plan, which will eventually see all messages in any of its apps opted into encryption by default.
But even so, it seems like Meta is pretty set on its messaging merger strategy. Part of the additional motivation for this could be that by welding together the company’s messaging back-end, that could then enable Meta to argue that its platform, as a whole, can’t be broken up.
Meta is under various antitrust investigations, with several recommending that its previous acquisitions of Instagram and WhatsApp be re-reviewed, and potentially rolled back if they’re found to have been initiated due to anti-competitive practices. If any of those rulings don’t go Meta’s way, that could mean that it would have to sell-off Instagram and/or WhatsApp – but maybe, if its messaging back-end is integrated, Meta could feasibly argue that its components can’t actually be split up, as they’re now all a part of one broader platform.
Which could be another reason why Meta’s so keen to push ahead, despite opposition to the plan – but maybe, with an extra 12 months added to the process, more debate can be had, which could halt the change.
Again, the strongest counter-argument here is from child safety groups, who say that broader messaging encryption will provide more protection for criminal groups.
The National Society for the Prevention of Cruelty to Children, for example, has argued that any move to further restrict access to messaging platforms by law enforcement will increase the potential for use of these platforms among perpetrator groups.
“Private messaging is at the front line of child sexual abuse, but the current debate around end-to-end encryption risks leaving children unprotected where there is most harm.”
This is a major point of debate, but so too is individual privacy and choice, and the very issue underlines the balance and nuance in such discussion, where, from an optimistic perspective, this is a good move, but bad actors will also be able to manipulate such for their own purpose.
Which is the case in virtually every social media debate – most systems and processes, in a general sense, have a positive impact on interaction and engagement, but the minority of criminals and groups seeking to manipulate the system are generally also able to glean a level of benefit from the same updates.
The latter can be far more damaging, but the former caters to more people. Which is why there are no easy answers in such considerations.
In essence, the messaging encryption debate is a microcosm of many other algorithmic and systematic process arguments – do you go with the change that will provide the most benefit to the most people, or do you seek to restrict such, even if doing so will lessen overall user satisfaction, and thus, retention and performance?