November 21, 2024

WhatsApp is under fire from Watchdog for lowering its UK age limit

The reviewer of UK terrorism laws issues a warning about youngsters being exposed to extremist content that is encrypted

The minimum age requirement for WhatsApp users was lowered from 16 to 13 by Mark Zuckerberg’s Meta, drawing criticism from the UK’s anti-terrorism body. This “extraordinary” choice, he said, would expose more youngsters to extremist content. According to Jonathan Hall KC, more kids have access to stuff that Meta is unable to control, such as sexually explicit or terroristically themed material.

Hall, the independent terrorism legislation reviewer, explained to LBC radio that WhatsApp’s use of end-to-end encryption, which limits message visibility to the sender and receiver, prevents Meta from removing dangerous material.

“By lowering the user age from 16 to 13 for WhatsApp, they are essentially exposing three more years within that age group to content that they cannot regulate,” he said. “So, to me, that’s an extraordinary thing to do.”

Hall also mentioned the unprecedented number of arrests made last year as evidence that kids are now more susceptible to terrorist content.

“In the past year, 42 minors have been placed under arrest. This is a massive figure—the highest ever. It’s clear now that youngsters are especially vulnerable to terrorist content, especially those who are sad,” the speaker stated. “They are searching for meaning in their lives, and they may find it in an extremist identity.”

The age shift for users in the UK and EU was announced by WhatsApp in February, and it went into effect on Wednesday. The platform claimed that protections were in place and that the adjustment brought the UK and EU’s age limit into line with other nations.

Advocates for kid safety, however, also took issue with the decision. The decision, according to Smartphone Free Childhood, “goes against the growing national demand for big tech to do more to protect our children.”

The Online Safety Act now centres on end-to-end encryption due to worries about illicit information on WhatsApp and other messaging services. This legislation gives the communications regulator, Ofcom, the authority to order a messaging service to adopt “accredited technology” in order to locate and delete content that promotes child sexual abuse.

The government has tried to downplay the importance of this provision, stating that Ofcom would only intervene if content scanning was “technically feasible” and if the process met minimum standards for privacy and accuracy.

In December, Meta announced that it was implementing end-to-end encryption on its Messenger app, with Instagram expected to do the same.


Copyright © All rights reserved | WebbSocial |