Facebook’s messaging app for under 13s, Messenger Kids — which launched two years ago pledging a “private” chat space for kids to talk with contacts specifically approved by their parents — has run into an embarrassing safety issue.
The Verge obtained messages sent by Facebook to an unknown number of parents of users of the app informing them the company had found what it couches as “a technical error” which allowed a friend of a child to create a group chat with them in the app which invited one or more of the second child’s parent-approved friends — i.e. without those secondary contacts having been approved by the parent of the first child.
Facebook did not make a public disclosure of the safety issue. We’ve reached out to the company with questions.
It earlier confirmed the bug to the Verge, telling it: “We recently notified some parents of Messenger Kids account users about a technical error that we detected affecting a small number of group chats. We turned off the affected chats and provided parents with additional resources on Messenger Kids and online safety.”
The issue appears to have arisen as a result of how Messenger Kids’ permissions are applied in group chat scenarios — where the multi-user chats apparently override the system of required parental approval for contacts who kids are chatting with one on one.
But given the app’s support for group messaging it’s pretty incredible that Facebook engineers failed to robustly enforce an additional layer of checks for friends of friends to avoid unapproved users (who could include adults) from being able to connect and chat with children.
The Verge reports that “thousands” of children were left in chats with unauthorized users as a result of the flaw.
Despite its long history of playing fast and loose with user privacy, at the launch of Messenger Kids in 2017 the then head of Facebook Messenger, David Marcus, was quick to throw shade at other apps kids might use to chat — saying: “In other apps, they can contact anyone they want or be contacted by anyone.”
Turns out Facebook’s Messenger Kids has also allowed unapproved users into chatrooms it claimed as safe spaces for kids, saying too that it had developed the app in “lockstep” with the FTC.
We’ve reached out to the FTC to ask if it will be investigating the safety breach.
Friends’ data has been something of a recurring privacy blackhole for Facebook — enabling, for example, the misuse of millions of users’ personal information without their knowledge or consent as a result of the expansive permissions Facebook wrapped around it, when the now defunct political data company, Cambridge Analytica, paid a developer to harvest Facebook data to build psychographic profiles of US voters.
The company is reportedly on the verge of being issued with a $5BN penalty by the FTC related to an investigation of whether it breached earlier privacy commitments made to the regulator.
Various data protection laws govern apps that process children’s data, including the Children’s Online Privacy Protection Act (Coppa) in the US and the General Data Protection Regulation in Europe. But while there are potential privacy issues here with the Messenger Kids flaw, given children’s data may have been shared with unauthorized third parties as a result of the “error”, the main issue of concern for parents is likely the safety risk of their children being exposed to people they have not authorized in an unsupervised video chat environment.
On that issue current laws have less of a support framework to offer.
Although — in Europe — rising concern about a range of risks and harms kids can face when going online has led the UK government to seek to regulate the area.
recently published white paper sets out its plan to regulate a broad range of online harms, including proposing a mandatory duty of care on platforms to take reasonable steps to protect users from a range of harms, such as child sexual exploitation.
0 Comments
Post a Comment