Children – Data Games with their Future

There are three data points to think about

  • Instagram is working on a version for children under 13
  • Proctoring applications are a lot more acceptable with educational institutions
  • The Metaverse and it’s inroads into the world of children

Let’s start with these though

Safety in the browser

  1. You would be aware of a new initiative called FLOC. It is currently in preview/ testing mode and will roll out soon enough. It is promoted, by Google, as a privacy protection initiative to third party cookies. Still, it has received significant push back, and most browser alternatives are refusing to support the standard and, for now, only available on Chrome. Will FLOC also cover Google Workspace (education) based students ids and what about Chromebooks used by children. Hopefully, the status is opt-out from the outset.

If you have concerns, then Edge is the default browser on Windows machines, and you can choose Firefox or Brave. Mac’s default browser is Safari, but all the other three browsers are available. Meanwhile, Microsoft Edge’s new Kids Mode has appropriate features and content for children aged 5-8 or 9-12. It limits the sites that children can go to, adds safe search and strict tracking prevention. The browser’s family safety feature is linked to a Microsoft account.

Secure the gaming and entertainment experience

  1. Children have taken to Roblox in a big way, which is a sign of the times. There’s excitement about the opportunities this and other similar platforms offer, but the safety issues also need urgent attention. Now Roblox is moving towards a content rating system for their games. This has now been felt because adult role play games and other inappropriate content have been offered to kids under 13. Surprisingly, such a system was already not in place to protect children. Gamers are wanting to enhance their gaming performance take to downloading cheat codes. This is avoidable, and Cisco Talos has seen several small tools looking like game patches, tweaks or modding tools but backdoored with malware. Many of these originate in How to Videos on video sites like YouTube and social media communities of various games. This malware can cause significant damage. Meanwhile, some games and applications are clearly meant for a specific age group. Children are often known to put incorrect age information to get access to these apps and games. Companies should do more as part of their safety and trust practices to ensure that access is limited soon. Tiktok has been sued in Europe for collecting large amounts of data of children under the age of data. This includes phone numbers, pictures, videos, location and biometric data. As per their standard model, it is monetised and transferred to other third parties.
  2. Instagram’s decision to offer a version for children under thirteen is disturbing. Though not very surprising. They are part of an aggressive company that has been known to ‘move fast’ and not fix things. Gaming companies and entertainment companies are already on the job, so why not the social network. There is a thoughtful post they have written on making it safe for the younger members, and you can download a guide in multiple languages. Well, I can’t get over the fact that Facebook misused the phone numbers provided for 2FA security, and some say the CEO even boasted about it. The cascading set of consequences of this decision were evident in the latest data breach. There is an increasing concern, and organisations are writing into the CEO, but it does not seem likely to change his mind. As this article explains, the company wants to reach out to the next generation at a very early age, utterly unmindful of the harmful effects on impressionable minds.

Prevent long term harm

  1. Proctoring sees rapid adoption in the K-12 and other educational domains. The use case is the fair conduct of examinations, and a skeuomorphism of the real-world invigilator is this remote app that students have to install on their devices. Depending on the service opted for by the institution, it could be a version of the automated or human-based proctoring. The automatic or ‘ai’ version of these services is supposed to analyse
    – patterns
    – voice and other audio
    – face movements
    – eye detection
    – environment
    – mouth
    – device monitoring through tab switching behaviour and more
    I have not come across any significant analysis on the efficacy of the algorithms in the public domain. This itself needs further debate amongst stakeholders because the children are one of them and are pushing back. As for the safety aspect of these applications, they want the student user account to be running with administrator privileges. These services wish for antivirus, firewall, VPN and security software to be deactivated! They would want to dial down on your browser security and extensions, access the camera and device microphone, and the file system. In effect, they seem to be doing a deep scan of the device and monitoring it real-time. Keeping aside the issue of invasion of privacy to access files without any significant disclosure on the part of the institution or provider, it is disturbing that they bring down the security of a child’s device to nothing.

Further, providers’ privacy or data protection policies tend to be opaque and that is putting it generously. Discussion and documentation on the efficacy of their algorithms, security practices on client devices, data storage and encryption best practices, data access and sharing transparency, and how long is this data retained at the educational institution or provider level.

Wouldn’t it be more prudent to take this opportunity to re-examine how the educational system measures academic progress? It should not be at the cost of student security. It is a high price to pay, and they have not opted-in to this.

Children in the metaverse

5. Augmented and Virtual Reality is considered essential for creating new educational experiences. Indeed they have the potential to explain concepts in unique ways. Most of the large tech companies have made significant investments in hardware and software. Eye-tracking is key to delivering these experiences. It also means that the application can capture much behavioural data whether they click or interact within the metaverse or not. Now let’s get back to companies that have advertising-based models and this metaverse. The dominant social platform is optimistic about extended reality.

What protections are built to protect children, whether on the educational side or in the entertainment or social side of the metaverse?

Connecting the dots

If you triangulate the three issues of data collected in the school, early inroads through social (under 13) and then the metaverse behavioural data, it paints a troubling picture. Should children not have a say in their choices?

Safety and security practices should be the core of our digital world. Why is it that companies and institutions feel it is ok to write applications or processes that ask the end-user to drop the core safety settings? It just does not feel right, and safety should be non-negotiable while designing good applications.

What does it mean for a company’s ESG disclosures that there is no concerted oversight into these activities?

Is it left to the individual stakeholders or pressure groups to push back against a behemoth or Can the brand’s brand purpose put a stake in the ground and define what cannot be done. Can the brand lead?




Browser safety
Microsoft kids mode for Edge

Instagram – the pushback


Instagram and reality



2FA and Facebook

Talos comments on backdoors and games

Discussing Standards in XR

Tiktop and Europe






Leave a Reply

%d bloggers like this: