Experts On Apps For Children Must Offer Privacy By Default

By   ISBuzz Team
Writer , Information Security Buzz | Sep 03, 2020 12:40 pm PST

Apps, social media platforms and online games that are specifically targeted at children will now have to put privacy at the heart of their design. A code of practice outlining how children’s data should be protected has come into force and firms have 12 months to comply with the new rules. If they do not, they could face huge fines imposed by the Information Commissioner’s Office. The ICO has the power to fine firms up to 4% of their global turnover if they breach data protection guidelines.

More information:

Notify of
4 Expert Comments
Oldest Most Voted
Inline Feedbacks
View all comments
Craig Young
Craig Young , Principal Security Researcher
September 3, 2020 8:47 pm

Although it may not always be obvious, the small bits of data people generate with online activities can paint a vivid picture about who they are and what they care about. We’ve seen large-scale examples proving how just portions of this information could be leveraged to manipulate adults. It is scary to think about how much more effectively our children can be manipulated if we do not figure out effective ways to regulate the industry.

The pandemic has pushed our kids to be more reliant on the Internet for their social and academic interactions. Technology and privacy policies related to online services in schools has been a growing problem for years and the pandemic has been like gas on a flame. Parents must choose between accepting third-party privacy policies for their children or risk excluding their children from class participation. It is especially critical that technology used in the classroom or for distance learning have strict rules limiting what data can be collected and how it may be used.

Last edited 3 years ago by Craig Young
Chris Hauk
Chris Hauk , Consumer Privacy Champion
September 3, 2020 8:45 pm

While new protections for the privacy of children are always a welcome sight, these types of regulations need to be actually enforced, including those rules that are already in place. Hopefully this is not just another toothless \”feel good\” set of regulations that will be forgotten after the creators shake hands and pat each other on the back after enacting such restrictions.

Last edited 3 years ago by Chris Hauk
Paul Bischoff
Paul Bischoff , Privacy Advocate
September 3, 2020 8:44 pm

I think everyone agrees that children\’s data needs to be protected. The biggest obstacle to doing so is age verification. How can a website or app reliably verify a user\’s age? Many apps and websites are used by both children and adults. The devices kids use might also be used by adults. And much of the web\’s content is not age-specific. Simply asking for the user\’s age prior to accessing a website makes it too easy to lie. Requiring a user upload some form of photo ID or input verifiable personal information is a burden to both platforms and users and carries significant privacy concerns. The inability to verify users\’ ages reliably makes policy-based efforts to protect children\’s data a minefield to navigate and problematic to implement.

Last edited 3 years ago by Paul Bischoff
Javvad Malik
Javvad Malik , Security Awareness Advocate
September 3, 2020 8:42 pm

All personal data needs to be considered and used carefully by organisations. But children\’s data deserves even more attention. This code is a welcome introduction, and encouraging to see. Hopefully, this will force manufacturers and service providers to consider how data will be collected, for what purposes, and how to safeguard it appropriately. It\’s very easy these days to set up any online service and collect personal information, but securing it requires a culture of security that considers the implications all the way through the data\’s life. Otherwise, it will be a case of merely applying patchwork temporary fixes.

Last edited 3 years ago by Javvad Malik

Recent Posts

Would love your thoughts, please comment.x