Is children's online data protection and privacy a priority? 

No. 44: Bringing you the news that matters in video privacy and security

A note from our Editor

Hi all,

COVID-19 has accelerated the way that children use and interact with the online world. Raising growing privacy concerns around how children use and upload to social media platforms, and how they use Edtech tools to supplement physical learning. From a data protection perspective, safeguarding children online as well as managing their sensitive data has to fall to the businesses providing these services. Lawmakers can provide guidance and policy, but can only go so far. How can companies identify children's data without breaking privacy, and then have the secure tools in place to protect this sensitive information responsibly? 

Instagram has recently been reminded of this, as they have been hit with what is Meta's largest GDPR fine - €405m for how they handle children’s data. As one of the Big tech giants, putting privacy first, particularly when it comes to protecting children's data, needs to be a priority. For children and parents, for other businesses, and to an extent as a way to influence privacy law, Big tech's actions over privacy are extremely pertinent. In the specific case of children's data, this cannot be overlooked and privacy considerations for children's use of social media have to be approached carefully and responsibly.

In the US, California has taken bold steps in passing legislation to help do just this - protect children online. Despite pushback from some social media giants, this is an exciting step in a more privacy-forward direction, and will hopefully aid in tackling how social media companies like Snapchat, Instagram, and TikTok affect young people’s mental health, safety, and privacy. 

Edtech became popular during the covid-19 pandemic as a way to help keep children learning, but these methods have since been more regularly adopted and become common practice for many. But, there are questions about how robust their privacy settings are, as digital rights campaigners in the UK have accused these companies of mishandling children’s data. These concerns are particularly important as there are questions surrounding the transparency of data-sharing practices, particularly in regard to third-party tracking of children online. 

As always, please send any feedback or topics of interest you would like to be covered. 

Seena


News

Chinese database of over 800 million faces and vehicles exposed online

A Chinese database of millions of faces and vehicle registration plates has been exposed for months online. The database, belonging to tech company Xinai Electronics, is said to have held over 800 million records from workplaces, schools, construction sites, and car parks across China.

Tech Crunch: A huge Chinese database of faces and vehicle license plates spilled online

Tech Times: Chinese Database Leaks 800M Facial and Vehicle Plate Records of Its Citizens

 

A tech tool allows US Police mass surveillance capabilities “on a budget”

Tech tool, "Fog Reveal", has been used by a range of US police departments, allowing them to search billions of records from 250 million mobile devices. This tool has been used over recent years - at times without search warrants - and has sparked concerns over violations of the 4th Amendment (protection from unreasonable search and seizure).

AP News: Tech tool offers police 'mass surveillance on a budget'

CBS News: Thanks to tech, police practice "mass surveillance on a budget" — no warrant required

 

Privacy campaigners argue Edtech companies are breaking UK data laws in regards to children’s data

Digital rights charity, 5Rights, has recently presented evidence to the ICO regarding the policies of Edtech products. Their research highlights that products like Google Classroom and Classdojo leave children's data vulnerable to being tracked by third parties and that their "opaque" privacy terms contradict UK data protection law. 

Financial Times: Edtech companies breaking UK data laws, privacy campaigners claim

Engineering and Technology: Edtech firms failing to protect children’s data, say campaigners

 

Meta receives €405M fine due to Instagram’s handling of children’s data 

Meta has been fined €405m (£349m) by the Irish Data Protection Commission over Instagram's measures for account set-ups for teenage users. The watchdog found that Instagram allowed users aged 13-17 to hold business accounts, which displayed their phone numbers and email addresses, as well as set their accounts to “public” by default.

Tech Crunch: Instagram fined €405M in EU over children's privacy

The Guardian: Instagram owner Meta fined €405m over handling of teens’ data

 

The Greek “Watergate” scandal throws up questions surrounding privacy and democracy

Greek Prime Minister, Kyriakos Mitsotakis, is currently under fire from EU legislators and facing calls to resign after state intelligence wiretapped the phone of the head of the opposition party, Nikos Androulakis, “for security reasons”. It could involve the violation of the EU GDPR and raises questions about the balance between protecting EU citizens’ fundamental rights and government sovereignty on national security issues.

Politico: EU and Greece veer toward standoff over wiretapping scandal

Euractiv: Greek spyware and wiretapping scandal now on EU agenda


AI Snippet of the Week

Artists in uproar after an AI-generated piece wins an Art Contest

A game designer in Colorado has faced a barrage of criticism of "cheating" after he won first place in the category “Digital Arts/Digitally-Manipulated Photography” at the Colorado State Fair Fine Arts Competition with an AI-generated entry. The piece was made with an AI system, Midjourney, that can be fed written prompts and produce detailed images. 

The New York Times: An A.I.-Generated Picture Won an Art Prize. Artists Aren’t Happy

Vice News: An AI-Generated Artwork Won First Place at a State Fair Fine Arts Competition, and Artists Are Pissed


Policy Updates

California passes brand-new Children’s Online Safety Act  

California has recently passed the California Age-Appropriate Design Code Act, the first of its kind in protecting children and teenagers’ privacy rights online. This law requires online platforms to prioritise the safety and privacy of children, vet whether new products may pose harm to them before rolling them out, and offer privacy guardrails to under-18s by default.

New York Times: Sweeping Children’s Online Safety Bill Is Passed in California

The Washington Post: California lawmakers pass landmark children’s online safety bill


To subscribe to our fortnightly newsletter, please click here

Thanks for reading, if you have any suggestions for topics or content that you want to see covered in future please drop a note to: info@secureredact.co.uk

Previous
Previous

Are we seeing a tangible international shift in how we protect data?  

Next
Next

Is privacy about personal choice and individual space, or can it only exist if it is based on shared practices and agreements?