SecureRedact

View Original

Video privacy, cloud computing, and cross-border data transfers: what the ICO needs you to know

The Information Commissioner’s Office (ICO) recently released updated guidance for those operating video surveillance systems: ANPR, CCTV, facial recognition systems, drones, camera doorbells, workplace monitoring, etc (1). With this guidance, it is clear the ICO are trying to respond to the rapid influx of different types of video surveillance systems, and establish frameworks for how they should be operated.

The ICO says relevant authorities; such as the police and local authorities, who deploy overt surveillance, should continue to follow the 2013 Surveillance Camera Code of Practice issued under the Protection of Freedoms Act 2012 as a means of good practice.

But they’ve also released new guidance to consolidate existing legislation and legislative developments around the area of video and data privacy, and help provide a succinct and comprehensive piece of guidance on how to handle this kind of data. 



UK GDPR and DPA is still a must-comply

The ICO reminds everyone that the UK GDPR and Data Protection Act 2018 regulate how personal data relating to individuals is handled and processed, and so compliance with the key data protection principles is essential: 

  • Lawfulness, fairness, and transparency

  • Purpose limitation

  • Data minimisation

  • Accuracy

  • Storage limitation

  • Integrity and confidentiality (security)

  • Accountability

The guidance also emphasises the importance of taking a privacy-by-design approach. 



Data protection impact assessments (DPIAs) are a big part of taking a privacy-by-design approach.

They help identify and mitigate any potential risks at an early stage. In the context of surveillance cameras, DPIAs are to be carried out in a way that considers if the means of surveillance will pose a high level of risk to individuals, namely if it will pose either a high probability of some harm or a lower possibility of serious harm. Those who have DPIAs that identify a high level of risk even after steps have been taken to mitigate, are legally required to consult with the ICO and cannot continue processing until then. The UK GDPR also requires people to complete a DPIA if they plan to use extensive profiling with significant effects, process special category data, or systematically monitor publicly accessible places on a large scale (2). 

DPIAs today remain extremely important in a post-Schrems II world, where companies collecting and sending data across borders need to ensure all the necessary risks are evaluated and sufficient data privacy protections are in place. Particularly for countries where there is no adequacy decision, there is an added emphasis to take every precaution necessary and ensure that the data in question is not likely to be compromised when it enters that jurisdiction. These assessments also go beyond just compliance risks, with broader considerations of the rights and freedoms of individuals, and the potential of material and non-material harm to individuals or wider society.


Extra vigilance when it comes to more sophisticated and AI surveillance

The sophistication of surveillance systems has improved over recent years; with a higher definition of images, the ability to share images and video in real-time, as well as size and positioning of lenses giving greater capability - meaning that these systems can be particularly intrusive, and so extra care needs to be taken with how they are operated. 

Facial recognition technology (FRT) and body worn cameras are both dramatically increasing in usage, meaning that a large number of people can be monitored via automated systems at any one time - with many unaware of the privacy consequences of this.

There continues to be talks in the EU to develop an AI regulation law which would seek to ban FRT in certain circumstances; however, nothing is set in stone as of yet (3). Operators should therefore weigh out whether or not certain surveillance systems are justified, have a clear legal basis for its use, and concrete, written organisational policies and business practices in place, that consider these privacy implications in more detail and at the priority of the consumer.

The ICO’s latest surveillance code guidance does aim to give some direction when it comes to these technologies, by reemphasising one of the main issues when dealing with FRT - the issue of bias. They refer to using diverse data sets to train the machines, to reduce the chances of bias and discrimination in these systems against certain demographics of people.

Whilst this is true, one thing the ICO guidance doesn’t delve into is the balancing act between creating these diverse datasets and ensuring data privacy is still properly in place.

When addressing the models and datasets machines are trained on, companies need to also consider how to source diverse data sets without compromising individual privacy: whether this be through gaining consent, creating proxy data or perhaps through completely new and untried methods. 

The ICO stress in this guidance the importance of helping inspire more public trust and confidence in the use of these new surveillance systems, as well as help operators stay well within the required legal and ethical parameters.


Addressing current concerns: safe data storage and employee monitoring

The guidance also takes into consideration the increase of remote working and remote learning, and offers some direction on how to handle employee-monitoring and recording of meetings in a way that is fair and transparent. Individuals should know that they are being monitored, and that this is legally justified and carried out for a specific purpose. 

The ICO also addresses storing and viewing surveillance information safely, by adopting methods like encryption to prevent unauthorised access, as well as other technical methods like storing data in cloud computing systems. 

This is particularly pertinent when considering international data transfers; an area that has come under the microscope since the 2020 Schrems II judgement - as well as the proposals from the EU Commission to limit cloud providers from giving access to the non-personal data of EU citizens to jurisdictions outside the EU (4). These developments have a significant effect on how data is collected and shared; especially for global companies, as data being stored in servers outside the EU or being sent to a non-EU country (without an adequacy decision) would have to go through risk assessments to ensure compliance with GDPR. Many organisations are reliant on the free flow of data across borders between the EU and other jurisdictions, and increasing limitations of what can be shared beyond the EU will have a significant impact on many businesses.

However, the ICO does not properly discuss the issues that can arise with the companies that host cloud services and similar platforms.

In January this year, the Austrian data protection authority found the use of Google Analytics cookies by an Austrian website did not comply with EU data protection law. The technical and organisational measures, as well as standard contractual clauses (SCCs) Google claimed to have in place to allow the data to be stored in the US, were found to be ineffective when considering the access US surveillance organisations have to the data (5). This case was just the first in the countries reprimanding Google for this breach (6). 

As Google Analytics is the most common statistics program and many businesses rely on it for website performance and marketing, this case has sparked a conversation about how legitimate cloud systems are and whether they adhere with data protection law. Cases like this will definitely put pressure on companies to ensure safe options are in place and if more data protection authorities follow suit, it could present both legal and operational challenges. 



There is a clear emphasis on privacy in this guidance, especially when considering some newer technologies which can be far more invasive - like those that capture sensitive biometric data.

For example, the guidance references the more contemporary phenomenon of smart doorbells which have also started to come under legal scrutiny when considering how they can infringe privacy rights (7). The more these new forms of data capturing emerge, the more there is a need for clear guidance and regulation to ensure people’s data is protected and secure. 

However, there are definitely gaps in this guidance and room for deeper analysis.

Despite their efforts, unfortunately legislation is almost always going to be behind recent technological developments, so logic and hopefully ethical approaches are inevitably going to be relied upon to uphold privacy within the early stages of AI surveillance growth.

Nevertheless, this guidance certainly signals movement in the right direction and interest in pushing these conversations to the forefront. 

We can hopefully anticipate more guidance and discussion of these topics as we get more legislation relating to data privacy rights from the EU, as well as any domestic changes. Because of the pandemic and the unprecedented challenges it brought, legislators and data protection officers (DPOs) had given a lot of leeway to businesses both in terms of new regulation being introduced and general enforcement of data protection. As we hopefully start to emerge from the worst of it, we can expect an emphasis from authorities like the ICO on ensuring these protections and data protection rules are properly enforced.


References: 

  1. https://ico.org.uk/media/for-organisations/guide-to-data-protection/key-dp-themes/guidance-on-video-surveillance-0-0.pdf 

  2. https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/accountability-and-governance/data-protection-impact-assessments/#:~:text=In%20particular%2C%20the%20UK%20GDPR,places%20on%20a%20large%20scale

  3. https://www.europarl.europa.eu/RegData/etudes/IDAN/2021/698021/EPRS_IDA(2021)698021_EN.pdf 

  4. https://www.pinsentmasons.com/out-law/news/eu-foreign-access-cloud-data 

  5. https://www.natlawreview.com/article/austrian-dpa-finds-data-transfers-resulting-analytics-cookie-use-to-be-violation 

  6. https://www.euractiv.com/section/data-protection/news/france-joins-austria-says-google-analytics-data-not-protected-in-us/ 

  7. https://www.independent.co.uk/life-style/gadgets-and-tech/amazon-ring-privacy-court-gdpr-b1938963.html