Is biometric age-assurance technology the future or a logistical nightmare?
A major priority for lawmakers today is ensuring users, especially minors, consume age-appropriate content online. Age assurance technology and biometric verification are becoming increasingly important across sectors such as retail, security, and content moderation. Methods range from simple self-declaration (i.e. checking a box) to more sophisticated techniques like ID document verification, biometric analysis, and facial age estimation.
However, these technologies also raise significant privacy, ethical, and security concerns, requiring a careful balance between protection and business outcomes.
The global legislative push for biometric age-assurance
The United States
In the United States, there has been a significant push towards implementing age verification laws, with various states enacting specific regulations to protect minors online.
In Louisiana, the Age Verification Law mandates that platforms hosting significant adult content verify the ages of their users through government-issued IDs or digital ID cards. California's Age-Appropriate Design Code Act goes further - it requires online platforms accessible by children to set high privacy settings by default and implement comprehensive age verification measures.
Arkansas, Texas, Virginia, and Utah have all introduced age verification requirements, often specifying acceptable methods such as state-approved digital IDs or independent third-party verification services.
This push goes beyond online content and stretches across the retail sector, especially alcohol sales. For example, methods like facial scanning and palm reading are becoming popular tactics to verify ages for purchasing alcohol across different types of venues and establishments.
Despite these efforts, there have also been notable pushback and practical challenges in the implementation of age verification technologies.
Texas has seen legal challenges against its age verification laws, with organizations like the Electronic Frontier Foundation (EFF) arguing that these laws violate First Amendment protections and place undue burdens on both users and platforms. Following this, in July 2024, the Supreme Court announced plans to consider a free speech challenge to the Texas law.
In Utah, the implementation of the Online Pornography Viewing Age Requirements Act led to major adult content providers completely blocking access to their sites for Utah residents. In this, they cited the inability to comply with the stringent age verification requirements. This highlights the practical difficulties and resistance from industry stakeholders when faced with rigorous regulatory demands.
Canada
Canada has also made strides in age-assurance verification with the Protecting Young Persons from Exposure to Pornography Act - an act designed to shield minors from inappropriate content online.
However, this Act has faced criticism for its overly broad scope. Namely, potentially impacting platforms that do not primarily deal in adult content - like, Netflix. Critics also argue that such regulations can inadvertently restrict access to legitimate content, leading to unintended consequences.
The EU and the UK
In the UK, the Online Safety Act requires platforms to implement robust age verification mechanisms to prevent children from accessing harmful content. Similarly, the European Union has introduced the Digital Services Act, which enforces stricter obligations on businesses to ensure the safety of children online - such as banning targeted ads for minors.
As with the US, biometric age-assurance verification has also been considered for the retail sector; in 2024, the UK considered adopting digital identities to verify ages for alcohol sales online, as well as biometrics in pubs and bars.
The privacy and ethical challenges of biometric age-assurance
Age assurance technology and biometric verification present a series of challenges that need careful consideration.
One major concern is the sensitivity of biometric data. The use of biometric information, such as facial and iris data, inherently raises privacy risks and data security is paramount. Biometric data, being highly personal and unique to individuals, requires stringent safeguards to prevent misuse and unauthorized access. Robust security measures are crucial to prevent data breaches that could expose personal information to malicious entities. In collecting and processing this data, processors must consider minimizing data retention, as well as implementing anonymization to protect privacy where possible.
Additionally, the invasiveness of verification methods must be minimized. Effective age verification needs to be balanced with respect for user privacy. Methods that are too intrusive can lead to user discomfort and resistance. For example, continuous biometric monitoring or frequent ID checks can lead to discomfort or resistance, potentially driving individuals away from compliant platforms or services. Operators must adopt proportionate measures and implement age verification methods that are appropriate to the risk level of the content or service.
Accuracy is another aspect. Age verification methods must be precise to avoid the pitfalls of false positives and negatives. Inaccurate age verification can lead to minors accessing inappropriate content or adults being wrongly restricted - both of which undermine the system's credibility and effectiveness. For example, a Gov.UK study found that accuracy levels varied across different age-assurance solutions, partially due to less access to broad training data of young users. There were also varying rates of accuracy for those from different ethnic groups and skin tones, with reduced accuracy for those with darker skin tones.
Operators of these systems must engage in continuous improvement, namely regularly updating these systems and consistently reviewing their accuracy. Transparency is also important; operators should communicate the age verification processes to users, ensuring they understand how their data is used and protected.
Pimloc's perspective
At Pimloc, we recognize the importance of protecting sensitive video data. The risks and expectations associated with video surveillance have significantly shifted and are constantly growing. This is why we created Secure Redact - to allow companies to use and manage video safely whilst protecting people’s data.
In the context of age assurance and biometric verification, Secure Redact can play a critical role in ensuring that the biometric data used for age verification is handled with the utmost care and respect for privacy. By anonymizing sensitive information, we help companies comply with regulatory demands whilst safeguarding user privacy.
Incorporating privacy into age-verification technology is not just about ethical considerations - it is also a necessity. Regulatory compliance requires stringent privacy measures, and failing to meet these standards can result in legal repercussions and a loss of user trust. Beyond compliance, there are also notable productivity benefits to embedding privacy into these systems. Efficient systems that include robust privacy protections streamline operations by reducing the need for manual data redaction and minimizing the risk of data breaches. This can lead to significant cost savings, and enhance the overall productivity of a business.
Whether you need to blur faces, redact license plates, or anonymize other personal data, Secure Redact offers the tools necessary for comprehensive data protection.