How to Protect Children's Privacy within an Era of AI
- Privacy Law In Canada
- Nov 3, 2023
- 2 min read
November 3rd, 2023 --- During a LinkedIn Live session, a discussion on regulatory developments, compliance, and children's privacy in the EU, U.K., and U.S. was held.
Participants included IAPP Research and Insights Director Joe Jones, Elizabeth Denham, Lothar Determann, and Jonathan Tam from Baker McKenzie. They addressed various questions:
Effectiveness of Fines: Regarding recent substantial fines imposed on companies for children's data processing, panelists expressed concerns about the appropriateness and effectiveness of such fines. They emphasized the need for clearer and more uniformly applied legal requirements, cooperation with industry, and guidance from authorities to develop reasonable policies and measures before resorting to fines.
Age Assurance Responsibility: The question of who should be responsible for age assurance was discussed. The consensus was that telecoms, broadband providers, and device makers should not carry the primary burden. Online service providers were considered better positioned to administer age-assurance processes.
Two-Factor Age Assurance: Panelists confirmed that companies can utilize multiple factors, including self-reported age and other age-related clues, for age assurance. U.S. law and the FTC guidelines were cited as relevant in this context.
AI Solutions for Age Verification: AI solutions for detecting children's age based on text, words, or other indicators were deemed feasible and could help flag accounts that raise concerns for additional age-assurance measures.
Balancing Children's Privacy and Online Safety: The discussion acknowledged the need for a balance between children's privacy and safety, emphasizing the importance of assessing trade-offs and risks on a case-by-case basis.
Age-Appropriate Content Accessibility: The U.K. Children's Code and California's code require companies to adapt content accessibility for different age groups of children, aiming for age-appropriate experiences. However, the enforcement of the California Age-Appropriate Design Code Act was temporarily halted by a federal court.
COPPA Preemption of State Laws: Federal laws like COPPA generally preempt conflicting state laws. COPPA does not contain a broad preemption clause, but federal law takes precedence where it contradicts state law.
Breaking Encryption for Scanning Content: Panelists expressed reservations about requirements for breaking encryption to scan for illegal content, citing potential harm to privacy and security. Alternative measures were suggested.
Appropriate Age Threshold for Consent: Age thresholds for parental consent vary worldwide and across different areas, such as data processing, alcohol consumption, and marriage. Consistency and clarity in age thresholds were considered necessary.
Responsibility for Child Privacy: The responsibility for child privacy was seen as a shared effort involving companies, parents, and schools. Various laws and policies globally emphasize different stakeholders' roles.
Vague Kids Online Safety Act (KOSA): Panelists recommended careful analysis of the KOSA in light of the preliminary injunction against enforcement of the California Age-Appropriate Design Code Act. They acknowledged bipartisan support for federal bills like KOSA and "COPPA 2.0" and highlighted their potential impacts on children's online privacy and safety.
Parental Consent for Children's Data: The necessity of parental consent for children's data under the GDPR was addressed, noting that other legal bases might be applicable, and consent depends on various factors, including data categories and processing purposes.
The discussion highlighted the complexities and ongoing developments in children's privacy and the need for balanced and effective regulatory approaches.
For more information please visit: