CHILD ONLINE SAFETY TOOLKIT
TEN POLICY ACTION AREAS
< PREVIOUS SECTION
NEXT SECTION >
Personal data, identity and autonomy
States parties should take legislative, administrative and other measures to ensure that children’s privacy is respected and protected by all organizations and in all environments that process their data. Legislation should include strong safeguards, transparency, independent oversight and access to remedy. States parties should require the integration of privacy-by-design into digital products and services that affect children. They should regularly review privacy and data protection legislation and ensure that procedures and practices prevent deliberate infringements or accidental breaches of children’s privacy. Where encryption is considered an appropriate means, States parties should consider appropriate measures enabling the detection and reporting of child sexual exploitation and abuse or child sexual abuse material. Such measures must be strictly limited according to the principles of legality, necessity and proportionality. Source: General comment No. 25 (2021), para 70118
Objective: To recognise the benefits of and respond to the current and emerging threats to privacy, identity and the agency of children in the digital world posed by the use of data including personal data, biometrics and automated decision making.
Model policy text: To ensure a holistic approach to child online safety, each of the steps below is necessary. 3a. Establish or ensure existing data protection frameworks are effective in providing specific protection for children’s data Children’s rights in the online environment are intimately connected with the way their data is collected, stored and used. Data protection law and regulation for children must be accessible, effective, and capable of evolving to meet emerging risks.119 This means, not only establishing the legal and regulatory frameworks, but also making sure they work in practice and are implemented accordingly. 3b. Establish protocols for and limitations on the use of automated decision making that may affect children Standards, laws and codes of practice should ensure that children benefit from automated systems and are not penalised through automated decision making.120 It is particularly important to avoid the potential for discrimination through automated decision making. These protocols and limitations may apply in the context of criminal justice, social welfare, health and medicine, education, and the private sector among others.
118. General comment No. 25 (2021) on children’s rights in relation to the digital environment, UNCRC, 2021. 119. General Data Protection Regulation, European Union, 2018. 120. World stumbling zombie-like into a digital welfare dystopia, warns UN human rights expert, Office of the United Nations High Commissioner for Human Rights, 2019.