loader image

Bagus Enrico & Partners

Protection of Children in Online Spaces: Examining Various Mechanisms for Age Verification and Parental Consent

Authors

The rapid technological development in Indonesia over recent decades has led to a significant rise in internet and social media usage among diverse demographics, including children. Social media platforms, now indispensable tools for communication and information exchange, are widely popular among Indonesian youth, with internet users in the country surpassing 221 million in 2024, a penetration rate of 79.5%, where 34.40% are aged 12 to 27 years and 16% are under 13.[1] Despite its benefits, social media poses serious risks to children’s privacy and safety, including misuse of data, insufficient protection mechanisms, and exposure to harmful content such as cyberbullying, exploitation, and unsuitable material. Many children create fake accounts to bypass age restrictions, reflecting the need for stricter filtering mechanisms by platforms like Facebook, Instagram, and TikTok, as well as proper standardization from the government. Furthermore, reports from TaskForce KBGO indicate that 20% of online violence complaints in 2023 involved minors aged 15–18.[2] Recent events, such as the 2024 breach of Indonesia’s National Data Center, underscore the urgent need for standardized online safety measures. Meanwhile, advancements in Artificial Intelligence (AI) have enabled tools for age verification and child protection.

Against this backdrop, this article explores suitable mechanisms for ensuring child protection online, focusing on legislative frameworks like the ITE[3] and PDP[4] Laws and analyzing global practices in age verification and parental consent, culminating in policy recommendations for the Indonesian Government. Countries were chosen for this article based on the following considerations: regional diversity (America, Europe, and Asia) and the different stages of online child protection (soft and hard regulatory approach).

Regulatory Framework on Age Verification and Parental Consent in Indonesia

The Indonesian government has incorporated new regulations on child protection in the 2024 revision of the ITE Law, mandating Electronic System Providers (ESPs) to provide information on minimum age limits, user verification mechanisms for children, and reporting systems. Currently, the Ministry of Communication and Digital Affairs is drafting a Government Regulation on the Governance of Child Protection in Electronic System Implementation (Draft Children Protection in ES)[5] to implement Article 16A of the ITE Law, focusing on the technology and operational measures for protecting children’s personal data, although technical standards for age verification remain unspecified. Additionally, the PDP Law requires parental consent for processing children’s personal data under Article 25, while Article 51 of the Draft PDP Regulation mandates that such consent must be verified legitimately, explicitly, and in alignment with available technology.

In summary, the current regulatory framework for online child protection, particularly regarding age verification and parental consent, still requires further clarification on the technological and technical operations involved in implementing these provisions. While both the UU ITE and UU PDP mandate that data used for age verification be immediately deleted and require social media platforms to consider available technologies, there are no clear enforceable criteria for balancing these two considerations: children’s privacy and the technologies available.

Age Verification Mechanism: Lessons Learned

Indonesia currently lacks technical standards for age verification mechanisms to protect child users, making comparative studies of standards in other countries essential as a reference for formulating appropriate and effective policies.

CountriesProvisions
United KingdomThe OS Act stipulates that user age declarations are not considered age verification or age estimation, as they are deemed ineffective. To implement this mandate, Ofcom released age assurance[6] guidelines in December 2023 for online platforms hosting pornographic content.[7] These guidelines detail effective age verification methods, including Open Banking, Photo-ID Matching, Facial Age Estimation, mobile network operator age checks, credit card verification, and Digital Identity Wallets.  
CanadaThe Government of Canada, through the Office of the Privacy Commissioner, is conducting an “exploratory consultation”[8] on privacy and age assurance for social media users. The three main methods used are age declaration, age verification, and age estimation. Meanwhile, the Digital Governance Standard Institute is proposing a minimum standard for biometric-based age verification systems.  
AustraliaIn March 2023, the eSafety Commission published a Roadmap for Implementing Age Verification, offering recommendations to the Australian government. It highlighted that age verification technologies, such as biometric estimation, voice estimation, and ID card-based features, are still underdeveloped. For example, a facial estimation test on a dataset of 10,139 images showed varying accuracy based on ethnicity, with higher accuracy for Caucasian faces and lower for African faces.[9] The roadmap also recommended a ‘double-blind token’ approach, similar to the EU model. Australia is taking a phased approach to age verification, without mandating advanced technologies like biometrics and facial estimation for now.  
European UnionOn March 26, 2024, the European Union adopted the Digital Identity Regulation, introducing the EU Digital Identity Wallet, which mandates all EU member states to provide digital wallets by 2026 and recognize wallets from other member states. This wallet allows users to authenticate their identity and share key information, such as age, without disclosing other personal details, ensuring privacy. One pilot project, euCONSENT, is developing a device-based app that stores anonymized age verification tokens.[10] Once a user’s age is verified, the verification provider issues a token stored in the app, revealing only encrypted age qualification data without additional identity information.  
ChinaChina applies the following age verification methods: (a) Mandatory Real-Name Registration, requiring all internet users to register with their real names and official identification to access internet services[11], and (b) Facial Recognition Technology, used for identity verification with the data owner’s consent.  

Parental Consent Mechanism: Lessons Learned

The PDP Law requires social media platforms to obtain parental consent to process children’s personal data, but this mechanism remains weak as it generally relies on self-declarations, such as email confirmations, without verifying whether the person granting consent is truly the parent or guardian. This creates challenges in ensuring the validity of consent, highlighting the need to study parental consent practices implemented in other countries.

CountriesProvisions
IrelandThe Irish Data Protection Commission acknowledges that, currently, there are few accurate and proportionate methods to effectively verify parental consent in practice.  
United StatesChild protection in the online space in the United States is governed by the Children’s Online Privacy Protection Act (COPPA) Rule. The COPPA Rule does not mandate specific methods for obtaining parental consent. However, the FTC has determined that certain methods of collecting parental consent meet the standards outlined in the COPPA Rule.  
European UnionThe General Data Protection Regulation (GDPR) does not specify practical methods for collecting parental consent or determining who is authorized to give it. Therefore, the European Data Protection Board (EDPB) recommends adopting a proportional approach, in line with the principle of data minimization.  

Moving Forward: What is the Appropriate Mechanism for Age Verification and Parental Consent?

Child protection in the online space has become a global priority, focusing on effective age verification and parental control mechanisms to safeguard children from harmful content and privacy risks. Indonesia has begun efforts to protect children online but lacks a national standard for technical age verification. It is essential for the government to establish such standards to guide online platforms. When developing these standards, key consideration factors include protecting children’s privacy, ensuring platform capabilities, and implementing sandboxing by relevant agencies. Additionally, the proportionality of parental consent verification should be assessed based on data processing risks, considering factors such as the child’s vulnerability, data sensitivity, service type, third-party access, and data sharing. These criteria could be integrated into, for instance, the Draft Children Protection in ES.

Full Article can be accessed here: https://doi.org/10.31219/osf.io/d5k2s

Should you have any inquiries related to this regulation or wish to ascertain its impact on your business or personal interests, please feel free to contact us. 

©2025. BE Partners. All Rights Reserved. 


[1] ‘‘Asosiasi Penyelenggara Jasa Internet Indonesia’ <https://apjii.or.id/berita/d/apjii-jumlah-pengguna-internet-indonesia-tembus-221-juta-orang> accessed 26 November 2024.

[2] Refleksi Tiga Tahun TaskForce KBGO: Kompleksitas Pelaku Anonim & Korban Anak’ (TaskForce KBGO 2023) <https://web.tresorit.com/l/0EI2Q#IYIo4gpYVDEGHyqqrs_FZw&viewer=fQpmbSnVflMZqe1QHVORpHSXmUlOGmj6> accessed 26 November 2024.

[3] Law No. 11 of 2008 on Electronic Information and Transactions, as last amended by Law No. 1 of 2024.

[4] Law No. 27 of 2022 on Personal Data Protection.

[5] ‘Rancangan Peraturan Pemerintah Tentang Tata Kelola Pelindungan Anak Dalam Penyelenggaraan Sistem Elektronik’ <https://web.komdigi.go.id/resource/ZHJ1cGFsL05hc2thaCBSUFBfVGF0YV9LZWxvbGFfUEFQU0UgS29taW5mby5wZGY=> accessed 26 November 2024.

[6] Age Assurance or age validation is an umbrella term used in the UK and various countries to describe a standardized system for determining a user’s age, which includes two main aspects: age verification and age estimation. Age verification involves the use of concrete evidence, while age estimation uses technology to estimate age without the need for identity documents.

[7] ‘Guidance on Age Assurance and Other Part 5 Duties for Service Providers Publishing Pornographic Content on Online Services’ <https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/272586-consultation-guidance-for-service-providers-publishing-pornographic-content/associated-documents/annex-2-guidance-for-service-providers-publishing-pornographic-content-online?v=368675>.

[8] Office of the Privacy Commissioner of Canada, ‘News Release: Launch of Exploratory Consultations on the Privacy Implications of Age-Assurance Systems’ (10 June 2024) <https://www.priv.gc.ca/en/opc-news/news-and-announcements/2024/nr-c_240610/> accessed 26 November 2024.

[9] ‘Age Assurance Trends and Challenges – Issues Paper’ (eSafety Commissioner 2024) <https://www.esafety.gov.au/industry/tech-trends-and-challenges/age-assurance> accessed 26 November 2024.

[10] ‘Age Assurance Trends and Challenges – Issues Paper’ (eSafety Commissioner 2024) <https://www.esafety.gov.au/industry/tech-trends-and-challenges/age-assurance> accessed 26 November 2024.

[11] Translation: Cybersecurity Law of the People’s Republic of China (Effective June 1, 2017)’ (DigiChina) art 24 <https://digichina.stanford.edu/work/translation-cybersecurity-law-of-the-peoples-republic-of-china-effective-june-1-2017/> accessed 26 November 2024.

Tags

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

top
Search
Search