Online child safety needs age assurance, not age policing
Bangladesh’s digital child protection policy still rests on a dangerously comforting illusion: that harmful online content can be managed by blocking websites. It is a politically convenient approach, because it allows the state to appear decisive without confronting the actual architecture of digital harm. It is also a technically weak approach. Bangladesh has already relied on mass blocking, including the blocking of 1,279 pornographic websites in 2019, but a blocked URL does not protect anyone. Children do not experience internet only through a list of prohibited websites. It happens through phones, feeds, games, livestreams, messaging apps, search results, advertising systems, influencer content, and increasingly, AI interfaces. A policy designed for static websites is badly mismatched with a digital environment built around algorithmic exposure.
In Bangladesh, internet access is overwhelmingly mobile and deeply embedded in everyday life. BTRC data showed that as of January 2026, Bangladesh had some 12.90 crore internet subscribers, with mobile connections accounting for the vast majority. That scale makes the relevant question unavoidable: not whether children are online, but whether Bangladesh has any credible system for deciding which digital spaces they should be allowed to enter, at what age, and under what safeguards. At present, the answer is largely no.
The United Kingdom’s Online Safety Act requires strong age checks for services allowing pornography, and Ofcom has made clear that simply ticking a box to claim adulthood is no longer enough. Australia has gone further, requiring age-restricted social media platforms to take reasonable steps to prevent Australians under 16 from creating or keeping accounts, backed by penalties that can reach 49.5 million Australian dollars. Indonesia, in March 2026, moved to restrict under-16 users from high-risk platforms including TikTok, Facebook, Instagram, YouTube, and X. The European Union is also developing a privacy-preserving age verification approach, including a tool that allows users to prove they are over 18 without disclosing other personal information. These jurisdictions differ sharply in law and politics, but they are converging on one principle: age can no longer remain a fiction written into a sign-up form.
Bangladesh is not behind merely because it lacks a specific age assurance law. It is behind because its regulatory instinct remains reactive, moralistic, and enforcement-heavy. The state blocks after panic, prosecutes after harm, and announces crackdowns after public outrage. What it does not do is require platforms, app stores, payment systems, gaming environments, and AI services to design age-appropriate access into their systems before harm occurs. Criminal law can punish an offender, but it cannot by itself stop a 12-year-old from entering an adult content site, joining an unsafe stranger chat, being nudged into gambling, or receiving self-harm content through recommendation systems.
The Cyber Security Ordinance 2025, later ratified as Cyber Security Act 2026, has changed the legal landscape by replacing the Cyber Security Act 2023. Public reporting on its approval noted provisions relating to online gambling, sexual harassment of women and children in cyberspace, and the recognition of internet access as a civic right. Those are important developments, but they do not amount to a coherent age assurance framework. They still treat child safety mainly as a matter of offences and punishment. A serious framework would instead define platform duties, minimum age thresholds, verification standards, independent audits, appeal mechanisms, data minimisation rules, and penalties for negligent design. Without those elements, Bangladesh is not regulating children’s digital access. It is merely reacting to the worst outcomes of unregulated access.
There is, however, a legitimate danger in rushing toward age verification without safeguards. Bangladesh’s history of digital regulation gives citizens every reason to fear that a child safety policy could become another surveillance instrument. Age assurance cannot mean forcing every user to surrender national identity data to every platform, nor can it mean building a central database of what citizens read, watch, play, or discuss. A child protection system that destroys privacy would be another policy failure dressed up as reform.
This is why Bangladesh needs age assurance, not crude age policing. Age assurance can include a spectrum of methods, from document-based verification to facial age estimation to trusted third-party tokens that confirm only whether a user is above or below a legal threshold. The best systems do not reveal identity when identity is unnecessary. The question should not be, “Who is this user?” It should be, “Is this user old enough for this service?” The OECD has warned that age assurance laws are spreading quickly while implementation remains complex, especially because online services operate across borders and because many services used by children still have serious gaps in their age-related practices. Bangladesh should learn from this complexity, not use it as an excuse for inaction.
The mental health and safety risks are not speculative. The US Surgeon General has stated that current evidence cannot conclude social media is sufficiently safe for children and adolescents, and has called for stronger age-appropriate health and safety standards, better privacy protections, and policies that reduce exposure to harm. Bangladesh should not wait for a domestic tragedy to accept a global evidence base that is already strong enough to demand regulatory action.
A credible Bangladeshi approach would begin by abandoning the fantasy that blocking websites equals protecting children. It would place legal duties on high-risk services, require privacy-preserving age checks for adult content and gambling, demand stronger protections in social media and gaming environments, and prohibit platforms from using children’s data to optimise addictive engagement. It would also require transparency from platforms about underage users, content exposure, complaint handling, and algorithmic safeguards. Most importantly, it would subject both companies and regulators to independent oversight.
Bangladesh can either remain trapped in a censorship-based model that is easy to announce and easy to bypass, or it can build a rights-respecting age assurance regime that protects children without turning every citizen into a monitored subject. The first option is familiar, ineffective, and politically lazy. The second is difficult, technical, and institutionally demanding—but necessary. Child online safety will not come from blocking yesterday’s websites. It will come from governing today’s platforms with seriousness, restraint, and accountability.
Khan Khalid Adnan is advocate at the Supreme Court of Bangladesh, fellow at the Chartered Institute of Arbitrators, and head of the chamber at Khan Saifur Rahman and Associates in Dhaka.
Azfar Adib is senior member of the Institute of Electrical and Electronic Engineers (IEEE) and a PhD student in Concordia University, Canada.
Views expressed in this article are the author's own.
Follow The Daily Star Opinion on Facebook for the latest opinions, commentaries, and analyses by experts and professionals. To contribute your article or letter to The Daily Star Opinion, see our guidelines for submission.
Comments