A growing concern

0
84

Indonesia has become the first country in Southeast Asia to implement restrictions on social media accounts for children under 16, joining a growing number of states that no longer treat these platforms as neutral tools. This shift reflects a growing concern among people in positions of power who cannot allow minors to be exposed, at an unprecedented scale, to explicit content, coercion and addictive design in systems built to maximise engagement.
The impulse to lock the monster out is understandable. Parents, schools, and governments have spent years confronting harms that technology companies were slow to acknowledge and even slower to address. In this sense, Indonesia’s move is not an outlier, but rather part of a broader shift in public policy. Last year, Australia reached for the same instrument, legislating a nationwide ban and threatening penalties for non-compliance. Nevertheless, age-based restrictions can easily be evaded through false credentials and informal workarounds.
That is why the more consequential shift may be taking place outside parliaments. In the US, for instance, litigation against companies such as Meta has pushed a consequential argument into the mainstream that harm to young users is not merely the result of bad content or poor supervision, but of product design itself, with infinite scroll, autoplay and notification loops inserted as deliberate mechanisms to keep users hooked to the platform for longer.
Perhaps, this also explains why comparisons with Big Tobacco seem to have gained traction. The analogy is imperfect because, while being an extremely harmful product, social media also reigns as a non-negotiable, central channel for communication and identity. Even so, the direction of accountability is changing, and that is welcome.
Pakistan’s own record on child online protection is thin where it matters. The National Commission on the Rights of the Child has repeatedly warned how children face cyberbullying, grooming, sexual exploitation and exposure to abusive material. Global monitoring bodies flagged more than two million instances of child sexual abuse material linked to users in Pakistan in 2022, while the Federal Investigation Agency registered roughly 187 cases, suggesting how weak detection, reporting and enforcement remain. The temptation, then, would be to reach for an outright ban. Still, we know too well that in this part of the world, regulation rarely remains confined to its stated purpose. Pakistan’s history of digital regulation gives ample reason for caution, especially when broad powers are defended in the name of public order. However, the harder work is less dramatic. Platforms can be compelled to adopt age-appropriate design, limit unsolicited contact from adults, and default to private settings. Law enforcement can also be directed toward grooming networks and extortion rackets that operate openly.
For years, light-touch regulation has allowed engagement-driven systems to become embedded in children’s daily lives while public institutions lag behind. Restricting access may satisfy the demand to act. It will not, by itself, solve the problem.