Malaysia Moves Toward Mandatory Age Verification for Social Media Under Online Safety Act 2025

1 hour ago

Malaysia Moves Toward Mandatory Age Verification for Social Media Under Online Safety Act 2025
Putrajaya plans to finalize age and identity verification rules for social media in the second quarter of 2026, placing legal responsibility on platforms to keep users under 16 off certain services and to curb harmful content.

Malaysia’s push to tighten online safeguards for minors is entering a decisive phase, with the government signalling that a formal age verification mechanism for social media users will be finalized through subsidiary legislation under the Online Safety Act 2025 (ONSA) in the second quarter of the year.

Deputy Communications Minister Teo Nie Ching told the Dewan Negara that the mechanism forms part of broader efforts to protect children from harmful online content, in line with ONSA, which came into force on January 1. The law establishes a new regulatory framework aimed at reducing exposure to content such as cyberbullying, sexual exploitation, scams, and other high-risk material.

According to Teo, the Malaysian Communications and Multimedia Commission (MCMC) is currently assessing how age and identity verification can be implemented in a way that balances security, privacy, and compatibility with existing laws, including data protection requirements.

“MCMC is conducting a regulatory sandbox process with social media platform providers to evaluate suitable technological approaches,” she said in reply to a question by Senator Norhasmimi Abdul Ghani during Question Time.

The sandbox approach allows regulators and technology providers to test systems in a controlled environment before full implementation. In this case, it includes examining age verification tools, identity validation mechanisms, and the use of artificial intelligence to detect high-risk content.

The policy direction is clear. Once the assessment and sandbox process are complete, the obligation to ensure users aged 16 and below do not operate social media accounts will rest squarely with platform providers under subsidiary legislation pursuant to ONSA. Non-compliance could result in financial penalties of up to RM10 million.

That figure is primarily intended to focus attention, according to observers.

A SHIFT IN RESPONSIBILITY

Malaysia is not alone in grappling with the question of how to shield minors online. Jurisdictions from the European Union to Australia and parts of the United States have introduced or proposed stricter digital age thresholds and enhanced child safety codes. The debate, however, remains complex for a number of reasons.

On one hand, policymakers argue that unregulated access to algorithm-driven platforms exposes young users to bullying, predatory behaviour, self-harm content, and other material that can have long-term negative psychological effects. On the other, civil liberties advocates, while not necessarily refuting the stance of these lawmakers, nevertheless caution against intrusive identity checks and potential overreach.

Teo’s remarks suggest the government is aware of these tensions. She noted that the MCMC is considering account security, personal data protection, privacy safeguards, and alignment with the existing legal framework before finalizing the mechanism.

The emphasis on subsidiary legislation is also significant. Rather than embedding highly technical rules directly in the Act, the government appears to be creating flexibility to adjust implementation details as technology evolves.

The proposed framework would also aim to ensure that social media algorithms are age-appropriate. That is a critical element. Modern platforms do not simply host content; through complex and proprietary algorithms, they actively curate and amplify it. For regulators, managing algorithmic exposure is increasingly as important as moderating individual posts.

Teo said the move is intended to prevent exposure to negative content, including cyberbullying and sexual harassment, while strengthening complaint and enforcement pathways.

DATA POINTS AND DIGITAL REALITIES

In response to a supplementary question by Senator Tiew Way Keng, Teo revealed that between January 1, 2022 and February 15 this year, 1,578 requests were submitted to service providers for the removal of extremely offensive content involving children. Of these, 96 percent were successfully taken down.

The statistic cuts both ways. On one level, it demonstrates that removal mechanisms can work. On another, it underscores the scale and persistence of harmful material circulating online.

To further reinforce digital safety, the MCMC launched a public consultation on February 12 to develop a Risk Reduction Code and a Child Protection Code. These are expected to be finalized after March 13.

“Our intention is to make compliance with these codes mandatory for platform providers so their algorithms are safer and to establish faster and more effective complaint mechanisms,” Teo said.

If implemented as indicated, Malaysia’s approach would combine three elements: age-based access controls, algorithmic accountability, and formalized complaint and enforcement procedures.

The practical question remains how age verification will function in real-world terms. Globally, proposed methods range from document-based identity checks to third-party digital identity providers and biometric estimation tools. Each carries trade-offs in cost, accuracy, and privacy risk.

For platforms operating across multiple jurisdictions, compliance could require substantial system redesigns. For users, particularly younger ones, it may mean stricter onboarding processes and reduced anonymity.

At a broader level, the debate touches on the role of parents, schools, and community education alongside state regulation. Laws can restrict access and impose penalties, but digital literacy and supervision remain essential components of child safety.

Malaysia’s move reflects a growing recognition that voluntary self-regulation by technology companies is unlikely to satisfy public expectations indefinitely. The RM10 million penalty ceiling reinforces that point.

Whether the new framework achieves its objectives will depend on its technical execution and enforcement consistency. What does seem clear is that, as of 2026, digital childhood in Malaysia is poised to look rather different from even a few years ago.

The government has set the direction. The next quarter will determine how firmly the rules are drawn.

...

Read the fullstory

It's better on the More. News app

✅ It’s fast

✅ It’s easy to use

✅ It’s free

Start using More.
More. from Expat Go ⬇️
news-stack-on-news-image

Why read with More?

app_description