Online safety demands early action, not just content removal

18 小时前

Online safety demands early action, not just content removal

KUALA LUMPUR (Dec 26): Online safety is no longer a question of whether it exists, but how quickly platforms act once risks are detected.For child sexual abuse material (CSAM) and financial scams, delayed action allows harmful content to continue spreading, exposes victims repeatedly, and causes greater harm.

The Online Safety Act 2025 (ONSA) emphasises an early prevention approach, requiring platforms to block, restrict and act immediately before harm escalates.

Today, the public no longer judges protection based on the amount of content removed, but on whether platforms act early enough to prevent harm from occurring.

In short, harm that is stopped early protects users, especially children; harm that is allowed to persist will recur and worsen.

Cases involving manipulated CSAM images, videos and recordings, as well as sharing through closed platforms, show an increasing trend in line with wider Internet access.

This is evidenced by a recent nationwide operation named ‘Integrated Op Pedo 2.0’ by the Royal Malaysia Police (PDRM) and the Malaysian Communications and Multimedia Commission (MCMC), which led to the recovery of more than 880,000 CSAM-related files and the arrest of 31 individuals at 37 locations.

Sharing these concerns, Cybersecurity Expert from the Faculty of Computer Science and Information Technology, Universiti Putra Malaysia (UPM), Associate Professor Ts Dr Nur Izura Udzir said that social media platforms should take early measures to filter and block harmful content, whether CSAM, scams, or cybercrime.

“It would be more effective if they carry out the content blocking themselves. It is their ‘home’ (platform) compared to the legal authorities in the country.

“They can actually carry out filtering and blocking without being instructed or receiving complaints first. If this is done, the regulation of harmful content will be more effective.

“Criminal misconduct exists everywhere, but with the help of technology it can be driven more aggressively, especially on social media platforms,” she said when contacted recently.

She added that delays in enforcement and blocking allow criminals to multiply distribution across multiple platforms.

Dr Nur Izura said swift action would also help in collecting criminal evidence, which can then be used for prosecution in court.

“In the digital world, evidence on social media can be easily deleted if immediate action is not taken,” she said.

She further explained that social media platforms need to be more proactive in detecting cybercrime, including adopting artificial intelligence (AI)-based filtering and blocking mechanisms.

In addition, if content is detected or suspected to contain harmful or criminal elements, platforms can impose temporary restrictions on such content.

“Temporary restrictions can also limit the content from spreading further, thereby preventing greater risks and harm,” she said.

From an information technology cybersecurity perspective, she noted that a regulatory approach that can be applied is the provision of a ‘safety by design’ mechanism.

“Systems and protection mechanisms built in from the start can be provided by social media platforms, as they are capable of blocking harm and cybercrime at an early stage,” she said.

Meanwhile, Senior Lecturer at the Department of Human Development and Family Studies, Faculty of Human Ecology, Universiti Putra Malaysia (UPM), Dr Nellie Ismail, said that failure to promptly block CSAM could expose children to various risks.

She said that every time such content is reshared, children risk experiencing repeated trauma, prolonged anxiety and feelings of shame.

“These effects can persist in the long term and ultimately affect children’s emotional wellbeing and social relationships.

“It is widely known that children are easily victimised because they are not yet cognitively and emotionally mature enough to identify risks in the online environment.

“Moreover, they tend to trust others and may not know how, feel confused or be afraid to report harmful content, causing exposure to occur earlier and repeatedly,” she said.

She added that early action by platforms is crucial in stopping the spread of harmful content such as CSAM, thereby reducing the risk of children encountering or being re-associated with such material.

Separately, President of the Malaysia Cyber Consumer Association (MCCA), Siraj Jalil, said he acknowledged that child exploitation content and financial scams typically spread very rapidly on the Internet.

According to him, if detection is delayed, the content would have already been copied, shared and disseminated across many other platforms.“When we manage to detect it early, we can stop the circulation, identify victims and perpetrators more quickly, and prevent the content from being stored on dark networks that are difficult to remove.

“Early detection actions can also help authorities understand the patterns and tactics of the syndicates involved,” he said.

He further explained that CSAM and financial scam issues in Malaysia are becoming increasingly serious, with data showing that more content is detected each year.

“Malaysia is also frequently mentioned in international reports as a location for content storage or transit. The perpetrators’ modus operandi is becoming more sophisticated with the use of VPNs, live-streaming platforms and closed sharing networks.

“This indicates that the problem has reached a critical level and requires firm action in terms of intelligence, security technology and public education,” he said.

...

Read the fullstory

It's better on the More. News app

✅ It’s fast

✅ It’s easy to use

✅ It’s free

Start using More.
More. from The Borneo Post ⬇️
news-stack-on-news-image

Why read with More?

app_description