KUALA LUMPUR, Feb 4 (Bernama) -- The Online Safety Act 2025 (ONSA), which took effect on Jan 1, adds a new layer to Malaysia’s regulatory framework for digital platforms, with a strong emphasis on managing online harm and strengthening protections for children.
Passed by Parliament in December 2024 and gazetted in May 2025, the Act operates alongside existing legislation, including the Communications and Multimedia Act 1998, introducing specific obligations for platforms that host or distribute user-generated content and shifting the regulatory focus towards systemic risk management rather than case-by-case content decisions, with child safety forming a central pillar of the new framework.
The Act is designed to address inconsistent platform responses to harmful content, which have historically relied on internal policies, as the failure to promptly remove even a small number of flagged items can expose large user populations to online harm.
The introduction of ONSA follows sustained growth in reported online harm in Malaysia, as police recorded RM2.7 billion in reported losses from online scams between January and November 2025.
On Oct 24, police announced they had crippled a criminal network linked to child sexual abuse material (CSAM), with the arrest of 31 individuals and the seizure of more than 880,000 digital files.
Since 2022, regulatory or enforcement action has resulted in the removal of 38,470 items related to cyberbullying and online harassment content, yet harmful material continues to surface and remain accessible despite ongoing platform moderation.
Data from the Malaysian Communications and Multimedia Commission (MCMC) indicates that while major platforms removed approximately 92 per cent of the 697,061 posts flagged as harmful between January 2024 and November 2025, some 58,104 posts remained accessible online, a situation further complicated by the growing use of automated tools and AI-generated content like deepfakes, which hinder detection and enforcement efforts.
ONSA establishes a regulatory framework for application service providers and content application service providers, including both local and foreign platforms operating in or targeting the Malaysian market.
The Act covers a defined range of harmful content categories, including CSAM, online scams and financial fraud, obscene or pornographic material, harassment and abusive communications, content linked to violence or terrorism, material that encourages self-harm among children, content that promotes hostility or disrupts public order, and content associated with dangerous drugs.
Rather than addressing individual posts, the law focuses on platform-level risk management systems, including content distribution and recommendation mechanisms, and on platform governance and operational practices, rather than imposing criminal liability on individual users for lawful expression.
Under ONSA, platforms are required to identify and manage risks arising from harmful content on their services, including reducing users’ exposure to ‘priority harms’, ensuring certain categories of content are inaccessible, and providing reporting and user support mechanisms.
Platforms must also prepare and submit an online safety plan detailing how these risks are addressed, with enforcement measures available for non-compliance, subject to procedural requirements governing the issuance and review of regulatory directions.
The Act further mandates specific measures, including age-related safeguards and access restrictions, to limit children's exposure to harmful content and interactions, a requirement expected to influence the design of default settings, content discovery tools, and interaction features for younger users.
However, ONSA does not extend to private one-to-one messaging, authorise general user monitoring, or create new offences for lawful speech or political expression, with safeguards including notice requirements, opportunities for representation, public records of regulatory directions, and access to appeal mechanisms and judicial review.
While introducing a formal regulatory structure for online safety, ONSA does not replace existing enforcement, education, or prevention efforts, as issues such as digital literacy, parental involvement, and online behaviour norms remain outside its scope, with its practical impact depending on platform adaptation and regulatory oversight over time.
-- BERNAMA
