Online Safety Bill 2024 Passed by Malaysian Parliament
17 December 2024
The
Online Safety Bill 2024 ("
the Bill") was passed by the Dewan Rakyat (
House of Representatives) and the Dewan Negara (
Senate) on 11 and 16 December 2024 respectively. The Bill will now be presented for Royal Assent and become law upon its being gazetted and will come into operation on a date to be appointed by the Minister of Communications.
The Bill aims to enhance and promote online safety in Malaysia by regulating harmful content and imposing duties and obligations on applications service providers (“
ASPs”) and content applications service providers (“
CASPs”) licensed under the Communications and Multimedia Act 1998 ("
CMA"). The Bill will have extraterritorial effect and will apply to persons outside Malaysia who are licensed as ASPs, CASPs or network service providers (“
NSPs”) under the CMA and who provide any applications service, content applications service, or network service within Malaysia.
The Bill also accords additional powers to the Malaysia Communications and Multimedia Commission (“
MCMC”) and establishes an Online Safety Committee and an Online Safety Appeal Tribunal whose functions are summarised below.
As the proposed new law will have significant implications for the content and media industry in Malaysia, we have summarised some of the salient provisions below:
Application
The Bill is stated to apply to: (1) applications services which utilise internet access services to enable communications between users; (2) content applications services which utilise internet access services to provide content; and (3) any network service. However, the Bill does not apply to the “private messaging features” of applications and content applications services, defined as “
a feature that allows a user to communicate a content to a specific and limited number of recipients determined by the user and may contain any other characteristics as may be prescribed”.
Meaning of harmful content
The First Schedule of the Bill sets out the types of content considered “harmful content” and includes:
- content on child sexual abuse;
- content on financial fraud;
- obscene content;
- indecent content;
- content that may cause harassment, distress, fear, or alarm by way of threatening, abusive, or insulting words, communications, or acts;
- content that may incite violence or terrorism;
- content that may induce a child to self-harm;
- content that may promote feelings of ill-will or hostility amongst the public; and
- content that promotes use or sale of dangerous drugs.
The Bill also defines content on child sexual abuse and financial fraud as “priority harmful content”, which are subject to more stringent controls.
Duties of ASPs and CASPs
The Bill imposes various duties on ASPs and CASPs, such as duties to:
- implement measures to mitigate the risk of users being exposed to harmful content, as specified in the code issued by the MCMC, or any alternative measures that are proven to be more effective;
- issue guidelines to users on the measures implemented to mitigate risk of exposure to harmful content, and the terms of use of their services;
- make available tools and settings to enable users to manage their online safety, including preventing or limiting other users from identifying, locating or communicating with them;
- make available mechanisms for reporting harmful content, as well as responsive assistance to raise concerns regarding online safety, online safety measures, and make inquires;
- establish mechanisms to make priority harmful content inaccessible on their services;
- implement measures to ensure safe use of their services by children, as specified in the code issued by the MCMC, or any alternative measures that are proven to be more effective – these measures shall include measures to prevent access to harmful content, limit communication with adults, control personalised recommendation systems, limit features that extend use of the service by children, and protect personal information; and
- prepare and submit an Online Safety Plan on their compliance of their duties to the MCMC and make the plan available on their services.
Reporting of harmful content
The Bill provides for actions to be taken by ASPs and CASPs upon receipt of a report by users of harmful content, such as procedures for assessing the report and making the content inaccessible.
The Bill also sets out actions to be taken by the MCMC upon receipt of a report by users of harmful content, such as procedures for assessing the report and issuing written instructions to ASPs, CASPs, or relevant NSPs regarding the harmful content.
Powers of MCMC
The Bill grants the MCMC various powers, such as powers to:
- issue directions and written instructions to ASPs, CASPs and NSPs licensed under the CMA regarding their compliance with the provisions of the Bill;
- gather, retain, and require the production of information, particulars, documents, or evidence relevant to the performance of its functions and powers under the Bill;
- issue notices of non-compliance and impose financial penalties on ASPs and CASPs for any non-compliance of their duties under the Bill; and
- where harmful content is available online other than on the service of an ASP or CASP, issue written instructions to relevant NSPs to restrict the relevant parts of their network service so as to make the harmful content inaccessible.
Online Safety Committee
The Bill establishes the Online Safety Committee which has the functions to advise and give recommendations to the MCMC on matters relating to online safety including, among others, determination of the types of harmful content and priority harmful content, and advising on best practices to encourage accountability of ASPs, CASPs and NSPs.
Online Safety Appeal Tribunal
The Bill establishes an Online Safety Appeal Tribunal to review any written instruction, determination, directions, or decision issued or made by the MCMC under the Bill.
Commentary
Affected parties, particularly ASPs, CASPs and NSPs, are advised to take note of the potential duties imposed on them, review their current policies and practices on online safety and content moderation, and to prepare for the potential impact of the new law on their operations and liabilities when it comes into effect.
For further information, please contact Charmayne Ong (Partner), Natalie Lim (Partner) and Jillian Chia (Partner) of the Technology, Media and Telecommunications Practice of Skrine.
This article/alert contains general information only. It does not constitute legal advice nor an expression of legal opinion and should not be relied upon as such. For further information, kindly contact skrine@skrine.com.