When passed, the Code of Practice for Online Safety and the Content Code for Social Media Services will require platforms to implement safety standards and content moderation processes to minimise users' risk of exposure to harmful online content like terrorist propaganda.
The platforms will also need to ensure additional safeguards for users who are under 18 years old, including tools to help them or their parents minimise their exposure to inappropriate content such as sexual or violent videos, and unwanted interactions like online stalking and harassment.
The new rules are aimed at codifying these standards in law and giving the authorities powers to take action against platforms that fail to meet the requirements. They are expected to be added to the Broadcasting Act.
The ministry has not released many details on the specific stipulations of the codes as they are still being developed in collaboration with the tech industry, though it has said the Infocomm Media Development Authority (IMDA) will be empowered to direct social media services to disable access to harmful online content for Singapore users.
Platforms will also be required to produce annual accountability reports to be published on the IMDA website.
Here are five types of online harm the Code of Practice for Online Safety and the Content Code for Social Media Services could cover.
1. Violence and terrorism
Social media platforms will need to proactively detect and remove violent videos or other content that promotes violence, such as terrorist propaganda.
In another incident, rioters who stormed Capitol Hill in the United States last year used social media to organise themselves and amplify their messages.
Similar material that incites violence, as well as content that encourages self-directed violence such as suicide, will also be covered under the new codes of practice.
2. Dangerous viral challenges
Social media "challenges" that encourage dangerous behaviour are another example of harmful online trends that the codes aim to reduce.
Some platforms have taken steps to remove such challenges.
Viral video platform TikTok blocked hashtags and videos related to the "milk crate challenge" last year over concerns that participants could be seriously injured. The trend involved users filming themselves stacking milk crates into a tower and then climbing over them. Many videos showed users falling to the ground while attempting the stunt.
3. Sexual exploitation, abuse and harassment
The codes will also aim to minimise users' risk of exposure to sexual content and abuse, including child pornography. Platforms will be required to detect and remove child sexual exploitation and abuse material, as well as content that promotes sexual violence.
Sexual harassment, online stalking and threats of "revenge porn" - or non-consensual sharing of sexual images - will also be tackled. Social media platforms will be required to ensure users can easily report such unwanted interactions, assess the reports and take appropriate action in a timely manner.
Content that threatens public health could also fall afoul of the new rules.
During the Covid-19 pandemic, conspiracy theories and viral social media posts may have contributed to vaccination hesitancy and encouraged people not to heed measures implemented to control the spread of the virus.
Similar posts could constitute a threat to public health and may be covered under the rules to combat online harm.
5. Threats to racial and religious harmony
The new codes will take Singapore's unique context into account, including sensitive issues like race and religion. Offensive content or incidents that could stoke racial or religious tension will be covered.
One example is a case where a man was charged for stoking racial tensions after he posted racially offensive tweets using the persona of a Chinese woman with the pseudonym Sharon Liew.