
Social media and technology companies would have six months to implement new measures to restrict children from accessing adult content online, or face fines up to $50 million, under proposed new codes developed by the industry. Source: The Guardian.
The draft codes, submitted to Australia’s eSafety commissioner last week for approval, would require social media platforms that allow pornography to prevent access to minors, and implement age assurance measures for users.
Social media services that restrict pornography would be required to detect and remove adult content, including material depicting self-harm or high-impact violence.
The proposed codes apply to various layers at which users interact with the internet: there are separate codes for social media platforms, gaming services, websites, search engines, internet service providers and equipment manufacturers.
Companies that make and sell equipment and operating systems that enable access to online services – including phones, tablets and computers – would be required to enable users to set up child accounts and profiles and to apply default safety restrictions to those accounts.
Search engine services would be required to apply default safety tools, such as “safe search” functions, at the highest safety setting by default to any account holder detected by age assurance systems as “likely to be an Australian child”.
The codes would require internet hosting services to take “appropriate and proportionate enforcement action” against customers who breach content laws and regulations.
Draft codes were released in October but the final proposals – which were developed by industry groups – were delayed by two months to allow the sector to address crossover with the Albanese Government’s announcement last year of a social media ban for under 16s.
The codes were developed by industry groups but will need to be assessed and registered by the eSafety Commissioner before coming into effect. They address pornography and content related to suicide, self-harm, eating disorders and violence.
If the codes are accepted by the commissioner, Julie Inman Grant, companies would then have six months to implement required measures before enforcement action could be taken under the federal Online Safety Act. Penalties could be up to $50 million.
FULL STORY
Social media platforms could face $50m fines if Australian children access adult content on their sites (By Ben Smee, The Guardian)