Australia has introduced a groundbreaking law that prohibits children under 16 from accessing social media platforms. This legislation, which passed the country's Parliament recently, aims to address growing concerns over social media's impact on young people's mental health, including issues like cyberbullying, addiction, and exposure to harmful content.

What does the new law require from 'Social Media Platforms'?

The bill, which cleared the Senate on Thursday, mandates social media companies to strengthen age verification processes to prevent children under 16 from using platforms such as Facebook, Instagram, TikTok, and Snapchat, according to a report by CNN. The law will take effect in early 2025, allowing time for both tech companies and parents to adjust.

Also read: 10 most powerful CEOs from India leading Google, Microsoft, Adobe and more

Under the new regulations, social media companies will have one year to implement robust age verification systems. Failure to comply could result in substantial fines, potentially reaching up to $50 million AUD for repeated violations. These penalties are designed to ensure that platforms take necessary steps to block access to users under 16.

Prime Minister Anthony Albanese has expressed strong support for the new law, calling it an important step in safeguarding children's emotional and mental well-being in the digital age. The legislation follows extensive research and recommendations from health experts, who have warned about the negative effects of social media on young users. Studies have shown links between excessive social media use and increased rates of depression, anxiety, and sleep disturbances in teenagers.

Also read: iPhone SE 4 arriving soon but it may have to share spotlight with OnePlus 13R

What are the concerns of Tech Companies?

Despite the support, the law's swift passage has drawn criticism. The bill was fast-tracked through Parliament, with limited time for public consultation. A Senate committee inquiry was conducted in just 24 hours, with submissions from over 100 sources expressing concerns about the rushed process. Tech companies, including Meta, TikTok, and Snap Inc., have acknowledged the importance of protecting young users but have raised concerns about the law's speed and potential technical challenges.

Also read: Steam Autumn Sale 2024: Huge discounts on popular games like Red Dead Redemption 2, GTA 5, and more

To comply, platforms are exploring advanced age verification technologies, including facial recognition and digital ID systems. However, the implementation of such methods has raised concerns over privacy and data security. The new regulations place pressure on tech companies to balance safety with user privacy while adapting to the law's requirements.

This historic move is expected to set a precedent for other countries grappling with similar issues around social media and child safety.

Leave a Reply

Your email address will not be published. Required fields are marked *