Australia’s eSafety Commission issued a significant fine against X for noncompliance with an investigation into anti-child abuse practices.
- The fine amounts to A$610,500 for failing to answer key questions regarding the handling of child sexual exploitation.
- Elon Musk’s X has seen continuous revenue decline since his acquisition, amid criticisms of platform safety and mass layoffs.
- Under Australian law, companies must respond to regulatory queries about online safety or face fines.
- Google also received a warning, but X’s noncompliance was deemed more serious.
Australia’s eSafety Commission has issued a substantial fine of A$610,500 against X for noncompliance with an investigation into the company’s anti-child abuse practices. The fine follows the company’s failure to respond to pivotal questions regarding their measures against child sexual exploitation.
The regulatory body stated that X left sections of their inquiries blank, including critical questions on response times, detection measures for child sexual exploitation in livestreams, and the technologies utilised for identifying such material. Elon Musk had previously declared that removing child exploitation was the platform’s top priority, which the commission subsequently labelled as ’empty talk.’
X, formerly known as Twitter, has been under scrutiny since Elon Musk’s US$44 billion acquisition, which has been accompanied by a continuous decline in revenue and significant workforce reductions. Insiders have voiced concerns about the platform’s ability to protect users from trolling following mass layoffs. Recent reports indicate that X has eliminated 80% of its workforce globally and no longer maintains public policy staff in Australia.
Australian legislation, enacted in 2021, empowers the regulator to compel internet companies to disclose information about their online safety practices or face financial penalties. Nonpayment of these fines can result in further legal action. Google was also warned for similar noncompliance but managed to avoid a fine, unlike X, whose noncompliance was more severe.
Compounding these issues, X faced additional criticism from Australian researchers last month for disabling a feature that allowed users to report misinformation about elections. This action raised significant concerns as it coincided with a critical referendum aimed at enhancing the rights of Indigenous people in Australia.
X’s failure to comply with Australia’s eSafety Commission regulations underscores the ongoing challenges the platform faces in ensuring user safety and effective content management.