Australia may push Apple to block AI apps under new age verification rules

Australia may push Apple to block AI apps under new age verification rules

Summary

Australia is enhancing age-verification measures for AI applications, building on last year's ban on social media apps for teenagers. This initiative aims to ensure safer online environments for younger users, reflecting the country's commitment to digital safety.

Read Original Article

Key Insights

What specific requirements do the new Australian age verification rules impose on AI services?
Starting March 9, 2026, AI services like ChatGPT must prevent Australians under 18 from accessing explicit content such as pornography, extreme violence, self-harm, or eating disorder material, with non-compliance risking fines up to A$49.5 million ($35 million).[1][3][4]
Sources: [1], [2]
How might Apple be involved in enforcing these AI age verification rules?
Australia's eSafety regulator may require app stores like Apple's to block access to non-compliant AI services, as they act as gatekeepers providing key access points.[1][3][4]
Sources: [1], [2], [3]
An unhandled error has occurred. Reload 🗙