Among the latest trends? AI-powered "kissing apps" are being advertised on platforms like TikTok and Instagram.
These apps allow users to upload real photos of people and generate fake videos of them kissing without their consent. With thousands of these apps flooding social media feeds, many teens are downloading and experimenting with them.
It's just one example of a broader, disturbing trend: the rise of free, easy-to-use AI tools that are being misused to target others, particularly women and girls.
Female students and teachers are being disproportionately targeted, with most incidents involving male students as the creators and distributors of this content.
Recent incidents have already made headlines.
Two Victorian schools were rocked by fake nude scandals, and just last month, male students at a private Sydney school were caught selling explicit deepfake images of female classmates in online group chats for less than $5 a photo.
A new report by Our Watch warns that this growing trend is fuelling an increase in sextortion, blackmail and stalking.
As AI tools become more advanced and more accessible, the question now is: how do we protect students from this new wave of AI-driven harassment and abuse in schools?