The debate around AI always turns heated. Some people enjoy the benefits of it, while others strongly oppose it for many valid reasons. Recently, a disturbing trend has surfaced where some men use Grok AI to create s*xualised or edited images of women, including popular female K-pop idols such as aespa’s Karina and IVE’s Jang Wonyoung. This trend has caused widespread anger and concern, as it is a clear violation of privacy. Read on to know more!
Men Are Using Grok AI To Create N*de Images Of Aespa’s Karina, IVE’s Wonyoung, And More Female K-pop Idols
Nowadays a lot of people are using Ai in a wrong way. They create fake n*de images of women. This has been going around for a long time but with the rise of Ai, it has increased on another level. Now, a new trend has appeared with Grok AI. Their recent update allows users to edit photos however they want. Unfortunately, soon this cool feature became harmful, as many men started editing women’s images in a very creepy and s*xual manner. Some people also targeted female idols like Karina, Wonyoung, and other singers who are minors as well.
Well, this behaviour is not surprising to us, as some people are willing to cross any line for their personal entertainment. The trend has sparked outrage internationally as hundreds or even thousands of such images are being created and shared. Fans are calling for stronger rules and regulations to stop this kind of abuse.
Also Read: ALLDAY PROJECT’s Annie Slammed For Commenting On ATEEZ’s Dance Video. This Needs To Stop!
They hope that quick and strict actions will be taken to fix these problems because they should help people, not harm them by invading privacy or spreading non-consensual content. Protecting individuals, especially women and young people, should become a top priority for all the developers.

