Model Accuses Clothing Brand Of Using AI-Generated Images Without Consent

Intelligence report synthesized for precision. Verified source updates below.
Detailed Report
A model who has previously worked with the Pakistani fashion brand Engine Pakistan has gone public with a claim that the company is using AI-generated images of her face to promote a new collection she was never photographed for and never agreed to appear in.
Tahir said she attempted to resolve the matter privately by reaching out to the brand first but did not receive a response, prompting her to speak publicly.
“Using someone’s face through AI without their permission is not okay, in any situation,” she wrote. “Consent should always come first, whether the content is real or digitally created.”
Following the public attention generated by Tahir’s post, Engine removed the AI-generated images from its platforms. Tahir later confirmed the takedown in an update, though the brand has not issued a public statement addressing how the images were created or why they were used without her consent in the first place.
Cases like these are not isolated ones, as they show a growing tension in Pakistan’s fashion and advertising industry as AI image generation tools become more accessible. Brands can now produce photorealistic campaign visuals without booking a studio, hiring a photographer, or scheduling a model. But the technology also makes it trivially easy to use someone’s likeness without involving them at all, raising questions about consent, intellectual property, and the commercial exploitation of a person’s face.
A face is a professional asset. When a brand can generate campaign imagery using AI without booking or paying the person whose likeness appears in it, it undermines both the economic relationship between talent and brands and the basic principle that individuals control how their image is used commercially.
The legal landscape around AI-generated likenesses remains largely undefined in Pakistan. There is no specific legislation governing the use of a person’s face in AI-generated commercial content, and existing intellectual property and privacy frameworks were not written with generative AI in mind. In markets where these issues have been litigated, such as the United States, courts have increasingly recognized a “right of publicity” that protects individuals from unauthorized commercial use of their likeness, but enforcement remains inconsistent even there.
Tahir later confirmed that the brand removed her images. However, the fact that Engine removed the images only after public pressure, not after Tahir’s private outreach, is a stark reality about how little recourse individuals currently have when their likeness is used without permission in AI-generated content.
Tahir said she hopes the matter sets a precedent and emphasized that her concern extends beyond a single brand to the broader principle that consent should be non-negotiable regardless of whether content is photographed or digitally generated.
AIOpenAI’s Sora is Officially Dead: Inside The $1 Billion Rug PullBy Abdul Wasay|2 days ago|3 min readOpenAI has announced it will shut down the Sora video generation app and its API, citing a strategic shift toward enterprise AI and robotics. Downloads.
AIGoogle Is Quietly Rewriting News Headlines With AI in Search ResultsBy Abdul Wasay|4 days ago|3 min readAfter testing AI-generated headline rewrites in Discover last year and reclassifying the experiment as a permanent feature within a month, Google is now running the.
AIThis Open-Source AI Claims It Can Predict What Happens NextBy Abdul Wasay|1 week ago|2 min readMiroFish, a new open-source project reportedly backed by Shanda Group founder Chen Tianqiao, uses thousands of AI agents with their own memories and behavioral profiles.
AIPicsart Just Replaced Your Social Media Manager With AIBy Abdul Wasay|1 week ago|3 min readThe 130-million-user design platform is letting creators delegate tasks like resizing content, remixing photo libraries, and optimising Shopify stores to specialised AI agents that can.



