Newsletter Subscribe
Enter your email address below and subscribe to our newsletter

Adobe's new AI feature lets artists train models on their own art, but is this a gift or a Trojan horse for harvesting unique styles?
Did you hear the one about the wolf guarding the hen house? Because Adobe just pulled off a similar trick, only this time, they’re after your precious art. They’ve announced a “game-changer” for their Firefly AI: now you can train it on your own art. On the surface, it sounds like a dream come true for artists, a digital genie granting endless creative wishes. But here’s what nobody is telling you about this supposed act of creative liberation – the fine print is where the devils dance.
Adobe, ever the master of branding, is positioning this new “Custom Models” feature as a beacon of “ethical and artist-centric AI.” They want you to believe they’re finally giving artists control, letting them scale their unique vision without compromising integrity. Sounds lovely, doesn’t it? Almost too lovely, like a suspiciously sweet deal you just know has a catch.
But let’s peel back that shiny veneer for a second. The internet, particularly the artist corners of Reddit and X, is already screaming “theft” and “data grab.” Why? Because many see this as Adobe’s clever way to harvest more unique datasets, disguised as empowerment. Are we really supposed to believe that a company, whose foundational AI models were trained on vast, often questionably sourced, datasets, is suddenly going to become the patron saint of individual artistic style? It strains credulity, to say the least.
“Train on MY art? So they can flood Stock with my clones and paywall my essence?” – A common sentiment echoing across artist forums.
This isn’t about empowering you as much as it is about them getting more fuel for their AI engine. What happens when your “custom model” data subtly influences their broader AI capabilities? Does anyone actually believe Adobe won’t find a way to monetize or leverage this invaluable data beyond your immediate use? To think otherwise is, frankly, naive. This feels less like a gift and more like a carefully baited trap.
Adobe’s Chief Technology Officer, Ely Greenfield, probably thinks this is a “game-changer.” And sure, for enterprise clients who want to maintain rigid brand consistency across a thousand AI-generated ads, it absolutely is. Imagine a world where every single visual element from a major brand is perfectly on-message, generated at lightning speed. Great for corporations, maybe not so much for the visual diversity of our world. It’s a sterile, predictable future.
But for individual artists, this “control” comes with a chilling caveat. You’re feeding your unique artistic fingerprint directly into a system. While Adobe promises security and individual ownership, the history of AI and intellectual property is a minefield. What if your style, once uploaded, becomes part of a larger, more ambiguous pool of data that can be subtly replicated or even “learned” by others? The very thing that makes you unique could become just another data point for the algorithm, diluted and generalized. It’s a terrifying prospect for anyone who values their distinct voice.
Maria Rodriguez, President of the Artists’ Rights Coalition, has it right when she says, “While I appreciate the effort to give artists more control, the larger questions about AI’s impact on human creativity and copyright are far from settled. We need to remain vigilant.” This isn’t just about your personal workflow; it’s about the future of artistic identity itself. Are you willing to gamble your artistic soul on a company’s vague promises?
Let’s be blunt: Adobe isn’t doing this purely out of the goodness of its heart. This move feels less like innovation and more like strategic maneuvering. With AI companies facing a barrage of lawsuits over copyrighted training data, offering artists the ability to train on their own work is a brilliant PR move. It attempts to clean up their image, positioning Firefly as the “ethical” choice, while competitors like Midjourney are still battling it out in court. It’s a masterful deflection, a shiny object to distract from the deeper issues.
But here’s the kicker: even if you train Firefly on your own art, the underlying foundational models that power Firefly were still trained on something. And that “something” is where the original sin of AI art lies for many. This “custom model” feature is less a confession and more a distraction. It’s a shiny new toy designed to make you forget the messy origins of the entire AI art movement, like painting over a crack in the wall instead of fixing the foundation.
The creative software market is a multi-billion dollar behemoth, and Adobe wants to keep its chokehold. By integrating AI so deeply and “ethically” into their ecosystem, they’re making their tools even more indispensable. For artists already struggling to make a living, are they really expected to opt out of a tool that promises efficiency, even if it feels like selling a piece of their soul? The pressure to conform, to adopt the “latest and greatest,” is immense, and Adobe knows it.
So, before you excitedly upload your entire portfolio to Adobe’s new “Custom Models” feature, ask yourself: are you truly gaining control, or are you just handing over the keys to your creative kingdom? The choice is yours, but choose wisely, because once your art is in their system, there might be no getting it back.
Source: Google News