The Lensa app has gone viral in recent weeks, with much excitement over its new AI-driven “magic avatars” feature.
Small problem: It’s not exactly magic nor purely artificial intelligence. Instead, to create these avatars, the app is apparently scraping up artists’ images without their consent. The image appropriation is so blatant in many cases, the Lensa-generated images even include the original artist’s signature. (See example above.)
“I think they didn’t think artists would stand up for themselves because we don’t [have] industry labels the way the music industry does,” Lauren tells me. She says that, because Stable Diffusion, the AI company which provides Lensa’s neural network, is very careful about how the platform samples and trains from recorded music.
“The fact that they do so with their music model shows they are well aware of copyright (I mean it’s a basic concept, anyone who isn’t a little kid is aware of copyright), and that it’s not something that was too complicated to implement.”
Lauryn’s findings were noted yesterday by NBC News, but so far, Lensa’s corporate owner has only offered an incomplete reply:
[Lensa creator] Prisma issued a lengthy Twitter thread on Tuesday morning, in which it addressed concerns of AI art replacing art by actual artists. The thread did not address accusations that many artists didn’t consent to the use of their work for AI training. [Emp. mine – WJA]
“Basically what I want to see is a model where they are using art that’s either in the public domain, and art that has been voluntarily provided,” Lauryn tells me. “It’s not that AI art programs shouldn’t exist, but they need to not steal the work they are using to train their programs. An opt in program where royalties are granted when used is an ethical way of doing it.
“Consent, compensation, and credit. These are the things that are currently lacking… licensing the artwork and paying for usage would be the correct way to go about it. That way people are compensated for their work, and they actually have a say as to whether or not it’s even used in the first place.”
I’m cropping these for privacy reasons/because I’m not trying to call out any one individual. These are all Lensa portraits where the mangled remains of an artist’s signature is still visible. That’s the remains of the signature of one of the multiple artists it stole from.
— Lauryn Ipsum (@LaurynIpsum) December 6, 2022
You can see examples of Lensa blatantly scraping artists’ signed images compiled by Lauryn embedded above and below. Surprisingly, Stability AI is not only hoovering up art works of independent artists like her, but art owned by major, sue-happy companies like Disney. So I’d expect all this to come to a head soon.
Stability AI is happy to follow copyright laws for their music model, because they know that music labels will hold them accountable. So this seems like a good time to point out to larger companies like @WaltDisneyCo that their copyrighted material is being stolen and used too pic.twitter.com/sIVza7hN95
— Lauryn Ipsum (@LaurynIpsum) December 7, 2022
More about this topic on NBC News. To me this is yet another example of a phenomena that’s become so common, it’s almost an adage at this point:
Whenever a company says it’s using “artificial intelligence”, look for unfairly treated humans behind the scenes.