Big Questions About Artificial Intelligence. Thisfursonadoesnotexist Much of the fun of internet drama comes from its frivolousness, but sometimes an online shitfest points to something bigger. Last week, the AI-powered furry art site thisfursonadoesnotexist did just that, igniting a fandom firestorm while also highlighting an important debate about digital art.
Arfa, the programmer behind thisfursonadoesnotexist, says he used the same GAN (generative adversarial network) architecture behind the site thispersondoesnotexist to generate around 186,000 furry portraits. When he posted the project on Twitter last Wednesday, dozens of commenters rushed to weigh in.
While many were fascinated by the project, some in the furry community objected to Arfa’s unauthorized use of art from the furry forum e621.net as training data. At least one person tried (and failed) to find proof that the algorithm was copying images from e621.net outright. And within days, the entire site was slapped with a DMCA copyright infringement complaint. (The company whose name the DMCA was issued in, according to Arfa, denied filing the notice and requested it be withdrawn.)
Some degree of backlash is understandable. Furry fandom has long been a close-knit community of independent creators supported by individual commissions.
A project aimed at mass-producing fursonas—using original art as training material, no less—could be seen as a threat to creators’ livelihood. Some commenters accused Arfa of disrespect and asked for the choice to opt-out of the project. Others complained that their work had been uploaded to e621 without their permission in the first place.
The creator of thisfursonadoesnotexist thinks it would’ve been impossible to contact all the artists involved. Arfa told Gizmodo that he scraped 200,000 images that were then narrowed down to a 55,000-image training set representing approximately 10,000 different artists—creators who may go by different names now or have left the fandom entirely.
According to Arfa, he’s more than willing to take an image down from thisfursonadoesnotexist if it clearly copies an original character, but he says he has yet to see credible evidence of that.
In defense of the AI’s originality, the site has produced a collection of mushier fursonas whose delirious weirdness inspired a flurry of memes. “Some of these have designs that are so… specific? Holistic?” a commenter on Hacker News wrote, linking to a fursona with a tail sticking out of her head and an adorably half-formed feline mouse. Do these Cronenberg-Esque misfit furries, with their wild-eyed gazes, scream “LOVE ME” or “SAVE ME”? The art world adores liminality—that’s value-added right there.
Furry artists aren’t alone in facing the dilemma of digital manipulation. Just last month, Jay Z filed DMCA takedown notices against a YouTuber who used speech synthesis software to make his voice read the Book of Genesis and cover Billy Joel’s “We Didn’t Start the Fire.”
While experts explained to Gizmodo that Jay-Z’s issue isn’t copyright, since copyright doesn’t cover speech patterns, both incidents suggest a future where machine learning art is widespread, even commonplace. In such a future, can an artist’s original work be used as training material? If so, to what end? (In Jay-Z’s case, YouTube ultimately allowed the videos to stand.)
One significant hurdle an AI art lawsuit would need to clear is evidence of actual copyright infringement, not just use. To be actionable, an infringing work must be “substantially similar” to the source work—a determination that would be up to the courts to decide. But if an AI does infringe on a copyrighted work, who’s responsible?
Art world attorney Nicholas O’Donnell compared the question to those raised when PETA tried to get a copyright on behalf of a monkey that took a viral selfie. A U.S. court dismissed that case, finding that works produced by animals can’t be copyrighted. “One of the things that [decision] spotlighted was the necessity that an author is a person,” O’Donnell said. But is the AI the monkey or the camera?
It could be a bit of both—and neither. AI itself can’t be held liable for infringement, but unlike a monkey or a camera, you can train AI not to infringe. It’s ultimately up to the programmer to ensure that the training data is broad enough so that the program doesn’t predictably spit out a similar work.
“If the program is being used as a tool, then the identity of the alleged infringer is someone who’s using those tools to potentially infringe on a protected work,” O’Donnell said. “And certainly, the narrower the scope of the program’s target, the more one could see that being argued.
What if I take a program, and all I put in front of it is one protected work—and then I say, ‘the program [infringed], not me,’ that doesn’t really fly.” When asked whether the AI-generated image below looks “substantially similar” to images found on e621, O’Donnell said, “I don’t know. I don’t know. It’s not jumping off the page.”
“The AI is learning, but it is also copying,” Jon Garon, who teaches intellectual property, cybersecurity, and technology law at Nova Southeastern University, told Gizmodo. To survive a legal challenge, an artistic AI would need to be trained in the same way that an art student learns not to copy a masterwork. If it duplicates a work in its database, then the developer could be held responsible.
In the case of thisfursonadoesnotexist, Garon said that one possible plaintiff is Disney. Given the amount of fanart that can be found on e621, it’s unsurprising that some of the images on the site look suspiciously like copyrighted corporate characters: Nick Wilde and Judy Hopps from Disney’s Zootopia, even an unholy amalgam of the two; Rouge the Bat from Sonic the Hedgehog; Nala from The Lion King.
Because intermediary copiers don’t matter, Garon explained, if someone steals his song and puts it on YouTube, and the song eventually makes it onto a pop star’s album, he could still sue the pop star.
“If [a program] is being shown a million images of Mickey Mouse, the AI will have the potential to create a work that is very different from any of the particular images,” Garon said. “But since all of the images are variations on the same copyrighted Mickey Mouse, it’s going to necessarily copy Mickey Mouse. And then Disney would have a case directly against the programmer of the AI.”
The infringement issue flips if an AI randomly generates a nearly identical version of a work that wasn’t used to train its algorithm. If the AI had never seen a version of the work, it cannot infringe upon it.
“If their work was never copied at all, there’s no copyright claim,” Zahr Said, professor at the University of Washington School of Law, explained to Gizmodo. Said pointed to Judge Learned Hand’s famed 1936 statement in a copyright case against Metro-Goldwyn Pictures:
…if by some magic a man who had never known it were to compose anew Keats’s Ode on a Grecian Urn, he would be an “author,” and, if he copyrighted it, others might not copy that poem, though they might of course copy Keats’s.
The hand was referencing the idea of independent creation when two authors happen to create the same work by chance. “It’s rarely found, but in a hypothetical in which the AI never had the work copied into the sample set, it couldn’t be infringing,” Said said.
But even if the AI did produce a substantially similar image to one in its sample set, Said continued, you’d have to take into account the commonality of elements (in copyright lingo, “scènes à faire”) that are essential to a genre and therefore not copyrightable—elements like large eyes and rounded snouts.
“If you’re creating work in a boarding school wizard genre, there will be teachers with witch hats and flying brooms,” Said added. The case gets trickier when you name a student “Harry Potter.”
AI is broad, and a copyright case would depend on whether it’s being used for scholarly purposes, in which case courts have tended to rule that AI needs, and can fairly use training data. The case would be viewed differently if, say, you were selling an AI-generated painting for half a million dollars (which artists have).
Said said that every case depends on the alleged infringer’s purpose, and lawyers often advise defendants to make that clear from the outset. (For his part, Arfa told Gizmodo, “I’m just a guy trying to show people some cool tech.”)
Said said that the fact that thisfursonadoesnotexist calls itself a “project,” and, through its URL, reference a previous project which was created to show how AI works, a fair use finding is more likely. “If there were facts showing that this programmer was selling ads or downloads from the site that would allow people not to pay licensing fees that they were habitually paying to furry artists—that they’re supplanting demand—the worse it looks for the defendant.”
It’ll take more than one AI-generated database to supplant the highly valued artists of the furry fandom, however. Commercial ventures have tried and failed to monetize furries in the past—and will keep trying. Others have taken thisfursonadoesnotexist on its face: a depersonalized bucket of elements, to be recycled as inspiration for improvement and play.
“You can do a lot of damage to a market even if you’re not charging money for your infringing copy,” he added.
Fundamentally, these questions about authorship and originality aren’t new, stretching back thousands of years to the very beginning of Western art. Like a photocopier, modern technology has simply taken art’s oldest predicaments and made them bigger and blurrier.
Unfortunately for us, a cigar-waving Orson Welles isn’t here to pace through the mist in a cape and pontificate on furry forgery. Instead, courts will have to wade through these issues—giving the rest of us plenty to argue about in the meantime.