Opinion To protect human artistry from AI, new safeguards might be essential

By
and 
March 13, 2023 at 6:45 a.m. EDT
(Video: Washington Post illustration; images by Fritzchens Fritz/Better Images of AI and iStock)
4 min

T Bone Burnett is a Grammy and Academy Award-winning producer, guitarist and songwriter. Jonathan Taplin is director emeritus at the USC Annenberg Innovation Lab and the author of “Move Fast and Break Things.”

Ten years ago, we participated in a panel discussion called “Rethinking Copyright” at the MIT Futures of Entertainment Conference. The panel’s program description said “it’s becoming painfully clear” that the current “conception of copyright is ill-prepared for regulating and making sense of a world where media content is fluidly circulated by most of a society.”

We argued against a tech-centered “information wants to be free” worldview, well aware that online file-sharing had nearly halved U.S. sales of recorded music, but little realizing the digital threats to creative professionals that lay ahead.

The work of visual artists and photographers is now regularly appropriated online, without permission or payment. Music streaming has provided a lifeline to many musicians, but plenty of others, along with countless artists in other creative fields, are hanging on by a thread, especially in the wake of the pandemic.

And now the rapid rise of generative AI systems such as ChatGPT for text, DALL-E for images and MusicLM is poised to cut that thread completely.

Big Tech companies are “training” generative AI models by stealing music, books, photographs, paintings and videos off the internet under the guise of copyright “fair use.”

Last month, Getty Images sued Stability AI — the maker of the popular image generator Stable Diffusion — for not only “ingesting” 12 million of Getty’s photos, but also for altering or removing copyright management information from the images. It seems clearly unlawful, but that doesn’t mean the courts will agree. Big Tech has successfully skated through legal matters for years, waving the banner of “permissionless innovation.”

Only the biggest of Big Tech players will dominate generative AI, because it requires massive amounts of computing power. But copyright appears to be anything but a top priority for Google, Meta and Microsoft. So while human creators rationally explore and debate this issue, tech corporations are using their work to train the generative machines that ultimately may make the artists obsolete.

There are already more than 200 paperbacks and e-books offered by Amazon that list ChatGPT as an author or co-author. One writer boasted recently that he produced a sci-fi novel in three hours using ChatGPT.

The problem is not just the immiseration of artists’ incomes. AI operates on the theory that all the possible original ideas in the universe are already contained in the data sets they have ingested from the internet. So, the intelligence of AI merely requires recombining this data into something that differs enough from the ingested work to not risk a copyright suit.

Surely, the world has enough formulaic artistic content already. What’s needed: keeping the spigot of original ideas and new thoughts open — to build a culture that goes beyond what is already found on the internet and can be enriched by artists who might grow in ways no AI could predict.

This is not to ignore the promise that AI holds to vastly improve human lives in areas such as agriculture planning and cancer screening and traffic routing. Even the creative arts will benefit from smart tools — in music, that might mean creating new sounds, improving streaming playlists and matching concert tours to audiences in new and more effective ways. But if the core creative act is radically diminished or replaced, that will stop culture in its tracks.

We realize that Big Tech’s plunge into AI has raised concerns that may make the endangering of artists’ rights seem pale in comparison. These include the troubling ways that Microsoft’s Bing chatbot has steered conversations, the likelihood that AI will supercharge disinformation and the possibility that AI-powered military weapons could become ungovernable “killer robots.”

Yet a nation’s cultural life is not a minor matter, and preserving artists’ rights is essential to ensuring their continued contribution. Any reasonable interpretation of existing copyright law ought to protect against abuses, but that doesn’t always happen. A case now before the Supreme Court involving the artist Andy Warhol’s unauthorized use of a photographer’s image of the musician Prince could dictate the direction copyright law will take in the coming years.

But another solution may be needed: new laws and regulations governing AI and safeguarding the human core of creative artistry.

As the physicist Stephen Hawking wrote, “If a superior alien civilization sent us a message saying, ‘We’ll arrive in a few decades,’ would we just reply, ‘OK, call us when you get here – we’ll leave the lights on’? Probably not – but this is more or less what is happening with AI.” That was nine years ago. It’s still happening.