Skip to main content

Fashion Forward Friday: DEAR CORPORATE AI OVERLORDS

Regulations today

A new Senate bill for creators to find out if their work was used to train AI, the Transparency and Responsibility for Artificial Intelligence Networks (TRAIN) Act. This act would enable copyright holders to subpoena training records of generative AI models if the holder can declare a “good faith belief” that their work was used to train the model. AI developers would only need to reveal the training material “sufficient to identify with certainty” whether the copyright holder’s works were used.

Sen. Peter Welch said, “We need to give America’s musicians, artists, and creators a tool to find out when A.I. companies are using their work to train models without artists’ permission.”

 

Napster AI

If you’re my age, you will remember Napster. Both Napster and Gen AI represent disruptive technologies that allow widespread access to content. Napster facilitated the easy sharing of digital music files without copyright permissions, enabling widespread piracy. Gen AI is a technology that can generate new content like text, or images, raising concerns about copyright infringement.

Napster went away, and its concept in our culture evolved; the music industry had to adapt to digital music distribution, and today, the creative industries need a new legal framework to address copyright issues related to Gen AI content.

What could that look like?

Currently, the subscription-based Gen AI models are not paying the artists used to build the models. They are receiving heavy investments to continue, and they are being sued. (I would add a link to a lawsuits, but there are so many)

Other services are paying artists, like Firefly. They have yet to present the long-term outcomes of these compensation plans.

If AI companies could purchase licensing for training, the models could be trained for private employers -e.g., a Disney Model, a Nike Model, etc. which would allow corporations to maintain brand integrity because, let’s be honest, at this point, only the big guys can afford to buy in. 

 

So what about us?

What does the future hold for the small screen printers exploring AI and learning how to integrate it into their business? More regulations for sure.

Also, looking back at Napster, early rapid adoption was happening. A court ordered Napster to show that it could restrict access to infringing material and keep track of user activity on its network. Napster shut down in July 2001 when it could not comply.

When Napster was shut down, users were left without a legal way to access and share music freely, leading many to either stop sharing music altogether or switch to other peer-to-peer file-sharing services (Limewire) that emerged later. Eventually legitimate music streaming services became widely available. The Recording Industry pursued legal action against some individual users who were sharing large amounts of copyrighted music.

In 2003, the Recording Industry sued 261 people for sharing songs on peer-to-peer (P2P) networks. The Recording Industry offered defendants the option to settle by destroying their illegally acquired files and paying a small amount per song. The Recording Industry also sued thousands more individuals in the following years. 

Not regulated does not mean legal. Current attempts at regulation are on-going and targeted at AI companies (as they should). Ideally a resolution will make the process usable and profitable for artists.

 

Open Letter to Open AI

Open AI gave a bunch of artists early access to explore and test Sora, a Generative Video AI that we looked at way back in May. I don’t think Open AI planned for this response from the artists, an open letter to DEAR CORPORATE AI OVERLORDS.

In short, they say,

ARTISTS ARE NOT YOUR UNPAID R&D
☠️ we are not your: free bug testers, PR puppets, training data, validation tokens ☠️

Any artist can join, and add their signature here. This is not an anti-AI campaign, but it is calling on Gen AI to value artists. Perhaps something that is more nuanced than AI is capable of?

Comments

Leave a comment