Awe and apprehension over AI-generated art from open-source tools like ChatGPT, Dall-E, Midjourney and Stable Diffusion aside, there is a huge debate about the ethical practices of using AI-generated models for creative work.
Several ongoing lawsuits have raised legal concerns around the use of these AI-generated images. Questions like who truly owns these images and if they might infringe on existing copyrighted works are compounded by the rapidly blurry line between reality and fiction. No one really knows where this is really headed.
Currently, Getty Images, known for its historical and stock photos, has sued AI image generation Stability AI, the maker of Stable Diffusion, for copyright infringement. Getty alleges that the company copied over 12 million of its images to train its AI model 'without permission or compensation.'
To avoid repeating such a scenario, creative software companies like Adobe have started to address the issue. They recently introduced Firefly, a generative AI tool, that will introduce a “Do Not Train” tag for creators who do not want their content used in model training.
To avoid the legal minefield, brands like L'Oreal have established a robust framework for the ethical development and use of AI systems in their marketing mix. They have outlined a structure and policies to mitigate the risk of bias and privacy with the use of AI models taking the UN guiding principles into account.
Ramzi Chaabane, global category manager for advocacy and metaverse at L'Oréal, tells Campaign Asia-Pacific that most brands recognise the importance of copyright ownership in AI-generated art, especially with the rise of open-source tools.
"That is why brands need to establish clear guidelines and legal frameworks to protect creative work," explains Chaabane.
At McCann's MRM, Ronald Ng, the global chief creative officer at MRM, says the agency is still learning about generative AI while using the tools. Interestingly, he notes, there is no fixed answer when it comes to respecting intellectual property and legalities around this issue.
However, he says what people can do is be responsible, both from the client and agency side. For example, he recalls in the past, before there was generative AI, there was plagiarism. In this industry, Ng says he has spent a lot of time studying the work to avoid presenting ideas that have been done before.
"I had a previous experience where a junior team presented an idea to me, and it turned out that the idea had already won a gold lion two years ago, and they were completely unaware. It was unintentional, but being responsible and a student of our craft is critical," Ng tells Campaign Asia-Pacific.
"We need to adopt new technologies responsibly. We shouldn't just evolve somebody's intellectual property into something else. We must create systems that say you cannot plagiarise, even if you're evolving somebody else's idea. We are looking into how we can be strict and take action when a team knowingly plagiarises. We want to avoid that. We want to be responsible and work with our clients to prevent this."
To prevent plagiarism in the agency's creative briefs, especially the larger ones, the agency has implemented an idea called 'Beat the Bot'.
Ng explains there are vast opportunities to use ChatGPT and other generative AI tools as partners. He compares it to a sparring session between creatives where they constantly challenge each other to produce better work.
"When we create a brief, we input it into ChatGPT and ask it for ideas for this campaign, this client, and this challenge. About half a dozen ideas will come in, and we will pick the best six. Then, we will tell our team, "These are the ideas. Do not copy these. You need to be better than this," explains Ng.
"We do not want people tempted to use the easy way out. There is a level of quality in the ideas. We weed out the expected ideas and check that GPT is spitting out. We immediately raise the bar, going to the second level of ideas because many creatives think of the first level of ideas first. We don't want those first-level ideas. We're using ChatGPT as a companion to increase the quality of ideas."
Implications for creators
Generative AI tools create content based on existing data and patterns, but the output of such devices is still subject to copyright laws. Hence, creators using them still have to adhere to copyright restrictions.
Mustapha Zainal, creative director for tech and innovation in APAC at MediaMonks, says the creators of such AI-generated content are responsible for ensuring that the content produced does not infringe on the rights of others.
The basis of training a generative AI on copyright-protected data is likely legal, but creators could use that same model in illegal ways.
"There may be specific applications of generative AI that fall outside of regulatory concerns, depending on the jurisdiction and specific circumstances of the use case. For example, in some cases, generative AI may be considered fair use," Zainal explains to Campaign Asia-Pacific.
"This legal doctrine allows limited use of copyrighted material without permission from the copyright owner. However, it is essential to seek legal advice in specific cases to determine if generative AI is within the bounds of the law."
The big question that everyone asks says Simon Hearn, managing director for APAC at Distillery, is,"where are these AI platforms getting fed their information from to generate the content they create?"
Hearn reckons creators should still be held accountable for the same copyright restrictions to ensure the delivery of ownable content for their brands.
"Here's an idea: A true test of AI-generated Art is to have an AI you can run through to see if it's at risk of copyright infringement. What other pre-existing work is out there that might be considered too closely alike and therefore carry potential risks?"
However, there are applications of AI art that are less challenging from a regulatory perspective.
Etienne Chia, co-founder, growth and creative at The Fourier Group, says this includes using generative AI for image editing, such as using image-to-image to apply styles or filters to existing images or using in-painting to replace part of an image.
"Furthermore, brands are also IPs, and it is their right and in their interest to train models using their datasets to generate content for themselves," Chia tells Campaign Asia-Pacific.
Who is behind the prompt?
Copyright law regarding works created by artificial intelligence (AI) is complex and evolving. However, in general, the law grants copyright protection to authors of original works that fix their creations in a tangible form.
Whether works created by AI can be considered original works of authorship and thus eligible for copyright protection is a contentious issue.
Without demonstrable human interaction, it can be challenging to determine who should be considered the author of a work created by AI. In addition, systems often use algorithms and data sets created by multiple people, making identifying a single human interaction or authorship challenging.
Furthermore, the extent to which a human must be involved in creating a work for it to be considered original is also uncertain.
"Given these complexities, applying copyright law to works created by AI remains a grey area," says Zainal.
"Some legal experts have argued that the creator of the AI system or the owner of the data used to train the system should be considered the author of the works created by the AI. In contrast, others believe the works should be considered in the public domain since a human author did not create them."
The latest jurisprudence in the US and Europe establishes that creators cannot copyright AI-generated images due to insufficient levels of human authorship.
However, Chia points out that the terms and conditions of the "big 3" image generators all grant personal and commercial usage rights to the content.
"Some are exclusive usage licenses (Stable Diffusion, Dall-E 2). In contrast, others (Midjourney) only offer a non-exclusive usage license, which means that other people can also use the artwork you created using Midjourney if they want to," says Chia.
"In the case of Stable Diffusion and Dall-E, creators are also allowed to mint their creations as NFTs, which allows them to justify ownership more easily in case of litigation. Another way to strengthen copyright claims is to increase human authorship in the creation process through manual editing and retouching work on top of the artwork."
Chaabane agrees that while a human may provide prompts for the AI, determining the right copyright product of the work can be challenging.
"On a brand level, we must remain committed to ensuring that AI-generated work is held to the same high ethical standards as any other creative work and that we're dedicated to upholding those standards at every step," explains Chaabane.
What should brands and creators do?
Brands and creators can bridge the gap in use and address tension around the infringement of generative AI by embracing the tremendous opportunities sensibly and ethically instead of throwing the baby out with the bathwater through class actions against technology.
"It is uncomfortable to acknowledge that there are – and will be – cases of using AI for bland plagiarism," Joschka Wolf, group creative director for experience design at R/GA, tells Campaign Asia-Pacific.
"Nevertheless, committed by people, not technology. It's painful to see models being trained on a specific (living) artist's style – and people monetising on just that. Those will ultimately need to be decided in court, as with previous forms of plagiarism enabled by conventional technology."
Wolf advises brands to be curious and look beyond the static outputs. The power of generative AI lies in its ability to rapidly transform campaigns into highly individualised experiences and enable innovative digital products and services to open up entirely new business avenues at a lesser cost. In addition, he also says they should be curious about people.
"Recognise their unmet needs that align with our brand's purpose, then look to technology to unlock possibilities. Let's be curious about collaboration: Pair up your brand teams with artists, illustrators, technologists, and product designers," explains Wolf.
"Think of a brand committed to children's development, engaging a comic illustrator on board to train a neural network with their specific style. You might end up with a thoroughly individualised children's graphic novel that features drawn equivalents of the actual family coming together as superheroes to fight bullying in school."
Brands must recognise the importance of staying current and informed in the constantly evolving field of AI regulation. L'Oréal has established an external advisory board of independent experts to ensure it makes the most ethical and responsible decisions as possible.
"Collaboration and dialogue are crucial to addressing the complex AI issues that brands must consider," adds Chaabane.
Chia agrees, saying the key is to work together rather than against each other. While it is true that the efficiencies created by AI art tools are massive, he argues brands will always need creators; simply, their roles will somewhat evolve.
"For brands, there are still several advantages to working directly with artists: creativity, name recognition, community building, and philanthropic brand image. But perhaps most importantly, working with an artist to train a custom model using their artwork will always produce better and more consistent results than the generic base model," explains Chia.
"These models can be secured, using blockchain, for example, to prevent anyone other than the artist from using them for content generation, effectively becoming part of the artist's IP in its own right. AI art tools are nothing more than a new and hyper-efficient way to produce and edit images, but the rules around how the images produced are used don't fundamentally change."