How Creators and Marketers Can Protect Themselves While Using AI

The tech’s emergence underscores the importance of proper attribution and collaboration, according to Growth Stack Media’s Jordan Mitchell. 

How Creators and Marketers Can Protect Themselves While Using AI

The following is a guest piece by Jordan Mitchell, founder of integrated communications and content agency Growth Stack Media. Opinions are the author’s own.

Throughout my career, I have witnessed the challenges content creators face when striving to make a living pursuing their passion. While honing technical skills and storytelling abilities is crucial, it is equally important for creators to understand the business side of the creative marketing industry.

As emerging technologies like artificial intelligence (AI) become powerful enough to replicate human voices without consent or create stunning visual short films that would take skilled professionals significantly more time and resources to produce, the importance of proper attribution and collaboration between creators and the marketers who partner with them grows.

While legislation like the NO FAKES Act is being rolled out, I don’t believe it’s enough to protect the integrity of branded content, be it generated by an individual creator, a marketer’s internal team or a creative agency. To fill in the gaps, here are my tips on ways creators and marketers can protect their rights while leveraging AI tools effectively and position themselves for success in this new era.

Collaborate closely

Establish clear guidelines for content usage and attribution. Maintain open communication to ensure all parties understand and agree to the terms of use.

Clearly define the scope of work from the outset to establish a mutual understanding of the deliverables. If the client is only paying for specific content deliverables, creators are not obligated to provide source files or raw materials. It's important to align on how the content will be used, where it will be distributed and the licensing requirements.

Here’s why: If a creator delivers the approved, final assets and the client starts making edits independently, it can be considered a breach of the agreement. They should be paying the creator for those use cases, and it's not recommended because they may lack the technical skills to maintain the integrity of your content if they make further adjustments.

It's also important to discuss plans for distribution and specifically request that the client tags you in social media posts when promoting the content externally. If they don't, that’s okay, tell them that you’d like to plan to reshare their content on your channels at a future date if they are comfortable with it and leverage the work you created for them as part of your portfolio.

If your deliverables will be used for internal purposes or specific offline use cases, ensure that the proper paperwork is in place, such as a non-disclosure agreement (NDA) to avoid confusion down the road. This applies to all forms of digital media.

Include clauses specifying that the creator must be credited, regardless of how their work is used. Negotiate for royalties or other forms of compensation when work is used in AI training or other derivative works.

Leverage technology

To ensure the protection of your exported content, it is important to utilize the available tools in addition to contracts. Make it a priority to include metadata in all exported assets, as this allows for the encryption and embedding of your information directly within the media file itself.

Content credential initiatives like TikTok's partnership with Adobe are a step in the right direction, but they can't prevent watermark removal and AI manipulation. Watermarks are easily removed with freely available editing tools and AI can change videos in ways that make tracing the original source extremely difficult.

While there's no one, perfect solution, blockchain technology shows a lot of promise. By using the blockchain to track the origin and ownership of content, creators can establish a transparent record of their work that's far harder to alter than traditional methods.

This is already happening with NFTs, which have helped creators protect their digital works by proving ownership and preventing unauthorized duplication. If similar blockchain-based solutions were applied to other types of content, it could help ensure that creators are credited when their work is used.

Follow the legislation

The recently introduced NO FAKES Act aims to protect all individuals, including artists, musicians and actors, from having their likeness replicated by AI without permission. However, media attention on high-profile celebrity cases, such as Scarlett Johansson's concerns about ChatGPT seemingly replicating her voice, overshadows the vulnerability of everyday creators and the public at large.

Even creators who are comfortable with their voice and likeness being replicated would have a hard time receiving proper credit and compensation under the NO FAKES Act as it's currently structured. The government's track record on preventing piracy and enforcing copyright law is poor, particularly with new technologies consistently outpacing regulations.

The infamous file-sharing service Napster was one of the few defendants in copyright infringement cases over the past few decades that faced real consequences, and that was back in 2001. If the government had difficulty cracking down on MP3 sharing, how can we expect it to keep up with the pace of AI development without a stronger focus on private-public sector partnerships?

Stay ahead of the curve

Be aware that generative AI tools commonly train their large language models (LLMs) by scraping data and using creators' original work without their knowledge. Before using generative AI, read the terms to understand how your data will be used. Be mindful of your prompts and the content you upload into these tools, as there may be clauses that automatically opt-in your consent for the company to reuse your content. If you are under an NDA, this could be problematic for you and the client.

AI-generated content is becoming increasingly sophisticated. For example, Luma AI recently released Dream Machine, an AI model, that quickly creates high-quality, realistic videos from text and images. The technology has a remarkable understanding of real-world physics, producing videos nearly indistinguishable from live-action footage. As AI advances, identifying original content from AI-generated replicas will become increasingly difficult.

Stay informed about the latest AI technologies and their potential impact on the creative industry. Engage with professional organizations and advocate for creator rights by joining industry associations or unions, such as SAG-AFTRA, that represent creators' interests.

Look on the bright side

Despite the risks AI poses to creators, it's not all doom and gloom. AI tools can offset time-consuming tasks throughout the creative process, lowering barriers to entry and empowering more people to freely express themselves. Regardless of the content type, AI can help expedite each stage of the production process.

From generating ideas and outlines to creating rough drafts, enhancing visuals and selecting key excerpts from longer pieces, AI tools can be very helpful. However, the human element remains crucial in knowing how to effectively use these tools in the creative process and bring everything together in a way that optimizes budget and delivers unique work that makes a business impact by influencing audiences to take action or make purchasing decisions.

Ultimately, for AI to be a net positive for the creative community, we need to establish stronger protections for creators' rights. Ensuring proper credit and compensation is the foundation upon which a thriving, symbiotic relationship between human creativity and AI can be built.