The following is a guest piece by Jordan Mitchell, founding father of integrated communications and content agency Growth Stack Media. Opinions are the writer’s own.
Throughout my profession, I even have witnessed the challenges content creators face when striving to make a living pursuing their passion. While honing technical skills and storytelling abilities is crucial, it’s equally vital for creators to know the business side of the creative marketing industry.
As emerging technologies like artificial intelligence (AI) grow to be powerful enough to copy human voices without consent or create stunning visual short movies that will take expert professionals significantly more time and resources to provide, the importance of proper attribution and collaboration between creators and the marketers who partner with them grows.
While laws just like the NO FAKES Act is being rolled out, I don’t consider it’s enough to protect the integrity of branded content, be it generated by a person creator, a marketer’s internal team or a creative agency. To fill within the gaps, listed here are my recommendations on ways creators and marketers can protect their rights while leveraging AI tools effectively and position themselves for fulfillment on this latest era.
Collaborate closely
Establish clear guidelines for content usage and attribution. Maintain open communication to make sure all parties understand and comply with the terms of use.
Clearly define the scope of labor from the outset to determine a mutual understanding of the deliverables. If the client is barely paying for specific content deliverables, creators will not be obligated to offer source files or raw materials. It’s vital to align on how the content will likely be used, where it can be distributed and the licensing requirements.
Here’s why: If a creator delivers the approved, final assets and the client starts making edits independently, it can be considered a breach of the agreement. They needs to be paying the creator for those use cases, and it is not advisable because they might lack the technical skills to take care of the integrity of your content in the event that they make further adjustments.
It’s also vital to debate plans for distribution and specifically request that the client tags you in social media posts when promoting the content externally. If they do not, that’s okay, tell them that you just’d wish to plan to reshare their content in your channels at a future date in the event that they are comfortable with it and leverage the work you created for them as a part of your portfolio.
If your deliverables will likely be used for internal purposes or specific offline use cases, make sure that the right paperwork is in place, comparable to a non-disclosure agreement (NDA) to avoid confusion down the road. This applies to all types of digital media.
Include clauses specifying that the creator should be credited, no matter how their work is used. Negotiate for royalties or other types of compensation when work is utilized in AI training or other derivative works.
Leverage technology
To make sure the protection of your exported content, it is crucial to utilize the available tools along with contracts. Make it a priority to incorporate metadata in all exported assets, as this enables for the encryption and embedding of your information directly inside the media file itself.
Content credential initiatives like TikTok’s partnership with Adobe are a step in the best direction, but they can’t prevent watermark removal and AI manipulation. Watermarks are easily removed with freely available editing tools and AI can change videos in ways in which make tracing the unique source extremely difficult.
While there isn’t any one, perfect solution, blockchain technology shows plenty of promise. By using the blockchain to trace the origin and ownership of content, creators can establish a transparent record of their work that is far harder to change than traditional methods.
This is already happening with NFTs, which have helped creators protect their digital works by proving ownership and stopping unauthorized duplication. If similar blockchain-based solutions were applied to other forms of content, it could help make sure that creators are credited when their work is used.
Follow the laws
The recently introduced NO FAKES Act goals to protect all individuals, including artists, musicians and actors, from having their likeness replicated by AI without permission. However, media attention on high-profile celebrity cases, comparable to Scarlett Johansson’s concerns about ChatGPT seemingly replicating her voice, overshadows the vulnerability of on a regular basis creators and the general public at large.
Even creators who’re comfortable with their voice and likeness being replicated would have a tough time receiving proper credit and compensation under the NO FAKES Act because it’s currently structured. The government’s track record on stopping piracy and enforcing copyright law is poor, particularly with latest technologies consistently outpacing regulations.
The infamous file-sharing service Napster was one in all the few defendants in copyright infringement cases over the past few a long time that faced real consequences, and that was back in 2001. If the federal government had difficulty cracking down on MP3 sharing, how can we expect it to maintain up with the pace of AI development and not using a stronger concentrate on private-public sector partnerships?
Stay ahead of the curve
Be aware that generative AI tools commonly train their large language models (LLMs) by scraping data and using creators’ original work without their knowledge. Before using generative AI, read the terms to know how your data will likely be used. Be mindful of your prompts and the content you upload into these tools, as there could also be clauses that robotically opt-in your consent for the corporate to reuse your content. If you might be under an NDA, this could possibly be problematic for you and the client.
AI-generated content is becoming increasingly sophisticated. For example, Luma AI recently released Dream Machine, an AI model, that quickly creates high-quality, realistic videos from text and images. The technology has a remarkable understanding of real-world physics, producing videos nearly indistinguishable from live-action footage. As AI advances, identifying original content from AI-generated replicas will grow to be increasingly difficult.
Stay informed concerning the latest AI technologies and their potential impact on the creative industry. Engage with skilled organizations and advocate for creator rights by joining industry associations or unions, comparable to SAG-AFTRA, that represent creators’ interests.
Look on the brilliant side
Despite the risks AI poses to creators, it is not all doom and gloom. AI tools can offset time-consuming tasks throughout the creative process, lowering barriers to entry and empowering more people to freely express themselves. Regardless of the content type, AI can help expedite each stage of the production process.
From generating ideas and outlines to creating rough drafts, enhancing visuals and choosing key excerpts from longer pieces, AI tools can be very helpful. However, the human element stays crucial in knowing learn how to effectively use these tools within the creative process and bring every part together in a way that optimizes budget and delivers unique work that makes a business impact by influencing audiences to take motion or make purchasing decisions.
Ultimately, for AI to be a net positive for the creative community, we want to determine stronger protections for creators’ rights. Ensuring proper credit and compensation is the muse upon which a thriving, symbiotic relationship between human creativity and AI can be built.
Read the complete article here