The Essential Guide to Ethical AI in the Music Industry
Written by
Published on
August 20, 2024
As a singer/songwriter turned music and entertainment attorney, I often hear from musicians who are equally excited and terrified about integrating artificial intelligence (AI) into their music making process. I get it; AI has rapidly emerged as both a beacon of innovation and a minefield of ethical and legal issues. Artists, songwriters, producers, and engineers are currently navigating these murky waters without adequate guidance. Recognizing this gap, innovative companies like Kits AI are stepping up to lead the charge, ensuring that the evolution of music production through AI is both ethically and legally sound.
This article explores the current legal landscape of AI in the music industry, introduces the concept of ethically trained AI models and its importance, and highlights how ethical platforms like Kits.AI are not only enhancing creative possibilities but also safeguarding the rights (and pockets) of musicians.
The Laws Impacted by AI in Music
In the United States, the use of AI in the music industry most commonly involves two distinct legal concepts: the right of publicity and copyright. As will be discussed, both laws play a crucial role in shaping the ethical and lawful integration of AI in music.
Right of Publicity
A subset of privacy law, the right of publicity lets people control how their image and voice are used for profit. This right can be violated when someone’s likeness, such as their voice, is used or manipulated by someone else without their permission. Because all voices in Kits.ai’s Voice Cloning tool are added directly by the vocalists with their permission, you the end-user can sleep soundly knowing you’re not going to receive a scary cease and desist letter from the owner of any voice you use on the platform.
While the right of publicity concept is straightforward, its application is anything but. After all, this right varies significantly from state to state and at the time of this writing, only 31 out of 50 states have offer right of publicity protections. In some states these rights only apply to a certain class of citizens, such as celebrities, public figures, or those actively serving in the military. In fact, only Tennessee offers legal protections and remedies to all its citizens when someone’s likeness is misappropriated by AI. While there is currently no federal right of publicity, a bill to provide similar protections for all US citizens was recently introduced in the US Senate. The key takeaway here is that until there is a federal right of publicity, the extent of one’s rights and legal options will vary greatly depending on their location. Though, as a lawyer, I am always going to advocate you to take the “better safe than sorry” approach.
Copyright Law
Copyright law, in contrast to the right of publicity, provides a uniform layer of protection across the entire United States. It creates an exclusive right to an author’s expression of their ideas—such as lyrics, melodies, and sound recordings. Any time that expression is used or recreated without the author’s permission, a potential copyright infringement is lurking around the corner.
AI’s role in music, particularly through “generative AI” tests the limits of copyright law and creates potential hazards for music makers. After all, what happens if a generative AI tool copies or closely resembles existing copyrighted material and who is responsible? While the US Copyright Office has maintained that the end-product or output of generative AI has no copyright protection because it lacks a human author, the outputs themselves can be the basis of an infringement lawsuit against brought against AI company or end-user.
Lack of copyright protections aside, there is nothing inherently wrong with generative AI. Where the problem arises is when these models use data sets of someone else’s copyrights without permission. You’ve likely heard of numerous high-profile lawsuits filed against the biggest generative AI companies for, at least according to the plaintiffs, this very reason.
Whether we are talking about the right of publicity or copyright, you can quickly start to see how the dividing line between being in the clear and the subject of a multi-million-dollar lawsuit boils down to permission. In other words, doing the right or ethical thing.
What is Ethical AI?
Ethical AI involves creating AI models and tools in a way that adheres to generally accepted moral, ethical, and legal standards. In both music and broader AI applications, ethical AI practices help ensure that the creators have given permission and are compensated when their works or likeness are used in an AI model or tool. This approach creates systems that respect the privacy and rights of others.
Fairly Trained
While some of the most popular AI systems available today have taken the stance that it would be “impossible” build ethical models, the nonprofit organization Fairly Trained has proven that successful AI models can be trained with the proper rights, permissions, and clearances. Their Fairly Trained Certification highlights this commitment by certifying AI models that meet rigorous criteria for ethical training. This certification process involves ensuring that data is used with full consent and used transparently. Additionally, it demands continuous monitoring and annual certification to ensure the AI systems continue to adhere to ethical standards post-launch. Whenever you see the Fairly Trained badge adopted by an AI platform, like Kits.ai, you can be confident that the platform has built their system the right way: ethically and with permission.
Aiformusic
Another badge you may encounter is aiformusic , an initiative endorsed by 50 leading tech companies, including Roland, Universal Music Group, and Kits.ai aimed at guiding the ethical use of AI in music production. All signing members have committed to following seven core principles to ensure AI enhances human creativity without overshadowing it. These principles advocate for respecting human artistry, promoting transparency, and encouraging community involvement in AI music-making processes.
The Perils of Unethical AI
So what happens when AI is used unethically? Consider the following recent case study. During the now infamous diss-track battle between Drake and Kendrick Lamar, Drake created a song, “Taylor Made”, that featured an AI-generated verse of the late Tupac Shakur. The issue? This verse was created without the knowledge or permission of Tupac’s estate, likely violating the late rapper’s right of publicity. The backlash was immediate and severe. The estate swiftly hit Drake with a cease-and-desist letter warning him that a lawsuit was imminent if he failed to remove the song from the internet within 24 hours. Ultimately, Drake complied and narrowly avoided a public and costly legal battle.
While the estate could have potentially gone after the AI company who cloned Tupac’s voice, to the extent they knew which company created it, what is remarkable about this dispute was that the legal response was aimed at Drake, the end-user.
Not all disputes, however, get resolved without litigation. Two voice actors recently filed a class-action lawsuit against the AI startup LOVO, alleging unauthorized use of their voices. This lawsuit, filed in a New York federal court, centers on accusations that LOVO misappropriated the voices of various voice actors, including those of prominent A-list celebrities, without obtaining the proper consents. The voice actors contend that their vocal identities were used to generate voice clones leveraged by LOVO’s AI technology for commercial gain without their consent.
These cases highlights the need for companies to create, and users to adopt, AI platforms and tools that prioritize consent and compensation.
Kits.AI: Pioneering Ethical AI in Music
Unlike the tools Drake used to allegedly steal Tupac’s voice, Kits.ai represents a gold standard in ethical AI within the music industry. The platform ensures all voices are legally cleared and verified. In fact, each voice is uploaded with the express consent and participation of the vocalists, who are compensated for their contributions. This model protects both Kits.ai and its users from potential legal issues and supports artists financially, promoting a sustainable ecosystem where technology enhances human creativity rather than exploits it.
Kits.ai employs a rigorous process to create, verify, and monetize voice models on their platform. This process includes several stages of quality assurance and ethical checks to ensure all voices are sourced responsibly. Vocalists can create a custom AI model of their voice, which is then verified for authenticity and quality before it becomes publicly available for use. This ensures that the original artists are fairly compensated and their legal rights are respected.
Comparing Kits.ai with other tools that offer less ethical approaches, such as those allowing users to swap vocals with famous singers without proper licensing, illustrates the benefits of Kits.ai’s ethical models. Other tools can expose users to significant legal risks—like copyright infringement and violation of publicity rights.
Using ethically trained AI models like those from Kits.ai allows musicians and producers to explore creative possibilities without the fear of lawsuits for infringing on someone else’s legal rights. This security enables creators to focus on what they do best, making incredible music, all while knowing that their use of AI tools is both legally and ethically sound.
Conclusion
Ethical AI in music production is creating groundbreaking advancements and a more equitable industry. By adhering to strict ethical and legal standards, platforms like Kits.ai not only foster innovation but also protect the rights and careers of artists and users alike. As AI technologies continue to permeate the creative sectors, prioritizing ethical training and the use of AI models will be crucial. As you embark on creating your next musical masterpiece, make the smart choice by choosing Kits.ai to ensure your process is not only cutting-edge but also ethically responsible. Embrace the future of music production with Kits.ai, where innovation meets integrity.
* Ryan Schmidt is a music lawyer and partner at Bowen Schmidt Entertainment Attorneys. With a background as a touring singer/songwriter, Ryan brings firsthand experience of the music industry to his legal practice. He specializes in music business law, copyright, and ethical AI, helping artists navigate complex contracts and protect their rights.
**The content on this blog is for general informational purposes only and does not constitute legal advice. For advice on specific legal issues, you should consult with your attorney. This blog may link to third-party websites for convenience only, without endorsing their content. Information in this blog may not constitute the most up-to-date legal or other information. No attorney-client relationship is formed by reading this blog or using this site. Kits.AI and its contributors are not liable for any actions taken based on the information provided here. The opinions expressed are solely those of the author and do not reflect those of Kits.AI.