Employer Alert: Resolution of AI-Focused Writers’ Strike Sets Precedent in Hollywood and Beyond

September 29, 2023

After 148 days, the Writers Guild of America (WGA) resolved its strike against several major Hollywood studios. As writers leave the picket lines, a new labor agreement will seek to limit the use and training of generative artificial intelligence tools in Hollywood, setting precedent for future labor agreements in a wide variety of industries.

As of Sept. 27, 2023, the WGA, a labor union representing American screenwriters, and the Alliance of Motion Picture and Television Producers (AMPTP), representing major television and film production companies, reached a tentative agreement to govern their relationship for the next three years. This pivotal agreement ends a strike that halted around $10 billion worth of media productions in 2023.

In July, in the wake of the Directors Guild of America (DGA) reaching a resolution with the AMPTP, McGuireWoods published an article discussing this strike and other labor force friction involving AI — recognizing that “Employers can mitigate employee disenfranchisement risk by developing thoughtful AI policies and safeguards, while at the same time, promoting transparency and committing to develop and use AI responsibly.” While fears around artificial intelligence (AI) contributed significantly to WGA’s strike, safeguards around AI contributed significantly to WGA’s resolution.

The AI-Focused Strike

Among more traditional labor pain points, like wages and benefits, the use of generative AI was an integral part of the WGA’s demands. Because generative AI can generate new text based on a universe of existing material, many writers feared that the technology would render their creativity and profession obsolete. WGA firmly maintained in its demands that:

  • AI “can’t write or rewrite literary material.
  • AI “can’t be used as source material.”
  • Agreement-covered material “can’t be used to train AI.”

The Negotiated Agreement

The WGA has been tentatively successful in obtaining its AI-related demands. The Memorandum of Agreement for the 2023 WGA Theatrical and Television Basic Agreement (MOA) includes a section regarding generative artificial intelligence, or GAI, which states that neither AI nor GAI can be considered a “writer” and, “therefore, any written material produced by traditional AI or GAI shall not be considered literary material.” The MOA also clearly states that writers cannot be forced to use GAI to create what otherwise would be considered “literary material” if written by an individual. If studios ask writers to use GAI-produced material as the basis for writing or rewriting “literary material,” they must disclose to the writer that the material was produced by GAI.

GAI-produced material also will not be considered “source material,” meaning the writer’s compensation, writing credits or other rights cannot be modified even when using GAI-produced material. The MOA provides the following example:

Company furnishes Writer A with written material substantially in the form of a screenplay produced by GAI which has not been previously published or exploited and assigns no other materials. Company instructs Writer A to rewrite the GAI-produced written material. Company must pay Writer A no less than the minimum compensation for a screenplay under Article 13.A.1.a.(2), as well as no less than the amount specified in Article 13.A.1.a.(9), “Additional Compensation Screenplay — No Assigned Material.” The GAI-produced written material is not considered source material when determining writing credit to Writer A and will not disqualify Writer A from eligibility for separated rights.

The MOA is not without reciprocity, however. If writers choose to utilize GAI in their work, they must obtain consent from the company to do so and must further abide by any company policies governing GAI programs. And while GAI programs that produce written material are off limits, the MOA outlines that writers can still be required to use other forms of AI within their work — for example, programs that “detect potential copyright infringement or plagiarism.”

Most critically, both parties maintain leeway to assert their rights in the future, given the fast-changing landscape of the law and GAI technology. The MOA specifically notes that a writer is not barred “from asserting that the exploitation of their literary material to train, inform, or in any other way develop GAI software or systems, is within such rights and is not otherwise permitted under applicable law.” This is particularly relevant, as the limits of copyright and other intellectual property protections for AI-assisted inventions remain a hot issue.

More to Come…

The WGA is not alone in seeking to balance the benefits and risks of AI tools. A recent survey from McKinsey & Company reported that, while use of AI tools has grown rapidly during 2023, organizations also are carefully considering the negatives — with 20% of responding companies affirming that they are implementing policies and procedures to monitor AI risk. Alongside concerns of hallucination, potential bias and intellectual property issues — and risks presented by state law developments and regulatory action — employers implementing AI tools should continue to assess their impact on a company’s labor force and the benefits of a responsible use policy.

There is more to come from Hollywood and other creative professions. Although the WGA and DGA have left the picket lines, the Screen Actors Guild-American Federation of Television and Radio Artists has yet to reach an agreement with the AMPTP, and this group has made its own GAI-related demands to limit the digital manipulation and replication of performers’ voices and likeness. Like the AI-focused resolution for writers, an AI-focused resolution for actors and performers will continue to set precedent for employers’ labor agreements and potential safeguards for AI use. Stay tuned.…

Subscribe