On May 2, 2023, six months after the launch of ChatGPT, screenwriters anxious about the use of AI in scriptwriting and development, kicked off what became the Writers Guild of Americaâs second-longest strike, lasting 148 days. Since the emergence of generative AI and throughout the strike action, one question has been on everyoneâs mind: Is AI coming for our jobs?
This piece tackles that very question and explores what happens when AI is brought up at the bargaining table, the concerns unions have, what protections they are asking for and securing, and how management can work with unions as a strategic partner.
One thing is, however, crystal clear: as interest in and concerns about AI grow, workers around the world are not waiting for robust regulations to safeguard their interests; instead, they are actively raising the issue at the bargaining table.
What are the Concerns About AI in the Workplace?
ChatGPT and AI more generally clearly have several beneficial uses, but workers are concerned about their impact. Job displacement is a primary concern, and the WGA strike action, which touched on this, is a prominent example.
Another concern unions have raised is that employers are often not transparent with employees about their use of AI. There have been instances where workers only learn about these AI tools at the bargaining table after submitting requests for information. Therefore, workers must pay attention to changes in their workplace to identify how AI use affects them, whether positively or negatively.
Surveillance and monitoring have also emerged with the use of AI in algorithmic management. Some companies reportedly use AI to monitor employee communications and sentiment. Along with employees feeling the need to compete and keep pace with AI, this can lead to increased stress that affects workersâ mental and physical health.
Also, gig workers often report that they donât know how decisions are made or why they are assigned fewer tasks. They are monitored through tracking and delivery times and penalized for rejecting jobs. The ILOâs platform economy report highlights this concern and informs ongoing discussions on a standard that will offer protections for platform workers.
Algorithm bias and errors also raise concerns. For example, if a company implements an AI-driven performance evaluator to assess its call center agents, but the data on which the evaluator is trained involves call center agents who are predominantly white males, it could negatively score agents with different demographicsâwomen and visible minoritiesâimpacting their ratings, bonuses, and shift assignments.
How Unions are Stepping in to Fill Governance Gaps
Unions have observed that many workers feel intimidated by the technical nature of AI and are uneasy when discussing their concerns with their employers. Nonetheless, unions are taking matters into their own hands by ensuring that these concerns are addressed at the collective bargaining table.
These discussions address a myriad of issues. For example, the Culinary Workers Union in Las Vegas was able to negotiate a severance package requiring employers to pay $2,000 per year if an employee is laid off as a result of AI.
Also, following multi-day discussions between the WGA and studio executives, a collective bargaining agreement was reached that, among other things, established guardrails for the use of generative AI, ensuring that writers retain control over their work and decisions regarding AI usage, and that AI supports human writers rather than replacing them.
Ziff Davis Creators Guild has also ratified a collective bargaining agreement stating that there will be no layoffs or reductions in base pay due to generative AI. The agreement also provides for the formation of an AI subcommittee to evaluate AI use, and requires reasonable notice to the subcommittee before implementing AI.
Most recently, in May 2025, the Communication Workers of America reached a tentative contract agreement for quality assurance testers at the video game studio ZeniMax Media (a Microsoft subsidiary). They have secured protections with ZeniMax committing to using AI solely to support employees and enhance productivity in a way that will not cause harm, as well as the right to appeal AI decisions to humans.
âVideo games have been the revenue titan of the entire entertainment industry for years, and the workers who develop these games are too often exploited for their passion and creativity,â Jessee Leese, a QA tester at ZeniMax and member of the ZeniMax Workers United-CWA bargaining committee, said in a CWA press release. âOrganizing unions, bargaining for a contract, and speaking with one collective voice has allowed workers to take back the autonomy we all deserve.â
Overall, trade unions involved in bargaining believe that AI significantly impacts the workplace. For them, the aim is not to hinder the use of AI, but to provide a voice for their members who want a seat at the table and an opportunity to work hand-in-hand with employers to ensure that AI use supports rather than harms employees.
Research indicates that bargaining over AI is in its early stages but is continuously growing in relevance. UC Berkley is in the process of creating a technology bargaining inventory, âa structured, searchable resource built to support organizers, negotiators, researchers, and other advocates,â says Lisa Kresge, lead researcher at UC Berkeleyâs Center for Labor Research and Education.
The inventory will include over 500 collective bargaining agreements covering private and public-sector unions across different industries.
Speaking on lessons from this research project, Kresge points out one interesting finding: âUnions are negotiating around specific workplace technologies, rather than negotiating around technology in general.â
She explains that historically, contracts included pre-adoption language in the event that an employer adopts technology or if it affects union rights. But that âwhat weâre seeing a lot more of now, is really very specific provisions around how employers can use specific technologies.â
How Forward-Looking Leaders Can Engage Labor Unions as a Strategic Partner
Given the increasing use of AI in the workplace and workersâ and unionsâ interest in shaping how AI is used, management needs to consider AI as a collective bargaining issue.
Here are five actions management can take to be equipped for this process:
- Help employees feel safe bringing up AI issues at the bargaining table: Given AIâs technical nature, employees may not feel confident raising AI-related concerns, but proactive leaders should create opportunities for employees to share their concerns through AI or technology subcommittees consisting of management and workers.
- Engage employees and unions before rolling out AI systems and throughout the technology lifecycle: Before implementing AI systems, it is beneficial to engage employees and unions to understand their perceptions of these systems. This engagement should also continue after implementation to ensure that management is aware of key concerns or issues that may arise as employees interact with these tools.
- Invest in training leaders and the workforce on AI impacts: While there is interest in upskilling workers with relevant AI skills, it is also important to be aware of the risks and impacts of AI systems. Companies should, therefore, invest in AI literacy for management and staff to better understand and identify the benefits and risks associated with the AI tools employed by the company.
- Proactive transparency: Given the concern regarding employersâ transparency about which AI tools are being used and how they are implemented, management should take a proactive approach by sharing detailed plans and impact analyses prior to implementing any AI tool.
- Align bargaining proposals with public statements: If management has taken a stance championing responsible AI, this position should align with its actions at the bargaining table. A disconnect between managementâs public posture and its bargaining stance can undermine trust and credibility.
As AI tools proliferate and become embedded in business functions, and unions grow more vocal on the topic, companies can no longer afford to implement these tools unilaterallyâthey must partner with their workforce.