Employers have been turning to media synthesis machines in some of the most sensitive domains, with absolutely dire consequences. In one particularly piquing example, the National Eating Disorders Association (NEDA) attempted to replace their workforce-a set of volunteer and paid coordinators and hotline operators-with a chatbot. This happened after NEDA workers, exhausted from the uptick of work during COVID, had voted to unionize under the moniker of Helpline Associates United. Both paid workers and volunteers at NEDA encountered intense workloads, and, despite being a place where others receive mental health support, they received very little themselves. Two weeks after unionizing, they were summarily fired for organizing together, a violation of U.S. labor law.
Soon after, a poorly tested chatbot called "Tessa" was brought online. According to its creator, the chatbot was intended to provide a limited number of responses to a small number of questions about issues like body image. But "Tessa" was quickly found to be an impoverished replacement for workers, offering disordered eating suggestions to people calling the hotline. Eating disorder advocate and fat activist Sharon Maxwell documented how the chatbot offered "healthy eating tips," suggesting that she could safely lose one to two pounds a week through counting calories. These tips are the hallmarks of enabling disordered eating. The chatbot was quickly decommissioned, and the NEDA hotline has since been taken completely offline, creating a major gap in mental health services for those struggling with disordered eating. In short, when NEDA tried to replace the work of actual people with an Al system, the result was not doing more with less, but just less, with greater potential for harm.