Home » Eating Disorder Group Pulls Chatbot Sharing Diet Advice
Health New York News Organization Technology United States US

Eating Disorder Group Pulls Chatbot Sharing Diet Advice


A US organisation that supports people with eating disorders has suspended use of a chatbot after reports it shared harmful advice.

The National Eating Disorder Association (Neda) recently closed its live helpline and directed people seeking help to other resources, including the chatbot.

The AI bot, named “Tessa,” has been taken down, the association said.

It will be investigating reports about the bot’s behaviour.

In recent weeks, some social media users posted screenshots of their experience with the chatbot online.

They said the bot continued to recommend behaviours like calorie restriction and dieting, even after it was told the user had an eating disorder.

For patients already struggling with stigma around their weight, further encouragement to shed pounds can lead to disordered eating behaviours like bingeing, restricting or purging, according to the American Academy of Family Physicians.

“Every single thing Tessa suggested were things that led to the development of my eating disorder,” Sharon Maxwell, a weight inclusive activist, wrote in a widely viewed post on Instagram detailing an interaction with the bot, which she said told her to monitor her weight daily and maintain a calorie deficit.

“If I had accessed this chatbot when I was in the throes of my eating disorder, I would not have gotten help.”

In a statement shared with US media outlets, Neda CEO Liz Thompson said the advice the chatbot shared “is against our policies and core beliefs as an eating disorder organisation”.

The association had planned to close its human-staffed helpline on 1 June 2023, and dismissed the staffers and volunteers who had maintained the information and treatment options helpline, which was launched in 1999. Officials quoted by NPR cited growing legal liabilities among the reasons for the switch.

Nearly 10% of Americans will be diagnosed with an eating disorder in their lifetime. The disorders often thrive in secrecy and treatment can be costly, or unavailable in many parts of the country.

Knowing this, Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University’s medical school, and her team set out to create a cognitive-behavioural tool that could offer prevention strategies for people with eating disorders.

She told BBC News the chatbot she designed was based on proven interventions that have been found to be effective in reducing eating disorders and related behaviours.

“It was never intended to be a replacement for the helpline,” she said. “It was an entirely different service.”

Ms Fitzsimmons-Craft and her team gave the programme to Neda and a technology firm to deploy to clients last year. Since then, she said she believes a “bug” or flaw has been introduced into her original design to make the algorithm function more like recent AI tools like ChatGPT. (Neda has said the bot is not run by ChatGPT and does not have the same functions).

“Our study absolutely never had that feature,” she said. “It is not the programme that we developed, tested and have shown to be effective.”

The BBC has reached out to Neda and the health technology firm Cass for comment.

Abbie Harper, a former helpline staffer, told BBC News that days after helpline staff officially unionised, workers were told they would no longer have jobs.

“In the middle of our regular Friday staff meeting, the CEO and chairman of the board popped in to let us know we were being replaced with a chatbot and our jobs were being eliminated,” she said.

“Our jaws hit the floor. We knew Tessa existed, mainly for folks that had body images issues, but it has these pre-programmed responses. It’s not a person who’s engaged in empathetic active listening to you.”

Ms Harper, who is also recovering from an eating disorder, said talking to someone who shared her experience with the illness was a key step to her recovery and to combatting the stigma and shame she felt.

A bot, she said, cannot offer the same support.

Source : BBC

Translate