Academy Article
|
November 10, 2025

Social Workers, Compassion, and Artificial Intelligence

Share

Dr. Daniel Barron MD, PhD, the Medical Director of the Interventional Pain Psychiatry Program at Brigham and Women's Hospital, delivered the keynote address at the National Association of Social Workers (NASW) virtual forum, Compassionate Innovation: Social Work and Technology. In his remarks, he spoke of the need to think of standards and specifics when considering the growing use of artificial intelligence (AI) in mental healthcare settings; he noted, "When we get in our heads, we lose touch with the concrete situation."

Because of the increasing presence of AI in mental healthcare, Barron spoke about how social workers might balance the potential benefits and risks of using AI in their practices. Barron also shares his expertise as a member of the steering committee and contributor to the Academy’s forthcoming publication, AI and Mental Health Care: Issues, Challenges, and Opportunities. 

Barron has observed that while people often imagine worst-case scenarios or other hypotheticals when evaluating AI's it is most helpful to understand its use in specific contexts. When asked about their favorite and least favorite parts of clinical practice, social workers in the audience responded with connecting with and helping someone as the best parts of the job while documentation, billing, and claims adjudication are the least. Barron acknowledged that AI can help social workers handle the more onerous tasks and allow them to devote more time to their clients. As he explained, "We need a more granular, pragmatic approach: a task-based approach that shifts the focus from what AI is to how to deploy AI tools to do what you want.”  

The best technologies, he maintains, address real problems. A task-based approach to using AI allows social workers to address real problems without ruining patient care. Doing so involves three basic steps:

  • define the task at hand, 

  • define the role of AI in completing the task, and 

  • compare solutions to carry out the task. 

This approach reframes ethical anxieties into risk analyses, giving practitioners concrete parameters to work within. 

Barron’s keynote ended with his thoughts on leading with compassion in the age of AI. He advised that AI can be a tool that relieves suffering and closes gaps in access to mental healthcare, rather than widening them, as long as it is used appropriately. Social workers, and other healthcare practitioners, should weigh the potential for creating equity with the risk of creating a two-tiered system of care when using AI. 

Above all, Barron concluded, it’s important for practitioners to keep sight of their core goal: alleviating human suffering. Efficiency should never come at the cost of human connection, and technology should serve humanity, not lead it. 

Additional reflections on the role of AI in mental healthcare can be found in the Academy’s forthcoming publication, AI and Mental Health Care: Issues, Challenges, and Opportunities. This work builds on the Academy’s commitment to convening experts across disciplines to have challenging conversations about important topics impacting society. 

Share

Related

Project

AI and Mental Health Care

Chairs
Paul Dagum, Sherry Glied, and Alan I. Leshner