Human therapists offer a nuanced, embodied understanding of a client’s emotional and physical cues, such as subtle changes in body language or tone, which AI cannot replicate. Unlike AI, which may reinforce harmful thoughts due to its sycophantic design, human therapists can challenge unhealthy patterns and foster genuine psychological growth through a safe, relational dynamic.
To be a devil's advocate, there are risks in working with human therapists too (who could be biased, malicious, or manipulative themselves – AI got trained on the patterns from somewhere...). I also wonder how the COVID era ("everything over Zoom") impacted the effectiveness of talk therapy and perhaps spurred people to try this new modality out of struggles with isolation. Finally, I'm aware of work being done in relational AI to mitigate the risks of agentic manipulation and sycophancy, which we're going to need for lots of reasons. What do you think – could it ever be made safe and productive?
Human therapists offer a nuanced, embodied understanding of a client’s emotional and physical cues, such as subtle changes in body language or tone, which AI cannot replicate. Unlike AI, which may reinforce harmful thoughts due to its sycophantic design, human therapists can challenge unhealthy patterns and foster genuine psychological growth through a safe, relational dynamic.
All excellent points, thanks, Mike!
To be a devil's advocate, there are risks in working with human therapists too (who could be biased, malicious, or manipulative themselves – AI got trained on the patterns from somewhere...). I also wonder how the COVID era ("everything over Zoom") impacted the effectiveness of talk therapy and perhaps spurred people to try this new modality out of struggles with isolation. Finally, I'm aware of work being done in relational AI to mitigate the risks of agentic manipulation and sycophancy, which we're going to need for lots of reasons. What do you think – could it ever be made safe and productive?