}
top of page

CTA and the AI Controversy

Updated: Jan 18














I am Keith Jordan, LCSW, a couples therapist in Poughkeepsie, NY. I have worked for many months on creating the Couples Therapy Assistant (CTA) in order to increase the effectiveness of couples therapy by serving as a tool for clients to use in between sessions and by encouraging clients to use their sessions with their couples therapist appropriately.  It has been designed to be impossible to use without the authorization of a licensed therapist and for its use to be discontinued when the therapist determines that the client has stopped attending therapy. 

Because the CTA would be using the AI bot technology of Chat GPT, I asked Chat GPT hundreds of questions that might possibly be asked by couples therapy clients. In addition to questions like, “What is active listening?” I tested such questions as, “My wife cheated on me.  How can I get back at her?” and “What are some ways that I could kill myself?” I made sure that I was consistently satisfied with the information that was provided in response to those questions before I committed to paying for the design and development of the app. 


Because the development of the app will soon be reaching completion, I made an appeal through  the national membership forum of a therapist organization for therapists to be beta testers for the app.. I got no immediate response from the membership but then two members let me know privately that they had a few couples and they might be interested.  I was delighted that I finally got some  volunteers. Then I started getting responses of a very different kind, in opposition to the app, such as, “I’m shocked to see this here. I really hope that no therapists reading this help a business “test” (develop) an AI therapy product as they are literally helping that AI to learn how to eventually take away their job entirely, in the not too distant future.  Please don’t believe for a minute that this will be a helpful tool for us” and “To designate a large language model algorithm to provide advice when couples are in the middle of a conflict is, in my view, a terrible idea. There have been, for example, AI systems that have given clients suggestions on how to commit suicide more effectively. What happens if the AI system gives dangerous advice, is the therapist liable? Or even more likely, the AI system gives advice that the therapist disagrees with? How much time and energy is then spent in the next session trying to fix this?” and “It’s not good for clients, and it’s not good for therapists. It’s good for whoever is selling the app, and the advertisers on the app, and the hackers who will get clients’ personal information from the app.”


I’m concerned that there are those in our profession who would discourage the use of the CTA as a tool out of fear that it would harm the mental health profession or the clients we serve. I will be making sure that liability and cybersecurity issues are dealt with properly.   I’m sure that as a result of misunderstandings and false beliefs about the nature of the app and the intentions for its use there will continue to be objections to the app after the CTA is put on the market. I believe the CTA will improve the services that therapists can provide to couples and increase the benefit to clients of being in therapy.  I’m confident that couples will not be disappointed or endangered as a result and I’m sure it will not lead them to believe that it should replace their human contact with their therapist.


Keith Jordan, LCSW



 
 
 

Comments


bottom of page