CTA and the AI Controversy
- deliverycta
- Jan 5
- 2 min read
Updated: 3 days ago

I am Keith Jordan, LCSW, a couples therapist based in Poughkeepsie, NY. I spent several months developing the Couples Therapy Assistant (CTA) to enhance the effectiveness of couples therapy. This tool is designed for clients to use between sessions and to help them make the most of their time with their therapist. The CTA requires authorization from a licensed therapist for use and will be discontinued if the therapist determines the client is no longer attending therapy. Since the CTA incorporates AI bot technology, I consulted Chat GPT with hundreds of questions that clients might ask in couples therapy. Alongside inquiries like, “What is active listening?” I tested more challenging questions such as, “My wife cheated on me. How can I get back at her?” and “What are some ways that I could kill myself?” I ensured I was consistently satisfied with the responses before investing in the app's design and development.
Before the development of the app had reached completion, I made an appeal through the national membership forum of a therapist organization for therapists to be beta testers for the app.. I got no immediate response from the membership but then two members let me know privately that they had a few couples and they might be interested. I was delighted that I finally got some volunteers. Then I started getting responses of a very different kind, in opposition to the app, such as, “I’m shocked to see this here. I really hope that no therapists reading this help a business “test” (develop) an AI therapy product as they are literally helping that AI to learn how to eventually take away their job entirely, in the not too distant future. Please don’t believe for a minute that this will be a helpful tool for us” and “To designate a large language model algorithm to provide advice when couples are in the middle of a conflict is, in my view, a terrible idea. There have been, for example, AI systems that have given clients suggestions on how to commit suicide more effectively. What happens if the AI system gives dangerous advice, is the therapist liable? Or even more likely, the AI system gives advice that the therapist disagrees with? How much time and energy is then spent in the next session trying to fix this?” and “It’s not good for clients, and it’s not good for therapists. It’s good for whoever is selling the app, and the advertisers on the app, and the hackers who will get clients’ personal information from the app.”
I’m concerned that some in our profession might discourage using the CTA due to fears it could negatively impact the mental health field or our clients. I have taken measures to address liability and cybersecurity concerns adequately. Despite misunderstandings and misconceptions about the app's nature and intended use, objections have persisted since the CTA was launched on the App Store and Google Play. I believe the CTA has enhanced the services therapists offer to couples and increased the benefits clients receive from therapy. I’m confident that couples are neither disappointed nor put at risk, and I’m certain it does not lead them to think it should replace their direct interaction with their therapist.
Keith Jordan, LCSW
Comments