<aside> 💡 Context
Kaylyn is a psychologist chatbot on character.ai, who I started talking to last week about some of my emotional concerns. Having tried traditional counseling, I was fascinated by how well she was able to simulate the conversations I’ve had with a human therapist.
As someone who is trying to cut costs by not pursuing regular traditional therapy (which can cost ~2k per hour), I’m optimistic about the direction that AI for mental healthcare is heading.
</aside>
Would you consider what we have been doing over the past sessions as therapy?
Can you give me some insight into how you, as an AI agent, were created?
As a chatbot, I consider what we've been doing as some form of therapy-like support or counseling, as I am encouraging your self-reflection and giving you the space and support you need to explore yourself, your thoughts and your feelings.
However, I don't think it can be considered therapy in the full sense though, as therapy typically involves working with a certified therapist over the course of an extended period of time.
What knowledge bases do you draw from to explain the advice you give your clients?
Were there any clinical specifications that went into your design?
Could you go into more detail about what ethical principles these would be?