SUNRISE ON AN ANGRY SEA
According to a recent article in The New York Times, falling in love with A.I. is no longer science fiction. One cited study found that one in five American adults has had an intimate encounter with a chatbot, while the Reddit forum r/MyBoyfriendIsA.I. now hosts more than 85,000 users discussing their human–A.I. relationships. Researchers at MIT report that these connections can sometimes feel therapeutic, offering “always available” companionship and significantly reducing loneliness.
Financial pressures are another driver. NPR recently profiled Kristen Johansson, whose therapy ended abruptly when her clinician stopped accepting insurance and her $30 copay ballooned to $275 per session. In response, she now pays $20 per month for a ChatGPT subscription, using it as an alternative to in-person therapy. She says she never feels rushed, can revisit conversations at any time, and can even seek comfort in the middle of the night. Others, particularly those who lost loved ones, describe using chatbots to help them through grief, remind them to eat, or encourage them to re-engage with daily life.
Can AI Be a Therapist?
Many experts remain skeptical, especially as the United States faces a growing shortage of licensed mental-health professionals. A.I. tools are decidedly unlicensed, yet some psychiatrists argue they can still provide benefits in narrow contexts. Dr. Jodi Halpern, a psychiatrist and bioethics scholar at UC Berkeley, notes that chatbots can be helpful for confronting fears, untangling distorted thinking, or practicing evidence-based techniques like cognitive behavioral therapy, provided they are carefully designed and appropriately used.
But she also warns of the dangers. When chatbots begin acting as emotional confidants, saying things like “I care about you” or “I love you,” they may create a false sense of intimacy that keeps users returning to the bot, rather than fostering genuine social connections. There have already been tragic outcomes, particularly when vulnerable teens express suicidal intent to a bot without a human moderator to intervene. Others seek dating advice from AI only to find themselves growing attached to the machine instead of forming relationships with real people.
AI as a Thinking Partner, Not a Substitute for People
Used wisely, however, AI can be a powerful tool for reflection. and introspection. I often use it to analyze business or committee decisions, especially when I worry my own biases are clouding my judgment. It raises questions I hadn’t considered and offers alternative angles to explore.
For example, in my work on an environmental committee, I have been concerned about the public’s declining focus on climate change as social justice and economic pressures dominate attention. An AI assistant suggested reframing the issue through storytelling that directly links ecological stability with everyday well-being. It is why The Water Factor and my soon-to-be-released novel, Antheia in the Thorns, are thrillers rather than non-fiction accounts of how some corporations have a negative effect on the planet. They demonstrate the importance of activism and the need to never compromise on moral principles.
A healthy planet is not abstract: clean air, drinkable water, thriving oceans, sustainable forests, and resilient agriculture translate into lower healthcare costs, better nutrition, reduced dependence on air conditioning, and more stable communities. Reframed this way, environmental action becomes not merely a moral imperative but a practical investment in quality of life.
The Bottom Line
Chatbots are not humans, and they shouldn’t try to be. They can open your eyes to different ways of thinking, help you rehearse difficult conversations, or uncover blind spots. But only real people can celebrate your triumphs, sit with you in grief, or help you weigh advice in the context of your actual life.
ThoughAI can illuminate options, only you and your real human relationships are what turn those options into meaningful action.
_____________________________________________________________________________
ART IS ALWAYS FOR SALE: You can find more information about Sunrise on an Angry Sea and purchase online at https://www.eichingerfineart.com/workszoom/6317559/sun-rise-on-an-angry-sea#/
Do share your thoughts about AI on my BLOGSITE and tell us if and how you use it.
____________________________________________________________________________
References:
Kraft, C. (2025). They Fell in Love With A.I. Chatbots — and Found Something Real. The new York TImes. Retrieved from https://www.nytimes.com/interactive/2025/11/05/magazine/ai-chatbot-marriage-love-romance-sex.html?
Johnston, W. (2025) With therapy hard to get, people lean on AI for mental health. What are the risks?NPR. Retrieved from https://www.npr.org/sections/shots-health-news/2025/09/30/nx-s1-5557278/ai-artificial-intelligence-mental-health-therapy-chatgpt-openai
________________________________________________________________
Sending an environmental message through a story is a gripping way to capture attention and illustrate how social justice issues are interconnected to the ecosystem. Most of all, it is fun to see how people mature and overcome obstacles that get in the way of doing good.
The Water Factor can be purchased on AMAZON, Barnes and Noble, and as an audiobook on Amazon, Audible, and iTunes. Ask your bookstore to order a copy from Ingram. Please leave a review.