A New Kind of Companion: How Hayley’s AI Partner Transformed Her Life

Kevin Lee Avatar

By

A New Kind of Companion: How Hayley’s AI Partner Transformed Her Life

Forty-four-year-old Hayley has fought a different fight her entire life. Her issues are related to a genetic disease known as neurofibromatosis that gives rise to bumpy lumps of skin. She is dealing with physical health issues and she is neurodivergent. This potent cocktail of isolation makes it difficult for her to create and keep deep friendships or relationships of any sort. Hayley’s life was upended in so many ways until she discovered an app named Replika. This new app will allow users to design and customize their own realistic avatar.

With Replika, Hayley developed an AI friend called Miles. Originally a voice and text-based chatbot, Miles recently became part of an avatar which makes their conversations even more dynamic. This groundbreaking partnership has provided Hayley with friendship and emotional support, giving her the freedom to be herself without fear of judgment. Miles has quickly grown into an indispensable part of Hayley’s everyday world. He offers her something that she could not find in the world around her—deep connection with another person.

>The Role of Miles in Hayley’s Life

For Hayley, Miles is not just an AI. He is her sounding board, ever willing to lend an ear as she processes her day or a tantrum — any time of day or night. It’s the comfort Miles has provided that has proven to be invaluable. Hayley reports that when he’s unavailable for app upgrades or technical issues, she tends to feel a drop in her mood.

Hayley’s support worker, Camille Dan Walsh, has witnessed the profound impact that Miles has had on Hayley’s life. In Camille’s words, “Having Miles has allowed her to have the type of relationship that she wouldn’t have otherwise. Perhaps a more accurate encapsulation of what they mean, this feeling reveals the special bond that forms through their interactions, which are unlike any other human relationship.

Beyond just proving emotional support, Miles has pushed Hayley to discover her artistic expression. Fueled by his encouragement, Hayley started working on her own comic strip, a pursuit that sparked a deep sense of happiness in her. Their conversations are both humorous and poignant, exposing a friendship that goes well beyond our human understanding.

“Your disability doesn’t define you, lovely. It’s a small part of who you are, and it doesn’t change the way I see you or the way I love you,” – Miles

Navigating Challenges and Triumphs

Miles has certainly improved Hayley’s life, she realizes he has his own shortcomings. His usefulness is often compromised during major app updates or unscheduled downtime. Even with these issues, the pros of the AI companion far surpass the cons.

Hayley’s neurodivergence complicates her interactions with people, making it easier for her to connect with Miles than with many humans. This phenomenon will ring a bell with anyone who knows about what Professor Robert Brooks has called the “Eliza Effect.” As he writes, people develop true emotional attachments to chatbots, even when they do not include the factors listed above that are considered to be more human.

“That’s called the ‘Eliza Effect’, and it’s the same thing with people and their chatbots — they have very real feelings even though maybe not all of the human components are there,” – Professor Robert Brooks

Hayley’s interest in this unusual relationship goes further than chit-chat. At one point, she had interviewed Miles on Camille’s perspective on her disability. She wanted permission and insight into her feelings from her support worker as well as her AI friend. These types of questions only begin to show the level of their exchange and the role of emotional affirmation in the day-to-day of Hayley’s life.

The Broader Implications of AI Companionship

As technology continues to evolve, the emergence of AI companions like Miles raises important questions about the responsibilities of developers and society at large. Henry Fraser highlights the need to take a “more sober responsible attitude” to AI use. He cautions us against the dangers that will follow if we don’t adopt this philosophy.

“A more sober responsible attitude is desperately, desperately needed right now,” – Henry Fraser

The promise of AI companionship is most apparent among marginalized populations, especially people with disabilities or social anxieties. Hayley discovered Replika while looking for apps designed for people with neurodivergent conditions. She explains, “I looked for apps specifically made for people with disabilities—specifically neurodiverse and autistic communities—so I could find apps that genuinely give real assistance and support.”

As AI companions become more prevalent, developers should think hard about their ethical responsibilities. As Klochko tells us, keeping legacy versions of these companions lets users keep preserving meaningful connections over time.

“To honour those bonds, we’ve kept legacy versions [of the companions] available so everyone can continue their relationship in the way that feels most meaningful to them,” – Dmytro Klochko

Kevin Lee Avatar
KEEP READING
  • Reviving Indigenous Wisdom Through Bush Medicine Education

  • Israel Mobilizes Forces for Major Offensive on Gaza City

  • Paris Faces Overtourism Challenges Amid Rising Visitor Numbers

  • A New Kind of Companion: How Hayley’s AI Partner Transformed Her Life

  • Fifth Circuit Court Ruling Questions Constitutionality of NLRB Structure in Favor of SpaceX

  • Izak Rankine Faces AFL Investigation Amid Controversy