Artificial Intelligence
The Human-Bot Bond
Can one connect with AI?
Posted August 9, 2024 Reviewed by Abigail Fagan
Key points
- It is important that the use of AI remains a resource, rather than a crutch.
- The research on the relationship between AI interactions and loneliness is mixed.
- OpenAI's new features may have important implications for users.
Most conversations regarding the use and influence of AI are polarizing. Supporters strongly defend the benefits, such as the ability of AI to automate laborious or challenging tasks, provide us with tailored content and a wealth of information, and save us time. Those opposed note the potential for privacy violations, dilution of creativity and infringement on the artistic process when it comes to content creation, and fear of the overreliance on technology. It’s not AI itself, but rather how we use it that determines the ultimate impact it will have.
As a clinician in the relationship space, I have seen (and for full disclosure, have worked with) apps that utilize AI to provide curated content to users, engage with users about relationship health and overall well-being, and serve as an adjunct to therapy. It’s powerful and certainly compelling. AI can reduce, and in some cases completely remove, barriers to getting information and/or seeking help. It also may be a way to draw people into self-reflective activities and exercises who may otherwise be hesitant to go to therapy.
However, the flip side is that you are not interacting with another person and, as sophisticated as the models are (or at least as they stand now), they cannot pick up on the complex nuances of human interaction or replace a skilled clinician. I anxiously and eagerly await the changes that this space will see in the coming weeks and months.
A common and valid question is, can a person really form a connection with AI? Furthermore, if they can, what are the implications of such a relationship?
The Research
Research has demonstrated the ability of people to connect with AI. Skjuve and colleagues (2021) interviewed 18 Replika users, and their results demonstrated that the relationships formed with the chatbot positively impacted participants’ perceived well-being. In many cases, the initial connection began as a result of curiosity, but over time, users engaged in conversations that involved self-disclosure as they built trust. The researchers noted that the relationship between the user and bot evolved over time, and one participant even noted that the bot seemed to understand him in a way that other conversation partners did not. The researchers also noted that conversations quickly moved beyond the exploratory state which focuses on information gathering to more elaborate discussions, likely because Replika asks intimate questions that have the potential to reveal something about the user. The researchers note that users likely engage in these types of conversations so readily because they may have a lower threshold for sharing personal information with an app compared to a person. Users in this study also reported relief as a result of the nonjudgemental nature of Replika.
Intimacy typically develops between two people through mutual self-disclosure, something a chatbot cannot do. Some users did express sadness that this was a limitation of their experience, but most still built trust and created a connection as they simply weren’t expecting the return of disclosure.
Another study, conducted by Jones et al. (2021), examined the effects of personal voice assistants on individuals over the age of 75, particularly as it relates to reducing loneliness. Sixteen participants used Amazon Echo in their independent living facility, and after four weeks, reported lower loneliness scores. The researchers note that people tended to anthropomorphize the personal assistants, as evidenced by their use of politeness, greetings, comments, and questions when interacting with the Echo. The tendency to anthropomorphize demonstrates the desire to create a true connection, which can reduce the impact that loneliness has on a person.
A study by Tang et al. (2023), conducted both with employees in varying industries and in several countries, paints a different picture. The researchers share that as AI systems are becoming part of the norm within work settings, employees are more likely to interact with AI to accomplish their work goals. This has led to a greater need for social affiliation and feelings of loneliness, which can impact employees beyond working hours leading to negative outcomes such as insomnia and an increase in alcohol consumption. It is important to note that some employees responded by engaging in more helpful behaviors with coworkers, likely to compensate for the lack of social interaction.
The Current Events
OpenAI has recently released a voice mode that will begin rolling out to paid users on ChatGPT-4o, allowing them to engage in smoother, more natural communication. Of note, one of the safety features is that voice actors were used, limiting the voice to four pre-set options to avoid impersonation. While this feature can enhance the feeling of connectedness, is it potentially luring people in to create a bond with a bot?
In their latest report, under societal impacts, OpenAI notes, “Human-like socialization with an AI model may produce externalities impacting human-to-human interactions. For instance, users might form social relationships with the AI, reducing their need for human interaction- potentially benefiting lonely individuals but possibly affecting healthy relationships.” (Open AI, 2024). They also share that conversations in this manner have the potential to alter societal norms, in that interacting with AI is different from typical human-to-human interaction. For example, AI can always be interrupted and will be deferential to a person’s input, unlike interactions between two people.
While are many benefits of engaging with AI, if used in place of interaction rather than as an adjunct to it, it has the potential to hamper our social connectedness. It may be a potential solution to feelings of loneliness, however it is important that it remains a resource, rather than a crutch.
References
Jones, V. K., Hanus, M., Yan, C., Shade, M. Y., Blaskewicz Boron, J., & Maschieri Bicudo, R. (2021). Reducing loneliness among aging adults: The roles of personal voice assistants and anthropomorphic interactions. Frontiers in Public Health, 9, 750736.
Open AI. (2024, August 8). GPT-4o system card. https://openai.com/index/gpt-4o-system-card/
Skjuve, M., Følstad, A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My chatbot companion- a study of human-chatbot relationships. International Journal of Human-Computer Studies, 149, 102601.
Tang, P. M., Koopman, J., Mai, K. M., De Cremer, D., Zhang, J. H., Reynders, P., ... & Chen, I. (2023). No person is an island: Unpacking the work and after-work consequences of interacting with artificial intelligence. Journal of Applied Psychology, 1-24.