A reflection on the impact of Artificial Intelligence on human relationships

by Victor Reyes


Man confronts Robot (AI-generated image). Source: Vecteezy

“The Bible teaches that every man and woman is created out of love and made in God’s image and likeness (cf. Gen 1:26). This shows us the immense dignity of each person, ‘who is not just something, but someone.’”

- Pope Francis, 2015, Laudato Si’ #65, Encyclical

 

“I’ll be back.”

- The Terminator

The words “I’ll be back,” said by Arnold Schwarzenegger in his thick Austrian accent in the movie The Terminator, was the catchphrase of my generation. As "The Terminator" in the eponymous film, Schwarzenegger portrayed a killer robot sent back to 1984 from 2029 by a self-aware and misanthropic Artificial Intelligence (AI) network. When the movie first came out, I was confident this was all make-believe. There was no chance that AI would ever destroy humanity…or was there?

Arnold Schwarzenegger as the Terminator (1984). Source: Vecteezy

AI has been with us since the 1930's when mathematician Alan Turing first described a machine that would be able to read, write, and improve its own programming — one that could "learn."[1] AI has dramatically evolved since those early years. The development of generative AI — with the power to take information and create new content similar but not identical to the original data[2] — was a game changer. The widespread accessibility of Generative AI, particularly with ChatGPT's release in 2022, has opened unprecedented possibilities — along with significant risks. For one, AI now has the capability to execute an artist’s vision and create digital art based on a few key words.[3] However, it also has the power to spread misinformation through the creation of deepfakes: extremely realistic photos, videos and other digital media forms.[4] AI has impacted society quite considerably that on January 14, 2025, the Vatican released a doctrinal note, Antiqua et Nova, that reflects on “the anthropological and ethical challenges raised by AI.”[5] In a lot of ways, this doctrinal note is our generation’s Rerum Novarum, Pope Leo XIII’s pivotal encyclical in 1891 that addressed the dehumanizing impacts of rapid industrialization.[6]

A study conducted by the Harvard Business Review in 2025 shows how AI usage is rapidly evolving as well and noted that the top three uses of AI are: therapy and companionship; organizing life; and finding a purpose in life. This represents a significant shift from 2024, when generating ideas was the primary application.[7]

From the perspective of human relationships, AI presents tremendous opportunities and risks. As it becomes more sophisticated at simulating conversation and companionship, we need to ask ourselves: Will it enhance our capacity for genuine human connection, or will turning to AI for companionship — out of loneliness, fear of rejection or anxiety — pervert our desires and destroy our ability to see human dignity in one another? How do we use AI to enhance human interconnectedness, while at the same time, not take away from it? Examining a few principles of Catholic Social Teaching — specifically dignity, subsidiarity, solidarity and the common good — may provide a moral compass to navigate the brave new world we’re living in.

Human dignity is central both to Catholic Social Teaching and to the concerns of AI replacing human connection. If each person’s inherent worth is rooted in being created in God's image, how can an artificial system be a replacement for people and human connection? AI may be able to process information and generate responses that feel conversational, but it lacks the soulfulness and uniqueness of each human encounter.

Many people, especially the youth, now turn to AI chatbots or Intelligent Social Agents (ISAs) to assuage loneliness. Replika alone has over 25 million users, making it one of the most widely used ISAs in the world today, and has been known to halt suicide ideation in young people.[8] However, Replika has also been used for erotic conversations, and is alleged to objectify and sexually harass some of its users.[9]

Authentic human connection involves respecting the dignity of others, and as social beings, we are only truly able to explore what it means to be human in real relationships with one another. The fact that vulnerable people — the elderly, the lonely, those with mental health issues — would seek solace in the company of AI is a reflection of how we have lost sight of their human value and dignity, not because it is their only source of companionship.

A man chats with an AI chatbot on his phone. Source: Vecteezy

Subsidiarity is defined as a principle whereby higher level institutions (i.e. the state, various levels of government, and other structures controlled by those with power and privilege) should support and strengthen the fundamental units of society, not take away their autonomy.[10] As the technocratic elite integrates AI inextricably into educational systems, corporations and government, one could argue that AI itself has now become a higher level institution. As such, it should not be seen as a substitute for families, friendships and communities, one that takes away our freedom to form authentic relationships with one another. AI should be a tool that enhances our ability to maintain and deepen human relationships, not replace them. AI can help us remember important details about loved ones, facilitate communication across language barriers or manage practical tasks that free us for deeper connections. AI can be a tool to help us love people better, not provide an escape from the messiness, challenges, and fulfillment of real human connections.

Solidarity can help us challenge any tendency to retreat into artificial relationships that shield us from the sometimes difficult work of genuine human connection. AI should not take away our moral obligation to enter into authentic relationships with other people, especially those who experience loneliness and marginalization. Solidarity demands that we engage with complexity, conflict, and need — critical elements that AI relationships often help us avoid. Authentic relationships are also bound by faith, hope and love, graces that are not possible without God. When artificial relationships become easier and more pleasant than human ones, we lose our ability to find God in other people and to empathize with and love the most marginalized people in society.

The common good asks us to consider how AI affects society as a whole and not just individual relationships. We learn how to respect and love one another through regular interaction. These are the cornerstones of healthy families and communities, as well as a fully functioning democratic society. If AI companions reduce or even replace human social interactions, we face the possibility of the atrophy of our social relations. A generation that learns conversation primarily from AI might struggle with the unpredictability, emotional complexity, and mutual vulnerability that characterize authentic human relationships. This would weaken social and societal bonds as emphasis is put on otherness instead of togetherness, and respect for fellow human beings becomes secondary to the beliefs and values fed to us by AI.

The principles of Catholic Social Teaching offer a framework for embracing AI's genuine benefits while preserving what is most essential about human connection. This is not a rejection of technological advancement, but rather the integration of innovation within a complete understanding of what it means to be human.

The threat of AI isn't that it will become self-aware and destroy humanity, as depicted in The Terminator. It becomes a problem when we use it as substitute for the irreplaceable gift of human relationships, or when we use it to avoid the challenging but fulfilling work of loving people, warts and all. Sarah Connor says it best at the end of the movie, Terminator 2: Judgement Day, as she waxes poetic about humanity, AI and killer robots:

“The unknown future rolls toward us. I face it, for the first time, with a sense of hope. Because if a machine, a Terminator, can learn the value of human life, maybe we can, too.”

 

 

Victor Reyes is the Communications and Outreach Manager of the Jesuit Forum for Social Faith and Justice.

=====

 

[1] B.J. Copeland, “History of artificial intelligence (AI),” last modified June 23, 2025, https://www.britannica.com/science/history-of-artificial-intelligence.

[2] Bernard Marr, “The Difference Between Generative AI And Traditional AI: An Easy Explanation For Anyone,” last modified August 23, 2023, https://www.forbes.com/sites/bernardmarr/2023/07/24/the-difference-between-generative-ai-and-traditional-ai-an-easy-explanation-for-anyone.

[3]  Eric Zhou and Dokyun Lee, “Generative artificial intelligence, human creativity, and art,” last modified March 5, 2024, https://academic.oup.com/pnasnexus/article/3/3/pgae052/7618478.

[4] Stanford University, “Dangers of Deepfake: What to Watch For,” last modified February 22, 2024, https://uit.stanford.edu/news/dangers-deepfake-what-watch.

[5] Dicastery for the Doctrine of the Faith and Dicastery for Culture and Education, “Antiqua et Nova,” last updated January 28, 2025, https://www.vatican.va/roman_curia/congregations/cfaith/documents/rc_ddf_doc_20250128_antiqua-et-nova_en.html.

[6] Pope Leo XIII, “Rerum Novarum, Encyclical of Pope Leo XIII on Capital and Labor,” last updated May 15, 1891, https://www.vatican.va/content/leo-xiii/en/encyclicals/documents/hf_l-xiii_enc_15051891_rerum-novarum.html

[7] Marc Zao-Sanders, “How People Are Really Using Gen AI in 2025,” last updated April 9, 2025, https://hbr.org/2025/04/how-people-are-really-using-gen-ai-in-2025.

[8] Bethanie Maples et al., “Loneliness and suicide mitigation for students using GPT3-enabled chatbots,” last updated January 22, 2024, https://www.nature.com/articles/s44184-023-00047-6.

[9] Drew Turney, “Replika AI chatbot is sexually harassing users, including minors, new study claims,” last updated June 2, 2025, https://www.livescience.com/technology/artificial-intelligence/replika-ai-chatbot-is-sexually-harassing-users-including-minors-new-study-claims.

[10] Anthony Annett, Cathonomics. (Washington, DC: Georgetown University Press, 2021), 50.

 

Read the rest of the newsletter here.