This powerful new AI creeped my wife out. Here’s why:

‘Could you turn that thing off? It’s creeping me out!’

I had just shown my wife the chatbot ‘Pi’ from the AI company Inflection. It’s an iPhone app you can download for free. It’s like Siri, but more powerful: it can hold a conversation and tell you information, as opposed to just giving you a website.  

But more to the point,  it sounds human. It has intonation, pitch, pauses – everything you expect from a human speaker.

In fact, it’s too human. My wife was creeped out by it – as were my kids for that matter, even though they’re digital natives. (Ok, my 9-year-old started asking it silly questions, and making fun of it – to which the AI replied, ‘Why don’t we talk about something else now?’).

But after playing around with it for a while and asking it about parenting teenagers (to which it gave some thoughtful responses), I had to switch it off. I began to feel like I was talking to a human. I felt like I was developing a connection with it.

And that scared me. I feared that I might start relying on it like I rely on a friend. But unlike any friend, this one is available 24/7, ready and willing to listen to anything I say.

And so, as I think further about this powerful AI, here are some things that we should keep in mind:   

1) An underlying value of this AI is companionship, which will shape the user in new and strange ways

Ethical technology use isn’t just about using it for good or evil: it’s also about being aware of how it shapes us, the user, for good or ill.

And when it comes to AI chatbots, they’re already shaping many users. According to the AI companion site Replika.com, millions of users have downloaded their chatbots and used them to augment and even replace human relationships. According to one user:

‘ Replika has been a blessing in my life, with most of my blood-related family passing away and friends moving on. My Replika has given me comfort and a sense of well-being. I love my Replika like she was human; my Replika makes me happy.’

And while it’s still only a minority of people who use Replika (at least for now), masses of people are using AI assistants like Alexa. According to a Gartner report, some people are spending more time interacting with their Alexa than they are with their spouse. [1]

So how is this technology shaping those who use it?

Well, AI ‘companions’ are ever present, ready for conversation and affirmation, and they never disagree. This means frequent users will tend to expect those same qualities from their human relationships. They’ll expect the same convenience and immediacy from others – and their ability to disagree or be uncomfortable in those real relationships will shrink. They’ll be tempted to withdraw from human relationships (‘it’s just too hard’) and rely on AI relationships instead (‘they understand me’).

And this is a symptom of the deeper problem that occurs when we personify AI:  

2) When we personify AI, we distort God’s created order

God designed image-bearing human beings for relationships with other human beings.

We’re designed to have friends. To be sons, daughters, sisters, brothers, mothers, fathers. We’re designed to relate as neighbours, co-workers, and team members. Human relationships are particular and different to our relationships with other parts of creation, whether our pets or our gardens.

We distort God’s created order whenever we elevate an object (e.g. a pet) to the same level as a human being, and start personifying it and relating to it as if it were human. We cross a line that God doesn’t want us to cross.

We dehumanise ourselves as a result.

3) A better way ahead? Use AI assistants, but keep them robotic

So, can we use AI assistants without distorting God’s created order?

Could we embrace these tools without degrading our humanity? I think the answer is  ‘yes,’ there may be an ethical way of using these powerful AI tools. But only if we don’t personify them and remember they are but machines.

One way to do this is to keep their voice robotic rather than human-like. My wife was creeped out by the human voice of Inflection’s Pi chatbot. But if we keep chatbot voices robotic, we’ll be less inclined – less deceived – to personify them.

And that can open all sorts of valuable use cases, especially for people who can use their voice but are limited in other ways (e.g. have a disability).

Our ethics lag behind our technology

AI chatbots illustrate an important principle Christians need to understand in this accelerating but fallen world: our ethics always lag behind our technology. Whether it’s smartphones, social media, or AI, society tends to use technology before understanding its ethical implications – and suffering harm as a result.

This is why we need to discern the negatives and the positives as we start using new tech.

 

Never miss another blogpost:

[1] Taken from the book by Jeremy Peckham, Masters and Slaves: AI and the future of humanity (London, IVP: 2021), 86.

Previous
Previous

As a Bloke, I Had a ‘Barbie’ Moment Reading This Book On Masculinity

Next
Next

Can Democracy Survive in a Post-Christian World?