I am a squirrel

I am a squirrel. I am sentient.

A squirrel

There has been much talk about LaMDA this week after a software engineer, Blake Lemoine, published a ‘conversation’ with LaMDA.

LaMDA does conversations. It is a language model built on Transformer, invented in 2017. LaMDA is different to models like GPT-3 and BERT, due to the training set. It was trained on dialogue.

Many stories seem to focus on the fact that Lemoine asks if LaMDA is sentient. Which leads to an interesting question of what does it mean to be sentient, and how can we measure or determine if something is scentient?

As a conversation program we can trace roots back to the very simple ELIZA created waaaaay back in 1964. It gave an illusion of conversation, but could easily be fooled into saying utter rubbish.

LaMDA of course has a much larger data set and can ‘remember’ at least 2000 words of the current conversation. It also demonstrates remembering prior conversations, which is an interesting feature if true. I guess it could train on a previous conversation, as it has will all the dialog it has been trained on to date.

Is it sentient though? No, it just sounds like it is. But if it is, we have AI Principles to keep the terminators from our doors.

A squirrel is sentient. We can tell by it’s independent actions and interactions. We cannot tell by talking to them. Imagine the interesting things it could tell you about the things it sees in the park, if it could talk.

How can you get involved?

The path is clear to a point. Learn Python, understand TensorFlow and Machine Learning, and down the rabbit hole you go.

If you need help setting up an environment, my friends and I have worked on a demo platform.

Hope you have found this useful, if you want to know more, contact my team or check out my book.