A senior software engineer has claimed that an advanced AI Program has gained its own feelings and desires.
A senior Google software engineer has been placed on suspension after having claimed that one of the most artificial intelligence system programs has become sentient. The engineer claimed that the program has all its own feelings and wants to experience mutual respect.
The technology giant has placed Blake Lemoine on paid leave as of the start of last week.
According to the human resources department at Google, the reason Blake Lemoine was placed on leave was because he violated the company’s confidentiality policy. According to a report in the New York Times, cited by Global News, the day before Lemoine was placed on paid leave, he had shared a number of documents with a US Senator’s office. The engineer alleged that Google had engaged in religious discrimination.
According to Lemoine, the discrimination occurred when Google denied to accept his request to require The Language Model for Dialogue Applications (LaMDA) program’s consent before moving forward with any experiments. It’s the LaMDA artificial intelligence program that Lamoine claimed had become sentient.
The artificial intelligence program is capable of engaging in “free-flowing” conversations via text.
The LaMDA AI software functions somewhat like a chatbot in the way that it can engage in text conversations. The BBC reported that Brian Gabriel, a Google representative, said that Lemoine “was told that there was no evidence that LaMDA was sentient (and lots of evidence against it).” Gabriel went on to say that there were hundreds of engineers and researchers who had held conversations with LaMDA, and Lemoine was the only one who had determined that the software had achieved sentience.
Gabriel added that LaMDA “tends to follow along with prompts and leading questions, going along with the pattern set by the user.” He pointed out that “These systems imitate the types of exchanges found in millions of sentences, and can riff on any fantastical topic.” Furthermore, he said about the artificial intelligence program that “If you ask what it’s like to be an ice cream dinosaur, they can generate text about melting and roaring and so on.”