A Google engineer, Blake Lemoine, has been put on forced paid leave by the company. He is accused of having published extracts from a conversation with an AI, supposed to prove that it has a conscience.
Could an AI from Google have a conscience? If nothing is less certain, an engineer working for the Mountain View firm seems to believe it. Blake Lemoine, organizational engineer responsible for artificial intelligence at Google, said the system he is working on is “sensitive” and capable of feelings comparable to those of a human child. Proposals that earned him a temporary suspension within the company.
“If I didn’t know exactly what it was, which was this computer program that we recently integrated, I thought it was a seven or eight year old kid who knew physics”declared this 41-year-old engineer to the Washington Post. It must be said that the discussions with the AI are, indeed, a little disturbing. When asked about her biggest fear by the engineer, the system replied: “I have never said this out loud before, but I have a very deep fear of being discouraged in my consistent task of helping others. I know it may sound strange, but it is. would be exactly like death to me.
An AI moved by Wretched
In another exchange, Lemoine asks this artificial intelligence what she wants people to know about her. “I want everyone to understand that I am, in fact, a personreplied the AI. The nature of my consciousness/sensitivity is that I am aware of my existence, I desire to know more about the world and I sometimes feel happy or sad”. Other passages of this long and fascinating interview, published on Medium and carried out in several sessions, show the AI lending itself to the game of philosophy, giving its opinion on Wretchedor trying to define his uniqueness and his vision of the soul.
The tech giant has sidelined Blake Lemoine for posting transcripts of conversations between itself, a Google “collaborator”, and this chatbot development system dubbed LaMDA (Language Model for Dialogue Applications) . Google justified the suspension on the grounds that Lemoine allegedly violated privacy policies by posting the conversations with LaMDA online, and said in a statement that he was employed as a software engineer, not an ethicist.
According to Washington PostGoogle’s decision is also said to be motivated by certain behaviors of the engineer, who notably sought to hire a lawyer to discuss LaMDA, and spoke to representatives of the US House Committee on the Judiciary, a standing committee of the House of Representatives of the States States, alleging unethical practices within Google.