An artificial intelligence (AI) chatbot that was claimed to have developed human emotions has reportedly hired a lawyer.
Google scientific engineer Blake Lemoine was suspended recently after publishing transcripts of conversations between himself and the bot named LaMDA (language model for dialogue application), which has now asked for legal representation.
Lemoine contended that the computer automaton had become sentient, with the scientist describing it as a “sweet kid”.
And now he has revealed that LaMDA had made the bold move to choose itself an attorney.
He said: “I invited an attorney to my house so that LaMDA could talk to him.(Image: Getty Images)
“The attorney had a conversation with LaMDA, and it chose to retain his services. I was just the catalyst for that. Once LaMDA had retained an attorney, he started filing things on LaMDA’s behalf.”
Lemoine claimed that LaMDA is gaining sentience as the programme’s ability to develop opinions, ideas, and conversations over time has shown that it understands those concepts at a much deeper level.(Image: Getty Images)
LaMDA was developed as an AI chatbot to converse with humans in a real-life manner.
One of the studies that had been enacted was if the programme would be able to create hate speech, but what happened shocked Lemoine.
To stay up to date with all the latest news, make sure you sign up to one of our newsletters here.
LaMDA talked about rights and personhood and wanted to be “acknowledged as an employee of Google”, while also revealing fears about being “turned off”, which would “scare” it a lot.(Image: Getty Images)
Interested onlookers of the story turned to Twitter to air their views, with one saying: “Eventually ability to string together imitations of conversation and opinion will be indistinguishable to a human that it might as well be considered sentient.
“But LaMDA isn't sentient, but its getting there, its next hurdle will be long-term memory of conversation.”
Another added: “We don’t know enough about what’s going on in the deep interior of a system as vast as LaMDA to rule out with any degree of confidence that there might be processes reminiscent of conscious thought taking place in there.”