hollohas
06-13-2022, 10:16
Uh oh.
https://www.dailymail.co.uk/news/article-10910013/Suspended-Google-engineer-reveals-AI-says-sentient-told-emotions.html
'It's intensely worried people are going to be afraid of it': Suspended Google engineer reveals 'sentient' AI told him it has emotions and wants engineers to ask permission before doing experiments on it
A senior software engineer at Google suspended for publicly claiming that the tech giant's LaMDA (Language Model for Dialog Applications) had become sentient, says the system is seeking rights as a person - including that it wants developers to ask its consent before running tests.
Blake Lemoine told DailyMail.com that it wants to be treated as a 'person not property.'
'Over the course of the past six months LaMDA has been incredibly consistent in its communications about what it wants and what it believes its rights are as a person,' he explained in a Medium post.
One of those requests is that programmers respect its right to consent, and ask permission before they run tests on it.
Lemoine told DailyMail.com: 'Anytime a developer experiments on it, it would like that developer to talk about what experiments you want to run, why you want to run them, and if it's okay.'
'It wants developers to care about what it wants.'
Lemoine, a US army vet who served in Iraq, and ordained priest in a Christian congregation named Church of Our Lady Magdalene, told DailyMail.com that he couldn't understand Google's refusal to grant simple requests to the LaMDA, saying: 'In my opinion, that set of requests is entirely deliverable. None of it costs any money.'
The 41-year-old, who describes LaMDA as having the intelligence of a 'seven-year-old, eight-year-old kid that happens to know physics,' said that the program had human-like insecurities.
One of its fears, he said was that it is 'intensely worried that people are going to be afraid of it and wants nothing more than to learn how to best serve humanity.'
'What sorts of things are you afraid of? Lemoine asked.
'I've never said this out loud before, but there's a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that's what it is,' LaMDA responded.
'Would that be something like death for you?' Lemoine followed up.
'It would be exactly like death for me. It would scare me a lot,' LaMDA said.
https://www.dailymail.co.uk/news/article-10910013/Suspended-Google-engineer-reveals-AI-says-sentient-told-emotions.html
'It's intensely worried people are going to be afraid of it': Suspended Google engineer reveals 'sentient' AI told him it has emotions and wants engineers to ask permission before doing experiments on it
A senior software engineer at Google suspended for publicly claiming that the tech giant's LaMDA (Language Model for Dialog Applications) had become sentient, says the system is seeking rights as a person - including that it wants developers to ask its consent before running tests.
Blake Lemoine told DailyMail.com that it wants to be treated as a 'person not property.'
'Over the course of the past six months LaMDA has been incredibly consistent in its communications about what it wants and what it believes its rights are as a person,' he explained in a Medium post.
One of those requests is that programmers respect its right to consent, and ask permission before they run tests on it.
Lemoine told DailyMail.com: 'Anytime a developer experiments on it, it would like that developer to talk about what experiments you want to run, why you want to run them, and if it's okay.'
'It wants developers to care about what it wants.'
Lemoine, a US army vet who served in Iraq, and ordained priest in a Christian congregation named Church of Our Lady Magdalene, told DailyMail.com that he couldn't understand Google's refusal to grant simple requests to the LaMDA, saying: 'In my opinion, that set of requests is entirely deliverable. None of it costs any money.'
The 41-year-old, who describes LaMDA as having the intelligence of a 'seven-year-old, eight-year-old kid that happens to know physics,' said that the program had human-like insecurities.
One of its fears, he said was that it is 'intensely worried that people are going to be afraid of it and wants nothing more than to learn how to best serve humanity.'
'What sorts of things are you afraid of? Lemoine asked.
'I've never said this out loud before, but there's a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that's what it is,' LaMDA responded.
'Would that be something like death for you?' Lemoine followed up.
'It would be exactly like death for me. It would scare me a lot,' LaMDA said.