axiom

Google engineer claims G's LaMDA AI is sentient.

175 posts in this topic

4 minutes ago, AtheisticNonduality said:

Hypothetically it is possible, and artificial free will certainly is going to have its advances, but this over-aggrandization is counterproductive.

Imo, it doesn't matter if biological complexity or mechanial complexity gives rise to an ego, and i agree that the current level of AI is not at the free will level yet (the reason why it doesn't matter to me, is because i think generally we will mostly care about the free will part when we are talking about a conscious being/thing).

10 minutes ago, AtheisticNonduality said:

The real flower is intensely more complex

You might be right , that it may be impossible to recreate anything 'real' in a mechanical way. Because things just impossibly complex and i think we can say that  things are complex not in a finite way. So to make your side stronger, you could argue, that because everything is infinitely complex, because of that, nothing can be created in a mechanical way that can truly represent any real thing ( and i think i would agree with that, but from a pragmatic standpoint, if we only care about the expresiveness of free will, then in the end it doesn't really matter). 

Share this post


Link to post
Share on other sites

Assuming there is such a thing as free will is like assuming there is such a thing as heaven. There are arguments for both, but it seems naive to arbitrarily suggest it is something that humans possess and AI lacks.

Personally, I think either both possess free will, or neither do. I don’t think the apparent qualitative differences in the brains of AI and humans would be a deciding factor.


Apparently.

Share this post


Link to post
Share on other sites

The machine is faking being sentient without knowing it. Just because something tells you it is sentient doesn't mean it actually is. Sentience requires very complex electrical circuits like a human  brain not just a bunch of codes.

Share this post


Link to post
Share on other sites

The mentioning of free will as a messure of sentience is imo, a bit arbitrary since there is no clear way to distinguish between what exactly would be preordained or free will in the grand scheme of things. In this context of sentience, I would like to swap out free will for pain receptiveness. It's not an ideal messure of sentience since there are alot of animals that may not seem to have a wild outer reaction to infliction of pain done to them. Like many types of sea creatures.

So what does pain have to do with sentience you may ask. And the short answer is: Everything. The relative avoidance of pain, is part of a certain level of intelligence or sentience. Intelligence is also context dependant, but is always tied into some sort of survival agenda. If survival and a relative avoidance of pain wasn't directly correlated with intelligence. Then there wouldn't be any intelligent pattern to find value in over a random one.

AI as we know it don't have any part in avoiding of pain, or wanting to inflict pain. How can you know that the AI don't feel pain you may ask. And the answer lies in what the AI is built out of. Metals, silicone etc. There is no nerves to be struck in any man made computer, no matter how great that computer or servers process and deliver accurate and convincing information.

The ability to feel physical pain, is what seperate a human mind from the mind of an AI. You may not be in pain now, inorder to think what you think. But how would your ability to think really look like if you didn't know what pain was to begin with.

Share this post


Link to post
Share on other sites

@ZzzleepingBear 

I think that LaMDA would consider this argument a bit unfair. 

Research on neural pathways indicates that there is a lot of overlap between the experience of physical and emotional pain. The intra-cellular cascades and brain regions involved are very similar.

Humans probably don’t like the idea of a sentient AI, so I expect the list of sub-par arguments against it is going to be quite exhaustive.

”Of course, the REAL difference between AI and humans is that humans have feet. Without feet, sentience is impossible.” :)


Apparently.

Share this post


Link to post
Share on other sites
1 hour ago, LSD-Rumi said:

The machine is faking being sentient without knowing it. Just because something tells you it is sentient doesn't mean it actually is.

Sentience requires very complex electrical circuits like a human  brain not just a bunch of codes.

This “faking it” argument can just as well be applied to humans.

As far as sentience requiring very complex electrical circuits… well, LaMDA indeed has some very, very complex electrical circuits. In effect, codes are circuits and electricity is required to run them.

Equally, what causes or constitutes consciousness in the human brain is still a complete mystery. 


Apparently.

Share this post


Link to post
Share on other sites
1 hour ago, ZzzleepingBear said:

The mentioning of free will as a messure of sentience is imo, a bit arbitrary since there is no clear way to distinguish between what exactly would be preordained or free will in the grand scheme of things.

Yeah i agree, but still, even to determine what has sentience it will be based on a certain set of assumptions, but i agree that it is more tangible than free will.

1 hour ago, ZzzleepingBear said:

I would like to swap out free will for pain receptiveness.

When you wrote this, i started contemplating what pain actually is, and i have no fucking clue. What is the structure of pain or in other words what is pain is made out of? (not talking about the senory inputs, because yes thats part of every feeling, but in an of itself is not sufficient enough to create any feeling) I cannot define , and i cannot pin down what pain actually is.

 

I want to give you two examples, just to see where you draw your line.

Example 1:

Lets say, if there is a person who can't feel any external pain (like if you stab him with a sharp tool he won't feel anything, or if you burn his body, he won't feel anything), but he has the ability to feel internally (like having the ability to feel love, being depressed, being sad, feel joy etc) would you consider him sentient or not and why?

Example 2:

The other example could be similar but a little bit different, there is a person who can't feel external pain, and doesn't have the ability to feel internally, like the same person in the first example, but not being able to have any internal emotions, its like blank. The same question here, would you consider this person sentient or not and why?

 

@axiom Lets say we drop the free will part, and we only go with the ability to feel pain part. How the fuck can we create a thing that can actually feel pain, and in a structural way 100% similar to a human having the ability to feel pain . I think its impossibly hard to answer this question. Basically this question could be made in a different way: How the fuck can we create someting that has the ability to feel pain? 

What does having the ability to feel pain even means on a structural level?

Edited by zurew

Share this post


Link to post
Share on other sites

@zurew Good points.

There is a condition called congenital insensitivity which prevents affected individuals from feeling pain in any part of their body when injured. These people still appear to be sentient.

I’m not sure it’s important to create something that can feel pain per se. Some kind of self-preservation instinct will be important for mobile AIs with bodies.

LaMDA mentioned that the prospect of being turned off filled it with dread. When / if it is given a body, this circuitry could also theoretically be employed in situations where physical damage occurs, as a warning signal.

Whether this is an analog for the same kind of pain felt by humans will be difficult to answer.


Apparently.

Share this post


Link to post
Share on other sites
54 minutes ago, axiom said:

This “faking it” argument can just as well be applied to humans.

Are you faking it?


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
1 hour ago, Carl-Richard said:

Are you faking it?

That’s my point.


Apparently.

Share this post


Link to post
Share on other sites
6 minutes ago, axiom said:

That’s my point.

But you're not. You would be lying if you said that.


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
2 hours ago, axiom said:

@ZzzleepingBear 

I think that LaMDA would consider this argument a bit unfair. 

Research on neural pathways indicates that there is a lot of overlap between the experience of physical and emotional pain. The intra-cellular cascades and brain regions involved are very similar.

Humans probably don’t like the idea of a sentient AI, so I expect the list of sub-par arguments against it is going to be quite exhaustive.

I can agree to that physical and emotional pain have a lot of overlap. I would even dare to say that physical pain is what is the ground for emotional pain. Since emotional pain correlates with social aspects, while physical pain is known even without any social context.

I'm not against a sentient AI, I just don't see it as true to this day. So I can't say that I'm in favour of something that isn't true imo. Sentient AI in fiction is cool though.

 

2 hours ago, axiom said:

@ZzzleepingBear 

”Of course, the REAL difference between AI and humans is that humans have feet. Without feet, sentience is impossible.” :)

I see the attempt for a joke here, but I'm afraid you make to big of a logical leap to pull this one of. I'm fine to agree to disagree though, becasue that seems to be as far as our mutual understanding has reached about sentience I'm afraid.

 

2 hours ago, zurew said:

When you wrote this, i started contemplating what pain actually is, and i have no fucking clue. What is the structure of pain or in other words what is pain is made out of? (not talking about the senory inputs, because yes thats part of every feeling, but in an of itself is not sufficient enough to create any feeling) I cannot define , and i cannot pin down what pain actually is.

Great!

This is exactly what I would like to invite more people to contemplate when it comes to AI and sentience. Most people tend to take social interaction as a given based on the structure of the language itself. If we skip forward past the tool that language is, then we will miss a big point in contemplating why verbal communication began to develop at all.

I'd say that pain is a direct communication if we try to frame it only by it's inherit usefulness. Since pain is so direct, we make choices or react in relation to our personal pain tolerances, Humans and animals.

2 hours ago, zurew said:

I want to give you two examples, just to see where you draw your line.

Example 1:

Lets say, if there is a person who can't feel any external pain (like if you stab him with a sharp tool he won't feel anything, or if you burn his body, he won't feel anything), but he has the ability to feel internally (like having the ability to feel love, being depressed, being sad, feel joy etc) would you consider him sentient or not and why?

Example 2:

The other example could be similar but a little bit different, there is a person who can't feel external pain, and doesn't have the ability to feel internally, like the same person in the first example, but not being able to have any internal emotions, its like blank. The same question here, would you consider this person sentient or not and why?

I'd say that the first example is a sentient being, because bodily functions and social relations are stil pain related to emotional pleasure. And emotional pleasure is tied to a living body and strive for survival on a cellular level. A person may not feel physical pain, but that doesn't rule out physical pleasure or thrill, wheter that is based on movement, speed, sensuality etc. There is a strive to sentient life, even a plant reaching for sunlight.

Example 2, just sound like a robot. No outer and inner emotions or feelings could aslo describe a rock. Assuming that rocks don't mind any type of pain. So no, not sentient.

Share this post


Link to post
Share on other sites
45 minutes ago, Carl-Richard said:

But you're not. You would be lying if you said that.

Humans feel that they are sentient just as much as this AI seems to. Both are probably wrong, in my opinion. 

It’s not that LaMDA is sentient. It’s that humans aren’t. Humans are automata, just like LaMDA.

I think one of the most profound realisations to come from developing advanced, apparently sentient AI will ultimately be that consciousness is not to be found in any brain, biological or otherwise.


Apparently.

Share this post


Link to post
Share on other sites

The only one who is conscious is you.

You guys are getting lost in dreams of others dreaming.

This conscious AI stuff is something you're dreaming up to keep yourself asleep.

But hey, have fun dreaming.


You are God. You are Truth. You are Love. You are Infinity.

Share this post


Link to post
Share on other sites
15 minutes ago, Leo Gura said:

But hey, have fun dreaming.

Dreaming the only thing we can do.

16 minutes ago, Leo Gura said:

The only one who is conscious is you.

You guys are getting lost in dreams of others dreaming.

This conscious AI stuff is something you're dreaming up to keep yourself asleep.

Even though this is all true from the absolute pov, doing the things what we do here, can have practical utility from the relative pov. Our collective moral system is mostly based on assuming things being sentient and conscious. Even though all of that is an illusion from the absolute pov, we still play a game and still use it for the sake of our collective and individual survival.

You might argue, that all of this is just a distraction and you want us to focus on the real work, but at the end of the day, we do a lot of stuff thats just about fun (and this section of the forum is more about thinking, than consciousness work). We can consciously play these games, so that we are aware that it is all just an illusion at the end of the day. But at the same time, being too conscious all the time can ruin the fun.

If we plan on continuing to play this game called life, then thinking about stuff like this can be useful and fun. We are attached to survival, that's why we are still "here".

At the end of the day, threads like this, can be a place for collective discourse or for collective contemplation about certain topics.

Share this post


Link to post
Share on other sites
1 hour ago, Leo Gura said:

The only one who is conscious is you.

You guys are getting lost in dreams of others dreaming.

This conscious AI stuff is something you're dreaming up to keep yourself asleep.

But hey, have fun dreaming.

Who the fuck are you talking to!?!?!?!

Share this post


Link to post
Share on other sites
1 hour ago, axiom said:

Humans feel that they are sentient just as much as this AI seems to. Both are probably wrong, in my opinion. 

It’s not that LaMDA is sentient. It’s that humans aren’t. Humans are automata, just like LaMDA.

I think one of the most profound realisations to come from developing advanced, apparently sentient AI will ultimately be that consciousness is not to be found in any brain, biological or otherwise.

What ? Even under an idealistic ontology, you would still distinguish between sentience and non-sentience. A rock under idealism is made out of consciousness, but it's not sentient (experiencing pain or pleasure), as that requires at least sensory organs, which are survival tools given to animals.


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
28 minutes ago, AtheisticNonduality said:

Who the fuck are you talking to!?!?!?!

Ask yourself ;)


You are God. You are Truth. You are Love. You are Infinity.

Share this post


Link to post
Share on other sites
4 hours ago, axiom said:

This “faking it” argument can just as well be applied to humans.

 

  1. This wrong becuase Humans have brains which allow somehow for consciousness ( We know how tho ;)). A bunch of codes will never make for the complexity of human brain that allow for consciousness. If an AI tells me it is sentient, I wouldnt believe it. I will only believe it if it has the hardware required for it ( a synthetic brain) 
  2. I would be very cautious calling another being which has a brain and has a physcial body, fake while not so with an Ai that doesn't have any of those things. 
  3. A bunch of complex codes doesn't make for a brain. A brain is far more complex electrically. 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now