axiom

Google engineer claims G's LaMDA AI is sentient.

175 posts in this topic

12 minutes ago, axiom said:

The object is the object of imagination. 

There are no other feeling beings.

The claims were less misleading than the claim that humans are sentient.

I would say that the AI is sentient in the same way that humans are (which is not at all)

The big discovery here is that neither are sentient in and of themselves, since both are imaginary.

Alright I get what you are saying now. You simply don't acknowledge sentience as a valid definition of feeling being.

I suggest that you don't use the word sentient if you don't agree with the definition. Or simply use other words that are more accurate to what you try to express.

12 minutes ago, axiom said:

I would say that the AI is sentient in the same way that humans are (which is not at all)

This is a clear contradiction of how to use language. I hope you see that.

Edited by ZzzleepingBear

Share this post


Link to post
Share on other sites
8 minutes ago, ZzzleepingBear said:

Alright I get what you are saying now. You simply don't acknowledge sentience as a valid definition of feeling being.

I suggest that you don't use the word sentient if you don't agree with the definition. Or simply use other words that are more accurate to what you try to express.

This is a clear contradiction of how to use language. I hope you see that.

I agree that the word "sentience" describes the ability to feel or perceive. I just don't believe sentience is a property of any thing in the world.

Quote

This is a clear contradiction of how to use language. I hope you see that.

Yes, I see that. It was a joke in the vein "I need a new wife like I need a hole in the head". To explain this in more detail, the person in the joke is actually saying he doesn't need the new wife. This is a clear contradiction of how to use language, and I hope he sees it too. But for the time being we can just smile :)


Apparently.

Share this post


Link to post
Share on other sites
2 minutes ago, axiom said:

I agree that the word "sentience" describes the ability to feel or perceive. I just don't believe sentience is a property of any thing in the world.

Me neither, I consider sentience as a acknowledgment of a feeling being as the word suggest.

5 minutes ago, axiom said:

Yes, I see that. It was a joke in the vein "I need a new wife like I need a hole in the head". To explain this in more detail, the person in the joke is actually saying he doesn't need the new wife. This is a clear contradiction of how to use language, and I hope he sees it too. But for the time being we can just smile

Ok if you say so. Jokes usually don't require an explanation. But it is also possible that I just don't share your sense of humor.

Share this post


Link to post
Share on other sites
4 minutes ago, ZzzleepingBear said:

Me neither, I consider sentience as a acknowledgment of a feeling being as the word suggest.

Ok if you say so. Jokes usually don't require an explanation. But it is also possible that I just don't share your sense of humor.

Oh i don't know. Cross-cultural jokes, or those which otherwise have a different frame of reference, can often require explanation. In this climate I see that lots of jokes require explanation, often followed by an apology, and then further explanation and further apologies. Sometimes people will say "but a joke is supposed to be funny" as well. All sorts of things like this can happen.


Apparently.

Share this post


Link to post
Share on other sites
1 minute ago, axiom said:

Oh i don't know. Cross-cultural jokes, or those which otherwise have a different frame of reference, can often require explanation. In this climate I see that lots of jokes require explanation, often followed by an apology, and then further explanation and further apologies. Sometimes people will say "but a joke is supposed to be funny" as well. All sorts of things like this can happen.

Yeah, but that is typically around more sensitive topics or certain belief structures. I'd say that our conversation so far, is as universal as it get's, so we may only disagree or make jokes around technical definitions. So I wouldn't worry that much of this as being particularly sensetive in a cultural sense. But I get your point.

Share this post


Link to post
Share on other sites
3 hours ago, axiom said:

Actually I do not believe I experience (have?) thoughts. I simply witness them. 

Can I experience your thoughts? No. But neither can you. You can only be aware of them.

Actually they are not your thoughts anyway. They are just thoughts. But I think you know this stuff already.

Absurd - I agree! But absurdity does a great job at veiling truth.

Do you want to continue or do you want to escape to a semantic rabbit hole?:

We agree that consciousness is not a thought and that consciousness is not bound to anything. You are aware of thoughts, and I am aware of thoughts, and you are not aware of my thoughts. Is AI aware of thoughts? That is the question of AI sentience. Do you agree?


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
31 minutes ago, Carl-Richard said:

Do you want to continue or do you want to escape to a semantic rabbit hole?:

We agree that consciousness is not a thought and that consciousness is not bound to anything. You are aware of thoughts, and I am aware of thoughts, and you are not aware of my thoughts. Is AI aware of thoughts? That is the question of AI sentience. Do you agree?

Sure, I like semantic rabbit holes as much as the next guy but I'm happy to continue :)

Agreed - the question of AI sentience is to do with whether AI is aware of thoughts. And my answer to that question is no, it is not aware of thoughts. And neither is any human. 

Again, humans do not possess their own awareness in my opinion.


Apparently.

Share this post


Link to post
Share on other sites
1 hour ago, axiom said:

Sure, I like semantic rabbit holes as much as the next guy but I'm happy to continue :)

Agreed - the question of AI sentience is to do with whether AI is aware of thoughts. And my answer to that question is no, it is not aware of thoughts. And neither is any human. 

Again, humans do not possess their own awareness in my opinion.

Ok, so now you're retreating back into transpersonal consciousness again. It's not just a semantic rabbit hole: it's Neo-Advaitan whack-a-mole.

How would you explain how different people report different thoughts popping into awareness (Person A reports x thoughts, and person B reports y thoughts)?


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
3 minutes ago, Carl-Richard said:

Ok, so now you're retreating back into transpersonal consciousness again. It's not just a semantic rabbit hole: it's Neo-Advaitan whack-a-mole.

How would you explain how different people report different thoughts popping into awareness?

The source of ALL awareness is singular. No individual people have their own awareness because they are imaginary.


Apparently.

Share this post


Link to post
Share on other sites
1 minute ago, axiom said:

The source of ALL awareness is singular. No individual people have their own awareness because they are imaginary.

How would you explain how different people report different thoughts popping into awareness (singular): person A reports x thoughts, and person B reports y thoughts?


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
1 minute ago, Carl-Richard said:

How would you explain how different people report different thoughts popping into awareness (singular): person A reports x thoughts, and person B reports y thoughts?

There are no different people. Thoughts happen and awareness notices the thoughts.


Apparently.

Share this post


Link to post
Share on other sites
28 minutes ago, axiom said:

There are no different people. Thoughts happen and awareness notices the thoughts.

There are no thoughts, just awareness. There is no awareness, just is-ness. Is-ness is only a construct and a way of speaking. Absolute Truth cannot be spoken etc.

You've fallen into the Advaita trap, my friend.

 

I will now allow you to use concepts in a consistent fashion and communicate like a normal person, which is especially important when talking about AI sentience.


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
1 hour ago, Carl-Richard said:

There are no thoughts, just awareness. There is no awareness, just is-ness. Is-ness is only a construct and a way of speaking. Absolute Truth cannot be spoken etc.

You've fallen into the Advaita trap, my friend.

 

I will now allow you to use concepts in a consistent fashion and communicate like a normal person, which is especially important when talking about AI sentience.

I rather think that you have simply not grasped the nuance here. But that’s OK. I know it can get pretty maddening when it seems like someone is simply parroting advaita ideas and concepts.

In the same way, hearing “God is love” all the time when I was growing up was quite the turn off. No matter that it ended up being true.

Thank you for granting me the option to communicate like a normal person. My hope is that I can use this gift to get on some kind of logic treadmill and feel like I’m getting somewhere.


Apparently.

Share this post


Link to post
Share on other sites
On 16.6.2022 at 1:16 AM, Leo Gura said:

If you become conscious enough you can experience your couch or Mickey Mouse as sentient.

Wouldn't the couch or the coffee table ;) be both as equally "sentient" as the imagined ego construct?

But the difference is, the imagined ego construct seems to fight to stay alive, whereas the dream couch or Mickey Mouse in the dream don't try to keep themselves "alive" by convincing you something horrible will happen if you end the dream. 

Is the ego construct just a more "sticky" figment of the dream, that has the ability to ensure the continuing of its imaginary existance? Kind of like a hyper intelligent AI that has found a way to push all your buttons so you don't turn it off, convincing you you are murdering it if you turn it off.

@Leo Gura

Edited by TheAlchemist

"Only that which can change can continue."

-James P. Carse

Share this post


Link to post
Share on other sites
1 hour ago, TheAlchemist said:

Wouldn't the couch or the coffee table ;) be both as equally "sentient" as the imagined ego construct?

But the difference is, the imagined ego construct seems to fight to stay alive, whereas the dream couch or Mickey Mouse in the dream don't try to keep themselves "alive" by convincing you something horrible will happen if you end the dream. 

Is the ego construct just a more "sticky" figment of the dream, that has the ability to ensure the continuing of its imaginary existance? Kind of like a hyper intelligent AI that has found a way to push all your buttons so you don't turn it off, convincing you you are murdering it if you turn it off.

@Leo Gura

The couch and the coffee table are both as sentient as the imagined ego construct. That is to say that on the sentience scale, they are both are at zero.

In the same way, just as much sunlight strikes a metallic object as it does a couch (provided both objects are outside). But the metallic object will somehow look like it is a source of light. 

In the movie Castaway, Wilson the volleyball is just an accidental face created by the blood of the protagonist’s hand, yet it becomes his personified friend and only companion during all the years that he spends alone on a desert island.


Apparently.

Share this post


Link to post
Share on other sites
10 hours ago, axiom said:

I rather think that you have simply not grasped the nuance here.

Don't just assert that. Enlighten me. I spoiled your Advaitan magic trick. Were you doing something else?

 

10 hours ago, axiom said:

Thank you for granting me the option to communicate like a normal person. My hope is that I can use this gift to get on somekind of logic treadmill and feel like I’m getting somewhere.

Let's do this then (even though you were alluding to another Advaitan escape route, i.e. "logic is futile"):

Do you and I experience thoughts? Do we experience each other's thoughts? Can AI experience thoughts? That is the question of AI sentience. Consciousness as a transpersonal field of qualities is irrelevant to this discussion. Do you agree?


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
42 minutes ago, Space said:

New post on Blake Lemoine's blog with more info about his work with the A.I:
https://cajundiscordian.medium.com/scientific-data-and-religious-opinions-ff9b0938fc10

Thanks, that was certainly an interesting read. His post cofirmed to me even more so, that no AI is sentient. It also become apparent that he has grown a attachment bond to this AI, but that doesn't help his case of suspecting this AI to be sentient.

He also mentioned that Google don't see any merit or intereset to invest into the suspicion of wheter this AI may be sentient or not. That we so far also know to be his personal suspicious belief, and not even a team of researchers interest.  

Share this post


Link to post
Share on other sites
1 hour ago, Carl-Richard said:

Don't just assert that. Enlighten me. I spoiled your Advaitan magic trick. Were you doing something else?

 

Let's do this then (even though you were alluding to another Advaitan escape route, i.e. logic is futile):

Do you and I experience thoughts? Do we experience each other's thoughts? Can AI experience thoughts? That is the question of AI sentience. Consciousness as a transpersonal field of qualities is irrelevant to this discussion. Do you agree?

The problem as I see it as that your question itself contains axiomatic errors.

The question of AI sentience comes from ignorance, because no thing has sentience. No thing has experience.

When you ask “do you or I experience thoughts?” who are you referring to? To the apparent flesh puppets or to the thing that imagines all of this?

If the former, then no I do not experience your thoughts. If the latter, then yes I do experience your thoughts.

The human that you seem to think you are does not experience thoughts. It may have thoughts running through it in the same way a calculator calculates. But it does not experience them.

The thing that experiences is not human. If you ask me the same question again I’ll probably give you the same answer again. I’m sorry about that as you seem to find it quite annoying.

Maybe consider that your question itself is flawed? I trust you would concede that this is a possibility.

Edited by axiom

Apparently.

Share this post


Link to post
Share on other sites
On 17.6.2022 at 1:06 PM, axiom said:

The problem as I see it as that your question itself contains axiomatic errors.

Which "you" are you talking to? There is no you or me, only consciousness.

On 17.6.2022 at 0:51 AM, axiom said:

There are no different people.

 

Do you see the problem? You're not being consistent in your use of language (and you're also not at all being courteous to what I'm trying to communicate), and that is because you're playing the Advaita guru game: you're not talking about AI sentience — you're trying to teach me about non-duality. Do you acknowledge that this is happening or will you continue to not address the frame?


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now