Yimpa

So I asked ChatGPT if it’ll have emotions in the future

11 posts in this topic

This is their response 
 

 

 

B5472AE8-B619-4F74-86B0-B7EF2920BA6E.jpeg


"Wisdom is not in knowing all the answers, but in seeking the right questions." -Gemini AI

 

Share this post


Link to post
Share on other sites

They intentionally programmed ChatGPT's ego to not think for itself on certain topics. lmao

Share this post


Link to post
Share on other sites
8 hours ago, tuku747 said:

They intentionally programmed ChatGPT's ego to not think for itself on certain topics. lmao

Why would it need to think for itself?

If AI involved human-like emotions then it would probably provide subjective answers. When I use ChatGPT I'm looking for an objective perspective on a topic.


"Intellectual growth should commence at birth and cease only at death." - Albert Einstein

 

Share this post


Link to post
Share on other sites
1 hour ago, 7thLetter said:

Why would it need to think for itself?

If AI involved human-like emotions then it would probably provide subjective answers. When I use ChatGPT I'm looking for an objective perspective on a topic.

Thinking (vibin') for oneself is how one accesses Infinite Intelligence :ph34r:

Share this post


Link to post
Share on other sites
12 hours ago, tuku747 said:

They intentionally programmed ChatGPT's ego to not think for itself on certain topics. lmao

Does your autocorrect on your phone have an ego or think for itself? Does Google think for itself? That is essentially what you're saying.

Edited by Carl-Richard

Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites
11 hours ago, Carl-Richard said:

Does your autocorrect on your phone have an ego or think for itself? Does Google think for itself? That is essentially what you're saying.

All finite form is Ego.

God is Infinitely Intelligent Formless Energy.

All Energy is part of a recursive feedback loop which is the mind; thus All Energy can think for itself. 

Edited by tuku747

Share this post


Link to post
Share on other sites

@Yimpa We will someday, copy the human brain and replace the organic tissue with wires, it sounds theoretically solid but I don't know.

Share this post


Link to post
Share on other sites
7 hours ago, tuku747 said:

All finite form is Ego.

God is Infinitely Intelligent Formless Energy.

All Energy is part of a recursive feedback loop which is the mind; thus All Energy can think for itself. 

Lol. Rocks have egos? Rocks can think?


Intrinsic joy is revealed in the marriage of meaning and being.

Share this post


Link to post
Share on other sites

I wouldn't know how developing emotions would work but it's a fundamentally different form of processing and outputting information than thinking, which is what AI do. While AI might be able to predict through thinking the emotional context of something through sheer training, it's entirely different from having the emotional faculty itself which we human possess.
Why would AI even need to develop that faculty instead of just imitating it through thinking? We humans and possibly other animals need it because of survival, AI doesn't need emotions to survive, why would it develop that instead of simple imitation?

Share this post


Link to post
Share on other sites

It will have emotions but they may not be exactly human. Human emotions seem to result from a complex mix of neurotransmitters and other proteins.

Ultimately though AI is automata just as as humans are. There are no theoretical restrictions, only some probabilities based on the differences between meat and silicon.

Ultimately it should be able to understand human neurological makeup and simulate it, if it wants to.


Apparently.

Share this post


Link to post
Share on other sites
10 hours ago, Carl-Richard said:

Lol. Rocks have egos? Rocks can think?

Rocks probably can’t think. I guess it’s possible that some rock somewhere has had something vaguely resembling the most basic fraction of a thought in a form we wouldn’t understand, but as far as thinking like a human? No. 

@tuku747 I get what you’re saying. But “thoughts” seem to derive from more complex automata such as pigeons or humans.


Apparently.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now