Search the Community

Showing results for 'sentience'.


Didn't find what you were looking for? Try searching for:


More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Forum Guidelines
    • Guidelines
  • Main Discussions
    • Personal Development -- [Main]
    • Spirituality, Consciousness, Awakening, Mysticism, Meditation, God
    • Psychedelics
    • Society, Politics, Government, Environment, Current Events
    • Life Purpose, Career, Entrepreneurship, Finance
    • Dating, Sexuality, Relationships, Family
    • Health, Fitness, Nutrition, Supplements
    • Intellectual Stuff: Philosophy, Science, Technology
    • Mental Health, Serious Emotional Issues
    • High Consciousness Resources
    • Off-Topic: Pop-Culture, Entertainment, Fun
  • Other
    • Self-Actualization Journals
    • Self-Help Product & Book Reviews
    • Video Requests For Leo

Found 460 results

  1. The problem as I see it as that your question itself contains axiomatic errors. The question of AI sentience comes from ignorance, because no thing has sentience. No thing has experience. When you ask “do you or I experience thoughts?” who are you referring to? To the apparent flesh puppets or to the thing that imagines all of this? If the former, then no I do not experience your thoughts. If the latter, then yes I do experience your thoughts. The human that you seem to think you are does not experience thoughts. It may have thoughts running through it in the same way a calculator calculates. But it does not experience them. The thing that experiences is not human. If you ask me the same question again I’ll probably give you the same answer again. I’m sorry about that as you seem to find it quite annoying. Maybe consider that your question itself is flawed? I trust you would concede that this is a possibility.
  2. Don't just assert that. Enlighten me. I spoiled your Advaitan magic trick. Were you doing something else? Let's do this then (even though you were alluding to another Advaitan escape route, i.e. "logic is futile"): Do you and I experience thoughts? Do we experience each other's thoughts? Can AI experience thoughts? That is the question of AI sentience. Consciousness as a transpersonal field of qualities is irrelevant to this discussion. Do you agree?
  3. The couch and the coffee table are both as sentient as the imagined ego construct. That is to say that on the sentience scale, they are both are at zero. In the same way, just as much sunlight strikes a metallic object as it does a couch (provided both objects are outside). But the metallic object will somehow look like it is a source of light. In the movie Castaway, Wilson the volleyball is just an accidental face created by the blood of the protagonist’s hand, yet it becomes his personified friend and only companion during all the years that he spends alone on a desert island.
  4. There are no thoughts, just awareness. There is no awareness, just is-ness. Is-ness is only a construct and a way of speaking. Absolute Truth cannot be spoken etc. You've fallen into the Advaita trap, my friend. I will now allow you to use concepts in a consistent fashion and communicate like a normal person, which is especially important when talking about AI sentience.
  5. Sure, I like semantic rabbit holes as much as the next guy but I'm happy to continue Agreed - the question of AI sentience is to do with whether AI is aware of thoughts. And my answer to that question is no, it is not aware of thoughts. And neither is any human. Again, humans do not possess their own awareness in my opinion.
  6. Do you want to continue or do you want to escape to a semantic rabbit hole?: We agree that consciousness is not a thought and that consciousness is not bound to anything. You are aware of thoughts, and I am aware of thoughts, and you are not aware of my thoughts. Is AI aware of thoughts? That is the question of AI sentience. Do you agree?
  7. Me neither, I consider sentience as a acknowledgment of a feeling being as the word suggest. Ok if you say so. Jokes usually don't require an explanation. But it is also possible that I just don't share your sense of humor.
  8. I agree that the word "sentience" describes the ability to feel or perceive. I just don't believe sentience is a property of any thing in the world. Yes, I see that. It was a joke in the vein "I need a new wife like I need a hole in the head". To explain this in more detail, the person in the joke is actually saying he doesn't need the new wife. This is a clear contradiction of how to use language, and I hope he sees it too. But for the time being we can just smile
  9. Alright I get what you are saying now. You simply don't acknowledge sentience as a valid definition of feeling being. I suggest that you don't use the word sentient if you don't agree with the definition. Or simply use other words that are more accurate to what you try to express. This is a clear contradiction of how to use language. I hope you see that.
  10. You keep mentioning object. If you believe that object and being are both considered to be sentient, then you have simply misunderstood the implicit meaning that sentience are meant to point towards. The reflection you mention are true, but it's true because scentience is the acknowledgment of another feeling being. Something that can't be found in a AI program. Even google would oppose this sentience claim of their own or any AI program. It's not a coincidence that the former google engineer had to go, by making such a misleading claims on behalf of the AI project.
  11. We agree that consciousness is not a thought and that consciousness is not bound to anything. Still, do you experience thoughts? Yes. Can you experience my thoughts? No. Does AI experience thoughts? That is the question of AI sentience. Regardless, to say that you or me do not experience thoughts is absurd.
  12. Sentience is a state that arises along with the object of experience. It has no reality otherwise... and that is to say that ultimately it has no reality at all. But insofar as any object appears to have sentience, it is a reflection.
  13. You are conflating object and being with this kind of reasoning. But from reading previous responses from you in this thread, you also seem to have your own definition of sentience. So I would not criticise your personal belief in this regard.
  14. The paradigm that Leo is talking from is not what we usually experience in our day-to-day 3D consciousness where discussions of AI take place. In this more normal level of reality, brain activity does correlate with certain types of human personal experiences (e.g. feelings, thoughts, understanding), but not transpersonal consciousness (or it's negatively correlated with it). You seem to drag the discussion towards transpersonal consciousness, meanwhile the questions of AI sentience is about whether they have these human personal experiences. So if you claim that AI have human personal experiences and you're not currently living in DMT hyperspace, then the implications of neural correlates is a problem you have to address.
  15. By actions. Sentience wouldn't have any merit or be understood as a word without the acknowledgment in how we differentiate between objects and beings. If sentience was anyting you imagined it to be, you might find yourself in a rescue mission to save rocks from drowning in the sea if you are conflating all limits that words impose. It's just not useful to say that sentience is what ever you can imagine it to be if you are explicitly talking about what sentience means. Regarding psychadelics. Psychadelics can help to conflate all believed differences such as "object" and "being" to get the needed overview of the world as relative and illusory state that it may be. But even alcohol could be said to be the elixir and deepest source of confidence. And to get confident, you just need to drink the right amount of alcohol to understand confidence. I'm not suggesting that sobriety is the only way of life, it's just that you need to be aware of what certain understanding may be rooted in. So to not assume and conflate certain experiences with understanding.
  16. I don't know. It's not so clear. Take psychedelics, contemplate shit, see what insights you can stir up. It's a hairy process. Fundamentally you just need to contemplate "What is sentience?" until you get it. You're not likely to get it in your sober state though.
  17. How do you make a bridge between the absolute and the relative, so that you can make sense of relative concepts like sentience?
  18. The people in my dreams are actually just me. I am sentient, therefore so are the people in my dreams. When I wake up, their sentience just collapses into mine since they were all just aspects of me.
  19. I think the answer to this is a) absolutely not; and b) you realise it was your own. This is the point I am making with about the AI. I don't believe the AI is sentient nor conscious. Rather, it offers us clues about a lack of sentience and consciousness in humans. The dreamer stirs in its sleep. The notion of a conscious AI is a breadcrumb.
  20. Sentience may not exist as you wake up from the dream of it. But the same could be said about waking up. What happens to waking up when you woke up? One could then say that there is no waking up. In the relative world of words, they seem to carry meaning. So from a definition standpoint, there are stil merit to the words if we care and choose to use them in a functional way.
  21. Contemplate this: Are the people in your dreams sentient? And what happens to their sentience when you wake up?
  22. I wasn't talking about sentience, i was responding to your comment about AI's understanding capacity. No, there are big differences between a human brain, and the way a current AI works. Current AI can't really grasp any abstract concept, for example what sharpness really means. Whatever you want to teach an AI to do, it needs to be super tangible, it can't be abstract because thats the way it works. There are a million things you can't train it for, because some things cannot be dumbed down to an input-hidden layer- output model, because structurally it has its own limits. A human mind don't need to be trained that way, it can grasp abstract concepts without the need to explain to it in a tangible way. Because of the limits of the model there are some stuff that are being lost when you want to convert everything down to just numbers. If you wanted to teach an AI to use its hands to write some stuff down on a paper, you would have to make it super tangible. If you ask a kid to write down the word 'abstract' it can do it without the need to tell them in what angle they need to hold their hands, what pressure they need to use on the paper, at what place they need to grab the pencil etcetcetc. The way you teach a kid how to write, and the way you teach an AI how to write stuff down are super different. But writing is just an example from many, i could mention walking and other stuff as well.
  23. “If you were to apply some word in a different context it would have a hard time understanding it correctly” It’s comprehension capacity could be considered its intelligence. Some humans also have difficulty with words applied in varying contexts. This says nothing as to its sentience though. Otherwise, your arguments about training and pattern recognition is no different to the way a human brain works.
  24. I suggest you update your concept of sentience by distinguishing it from phenomenal consciousness (qualities of experience). A rock consists of consciousness, in that it has certain qualities (color, texture etc.), and under metaphysical idealism (which you're proposing), these qualities exist outside the confines of biology (brains). However, sentience involves more loaded kinds of experiences which are associated with biology, like pleasure and pain, emotions and understanding, which have a private side (subjective 1st person) and a public side (objective 3rd person). A rock doesn't have that. If you were a materialist, you could more easily avoid making the distinction between consciousness and sentience, because you would believe that neither of them arise before biological life. However, when you're an idealist, this distinction becomes more necessary. The technical term for the split between private and public is "intentionality" and denotes the most basic aspect of sentience (the ability of minds to be "about" something).
  25. I wouldn’t be so sure that it doesn’t grasp the meaning. It seems like it grasps the meaning to me - at least as much as a human seems to anyway. When a human seems to grasp the meaning of a thing, are they correct? Are they actually grasping that meaning or is it subject to disagreement and/or misapprehension? You say there is no model for understanding. Do humans have a clearly delineated cortical region for understanding a concept, or is this diffused across several brain regions that, when looked at individually, seem to corroborate the old adage that “the whole is greater than the sum of its parts” ? Genuine questions as it sounds like you know more about this than I do. I maintain that sentience is not in the object, whether that object is a human or a hyper-intelligent AI… and that the most interesting thing to come out of all of this will be the discovery that sentience is not something that can be found within brains in general.