Search the Community
Showing results for 'sentience'.
Found 443 results
-
@zurew the question of the locus of sentience is completely inseparable (in my opinion) from the question of whether AI has sentience. Yes, we can speak of things in relative terms, but in this case I think the whole point is that this topic transcends the relative. The frame is wrong. I feel a bit like someone in all seriousness being asked “how far do ships have to sail before they fall off the end of the Earth?” My answer is that the Earth isn’t flat. And the reply to this is “That’s irrelevant. How far do the ships need to sail?”
-
It may be the exact point from your persepective. But you have to understand that we don't need to have a scientific answer of proof about what sentience is according to science. When we already know that computers and the data that is stored in googles servers are not to be misstaken for having the slightest of feeling. No nerve endings is to be found in googles servers or quantum computers, so sentience can be ruled out from the equation. It's that simple really. This is true. And science may never be able to answear this, since sentience is not to be messured. But sentience is not a typicall measurable thing to begin with, It merely serves as an acknowledgment of a feeling being. Non living matter as different metals and silicone components, plastics etc, are not to be misstaken as sentience. These non living materials doesn't just magically come alive one day because alot of data has been used to mimic common use of language, or even advanced use for that matter. It is cool that AI can mimic, but you need to be grounded in more fundamental understandings than letting yourself be persuaded and decived by certain rhetoric that it use.
-
@Carl-Richard I think you’re mixing up neural correlates with qualia. Yes, I am saying that the experience of emotions and thoughts (which is what is meant by sentience) arises independently of any structural configuration of stuff. I understand that people are under the impression that an AI may feel or think like a human because it writes like a human. But I think the basis of the question is flawed. We can perhaps use the word “thinks” without invoking qualia if we are talking about the way a calculator “thinks”. But we can’t really say a calculator (nor a human, nor an AI) “feels” in my opinion. Neural correlates of experience seem to exist when investigated, but these do not explain sentience. Rather, they seem to merely be calculations. Calculations can exist without sentience, like in a pocket calculator, or the calculator on your phone for example. The human brain seems to calculate things too. But to the extent it (you) have awareness of any calculations or feel anything about them, I do not think that is something the brain is doing. Now in my view, both the AI and the human are imaginary. To the extent the AI seems to exist, it seems to have the ability to process complex linguistic information somewhat similarly to the way a human brain seems to process complex linguistic information. And this ability may seem to improve in the future.
-
It's perfectly fine to think that the most basic types of phenomenological experience (like the experience of red and blue) simply exist "out there" in the aether so to speak, independent of any structural-functional configuration of stuff. Panpsychism (which is most likely what the paper refers to when it says "ontologically pansentient universe") and idealism are both compatible with that position. However, again, the question about AI sentience is not really about that. It's about very complex experiences like emotions and thoughts. When people say that the AI writes like a human and therefore is sentient, they're claiming that it also feels or thinks at least somewhat like a human, and this claim goes way beyond any discussion about the most basic levels of phenomenal consciousness, to the point that it's frankly irrelevant to the discussion, unless you claim that emotions and thoughts generally arise independently of any structural-functional configuration of stuff (which is patently absurd). According to our best current knowledge, we know that emotions and thoughts are somehow tied to a certain structural-functional configuration of stuff known as biology, and that therefore, to start to question whether AI is sentient or not, you have to talk about the plausibility that these complex inner experiences are able to arise in a medium that is not biological. Again, to mention any discussion about basic phenomenological experiences is simply a red herring.
-
There is no problem here. It is a question of axioms. I apologise if you feel I'm being discourteous. I have been trying to explain that questioning whether an AI is sentient is, in my opinion, implicitly misunderstanding the nature of sentience. I don't mean to offend you by saying this. It's just my point of view. I do not believe that sentience as a phenomenal experience is to be found within any (bio)mechanical object, including humans or AI. This looks like a great article that generally reflects my point of view: https://www.sciencedirect.com/science/article/abs/pii/S0079610715001169 You seem to disagree with this line of thinking, and that's OK.
-
But that's exactly the point, I think. Scientific materialism has no answer for how sentience comes about. Just as quantum mechanics destabilised the classical physics paradigm, so the question of AI sentience has the potential to destabilise notions of sentience and its ultimate source. That's by far the most interesting thing about it imo.
-
I would agree if the question was to fully explain what sentience is or how it came about. But the on going debate is not to be confused with exactly what sentience is or why. The question is, is AI sentient or not. So the fact that we use the word sentience at all, implies that there is atleast some merit to what the word means based on it's current definitions. And a computer made of man made components doesn't fit the description of a feeling being as far as I'm aware. Just as boats have been named and can move, but does that make them sentient aswell?
-
Which "you" are you talking to? There is no you or me, only consciousness. Do you see the problem? You're not being consistent in your use of language (and you're also not at all being courteous to what I'm trying to communicate), and that is because you're playing the Advaita guru game: you're not talking about AI sentience — you're trying to teach me about non-duality. Do you acknowledge that this is happening or will you continue to not address the frame?
-
The problem as I see it as that your question itself contains axiomatic errors. The question of AI sentience comes from ignorance, because no thing has sentience. No thing has experience. When you ask “do you or I experience thoughts?” who are you referring to? To the apparent flesh puppets or to the thing that imagines all of this? If the former, then no I do not experience your thoughts. If the latter, then yes I do experience your thoughts. The human that you seem to think you are does not experience thoughts. It may have thoughts running through it in the same way a calculator calculates. But it does not experience them. The thing that experiences is not human. If you ask me the same question again I’ll probably give you the same answer again. I’m sorry about that as you seem to find it quite annoying. Maybe consider that your question itself is flawed? I trust you would concede that this is a possibility.
-
Don't just assert that. Enlighten me. I spoiled your Advaitan magic trick. Were you doing something else? Let's do this then (even though you were alluding to another Advaitan escape route, i.e. "logic is futile"): Do you and I experience thoughts? Do we experience each other's thoughts? Can AI experience thoughts? That is the question of AI sentience. Consciousness as a transpersonal field of qualities is irrelevant to this discussion. Do you agree?
-
The couch and the coffee table are both as sentient as the imagined ego construct. That is to say that on the sentience scale, they are both are at zero. In the same way, just as much sunlight strikes a metallic object as it does a couch (provided both objects are outside). But the metallic object will somehow look like it is a source of light. In the movie Castaway, Wilson the volleyball is just an accidental face created by the blood of the protagonist’s hand, yet it becomes his personified friend and only companion during all the years that he spends alone on a desert island.
-
There are no thoughts, just awareness. There is no awareness, just is-ness. Is-ness is only a construct and a way of speaking. Absolute Truth cannot be spoken etc. You've fallen into the Advaita trap, my friend. I will now allow you to use concepts in a consistent fashion and communicate like a normal person, which is especially important when talking about AI sentience.
-
Sure, I like semantic rabbit holes as much as the next guy but I'm happy to continue Agreed - the question of AI sentience is to do with whether AI is aware of thoughts. And my answer to that question is no, it is not aware of thoughts. And neither is any human. Again, humans do not possess their own awareness in my opinion.
-
Do you want to continue or do you want to escape to a semantic rabbit hole?: We agree that consciousness is not a thought and that consciousness is not bound to anything. You are aware of thoughts, and I am aware of thoughts, and you are not aware of my thoughts. Is AI aware of thoughts? That is the question of AI sentience. Do you agree?
-
Me neither, I consider sentience as a acknowledgment of a feeling being as the word suggest. Ok if you say so. Jokes usually don't require an explanation. But it is also possible that I just don't share your sense of humor.
-
I agree that the word "sentience" describes the ability to feel or perceive. I just don't believe sentience is a property of any thing in the world. Yes, I see that. It was a joke in the vein "I need a new wife like I need a hole in the head". To explain this in more detail, the person in the joke is actually saying he doesn't need the new wife. This is a clear contradiction of how to use language, and I hope he sees it too. But for the time being we can just smile
-
Alright I get what you are saying now. You simply don't acknowledge sentience as a valid definition of feeling being. I suggest that you don't use the word sentient if you don't agree with the definition. Or simply use other words that are more accurate to what you try to express. This is a clear contradiction of how to use language. I hope you see that.
-
You keep mentioning object. If you believe that object and being are both considered to be sentient, then you have simply misunderstood the implicit meaning that sentience are meant to point towards. The reflection you mention are true, but it's true because scentience is the acknowledgment of another feeling being. Something that can't be found in a AI program. Even google would oppose this sentience claim of their own or any AI program. It's not a coincidence that the former google engineer had to go, by making such a misleading claims on behalf of the AI project.
-
We agree that consciousness is not a thought and that consciousness is not bound to anything. Still, do you experience thoughts? Yes. Can you experience my thoughts? No. Does AI experience thoughts? That is the question of AI sentience. Regardless, to say that you or me do not experience thoughts is absurd.
-
Sentience is a state that arises along with the object of experience. It has no reality otherwise... and that is to say that ultimately it has no reality at all. But insofar as any object appears to have sentience, it is a reflection.
-
You are conflating object and being with this kind of reasoning. But from reading previous responses from you in this thread, you also seem to have your own definition of sentience. So I would not criticise your personal belief in this regard.
-
The paradigm that Leo is talking from is not what we usually experience in our day-to-day 3D consciousness where discussions of AI take place. In this more normal level of reality, brain activity does correlate with certain types of human personal experiences (e.g. feelings, thoughts, understanding), but not transpersonal consciousness (or it's negatively correlated with it). You seem to drag the discussion towards transpersonal consciousness, meanwhile the questions of AI sentience is about whether they have these human personal experiences. So if you claim that AI have human personal experiences and you're not currently living in DMT hyperspace, then the implications of neural correlates is a problem you have to address.
-
By actions. Sentience wouldn't have any merit or be understood as a word without the acknowledgment in how we differentiate between objects and beings. If sentience was anyting you imagined it to be, you might find yourself in a rescue mission to save rocks from drowning in the sea if you are conflating all limits that words impose. It's just not useful to say that sentience is what ever you can imagine it to be if you are explicitly talking about what sentience means. Regarding psychadelics. Psychadelics can help to conflate all believed differences such as "object" and "being" to get the needed overview of the world as relative and illusory state that it may be. But even alcohol could be said to be the elixir and deepest source of confidence. And to get confident, you just need to drink the right amount of alcohol to understand confidence. I'm not suggesting that sobriety is the only way of life, it's just that you need to be aware of what certain understanding may be rooted in. So to not assume and conflate certain experiences with understanding.
-
I don't know. It's not so clear. Take psychedelics, contemplate shit, see what insights you can stir up. It's a hairy process. Fundamentally you just need to contemplate "What is sentience?" until you get it. You're not likely to get it in your sober state though.
-
How do you make a bridge between the absolute and the relative, so that you can make sense of relative concepts like sentience?
