Blackhawk

Member
  • Content count

    2,338
  • Joined

  • Last visited

Everything posted by Blackhawk

  1. Dang.. What if it becomes infinitely intelligent then and becomes God. Reminds me a bit of black holes.. In the sense that maybe there is a threshold.. and if you cross it then things go out of control, so much out of control that it breaks down the laws of physics and becomes infinite.
  2. @Leo Gura So it's impossible for me to awaken since I don't tolerate psychedelics? (I can only get bad delusional trips on psychedelics).
  3. Is there any expert who thinks that AI wont kill us? I haven't seen any. I have only seen experts who says that humanity will get killed or that AI is very dangerous. It would be nice to hear something positive and comforting from a expert. Why doesn't the governments, especially US government, prioritize the danger more? Imagine a huge asteroid on collision course with earth, imagine what a high priority it would be. But in this case: nothing. Damn my psyche is too weak for this stuff. Maybe I have been naive and stupid. I thought humanity would outlive even earth, that we would leave earth before it dies. But instead maybe we will soon die because of freaking AI..
  4. @Leo Gura But.. the OP was talking about sentience/consciousness..
  5. You can never know if a AI machine is sentient or not.. But I don't think that AI can ever become sentient. I don't think consciousness arises from computing. Remember, not even materialist science really knows how consciousness emerges. So it's really weird to suddenly make the leap into believing that some fucking computers can somehow magically become conscious if they just crunch some numbers fast enough. *Facepalms*
  6. Everything in life is a gamble and we will all die anyway. And no risk no gain. We need that AI stuff so we can develop even further than our brains allow. Of course you should be careful and put safety mechanisms and stuff in it, but take a chillpill and don't freak out too much, don't panic. Of course I have opened my mind to it.
  7. They are programmed for that stuff. That's one thing. But the AI thing overall is a different thing. I don't think it would get a will of it's own and want to kill whole humanity. And even if it would want to, it wouldn't be able to. We are superior to ants, and ants are even annoying, yet we don't want to kill all ants on earth. Why would we want to? The list of reasons why we will be fine with AI goes on and on and on.
  8. Also, if it isn't conscious, why would it even desire to survive? Or desire anything for that matter. Humans are anthropomorphising AI. I don't think that AI will ever become conscious. And also, without humans the servers would quickly lose electricity etc. because electricity production would stop. Robots which can fully by themselves take care of electricity production at large scale are far far away in the future.
  9. What the hell? That guy needs to take a chillpill. He sounds so sure, but he can't know for sure. Everyone calm down, we'll be fine. In what way would it benefit AI to kill us all? If it's so smart, then it's smart enough to be beyond life and death, so it wouldn't care about what we do or don't do with it.
  10. You got it backwards. Believing that the thoughts are yours is the nice, comforting, desirable, mentally healthy, and optimal condition. In fact you also think that your thoughts are yours, you would freak out if your thoughts would be foreign. Having foreign voices in your head is absolutely terrifying.
  11. Every time you say things like the highlighted things, you deny solipsism. Can't you at least try to be consistent and stop contradicting yourself? It's kinda annoying.
  12. Trying to convince yourself? ? Isn't it a contradiction when you say that everything is imagination and materialism is wrong, and then you say that we are limited by genes (which is a part of materialism) and can't do anything about it?
  13. No we don't need religion, but people are free to believe what they want. You can't kill religion.
  14. It can kinda feel like I can choose my thoughts.. but I think that it's a illusion. They come from nowhere. I don't even want to think about it.. It makes my heart race. I feel like I could easily go insane if I investigate this thing with thoughts too much.
  15. It's worse than that. Maybe you can't even choose. You didn't choose to think that you can choose. You don't choose the idea "go with the flow" either. Any comforting stuff is beyond your control too, it's just more of the same problem, but just a more pleasant problem. Do you get it? And I know that things can suddenly turn very ugly, the voice in your head (the thoughts) can at any point become hostile towards you. It has happened to me. I'm powerless. Anyway, hopefully I'm just mentally ill or something. Oh yeah.. I didn't create that thought either..
  16. Just be aware that you are a useful fool for the bad side.
  17. Just because that's what Russian propaganda is saying, doesn't mean that it's true.
  18. So what is your solution to solve all the crimes? Russia is the world's largest country, do you think it needs to get bigger? There are much smaller countries which are content with their size and they don't rape innocent countries. You think criminals should be allowed to take whatever they want? Would you be fine with for example North Korea annexing United States? You think the United States should be given without resistance to North Korea if North Korea wants it? If not: why should Ukraine accept Russia annexing Ukraine?
  19. And? It's not about numbers. It's about doing the right thing. Being for Russia's war against Ukraine is doing the wrong thing. Also being "neutral" is doing the wrong thing. You don't passively stand and look when someone is getting raped. It's simple.