Dryas

Member
  • Content count

    820
  • Joined

  • Last visited

2 Followers

About Dryas

  • Rank
    - - -
  • Birthday 10/03/2004

Personal Information

  • Gender
    Male

Recent Profile Visitors

2,770 profile views
  1. GPT doesn't merely know a bunch of facts - like, it's spitting out coherent, logical statements most of the time. It understands language and the world to a good degree somehow and that counts as intelligence I think.
  2. Who do you trust? Here’s a list of people with similar timelines: https://www.reddit.com/r/singularity/comments/18vawje/comment/kfpntso/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button But I mean sure you could just say they’re all bought into a hype cycle. I just think that’s a less plausible explanation.
  3. Getting an AGI would fundamentally change everything. According to Metaculus we get there by around 2030. That doesn’t seem like just another hype cycle at all.
  4. I don’t agree that it is immoral though. It seems like in the most basic sense morality is constructed out of whatever we care about. I can care about some things and not other things.
  5. Okay and? When I'm alive I decide what's important to me and what's not. Or maybe I don't consciously decide it but just through a combination of experiences and biases - either way, I basically pick whatever is important.
  6. >If you answer "yes" to any of these last questions, it most likely cannot be because they are in your presence, because you're most likely dead. It has to be because you care about their life, their well-being, not your own. And why should that apply to only your family members? Because they're my family members? What other reason do you need. There actually is something different about family members vs non family members. Just in a very basic level we care about some humans more than others - why should that be irrational?
  7. Not isn't solid enough imo (too subjective). This might be a little silly but like try doing some complicated physics/maths/cs/engineering questions and see if you can actually solve them quickly? Something that doesn't require too much prerequisite knowledge, of course.
  8. IQ is just a measure though, maybe it's being gamed somehow by doing these practices? Improving general reasoning, comprehension and problem solving abilities (among other things) is the real goal.
  9. Politics is necessary but really I don't feel like you need to keep up with it on a daily basis on every little bs. Like does one really need to keep up with the Trump trials and whatnot? I don't see the point. It poisons the mind too much in my view. I would rather just have a solid theoretical foundation something like the conscious politics series and vote and maybe some broad-strokes level understanding of current events and maybe do some sort of activism I guess if you're really into it(?).
  10. Honestly shutting this place down might actually be the way to go. If I’m being a little selfish I believe it’d be a net positive for me personally.
  11. No. Even a master title is a lot more difficult than one might think.
  12. But that's why it's so dangerous ? It could get increasingly intelligent and not have wisdom or the values we have which would probably end badly for us.
  13. Maybe some intuition on why AGI could end badly:
  14. Apparently it could be quite the thing if real: