UpperMaster

How is AI going to impact Actualized.org and philosophy? (Concerning)

27 posts in this topic

You're talking about an extremely developed AGI or Artificial General Intellgience, by the time something like that flourishes into existence

1. Acrualized.org won't exist.

2. There will be more things to worry about than just human philosophy getting "replaced" or whatever your point Is.

Technology only moves fast In retrospect, real time progression is much slower.

But of course, once A.I gets some grasp of intelligence, it's own intelligence will probably double every couple of year(s). It's a basic growth law. Similar to Moore's law and transistor density.

This theory basically says that if it took 100 years for A.I to out-smart a single human. It will take only <10-20 years to outsmart the entire human race.

Edited by MarkKol

Share this post


Link to post
Share on other sites

Y’all trippin’. If AI continues to grow it will show us the connection between science, art, and mental health which will enlighten the western world. 

Share this post


Link to post
Share on other sites

Kurt interviewed the AI yesterday xD

But to be honest, right now it looks like it's compiling answers that already exist on the web.
 

 

Edited by Yog

Share this post


Link to post
Share on other sites
1 hour ago, MarkKol said:

You're talking about an extremely developed AGI or Artificial General Intellgience, by the time something like that flourishes into existence

1. Acrualized.org won't exist.

2. There will be more things to worry about than just human philosophy getting "replaced" or whatever your point Is.

You have no idea whether something like this is a year away or 100 years away. It’s very very tough to estimate

For example lots of people estimated that it would take 10+ years for an AI to be super-human at Go a few years back and that record was smashed in far far less time

Share this post


Link to post
Share on other sites
21 hours ago, KH2 said:

"Uuuuh, but it can't do this yet, humans do this thing much better, A.I. will NEVER EVER EVER replace *insert some random skill*" Guys, these are all copes, you surely must know that deep inside, somewhere in the depths of your unconsciousness, there's a part that will feel useless if you were to be replaced. You're still clinging onto that thing that you think makes you/us special. Why surfing against the vawe, instead of just embracing it?

The very likely consequences of A.I. within this very century, and maybe within 2 decades:

  • Severe changes to capitalism, or maybe even death of capitalism
  • Severe changes to how we view ourselves, work and relationships
  • Sex and dating dynamic will change beyond recognition
  • The way we govern ourselves, and power structures will change severely
  • Scientific progress will accelerate at an astonishing rate
  • Traditional education system will go extinct
  • And much more

Better prepare and plan ahead, instead of resist, and be like "muuuh human creativity, muuuh human emotions"

Yea I agree. I think now we have to compile ways where we can adapt to AI

Share this post


Link to post
Share on other sites
19 hours ago, John Paul said:

Y’all trippin’. If AI continues to grow it will show us the connection between science, art, and mental health which will enlighten the western world. 

I thought about this happening aswell. But the thing is I'm not sure how it will turn out in practice. Will AI develop it's own agenda? Isn't there going serve the elites of the world in some way?

 

There is also the possibility where AI will help only more fortunate humans while leaving people that cannot afford AI behind. This same concern is for bio- technology

Share this post


Link to post
Share on other sites

@KH2 I’m not glorifying the human nature. There are legitimate limitations in the outputs of current transformer models because of how training algorithms are designed. Scientists would certainly be capable of circumventing these if they had more efforts put into understanding the philosophical nature of reality.

After AGI is dropped in nature, I fail to see any other outcome than humanity being completely outcompeted by entities struggling to survive at their own level in the acquisition of resources and energy. For example, when there is a war between humans, nobody worries about the ant colonies being blasted along the way. Unless if they were incredibly loving and quickly adopting artificial-societal norms to preserve the safety of other perspectives.

Edited by nuwu

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now