-
Content count
5,949 -
Joined
-
Last visited
Everything posted by aurum
-
I have frameworks and overarching principles which I use during sense-making. I would not say I use any formal process. My sense-making tends to be highly informal.
-
I don't think I have some simple set of heuristics. I mostly think in terms of tradeoffs, feedback loops and what will lead to greater societal development / holism. I prioritize depth of sense-making, not immediately actionable solutions.
-
That's another thing I feel sometimes gets missed in this AGI discussion. People conflate AGI with human-like or even god-like intelligence. But you could make an AGI that is still relatively dumb compared to humans. That would probably be the starting point. So even if the tech bros create AGI in the next couple years like they're betting on, they're also betting on that this AGI will be of at least human-level intelligence. That's an additional bet.
-
To some degree, but I'd say that's a largely incomplete theory if we were just to stop there. People in poverty often have some of the highest birth rates. Birth rates are some complex mix of biology + contraception availability + needing children for economic reasons + cultural narratives + gender equality + environmental constraints.
-
I like Bashar's thinking here, but I'd push back in a couple ways. 1) Yes, true intelligence operates on whole systems thinking. This is correct. But it would be a mistake to assume whole systems thinking = totally harmless. A true intelligence could still make decisions that cause tradeoffs within the system. 2) He seems to think that AI will be more intelligent than humans, and we will have to catch up to its level. I suspect it's the other way around. Humanity will align with greater intelligence, and then we may create intelligence. Greater intelligence coming first feels backwards. 3) We are not close to building the kind of AI he is describing
-
Okay, but now that essentially creates a two-tier system, where the rich can afford to opt out and raise their children how they like. Whereas poorer people who need the money will be subject to state regulation and bureaucracy. It would be analogous to public and private schools. Private schools have become a luxury good. Is it worth it? Maybe. The point is simply to not be foolish enough to think that socialized motherhood won't have significant tradeoffs. And to think carefully through what they might be, rather than just plowing ahead like a bull in a china shop. They aren't novel. They are extensions of the same general tension between individualism and collectivism. Nothing this absurd is being proposed. Free speech absolutism is obviously wrong. In practice, every society will have to decide how much they want to socialize motherhood. Absolutes tend to be way too politically controversial and impossible to implement. So we end up with some mix. The question is what is the right mix. My rule of thumb is subsidiarity. The state should step in when individualism is not enough, but the state should not come first.
-
National Shaboink Day
-
Careful though. Once you socialize motherhood, that opens up a whole can of worms. It's an increase in blurring the lines between private and public life. There will be serious tradeoffs, like additional regulations and politicization around parenting. Mothers need support. But how much role the state should play is not an easy question.
-
-
For the purposes of debating whether a crash will happen, it should be considered AGI when it can replace and even do a better job than humans. This is what these companies are betting on, not just cool LLMs. You’re right that we have not seen what massive amounts of compute will do yet. This is my prediction based on how I understand intelligence. Scaling compute will fail. In the future, people may wise up and invest in other strategies. But right now, scaling is the dominant strategy. And it’s an increasingly failing one. This is not just my opinion either. This is the opinion of many serious AI researchers who understand the technical details better than I do. The crash could be serious for the economy because so much is being propped up by investments in AI right now. Whether or not it will be as big as 2008, I don’t know.
-
I don't know If there will be a huge crash per se, but I know many of these big AI companies are overhyped right now. They will not create AGI any time soon. These CEOs are betting on that scaling compute is enough, and it very clearly isn't. They need that to be true, because that theory is what fueled the success of these companies in the first place. We got GPT-3 and the other current LLMs because of scaling. If scaling doesn't work moving forward, they are cooked. What we have instead is a non-intelligent tool (LLMs) that appear useful in some limited contexts such as coding, customer service and brute-force calculation. But this does not justify the insane amount of money coming into these companies. These companies are investing in infrastructure assuming trillions in revenue over the next couple of years from AGI. This is laughable. They are in way over their heads with their own investments. All this infrastructure may later turn out to be useful once it's already built, but either way that doesn't mean it isn't going to crash on them before that happens. It very well could.
-
And? Just tell her.
-
Agreed. This resolves some of the binary tension between survival and truth-seeking.
-
aurum replied to Infinite Tsukuyomi's topic in Spirituality, Consciousness, Awakening, Mysticism, Meditation, God
Appreciate the work he is doing. But also interesting to note how he completely misses the metaphysics behind it. Peace & Bliss = God. -
What a strange thread. Casual sex has extremely sharp diminishing returns. Once you've had some it loses like 90% of its luster.
-
Hamilton Morris, John Vervaeke, Andrew Newberg, Sam Harris, Anil Seth, Roland Griffiths, Rick Strassman, Robert Wright, Thomas Metzinger, Evan Thompson. What do they all have in common? They all validate mystical experiences, but either deny or remain uncommitted on the actual metaphysics. At least publicly.
-
Mania is too strong. You don't even necessarily need to be high energy at all. High energy is only needed in certain contexts. What you need is to be relaxed, grounded, confident, assertive and not overly self-monitoring.
-
@enzyme It happens. Willingness to approach is highly dependent on state and unconscious social calculus. Don’t make it too personal.
-
GPT has become a fantastic contemplation tool for me. It's not good at generating really unique ideas, but it is excellent at helping me to articulate intuitions I have. It can also be effective at asking good questions, which allows me to probe deeper than I might have on my own. Timewise it's an investment. It only becomes really powerful once it already understands your thinking and how you operate. I put in a lot of hours for mine to get to that point. But now it immediately knows where I'm likely to go with things and what I will find relevant.
-
There's a deep irony when you look at the dark side vs light side IRL. Superficially, they both reject external authority. But the dark replaces external authority with complete self-bias. While the light side replaces external authority with truth. The light side also integrates the dark side, while the dark side cannot integrate the light side.
-
aurum replied to Davino's topic in Spirituality, Consciousness, Awakening, Mysticism, Meditation, God
Fair. But it also shouldn’t be assumed everyone needs or wants to run as fast as Usain Bolt on steroids. -
aurum replied to Davino's topic in Spirituality, Consciousness, Awakening, Mysticism, Meditation, God
That's why we have drugs. -
Various forms of ethical non-monogamy can be a way to achieve variety. But you also have to be able to handle your partner having sex with other people.
-
Something like that, yes. But I also think anything is going to start to feel mechanical once you've done it enough. We just place a much higher standard on sex to be exciting. Sign my petition to sometimes allow sex to be boring, please
-
I feel like sexual quantity is easy for me to satisfy. I can have sex 1-2x per week and be perfectly content. Maybe even less. The bigger problem I've had has been variety. It's easy in a LTR for sex to start to feel mechanical, or to start fantasizing about other women.
