LordFall

The AI crash is impossible - Change my View

44 posts in this topic

@LordFall I am not saying Bitcoin will stop existing, but it will drop 80% if AI crashes. It will eventually recover, but that will take years.

Be careful, you are going to lose a lot of money with your mindset.

But all other crypto will get slaughtered in the AI crash.

Edited by Leo Gura

You are God. You are Truth. You are Love. You are Infinity.

Share this post


Link to post
Share on other sites
19 minutes ago, Leo Gura said:

Yes it does. Sam Altman was touting how GPT5 is equivalent to human PhDs.

They actually believe this shit.

No AI will ever replace me. These tech bros can only dream of replacing me.

Current humans are so primitive. Just image how advanced AI-integrated humans will be in 500 years. They will look back at this post and just laugh at us thinking we are so advanced and irreplicable, yet replaced. Most humans can barely distinguish one ass from the a hole in the ground so saying AI can't replace this intelligence is just funny. Like not so long ago we didn't have electricity and most humans couldn't write or read. We are not that smart. I think GPT5 can already do more than a regular person.

Edited by AION

Prometheus was always a friend of man

Share this post


Link to post
Share on other sites

Something I notice in this debate: both scenarios. crash and no-crash, seem to converge on the same distributional outcome. LordFall's own framing of the K-shaped recovery is actually the most honest part of this whole thread, but it kind of disappeared into the AI girlfriend discussion.

If the bubble crashes, middle-class retirement portfolios and junior employment take the hit while the firms restructure and survive. If it doesn't crash, if AI gets absorbed into the military-industrial complex and nationalized the way the Palantir CEO is already suggesting, who governs that? Not the people being displaced by it.

While the crash/no-crash question is interesting, I am more curious about: what legitimate governance structures could emerge around this technology, regardless of which scenario plays out? Because UBI from Trump and nationalized compute are two very different things with very different accountability structures. Also depending on where decisions actually get made. A federally administered compute nationalization has the same structural problem as any centralized controller: by the time Washington perceives what's happening on the ground and responds, conditions have changed. The history of complex system management suggests the response latency matters as much as the intent.

Not saying decentralization is magic either. Just that "government steps in" is doing a lot of work in both the crash and no-crash scenarios without specifying what kind of governance architecture actually has the bandwidth to handle this.


Civilization has outgrown its coordination infrastructure : an open essay on why, and what the design pattern might look like: The Coordination Imperative

Share this post


Link to post
Share on other sites

The issue is IF AGI does come to being (and that is a big IF, current LLMs no matter how powerful, are not AGI) then most blue and white collar jobs will be replaced entirely. Then we will have to have an entirely new economic system. I don't know what that will be. UBI sounds good on paper, but the math may not work out. We will have to have a new economic system.

Now, will the people with most of the money in this new era have any incentive to help the general public adapt to these radically different times?
Honestly, I lean towards no. Wealth inequality is already massive. It will probably get even bigger. Will the public be okay with this? No. We will have mass unemployment and a hordes of angry young men. Who knows where that'll lead. It may be the 'end' of this particular world, or it may lead to an era of great prosperity for humans. We can't know for sure right now. 


Again, this is all under the assumption that AGI does happen. Currently, if you combine all forecasts, there is a 30-50% chance we will have AGI by 2030-2035, and a 50-75% chance we will achieve AGI By 2035-2050. However, this is not really a guarantee either. There are two main views in the AI community for achieving AGI: 

1)  The “Scaling Gets Us There” View

Some researchers think no fundamentally new idea is needed. The argument is that current methods (transformers, deep learning, reinforcement learning) just need:

- more compute

- more data

- better training techniques

- better architectures built on the same principles

 2) The “We Need a New Paradigm” View

Other researchers think current methods will plateau. They argue current AI lacks key things like true reasoning, world models, long-term memory, self-improving learning, efficient learning from small data, etc. They believe a new breakthrough may be needed similar to how:

- backpropagation unlocked neural networks

- transformers unlocked modern AI

- deep learning unlocked modern computer vision

In other words, we may not even get AGI because scaling may not get us there and a new paradigm may simply just never happen. (Ex: right now with nuclear fusion). The best anyone of us can do now is make and save as much money as possible, strengthen relationships we already have, and awaken from the dream of life so that we can live without fear in this new era, or die without fear. Me personally, I am trying to use my CS background and all the LLMs and AI systems out right now to try to make millions of dollars. Hopefully I can get there.

Share this post


Link to post
Share on other sites
17 minutes ago, Bjorn K Holmstrom said:

Something I notice in this debate: both scenarios. crash and no-crash, seem to converge on the same distributional outcome. LordFall's own framing of the K-shaped recovery is actually the most honest part of this whole thread, but it kind of disappeared into the AI girlfriend discussion.

If the bubble crashes, middle-class retirement portfolios and junior employment take the hit while the firms restructure and survive. If it doesn't crash, if AI gets absorbed into the military-industrial complex and nationalized the way the Palantir CEO is already suggesting, who governs that? Not the people being displaced by it.

While the crash/no-crash question is interesting, I am more curious about: what legitimate governance structures could emerge around this technology, regardless of which scenario plays out? Because UBI from Trump and nationalized compute are two very different things with very different accountability structures. Also depending on where decisions actually get made. A federally administered compute nationalization has the same structural problem as any centralized controller: by the time Washington perceives what's happening on the ground and responds, conditions have changed. The history of complex system management suggests the response latency matters as much as the intent.

Not saying decentralization is magic either. Just that "government steps in" is doing a lot of work in both the crash and no-crash scenarios without specifying what kind of governance architecture actually has the bandwidth to handle this.

Yes it seems our minds got carried away by a philosophical discussion on the value of AI titties and as you guys can read my position is clear, brother Leo is misguided in undervaluing them but to each their own.

I think you're correct that governance and policy is gonna shape this whole issue. They could handle it well and if/when mass layoffs occur have social programs in place to operate a transition or there could be more of a slow standards of living collapse until things a breaking point which historically happens at around 20-30% unemployment rate. 

@Lazarus93 Once again what do you define as AGI? The era of AI agents majorly changing the economy will happen much before AGI. You can already have claude code work on your pc 24/7 and extract economic value on your behalf and buy a whole server to run 50 different instances of it with different objectives that each learn from eachother. 

Here's a funny one, how soon til we get the first Robot Police robots patrolling US streets? I predict that will be a huge issue with how sensitive policing is in the US but I mean it does make sense in the end perhaps it will be more reliable than human cops. I think by 2030 they start pilot projects around it in smaller cities and by 2035 it's common. 

Edited by LordFall

Owner of creatives community all around Canada as well as a business & Investing mastermind 

Follow me on Instagram @Kylegfall 

 

Share this post


Link to post
Share on other sites
23 minutes ago, LordFall said:

Leo is misguided in undervaluing them

I acknowledge that there will be a huge market of pathetic incels who pay money for AI girlfriends.

It's just not a world I want to live in.

These AI girlfriend companies will be like Andrew Tate, scamming desperate lonely young men of their money with fake girlfriends. Your fake AI girlfriend will want you to buy her shit with real money. Lol. You can't make this shit up!

Edited by Leo Gura

You are God. You are Truth. You are Love. You are Infinity.

Share this post


Link to post
Share on other sites

To me, they're like a smarter, improved version of a search engine at the end of the day. I'm speaking from ignorance here, while acknowledging the potential amazing value this technology could have. But the hype is just too much. LLMs can't even get basic information right sometimes.

Edited by UnbornTao

Share this post


Link to post
Share on other sites

@UnbornTao Humans also struggle to get basic information right and are less prone towards improving.

@Leo Gura Young men struggling with dating is not the only nor probably main market for this. Plenty of elderly people are not in relationships and struggle with intimacy. Women are having a huge dating crisis as well.

I don't think you should simply judge it as pathetic incels. It's just the road to least resistance. Equally AI can coach you into getting a high quality companion. What percentage of the population do you think has read a dating book in their life today? Why would they get a dating coach, they'll just take the predictable consumer solution to the problem. It's a weird future but it will be personalized so you get to pick what you want. To me I like the harem idea so I will date many women and I mean why not an AI girlfriend on top seems fun. The sky is the limit. 


Owner of creatives community all around Canada as well as a business & Investing mastermind 

Follow me on Instagram @Kylegfall 

 

Share this post


Link to post
Share on other sites
6 minutes ago, LordFall said:

Young men struggling with dating is not the only nor probably main market for this. Plenty of elderly people are not in relationships and struggle with intimacy. Women are having a huge dating crisis as well.

Those are all valid points. But it is pathetic nonetheless. It is deeply unhealthy and will lead to many social problems.

An AI girlfriend will never satisfy you, and it will make it all the harder to fix the problem of underdeveloped social skills. It is like feeding drugs to people in pain. It seems nice in the short-term but it is disastrous in the long-term.

Edited by Leo Gura

You are God. You are Truth. You are Love. You are Infinity.

Share this post


Link to post
Share on other sites
6 minutes ago, LordFall said:

@UnbornTao Humans also struggle to get basic information right and are less prone towards improving.

I guess that's true. But humans aren't hyped enough.

Edited by UnbornTao

Share this post


Link to post
Share on other sites
19 minutes ago, LordFall said:

Women are having a huge dating crisis as well.

AI girlfriends makes it worse for women.

Why should guys learn how to be attractive to girls when they can simply score with AI?

AI girlfriends = reduction in need to build attraction skills = reduction of men who are attractive

Share this post


Link to post
Share on other sites
26 minutes ago, Terell Kirby said:

AI girlfriends makes it worse for women.

Why should guys learn how to be attractive to girls when they can simply score with AI?

AI girlfriends = reduction in need to build attraction skills = reduction of men who are attractive

I mean it's a power law meaning that the more this compounds the more women will drop their standards and date the guys that do go out so it's an equilibrium. I think it's mostly good for all parties involved, dating is gonna become more intentional now. If you want a girlfriend the AI will tell you which venues are most popular and help you debrief your nights out and if you can't be bothered then you'll get a high quality virtual intimacy experience as well. 

For women that date and just want a companion for their life it's gonna be plenty of dudes available like usual and for the ones that have more refined tastes it's gonna be more of a hunt but I think they still find solid partners that they're into.

And being in a cool rich dude's harem is a good choice I think for the right woman that will be a very fun lifestyle. A lot of women are bi remember so being in a polycule would be an optimal scenario for them. 


Owner of creatives community all around Canada as well as a business & Investing mastermind 

Follow me on Instagram @Kylegfall 

 

Share this post


Link to post
Share on other sites
2 hours ago, Leo Gura said:

Yes it does. Sam Altman was touting how GPT5 is equivalent to human PhDs.

They actually believe this shit.

No AI will ever replace me. These tech bros can only dream of replacing me.

@Leo Gura a quick test of human intelligence vs. AI intelligence hit me when I was talking to ChatGPT and then switched to deep journaling in OneNote. I instantly realized how much more powerful that level of contemplation, creativity, understanding, etc. is with OneNote. I also think there's something quite deeply wrong when they say that AI intelligence is PhD level. 

Edited by Jayson G

I created a platform to build, design, and iterate your life at lifebase.ai

Share this post


Link to post
Share on other sites

AI girlfriend sounds heavy depressing. A GPU-rendered character on a screen that is programmed to like the buyer.

Idiots will get one and then purchase pixelated high heels for their AI wife because she wants a pair from the microtransaction shop.

Incels are already feeding microtransactions in games. 

At this point, it's not really "AI girlfriend" but a tool to artificially suck your balls dry .

Share this post


Link to post
Share on other sites
32 minutes ago, Hafiz said:

Jensen himself is saying the bubble is about to pop.

https://www.cnbc.com/amp/2026/03/04/nvidia-huang-openai-investment.html

Did you read the article? OpenAI is going public soon so they're not expecting to hold another round of crossover investment. OpenAI is gonna go public right as it gets hit with defence contracts that Anthropic lost so I don't see how this is bearish for them or the AI industry. 


Owner of creatives community all around Canada as well as a business & Investing mastermind 

Follow me on Instagram @Kylegfall 

 

Share this post


Link to post
Share on other sites
4 hours ago, Leo Gura said:

I would rather be dead than have an AI girlfriend.

I would rather be dead than have an AI boyfriend!

As an aside - we tried to use AI to do some pricing/estimating for construction work....

Lets just say, consumer grade shit has a long way to go. If it can ever estimate for commercial construction.

 


It is far easier to fool someone, than to convince them they have been fooled.

Share this post


Link to post
Share on other sites

I am turning into old-man-yells-at-cloud.


You are God. You are Truth. You are Love. You are Infinity.

Share this post


Link to post
Share on other sites
5 minutes ago, Leo Gura said:

I am turning into old-man-yells-at-cloud.

Dude I agree with so much of what you have to say except for your A.I pessimism. Try making a short film now with A.I. You can clearly see it's going to take out Hollywood in a few years and the power to the individual's ability to create is going to explode. Any kid on a laptop will be able to create a Hollywood-level film. It's gonna be a massive shift in how we experience reality. 

Share this post


Link to post
Share on other sites

When the first cars came out, there were people who said I rather die than get rid off my horse. I love my horse. A car can never replace my horse. A car can't keep me company, leave farts and make me laugh. Blabla.

It is typical boomer talk. These people said the same thing about smartphone. They would never take a smartphone or 5G and now everybody is on it. We shouldn't take boomers too seriously nor should we try to convince them. Every century has them.

Especially people's who's livelihood will be on the line will be in denial until one day they will wake up and have to face the music.  They will feel like Neatherthalers amongst Homosapiens, but this time the homosapien will feel outmatched by AI integrated humans.

There is nothing more integral and AI integrated humans.

Edited by AION

Prometheus was always a friend of man

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now