if people understood how good local LLMs are getting, the stock market would crash tomorrow
Anthropic doesn’t want you to know but you can Self-host LLM like Qwen and use it in Claude Code for free
190
510
50
10,804
@DavidOndrej1 can you recommend something in particular? Assuming a local GPU with 8GB VRAM or a Mac with 16-32gb unified RAM.
6
39
Replying to @eiselems
Macbook with 128gb RAM

Nov 8, 2025 · 8:37 PM UTC

10
1
1
209
Replying to @DavidOndrej1
What else was I expecting? 😅 Yeah, I agree LLMs are right now just power-hungry. But I guess this is why the stock market is not yet collapsing. I guess we are still in the "Uber time" of LLMs where it is still way cheaper and makes more financial sense to use an API vs inference on your laptop/phone.
2
1
47
Yeah because the average ChatGPT user can afford a 128gb ram Macbook
1
14
I have windows 16gb ram running LLAMA without any issue
1
1
Who the hell has 128GB of RAM.
1
2
From chemicals to gas to energy solutions, we invest with discipline and partner with purpose to generate long-term value.
22
37
530
Do you now understand the problem?
So that’s why the stock market isn’t crashing 😆
1
For running what? whats state of the art with tool calling u can rub locally with decent context?
1
1
So in apple world 128gb would be like a billion usd upsell
1