Founder and CEO @muspacetech // Aerospace Eng. B.S. Mech. Eng. M.S. @ucla // ex @northropgrumman

Joined January 2017
Start from scratch to customer delivery! We keep moving forward delivering more on aerospace components. Stay tuned @muSpaceTech #satellite #aerospace #electronic #power
mu Space's S-band patch antenna from concept to delivery. Read more: muspacecorp.com/from-concept…
2
19
James Yenbamroong retweeted
My best sighting of a Starlink satellite "train" from orbit!
Same will happen for an aircraft and spaceship in 20 years
Same driver, 26 years later
2
4
James Yenbamroong retweeted
Looking good.
Figure 03 coming 10/9
5
James Yenbamroong retweeted
Replying to @elonmusk
Will give it a try
We’ll take it down when something better comes out.
3
Bring it back.
18
0
All my favorites sushi spots. Mostly standing sushi places!
1
3
10x fact wikipedia is what we need. sexipedia!
grokipedia is soon going to be the de facto source of truth on the internet.
1
James Yenbamroong retweeted
Starlink Direct to Cell leveraging this spectrum is a game changer for connecting the world. So excited to work with telcos across the globe to get this capability into people’s hands (literally)!! ir.echostar.com/news-release…
Big miss for me. See you guys next year!
1
Yes, AI data center and nuclear energy in space. Moon colony!
1 gigawatt space data centers = 1,000,000 kilowatts space data centers. For perspective, the largest solar arrays deployed in space are on the ISS, generating ~240 kilowatts at peak sunlight. So yeah, hard. Other downsides include: Large radiators needed, radiation exposure, launch costs, things break in space But there are two big benefits to data centers in space: - Free power: In some orbits, satellites can receive near-uninterrupted sunlight. Also, in orbit solar arrays are exposed to higher intensity sunlight due to the lack of atmosphere and weather. Much, much more efficient. And data centers need a ridiculous amount of energy. - Nimby: Communities are already v pissed about data centers coming in and driving up electricity prices + environmental concerns. This will become a VERY contentious issue in the next 10 years. Putting them in space bypasses this issue. If anyone knows about the future of data centers, it's Jeff. Launch costs will fall and design constraints will change dramatically over the next decade. But still, the economics and regulatory burden need to make way more sense than just plopping a data center on Earth, where things are a whole lot easier.
James Yenbamroong retweeted
Jeff Bezos at Italian Tech Week this morning talking Blue Origin, cryogenic storage, Moon's advantages, space gigawatt data centers, and millions of people living in space
James Yenbamroong retweeted
Jeff Bezos plans to build a data center in space within the next 10+ years. Unlimited solar energy available 24/7, space is an ideal location for data centers. $AMZN AWS is set to make major moves out there.
That’s cool 😎
OpenAI employees sold $6.6B of equity if they held for more than 2yrs, when they had 770 employees. Thats $8.5M/employee on average. Thanks ChatGPT for absolutely destroying the SF real estate market!
James Yenbamroong retweeted
Replying to @JeffBezos
yep boring is good technically
1
1
10
James Yenbamroong retweeted
James Yenbamroong retweeted
🇺🇸 US vs 🇨🇳 China numbers here are unbelievable. The US controls the absolute majority of known AI training compute on this planet and continues to build the biggest, most power hungry clusters. China is spending heavily to close the gap. Recent reporting pegs 2025 AI capital expenditure in China at up to $98B, up 48% from 2024, with about $56B from government programs and about $24B from major internet firms. Capacity will grow, but translating capex into competitive training compute takes time, especially under export controls. With US controls constraining access to top Nvidia and AMD parts, Chinese firms are leaning more on domestic accelerators. Huawei plans mass shipments of the Ascend 910C in 2025, a two-die package built from 910B chips. US officials argue domestic output is limited this year, and Chinese buyers still weigh tradeoffs in performance, memory, and software. 📜 Chips and policy are moving targets The policy environment shifted again this week. A new US arrangement now lets Nvidia and AMD resume limited AI chip sales to China in exchange for a 15% revenue share paid to the US government, covering products like Nvidia H20 and AMD MI308. This could boost near-term Chinese access to mid-tier training parts, yet it does not restore availability of the top US chips. Beijing is cautious about reliance on these parts. Chinese regulators have urged companies to pause H20 purchases pending review, and local media describe official pressure to prefer domestic chips. 🇺🇸 Why performance still favors the US stack like NVIDIA Independent analysts compare Nvidia’s export-grade H20 with Huawei’s Ascend 910B and find the Nvidia part still holds advantages in memory capacity and bandwidth, which matter for training large models. But software maturity gaps around Huawei’s stack remains, that reduce effective throughput, even when nominal specs look close to older Nvidia parts like A100. These issues make it harder for Chinese labs to match US training runs at the same wall-clock cost.