*gets up on soap box*
With the announcement of this new "code mode" from Anthropic and Cloudflare, I've gotta rant about LLMs, MCP, and tool-calling for a second
Let's all remember where this started
LLMs were bad at writing JSON
So OpenAI asked us to write good JSON schemas & OpenAPI specs
But LLMs sucked at tool calling, so it didn't matter. OpenAPI specs were too long, so everyone wrote custom subsets
Then LLMs got good at tool calling (yay!) but everyone had to integrate differently with every LLM
Then MCP comes along and promises a write-once-integrate everywhere story.
It's OpenAPI all over again. MCP is just a OpenAPI with slightly different formatting, and no real justification for doing the same work we did to make OpenAPI specs and but different
MCP itself goes through a lot of iteration. Every company ships MCP servers. Hype is through the roof. Yet use of MCP use is super niche
But now we hear MCP has problems. It uses way too many tokens. It's not composable.
So now Cloudflare and Anthropic tell us it's better to use "code mode", where we have the model write code directly
Now this next part sounds like a joke, but it's not. They generate a TypeScript SDK based on the MCP server, and then ask the LLM to write code using that SDK
Are you kidding me? After all this, we want the LLM to use the SAME EXACT INTERFACE that human programmers use?
I already had a good SDK at the beginning of all this, automatically generated from my OpenAPI spec (shout-out @StainlessAPI)
Why did we do all this tool calling nonsense? Can LLMs effectively write JSON and use SDKs now?
The central thesis of my rant is that OpenAI and Anthropic are platforms and they run "app stores" but they don't take this responsibility and opportunity seriously. And it's been this way for years. The quality bar is so much lower than the rest of the stuff they ship. They need to invest like Apple does in Swift and XCode. They think they're an API company like Stripe, but their a platform company like an OS.
I, as a developer, don't want to build a custom chatgpt clone for my domain. I want to ship chatgpt and claude apps so folks can access my service from the AI they already use
Thanks for coming to my TED talk
One thing I have also trouble understanding is that most of the MCP servers are just some lightweight shims or proxies that you need to run on your local system and they are distributed like they were some packages. Like why, if they are servers, just host them like an API.
Nov 8, 2025 · 8:34 AM UTC

