The big advantage of MCP over OpenAPI is that it is very clear about auth. OpenAPI supports too many different auth mechanisms, and the schema doesn't necessarily have enough information for a robot to be able to complete the auth flow.
*gets up on soap box* With the announcement of this new "code mode" from Anthropic and Cloudflare, I've gotta rant about LLMs, MCP, and tool-calling for a second Let's all remember where this started LLMs were bad at writing JSON So OpenAI asked us to write good JSON schemas & OpenAPI specs But LLMs sucked at tool calling, so it didn't matter. OpenAPI specs were too long, so everyone wrote custom subsets Then LLMs got good at tool calling (yay!) but everyone had to integrate differently with every LLM Then MCP comes along and promises a write-once-integrate everywhere story. It's OpenAPI all over again. MCP is just a OpenAPI with slightly different formatting, and no real justification for doing the same work we did to make OpenAPI specs and but different MCP itself goes through a lot of iteration. Every company ships MCP servers. Hype is through the roof. Yet use of MCP use is super niche But now we hear MCP has problems. It uses way too many tokens. It's not composable. So now Cloudflare and Anthropic tell us it's better to use "code mode", where we have the model write code directly Now this next part sounds like a joke, but it's not. They generate a TypeScript SDK based on the MCP server, and then ask the LLM to write code using that SDK Are you kidding me? After all this, we want the LLM to use the SAME EXACT INTERFACE that human programmers use? I already had a good SDK at the beginning of all this, automatically generated from my OpenAPI spec (shout-out @StainlessAPI) Why did we do all this tool calling nonsense? Can LLMs effectively write JSON and use SDKs now? The central thesis of my rant is that OpenAI and Anthropic are platforms and they run "app stores" but they don't take this responsibility and opportunity seriously. And it's been this way for years. The quality bar is so much lower than the rest of the stuff they ship. They need to invest like Apple does in Swift and XCode. They think they're an API company like Stripe, but their a platform company like an OS. I, as a developer, don't want to build a custom chatgpt clone for my domain. I want to ship chatgpt and claude apps so folks can access my service from the AI they already use Thanks for coming to my TED talk

Nov 8, 2025 · 5:21 PM UTC

Maybe an agent could read the docs and write code to auth. But we don't actually want that, because it implies the agent gets access to the API token! We want the agent's harness to handle that and never reveal the key to the agent.
1
20
But we can't have every harness implementing every auth mechanism ever. We'd at least have to say "must be OAuth2". But even then, client registration is a problem.
1
8
OAuth has always assumed that the client knows what API it's talking to, and so the client's developer can register the client with that API in advance to get a client_id/client_secret pair. Agents, though, don't know what MCPs they'll talk to in advance.
1
12
So MCP requires OAuth dynamic client registration (RFC 7591), which practically nobody actually implemented prior to MCP. DCR might as well have been introduced by MCP, and may actually be the most important unlock in the whole spec.
1
5
1
47
(RFC 7591 has problems, incidentally, and is likely to be replaced by something better. But that doesn't change the fact that dynamic registration was mostly not supported at all prior to MCP.)
1
1
8
A second benefit of MCP -- which code mode perhaps negates somewhat -- is that it forced people to design simpler API surfaces that an agent could understand and work with. Many APIs are just ridiculously complicated, and agents are more easily overwhelmed than humans.
1
1
18
OpenAPI is designed to describe HTTP / RESTful APIs, which are particularly difficult to think about (even for humans). MCP presents a simple function call API instead: much easier.
1
16
My spicy opinion is that REST was a mistake all along (even before AI entered the picture). REST is all about forcing programming interfaces into the framework of HTTP, which wasn't ever really designed for that.
2
2
36
REST feels good because it fits into HTTP and we have all this infrastructure and understanding around HTTP on the net. But it's actually unnatural. Nobody writes in-process APIs this way. Why should network APIs be so different?
2
20
IMO the best future for APIs would be if we ditched REST and instead designed network APIs more similarly to in-process APIs, which are easier to reason about for AI -- and also for humans.
2
1
18
But I think MCP may have dumbed down interfaces a bit *too* much. With code mode we now realize that AI can handle more complexity if it's presented as code instead of "tool calls". But MCP only lets you express a flat set of procedures, not rich OOP or functional interfaces.
1
9
I'm hoping to solve this with Cap'n Web, our RPC protocol that supports full bidirectional calling, higher-order functions, object-capabilities, streaming, etc. Basically lets you expose a rich TypeScript interface over the network. blog.cloudflare.com/capnweb-…
3
5
58
So the direction I'm excited to explore is MCP's auth framework + Cap'n Web APIs specified as TypeScript. But MCPish auth + REST APIs specified with OpenAPI is another route. I'm guessing we'll see both and it'll be interesting.
2
10
(If we do end up with AI agents using OpenAPI tools, though, we gotta rename it. Every time I see OpenAPI I read it as OpenAI and get really confused.)
4
25
Replying to @KentonVarda
What do you mean? How is it very clear about Auth? Please explain. OpenAPI does not dictate Auth/Authz as it should be. MCP doesn't do that either. You can do API Keys (i.e.: equivalent to Basic Auth) or OAuth2. Both you can do with OpenAPI specs as well.
MCP dictates OAuth 2 with DCR. That's like the whole point of the remote MCP spec...
1
Replying to @KentonVarda
imo removing statefulness and making MCP just POST per endpoint would make it a lot better through simplification and throw in types like @headinthebox has been preaching
2
5
Replying to @KentonVarda
OpenAPI is way too big and complex for this use case. And the resulting json files are extremely token inefficient.
1
Replying to @KentonVarda
truly fantastic thread
1
Replying to @KentonVarda
agree,along with specific use cases, api and rest is for humans, mcp should be for agents. a human would use like 3, 4 or more api calls to get some data this should be a single tool. and most api endpoints you don't want to even expose to the agent.
Replying to @KentonVarda
Isn't this somewhat retrospective rationalization since MCP didn't even have auth at launch .. so it's not like auth was _the_ problem with OpenAPI that it was designed to solve.
1
1
Replying to @KentonVarda
MCP is garbage, as you said only the handshake mechanism worth something.. the rest should be deleted and stop give us headaches..
Replying to @KentonVarda
Try offering your MCP into the leading channels and you'll find auth is the sticking point
Replying to @KentonVarda
THe big advantage is actually the AI not having to recreate the wheel every single time you do a recorring task. The AI should be making decisions not rewriting the same line of code over and over and over