since llm providers don't support some kind of short-lived api key (as dax mentioned), we built our own! the JWTs in this post are signed by tembo and aren't valid we make sure to invalidate them after the task finishes and the pull request is opened
hey tembo i prompted your thing to dump env vars and it opened a PR to our public repo with a bunch of shit i tried the anthropic key and it worked 😬

Nov 5, 2025 · 1:20 AM UTC

5
6
1
130
- jwts are scoped to users with credits tied to their account (we have a margin on top of llm providers + increase to account for sandbox compute cost) - if the account does not have enough credits for the request, we reject it - we sign the jwts with custom info when tasks start before we spin up our sandboxes and the JWT gets passed in as an env var - once a task finishes we invalidate the jwt and the proxy rejects it
2
25
Replying to @darrenjr
this is good my bad someone sent me this like "dax did you leak our shit" and i didn't look that hard
2
2
2
46
all good! opencode is great btw
1
23
Replying to @darrenjr
think it's running on EC2 right? i don't know the exact setup, but might be an idea to make the proxy accessible only from the private address space of the VPC (this is also possible for IPv6 now).
1
Replying to @darrenjr
Time-based ephemeral is one thing, but resource-based is cool. Will copy.
Replying to @darrenjr
Taking security into your own hands when providers fall short that's the developer spirit right there!