Manage a fleet of cloud agents, directly from Cursor. Now with faster startup, improved reliability, and a new UI.
Replying to @cursor_ai
But can these agents prompt other agents instead of making the same request in parallel?

Oct 30, 2025 · 6:21 PM UTC

1
2
Could you clarify what you're looking for?
1
1
Sure, imagine using the “agents” tab as a chat with your team of junior devs. 1-4x of them. Each of them have their own role and the agents tab handles their team communication and roles. You first talk to a “Project Manager” agent, to get a good grasp of what’s being discussed. Then that agent deploys 2-3 more agents - each with their own role, tasks and prompts. Naturally some stuff might come out not looking as intended. So agent 1 (the project manager) handled revisions. Prompting the LLM to correct course, unblocking it, and handling queueing. For queuing, I mean it is waiting for one agent to deliver a feature before prompting the next one to take over. If Project manager faces an issue (needs an API key, clarification on UX, etc) it chats back with the user. Pretty much, as you’d have it in a small-scale tech startup with a production team. Does this sound realistic to implement or am I just hoping this can happen?