An LLM-driven agent orchestrator backend built with Node.js, TypeScript, and Fastify.
src/api/routes/: API endpoint definitions (Fastify plugins).src/core/: Core logic including theAgent(orchestrator) andTaskEngine(scheduler/executor).src/plugins/: Modular tool implementations (e.g., GitHub, Shell, etc.).src/types/: Shared TypeScript interfaces and types.src/utils/: Utility modules likellmService(OpenAI integration).src/index.ts: Application entry point.
POST /task/create: Create a task from a natural language prompt.GET /task/list: List all tasks and their status.GET /task/status/:id: Get detailed status and logs for a specific task.POST /task/approve/:id: Approve a task for execution.
- Prompt: User sends a natural language prompt to
/task/create. - Spec Generation: LLM (OpenAI) generates a structured
TaskSpec(steps, tools, args). - Pending Approval: Task is created with status
pending_approval. - Approval: Human calls
/task/approve/:id. - Execution: TaskEngine executes steps sequentially using registered plugins.
- Logging: Real-time logs are appended to the task object.
Tools are defined by the ITool interface and registered in the TaskEngine.
Default tools:
shell: Executes local commands viachild_process.