LLM: "Call get_expenses(employee_id=1)" → Returns 100 expense items to context LLM: "Call get_expenses(employee_id=2)" → Returns 100 more items to context ... (20 employees later) → 2,000+ line items ...
Claude Codex extends the upstream openai/codex project. Follow the upstream repository for the canonical feature list; this README focuses on the extras maintained in this fork.