Current LLM tool use assumes compile-time bindings — every tool must be known in advance, added to the prompt, and hardcoded in.

We built Invoke, a lightweight framework that lets agents discover and invoke APIs dynamically at runtime using a simple agents.json descriptor — no plugins, no schemas, no registries.

The LLM uses a single universal function and discovers available tools just like a browser loading links.

whitepaper

Github

1-minute demo

Would love feedback and ideas — especially if you’re working on LLM agents or LangChain-style tooling.


💬 Discussion r/LocalLLaMA (1 points, 1 commentaires)