Function Calling / Tool Use
Letting a language model invoke functions, APIs, or tools and use the results in its answer.
Function calling is the mechanism by which an LLM goes beyond text generation. You describe a set of available functions (in JSON schema form), the model decides if and which to call, the runtime executes the call, and the result feeds back into the model.
This is the foundation of every AI agent. Without tool use, a model is a chat companion; with tool use, it's a worker that can look things up, run calculations, edit files, or trigger external systems.
Most frontier APIs support tool use natively (Anthropic, OpenAI, Gemini). MCP standardizes how tools are described so they work across clients. Vercel AI SDK, LangChain, and the official provider SDKs make tool calling straightforward to wire up.