HomeGlossaryZero-Shot Learning
Techniques & Methods

Zero-Shot Learning

A model's ability to perform a task without any examples — just instructions or task descriptions.

Zero-shot learning is when a model handles a task it was never explicitly trained on, using only natural language instructions. Ask an LLM to translate, summarize, or classify text, and it does so without needing examples.

Zero-shot capability emerged in large language models as a side effect of training on diverse internet-scale text. Models learned to generalize task intent from instructions alone.

Contrast with few-shot: zero-shot uses only a task description; few-shot adds a handful of examples in the prompt.

Zero-shot is the default way most people interact with LLMs today. Its quality depends heavily on the clarity of the prompt and the capability of the underlying model.

Related Terms

← Back to Glossary