HomeGlossaryEpoch
Training & Learning

Epoch

One complete pass through the entire training dataset during model training.

An epoch is one full cycle of training where the model sees every example in the training set exactly once. Training typically spans many epochs, allowing the model to refine its weights through repeated exposure to the data.

Too few epochs lead to underfitting (the model hasn't learned enough). Too many lead to overfitting (the model memorizes the training data and fails on new examples).

Typical values: from a few epochs for fine-tuning to hundreds for training from scratch on small datasets.

In modern LLM pretraining, models often see data only once or twice — the dataset is so large that multiple epochs aren't necessary or practical. Fine-tuning typically uses 1-5 epochs.

Related Terms

← Back to Glossary