Epoch

When training neural networks we traverse a training set. One epoch means one pass of this training set. Usually it may contain a few iterations.

Training sets may be divided intro batches of training sets, still, one epoch means going through the entire training set (i.e. through all batches).