diff --git a/docs/freqai-parameter-table.md b/docs/freqai-parameter-table.md
index 5fe23e710..bce45133e 100644
--- a/docs/freqai-parameter-table.md
+++ b/docs/freqai-parameter-table.md
@@ -106,6 +106,7 @@ Mandatory parameters are marked as **Required** and have to be set in one of the
| `n_epochs` | The `n_epochs` parameter is a crucial setting in the PyTorch training loop that determines the number of times the entire training dataset will be used to update the model's parameters. An epoch represents one full pass through the entire training dataset. Overrides `n_steps`. Either `n_epochs` or `n_steps` must be set.
**Datatype:** int. optional.
Default: `10`.
| `n_steps` | An alternative way of setting `n_epochs` - the number of training iterations to run. Iteration here refer to the number of times we call `optimizer.step()`. Ignored if `n_epochs` is set. A simplified version of the function:
n_epochs = n_steps / (n_obs / batch_size)
The motivation here is that `n_steps` is easier to optimize and keep stable across different n_obs - the number of data points.
**Datatype:** int. optional.
Default: `None`.
| `batch_size` | The size of the batches to use during training.
**Datatype:** int.
Default: `64`.
+| `early_stopping_patience` | Number of epochs with no improvement in validation loss before training is stopped early. This helps prevent overfitting by halting training when the model stops improving. Set to `0` to disable early stopping. Requires a test/validation split (`test_size > 0`).
**Datatype:** int.
Default: `0` (disabled).
### Additional parameters