Dimensionality Reduction

When training a model (whether it is Neural Network based or not) the most important part is the amount and quality of the data used to train it.

The more good data you have, the better you will be able to train your model.

However, the more data you have the more time it takes to train your model. So if you can reduce the dimensions you can optimize training.

Other advantages of Dimensionality Reduction include:

  • Easier to interpret
  • Avoids the Curse of Dimensionality
  • Reduction of variance

There are two ways of reducing the number of dimensions: extraction & selection, see Feature Selection versus Feature Extraction.

Some examples of techniques of Dimensionality Reduction are: PCA and LCA.