Sparsity
The so called curse of dimensionality in machine learning is the observation that neural networks with many parameters can be impossibly difficult to train due to the vastness of its parameter space. Another issue that arises in practice is that most of the neural network does not do anything, as a lot of its weights turn out to be redundant. This is because many (if not all) of the problems we’re interested in solving as engineers have some inherent sparsity. Steve Brunton has an excellent video explaining why this is so. ...