For the past year, we’ve compared nearly 8,800 open source Machine Learning projects to pick Top 30 (0.3% chance).
This playground can give you intuition about Neural Networks and what happens during the learning process.
Humans don’t start their thinking from scratch every second. As you read this essay, you understand each word based on your understanding of previous words.
This is inspired from Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy by Andrej Karpathy.
Dropout is a technique for addressing this problem. The key idea is to randomly drop units (along with their connections) from the neural network during training. This prevents units from co-adapting too much.
Neural networks are soo flexible that it is common to end up with an overfitted model. You can prevent this by creating several independent neural nets models and them combining them in a new model, that can be quite complex OR you can use neural dropout, that is randomly getting rid of some units of the network. You introduce a little chaos (or noise) in your system.