Congratulaions Cole! Even though we intended to fund projects in Q1 2016, your proposal was so awesome that we just had to get you started right away.
Our first coworking session at Floyd's on Sunday was a big success, we came up with a better regularization approach that boosted his accuracy on the Kaggle MINST problem by 30% (accuracy jumped from 92% to 95% with just a partial implementation of the new appraoch). Basically, he realized that 25% random dropout in combination with L2 weight regularization was driving all his weights (and performance) to zero. First he turned off random dropouts (temporarily, until he gets regularization dialed in). Then he switched from L2 norm to L1 (as a first step towards p-norm). I can't wait to see what p-norm does!
And this week he's looking into d3 visualizations of the weights:
- heatmap matrices
- force-directed graphs
I can't wait to see what you come up with next.
Thank you Thunder for helping him get started on this awesome project.