Collection of generative models in Tensorflow

Open Data for Deep Learning

Here you’ll find an organized list of interesting, high-quality datasets for machine learning research. We welcome your contributions for curating this list! You can find other lists of such datasets on Wikipedia, for example. Recent AdditionsOpen Source Biometric Recognition DataGoogle Audioset: An expanding ontology of 632 audio event classes and a collection of 2,084,320 human-labeled 10-second sound clips drawn from YouTube videos.Uber 2B trip data: Slow rollout of access to ride data for 2Bn trips.Natural-Image DatasetsMNIST: handwritten digits: The most commonly used sanity check. Dataset of 25x25, centered, B&W handwritten digits. It is an easy task — just because something works on MNIST, doesn’t mean it works.CIFAR10 / CIFAR100: 32x32 color images with 10 / 100 categories. Not commonly used anymore, though once again, can be an interesting sanity check.Caltech 101: Pictures of objects belonging to 101 categor…

Jaccard Similarity vs Cosine Similarity Jaccard Similarity is given by sij=pp+q+rsij=pp+

The Black Magic of Deep Learning - Tips and Tricks for the practitioner


I've been using Deep Learning and Deep Belief Networks since 2013.
I was involved in a green field project and I was in charge of deciding the core Machine Learning algorithms to be used in a computer vision platform.

Nothing worked good enough and if it did it wouldn't generalize, required fiddling all the time and when introduced to similar datasets it wouldn't converge. I was lost. I then caught wind from Academia, the new hype of Deep Learning was here and it would solve everything.

I was skeptical, so I read the papers, the books and the notes. I then went and put to work everything I learned.  Suprisingly, it was no hype, Deep Learning works and it works well. However it is such a new concept (even though the foundations were laid in the 70's) that a lot of anecdotal tricks and tips started coming out on how to make the most of it (Alex Krizhevsky covered a lot of them and in so…