Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 63175

Daily Paper Review, Friday April 11: Le Cam Meets LeCun, Deficiency and Generic Feature Learning

$
0
0

Machine Learning Paper Review, Thursday April 10:

Le Cam meets LeCun: Deficiency and Generic Feature Learning Link: http://arxiv.org/pdf/1402.4884.pdf

Abstract:

“Deep Learning” methods attempt to learn generic features in an unsupervised fashion from a large unlabelled data set. These generic features should perform as well as the best hand crafted features for any learning problem that makes use of this data. We provide a definition of generic features, characterize when it is possible to learn them and provide algorithms closely related to the deep belief network and autoencoders of deep learning. In order to do so we use the notion of deficiency distance and illustrate its value in studying certain general learning problems.

Review:

When I read through and saw 27 pages, holey thats a lot! The first 8 pages are a re-iteration and formalization of what most of you already know (supervised learning, active learning, etc) and explained by their defined quintuple of the learning problem. The last 15 pages are proofs of theorems. That being said, the most interesting part is something I can't understand because it has to do with statistics (deficiency and such) described in these papers. The most interesting results are on page 6 and 7, section 3.2 and section 4 respectively. The paper also formally describes when generic feature learning can be used, and when supervised learning is better.

Questions:

Could any statistician describe what "Factoring through" and "Deficiency" are? My hypothesis of deficiency is the lack of data required to close the gap between the optimal function, thus the learned hypothesis is "data deficient".

submitted by MLPaperReviews
[link][1 comment]

Viewing all articles
Browse latest Browse all 63175

Trending Articles