Hi, I'm new in ML (and English is not my first language, sorry for mistakes). I know and understand (more or less) Restricted Boltzmann Machines, and I would like to stack some layers to do deep belief networks. But my training samples aren't binary values but continuous values, and (if I'm not totally lost) the original formulation of RBM is for discrete samples values.
I've seen some RBM algorithms to work with continuous values but were quite old (http://pdf.aminer.org/000/271/523/a_continuous_restricted_boltzmann_machine_with_a_hardware_amenable_learning.pdf). Is there any "standarized" and "modern" (and simple if possible but probably that's asking too much) way to learn layer by layer with unsupervised algorithm from continuous information?
Thank you.
[link][1 comment]