unsupervised learning - Restricted Boltzmann Machine for real-valued data - gaussian linear units (glu) - -
i want restricted boltzmann machine learn new representation of real-valued data (see: hinton - 2010 - practical guide training rbms). i'm struggling implementation of gaussian linear units.
with gaussian linear units in visible layer energy changes e(v,h)= ∑ (v-a)²/2σ - ∑ bh - ∑v/σ h w
. don't know how change contrastive divergence learning algorithm. visible units won't sampled more linear. use expectation (mean-fied activation) p(v_i=1|h)= +∑hw + n(0,1)
state. associations left unchangend ( pos: data*p(h=1|v)'
neg: p(v=1|h)*p(h=1|v)'
). leads random noise when want reconstruct data. error rate stop improving around 50%.
finally want use gaussian linear units in both layers. how states of hidden units then? suggest using mean-field activation p(h_i=1|v)= b +∑vw + n(0,1)
i'm not sure.
you take @ gaussian rbm hinton himself has provided please find here. http://www.cs.toronto.edu/~hinton/code/rbmhidlinear.m
Comments
Post a Comment