I've successfully implemented a RBM and trained it with CD-k, but I now have some issues doing the same with Convolutional RBM.

The visible and hidden biases gradients are easy to compute: v1 - v2 for visible bias and h1_k - h2_k for the hidden bias. But I failed to see how to compute the gradients for the weights. The weights are NH x NH matrices. In RBM it was easy since the weight matrix was NH * NV dimension and with multiplication of h1 * v1 - h2 * v2 we could have the gradients, but the dimension are not equivalent in CRBM.

What is the rule to compute the gradients ? Or is there a different way to update the weights ? It's probably a convolution, but I fail to see how...

asked Jul 04 '14 at 10:32

Baptiste%20Wicht's gravatar image

Baptiste Wicht
31121315


One Answer:

The gradients can be computed from this formula:

W_pos = V_1 *v ~H_1  
W_neg = V_2 *v ~H_2
W_grad = W_pos - W_neg

*v being a "valid" convolution. ~H being H flipped horizontally and vertically.

answered Jul 15 '14 at 02:06

Baptiste%20Wicht's gravatar image

Baptiste Wicht
31121315

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.