aa
asked Feb 18 '13 at 12:11
bizso09 1●2●2●3
edited Feb 18 '13 at 12:13
The first term of the gradient descent looks wrong, it should be X^T(XW - Y). The rest I haven't checked.
You can use mathematica or wolfram alpha to check your partial derivatives. It's probable easier than asking here.
Once you sign in you will be able to subscribe for any updates here
Tags:
cost ×5 derivation ×4
Asked: Feb 18 '13 at 12:11
Seen: 1,169 times
Last updated: Feb 19 '13 at 10:51
computer-algebra: symbolic differentiation with respect to vectors
Help with Andrew NG Machine Learning Class
Machine Learning with Matlab
Traces To find Acceptable voice over powerpoint Remedies
Confusion related to derivation of CRF log likelihood gradient
verbal translation russian
Maxout Neural Network Activation Function Error Derivative?
powered by OSQA
The first term of the gradient descent looks wrong, it should be X^T(XW - Y). The rest I haven't checked.
You can use mathematica or wolfram alpha to check your partial derivatives. It's probable easier than asking here.