|
Does any one know good resources on learning general (loopy) CRF parameters involving higher order cliques? There is a CVPR2011 paper using dual decomposition and decision tree fields also allow for higher order factors. Are there any other / more standard methods to do this? Thanks!
This question is marked "community wiki".
|
|
The hard part of learning higher-order CRFs is doing inference, which is required in most training algorithms (likelihood training requires marginal inference, and SVM or perceptron training require MAP inference). Dual decomposition, when applied to these problems, is as a subroutine for performing faster MAP inference in some model classes. Unfortunately, the inference issue is really hard. Learning with approximate inference is known to be troublesome. Regardless, people have been succesful using loopy BP and/or variational when training these models. Another popular strategy is to decompose these models into simpler models with lower-order structures which are trained separately and then inference is done on them jointly at test time with something like dual decomposition. There are also pseudolikelihood estimators and similar techniques, which might be very helpful.
This answer is marked "community wiki".
Do you have examples of papers where these strategies have been used? Btw according to Finley learning with approximate algorithms isn't so bad. And many people use it for pairwise CRFs.
(May 26 '12 at 06:19)
Andreas Mueller
|