|
I'm intending to implement a system that uses an evolutionary algorithm on a large image dataset (Not sure which dataset - it should have a lot of variety and a large number of images, not all necessarily labeled). I'm most familiar with deep-learning systems like Convolutional Neural Nets and Deep Belief Nets that learn a number of local image filters, take the dot product of these with patches of the image, maybe do some other stuff like subsampling and using multiple layers, and give a result based on this. This has the advantage that it's pretty simple and adaptable to a wide variety of machine learning techniques (beyond deep-learning even), but it also seems very crude. What alternatives are there to these sorts of filters, and how can they be parametrized/learned in a way that might fit into the evolutionary context I'm interested in? EDIT: People have given some good answers, but I'm looking for broader departures from filters. For instance, is there a learnable/evolvable parameterization of SIFT, SURF or other "feature-finding" variants? Techniques from computer vision I haven't even mentioned? |
|
I think the cheapest and most effective alternative for you should be to use deep-learned features but don't bother actually learning them. According to the weighted sums of random kitchen sinks paper, this works nearly as well as many popular learned features, and it's a lot faster to compute for obvious reasons. |
|
Sum-Product Networks seem like they'd mesh really well with an evolutionary process (Sum Product Networks: A New Deep Architecture, Hoifung Poon and Pedro Domingos). You construct networks that match certain rules, then run learning algorithms for your data. Genetic programming could create these networks. You'd have to do some extra work to ensure that the networks generated fit the requirements of SPN's, but after that's in place it seems straightforward to get the evolutionary process learning higher quality deep learning features. |