Hi, This question may not have a definitive answer. However, if someone is able to illuminate the topic for me, I would be very grateful. The Mandelbrot set is the set obtained from the quadratic recurrence equation{1}: I'm sure most of you know what the graphical representation of the Mandelbrot set looks like, so I won't post a picture of it here. QuestionHave there been any attempts to derive the Mandelbrot set equation purely from it's graphical representation? I would imagine that this would involve some sort of machine learning process which searches through program space trying to find a correct program with the smallest Kolmogorov complexity{2}. What branch of mathematics works on solving this type of problem? Thank you. {1}: http://mathworld.wolfram.com/MandelbrotSet.html {2}: http://en.wikipedia.org/wiki/Kolmogorov_complexity asked May 30 '11 at 12:20 NCarlson 
The problem is undecidable, but as always with theory, that doesn't mean much if you want to just ignore that, hack with it anyway, and get it work well. If you stop iteration after 10 points, you're already going to have the correct answer for 99.something % of the entire plane that's enough to go on. There is relatively little work on fitting programs to data as far as I am aware (unfortunately). If you're trying to fit a program, what's generally done is to enumerate random Sexpressions or things like that (trees with variables as leaves and nodes being binary mathematical operations), and check them all. The checking part is relatively easy in this example with high likelihood. The problem here is that the set is defined by convergence properties of that equation, not the equation itself. So you need a for loop in the programs you're enumerating. Your best bet is to look through sites on genetic programming. answered May 30 '11 at 14:19 karpathy 1
I think you should look at Percy Liang's recent papers, specilly http://www.cs.berkeley.edu/~pliang/papers/programsicml2010.pdf
(May 30 '11 at 14:41)
Alexandre Passos ♦
I tried once before. I found it to be a bit hard to process. But it did look interesting, I wish I could be... smart enough for it :)
(May 30 '11 at 16:21)
karpathy
2
@karpathy it is unfortunately more opaque than it should be, I agree. The main idea, however, is great: when programming, there are all these tiny little bits of repetitive code that we use that have encoded meanings; think if guards, enumerating for loop, map, reduce, filter, etc. If we try to search for a program based on examples the space is too unconstrained for the results to be meaningful. However, if we try to fit programs to examples while encouraging reuse of tiny little pieces of solutions, then we can maybe find some structure in the data that will lead us to many short programs that make sense, essentially using one problem to help write programs for other problems. Now with this idea of possible reuse in mind, think about dirichlet processes, which let you automatically trade off reusing an earlier part of a model (say a mean in a mixture model) against adding more complexity, and you have that paper. All the rest, as far as I remember, is inference details, optimizations, and tricks to make sure things work out correctly with trees.
(May 30 '11 at 16:25)
Alexandre Passos ♦
1
Thank you for all the great comments. You might also want to take a look at Marcus Hutter and Ray Solomonoff's work.
(May 30 '11 at 17:30)
NCarlson

I think you're failing to point out an important feature of the mandelbrot set that might make this process seem easier than it actually is. The mandelbrot set is not the set of points that satisfy that equation; rather, it is the set of points for which after iterating that equation it will not lead to infinity. Also, the mandelbrot set is infinitely fine, and one can prove that in a realvalued turing machine deciding whether or not a point is in the mandelbrot set is an undecidable problem. So it is more complicated than just fitting the parameters of this equation or something similar, which would usually be done with parametric machine learning. answered May 30 '11 at 12:46 Alexandre Passos ♦ 
This gets out of the "practical AI" arena, and into "General AI." Marcus Hutter's AIXI is the canonical example of a ML process which searches through program space trying to find a correct program with the smallest Kolmogorov complexity. Unfortunately, AIXI is noncomputable. Fortunately, the AIXItl variant is computable. Unfortunately, it takes far longer to get good results than ML algorithms which start with more assumptions. The probabilistic version is even more practical, but still no match for a domainspecific algorithm. answered Jun 02 '11 at 08:58 Steve Thanks for that last link. AIXI is incredibly exciting. I'm surprised that it's fairly unknown outside of AGI circles.
(Jun 02 '11 at 12:13)
NCarlson
