12-13-06 - 1

Some random updates from the ML field :

1. The FANN Neural Net library is actually pretty decent. It's missing a lot of functions you would want, but it's reasonable easy to write them yourself.

2. SVDLIBC is a nice little implementation of the old "Lanczos" SVD algorithm which is fast and good. It comes from the old FORTRAN math goodness. One amazing little routine I found in there is "Machar" which will deduce the properties of your FPU by doing a few simple math operations!

3. If you search google for things like "SVD learning unknown missing" you will find about 100 papers; this is a major field of research, a lot of it driven by computer vision.

4. Simon's spilled the beans on his Netflix approach. His description of the "GHA" (Generalized Hebbian Algorithm) is way way clearer than the paper. The Hebbian is actually a type of Neural Network, and I find this approach much easier to understand if you just think of it in terms of doing a gradient descent to optimize the SVD vectors. I also found that Gorrell has released source code for her own GHA Lab

No comments:

old rants