Mean Shift and Kernel Density Estimation Update
Yesterday I pushed a rather large update to my Github code repository, for the mean shift module. Over 8K lines were changed when I applied the patch! (development was in a different repository, so a patch was the easiest way to move it over)
Technically, I had already shared all of the mean shift modules code as relevant to my paper "My Text in Your Handwriting" some time ago, but the module had continued to be developed for another project, that then moved on and stopped using it. This is simply a good opportunity to share code that would otherwise remain unused.
Originally, the mean shift module just did mean shift, albeit for arbitrary data (I had needed to do the traditional 2D segmentation scenario, but also 3D mode finding in the RGB colour cube). But then I needed kernel density estimates, and of course mean shift is just a technique for finding the modes of a kernel density estimate, so it makes sense to support that, as the exact same interface works. It also gained the subspace constrained variant of mean shift (Gaussian kernel only) as an experiment that went nowhere - it was too computationally intensive to be justified. Plus I had gone to town with a proper set of spatial indexing data structures and a fully featured interface. This is the state the code was in before yesterdays update.
The improvements are fairly extensive. Firstly, a lot of bugs have been squished, and there are now loads of test files, 34 in fact. Some generate pretty pictures:-) Secondly, it has much better support for directional distributions and composite kernel support - you can, for instance, now have a feature vector with the first three entries position and the second three rotation (angle axis representation) and build a proper distribution over the location and orientation of an object, by setting the first three to be Gaussian, and the second three to be a mirrored Fisher distribution over the angle axis converted to a Quaternion (which will happen internally, so you can continue to use angle axis externally). This conversion system has to be configured, but is very flexible, within the constraints of the built in converters.
The biggest improvement however is support for multiplying the distributions represented by MeanShift objects, using the technique found in the paper "Nonparametric Belief Propagation", by Sudderth et al. This works for all kernels, though its only really optimised for Gaussian and Fisher (inc. composites of them and the mirror Fisher. For everything else it drops back to Metropolis-Hastings for drawing, and Monte-Carlo integration for estimating the probabilities fed to the Gibbs sampler. Slow!). This makes it trivial to implement the nonparametric belief propagation technique, as you can use MeanShift objects as messages/beliefs.
This is a little unusual for me. Normally, I only upload code when its attached to a paper, so anyone who uses the code will (I hope!) cite the associated paper. Sure, this module is used, heavily, by "My Text in Your Handwriting", but the entirety of this update is not. My only other module like this is my Dirichlet process Gaussian mixture model (which is also my most used module - people occasionally cite my background subtraction algorithm when using it, but there is no shared code, just a shared idea. Just my luck that my most popular code only rarely leads to citations!). But this code has been sitting around for a while, and it would be a waste to not publish it. So here it is - if anyone wants to collaborate on a research paper that uses it, then please email me!
Technically, I had already shared all of the mean shift modules code as relevant to my paper "My Text in Your Handwriting" some time ago, but the module had continued to be developed for another project, that then moved on and stopped using it. This is simply a good opportunity to share code that would otherwise remain unused.
Originally, the mean shift module just did mean shift, albeit for arbitrary data (I had needed to do the traditional 2D segmentation scenario, but also 3D mode finding in the RGB colour cube). But then I needed kernel density estimates, and of course mean shift is just a technique for finding the modes of a kernel density estimate, so it makes sense to support that, as the exact same interface works. It also gained the subspace constrained variant of mean shift (Gaussian kernel only) as an experiment that went nowhere - it was too computationally intensive to be justified. Plus I had gone to town with a proper set of spatial indexing data structures and a fully featured interface. This is the state the code was in before yesterdays update.
The improvements are fairly extensive. Firstly, a lot of bugs have been squished, and there are now loads of test files, 34 in fact. Some generate pretty pictures:-) Secondly, it has much better support for directional distributions and composite kernel support - you can, for instance, now have a feature vector with the first three entries position and the second three rotation (angle axis representation) and build a proper distribution over the location and orientation of an object, by setting the first three to be Gaussian, and the second three to be a mirrored Fisher distribution over the angle axis converted to a Quaternion (which will happen internally, so you can continue to use angle axis externally). This conversion system has to be configured, but is very flexible, within the constraints of the built in converters.
The biggest improvement however is support for multiplying the distributions represented by MeanShift objects, using the technique found in the paper "Nonparametric Belief Propagation", by Sudderth et al. This works for all kernels, though its only really optimised for Gaussian and Fisher (inc. composites of them and the mirror Fisher. For everything else it drops back to Metropolis-Hastings for drawing, and Monte-Carlo integration for estimating the probabilities fed to the Gibbs sampler. Slow!). This makes it trivial to implement the nonparametric belief propagation technique, as you can use MeanShift objects as messages/beliefs.
This is a little unusual for me. Normally, I only upload code when its attached to a paper, so anyone who uses the code will (I hope!) cite the associated paper. Sure, this module is used, heavily, by "My Text in Your Handwriting", but the entirety of this update is not. My only other module like this is my Dirichlet process Gaussian mixture model (which is also my most used module - people occasionally cite my background subtraction algorithm when using it, but there is no shared code, just a shared idea. Just my luck that my most popular code only rarely leads to citations!). But this code has been sitting around for a while, and it would be a waste to not publish it. So here it is - if anyone wants to collaborate on a research paper that uses it, then please email me!