Last night at the HomeBrew Robotics Club
How they were able to put on the demonstration in the first place was actually amazing in it's self. They first searched the web to find any code that would help them, and when that failed, they plowed through the mathematical paper on the algorithm and them implemented it. That was a very tedious task.
It was a great presentation. First, one of the fellows showed his computer (with webcam attached) first train on objects and then recognizing these various objects by speaking what they were: a doll, a dollar bill, etc. What was amazing was these objects could be rotated, partially obsured, distorted, at varied distance, etc. and could still be recoginized using this algorithm. For example, the computer trained on both a one dollar bill and a twenty dollar bill. The bills were then shown to the webcam rotated, crumpled, with a finger overlaying the bill at varying distances and the computer said what the object was each and every time. The kicker is that the computer can be trained on many many many objects and recognize any or all of them. The training data is stored in a database and the real problem is that it really comes down to searching the database speed as to how fast objects are recognized.
Next, there was a gentle introduction to the algorithm. The guy showed a few slides and then a program he had written in Visual Basic that took you visually through the steps of the algorithm and explained what was going on in each step.
Next, there was a slightly more technical explaination.
But what really blew the crowd away (not that the crowd of about 50 members wasn't already) was this: the next presenter had taken a CMUJcam and attached it to a cheap FPGA in which he had implemented some of the algorithm. While it was not yet recognizing objects (he is taking development in steps and he has a real job with Xilinks). He pointed the CMUcam at us, and at 60 FPS, there the crowd was in outlined form. A guy in the back of the crowd started throwing and spinning a hat in the air. No problems. The crowd started moving to see how robust this was. No problem. Truely amazing.
There now are plans of finishing implementing Lowe's SIFT algorithm in an FPGA, attaching a camera lens to it and selling them to HBRC members to play with (read debug). The expected cost was somewhere under $50. That's right. I'll type it again. $50. But I would think it worth it for twice that much or more.