I wanted to start a post to see if anyone else is working on tracking vectors of objects detected by Pixy. In my application, I need to determine when an object abruptly changes direction, and it seems like computing the vector based on position deltas would be the most logical approach.
This would be helpful for gesture recognition, but I have a much simpler and fun idea in mind. I would like to determine when a colored juggling ball leaves your hand, and when it does use this to choreograph a song The song may be something recognizable, however based on your juggling pattern it might not sound right but would still be fun. Also it would cool to just create a simple synthesizer that responds to different colors in motion and direction change.
If anyone else is working on something similar I would love to collaborate here with code snippets. When I get something working I’ll post a reply.
If anyone has any advice that would be helpful as well, thanks!