Some Setbacks with Kinect
Today was a testing day filled with setbacks. With a few members of my team, we have been trying to make the motion sensors, touch sensors, and synthetic music work together but every part is popping up with an issue.
With the Kinects, we've tried multiple configurations to get the most accurate and smooth motion data as possible. Unfortunately, none of them seemed to be the perfect choice for our project yet.
At first, we placed the Kinects behind the performer at a 45-degree angle from the origin. This configuration allows us to make sure that the audience can't get between the performer and the cameras. However, the biggest setback is that because both cameras are only seeing the back of the performer when she turns or spins around herself, the cameras cannot discern which direction she is facing. Thus we lose accurate tracking and even when the performer turns back to her original position, the tracking remains confused.
The second configuration we tried was 90 degrees from the origin. We placed the Kinects perpendicular to each other to check if having the side-view would help the tracking when the performer turns or spins. Unfortunately, this wasn't the case and the tracking still had issues when the performer faced a different direction. Despite this, we noted that the general jittery-ness of the tracked performer and the popping of arms and legs improved slightly in this configuration compared to the 45-degree one.
The third and last configuration we tried was with the cameras placed at the front and back at a 180-degree angle. Following the iPi documentation, we made sure to offset the cameras in opposite directions slightly so they weren't completely facing each other. For tracking spins and turns, this configuration was easily the best. Unfortunately, our main concern with this configuration is the physical locations of the cameras in our setup. The front camera can easily get blocked by the audience, while the back camera sits right in the middle of our screen and blocks the projection. This setup can be our latest resort but it's still not preferable.
Along with our issue with Kinect configurations, today we also tried to figure out the best way to calibrate our Kinects. We had to get very creative with our solutions, but that'll be a whole another blog post of its own :3
Though we weren't able to make motion-touch-music work together today as we had hoped, I still managed to get some triggers and effects in Unreal based on the tracking! The following first video showcases 2 interactions I've set up in Unreal: a sphere following basic physics and a light trigger based on collision, and the latter half of the video also shows issues we had with the performer spinning in the first two configurations. The second video is based on our final front-back configuration when the spin tracked more accurately.
I haven't worked with collisions and physics in Unreal before but setting up the sphere and light was much easier than I thought! By next week, I hope that we can include more effects into our scene to showcase in class.
Comments
Post a Comment