Finalizing Motion Sensors and Configuration
Another Friday in the studio, and we started off by testing different configurations with 3 Kinects this time. Only to realize after spending quite a bit of time that the live broadcasting of the software we use does not support more than 2 Kinects! So we had to take a step back to our front-back configuration of 2 Kinects.
Next, we tested a few more calibration objects. Yash tried one more time to get a Move controller working, but it was mostly in vain. I think the software is too old and undocumented to use with an independent computer. We had to get creative again and combined our phone light + bottle cap idea with the silicon ball on top of the controller.
(Our final makeshift calibration object)
(The best calibration we got with 2 Kinect setup and the final calibration object)
In the last few minutes of our testing day, we realized that if the Kinects were positioned on two sides of the actor, the live tracking was as good as if they were positioned at the front and back! The side-to-side setup could track a full turn of the performer around herself and it would limit the audience interference with the cameras' views! Ignoring the snapping joints that could be further refined, the below video shows that the Kinects could track pretty well when they can see the performer's sides.
Another thing I wanted to get going this week was the projection screen and the Unreal nDisplay. I've had previous experience working with the 2 rear projectors in the studio, so getting them calibrated with the Scalable Desktop software went pretty smoothly.
(Rear projector setup in the studio and the calibration process)
I didn't have much time in the studio to keep troubleshooting for the day, so I'm planning to go home and test a nDisplay setup from scratch on my own laptop. If that works, I should have a pretty good idea on how to do it again in the studio on Monday.
Comments
Post a Comment