Posts

Showing posts from February, 2024

nDisplay on Projection

Image
Based on my nDisplay testing last night, I recreated the configuration on the studio computer. After calibrating the projectors, I launched nDisplay and everything worked great! The ticker on the screen was live and accurate. For a single nDisplay setup, everything should be good to go for tomorrow's class! UPDATE:   Everything was not "good to go for tomorrow's class" :') During the demo, when nDisplay was launched, the "puppet" on screen would not update live and move according to the Kinect skeleton. It just became a flat pancake on the ground... Because we didn't have time to test the nDisplay setup along with the Kinects the day before, I didn't think we would come across an issue like this. My guess is that when nDisplay launches, it doesn't know to update the puppet's skeleton based on Livelink information by default. I wasn't sure how to fix this issue yet, but after having a quick chat with Nick, he said that my guess was par...

Testing nDisplay on Laptop

Image
Since nDisplay in the studio didn't work a few days ago, I wanted to test a simple setup on my laptop to understand the basics. Create a new nDisplay configuration. This could easily be done in a new folder in the contents of the project with the right-click menu. Create a new nDisplay screen in the configuration. Delete the pre-existing screen and create a new one; set the dimensions according to the physical screen in the studio. For specific, curved LED walls, you can also bring in an FBX model of the screen and use that in the configuration. A flat screen that already exists in the Unreal project was good enough for my needs. In a full virtual production setup, the location and rotation of the screen would need to be accurate down to a millimeter. However, because we don't care for a camera or tracking in our project, the location of the screen doesn't matter. Create a new cluster node. Most of the default settings are good. If the IP address section is not already po...

Finalizing Motion Sensors and Configuration

Image
Another Friday in the studio, and we started off by testing different configurations with 3 Kinects this time. Only to realize after spending quite a bit of time that the live broadcasting of the software we use does not support more than 2 Kinects! So we had to take a step back to our front-back configuration of 2 Kinects. Next, we tested a few more calibration objects. Yash tried one more time to get a Move controller working, but it was mostly in vain. I think the software is too old and undocumented to use with an independent computer. We had to get creative again and combined our phone light + bottle cap idea with the silicon ball on top of the controller. (Our final makeshift calibration object) This method led to the best results we got from different options and we decided to finalize it as our calibration object. Despite multiple tests, we still weren't able to get a "perfect" calibration in iPi, but we got the closest we could. The only data that's preventin...

Setting Up Capacitive Touch Sensors

Image
In the past week, we had an issue with setting up the capacitive sensors in the studio. To double-check if our issue was due to the studio's network or not, I took the sensor with me to try the same setup at home. After a few tries, I was able to make it work. Here are the detailed steps I followed: Make sure to download the latest version of Thonny. I originally had an older version, and until I updated the software, I wasn't able to proceed to step 2. Set the Thonny interpreter to MicroPython. Go to Tools > Options > Interpreter and set it to MicroPython (Raspberry Pi Pico). I left the remaining settings as is. Restart Thonny to make sure everything is set. Connect the capacitive sensors to the computer.  The files and settings in Adafruit should appear at the bottom of the Files window in Thonny. (Open files window from View > Files.) Open the "settings.toml" file from the left window. Set the SSID and password to the same network that the computer is con...

Getting Creative with the Calibration Object

Image
The original documentation on iPi's website suggests calibrating the Kinect setup with a PlayStation Move controller. Unfortunately for us, no one in the team owns one, and even when we borrowed one from the lab, we couldn't get it to turn on, much less work! So, we had to get really creative with our calibration objects. At first, we tried to use the old calibration wand from the motion capture studio: basically a T-shaped stick with reflective markers attached to it. However, the Kinect sensors didn't pick up on the markers at all. Then we tried the active wand: same idea but the wand has red and infrared lights built into it for calibration. The tracking still wasn't the best, probably due to multiple light sources on the wand. So for the next iteration, we tried using black tape to block all but one of the lights. Plus, we added a bottle cap on the remaining light to make it more diffused. This method actually worked pretty well! Unfortunately, the wand is built wit...

Some Setbacks with Kinect

Image
Today was a testing day filled with setbacks. With a few members of my team, we have been trying to make the motion sensors, touch sensors, and synthetic music work together but every part is popping up with an issue. With the Kinects, we've tried multiple configurations to get the most accurate and smooth motion data as possible. Unfortunately, none of them seemed to be the perfect choice for our project yet. At first, we placed the Kinects behind the performer at a 45-degree angle from the origin. This configuration allows us to make sure that the audience can't get between the performer and the cameras. However, the biggest setback is that because both cameras are only seeing the back of the performer when she turns or spins around herself, the cameras cannot discern which direction she is facing. Thus we lose accurate tracking and even when the performer turns back to her original position, the tracking remains confused. (45-degree configuration) The second configuration we...

Testing Kinect and Digital Puppet

Image
Yesterday we met in the mocap lab to test our Kinect setup with TouchDesigner and Unreal Engine. We spent quite some time trying to get the motion and pose data from the iPi software to TouchDesigner. Unfortunately, this didn't work out because neither iPi nor Touch have their native plug-in to communicate with each other. The only solution seemed to be having a member of the team write a script to solve the communication issue. But since that might complicate things in our project, we decided to fall back into our backup plan and use Unreal Engine. For this method, I'd say we were somewhat successful. At first, we set up the Kinect and iPi software as usual and made sure it was tracking the person in view. With tracking going, we made sure to turn on data streaming and choose the UE5 mannequin "Manny" as our test rig. (Default rig and stream setup) On the Unreal Engine side, I made sure that the Live Link plug-ins from iPi were added to the project and activated. Th...

2D Puppet Updates in TouchDesigner

Image
I've been playing around for the past few days to create a puppet that can be controlled with different inputs in TouchDesigner. After some trial and error, I ended up making a 2D, temporary puppet that can be controlled with my mouse on screen. My first step was to mock up a puppet on my iPad. In the end, the best way I found was to create different parts of the puppet as different image files and import them separately into TouchDesigner. For testing purposes, I only focused on the puppet's right arm: one image for the arm, one for the shoulder, and one for the rest of the body. (Parts of the puppet drawn separately and uploaded to Touch) After bringing the puppet into TouchDesigner, I tried connecting its motions to some input data. For now, I'm using the location x and y of my mouse to move the arm. Within the next week, I hope to connect the puppet to data coming from Kinect instead. I will have to check what kind of data I can bring in from Kinect to Touch though... A...

Potential New Equipment: Mo-Sys??

Image
One of my old classmates told me that somewhere in the pile of equipment Nick likes to collect, there's Mo-Sys equipment. I've never used this brand for tracking or mocap before, but it might be worth checking out! I should talk to Nick next week and learn more. From what I already know and a quick research, this system is usually used to track camera position. Thus, mostly used in virtual production. The company also makes a lot of other equipment and robots. It might be useful to us to track the location of a person, but I'm not sure if it's going to be the best option for our project. (Mo-Sys camera tracking equipment) ----- Update: As I thought, Kinect was much more reliable and better fitted for our final idea. I guess I'll play around with setting up Mo-Sys another time haha