Testing Kinect and Digital Puppet

Yesterday we met in the mocap lab to test our Kinect setup with TouchDesigner and Unreal Engine. We spent quite some time trying to get the motion and pose data from the iPi software to TouchDesigner. Unfortunately, this didn't work out because neither iPi nor Touch have their native plug-in to communicate with each other. The only solution seemed to be having a member of the team write a script to solve the communication issue. But since that might complicate things in our project, we decided to fall back into our backup plan and use Unreal Engine.

For this method, I'd say we were somewhat successful. At first, we set up the Kinect and iPi software as usual and made sure it was tracking the person in view. With tracking going, we made sure to turn on data streaming and choose the UE5 mannequin "Manny" as our test rig.

(Default rig and stream setup)

On the Unreal Engine side, I made sure that the Live Link plug-ins from iPi were added to the project and activated. Then I added the data stream to live link sources and set up a blueprint for the puppet to run based on the real-time data received from iPi. As soon as the puppets were connected, I realized that the data flow from iPi to UE was extremely slow and "Manny" in Unreal would lag almost a whole minute behind the actual person in front of Kinect. We realized this was probably due to hardware limitations. Despite having RTX3080 on the computer we were testing, the lag caused our real-time data stream to be basically useless.

(Live link setup and pose blueprint in UE)

To optimize our stream, I ended up excluding some data transfer from iPi to UE (all finger joints and detailed spine joints), reduced graphics settings in the Unreal project, and closed down everything else running on the laptop we were using. In the end, we managed to get much smoother animation in Unreal.
(Streaming from iPi to Unreal Engine in real-time)

As you might have noticed in the video, there are unfortunately still some jitter issues in the data. I think our only other option is to switch to hardware with a better GPU, so I suggested that next time, we test with the render box in the mocap room; it has an A6000 board. If we can set that machine up, we should be able to test whether the jitter issue can't be solved or not. If it can't, we'll have to look into different methods asap.

Right now, the backup options we have talked about are:
  • Removing jitter within Unreal Engine
  • Running one Kinect directly into UE (no iPi streaming)
  • Running one Kinect directly into TouchDesigner

Comments

Popular posts from this blog

DMX in Unreal Engine

Final Tests on CRMX

Pixel Mapping