Even More Testing and First Demo in Studio

Today I started my work in the studio by continuing some tests on nDisplay configurations. As a proof of concept, I decided to use my laptop and a TV monitor to test the multiple screens setup. I connected my laptop to the TV with a display cable and made sure the screens were lined up and numbered correctly in settings.

(Display settings showing my laptop (HD) as number 1 and the TV (4K) as number 2)

After setting up the hardware, I went to the nDisplay configuration in Unreal to position the viewports. With this configuration, I should have the first nDisplay screen appear on the top-left corner of my laptop and the second screen across the whole TV.

To my surprise, the setup worked perfectly on the first try! This test shows that my setup yesterday was also correct and the issues I had stemmed from the nature of having 2 projectors in our setup. Since we can't get rid of the projectors or turn one of them off (it will lower the brightness and contrast), I will need to do some tests with multi-user sessions in Unreal.

(nDisplay setup working as intended on two different screens)


Next, I helped my team with the setup of our final environment for the puppet. Instead of creating a stage from scratch, we decided to use an existing free environment asset from the Epic store and edit it the way we saw fit. When done, we added the curtains I had from yesterday to the stage.

(Final environment for the demo of the day)

After the final touches with the setup of the environment, I dragged the nDisplay blueprint into the scene and adjusted it to the position we wanted. As you can see from the following image, it is hard to see what is displayed on the screen when the viewport is busy with other assets. I couldn't find a way around this. I ended up launching nDisplay on screen, checking the position of the camera, and then stopping nDisplay to edit the camera, and repeat again and again until we got the view we wanted. It was a tedious process, so I might look into how to make editing nDisplay camera easier in the future.

(Final placement of nDisplay for the demo of the day)

While adjusting the camera settings, I came across a new issue. If we didn't use a frustum, the lens settings of the camera wouldn't apply to nDisplay. However, when we used a frustum, it would appear as a smaller scene within the scene. This is because the frustum usually follows a real camera on set that is tracked, but we're not using one and need the whole screen to be the frustum.

(nDisplay with frustum in the middle that shows what the Unreal camera sees)

When I searched online, I wasn't able to find any information on how to use a frustum for full screen rather than a camera view. But while looking through some ICVFX camera settings in Unreal, I found something called "Overscan". When changed to a higher number than the default 1, this setting enlarged the frustum on nDisplay. If I became too greedy with this setting, however, the image on the frustum would become a blurry mess.

(nDisplay when frustum overscan is set above 3)

I ended up needing to strike a balance between the overscan setting and the camera's focal length. This meant that these settings would be locked for the stage and we couldn't use focal length for changes based on triggers... but it is what it is until I find a different way to fix the frustum issue!

(Final look with overscan set at 2 and focal length at 30)


Lastly, I imported the puppet we wanted to use for the final exhibition and placed it in the scene (over the bed and in the middle of the stage). Initially, I thought I would have to do some extra work to retarget the iPi skeleton onto the puppet. I did a test run for Live Link where iPi retargeted to the default Unreal mannequin, and I placed the live data directly onto the puppet. To my surprise, the puppet worked perfectly and the animations were pretty accurate without the need for extra retargeting. Due to time crunch, I decided to leave retargeting to another time if an issue with the current setup popped up.

(Live-tracked performer with the puppet for today's demo)

As seen in the above video, we also realized that with the current Kinect and puppet setup, when the live tracking brakes down, a quick T-pose fixes any issues in the arms and legs. This shows that unless the tracking becomes absolutely unrecognizable, we don't have to stop and restart streaming through iPi software like we had to do last week during the class demo.

Overall, I'd say we've done a lot of great work as a team in the past few days for today's demo!

Comments

Popular posts from this blog

DMX in Unreal Engine

Final Tests on CRMX

Pixel Mapping