Dress Rehearsal
Today we did a full dress rehearsal to get a sense of how the exhibit would work and collect feedback from the attending audience. Before the rehearsal started, I worked on a few things in the Unreal project to better the experience.
Since I had arrived at the studio early, I wanted to give nDisplay one more chance by using a different machine. I rolled out another computer on a cart next to the one we've been using and tried connecting it to the multi-user session. However, I quickly came across hardware issues with the new computer. It had a pretty old GTX graphics card, and it wasn't strong enough to handle multi-user and rendering at the same time. Just logging into the computer even took 15 minutes! I had to abandon my two-screen display ideas once again...
Next, I made some changes to our Switchboard settings to be able to edit the project more intuitively. Last time I was trying to position the curtains properly, I had to make changes, launch nDisplay, check the position on the screen, and then close nDisplay to try again. This was a time-consuming and very tedious process.
To fix this issue, instead of opening the Unreal project through the Epic launcher, I added it to the Switchboard to be launched on command. After making sure both Swithcboard and the newly opened project were connected properly to the multi-user server, I was able to edit the project and see changes in real time on nDisplay screen.
(The project added to Switchboard as an Unreal device.)
(Editing project with changes reflected on nDisplay in real-time.)
While setting up the project on Switchboard, I came across a weird issue where nDisplay would be stuck on a black screen and then crash randomly. The main settings in nDisplay and Switchboard hadn't changed since the last time I worked with them, so I wasn't sure what was causing the issue. After looking through different settings, I somehow realized that the computer wasn't connected to the URBN-246 wifi. We mainly use the wifi to transfer data from our touch sensors and not for anything nDisplay-related. But since it wouldn't hurt to try, I reconnected to wifi, and voila! nDisplay started working as usual again...
Something else I worked on but didn't have too much time to implement before the dress rehearsal are post-process effects on camera. For the past few days, I've been looking into different camera effects we can use on nDisplay to further improve our visuals and add to the progress of discomfort on screen. Some settings I've found that can be useful for us are saturation, contrast, gain, grain, etc. I wanted to add some lens distortion as well, but unfortunately, the use of frustum on nDisplay limits us to a specific focal length and lens setting.
I've tried a few different ways to integrate the camera fx into our master blueprint but not all of them worked as I had hoped. There wasn't an easy way to bring the camera that already exists inside the nDisplay configuration into another blueprint, so I tried creating a different camera and turning it into a child blueprint that exists under the master BP. However, this caused a lot of errors whenever we tried to run the scene because the camera became a child BP under a new master blueprint. And having duplicates of the same blueprint caused Unreal to shut down with errors.
(Adding ICVFX camera as a child actor under the master (UDP) blueprint.)
With Nick's help, next, I tried to directly add the ICVFX camera into the master blueprint as a variable. This worked much better and I was able to get and set data to the camera as needed. Setting the variable as "Cine Camera Actor" worked best and from there, I could connect the camera variable to a "Add Post Process Component" node.
(ICVFX camera variable in the master blueprint.)
For testing purposes, I was able to lower the saturation of the scene based on a keyboard trigger. So the setup is working correctly and can be edited more to fit the project. After the dress rehearsal, we will need to improve our setup and set the triggers based on touch or discomfort level.
My role in the team was a "minion". I was responsible for staying near the main computer and making sure all parts of the tech were working correctly throughout the rehearsal. This included the Kinects, mocap software, nDisplay and projection, and touch sensors. As a part of the stage crew, at the climax of the exhibit, I also took part in "trying to fix the puppet" after she shuts down. My role here was to look like I was trying to fix the puppet through the computer to the audience, while in reality, I was counting down until it was time to "reset" the puppet (and the whole scene).
After the rehearsal ended, we gathered as a group to go over our notes and the feedback briefly. I think we will get to discuss the rehearsal further during class tomorrow. For now, these are my personal notes from today:
- Nailing down how and when the puppet is "fixed" after shutting down.
- Resetting nDisplay takes too long and causes the default Windows background to pop up. -> I will look into implementing a reset button to the project.
- Collisions are not working as planned.
- Adding more visual triggers based on immediate touch. It will help with the build-up of the exhibit climax and also make the screen more interesting to the audience. -> I will look into adding small vfx particles based on the body part that is being touched.
- Keeping the time between the puppet's "shut-down" and "reset" shorter. -> I can lower the countdown and restart the scene earlier.
Overall, I would say the rehearsal went well and we did a lot of work in a very short amount of time to put everything together! We definitely have areas that we can improve further before the final and I'm hopeful that the end result would be a great exhibit! :D
Comments
Post a Comment