Concept
Co-creator Laura Rodriguez and I aimed to create a room for a virtual Spielberg Museum to resemble Carol Anne’s room in Poltergeist the movie. Design of the room included interactive elements such as lights that change with proximity, a TV that turns on and plays a clip from the movie when you come close to it, moving objects in a cyclone, the picking up and releasing of objects, and shattering a window on touch.
Methodology
Our 5-week plan included dividing and conquering various different interactive and animated elements. Laura dived into various iterations of animating and shattering techniques of glass for our window in both Maya and Unity while I sourced scripts to move the objects in a cyclone in Unity and created and textured the walls in Maya. We both sourced 3d Models for the static room objects (bed, wall decals, shelves, door, exit sign, window curtains, tree, and little bunny model) through Turbo Squid and various other free 3D models. We also sourced 3d Models in this way to modify in Maya and Unity such as the lamp, records, horse, and television. When you get close to the television the lights turn off and the television plays a video clip of the Poltergeist movie in which the objects circle in Carol Ann’s bedroom. This video gives context if the user has never seen the movie before. With guidance we created a script to activate the lights when you got close to the TV, this also activated the TV and the circling of the objects. We created hands that were able to grab the objects and conducted many tests to determine how large the collision boxes needed to be on the objects before they were grabbable as they passed the user. The speed of the cyclone also needed alteration (and perhaps still does) so the user can grab the objects. To simulate a museum experience, once an object is grabbed, the user hears descriptive information about the movie or disturbing information of the actors who played the characters in Poltergeist.
Discoveries and Looking Ahead
The biggest challenges we faced during this process included our learning curve when using Maya and also creating scripts for the interactions. Some intricacies in using Maya’s software made it difficult to transfer objects from Maya to Unity and it was easy to miss one little click which would be the determining factor for something working. Similarly, creating the scripts required a problem solving brain and although I have some understanding of coding, we had to get some more in depth help to really get some of these interactions working. The most difficult was when we wanted multiple things to happen at the same time like proximity to the TV results in the lights switching off and the objects beginning to circle (which turns on being able to grab the objects). The more if/then’s you added, the more complicated and the more one thing began to affect the next. I think this factored into the shatter of the window and the video clip playing on the TV. At one point the window was shattering into pieces on the floor, and as you see in this video, the window is one big piece of glass which I think is a result of some of the code mentioned above. Similarly, the clip on the TV is distorted in this final video when it was originally working before this script was implemented.
Even with all these challenges, our previous experience of choreography, movement, and creating performative spaces made it second nature for us to imagine the possibilities of what things could do in the virtual space. Turning lights off, triggering sound, and the flickering of a TV combine to create an immersive experience in the virtual world. Just like live performance, the audience is able to suspend their disbelief further when you nuance light, sound, and interaction. The magic is in the details. Having a more in depth knowledge of how these worlds are created gives me insight into the possibilities when used in performance and how live and virtual performances can live in the same spaces together.
Advisor: Shadrick Addy ACCAD