Center For Immersive Media (CIM)

Work-In-Progress

CIM Blog

Alright, well so, this section is adapted from my Google Blogsite for the Center (which can be found here for now…)

Long story short, in my junior and senior year of UArts, I began as a work-study student for the Center for Immersive Media (hereby referred to as the CIM)

To keep it simple… The CIM is essentially a research center, independent of any one discipline or school of teaching, with the goal of exploring, well, emerging and immersive technologies, I.E. Virtual Reality, Augmented Reality, Projection Mapping, Motion Capture, Virtual Production, Immersive Audio, Haptic Feedback, so on and so forth.

The point being, how can we use these new tools, software/hardware, and technology, and how can other fields utilize it to better explore their own fields of study and practice? What kind of opportunities can be explored? How can others, less tech oriented, utilize this tech to the benefit of themselves, and their practices of study?

Well, it’s a work in progress, but hey, check below to see some of the various projects we have worked on in regard to this goal!

UPDATED NOTE: The University of the Arts, and by extension, the Center for Immersive Media, has suddenly closed its doors. The links above will break sometime in the future, but hey, at least I have a bunch of work and documentation arched here!

A picture of me, looking really cool and professional

By the by, most of these projects are created with Unity, and utilize Meta's Oculus headsets, and/or our motion capture system, or other various technical goodies we have lying around. These projects are either speculative templates that could be used as a starting base for future projects, or applied projects that we are actively working on with a deadline.

I'll also help out around the space, whether it be moving or setting up equipment, acting as a sort of guide for students coming in with a class or workshop to experience the space and test the equipment, an instructor to help onboard other work-study students who begin working here, or an assistant for other projects going on that I don't actively work on.

Proficiencies include: 

  • Game Engines (Unity, Unreal Engine, and Godot)

  • 3D Modeling, Texturing, Lighting, Rendering, Simulation & Fluids (Blender & Maya mainly, some very quick experience with 3DS Max, Modo, Cinema4D and even Lightwave)

    • Specially trained in Game Asset/Real-time asset Optimization

  • The Adobe Suite (Photoshop, Illustrator, After Effects, Premiere Pro)

  • 3D Texturing (Photoshop, Blender, Substance Painter, Quixel Mixer)

  • Some basic programming experience across a couple different languages, but primarily C# for use in the Unity game engine.

Projects

Qualisys Track Management (QTM)

One of my major responsibilities was to manage and run our 12 infrared camera mocap system, created by Qualisys.

Here we would show off our mocap capabilities to faculty, external partners, and students who were interested. In this, we would do recordings, as well as live demonstrations due to Qualisys' ability to stream the skeletal data in real-time over our local network, which we could then input into Unity or Unreal Engine.

Room-Scale Multiplayer VR (2022-2024)

Another likely long-running major project is our own custom solution for standalone networked VR projects. This project will serve as a template and starting point for future projects and games.

The basic setup is a 3D scan of our studio space, with Unity Netcode players, and the Qualisys Mocap plugin. With this, we can build standalone projects to the Oculus Quest, and have visitors explore a virtual version of our space that aligns with the real space. From there, we have countless ideas for future projects which can expand off this base.

Live Performer Avatar Recorder (2022-2024)

This project started off as a kind of off-chute/experimentation on the Room-Scale Multiplayer VR project.

Essentially, we acquired a work-study student named Aspen, who's in the School of Dance. We dug up an old script that would take a Unity avatar, and record the position and rotations of its joints, which could then playback later.

With this, we were able to create an experimental tool specifically designed for dancers and performers to move around in VR while mocapped, record their movements, and watch it playback in VR from any angle.

Haptic Feedback and VR (2022)

Our newest major project underway, with the intention of working with Access Now Inc. a ADA advocacy group which works to support disabled Americans around the country.

The primary idea being a unique and full body experience that relies on senses other than visual. With this in mind, we have a 16 channel surround sound setup, along with a haptic feedback vest and Oculus Quest headset.

VR Multi-user Art Making (2022)

N/A - This project was a bit of time ago, and unfortunately, I do not have access to the original files, so all I can give is my memory of the project and files.

This was, essentially, as I saw it, an experiment and learning experience for me. At the CIM, we often hosted classes from Graphic Design, for classes like “New Media Illustration”, which is to say, teaching these designers about VR, and how to incorporate their knowledge of design, and user experience, and how it can be incorporated into a fully 3D environment, and how they could expand design to fits such experiences.

That’s the idea of the class, ultimately, my duty was to try and re-create our own D.I.Y. creation of these art making apps, see if we could customize to better fit our students and their goals.

Jesse Zaritt Performance Animation (2022)

This quick project saw us creating a simple animation to be projected on a scrim behind a dance performance. We worked closely with Jesse Zaritt, a dance instructor for the School of Dance here at UArts.

Jesse was inspired by the motion trails created by a performer that we would visualize to him in our QTM software, and came up with an idea based around that aspect, exploring space and architecture as created by our bodies when we move around.

The 3D aspect of the final animation I worked on utilized the motion trails of various performers to create these spindly lines that would grow and evolve during the course of the video.

Movement SDK (2023-2024)

The MovementSDK project is primarily a test project utilizing Oculus's Movement SDK template, combined with the new hardware and software integrated in the recently released Oculus Meta Quest Pro, which allows full facial feature, including eyes, and limited body IK tracking.

The idea being that if we combine this with our work in the Avatar Recorder project and similar projects, we will be able to fully track and digitalize someone in a virtual reality space, where they can have fully accurate body and positional tracking, as well as being able to track the facial expressions, and even a limited form of lip sync.

NLM Echo Chamber (2023)

The "Echo Chamber" is a collaborative project/installation we created with the help of Philadelphia's National Liberty Museum, for an upcoming exhibition they were hosting called "Data Nation: Democracy in the Age of A.I.".

The point being to show off "interactive installations, topical interactives, and provocative artwork that aims to make visitors ponder how rapid advancements in technology impact democratic norms".

At the CIM, our purpose was the creation of this "Echo Chamber" concept, which is a geodesic dome presenting a volume for visitors to enter. 

At the center of the chamber will be a touch screen display kiosk that allows visitors to type in any message, phrase, or "belief", and plays them back in multitudes through speakers placed around the volume interior.

LED Wall (2023-2024)

We recently acquired a re-configurable LED Wall. Besides essentially being a very large screen with better detail and color than a projector, it allows us a variety of neat project ideas that wouldn't have been possible before, the biggest example being Virtual Production in a similar way to how Hollywood is beginning to move it's TV and Film production workflows in. Additionally, since it's reconfigurable, due to it's rearrangeable interlocking panels, we can also create a wide variety of esoteric and abstract setups far more unique than a general widescreen large TV screen.

Further tech specifications can be found inside the project documentation page below.

Projection Mapping Workshop (2023-2024)

We had some upcoming classes and workshops to teach the basics of Projection Mapping, something which Alan has a lot of previous experience in. In order to get more content and show off a a greater variety of work, and showcase the relatively ease of getting started, he taught me the basic workflows he was used to, allowing me to experiment, before we then brought on the other work-study students, teaching them, and encouraging them to experiment and create their own little pieces.

We started with a simple setup using a small Optoma projector, along with some basic white geometric boxes for shape, and white projection cloth.

I first started in After Effects to get the basics down, before moving to Blender to create more advanced scenes with 3D lighting (which were ultimately put together in After Effects anyway), before then finally building out a new solution in Unity for the greatest ease of use and flexibility in terms of 3D workflow, direct output to the projector, real-time adjustment of the images are mapped, and interactive input.

Virtual Production Test (2023-2024)

With the arrival of our newest piece of tech, that is a 60 panel, interlocking and interchangeable LED Wall, and some smaller experiments in the past utilizing Unity and projectors, we decided to try and learn and create a template for Virtual Production.

Inspired by the many new companies, and general Hollywood shift to Virtual Production workflows, and LED/Motion Tracked volumes, we figured we had all the right technology and tools to accomplish our own version, and we could help inspire and teach other faculty and students in the School of Film, which seems prudent given the latest shift Film and TV is heading in.

We experienced quite a few hiccups and roadblocks, but we have a decent handle of the entire workflow, and created a small example short, as well as a basic template project to work on more ambitious projects in the future.