Project Synopsis
"Operation Luna" aimed to transport a select group of individuals from the community into the futuristic realm of 2073, providing them with an immersive experience of being civilian astronauts.
The journey encompassed a liftoff from Earth, a spaceflight to the Moon, an encounter with the transformative Overview Effect, and a glimpse of a semi-constructed lunar site.
Beyond the astronaut simulation, the project aimed to provoke contemplation on the future of human civilization, fostering a deepened connection and appreciation for Earth.
The journey encompassed a liftoff from Earth, a spaceflight to the Moon, an encounter with the transformative Overview Effect, and a glimpse of a semi-constructed lunar site.
Beyond the astronaut simulation, the project aimed to provoke contemplation on the future of human civilization, fostering a deepened connection and appreciation for Earth.
Key Features
The project unfolded in three distinct stages - Launch, Space Flight, and Landing:
Participants gathered in a Briefing room, surrounded by posters setting the tone for the futuristic world. Boarding passes were handed out, and the participant "civilian astronauts" were escorted to the Enhanced Immersion Studio.
The experience started with announcements in multiple languages from countries with robust space programs. Once the participants were comfortably accustomed to the space, the immersive experience was projected onto screens surrounding them. In the room, guiding the participants were also two members of the crew, who also served as narrators for various dialog that were part of the experience. Post-experience, participants pondered the future of human civilization before being guided to a Debriefing room.
In the Debriefing room, participants scanned a QR code leading to a survey, capturing their thoughts and feelings about the experience.
Technological Landscape
All environments for each stage were meticulously modeled in Unreal Engine 5.3. The spaceship animations, in each environment were rendered into 8K Equirectangular Video sequences via a custom-created rendering workflow within Unreal Engine, and then converted into the HAP format using Adobe Media Encoder.
Disguise was employed for seamless playback and transitions of both audio and visual content on a 300-degree drum configuration of screens.
Disguise was employed for seamless playback and transitions of both audio and visual content on a 300-degree drum configuration of screens.
The Screens
The Disguise Server, the magic black box that makes things possible
Screenshots from the Disguise show file
Roles and Responsibilities
In the capacity of project leader, my responsibilities were diverse:
- Designed and modeled the Launch Site, Flight in Space, and Lunar Landing in Unreal Engine.
- Created custom materials for the Moon and Earth, as well as each of the animation sequences.
- Devised a custom workflow for rendering glitch-free 8K Equirectangular Animation sequences.
- Programmed the Disguise file along with a well-defined Cue list for flawless execution during the showcase.
- Acted as the briefing agent during the experience and managed the run of show.
Team Collaboration:
Collaborating with KunGin, Gbemileke, and Veronica, we executed the project seamlessly:
Collaborating with KunGin, Gbemileke, and Veronica, we executed the project seamlessly:
- Gbemileke Anthony refined the narrative.
- Tsai KunGin designed the Spaceship, created graphic elements, and produced physical items for the audience.
- Veronica Mangu handled stage lighting and assisted with audience management.
- Gbemileke and KunGin together were also present in the space, along with the participants, and narrated dialog that accompanied the show, and were also ready to assist the participants should anything arise.
- Gbemileke and KunGin together were also present in the space, along with the participants, and narrated dialog that accompanied the show, and were also ready to assist the participants should anything arise.
Challenges and Solutions
Creating 8K Equirectangular videos presented computational challenges, including system strain and crashes.
At first, the animation sequences were created and rendered by attaching the virtual camera to the spaceship, which was then placed in each environment, and a path was created that the spaceship would follow through the environment. The intention was for the attached virtual camera to render out what it saw while the spaceship moved. This did not work out though, and instead, the camera though attached, didn't rotate along with the spaceship.
The next way I approached this was by attaching the spaceship to the camera and animating the position of the camera, but this method was unsuccessful too as this resulted in the camera moving and rotating independently of the spaceship and thus not being able to get the interior of the spaceship, in the render.
I then reverted to attaching the camera to the spaceship, but this time, continuing to animate the camera instead of the spaceship. This resulted in the camera moving along with the spaceship, but rotating in an opposite direction whenever there was a Yaw or Roll angle rotation involved.
Finally, post discussing with my group mates, we decided to remove the spaceship altogether from the environments and animate the camera alone, flying through the environments mimicking the path of the spaceship and rendering out those videos separate from the spaceship. The interior of the spaceship was then masked on top of these renders to give the sense of being inside the spaceship while flying.
Trying these various solutions and rendering them out took several hours per video and the system was even left on overnight for multiple nights.
Despite hurdles, the videos were seamlessly stitched together, ensuring a glitch-free showcase.
At first, the animation sequences were created and rendered by attaching the virtual camera to the spaceship, which was then placed in each environment, and a path was created that the spaceship would follow through the environment. The intention was for the attached virtual camera to render out what it saw while the spaceship moved. This did not work out though, and instead, the camera though attached, didn't rotate along with the spaceship.
The next way I approached this was by attaching the spaceship to the camera and animating the position of the camera, but this method was unsuccessful too as this resulted in the camera moving and rotating independently of the spaceship and thus not being able to get the interior of the spaceship, in the render.
I then reverted to attaching the camera to the spaceship, but this time, continuing to animate the camera instead of the spaceship. This resulted in the camera moving along with the spaceship, but rotating in an opposite direction whenever there was a Yaw or Roll angle rotation involved.
Finally, post discussing with my group mates, we decided to remove the spaceship altogether from the environments and animate the camera alone, flying through the environments mimicking the path of the spaceship and rendering out those videos separate from the spaceship. The interior of the spaceship was then masked on top of these renders to give the sense of being inside the spaceship while flying.
Trying these various solutions and rendering them out took several hours per video and the system was even left on overnight for multiple nights.
Despite hurdles, the videos were seamlessly stitched together, ensuring a glitch-free showcase.
Project Outcomes
"Operation Luna" left the community audience astonished and delighted. A first-of-its-kind experience in Mesa, it not only provided enjoyment but also sparked curiosity and contemplation about the future of human civilization. Participants left with lingering questions, concerns, and a newfound sense of curiosity.
Final Environments - Launch Site, Flight in Space, and Landing Site
Equirectangular 8k Renders