Students leverage Unreal Engine and VR to rapidly develop complex planetarium show

Hyunwoo Lee |
20. Januar 2021
Currently majoring in Digital Art at Seoul Institute of the Arts, Hyunwoo Lee has been actively creating interactive exhibitions and show content, linking Unreal Engine to various sensors. Generally, Hyunwoo aims to contribute to the project’s art and technical aspect when developing and designing content. He formed a team of five with fellow students Kyuri Kim, Yesong Lee, Joo-yeol Kim, and Jiwon Kim for the ONENESS project, and led the team as the director and technical artist.
Hello, we are a team of Seoul Institute of the Arts students from Korea, consisting of four Digital Art majors and one Practical Music major. We would like to walk you through how we created our project ONENESS, which we showcased at the Gwacheon National Science Museum‘s planetarium as well as in the Student Showcase Fall 2020 reel.
 


Unreal Engine Education offered by Seoul Institute of Arts

Seoul Institute of the Arts offers great Unreal Engine education, which was the main reason we were able to finish such a big project as students. Recognizing Unreal Engine’s possibilities for creating media art, Seoul Institute of the Arts has been teaching Unreal Engine to students majoring in Digital Art since 2011. Students learn the fundamentals of Unreal Engine and work on archviz, cinematics, and game development assessments to explore the use of Unreal Engine for different purposes.

Through this process, many students efficiently gain experiences they can apply to professional fields such as VR, AR, interactive shows, and exhibitions. Our team has also greatly benefited from this education, and for the past two years we have been using Unreal Engine to create various projects like cinematics, interactive art, and media performances.
 

Looking back at the start of ONENESS to its final showing

The ONENESS project started off as VR video content to view a believable version of space. During the development process, we heard about a film festival hosted by the Gwacheon National Science Museum, so we converted our project to match the event. 

This presented a significant time issue, however. Creating content for large-scale planetarium domes typically takes a significant amount of time and costs. We were faced with having to complete the 15-minute video in just three months. However, we saw this as an incredible opportunity for us to show our work in a planetarium that has a large diameter of 25 meters, and most importantly, we trusted Unreal Engine’s powerful real-time rendering would make this possible. 
The exterior and interior of Gwacheon National Science Museum’s planetarium
A scene from the planetarium exhibition

Design

The theme of the ONENESS project is humanity and the universe. We wanted to deliver the message that the history of the universe is another form of humanity’s memory. So the long history from the Big Bang to the beginning of human civilization is wrapped up into 15 minutes, delivered as one long take as if we were viewing a walk of life. Also, we wanted to visualize scenes that transcend time and space while basing the overall development and key scenes on scientific ground. To this end, our team studied many space science theories to plan out the scenes and sought out astrophysics professors and researchers for their consultation. Without Unreal Engine’s Blueprint system and Niagara, it would have been difficult to actually make our design a reality.
This scene shows the creation of the last universal common ancestor, LUCA

Niagara particles

The majority of scenes in ONENESS are set in cosmic space, which made the visualization of abstract molecules, rather than actual objects, key to the film’s aesthetics. For this reason, particles played an important part in our project. Naturally, we decided to actively utilize Niagara, Unreal Engine’s new particle system.

Niagara’s modular stack method is highly intuitive, allowing artists who use the system for the first time to easily create particles they desire. However, we discovered the limitations of creating general effects when it came to producing fine details when the particles created in Niagara were used in Sequencer
 
Niagara particles had to match the timing of other various factors as well as Sequencer’s keyframe animation, but the method we used in Niagara was difficult to get the complex timing right. Since all the visualization was completed within Sequencer, we needed a real-time method to preview and control Niagara in Sequencer. 

Fortunately, Niagara’s user-defined parameters offered the features we were looking for. In Sequencer, we were able to call the user-defined parameters from Niagara and used the keyframes for control. 
Our Niagara and Sequencer workspace
All the modular variables that had to be dynamically controlled according to the film direction, such as opacity, size, and speed, were replaced with user-defined parameters and were controlled from Sequencer. Like other actors, Niagara particles could be viewed in real-time in Sequencer, which greatly improved the cinematic production process by enabling us to immediately view the final scene. This allowed for a more complex production, which, in turn, allowed us to quickly try out different visuals.
Using Niagara for a scene transition
Having flexible control over Niagara in Sequencer was especially crucial for a one-take film. The ability to view and control various Niagara systems at the same time helped us smoothly connect the main scene transitions without editing cuts or using tricks. 

Creating a simulation using Blueprints

In ONENESS, most of the main scenes recreated the pivotal events and physical phenomenon in the history of the cosmos. Like the majority of cinematic production, ONENESS also used Sequencer’s keyframe animation to create these scenes. There were greatly challenging scenes that would be impossible to recreate using the Sequencer animation method, such as the scene portraying the period between a few microseconds(10^-6 seconds) after the Big Bang until 380,000 years later. In this scene, the protons and neutrons are compounded together and pushed apart by electromagnetic force to form four types of atomic nuclei. We had to visualize the atom formation process in which the neutrons attach to the protons needed for an atomic nucleus.
This is the structure of protons and neutrons forming a nucleus
This complex interaction had to occur across the entirety of the overall cosmic environment, so applying this logic to hundreds of objects made it impossible to animate keyframes ourselves. Our team needed to use something else besides Sequencer to efficiently deliver this scene.

Blueprints were our answer. Our idea was to define the process of protons and neutrons forming helium and use Blueprints to make the particles move and react automatically so that simulation alone can complete the scene without the need for Sequencer’s keyframe animation. We started this process by defining the proton and neutron class.
Neutron Blueprint and definition map
The map above shows how we defined the proton and neutron classes. All particles float around space until an overlapping reaction occurs and behave according to their status. For example, when a proton meets a heavy hydrogen atom, they form the nucleus of Helium-3, and when the Helium-3 nucleus meets another Helium-3 nucleus, they form the nucleus of Helium-4 and release two protons in the process. The Blueprint timeline feature was used instead of Sequencer to create particle movement and material animation in the moment of the reaction. 

Having created the interaction and animations of the masses using Blueprints, we were able to complete the following scenes without the complicated Sequencer animation. We simply added a few event keyframes in Sequencer in order to trigger several class events when desired.
Proton formation simulation
Primordial solar system’s planet formation simulation
These scientific phenomena would have been extremely difficult to recreate in Sequencer. Thanks to Blueprint’s ability to handle automatic simulation, complex scenes were efficiently delivered, allowing us to focus more time on other scenes.

Pre-visualization in a VR planetarium

The biggest challenge of this project was producing video content for a semi-spherical planetarium. When we first viewed our work in progress in the planetarium, it was completely different from what we had imagined despite the fact that production was catered to accommodate the dome environment. This made us realize how difficult this project would truly be. 

The first issue was the screen’s diameter was about 25 meters, exceeding our expectations in structure and scale. We realized we needed a new structure and layout adjusted for the massive planetarium screen. The second issue was the distortion from the semi-spherical screen. When a flat image was projected onto a semi-spherical 3D screen, the edges were significantly distorted. These problems made it impossible to view the content and called for frequent troubleshooting, going back and forth between our workspace and the planetarium.
Comparison of rendering image and the actual result
However, the real issue was that the troubleshooting process was inefficient. Our team was unable to foresee issues that arose when our video was shown within the planetarium, and the only way we could test it would be by visiting the planetarium itself, which was only available for test showings twice a month. It didn’t help that a proper test would take about two days to prepare. 

When we only had two months left until the film festival, we quickly realized that four testing sessions wouldn’t be enough to solve these problems. We needed a new approach to innovate the troubleshooting process, and after brainstorming on ways to test the film without having to visit the planetarium, we realized that VR was our solution. Our idea was to create a virtual planetarium and test it in VR.
Our virtual planetarium
The goal of the VR planetarium was to build the same environment to simulate the actual venue and experience. As a result, we recreated a virtual planetarium that reflected the actual environment, including a hemispherical dome of 25 meters, a tilt of 10 degrees, along with seat placement and proper height. Then we made sure to use materials on the dome mesh to map images and video sources. We successfully recreated the planetarium in Unreal Engine to match the real venue and experienced the planetarium in virtual reality.
Using VR to represent the virtual planetarium
The result was a success. The experience from the actual planetarium was almost identical to the VR planetarium experience. We no longer had to spend a lot of time preparing for the screening and immediately gained feedback in VR whenever we needed to test. Our planetarium video content production process now had the immediate feedback capabilities that a real-time engine offers. Compared to the previous feedback process, our new workflow of adjusting the placement and composition to troubleshoot the distortion issue was accelerated like never before.
 

Using this method, our team was able to stabilize our planetarium content until the film festival submission date and successfully wrapped up the production of the ONENESS project under a strict deadline.

Conclusion

This sums up our three-month production process of ONENESS. Having to achieve high-quality results within a tight schedule, the ONENESS project would not have been possible without Unreal Engine’s real-time rendering, powerful cinematic features, and intuitive visual scripting. Unreal Engine is an extremely powerful tool that makes our imaginations a reality. Our team will continue our journey with Unreal to create diverse content moving forward.

    Looking for more learning content?

    Visit our Unreal Online Learning portal for over 100 hours of hands-on video courses and guided learning paths.