One Moment, One World is a video installation I made in collaboration with J.H. Moon. The piece was first exhibited on 2016 NASA Space Apps Challenge Conference.

2016
Cinder, Final Cut Pro, Photoshop, Blender

In this work we explored new way of rendering the Sun. As a fixed star full of energy, the Sun is different from planets in that it has its characteristic solar prominence, which is often emphasized in artistic representation. It’s nearly impossible to model and render the Sun as something as simple as a solid sphere with a finite radius. Furthermore, it’s usually more convincing if the Sun comes with a really bright, huge lens flare, which is too big for any full screen post processing to create. Our technique makes use of multiple billboard layers, which can be easily replaced with procedural fragment shaders on a quad to avoid pixel distortion. J.H. Moon edited a seamless video in Final Cut Pro using existing assets as the central matter of the Sun. The video billboard is then composited with mask image and lens flare texture.

Sun video billboard without additive blending

Since both the Sun and the planets has transparent billboards, we had to disable depth testing when rendering the billboards. To ensure correct blending we rendered them with painter’s algorithm. Specifically, the Sun video billboard is rendered in the same pass as all planets, in between planets further from the Sun and nearer to it. Debugging depth and blending is certainly a major pain throughout the development of this project.

Depth issue

For the lens flare, J.H. Moon made a really huge image and we rendered it as a big billboard into a separate FBO, which is later blended additively, on top of everything, into the final frame. However, the center of the glow was brightening the Sun video and kills all the details in the video texture. We figured out a way which is to draw a blurred black mask circle roughly same size as the Sun on top of the flare texture, and then this FBO would have zero contribution in its center, while the size of the flare billboard can be scaled freely.

Sun flare with black mask Sun flare affecting a planet

The next challenge was to render our stylized planets. We wanted the planets to have two different appeareances: a regular planet texture when viewed from far away, and a semi-transparent 360 photo texture when viewed up close as well as from the inside. I used blinn-phong shading model for the first style, and created a semi-transparent, crisp and glowy looking shader, which heavily relies on the fresnel term, for the second style. Then I basically just mix them together according to camera viewing distance. J.H. Moon created a series of color ramps in Photoshop and we used them to remap the color of the 360 photos.

360 photo texture, color remapped, without shading 360 photo texture with my custom shading

Since in the transparent style the planet is no longer realistic, there is really no rights or wrongs in how we approach it. For each planet I rendered its back face first, with low alpha value and additive blending, then rendered the front face with higher alpha value and alpha blending. Planets sorting and the painter’s algorithm solves all depth headache. We added in a billboard for the planet which not only provides the glow, but also adds in some starry overlay which is visible from the inside of the planet.

Inside of the planet

To further enrich the visuals, we decided to add in some asteroids/debris/dusts, whatever you think they are, in the whole space. I pulled in a lowpoly mesh that I built earlier in Blender, and used instancing to achived rendering up to 500000 instances of the mesh. From the visual perspective, however, we decided to go with 4000 instances. J.H. Moon programmed 4 types of movements which are driven by Simplex noise, assigned them randomly to the particles, and we ended up with a mixture of floating and flocking asteroids/debris/dusts.

Particles

Now that the graphics side is almost ready, we worked really hard on the interaction side of the project. While figuring out the way to control the camera, I was inspired by Robert Hodgin’s work Look Up: 100 Year Starship where the camera is automatically aiming different stars, flying and turning itself smoothly. Although we acquired a 3DConnexion SpaceNavigator controller from our client, we decided to only let the user control the “look at” direction of the camera with it, and let the camera fly by itself. In this way the user’s attention is guided by the automatic movement of the camera, and thus would never get lost in the empty space, but still being able to look around.

3DConnexion SpaceNavigator at the exhibition

What is different from Robert’s project is that our planetary system is dynamic: planets revolve around the Sun at a visually significant rate. Although we could retrieve the target planet’s current position and use it to lerp the camera position, this time we wanted the camera movement to be physically based, since it could bring us some unpredictable results. I wrote a finite state machine for the camera which has several states: switching between targets, flying toward target, locking to target, being inside of a planet. For each state I used either lerp or physical attraction and deceleration to control properties of the camera, such as position and view direction. The downside of being physically based is that we certainly had a hard time tuning all the physics parameters, which have from subtle to dramatic impacts on the camera’s behavior. An interesting finding is that by modulating the camera’s y position with a sine wave, the camera approaches its target in a quite interesting way.

Audience in the exhibition

One important feature we added for the exhibition, which is also part of J.H. Moon’s initial idea, is that we are able to add new 360 photos to the application while it’s running non-stop. We used watchdog to monitor a Dropbox folder, take pictures in the conference every now and then and then upload them into the Dropbox folder, and application will create new planets made of the new photo.