Images and Animations
Click on the images for a larger (1920x1080) version.
Large movie (1080p 136MB)
Medium movie (720p 54MB)
Small movie (480p 27MB)
I recommend right clicking on the movie links, opening with Quicktime and looping the movie.
The software used to render each frame of the animation was written by me. No post-processing was done. I was much more interested in playing with my own software than using Photoshop to batch edit each of these files to get the look that I wanted. Not that using Photoshop or any other software is a bad idea - it just wasn't as interesting to me as working on the raytracer. The code is available here.
ffmpeg was used to join the frames together into an animation.
The final project used a very simple scene:
- 10 glass spheres with a index of refraction of 1.5.
- Sky Sphere: textured using a turbulent sky blue + extra white texture to simulate a cloudy sky.
- Ground plane: textured using a turbulent texture and bump mapped to simulate flowing sand.
High Dynamic Range Color
From the start of writing this software, double precision floating point high dynamic range color was used. Colors are tone
mapped into the 0.0-1.0 range using (color / (color - 1.0)). The image was stored in an OpenGL floating point texture and pushed out to the video card in this format. It wis left to the videocard to convert from 0.0-1.0 to 0-255 if needed. If the image was to be saved, the raytracer performed the conversion and saved the file in PPM format.
The specular highlight on each sphere is varied between 0-30 using Perlin noise. This gives the highlight a non-uniform look so that when the sphere rotates, it appears to be moving.
A yellow specular highlight was used. Red-orange was originally the color that looked the best in still shots, but once it was animated, the yellow specular highlight looked more natural.
The sky sphere is implemented as its own class. It is derived from the Sphere class and uses its intersect() method, but reverses the direction of the normal since the sky sphere is intended to encapsulate the entire scene. If a plain Sphere was used, the normal would be pointing the wrong way and the sky would not be shaded properly.
As the texture-mapped infinite plane disappears into the distance, some serious aliasing begins to occur and it is very noticeable in the animation. Fog to the rescue! The fog color is the same as the sky blue color used in the sky sphere and helps blend the far away points into the distance. This keeps things from looking like they live in a vacuum and adds a feeling of air and atmosphere to the scene.
Fog was implemented as described in Ed Angel's book, Interactive Graphics: A Top-Down Approach.
Perlin noise was used throughout the scene. A simple two-color turbulent texture was used on the ground plane to create a flowing watery sand effect and another two-color turbulent texture was used to create a cloudy sky.
As noted earlier, Perlin noise was used to vary the specular highlights. It was also used to bump map the ground plane. The combined effect of a Perlin noise turbulent texture and bump mapping was really cool looking, especially in the animation. It's hard to believe that a single plane could look so interesting!
The clouds in the sky weren't noticeable enough, so the white color was increased from (rgb) (1,1,1) to (rbg): (2.0, 2.0, 2.0). This wouldn't have worked if high dynamic range color wasn't supported by the raytracer.
Horses and Cows
I ride the train every day and I saw a bunch of these animals while writing this raytracer and thought they deserved a mention. Cows aren't very smart. They run away after the train passes them. Horses run away before the train gets near them. Maybe that's why we eat cows and not horses, since the cows don't run away from danger until it's too late. Oh, and horses are useful beyond being edible. I suspect cows aren't useful for much beyond being food for people. Just a thought.
Dielectrics were implemented using Snell's law to determine the transmitted and reflected rays and a Fresnel term was used to determine how much of the light was divided up between those two rays.
Reflected rays that bounced around inside of a dielectric were ignored. These rays would quickly reach the maximum depth and result in the default color, which would make the image look grainy. This looked pretty cool, but could darken up the image. There are ways to approximate this effect and it would be fun to add this to the raytracer.
Instancing was used to create the ten different spheres. Only one actual sphere was created, 10 transform matrices were created and then ten instances with the one sphere and its associated transform matrix were added to the scene.
The scene is animated by passing a time variable can be passed on on the command line. The raytracer uses this parameter to decide on the location of the objects in the scene. Spheres are rotated according to time and translated using a parabolic function and a preset time shift variable to make them bounce at different times.
Each frame of the final movie took 10 minutes to render. There are 600 frames. That means 100 hours of render time. This problem was solved using shell scripts. A piece of C++ code was written that generated 16 shell scripts that could be run simultaneously on an 8 core Mac Pro. In a perfect world, this would have cut the render time down to about 6.5 hours, but in real life, it took 12 hours. I probably should have kept the number of parallel processes down to 8 or 12 since 2 processes per core didn't seem to have the desired effect.
One problem with these long renders was how to keep track of what has been rendered so that in the event that somebody rebooted the Mac Pro durng the render, the process would not have to start over from the beginning
A simple form of checkpointing was implemented. Prior to rendering a scene, the existance of the output file was checked. If it existed, it was assumed that the work was already completed and could be skipped.
Movie Creation and Addition of Audio
ffmpeg was used to create the animation from 600 frames of PPM formatted images. Here is the command line used:
ffmpeg -y -f image2 -i %04d-ani.ppm -b 100000k -r 30 -s 1920x1080 ani.mov
Using that very high bit rate was key to producing a nice looking movie. Originally the bit rates described in the man page were used and the quality was very low. After bumping it up and finally trying some very high numbers was good quality achieved.
Music can also be added using ffmpeg, although it was not added to the final project because it became very, very annoying when the movie was looped. First a simple piece was created using GarageBand and the builtin loops. A little bit of quiet was added to the beginning and end of the music so that the movie could be looped without needing the music to match up exactly at the beginning and the end.
The song was exported into a .wav file and then converted to ac3 audio with the following command:
ffmpeg -i music.wav -acodec ac3 music.ac3
Once the audio is in ac3 format, it can be added to the movie:
ffmpeg -i music.ac3 -acodec copy -b 100000k -i ani.mov final.mov
Ray Tracing from the Ground Up by Keith Suffern
Interactive Computer Graphics: A Top-Down Approach by Ed Angel
Improved Perlin Noise