- Use OpenGL and Glut or Java and Swing to produce a simple image viewer.
- Viewer must accept command-line parameters for size, camera, lights, etc...
- Viewer must report pixel values under mouse when image is clicked on...
- Viewer must report current state (size, camera...) in command line form when 'C' key is pressed
- Viewer must display an image either read from file or generated by the program.
- Extra credit 1: measure speed of image updates. (how fast can images be loaded?)
- Extra credit 2: identify fastest method for image updates. (use GL extensions)
Usage
Usage: ./image_viewer [options]
options:
--filename=[filename.png] File to load
--float Use floating point color
--width Window width. Defaults to image
width
--height Window height. Defaults to image
height
--drawpixels Use glDrawPixels to draw image.
Default is to draw textured quad
--eye (x,y,z) eye location
--center (x,y,z) where the eye is looking
--up (x,y,z) vector for normal to
the eye's plane
--light0 define light 0
--light1 define light 1
--light2 define light 2
--light3 define light 3
--light4 define light 4
--light5 define light 5
--light6 define light 6
--light7 define light 7
--light8 define light 8
--light9 define light 9
To exit, pressing Apple-Q, escape, or just q.
Different Implementations and Effects on Speed
I implemented this project two different ways:
- Read the image into a buffer and use glDrawPixels to put the image onto the display.
- Read the image into a buffer and draw a textured quad.
Timing the glDrawPixels implementation:
0.019059 seconds for first display()
0.005490 seconds average for display()
Timing the textured quad implementation:
0.009020 seconds for first display()
0.000642 seconds average for display()
Using a textured quad is far faster than using glDrawPixels. Since we'd like to display frames as fast as possible, this is important to know.
Interestingly, the first time through the display function always took a lot longer. This probably has something to do with OpenGL initialization.
One thing I would like to look into is using glAsync or the ARB_sync extension so that I can write to a buffer while rendering it. This may prove useful for ray tracing.
Informal Discussion of Development Process
This program uses OpenGL and GLUT to display the image. It is rendered using a textured quad. The camera uses the simple orthographic projection.
I hadn't written any OpenGL and GLUT code in at least 10 years. This assignment required me to bring out my old rusty skills.
My plan was to load the image, bind it to a texture and display a textured quad. First I just wanted to get a white quad up on a black background, but got distracted trying to link to libpng.
I messed around with the XCode IDE a bit and couldn't figure out how to get it to link to the PNG library that codes with XCode. There were plenty of examples of how to link to a PNG library that you compiled and installed in /usr/local, but I didn't think my instructor would appreciate me requiring him to do the same. Those examples didn't to work when I pointed them at the libpng installed with XCode. I was probably just doing something stupid, but I was getting tired and frustrated. I just wanted to code!
So I busted out an old Makefile and pointed the linker at /Developer/SDKs/MacOSX10.5.sdk/usr/X11/lib where libpng lives. It was simple. Now I was able to link to libpng.
Next I built a simple program using GLUT and OpenGL that displayed a white quad on a black background.
Now I needed to load a PNG. This took a while. First I started reading the libpng documentation, but it wasn't really clear how to load a PNG. The documentation tends to, er, wander and expects that you know a lot the PNG format. I just wanted to load an image, dammit! How hard would it be for them to provide a function that takes a file handle and loads the image?
In their defense, PNGs need to be loaded by some pretty complex programs and need a complex API. I'm writing a simple program and I bet that I am not their target consumer. If it really pisses me off that, maybe I should contribute some code? Hmmm...
So I looked online for some examples. Most sucked. I finally found some code written by David Henry. I compiled it and it worked! Well, mostly. The image had some stair-stepping issues.
I extracted his PNG reading code, refactored it into C++, moved the global variables into my new PNGTexture object and added exception handling. Then I textured my quad, and it looked great, if you didn't mind a little stair stepping.
At first I thought the problem was with the PNG loading code. I spent a lot of reading through this code and decided that I really didn't care about the mechanics of loading a PNG file. I'm much more interested in ray tracing, which is what this class is about. I built a checkerboard texture in my code and used that instead.
It worked fine. So the texture coordinates were correct. I should have done this in the first place.
Since I didn't want to understand the loading of PNGs, I looked at the code that loaded the texture into the OpenGL state machine. I compared it with an example from the Red Book and found that I was missing a call to glPixelStorei(GL_UNPACK_ALIGNMENT, 1).
This call was missing from the PNG loading code.
So, I added that call and the stair-stepping problem disappeared. I was loading PNGs and not lame PPMs!
Now I just needed to get the color of the pixel underneath the cursor. This is simple. Just use glReadPixels. Then I added the state printing and started timng the display function.
No comments:
Post a Comment