Homework 6
Justin Huang (cs184-en)
Gideon Chia (cs184-an)
So we enjoyed homework 5 (the raytracer) so much, that we decided to extend it.
On the raytracer side there were a couple things we decided to extend (it gets better towards the end):
- Refraction
- Depth of Field
- Animation
- Extra Credit: Extreme Parallelization
- Extra Credit: 8 Filters
Refraction
We generated refraction rays, by adding the command:
- refraction index
Depth of Field
In order to accomplish depth of field, we introduce a couple more commands:
- focal x y z aperture
- aperture aperture
- focal_depth depth
Animation
For animation, I didn't really animate any objects, but I do change the camera position and the up angle. I define a path using a set of points, and then draw a bezier curve through the points. This curve defines the path on which the camera follows as you can see in the video below. To create the video, I generate about 400 frames, and then stitch them together using ffmpeg, at roughly 30 frames per second.
Also, introduced autofocus. It takes the middle pixel, and makes that it's focal point.
However, the effect was too abrupt so I introduced a gradual autofocus, that averages the previous couple of frames focal point.
[PLEASE WATCH IN HIGH DEF]
Extra Credit: Extreme Parallelization
After introduction a lot more effects, raytracing became painfully slow, and making videos became even slower. In order to be able to still generate high quality videos, but be able to render them in less than a day, I decided to utilize the power of the hives. I wrote a small http server in python using the microframework: bottle. The server had a few simple get and put commands.
- [get] url:port/next (will give the client the next job needed)
- [post] url:port/pic (if a client posts with the image data, the server will save it to a file)
3 Area lights with a 50,000 triangle dragon with soft shadows, and depth of field. This took about 3 hours to render on 26 hive computers totaling 998.6 Ghz, and would have taken 12.5 days! to render on my computer with only 9.8Ghz.
[PLEASE WATCH IN HIGH DEF]
A 50,000 triangle dragon with refraction and depth of field. This took about 2 hours to render, and would have taken a week! to render on my computer.
[PLEASE WATCH IN HIGH DEF]
Extra Credit: 8 Filters
Justin wrote 8 filters that will be demonstarted on the video below. It took too much space on my disk to render, so I could only render half of the video. But, the video demonstrates 8 filters.
filters in this pattern:
1 2 3
4 5 6
7 8 9
- normal without filter
- pincushion distortion
- image magnification increases with the distance from the optical axis. The visible effect is that lines that do not go through the centre of the image are bowed inwards, towards the centre of the image. Rearranged and mapped x,y points to new x',y' points based on how much of a "bulge" effect i wanted.
- barrel distortion
- image magnification decreases with distance from the optical axis. The apparent effect is that of an image which has been mapped around a sphere (or barrel). Like a fisheye lense. Rearranged and mapped x,y points to new x',y' points based on how much of a "bulge" effect i wanted. Opposite of the pincushion distortion algorithm.
- sepia filter
- Applied a sepia image filter over the image by adjusting and weighting RGB values.
- gaussian blur
- Applied a 5x5 averaging convolution matrix over the image to apply a gaussian blur.
- grayscale
- took the intensity of colors at each point x,y to convert the image to grayscale.
- color inversion filter
- Inverted the colors RGB where an object exists.
- horizontal blur
- Used a horizontal 20x1 averaging convolution matrix to create a larger blur effect than the 5x5 gaussian blur.
- edge detection algorithm
- Used the sobel operator with vertical and horizontal derivative approximations to calculate gradient magnitudes and edges in the picture.
[PLEASE WATCH IN HIGH DEF]