Friday, October 21, 2011

Assignment #6 - Refraction

This week we're working on refraction.  After we derived the refraction angle from Snell's Law, I started a new spike in which to start working with refraction in my raytracer.  My very first attempt is a black hole.


The first step I took toward getting to the bottom of this was a bunch of print statements.  The very second thing I did was visualize the refraction directions, depicted in the image below.


That dimple is weird, it reminds me of a singular point.  It's right where the incoming view direction is the same as the surface normal.

The next thing I did was double check my intersection code, where I decide whether a ray is inside or outside an object when it hits the surface.  I found a copy and paste error (when will I stop doing that), and when I fixed it, I discovered that I could at least bend some light through my transparent sphere.


That's an unsettling black mass in my transparent orb.  I wonder if rays are not hitting the plane near the very bottom of the sphere due to my small epsilon guard against allowing a ray to hit an object before it has left the object's surface on it's way to some other destination.  That might explain the black circle at the bottom of the sphere, which might help account for the large black area in the middle of the sphere, if the refraction angles there approach pointing down into it.

I tried some smaller refractive indices for the solid sphere, and though it created much less black space in the center, there's obviously still a problem with the refraction angles.  Here's a screenshot from when I set the refractive index of the sphere equal to that in the air, i.e. both equal to 1.


It seems like when the refractive indices of the air and the sphere are equal to each other, the sphere should become invisible.

I think the dimple in my refraction angles is what's wrong here, considering the image above, where there appears to be a singular point in the same relative position as there does in the refraction angles visualization.

Ok, so I just found my bug, and now I'm getting something that looks like refraction.  I put up another checkered plane behind my refractive sphere, and the resulting image appears similar to other refractive spheres I've seen placed in front of other checkered planes around the internet.


Some unsightly speckles have been added, but the distortion of the checkered plane through the sphere seems to look right to me.  In order to clear up the speckles, I went from one ray per pixel to 25, and that seemed to make a difference, as you can see below.


So now one of the problems with this image is that I've avoided allowing reflections in the case of transparent objects, just to see if I could quickly get the light rays bent in the right directions.  So I think the next step is to go back and split rays that hit transparent objects, one ray should trace the reflection, and the other should trace the refraction.

I also still need to allow for different colors of glass, as right now the only color for transparent objects is perfectly clear.

By the way, here is what the visualized refraction vectors look like, now that they're correct.  The eye is sitting on the negative x-axis looking up the positive x-axis, and the transparent (here, salmon-colored) sphere is right on top of the origin.  I think the main point is that you see nearly a solid color, which reflects the fact that the refracted rays leave the inside of the sphere surface closest to the eye parallel to each other as they travel through the sphere to the opposite side.


And just as a sanity test, I rendered the scene with the transparent sphere having an index of refraction equal to that of the air.  It's nice to see no distortion of the plane in back of the sphere.


Now to handle reflections and refractions together.  And maybe figure out the source of those speckles.

... in progress ...

Tuesday, October 18, 2011

Assignment #4 - Texture and Anti-Aliasing

This assignment included texture, anti-aliasing, viewing the scene from different angles, and making a movie out of raytraced frames.

The first step for me was getting the plane textured with the checkerboard pattern.


I didn't get it quite correct, at first, as you can see above.  My rectangles are defined by their center point, a height and width, and two vectors which allow for orientation and texturing.  In the image above, the origin of the rectangle (plane) is right below the red ball, and you can see that the pattern is ok in the upper right and lower left quadrants, but not ok in the other two.  I fixed it, as you can see below.


After getting the checkerboard texture working, I worked on adding an OrientedSphere class, which I would use to both texture the sphere and rotate it in the animated scene.  The first image with the earth texture on the sphere looks all wrong.


Here I'm trying to visualize the spherical coordinates I generated in the process of texturing the sphere.


Here I'm visualizing the projections of the vector pointing from the center of the sphere to the hit point onto the oriented sphere's "forward", "up", and "right" vectors.


Below I thought I was on the trail of something, but I still wasn't getting near the cause of the problem.


Aha!  I was treating the earth image RGB data as RGBA data, and once I fixed it, I got the map textured onto the sphere.


Since I'm eventually going to be viewing from different angles, I'd like to not have a "dark side of the moon" in my scene.  And since I had designed my raytracer for this eventuality, supporting multiple colored point lights wasn't very difficult.  Below, I've added a blue-ish green light.


Now a third light, a pink-ish one.


In the following two images, I've highlighted a problem I'm having with the earth texture on the sphere.  I rotated the offending sphere until you can see the longitudinal line across which the texture image seems to be mirrored


In the image below, I rotated the sphere 180 degrees so you can see the other line of longitudinal reflection.  I still have a bug in the way I generate the texture coordinates for the sphere, I guess.  For now, I'm leaving it behind to move on to the rest of the assignment.


I implemented a simple anti-aliasing approach which just divides each pixel up into an n x n grid and puts a ray through the center of each sub-pixel grid cell.  This approach could be extended pretty easily to do jittered samples, but as a base line, I thought I'd see what the simplest method got me.  Below is an image rendered with 16 rays per pixel.  The places where the image is obviously much better include on the green sphere in the reflected checkerboard  pattern, and the texture in the other sphere.


Below is an image rendered with 121 rays per pixel.  In my opinion, the 16 rays per pixel version is almost as good, and it rendered much more quickly.


Here's a close up with 1 ray per pixel.


Below is the same close-up with 25 rays per pixel.



The final step this week was to allow viewing from different angles.  I decided to allow moving and rotating the raytracing camera using the keyboard to make capturing a short clip a little easier.  Below is a short video clip, assembled by using the keys to move through the scene and rotate the globe, while capturing the image at each step.  I then converted the ppm images to png and made an mpg video out of them using ffmpeg.




You can check out this assignment by doing an "svn update" in my repository.  Check out the README.txt for the available command-line options when running.

Saturday, October 8, 2011

Assignment #3 - Part 3 - Accelerated Sphere Flake

The assignment this time is to finish up the acceleration data structure by making it handle our arbitrary geometric objects instead of just a set of boxes.  So I added code to take a list of geometric objects and put them into a BVH.  Then I tested it out on my sphere flake.  The results are quite noticeable.

Here's just 1 level of recursion, resulting in 7 spheres:


Without BVH:

Raytraced image in 250885 microseconds

With BVH:

Raytraced image in 172963 microseconds

So the BVH approach is a little faster, but not very impressive.  Below are the image and results with 3 levels of recursion, which result in 37 spheres:


Without BVH:

Raytraced image in 1032742 microseconds

With BVH:

Raytraced image in 388892 microseconds


That's a tiny bit better, now the BVH approach looks a little more than twice as fast.  Here's the next step: 4 levels of recursion resulting in 187 spheres.


Without BVH:

Raytraced image in 5098718 microseconds

With BVH:

Raytraced image in 805257 microseconds


So now it's better.  We're tracing 187 spheres in 800 microseconds with the BVH, and it's taking over 5 seconds to the same number without it.  

Let's keep going.  Next is 5 levels of recursion for 937 spheres:


Without BVH:

Raytraced image in 26434916 microseconds

With BVH:

Raytraced image in 1416842 microseconds


The BVH is really starting to pay off now.  I can do 937 spheres in under 1.5 seconds with the BVH, where it took 26 seconds to do that many without it.  At 937 objects, the BVH is 17 times faster.

How about 4687 spheres?


Without BVH:

Raytraced image in 135769963 microseconds

With BVH:

Raytraced image in 2239579 microseconds


Over 2 minutes without the BVH, just over 2 seconds with it.  It just gets better as we add more and more spheres.  I don't have time to wait around for the old one to do this many spheres:


With BVH:

Raytraced image in 5171065 microseconds


That's 117,187 spheres in about 5 seconds, that's about how long it took to do 187 spheres without the BVH, so that's about 600 times faster with the BVH than without it.


When you check out this project by doing an "svn update", you can try different levels of recursion on the sphere flake, as well as choose whether or not to use the BVH, both from the command line.  See the README.txt for details.  The project is in the "cs513/assignment03_spike03/" folder.

Saturday, October 1, 2011

Assignment #3 - BVH

This assignment is also going to have multiple parts.  In the first part, we were supposed to get a sphere flake working in order to motivate an acceleration data structure.  In this second portion, we had to build a hierarchy of axis-aligned bounding boxes, and to traverse the tree, tracing the bounding boxes themselves.

So I thought the first part of this was creating a finite plane abstraction, I called it Rectangle.  Here is my first bounded rectangle:


Then I created a Box abstraction that just compiled 6 Rectangles, here it is:


Then I added code to build a hierarchy of axis aligned bounding boxes recursively.  When I get to the leaves of the recursion, either because I hit the requested recursion depth, or else because the boxes are getting smaller than a small fixed tolerance along the axis of division, I created an actual Box to fill the bounding box.  Actually, in order to see the boxes at all, I had to shrink the bounding boxes a little when making the visible boxes at the leaf level.

I also added code to traverse the tree, only attempting to hit actual GeometricObjects when I hit the leaves of tree, or otherwise didn't get any further hits from recursion.

Below is a screenshot with 1 level of recursion:


Number of tree nodes: 3, number of geometric objects: 2
Raytraced image in 322019 microseconds



2 levels of recursion:


Number of tree nodes: 7, number of geometric objects: 4
Raytraced image in 394739 microseconds



3 levels:


Number of tree nodes: 15, number of geometric objects: 8
Raytraced image in 465404 microseconds



4 levels:


Number of tree nodes: 31, number of geometric objects: 16
Raytraced image in 552563 microseconds



5 levels:


Number of tree nodes: 63, number of geometric objects: 32
Raytraced image in 677675 microseconds



Now we're up to 10 levels of recursion:


Number of tree nodes: 2047, number of geometric objects: 1024
Raytraced image in 2180543 microseconds



Up to 12 levels of recursion:


Number of tree nodes: 8191, number of geometric objects: 4096
Raytraced image in 3421047 microseconds



13 levels:


Number of tree nodes: 16383, number of geometric objects: 8192
Raytraced image in 5524829 microseconds



So this is already looking pretty good.  In my sphereflake blog post, I was only tracing 937 spheres in over 33 seconds without an acceleration data structure.  Boxes are much more work to hit than spheres because of their 6 bounded planes to check, and now I'm tracing over 8,000 Boxes in under 6 seconds!

To check out this assignment, just do an "svn update" if you've checked out my repository before.

This program is in the cs513/assignment03_spike_02/ directory, and to get different levels of recursion, run it like this:

./assignment03 -bvhLevel 7