Sunday, 11 August 2013

raytracing: why is my sphere rendered as an oval?

raytracing: why is my sphere rendered as an oval?

I am to write a raytracer, however I already seem to hit my first big
problem. For whatever reason, my sphere (which - since I only begin - I
simply color white when a ray hits) is rendered as an oval.
Furthermore, it seems that the distortion is getting worse, the farther I
am moving the sphere's center away from x = 0 and y = 0.
Here's the intersection and main-loop code:
double const Sphere::getIntersection(Ray const& ray) const
{
double t;
double A = 1;
double B = 2*( ray.dir[0]*(ray.origin[0]-center_[0]) +
ray.dir[1]*(ray.origin[1]-center_[1]) +
ray.dir[2]*(ray.origin[2]-center_[2]));
double C = pow(ray.origin[0]-center_[0],2) +
pow(ray.origin[1]-center_[1],2) + pow(ray.origin[2]-center_[2],2) -
radius_pow2_;
double discr = B*B-4*C;
if(discr > 0)
{
t = (-B - sqrt(discr))/2;
if(t <= 0)
{
t = (-B + sqrt(discr))/2;
}
}
else t = 0;
return t;
}
Sphere blub = Sphere(math3d::point(300.,300.,-500.),200.);
Ray mu = Ray();
// for all pixels of window
for (std::size_t y = 0; y < window.height(); ++y) {
for (std::size_t x = 0; x < window.width(); ++x) {
Pixel p(x, y);
mu =
Ray(math3d::point(0.,0.,0.),math3d::vector(float(x),float(y),-300.));
if (blub.getIntersection(mu) == 0. ) {
p.color = Color(0.0, 0.0, 0.0);
} else {
p.color = Color(1., 1., 1.);
}
}
}

What I also do not understand is why my "oval" isn't centered on the
picture. I have a window of 600 x 600 pixels, so putting the sphere's
center at 300 x 300 should afaik put the sphere in the center of the
window as well.

No comments:

Post a Comment