Embree: Photo-Realistic Ray Tracing Kernels

Backface culling determination

Wondering how Embree determines which faces are 'backfacing' when this option is enabled upon compilation or set for a geometry. 

Do all triangles in the surface need to be backfacing for the surface to be ignored by the ray? Or only the ones which would typically be hit by the ray?

A simple example being a ray passing through the barrel of a cylinder with some triangle normals facing the ray and others not.



Closest to location

For our purposes of using Embree as a ray-tracer in our geometry-accelerated Monte Carlo code, it is sometimes necessary to find the closest intersection to a given location. Would you have a recommendation on how best to do this using Embree?

Thank you for your time.

Prerequisite for Embree?


This is my first post.

I'n new to Embree and really interested in learning about it. I want to know what are good prerequisite courses to take before diving deep into Embree. My goal is to make Embree fast as possible by using advanced acceleration data structures, acceleration algorithms and hardware optimizations. 

Here are these few courses that I think might be relevant:


sunlight and glass material

Hello everybody,
I'm trying the last embree examples with sphere_glass.xml test scene. I just replaced the HDRILight with DISTANTLIGHT as follow:
    0   0   -0.000000   0
    0   0   -0.777146   0
    0   0   -0.629320   0
<L>350.000000 350.000000 350.000000</L>

Glass material appears as black


I am using EMBREE (for a research project)  to render a architecture scene in which I have 2 glasses on a table. It was originally a Blender project, which I exported to an OBJ. I substituted the glasses OBJ material by a EMBREE Glass material , but when I render the scene the glasses are rendered as completely black. The code for the glasses is below as well as an output of the renderer after 50 frames.

Does anyone know what I am missing or doing wrong?

rtcIntesect with tfar and unnormalized diection vector?


I'd have a very basic question on how to use RTCRay with rtcIntesect: am I required to provide a normalized direction vector?

And if I don't: how is the value of "tfar" interpreted?
Is the endpoint: ray.tfar * ray.dir?

I'm trying to use a direction vector with the full length between two points p1 and p2:
ray.dir = p2 - p1;
Can I set ray.tfar = 1 now, assuming that it will travel the whole distance from p1 up to point p2?

Thanks for clarifying this,


Floating point problem


as a test, I have a rectangle of size 20 with its center at origin 0,0,0 with the face normal on the x axis and shoot a ray from org=(254894535,0,0) and dir=(-1,0,0)

I would expect tfar to return 254894535 but it return 254894528 instead, 7 is a lot when calculating the hit point, any idea what this could be??? thanks.


Wrong normal returned

Hello again,

I'm having problems with embree returning the wrong normal. Here's what I did to test this, create a sphere where the front faces are all facing out, shoot rays from outside towards the sphere and often the dot(raydir, Ng) returns positive as oppose to negative.

I have no clue what's going on and yes I do have ROBUST set. Can you help? thank you.

订阅 Embree: Photo-Realistic Ray Tracing Kernels