Embree: Photo-Realistic Ray Tracing Kernels

Glass material appears as black


I am using EMBREE (for a research project)  to render a architecture scene in which I have 2 glasses on a table. It was originally a Blender project, which I exported to an OBJ. I substituted the glasses OBJ material by a EMBREE Glass material , but when I render the scene the glasses are rendered as completely black. The code for the glasses is below as well as an output of the renderer after 50 frames.

Does anyone know what I am missing or doing wrong?

rtcIntesect with tfar and unnormalized diection vector?


I'd have a very basic question on how to use RTCRay with rtcIntesect: am I required to provide a normalized direction vector?

And if I don't: how is the value of "tfar" interpreted?
Is the endpoint: ray.tfar * ray.dir?

I'm trying to use a direction vector with the full length between two points p1 and p2:
ray.dir = p2 - p1;
Can I set ray.tfar = 1 now, assuming that it will travel the whole distance from p1 up to point p2?

Thanks for clarifying this,


Floating point problem


as a test, I have a rectangle of size 20 with its center at origin 0,0,0 with the face normal on the x axis and shoot a ray from org=(254894535,0,0) and dir=(-1,0,0)

I would expect tfar to return 254894535 but it return 254894528 instead, 7 is a lot when calculating the hit point, any idea what this could be??? thanks.


Wrong normal returned

Hello again,

I'm having problems with embree returning the wrong normal. Here's what I did to test this, create a sphere where the front faces are all facing out, shoot rays from outside towards the sphere and often the dot(raydir, Ng) returns positive as oppose to negative.

I have no clue what's going on and yes I do have ROBUST set. Can you help? thank you.

Performance difference many Geometries vs. many Primitives


in my application I often have to insert single polygons from time to time into the scene.

Would it be better so insert each polygon upon arrival as an individual geometry (using rtcNewTriangleMesh) or should I make my own polygon buffer? I would then collect a larger number of polygons, so that I can create one single geometry with many primitives, when the user decides to commit the scene?

Is there a drawback in performance when inserting single polygons as geometries versus inserting a geometry with many primitives at once?

Thank you!

Intersection Filters with Instances


I would like to use the intersection filters with scenes with many instances.

but how I have to set them? I tried to set them with geomID in the base geometry like without instancing. but it doesn't work.

I tried setting them in the top level scene with the instance ID but again no luck...

any hints on how to set them in a right way?

many thanks,



Assine o Embree: Photo-Realistic Ray Tracing Kernels