nvidia vca

nvidia vca

hallo, i think now with the annoucement of the nv iray vca it makes no sense anymore to

think about mic/embree as a valid choice when it comes to ray tracing or path tracing.

steve

8 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

Ok, thank you very much for the info. I'll go tell my users I won't bother anymore with my renderer development, they will be happy to hear that.

We got a fairly large studio which now uses embree based renderer (Corona) with great quality/performance ratio. And performance of pretty much any GPU based solution, including state of art Octane renderer, is pathetic compared to it. Let alone flexibility issues and memory limitations. 

I would encourage Embree team to keep up with the amazing work. It's real fun to see how quickly is "100* faster on GPU" myth getting it's butt kicked. :)

I have no clue what corona is (seems to be not widely used) but believe me:

everything that is noisy and offline rendering for you today will be realtime with iray vca.

 

Quote:

steve g. wrote:

I have no clue what corona is (seems to be not widely used) but believe me:

I believe you that you have no clue about lots of things ;).

oh shit, amateurs everywhere...

Quote:

rawalanche wrote:

We got a fairly large studio which now uses embree based renderer (Corona) with great quality/performance ratio. And performance of pretty much any GPU based solution, including state of art Octane renderer, is pathetic compared to it. Let alone flexibility issues and memory limitations. 

I would encourage Embree team to keep up with the amazing work. It's real fun to see how quickly is "100* faster on GPU" myth getting it's butt kicked. :)

 

Its no myth that GPUs are 100 times faster than CPUs in rendering just look at todays PC games ;)

GPUs are dedicated to render pictures. Look at how much faster renders are created in GPU solutions. But there are limitations like you said nowtime GPU solutions have flexibility issues and do not meet the quality of CPU generated pictures. But this is mostly a problem of software, thats why high quality renders from GPUs are mostly made with custom software. NVIDIA does only provide solutions to increase fidelity while holding a certain render speed because of more processing power available. Meaning that you can easily add presicion levels and increase the overall quality but sacraficing speed. Which pretty much ends in a tie between CPU and GPU rendering :D

But GPUs can create almost instantly a lets say for most applications good enough pitcture while CPUs need their time to create a good enough image.  

Ok, but what's the point of posting it on the forum dedicated to CPU rendering, and why do you need to create so many different fake accounts for every one of your arguments?

Login to leave a comment.