Fucking bug ; What does /3 think of Nvidia Iray renderer ?
Its still young which makes it still useless too
>Sam Hyde, leading Linux developer and real life shitposter
>It's still young
>translation: Because I haven't heard much about it or used it, it's young and bad
Bruh, it's been around since like, 2009 and has become one of the best ray-tracers around. It is much more feature rich than Arnold and even LuxRender. It can utilize all the GPUs and CPUs in your system, and on your networked computers. It even naturally does light dispersion (splitting into colors). And many 3D packages have begun adopting it.
how is it more feature rich then Lux
I think its great to have many options.
If it gives me the ability to create pretty pictures even faster, i'll like it.
I still don't understand the model behind this product.
Is it going to be free for everyone to use?
Is it going to be a paid-for plugin?
Is it something that companies have to pay for themselves to get it in their packages?
Looks interesting, though. I am looking foward to a GPU-accelerated Arnold-like.
the model is look at this new shit, get hyped.
It's already integrated in 3DS Max and Substance Designer, you don't pay extra for it in either of them. It all comes down to the software company, whether they choose to license and integrate it themselves. The plugins Nvidia is providing will likely cost a bit of money, considering that the beta tests have a license expiry date. But that doesn't mean for sure it will cost money, and if it does, it will be decently priced at least.
> It can utilize all the GPUs and CPUs in your system, and on your networked computers.
Which renderer *can't* do that. That'd be pretty shit-tier. No problem to do this with e.g. cycles.
> It even naturally does light dispersion
Many other renderers can do that as well. Cycles is the only one I can think of that can't do that, but e.g. luxrender, vray, octane, mitsuba, ... can.
Not that I think anybody gives a shit, really. If light dispersion is somehow essential to telling your story, you can just map it to the wall in five minutes with a colored light(-map). Not only does that work anywhere, it's also gonna be easier to control and way faster to render.
>Which renderer *can't* do that
No other production renderers currently will do distributed GPU rendering across GPUs on a network, they'll do CPUs, yeah, that's nothing special.
>Many other renderers can do that as well
Only a few do it, basically the ones you listed, and maybe Maxwell. LuxRender's implementation is complicated and doesn't work with all of the other features. Iray's is a single option you turn on and it just works with all your materials/lights.
We're talking about photorealism here breh, any ray-tracer is good enough if you're just trying to tell a story.
Any renderer that can render on GPU and CPU can be used to render distributed across a network on GPUs and CPUs. Worst-case situation is that you have to do a few minutes of automation work to move over the scene data, set up a seed and kick off the jobs. Ain't no thing to set up e.g. cycles to render across a bunch of different machines, using the GPU on some, using the CPU on some, using both on some, whatever you feel like.
It doesn't matter how many samples each device in your network gets done, in the end you can always weight them all into one image, and that image will be as if you rendered on a single machine with samples = the sum of all the samples all your machines in the network rendered.
Is it going to be as nice as using tractor? No, but it gets the job done for sure, and it works with any renderer (basically all renderers can be started/controlled by scripts)
> Only a few do it, basically the ones you listed, and maybe Maxwell.
> We're talking about photorealism here breh
So, what you're saying is... basically every renderer (sans cycles) there is that has a focus on photorealism can do it? I mean, how many more are there? Indigo can also do it, Arion can do it, mental ray can do it, keyshot can do it, and now that's starting to get pretty obscure...
Yeah, renderman can't do it, but that one doesn't focus on realism.
Pretty good. Scales well accross my servers with low cost "gaming" grade gpu´s.
Still lacks a working proxy system but it can handle 6k renders pretty well if you know what you are doing.
6k as in resolution? Is that a special thing? I've done 20k by 20k renders with cycles before.
how much do you ask for renders like this? they're noisy as fuck. my clients would never be satisfied with that
Your client loves you very much, but wishes you would get a job, find a nice girl, and move out of the basement.
Source: I fucked your client last night.
no but seriously tho, why are your renders so noisy
It only looks noisy at full scale. This is clearly his "before reduction" render. Once scale down, it looks perfectly fine. Rendering at a higher resolution is always better than just turning up your AA, in terms of image quality.
In fact, once scaled down, the noise adds color variation like you get with taking a photo that adds to the realism. So you don't need to do as much work with surface detailing.
Thats a test render
Yep thats pretty much my workflow on iray.
Oh yeah, looks so much better now. It looks like shit.
It isn't noise.
It is, uh, STYLIZED BUMP MAPPING!
Yeah that's it.
>Not knowing how to downscale.
Lmao, nice try troll, what is that, fucking Nearest Neighbor interpolation?
This was likely only a minute in Iray.
>only a minute in Iray
what does that even mean? fast? slow? average?
you angsty retard
what? 6k? 20k?
what are these resolutions for you retards, do you even know what you're doing?
6k is imax size. guy talking about 20k is just a retard.
>Yeah, renderman can't do it, but that one doesn't focus on realism.
Nigga you wut?
Not that anon, but thank you /3/. Thank you for being a place where people slam your interpolation algorithm like normal people make fun of haircuts. Fuck I love you guys.
>Just clamp my shit up
There was no clamping involved there anon... The fireflies scene in the full scale render are only 1 pixel big, so when you scale down, they get blended with the other colors and simply become a more natural looking noise.