I don't know what these mean.
Why should anyone be looking forward to these?
These seem like features geared more to the high poly concept art crowd (like keyshot)?
Are these just marketing buzzwords?
path tracing is a lighting algorithm for GI, comparable to ray tracing
>geared more to the high poly concept art
if you're not gonna use substance designer as your main renderer it won't do much for you
unlimited resolution is pretty buzzwordy
>if you're not gonna use substance designer as your main renderer it won't do much for you
Understood, but this type of rendering is more for highpoly, with advanced reflections and stuff too intense/not applicable to typical game art/real time presentation, right?
I THINK that rendering approach is used for a more "bells and whistles, kit n kaboodle plus the kitchen sink" type renders.
>materials with nothing but basic colors and values
the absolute madman
the point of it is for people who want to use substance for film/cg. up u till now it has only been seen as something for game assets really, but this helps change that, so you can lookdev within the app and then exported your desired, properly calibrated for raytracing, maps.
>Why should anyone be looking forward to these?
its a marketing thing. "They" "have" to have something "new" even when this is very old tech and "we" - the pro's - dont actually use iray or yibis shaders in production, not even for preview. Look at siggraph studio papers by say ready at dawn for how controlled it is in a typical film / game / commercial shop.
Substance just continues to "exist", not really creating anything new, just following trends.
It's currently not possible to do good quality ray tracing in real time. It's for offline rendering.
"we" are starting to use Iray in the industry, for production, because of how powerful it is compared to other path tracers, and how well it scales across multiple GPUs while also utilizing CPUs. It's the fast path-tracer out now and is one of the most fully featured if not the most.
With a high end GPU, Iray will converge to a good enough result for judgment purposes in less than 6 seconds. The progressive rendering is the same quality as "offline", it just gradually clears up graininess.
If you guys can't understand why this is useful to have in Substance, then you're sorely misguided or just plain stupid.
That jibe caused me to chortle heartily, I commend you sir.
>ominous, increasingly loud music
>WUB WUB WUB
if you are using the substance painter then you should consider to get the designer.
beside that its most uncomplicated tool for game texturing i ever meet.
i like substance designer for having easy UI and really good engine for making game textures. i also can go into incredible generated reality details without to tweark it all in the photoshop at heavy resolutions. my minus point is that designer still needs some tools.
path tracing is way better than ray tracing, so yeah. its a big deal
didnt know DM dosnt show, heres a youtube link
Path tracing is a ray tracing technique.
Usually path-tracing is used to describe a brute-force method whereby you just shoot rays out from the camera. While ray-tracers shoot rays from lights and use a bunch of interpolation tricks and maps to approximate things, techniques like Final Gathering, Irradiance Particles, Importons and Final Gathering.
Iray blends both methods depending on which samplers you have enabled, and has support light diffusion for 3 years now
And despite being a GPU accelerated path-tracer (uses all your GPUs and CPUs, even networked ones), it supports a lot of non-physical effects and control, including passes.
And a pretty nice feature to help reduce the limited amount of memory GPUs have.
They've really developed quite a decent production ready renderer that is accelerated immensely by consumer grade GPUs.
They said "unlimited resolution" there because the person mentioned Keyshot, which has an artificial limit on the resolution you're allowed to render.
>Usually path-tracing is used to describe a brute-force method whereby you just shoot rays out from the camera. While ray-tracers shoot rays from lights and use a bunch of interpolation tricks and maps to approximate things, techniques like Final Gathering, Irradiance Particles, Importons and Final Gathering.
its the same fucking thing mate, dont get bogged down on semantics pls
iray is a meme btw. Hi nvidia shills
It's bot the same thing and it's not. Yes a ray is being traced, but path-trace implies a different type of implementation, it usually implies "unbiased" while a biased will be called a "ray tracer". Just stating a fact of how the terms are usually used.
every engineer uses these terms differently. Teh fact is that you basically shoot rays and wait for convergence in "both" cases, kid
Path tracers progressively converge, using the first set of samples to then sub-sample, and on and on for as long as you like. Ray-tracers don't converge, they have a preset amount of rays they shoot out and interpolate based on that strict amount, kid.
you shoot out rays for both and wait for the noise to clear up. Pathtracing is ray tracing extended to paths. This is a stochastic (random) method. You have terms like MCPT and MCRT. Read a book or two, idiot and not some blog.nvidiashill.kiddie.dum
You don't wait for the noise to clear up with a proper ray-tracer, you dunce, just like you don't wait for noise to clear up in your fucking games. With a renderer like mentalray, VRay (classic, not RT) and renderman, you have a predefined point density that hits the scene, is stored in maps and memory and then those points on the map are interpolated and used to calculate the shading using a rasterization method. Whereas what people know as "path-tracers" typically do this stochastic method (usually monte-carlo based) that just continues to sub-sample further until you get the result you want, whether the noise clears or not is all about the time you want to wait, instead of the properties you set beforehand.
Iray employs their own version of metropolis-light-transport, which is a precompute operation and thus their operation is not just stochastic. And the fact that it does light dispersion, render passes and LPEs is further evidence of that.
you're just another blind shill.
Come back at me when shitty ass substance has any new, original, features worth mentioning, not just 1997 eric veach papers
-over and out
You suck at trolling m8, find a better day job.
What happened itt guize. Cliffs on this thread pls.
did someone say path tracing ?
From what I can tell, SD got a new alternate renderer.
It's not real time/indicative of of a game game engine render, nor does it seem for that crowd.
Instead, it a fancy schmancy render with a lot of bells and whistles not available/too much for real time renderers to handle.
That said, it seems to be catered towards more of the traditional 2d/cg crowd.
High poly stuff, rendering "concept" models which has become popular ala Vitaly Bulgarov who uses keyshot, it seems to also be looking to entice that crowd.
Added to SD for free. Program keeps getting better.
Great bang for your buck IMO.
>He paid for a program
Imagine the dough they could rake in if they make this a regular renderer instead of a game engine.
I got no earthly idea whats stopping them. People are fed up with "upgraded" mentalray or vray which takes 4 minutes to render a fucking cube, while game engines are surpassing them with light speed.
I understand that the classic method for raytracers are different and take much longer time, but why has nobody came up with a new renderer that can do this shit with some smart algorythm like pathtracing so we only wait for seconds instead of hours.
I remember something called "Mitsuba" which uses some local point data technique to create GI only within seconds. It's not accurate as raytracers ofcourse but it's okay considering it only takes seconds and can run real-time.
It's not that they can't render a nearly-perefect image really fast, most pathtracers could.
The small noise is what takes so long to cancel out. I can move around in the Blender viewport with Cycles too, and wait a second until the image clears up a bit. Dimly lit areas still take thousands of samples and a ridiculous amount of light bounces until the image is clear, and that's a step they simply left out.
You mean their Virtual Point Light renderer? That is only meant to be their viewport renderer. Anyways Mitsuba is meant as a research based renderer. Used to test out new algorithms. and never recommended for production use. Which is why you'll find things in there like the Adjoint Particle tracer which is literally the opposite of traditional rendering. And despite how inefficient APT rendering is it is still kind of fast in Mitsuba.
>Dimly lit areas still take thousands of samples and a ridiculous amount of light bounces until the image is clear
Isn't that a problem that Metropolis was supposed to solve along with caustics?
MLT doesn't "solve" that. The render times for animations are still long
I didn't mean it like that. I mean like doesn't it help reduce the noise in those dim areas a little faster so you don't have noisier areas than others?
Maybe, depends on a lot of factors.
Like what? Scene complexity?
mate...most plugin authors aren't good at predicting mutations well enough and in the end MLT has usefulness limited to only caustics and slightly open doors with a light coming through it in a dark room - two cases that dont come up very often in production. Most scenes are well lit during the day or many lights at night (think NYC times square after dark)
>People are fed up with "upgraded" mentalray or vray which takes 4 minutes to render a fucking cube, while game engines are surpassing them with light speed.
I don't fucking need an autisticly accurate rendering technique for most of my stuff and would completely be satisfied with rendering techniques they use for today's games.
also, I wish I could switch between renderers without having to set up all materals again just for using a different renderer.
unlimited resolution as in image dimensions, not muh euclideon bullcrap.
I dunno, he had that guy going for a while. I'd have told him to fuck off after the second reply.