[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Search] [Home]
Board
Settings Home
/3/ - 3DCG



Thread archived.
You cannot reply anymore.



File: maxresdefault(6).jpg (104 KB, 1280x720)
104 KB
104 KB JPG
> /3/ will defend this
>>
you can tell which is raytraced and which is not.

your point?
>>
File: 246829.jpg (936 KB, 3200x2200)
936 KB
936 KB JPG
>>
>>616564
This is the kind of content /3/ needs more of
>>
>>616565
You just need eyes m8
>>
Defend what? What am I looking at?
>>
one is for a game
another is solely for animation
>>
>>616564
2017 video game frame render time: 33 milliseconds
Shitty Pixar frame render time: 16 hours

Blenders EEVEE is going to fuck your shitty ray tracers, boyos.
>>
>>616572
evee is not a raytracer, correct me if im wrong
>>
>>616561
Toy Story didn't use ray tracing.
>>
>>616558
>Video game which has to deal with logic, free camera, shitty hardware and the possibility to run slow due to an unexpected amount of objects looks worse than pre-rendered frames
Duh?
>>
>>616572
Anon, eevee is not a raytracer like Cycles. It's a real time rendering visualization engine, which is useful but not for rendering realism like Cycles does.
>>
>>616576
Path-tracing wasn't used for production rendering until 1998, and the first animated film that was rendered entirely in this way came out in 2005.
Everything that came before was certainly ray-traced and you can tell by the cleanliness of the image. Just look at the shadows here: >>616564 they're sharp at all distances with no falloff or variance of any kind, it's obviously ray-traced.

As for the geometry difference, Toy Story along with many early 3D productions were modeled with NURBS, since Maya didn't even have polygons at that time and Renderman only accepted NURBS surfaces anyway, so it goes without saying that all surfaces would have flawless curvature.

>>616581
Cycles isn't a ray-tracer either, it's a path-tracer. A ray-tracer will never produce a noise in an image regardless of image quality settings unless you pair it with brute-force GI, while noise is intrinsic to path-tracing and higher-quality settings sample more rays per pixel to help reduce it.
>>
>>616582
How and where do I learn this stuff
>>
>>616582
Wasn't Toy Story rendered with a ray-trace alternative?
For some reason I want to say they claimed they were using in-house render methods.
>>
>>616558
I did some math a while back on how much power had the render farm they used for toy story for every frame vs what a top 1080gtx card can output and while i cant remember the exact numbers it was over 100x too slow
Considering rendering optimization i expect few next gen volta cards could do it, but current age consoles? No fucking chance.
>>
>>616582
Toy Story did not use any ray tracing what so ever, not even the shadows. trace() returned black until you setup a ray server which was not possible till renderman 3.8 through DSO shadeops, That was long after Toy Story. First Pixar feature movie to use ray tracing was Bugs Life, it used BMRT as a ray server.

Toy Story had 0 ray tracing.
>>
>>616587
Hmm.. looking it up it seems that it used a system called Reyes, honestly I'm not that familiar with archaic rendering systems enough to be able to tell what disadvantages it held over raytracing (it's late and I can't be bothered to search for answers), but I'll suspect that it's something of a predecessor to it.

Since you seem to be technically knowledgeable about how these things actually worked it'd be nice if you could give a rundown. Like, if the shadows weren't even traced, then what are they? a fucking stencil or something?
>>
>>616587
so if it had no raytracer i assume it used something like this?
>>
>>616558
>>616564
Hello /v/, now get out.
>>
>>616595
Depth maps, but you had far greater control over them than other renderers commonly present, for example shadow() has a blur argument you could use to give a depth map softer edges as the point bieng shaded got further from the light source and contact point(artist specified) of the object casting the shadow. It looked close enough to a ray traced soft shadow, but several orders of magnitude faster. REYES (render everything you ever saw) was the bad ass scanline renderer for the time, especially in motion blur, dof(thanks to software patents), shading and displacements and just overall level of control. But it is eclipsed now by ray tracing.
I don't blame you for not wanting to look this shit up, It's old and outdated. Only reason I know about it is because I am old, but not yet outdated. However with all the excitement surrounding substance designer it's worth looking into this stuff a bit, procedural texturing is another strength of PRman.
>>
>>616596
REYES is basically a scan line renderer. Pixar has patents protecting the stuff that made scan line rendering amazing, which of course held the industry back and granted Pixar a near monopoly on rendering software for feature films.
>>
>>616601
Thanks for the explanation. I'm guessing it was rather ahead of its time, but also took a lot of hacks and manual labor to get things done.

>>616603
I'm impressed that it's scanline, but mostly because as you said, every other such renderer was basically just shit in comparison.
>>
>>616564
>>616558
All pixar movies are rendered with renderman, and renderman is not a ray-tracer, its Reyes rendering, which basically cheats lighting with no raytracing as a matter of fact the first movie that pixar used light raytracing on was Monsters U.
>>
>>616558

shitty troll post, in the trash it goes
>>
>>616605
One of the chief ideas behind REYES is that everything in CG is a hack, or an illusion anyway. So why not give artists a deep level of access to the technology to make the kind of image they want as quickly as possible, with as few restrictions as possible. When rsl first clicked for me it was like having my mind expanded and I actually saw the computer as an artistic tool on it's own merits, and not trying to emulate other media. I felt completely in control.

>>616608
Renderman has had raytracing natively for awhile now. They even have a fancy machine learning denoiser so you can get your frames even faster!
>>
>>616582
Maya came out in early '98. It has nothing to do with Toy Story.
>>
File: 1519863436670.jpg (28 KB, 419x420)
28 KB
28 KB JPG
>>616572
>Shitty Pixar frame render time: 16 hours

keeeeek
you're actually bitching on the first 3D animated full-length movie ever made

Pixar practically invented both the hardware and software required to create this

if you don't think it's an incredible achievement, better kys blenderfaggot, because everything CGI-related owes something to the pioneers that came before them and pushed the limits of the technology
>>
>>616618
The blender community seems to be full of underage faggots that have no sense of history or reverence for the things that came before, blender would not exist if not for the pioneering research that pixar has made for 3d animation, research that they give away for free, tech that is probably stolen and built into blender just that faggots like this >>616572
can come up and brag about how blender is gonna burn some shit down.
>>
>>616618
>>616621
precisely
thank you anons
>>
>>616582
path tracing is a form of ray tracing, you Claude.

>>616583
You stop believing everything everyone tells you on /3/ to start with.
>>
>>616572
>Pixar: Played a hand in basically everything related to 3D today
>Anon: Still living with his mom
>>
>>616564

That's the UE4 for you. Overrated as fuck by it's fanboys. Sure it's an amazing engine to make games easily, but the graphics part sucks big time.
>>
>>616639
>path tracing is a form of ray tracing, you Claude.
Not him, but the blur between ray tracing and path tracing is mostly an issue of PR. Ray tracing has amazing publicity. In the eyes of the public, ray tracing allows photorealistic images out of the box. Path tracing is virtually unknown and not a sexy term at all.

While ray tracing and path tracing are two different techniques, path tracer are often called "physically unbiased ray tracer" by PR. This blurs the line between both. Even competent people can have issues knowing what is what.

What compound the issue is also that you can, relatively simply, transform a ray tracer into a path tracer, and vice-versa. The algorithms are almost the same.

For everyone's information, a ray tracer launches a single ray by pixel, compute the point of intersection, and then launches secondary rays toward the lights of the scene. There is also often a ton of tricks at this stage to get a relatively realistic image. When a pixel is computed, it is done.

A path tracer casts 3000+ rays by pixel, compute the point of intersection for each of them, and then bounce them randomly in the scene until they reach a light or escape the scene entirely. The more rays are launched, the better (i.e more realistic) is the pixel. Images are noised until enough rays have been computed.
>>
>>616662
But if a 3D artist wants to render his shit in a game engine, what else he can use? It's not like there are many available options out there. Only Cryengine and Unity. No wonder UE4 got so popular and loved.
>>
>>616673
companies could write a new engine if they weren't busy spending 100 millions on voice acting
>>
File: 1523346249383.jpg (116 KB, 1280x720)
116 KB
116 KB JPG
>>
>>616678
>Already explained why it looks different
>Continues to beat the dead horse
Stop embarrassing yourself.
>>
>>616682
>you can't complain!
>>
>>616683
If your complaint is "Why don't real time graphics look as good as pre-rendered!", then no, you can't.
>>
>>616568
this...
>>
>>616673
>But if a 3D artist wants to render his shit in a game engine, what else he can use?
Octane 4 with Brigade integration, coming Soon(TM). (Octane 3.07 is already available in Unity, though not very useful yet for a standard game production.)
>>
>>616669
Does that mean it's impossible to use a true point light source in path tracing, because the rays would never intersect it?
>>
>>616662
Everything mentioned in the right part of the pic can be achieved in UE4 (Realistic cloth material, smooth mesh, floor reflections, specular reflection on the eyes). KHIII Just skipped those features to run on shitty consoles probably.
>>
>>616784

Unreal still looks like shit compared to most engines out there. I'm not saying it's a shit engine (I would not use anything else to make a game on my own), but the graphics are not nearly as impressive as the unreal drones want to believe. The outdoors are especially disgusting.
>>
>>616786
Eh, it's probably not quite as optimized as the likes of Frostbite or AnvilNext, but considering it has a wider use-case I think it's quite powerful. A game is only as beautiful as its devs make it, and I reckon a lot of UE4 devs skimp on their games' graphics. Understandably, since they're often indies nowadays.
Are there any features missing from UE4 that you're thinking of? Or are you just talking about the worse performance?
>>
>>616790
>A game is only as beautiful as its devs make it

Not entirely true. Engines have different lighting, AA, shaders and GI. You could have the best models and textures, if the shaders and lighting look bad it will have some negative impact on your models.

What's really missing from UE4 is real time GI (something like SVOGI) which makes games look almost twice as good (look at the VXGI integration videos for unreal, it makes the engine look way better). Cryengine looks much better than Cryengine and it performs better with large scenes too. and for some reason performance will drop if you add trees with the foliage tool, but if you add the same amount of trees manually it won't affect performance at all. That's retarded.

I also think Unreal's post processing effects look cheap. I'd like to have custom lens flares, and a better looking grain filter (for horror games). The antialiasing looks blurry, which makes foliage look cartoony and pasty. Unreal has always stuggled with AA for some reason.

Still a very solid engine for making games, especially if you're an artist who hates code.
>>
>>616782
Exactly!

Path tracing needs realistic light sources, i.e light source with volumes. Point light sources are never reached by the scene's rays. That is why all lights in path-tracing have volumes, or sometimes are simply a surface's shader.

Some path-tracing methods can accommodate point light sources, most often when the rays are cast both from the camera and the lights and conjoined in the middle, but it is a good idea to always have a volume for your light sources.
>>
>>616793
I agree that GI would be amazing to have out of the box in UE4. I think Godot got something like that not too long ago, so hopefully that'll put some pressure on Epic in that regard.
The foliage tool and most post-processing are pretty crap, for sure. I feel like those are more cheap tools to allow noobs to quickly implement those without too much effort. I've a background in shaders, so I usually get better results doing stuff like filters and blur with custom solutions. But having better out-of-the-box solutions would be great, yeah.
>>
>>616796
Huh, that's really interesting. I just finished writing a raytracer in openGL for a class, maybe I'll try adding area lights and implementing this path-tracing.
But as Anon said, it shoots out tons of rays per pixel instead of a handful, right? Probably wouldn't be able to run that using only my CPU.
>>
>>616800
A typical render time for a path tracer on cpu is several hours, sometimes several days. It is definitely not something real time, but it isn't something that can't be done on cpu.

A path tracer has the particularity that the image is never finished - you can always cast more rays to have a better quality. In my experience, the image becomes recognizable after 10-15 rays by pixel (what is called 15 iterations), and become decent-ish after ~300. Noise disappears after ~3000.

Picture related, the number of iteration is doubled for each square.

Another issue to keep in mind is that the easier it is to reach the light, the easier the computation is, because the rays bounces less. Pathological scenes (a single very small, very luminous light in a recess) can take weeks to compute.
>>
File: berserk.jpg (18 KB, 1067x600)
18 KB
18 KB JPG
>>616806
That pic illustrates what you're saying very well, I see. That looks a lot like the technique used by the renderer of CAD software like Solidworks or Inventor, TIL that is called path tracing.
I remember the render times of scenes I made on Solidworks being in the order of minutes, it's always mind-boggling how good GPUs are at their jobs.
Pic related is an awfully low-res and .jpg-ed to fuck example.
>>
>>616793
Wow, I took a look at VXGI, you weren't kidding, it's actually very well done, I thought it was some fan-made hack. Gonna have to try implementing that in my next prototype.
>>
>>616810

You can't use it for a real game, it's way too ressource intensive. They ned to come up with their own in house stuff instead of integrating Nvidia's tech.
>>
>>616820
That won't stop me, it's not like anyone plays any of them anyway heh. But they sure do need to come up with something, we don't want all the cool features to be locked behind nvidia cards.
>>
>>616572
>Blenders EEVEE is going to fuck your shitty ray tracers, boyos.
fuck you man, even Blender has to owe its existence to Toy Story.

Also eevee's not a ray tracer.
>>
File: 1495075630989.jpg (38 KB, 349x352)
38 KB
38 KB JPG
mfw this thread is the perfect mix of interesting information and absolute shitposting
>>
>>616558
>Comparing realtime to rendered
What's your point?
>>
>>616809
This is unsurprising. Path tracing is the industry standard for photorealistic rendering.

Unlike ray tracing, path tracing handles perfectly indirect lightings and soft shadows. It also handle okay caustics, and if you upgrade for bi-directional path tracing (rays from both the camera toward the lights and the lights toward the camera), can handle caustics really well.

This is because path tracing is basically what happens in real life. For each pixel, you cast hundred of "photons" into the scene, make them bounce, lose and gain colour, and then average those photons for the final pixel. You can't do more realistic than the physical processes behind light.

Technically, only casting from the camera toward the lights is a form of bias, which is why bi-directional path tracing exists. The ultimate realistic renderer would be an inverse path tracer (from the lights to the camera exclusively), but then that would mean casting literally billions of rays and accepting losing 99% of them.
>>
>>616612
Maya existed before '98; it was known as PowerAnimator and was launched in 1990.
>>
>>617269
PowerAnimator was NOT Maya. Maya is PAs spiritual successor. Development of Maya began in 1995, it was shipped in early 1998. I (read: the company I worked for) was an early adopter.

- Anon who actually used PA, Alias and Maya on a big bulky SGI space heater
>>
A multi million dollar movie made on top of the line equipment (at the time) with literally months of available render time

verses

A real time animation made on a piece of equipment that costs a couple hundred bucks

There's your difference.
>>
>>617295
That has nothing to do with the lighting. Also show a gic.
>>
>>616558
If i had access to the dev kit I could make it look more like the picture on the right with some material and lighting tweaks. I'm more than sure they could have too if they were aiming for a toy story realtime film instead of a video game.
>>
>>616806
Why couldn't you just render it to 7th image and then use filter to remove the white dots?
>>
>>617295
Well, no, the original Toy Story can be rendered real-time, albeit requiring stronger GPU power than a console. But it's purportedly nothing current high/enthusiast-grade hardware can't handle.
>>
>>618391
You can.

Filters, by definition, do not bring any new data into the rendered picture. In opposite, any new iteration brings new data. Specific filters have been created that remove the noise from a render, but at best you are giving only an approximation of what the image would look like with more rays. The shadows will be subtly wrong, the indirect lightings won't be as good, etc.

They exist and they are used in production, at least to have a decent idea of the finished picture. Most of the time the final render won't use it.

NVidia has developed a denoising filter using a convoluted neural networks (the neural networks that can work with images really well), specifically trained to denoise images rendered after the 15th iteration. It works, it is impressive, and some people call it the future of path tracing.
>>
its like people in this thread can't even fathom how shitty computers were in 1993

rendering a movie of this quality back then was literally a tour de force
>>
>>618411
There is one thing about neural networks that makes the "no new information" claim a little arguable. A neural network for image denoising is produced by training it to take as input noisy images like 7 or 8 and output clean images like 16. They train the network with many thousands of renders like that. So in a sense there is additional information packed into the neural network, extracted from many renders, about what noise looks like in general, and how to fix it.

The information it brings to the table is not information about your specific scene, so you could say it's just semantics, however I would personally say it does bring new information, in the same way manual cleanup in photoshop brings new information.
>>
>>618411
Also "convolutional" neural networks
>>
File: Muun.jpg (86 KB, 795x768)
86 KB
86 KB JPG
>>616564
Left is way more aesthetically pleasing.
>>
File: 1375933710293.jpg (52 KB, 184x181)
52 KB
52 KB JPG
>>618449
Its just softer, brighter, and more vivid, partly because of the GI simulation (which was virtually impossible to achieve back then)

The first two images look better because of that, but in the third row, the right Woody looks way more "cinematic" than the left one.

Different times, technology, and goals
>>
>>618449
>>618458
I agree the game shots are more aesthetic but it's basically down to the color grading and bloom imo. Also the first two toy story images are not particularly great compositions, they seem chosen to contrast directly against the 2017 shots to demonstrate the flaws. The third toy story shot is far better than the third KH shot.




Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.