[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Search] [Home]
Board
Settings Home
/3/ - 3DCG



Thread archived.
You cannot reply anymore.




File: result_25k.png (974 KB, 1024x768)
974 KB
974 KB PNG
DEFERRED SHADING.
sup, im trying to code a deffered shader (including global illliin go (golang) for my seminar in university.
all im looking for is a paper or a tutorial of some kind of how to do deferred shading WITHOUT FUCKING OPENGL. is that too much to ask jesus christ, im sick and tired of reading about opengl when i google this shit.
what im interested in is the basic principle of deferred shading, not some garbage ass opengl c code.
also:
- dont ask why i use go, i just do
- i dont know if this is the right board to post this on. i checked the catalog and most of the people here seem to be blender retards, but im frustrated enough to try my luck here
>>
So you want to code a shader that runs on the GPU, without a framework (OpenGL) that lets you use the GPU?
Or are you trying to do it all the hard way, through your own software renderer (CPU only)?
>>
>>658072
Go to /g//dpt/
You only find tutorials with opengl or directx because deferred makes sense if you are doing your calculations on a gpu if you are doing software rendering then there is pretty much zero reason for deferred shading.
>>
>>658073
spot on, i want to do it cpu only
>>
>>658074
achieving real time rendering is not my goal anyway, all i want to do is code a deferred shader to understand the principle of it. im having trouble understanding deferred shading as most of the papers and tutorials only give me some opengl shit.
>>
>>658077
The concept is simple, render information to a bunch of textures (minimum is surface normals and depth), then do all your lighting on those textures (directional light is just normal dot product with light direction, for a point light you'll have to recover world position from your depth and pixel position). That's really all there is to a defeĊ•red renderer.
>>
>>658072
what horse cock?
>>
>>658072
Deferred shading doesn't make sense without reference to a GPU shading / compute pipeline. It refers to which stage of the pipeline the lighting is computed in. If you're writing a CPU render engine, there's no Rendering Pipeline you have to stick to, you just write your instructions in whatever order you feel like.


https://www.khronos.org/opengl/wiki/Rendering_Pipeline_Overview


If you want to write a Deferred Shader in Go, you would have to first constrain yourself to work with memory and parallelization in ways that match the way that pipeline above works. Then then you could reorder what tasks are done in which step. Do lighting first, or do it later. That's deferred shading.
>>
>>658089
>>658080
i dont see how any of what you describe would help reduce the time-complexity of computing the shading.
>>
>>658072
Hey guys, I want to put a nail into this piece of wood with a screwdriver. Why are there no guides for how to use a screwdriver to drive nails?
>>
>>658072
There's so much wrong with your thinking, it's sort of insane. You want a paper or a tutorial for doing low level graphics work in a relatively new programming language? You also want information on the subject in a way that runs counter intuitive to industry and academic standards for doing 3d rendering computationally.

Options:
- Use opengl bindings in golang.
- create your own rendering pipeline
- Search harder for publications on the subject (they will be relatively old, and you will want to go through archives, so use your library account to search journals)

You should know, that option 2 will take you longer than your semester to pull off. What you are thinking of doing is more akin to a master's thesis. You really do not know enough about the subject to take on the task you are interested in, desu.
>>
>>658105
faggot
>>
>>658106
what word is being seded to desu?
>>
>>658107
I don't know, desu.
>>
File: witch_clustered.jpg (108 KB, 1280x720)
108 KB
108 KB JPG
>>658095
It does in the context of a traditional GPU shading pipeline, what has been used and is still used in some cases is a so called forward renderer. With a forward renderer you calculate the light directly without the intermediate step of rendering normals etc. to a texture, but the problem is you have to do that whole thing, including the vertex transformations, for each light. So a scene with 20 lights will be rendered 20 times, which sucks for performance. (There are options to group lights up so that multiple lights are rendered in a single pass, but then we are probably moving into stuff like clustered shading which is even more complex.)
A few more things deferred can't really do transparency (without expensive tricks at least), which often leads to an additional forward pipeline for transparent objects. And even if you do forward shading you often still want to render out a few things like depth and normals for post process effects like ambient occlusion or screenspace reflections.
Pic related my clustered deferred&forward renderer from back when I was still doing stuff like that.
>>658105
>You really do not know enough about the subject to take on the task you are interested in
Probably true, implementing a simple baseline deferred renderer (in openGL or DirectX) isn't that difficult though.
>>
>>658072
how much ram do you have?
>>
Noob mdr
>>658072
>>
>>658072
The exact purpose of deferred shading is to delay running a full shader until all relevant texture, normal, depth and other useful information is collected.

There main reason is if a triangle you are drawing is out order, that is some part of it will be drawn over by another triangle in the future, storing it's relevant info is less costly than running the shader that calculates its lighting. This very important if you have a ton lights or probes that you are working with.

So it doesn't really make sense to use deferred shading on a ray tracer that always calculates the closest intersection. I guess you could slowly dim the lights on this way, but doesn't seem useful.
>>
>>658095
Are you OP? If this is the OP's response you need to start over with a new project idea, you have misunderstood what deferred shading means.

Either way please reread comment >>658089 so that you understand there is no a priori computational benefit without thinking in terms of a standardized shader pipeline like OpenGL or DirectX. Unless you're taking into account things like transferring data into and out of VRAM, your cost/benefit analysis for CPU algorithms will not accurately estimate how it works on a GPU.
>>
File: RenderingPipeline.png (42 KB, 271x602)
42 KB
42 KB PNG
>>658105
im not sure i understand what is meant by "rendering pipeline". say i coded a rasterizer from scratch in go. would that mean i coded a rendering pipeline? that's all that i understand from seeing this graph. if thats the case, then i already have a rendering pipeline. its capable of collecting basic gbuffers like normals, depth, color (all work done by cpu). im now thinking of how to apply shading using only those gbuffers since that's all what i thought deferred shading was to begin with.
>>
>>658248
Deffering is meant for the shader program, not the driver. It sounds like you are cache data between the vertex and pixel shaders.
>>
sounds like you're going to a glorified art school, op. Enjoy working in the mall food court in a couple months to pay for your "deferred renderer" degree
>>
>>658270
i study cs and this is just a minor project. i dont know why you're talking shit, hang yourself.
>>
>>658273
you study for a "minor project" what you can implement in literally under a day. YOU neck yourself
>>
>>658275
no u
>>
>>658277
you're not cut out for this
>>
>>658278
ok i feel like you have some sort of inferiority complex, because why else would someone waste his time telling someone else "duh you're not cut out for this" instead of actually posting a reasonable response regarding op.
fucking faggot i swear.
>>
>>658279
you're "studying cs" very hard by posting on the 4chan i see lmfao
>>
>>658076
Why would you do software deferred rendering? Deferred rendering is a series of steps that only make sense in the context of GPU rendering. If you're rendering on a CPU, you can get all the same results in a much more straightforward implementation (albeit far slower)...
>>
>>658313
not Op but probably to help wrap ones head around how to go about it as a learning experience for later endavours.
Uni's often restrict you into some BS language not of your chosing.

>>658072
If you understand the idea of what deffered shading is you should be able to translate the core ideas of any tutorial into whatever language you're using.
Hoping to find good step by step information on how to go about it in some obscure language seems a tall order.
OpenGL or DirectX is gonna be what anyone interested in using this stuff in a real world application is gonna use.

I know next to nothing about how to write a program that can rasterize mesh, but I imagine that is covered in many places if you actually wanna do that from scratch.
Once you have all the information collected as images in your G-buffer the operations you need to perform on each pixel is very straight forward.
Open any basic HLSL shader of your liking and it'll show the fragment/pixel calculations you will need to perform as clear text.
>>
>>658280
>Implying that useful discussions can't happen on 4chan
Ok.
>>
>>658072
dont know much about 3d rendering, but wikipedia has good sudo code for most algorithms.



Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.