DEFERRED SHADING.sup, im trying to code a deffered shader (including global illliin go (golang) for my seminar in university.all im looking for is a paper or a tutorial of some kind of how to do deferred shading WITHOUT FUCKING OPENGL. is that too much to ask jesus christ, im sick and tired of reading about opengl when i google this shit.what im interested in is the basic principle of deferred shading, not some garbage ass opengl c code.also:- dont ask why i use go, i just do- i dont know if this is the right board to post this on. i checked the catalog and most of the people here seem to be blender retards, but im frustrated enough to try my luck here
So you want to code a shader that runs on the GPU, without a framework (OpenGL) that lets you use the GPU?Or are you trying to do it all the hard way, through your own software renderer (CPU only)?
>>658072Go to /g//dpt/You only find tutorials with opengl or directx because deferred makes sense if you are doing your calculations on a gpu if you are doing software rendering then there is pretty much zero reason for deferred shading.
>>658073spot on, i want to do it cpu only
>>658074achieving real time rendering is not my goal anyway, all i want to do is code a deferred shader to understand the principle of it. im having trouble understanding deferred shading as most of the papers and tutorials only give me some opengl shit.
>>658077The concept is simple, render information to a bunch of textures (minimum is surface normals and depth), then do all your lighting on those textures (directional light is just normal dot product with light direction, for a point light you'll have to recover world position from your depth and pixel position). That's really all there is to a defeŕred renderer.
>>658072what horse cock?
>>658072Deferred shading doesn't make sense without reference to a GPU shading / compute pipeline. It refers to which stage of the pipeline the lighting is computed in. If you're writing a CPU render engine, there's no Rendering Pipeline you have to stick to, you just write your instructions in whatever order you feel like. https://www.khronos.org/opengl/wiki/Rendering_Pipeline_OverviewIf you want to write a Deferred Shader in Go, you would have to first constrain yourself to work with memory and parallelization in ways that match the way that pipeline above works. Then then you could reorder what tasks are done in which step. Do lighting first, or do it later. That's deferred shading.
>>658089>>658080i dont see how any of what you describe would help reduce the time-complexity of computing the shading.
>>658072Hey guys, I want to put a nail into this piece of wood with a screwdriver. Why are there no guides for how to use a screwdriver to drive nails?
>>658072There's so much wrong with your thinking, it's sort of insane. You want a paper or a tutorial for doing low level graphics work in a relatively new programming language? You also want information on the subject in a way that runs counter intuitive to industry and academic standards for doing 3d rendering computationally.Options:- Use opengl bindings in golang.- create your own rendering pipeline- Search harder for publications on the subject (they will be relatively old, and you will want to go through archives, so use your library account to search journals)You should know, that option 2 will take you longer than your semester to pull off. What you are thinking of doing is more akin to a master's thesis. You really do not know enough about the subject to take on the task you are interested in, desu.
>>658106what word is being seded to desu?
>>658107I don't know, desu.
>>658095It does in the context of a traditional GPU shading pipeline, what has been used and is still used in some cases is a so called forward renderer. With a forward renderer you calculate the light directly without the intermediate step of rendering normals etc. to a texture, but the problem is you have to do that whole thing, including the vertex transformations, for each light. So a scene with 20 lights will be rendered 20 times, which sucks for performance. (There are options to group lights up so that multiple lights are rendered in a single pass, but then we are probably moving into stuff like clustered shading which is even more complex.)A few more things deferred can't really do transparency (without expensive tricks at least), which often leads to an additional forward pipeline for transparent objects. And even if you do forward shading you often still want to render out a few things like depth and normals for post process effects like ambient occlusion or screenspace reflections.Pic related my clustered deferred&forward renderer from back when I was still doing stuff like that.>>658105>You really do not know enough about the subject to take on the task you are interested inProbably true, implementing a simple baseline deferred renderer (in openGL or DirectX) isn't that difficult though.
>>658072how much ram do you have?
>>658072The exact purpose of deferred shading is to delay running a full shader until all relevant texture, normal, depth and other useful information is collected.There main reason is if a triangle you are drawing is out order, that is some part of it will be drawn over by another triangle in the future, storing it's relevant info is less costly than running the shader that calculates its lighting. This very important if you have a ton lights or probes that you are working with.So it doesn't really make sense to use deferred shading on a ray tracer that always calculates the closest intersection. I guess you could slowly dim the lights on this way, but doesn't seem useful.
>>658095Are you OP? If this is the OP's response you need to start over with a new project idea, you have misunderstood what deferred shading means.Either way please reread comment >>658089 so that you understand there is no a priori computational benefit without thinking in terms of a standardized shader pipeline like OpenGL or DirectX. Unless you're taking into account things like transferring data into and out of VRAM, your cost/benefit analysis for CPU algorithms will not accurately estimate how it works on a GPU.
>>658105im not sure i understand what is meant by "rendering pipeline". say i coded a rasterizer from scratch in go. would that mean i coded a rendering pipeline? that's all that i understand from seeing this graph. if thats the case, then i already have a rendering pipeline. its capable of collecting basic gbuffers like normals, depth, color (all work done by cpu). im now thinking of how to apply shading using only those gbuffers since that's all what i thought deferred shading was to begin with.
>>658248Deffering is meant for the shader program, not the driver. It sounds like you are cache data between the vertex and pixel shaders.
sounds like you're going to a glorified art school, op. Enjoy working in the mall food court in a couple months to pay for your "deferred renderer" degree
>>658270i study cs and this is just a minor project. i dont know why you're talking shit, hang yourself.
>>658273you study for a "minor project" what you can implement in literally under a day. YOU neck yourself
>>658277you're not cut out for this
>>658278ok i feel like you have some sort of inferiority complex, because why else would someone waste his time telling someone else "duh you're not cut out for this" instead of actually posting a reasonable response regarding op.fucking faggot i swear.
>>658279you're "studying cs" very hard by posting on the 4chan i see lmfao
>>658076Why would you do software deferred rendering? Deferred rendering is a series of steps that only make sense in the context of GPU rendering. If you're rendering on a CPU, you can get all the same results in a much more straightforward implementation (albeit far slower)...
>>658313not Op but probably to help wrap ones head around how to go about it as a learning experience for later endavours.Uni's often restrict you into some BS language not of your chosing.>>658072If you understand the idea of what deffered shading is you should be able to translate the core ideas of any tutorial into whatever language you're using. Hoping to find good step by step information on how to go about it in some obscure language seems a tall order. OpenGL or DirectX is gonna be what anyone interested in using this stuff in a real world application is gonna use.I know next to nothing about how to write a program that can rasterize mesh, but I imagine that is covered in many places if you actually wanna do that from scratch.Once you have all the information collected as images in your G-buffer the operations you need to perform on each pixel is very straight forward.Open any basic HLSL shader of your liking and it'll show the fragment/pixel calculations you will need to perform as clear text.
>>658280>Implying that useful discussions can't happen on 4chanOk.
>>658072dont know much about 3d rendering, but wikipedia has good sudo code for most algorithms.