[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Search] [Home]
Board
Settings Home
/3/ - 3DCG



Thread archived.
You cannot reply anymore.




File: rtx.png (2.79 MB, 2160x3645)
2.79 MB
2.79 MB PNG
As we all know, Nvidia's finally brought their RTX cards out of the dark and we're starting to see people publish performance metrics about them. The 2080's basically a 1080Ti, and the 2080Ti's almost hitting that fabled 4k60 at a bank-breaking pricetag.

So, let's restart the conversation.
Of course, since this is /3/, it'll be 3DCG-related:

Do you see the current implementation of RTX as a gimmick or could it be the start of a new era? Do you think we'll ever get to the day of genuine real-time path-tracing?

When Unreal (or whatever real-time engine of your choice) gets RTX support, will it even be worth it rendering in a dedicated path-tracing engine anymore? Unless your scene has to handle translucent materials, glass and/or realistic volumetrics, there's really not going to be much point, is there?

How do you feel about the mass of gamers not understanding the significance of RTRT tech and dismissing it as Nvidia attempting to scam consumers? Are they right, or are they fools?
>>
>>642695
bump someone do testing with redshift plz
>>
File: 1535309109163.jpg (59 KB, 662x900)
59 KB
59 KB JPG
>>642695
I'm quite sure these aren't even using RT cores either, as far as I know only RenderMan has officially added support for it.

>How do you feel about the mass of gamers not understanding the significance of RTRT tech and dismissing it as Nvidia attempting to scam consumers? Are they right, or are they fools?
Absolute brainlets like usual. Tried explaining what RT is and why RTX is a good idea, got told Ray Tracing is a meme like 3D TVs. Not exactly surprising as people that play videogames aren't known to be the sharpest tools in the shed, but still. They seem to absolutely not understand how development works, and how having real time GI and ray traced reflections will allow for much faster development and most of all much much more dynamic lighting (and therefore levels).
>>
>>642695
>will it even be worth it rendering in a dedicated path-tracing engine anymore?
Yes. Whatever speed the realtime engines get from this, the offline renderers should also get but without the additional overhead of having to run an entire game alongside.
>>
>>642695
It depends on the performance of RT-cores.

I have read absolutely absurd bullcrap specs, something like RT cores being able to cast a bazillion rays per seconds, which would be big. If. True.

If.

True.

It's NVidia we're speaking. I don't doubt that they are willing to add lots of zero to the bazillion, or even to fake a gorillion altogether. Remember muh six gorillions ray/s!

If RT cores aren't that revolutionary, then it'll flop just like hairwork. We'll get RTRT later, one or two graphic cards generations later.
>>
>>642711
it has nothing to do with stupidity only preference, it's only been in recent years we have gotten 1080p/144 and 4k/~60. Many gamers on PC prefer fluidity more than water puddles and mirror reflections. Yeah raytracing is more than that but it will take time to catch on and for people to actually use it. If AMD also implemented hardware accelerated raytracing I think it would have much better chances of a wide-implementation but NVIDIA would likely push for exclusivity anyways, like PhysX.
>>
>>642695
So what's RTX stand for? I got no ideas

Ray Tracing Xtender?
Roy Taskforce 10
Radio Tesselation X

I think the X is just to look edgy right?
>>
>>642722
Ray Tracing X. I mean it succeeds the GTX cards, so that's probably why they kept the X at the end.
>>
Raytracing in most games will be mostly a useless gimmick. Nobody has time in a fast FPS shooter to glare at some pseudo-reflections. Useless
>>
>>642734
I would argue that most games are not fast fps shooters though.
>>
>>642734
Just lol at this comment.
>>
File: 4563548343253.png (34 KB, 653x726)
34 KB
34 KB PNG
>>642734
Do you not see the implications of real time global illumination, which extend well into gameplay too, or are you just willingly ignoring them?
>>
>>642724
Is this the end for the Game Tracing X cards?
>>
>>642753
We'll likely see GTX cards being the 2060 and 2050, which probably will not have RT and Tensor cores.
>>
>>642701
Right now Redshift gets about 25% speed improvement, same as every other CUDA app. Redshift 3.0 should implement RTX features and see a bigger boost.
>>
>>642754
>>642837
Imagine if those hypothetical GTX cards get released, and they bring the same improvement in renderers, a 25 %. And, when support for RTX tech has finally arrived in renderers, bringing the expected additional improvements, NVIDIA releases a new generation of cards with greater performance.
>>
>>642749
you can do fake GI with image based lightning
>>
One thing I haven’t seen mentioned is that the consumer RTX cards will use NVLink, and depending on how it’s handled by the drivers and such, we may very well see RAM pooling come to consumer hardware. Having 22GB of VRAM to work with would be incredible as that covers just about any reasonable scene size the average artist could need.
>>
>>643047
>Having 22GB of VRAM to work with would be incredible as that covers just about any reasonable scene size the average artist could need.
Yes, that may be true, but do you want to be an average artist, or an exceptional artist? The best artists use Quadro cards -- if you look up to them when it comes to art, you shouldn't do otherwise when considering in which tool to invest now.
>>
>>643057
>The best artists use Quadro cards
That is a stupid statement, and sounds like advertising. Inclusive the implied reverse association of becoming a better artist by using this product.
Fuck yourself you stupid shill.
>>
>>643057
That's dumb and misinformed. Everyone knows 4x1070 has been the best bang for the buck for years. Even Redshift official docs still recommend this card. Until RT features are implemented the 1070 is still champion.
>>
>>643047
for that reason alone I have a feeling they may lock that feature out to not eat into quadro rtx 5000 sales. I could be really wrong though, maybe those workstation cards have more up their sleeve than anything I know about.
>>
>>642701
>redshift
Please, the ultimate test us using old luxrender
>>
>>642695

This isn't even using the RTX cores. Nothing has them implemented yet, you need to implement the new Optix API to make use of the RTX cores. Redshift will have it in version 3.0, Octane is "Coming soon".

Arnold apparently has it in their internal test but that thing is never going to release so who care.
>>
File: result.jpg (212 KB, 3224x1197)
212 KB
212 KB JPG
Why is the 2X Titan's performance all over the place?
>>
>>642695
Nay and gay. You can do the same thing with CUDA already. Bullshit gimmick.
>>
>>645045
It's funny how know-nothing 'explainers' like Linus pretend now they know shit about ray-tracing because they skimmed a Wikipedia article. Once they had a live show thing on and all people that came were lit. 12-yo boys - you can find it on Youtube. Ridiculous.

Anyone who got a card to review is a PAID SHILL. Why? Because Ncuntia doesn't give cards to ppl early unless they own them. Notice how there is never a mention of this before any review.

The fucking corporations own and run everything.

Don't buy Nvidia. They intentionally manipulate customers. For example the GTX970 has less ram the the specs and there was even a class action about it.
>>
>>645051
>Don't buy Nvidia.
It's sad because while I agree with you, there's literally no other choice on the market. AMD doesn't even come close to the high-tier 10-series cards, let alone 20-series. Even if value wasn't an issue, a Vega 64-LC still can't touch the performance of a 1080Ti. The closest we'll ever get to another competitor in the GPU sphere would be Intel (if they follow through with their GPU rumours), and you should already know Intel's not that good of a company either.





Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.