[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k] [s4s] [vip] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Home]
Board
Settings Home
/3/ - 3DCG



Thread archived.
You cannot reply anymore.



File: pep.png (51 KB, 657x527)
51 KB
51 KB PNG
nvidia or amd for 3d rendering
>>
>>548376
mobile graphics
>>
Nvidia for CUDA in Blender, don't know what the other programs work best with.
>>
>>548376
Nvidia, just so you don't get locked out of being able to use CUDA-only renderers if need be. Both brands can do OpenCL.
Main thing that matters is number of cores, so in some cases two slower cards may be better than one fast card, for example two 1070's vs one 1080.
Rendering doesn't need SLI, so anything goes as far as # of GPUs. Rendering scaling is fairly linear.
>>
>>548376
nvidia. blender doesn't support amd as a compute device.

i'd go NVidia over amd graphic card any day even for gaming.
>>
>>548405
>blender doesn't support amd as a compute device.
That's complete nonsense
>>
>>548405
With OpenCL it does support AMD, but they also say that CUDA is faster than OpenCL.

>Currently Nvidia with CUDA is rendering faster. There is no fundamental reason why this should be so, because we do not use any CUDA specific features, but the compiler appears to be more mature, and can better support big kernels. OpenCL support is still in an early stage and has not been optimized as much.

https://www.blender.org/manual/render/cycles/gpu_rendering.html
>>
>>548417
anon here.

ok did not know. thanks for the enlightment!
>>
>>548376
Nvidia also released MDL recently. Not sure if AMD can display them properly.

Nvidia seems to have a tentacle in every aspect of rendering, much like Autodesk. I don't know what the fuck AMD is doing apart from Vulkan.
>>
>>548405
>nVidia over AMD for graphics
Yeah if you want your card gimped in a year, are ready to upgrade for any new features and want to drag the inudstry backwards
>>
File: Nvidia-AMD.png (289 KB, 2236x527)
289 KB
289 KB PNG
>>548517
How exactly is AMD moving the industry forward? They're very slowly sliding off the market. They need to get their shit together - while I do use Nvidia, I don't want them to have a monopoly.
>>
>>548517
You can whine all you want about it, but the bitter reality is that CUDA has a high market share, OpenGL has not and is not as fast and stable, and Nvidia Cards even outperform AMD cards on OpenGL.
Efficiency is all that counts. Not ideology or Brand loyalty.
I don't like it and it might change soon, but right now thats how it is.
>>
>>548521
>Q2 2014

what the hell happened
>>
>>548521
They've gotten a lot better since 2015.
This graph conveniently ends at 2014
>>
>>548526
I wouldn't say "a lot better" since they're still below that peak, but maybe they'll make a comeback.
>>
>>548523
Talking about gaming here

>>548521
>no async compute
>loses frames in Vulkan
nVidia doesn't let a cent near rnd and throws all their Jew gold at die shrinks
>>
>>548521
>How exactly is AMD moving the industry forward?
HBM, which nvidia still doesn't have 2 years later, async compute and also Vulkan.
>>
nvidia because quadro
>>
File: p100.jpg (450 KB, 1920x905)
450 KB
450 KB JPG
>>548579
>HBM, which nvidia still doesn't have 2 years later

Not that it matters because of the crazy price, but the Tesla P100 GPUs which are now shipping have 16GB of HBM2.
>>
I'm building a PC, and right now I'm working off a laptop. For a good ballpark for rendering speed, if I try to render a decently sized scene at 1920x1080 at 500 samples, it can take anywhere from 2 to 4 hours a frame. I mean, it sounds like the laptop is terrible, but it's not that bad. I can run games from 2010 well enough (although that's not much of a benchmark either).

In either case, Nvidia or AMD, I'll still notice a good improvement in rendering speed right? Like I'll be able to wait a few minutes instead of all night to render a scene?
Is crossfire or whatever going to improve rendering speeds if I hook up 2 cards in parallel? In case I decide to get a second one down the line.

I'm only really looking to spend like $700 at most, which I'm still trying to save up.
Right now I'm looking at a Radeon R9 380 for the card. Honestly I don't know much about graphics cards as far as how they compare to each other, but it seems decent enough to run modern games, especially at the price point.

Sorry for all the questions. I've built PC's before, but I never was the one to buy the parts.
>>
>>548626
I forgot to mention that I use blender. Don't really know if that'll make a difference.
>>
>>548578
>Talking about gaming here
This is /3/ not /v/
>>
>>548376
the cpu
>>
>>548626
>I mean, it sounds like the laptop is terrible, but it's not that bad. I can run games from 2010 well enough
You can run games from 7 years ago "well enough"
Wow, that is actually quite terrible.

>Like I'll be able to wait a few minutes instead of all night to render a scene?
That depends on what you're rendering. How could I answer your question?
In any case if your hardware is that bad of course you're going to notice a big improvement.
It took me 17h to render 300 frames of a walking animation.

I have an R9 fury and it seems fast for still renders.
Animations take forever though but that's just how it is. If you want fast render just pay a render farm instead.
Even if you get 4 titans it's still gonna take a long time depending on what you want to render.
I mean it's not like I can give you numbers since it's completely meaningless unless I try rendering the same thing you do.

>>548627
>I forgot to mention that I use blender. Don't really know if that'll make a difference.
as has been stated in this thread blender is faster with Nvidia. AMD is still trying to fix that.
>>
>>548665
I have a relatively recent laptop (2014) and it lags pretty bad running games from the 2005-2008 era
>>
>>548667
disable AA kid
>>
>>548668
I can play oblivion fine, but only with lower settings and on 800 x 600 resolution. If I try to play anything newer it'll lag to hell, I'll get 5-10 fps at most with frequent crashes.

It doesn't help that the laptop itself seems to be built like shit. It's entirely plastic and fucking bends and wobbles. You can bend the whole thing and give it a solid 10-15 degree curvature no problem and it'll come back to its original shape.

One of my classmates threw it on the floor once and it didn't even get damaged, only made a really loud "DONK" sound and kinda bounced off. It worked fine after that.
>>
>>548626
Most renderers are CPU based! That means the graphics card will do jack shit. If you want to use the graphics card for rendering you need to look into GPU rendering.

It sounds like Blender supports some form of that. If it's CUDA only you'll want to get an nvidia card. You'll want to get an nvidia card anyhow for 3d, they've always been more solid for this industry compared to ATI/AMD.

>Animations take forever though but that's just how it is. If you want fast render just pay a render farm instead.
Even if you get 4 titans it's still gonna take a long time depending on what you want to render.

Things have changed. With something like Redshift and a good GPU you're rendering 5-20 times faster than a CPU. A machine with 4 Titans would blow away the shitty 10-node renderfarm we have at work (which we still manage to render projects with).

This dude rendered his short on a single worstation:
https://www.fxguide.com/featured/how-one-vfx-artist-made-these-3-minutes-of-madness/

>This way, I was able to do the whole project on a single workstation with render times ranging from 2min to 15min in 1440p with full brute force GI and Motionblur / DOF. Deadline was also used to stack up jobs so my workstation would be busy around the clock.”

>Lovvold created the piece on a single workstation from his home in Norway. “Nothing too special,” he admits, “except four GTX780TIs for the GPU rendering. You could say the last generation top-end cards - I would go with GTX980TIs or TitanXs if i where to build a new system now a days. My workstation also doubled as an oven in my tiny cellar, with each GPU card running at 83C!”
>>
>>548376
Nvidia Geforce is for gamers - you can't see what the card is doing unless you buy exactly the same card but re-labeled as "Quatro". So for involving work you don't want Nvidia unless you have truck loads of money




Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.