[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k] [s4s] [vip] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Home]
Board
Settings Home
/3/ - 3DCG



Thread archived.
You cannot reply anymore.



File: maxresdefault.jpg (146 KB, 1280x720)
146 KB
146 KB JPG
I've been wondering lately what's a decent hardware rig for working CGI in real life scenarios.

Do we honestly need the most powerful gear out there? I know, "it depends on what you are working", workflow and whatnot but consider the following:

Maya doesn't even take advantage of quadro drivers and now it has Arnold as native render engine, Arnold is CPU based not GPU based so the only use one would end up giving the more expensive card would be the final composite render from AE or Nuke, no?

Sure, i get the more powerful is ALWAYS better thing... i don't doubt that but i do wonder how much of a difference can it actually make performance wise.

Certain projects may actually need the extra power i am sure but isn't that what farms are for?
>>
>CGI in real life scenarios.
what would you consider a real life scenario?
>Do we honestly need the most powerful gear out there?
no
>, Arnold is CPU based not GPU based so the only use one would end up giving the more expensive card would be the final composite render from AE or Nuke, no?
if your bread and butter is rendering (even if not production work) then rendering on CPU would be a mistake. you need speed to prototype quickly and render scenes. you don't want to waste half a day rendering a still. it will just hinder you later on
>>
>>563460

Real life would be on production or taking freelance projects. When practicing or making a reel you can take your time and try out several renders but on production or freelance every failed one is less time you can work on a different project and less money. (If you are not using a farm and charging that to the client that is)

As for CPU based vs GPU based it's not a question of preference... well it is, but if you use Arnold as i do there is no choice, for now.

I am not a render guy, i am mostly a modeler and animator and i really value how Arnold simplifies rendering for me. As long as you know your photography stuff you are ready to go with that one... but is CPU based.

That's what i meant when i was talking about workflow. If you don't use Arnold and use say... Vray then you would get a 1080 for sure because so much of your rendering workflow would be GPU based.

But in that scenario do you really need the 7700k then?

My question is more about what can you get away with hardware wise when working?
>>
>>563476
>But in that scenario do you really need the 7700k then?

high level sculpting i guess.
i can't give you precise list of what you can get away with because im not a graphics engineer.

but i will say there is alot of people in the 3D community that buy dual GPU's and monster computers just for living room renders, so its mostly panic buying and flashing your hardware online.
the bottom line is make sure you have a decent computer
>>
>>563416
Dont be stupid and get Ryzen,those cores are god sent.
>>
>>563480

>Ryzen not the stupid choice.

I understand amd/intel - like amd/nvidia is a thing cause pre-school playground fighting never stops inn life.

But. Hands down intel/nvidia for everything related to gaming/3D modeling and rendering. Yes. You pay more. You also avoid:

>broken drivers
>hotter than the sun
>unstable
>unsupported

Pluses:
You pay less
You have loyalist who scream "It will, it must, please don't leave us amd"

GL
>>
>>563497
You are right on the money when it comes to AMD/Nvidia but the CPU market is different.
I am on Intel since the Dualcore came out. BUt when AMD delivers the same power as Intel for half the price, it absolutely makes sense to think about buying one. Especially for dumb headless render-slaves.
>>
>>563477
>its mostly panic buying and flashing your hardware onli

Well that's exactly the point; not to do that. If one is on a budget it would be seriously fucked up to go for a 1080 SLI build just to discover the rendering engine is CPU based and not GPU based.

>>563477

So Zbrush would be CPU based as well? See... that's exactly what one should consider when building a rig.

If on a budget and knowing most of your workflow would be CPU based then why get a 1080 in the first place if you could get away with a cheaper card?

I feel most benchmarks and tests are done around the gaming community (though i have the feeling even them don't really know why they are buying 8GDRR 4k videocards... i've seen people get the cards but play on single 1080 monitors... what's the fucking point then? the 60 FPS on ultra settings?) sorry, that's not the point... the point was that even with all the information out there sometimes it's still unclear what one should actually benefit of buying.
>>
>>563516
>If on a budget and knowing most of your workflow would be CPU based then why get a 1080 in the first place if you could get away with a cheaper card?

can't beat the common sense check. if you're doing a lot of stuff that's CPU heavy, invest in a good CPU.
upgrading your GPU will generally make things like maya/blender/whatever's viewport speed up, as well as GPU rendering if you want to fuck around with that.
if you have heavy scene files you're working with in those programs, you may be able to justify the cost of upgrading your gpu.

i'm assuming cpu rendering is still king right now though, just based on the lack of support most engines have for GPU rendering.

>>563497
ryzen cpus are just straight up outperforming intel in terms of cost vs performance. not optimal for gaming, but for 3D/vfx/video editing, it's very nice if you can get one.
>>
File: IMG_1254_resize.jpg (376 KB, 1280x960)
376 KB
376 KB JPG
Ryzen seems pretty good if you don't need more than 64GB of RAM or the extra PCIe lanes of an HEDT chip. At work we just put together a X99 based machine that has 64GB of RAM (a budget concession instead of the max allowable 128GB) and we have already run up against that limit doing smoke solves.

For most users it's probably a non-issue but OP did ask about real world use, as in production/freelance. For that, yeah... you do want the beefiest hardware you can budget.
>>
>>563416
You should get a gaming-quality rig instead of a Ryzen/Xeon meme. Viewport speed and interactive performance is the single most important factor in your efficiency unless you're a lighting & lookdev artist and need to do IPR all the time. A variation of 50% in render speed is not going to make that much of a difference at the end of the day. 60m render vs 90m, big deal. But if you can make sure your viewport is always responsive you have made a big win.
>>
Scenario 1 - Get bleeding edge 1080 Ti:
>53 seconds render time (based on blenchmark results)
>700$
>if i don't have a 4k monitor, i can't really get a return investment on gaming
>3D programs (especially Zbrush) favorite strong CPUs with cores and threads, so simulations and high poly counts might hinder the workflow
>high end GPUs need way stronger power supplies (600W+), where CPU consume about 65-100W
>GPUs are limited by memory. Very heavy scenes where the geometry + textures are above the GPUs memory size won't render
>the render speed actually depends on both the CPU and GPU in about a 2:8 ration
Scenario 2 - Get bleeding edge CPU:
I'd personally recommend a Ryzen 1700. It overclocks to the same amount as 1800x, costs less than half of the intel equivalent (i7-5960X), and has those sweet 8 core/16 threads
>costs about 330Euros, but an aftermarket cooler is suggested for overclocking
>can render scenes with the size of your hard drive basically
>render time is 78seconds, down to little over 66 when OC (again from blenchmark)

So overall, i'd still go for the CPU based system.
>>
>>563416
jesus. this isnt 2007.
You will be fine with a gtx 970 and i7/15 6700 or 6500
>>
>>563563
Tell me why I should not have at least 8 cores? Because it is in fact, 2017.
>>
>>563762
are you some kind of sim faggot making some gay ass fake looking sim?
>>
>>563497
You are fucking idiot.
It takes two seconds looking on the spec sheet to know why.
Do whatever its your money after all.
>>
>>563765
Nah senpai I render and composite 3d assets into 4k raw and grade all day. My workstation is a dual E5-2687w V2 box with 64gb ram which is honestly getting a bit long in the tooth.
>>
>>563497
>lol look at this idiot

I just got a Ryzen 1700 and it's great. 16 threads, no temperature issues even with OC at 3.5 GHz, 10 frames fewer than intel at most when we're talking 120 fps with everything at max on the newest games (bottleneck still being the video card).

Amazing for anything that uses multithreading.
>>
>>563557
Where are you taking all those facts from?

From the way you talk it seems like it's just from what you heard and not what you've worked with,
>>
>>563912
He's right as far as his first point goes, don't underestimate the importance of interactivity. More cores are great for rendering, but that comes at the tail end of a project. For actually *creating* content, a snappy, responsive system is more important and that's where good single-threaded performance, fast memory and fast IO comes in.

At work we all prefer the i7 machines over the dual Xeons that have a bunch of cores running at 2.5Ghz or whatever. **For most tasks** anyways.
>>
Just get two of these and put them in on a dual socket mobo. It will give you a CINEBENCH R15 score of just below 2100. Can not be beat by any other affordable combo. Also no single cpu other then 3k+ xeons can top this.

So good for rendering and stuff. Good enough for gaming as well..

http://www.ebay.com/itm/INTEL-XEON-SR0H8-E5-2670-2-6GHZ-20MB-CACHE-CPU-FOR-DELL-M620-/182177906143?hash=item2a6aa63ddf:g:qKAAAOSwGIRXaAiA
>>
>>563963
yeah these things are an insane deal...I have 5 2U dual socket servers I built with these last year for just under $600 each last summer when they were like $70 a pair. Any CPU intensive job we have we throw at them and they just eat it up.
>>
>>563480
>>563558
>>563911
Can vouch with these anons. With a $25 cooler, I managed to get my 1700 to 3.9 GHz at 1.375v, doesn't go a lick over 75c. The dual CPU solutions the other anons are suggesting are nice, but I do a lot more work with programs that have different needs. I haven't done too much work with 3D since I got it, but with what I've seen so far I've been impressed. Best $300 I spent in a long while.
>>
Any recommendations for what I should bump up for my specs? Mainly do environment modeling/rendering in Maya with Arnold and now I'm moving towards UE4. Also wanted to check out Redshift since it's quick with its GPU based rendering. I also do some work in AE and Photoshop.

CPU: i7-4770K @ 3.50GHz
RAM: 16GB
GPU: GTX EVGA 780ti
>>
>>563836

lmao I'm the idiot? Be salty you dumb fucking amd zombie.

You can build a intel / nvidia set-up as cheap with better results using gaming hardware verse a ryzen. lmao.

Saving money doesn't equal smart choice. Go back to play your games, you offer nothing in this thread.
>>
>>563911

>Green texting back and fourth.

Yeah nah. Only idiots are the fags trying so hard to promote their failing 'muh amd' we spend less money lol bbq retards.

In a few months we will be reading all you loyal worms crying about why doesn't anything support amd - from games to 3D. And just go on blaming intel/nvidia over and over and over until amd launches another failure of a cpu/gpu and rinse and repeat.

If amd isn't dead by then.

I don't care to help but I also know when to post common-sense. intel/nivida all the way for gaming/3D/Work.

amd for shit memes, linux pushers and being useless. But my low cost!
>>
>>564397
>Be salty you dumb fucking amd zombie.

Nigger I am using Intel Haswell and Nvidia right now.
Guess what,I am selling this pile of garbage and getting a proper processor that can take on a load as it fucking should.
I had 480GTX that I payed out of the ass just to die on me because Nvidia drivers decided fan control and voltages are irrelevant.
I need to make a profit not make Intel and Nvidia rich.
>>
>>563911
What's your gpu and monitor resolution ?

I'm getting the 1700 with the Asus 1070 and a 1440p monitor. I'm hoping to get at least 60 fps in every game on ultra, considering that I'm getting the oc'd strix.
>>
>>564199
You should overclock your CPU. If you want to get into GPU rendering or UE4 definitely get a new graphics card and preferably some more RAM.
>>
>>563416
Go GPU rendering, once you've tried it CPU is unbearable
>>
>>564398
YEE YEE!
>>
>>564397
>>564398
In what world would you spend more for the same performance? In a few months we're going to see up to 64 physical cores in 2 sockets with 8 channels of memory from AMD on the server side. 16 cores with quad channel on the HEDT side. Intel is prepping a 12 core part to compete with that. I'm due a new workstation and I'm going for whatever gets me the best bang for the buck, including power efficiency, because my shit is up and running literally 24/7.

I could care less about how many fps the shit gets in Overwatch, I care about that Cinebench score. I for one am glad there are actually options on the market now. I don't wanna pay the Intel tax any longer, thanks, and I'll be glad to see more IPC gains than the meager 5% they have stuck with over the last 5+ years.




Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.