[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Search] [Home]
Settings Home
/3/ - 3DCG

Thread archived.
You cannot reply anymore.

File: 1521329022226.png (277 KB, 1035x877)
277 KB
277 KB PNG
What's a good laptop for gamedev and /3/ ?? Looking to spend under $1k
Anything by Alienware

Really? I always thought alienware was overpriced as fuck because it targeted gamers specifically.
They are overpriced, but laptops are overpriced in general

Build a fucking desktop WORKSTATION, you ginger-bearded hipster.
lmao it's easy to say that without knowing how the cpu/gpu/ram market got fucked by faggots believing in digital coins
How cpu/ram are connected to digital currency? Not even the most retarded believers would use them to mine coins. Also the bubble is already exploding, many "farmers" started to sell their 50 Titans for half price.
don't be stupid shit - google, analize, make a choice.
>Also the bubble is already exploding, many "farmers" started to sell their 50 Titans for half price.
really? don't get my hopes up m8
Card prices are already falling. The GTX 1070 I bought before this craze started is now at “just” 100 € over the normal price.
Used Dell Precision, ThinkPad P series, HP Zbook.

Just search for Quadro, Laptop, 17inch screen size.
- Both DDR sticks used by the system and GDDR memory used in graphics cards come from the same place and in fact have manufacturing requirements and processes similar enough that the price of DDR can get affected by an increased demand of GDDR and vice-versa. Miners buying cards wholesale and increasing demand for production does affect memory prices in that regard.

- Some cryptocurrencies (anything that uses scrypt) were designed not to take advantage of GPU computational power per se, but of memory and memory bandwidth instead. In that case it pays to have both fast RAM on your system and a card with as much VRAM as you can afford.

- After the whole India crackdown on BTC and some more cryptoscams running away with the money, the bitcoin price is recovering at a predictable pace. Some retarded kids who took out two mortgages + their family reserves from the bank, bought all the expensive cards and short sold at the first sign of trouble may have been dicked, but the bubble hasn't really burst yet. It's more likely that scammers are taking advantage and fleecing desperate PC builders so I'd be careful around used cards.

Before you ask, I don't mine anything. I just happened to absorb all of this from all the faggotry going around cryptocurrencies.

The RAM market is mainly getting fucked by phone designer companies cramming more and more RAM into these devices which clearly don't need it (why a smartphone has to have 8 gigs of ram is beyond me).
you're always going to have a substandard build for a higher price compared to a workstation. But if it's a must then Dell makes some good workstation laptops. Alienware are overpriced but "gamer" laptopts actually tend to be quite good for CG stuff due to the decent CPU/GPU/SSD specs. You just have to find one that isn't all riced out with LEDs and angles everywhere.

Generally any high spec cpu from the last generation or two will work fine, if you go for a high thread count such as a Xeon or Threadripper CPU then you will be able to render on multiple threads at once, which will generally increase CPU render times massively. However single-core performance is also important for general responsiveness and for applications which are not multi-threaded. I have an i7-4790k on my desktop and I can render on 8 threads about as fast as the 12-thread zeons at work because of the better per-core speed, but this tradeoff will vary based on other factors.

get as much RAM as you can afford, 3D has a tendency to use huge amounts in certain applications especially if you're working with large textures and complex models. 16GB is a minimum, aim for at least 32GB

SSD is also a must, you will find massive reductions in load times for models if you work from one compared to a HDD. But since SSDs are still expensive for mass storage I would recommend a 1TB SSD for your OS and programs, and then a big storage HDD (3, 6 or even 12TB)

GPU choice will depend on your workflow and budget, if you're doing precision modelling and using a lot of CAD programs then a Quadro card offers double point precision however this is probably too expensive and they tend to suck in more realtime, game-engine type workflows. A GTX 10-series would be great for pretty much any 3D program but GPUs are expensive as fuck right now. Maybe worth waiting for the next generation to see if prices come down a bit but in the meantime try to find a used one that hasn't been used for mining.

Have you ever considered that if you need such high specs, maybe it's because you're doing something wrong?
Not that poster, but those specs are pretty standard for a CG workstation.
>those specs are pretty standard for a CG workstation

Not that, that, poster. But maybe don't push server-grade hardware onto a person who probably doesn't even know the first thing about PCs or gamedev? OP is likely an amateur who wants to play around in Maya and Unity making and shading PS1-level models from the image he posted - the only real info we have to go for.


For that kinda baby workload, just buy one of these and be happy with the fact that you weren't just suckered into spending twice the price into something you won't use and won't be great for gaming either. If you move up to UE or something and you need the extra kick, you can buy a 1060 and still be good.

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.