[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Search] [Home]
Settings Home
/3/ - 3DCG

Thread archived.
You cannot reply anymore.

What did people use before Zbrush?
File: OH MAX.jpg (59 KB, 620x1079)
59 KB
3ds Max, Maya, Rhino.
Retopology was nonexistent since we either used Nurbs/surface modeling or boxmodeled the characters. Not much difference in the final results, but the processes took about 2 weeks per character, with Zbrush´s help it´s 3 days.
1. Modeling in NURBS, converted to dense poly mesh, then pulling and shaping vertices by hand
2. MetaBall/MetaClay/MetaShape tools, approximating your object with shapes, then converting into continuous poly mesh - Metareyes was particularly good at this
3. Some packages had primitive sculpting and painting tools, mostly via plugins. Again, this was mostly just to push and pull groups of vertices and rather slow, not like the fancy voxel/volume based approaches
4. some Studios like ILM had proprietary in-house tools for sculpting and painting before zbrush
they used mudbox for the LOTR movies
If i remember correctly, they used a haptic device for 3d modeling: they sculpt the model in clay, drew the polys BY HAND on it and them used the haptic stylus to create the 3d model on Alias, running from a Unix system inside a silicon graphics Station.
Haptics are still on market today... so expensive that you can´t even look up the price on their websites. Their market are billionaire´s engineering laboratories, medical research labs and old Hollywood studios who still make clay sculpts.
I worked with some 3D digitizers during the 90s. You had to draw control points on your object, later scanning them one by one in the right order, confirming each one by pressing on a foot pedal switch. You could then convert the result into either nurbs or polygons. Worked extremely well for organic objects like faces or body parts.

this is the most /3/ boomer thing i've ever seen.
the haptics do look kinda cool tho. someone's selling a used unit for 5k which is apparently 1/5th of the original price.
SGI hardware and Power Animator or SOFTIMAGE|3D
Found one product that is very similar, but already a bit more advanced to what I used.
Price was around 15k for the smallest model if I remember correctly.
Didn´t Alias came first?
Displacement maps and bump maps. We stepped up the subdivision to ridiculous levels and displaced it at render time to achieve things like realistic dinosaur skin. This was also way before normal maps where even invented so fine surface sub millimeter type texture was achieved with bump maps (legacy ones since that term often is used for normals today).
To elaborate, with stacked displacement maps it's possible to something very much akin to what you can do with 'vector displacement maps'
and it was therefore possible to create surfaces just as detailed as you can with sculpting today.
Only it took forever compared to using contemporary sculpt tools since the artist could not see in real time exactly what they where doing.

The process kinda looks like this, say you have a good smooth subdivision model of a T-Rex.

You step up the subdivision so the geometry is really dense, just like on sculpts today, then you apply one displacement map to get all the major bulges in place
surface muscles, major skinfolds, things like that using a black & white height map you paint by hand ontop of a template of the UV's in photoshop.
You now have a smoothskin T-rex but with all the major musclegroups and major curvature details in place.

Ontop of this you then displace with another displacement map holding finer geometry that sits ontop of that geometry, like the individual scales and fine bumps of
a size that still have noticeable geometry to them.
You now have a very realistic looking geometric representation of something that may look like it could've been a scan of a living creature (suffice you did it right).

Ontop of this you now use bump maps to give texture to the individual scales and you are now down at a per-pixel level so any detail you create wont need
any actual geometric displacement to look real, unless you go for a macro photography type shot.
Pretty much this. I worked with Renderman which handled the subdivision and displacement at render time beautifully and fast.
Painting clean gigantic displacement maps wasn't always easy and I often ended up with seams or artifacts which I conveniently hid on the dorsal/ventral side of characters. Of course this was before UV mapping was widespread. Surfacing a big model with many planar projections was a tedious, sometimes week long process.

It came first, by 2 or 3 years.

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.