It's been about 2 years now
What's your final verdict on 10-bit video?
It didn't decrease the filesizes. At all.
1. Still see banding on many shows.
2. Supposed space saving are negligible.
3. Daiz is a elitist who wants to be seen as a special flower.
Hey Daiz how about translating some show instead of attempting to cash in on piracy?
It's shit. 12-bit when?
Deprecated. We Hi444PP now.
Useless gimmick. Sure, it may have a minuscule advantage (that most encoders can't utilize anyway) but it's not worth all incompatibility problems.
not a fan, reduced compatibility is never good.
No, it just made them better, ask Lord Daiz-sama. :^)
There is a definite benefit over 8-bit, but it's a small one that's heavily outweighed by the issues brought on by forcing its early adoption.
Never. We already have superior Daala codec in development.
stop using windows media player.
Even VLC can do 10b now.
Is this a Daiz summoning ritual thread?
>waa~ my playstation won't play 10bit
>waa~ my dated smart phone won't play 10bit
>waa~ my dated pc doesn't support hdmi
sucks to be you, pleb
You're right, we should still being using MPEG2.
And most video editing software can't, so making gifs became harder.
Most pointless shit ever.
Barely any noticeable improvements (if any), and zero effect on file size.
Definitely not worth breaking compatibility with most players and hardware acceleration supports.
Script kiddies need to learn about cost and benefit, and that new & niche shits does not always equal better.
Reminder that mkv started getting support in hardware players and shit years after it became standart for fansubs. Same with h264
Even my Moto G can play 10bit anime.
Sucks my phone battery like Saten.
Sucks because now I can't watch it on my tablet.
Dead on Arrival, especially when everybody knew HEVC is just around the corner.
>people still not seeing the difference.
Is it your eyes or your brain that is damaged?
it's FLAC for videophiles and Daiz should be reported to the Weekly World News as Batboy's long-lost brother.
FLAC is loss-less, 10bit video is not
>playing 10-bit video on consumer level 8-bit monitor
sounds like listening to 24bit FLAC on $5 earbud
>playing 10-bit video on consumer level 8-bit monitor
What is dithering?
Who are you quoting?
I'm sorry. I know there shouldn't be a hear able difference but since I first used it my brain just tells me it sounds like shit the first moment I know it isn't lossless.
I can't help it.
>what is pointless
Yeah it will look a little bit better but that's about it because most people don't have monitor capable of 10-bit.
>monitor capable of 10-bit.
And that basically doesn't matter here.
>Implying that's relevant when the source is 8 bit
It's just to save on filesize, not magically make the source higher quality
>most people don't have monitor capable of 10-bit
After almost three years of Hi10p, there's still people thinking this means anything?
>ust to save on filesize
That's about it? How much does it save?
I'm being serious here.
>It's been about 2 years now
No. Because it's fucking stupid anyway since sources are 8-bit.
Two and a half.
>You're correct in that converting 8-bit sources to 10-bit does not magically reduce banding by itself. However, beyond that, you're pretty much completely missing the point. Let's talk about the medium we're working with a bit first.
>Banding is the most common issue with anime. Smooth color surfaces are aplenty, and consumer products (DVDs/BDs) made by "professionals" have a long history of terrible mastering (and let's not even get to the subject of what QTEC does to video quality). As such, the fansubbing scene has a long history with video processing in an effort to increase the perceived quality by fixing the various source issues.
>This naturally includes debanding. However, due to the large smooth color surfaces, you pretty much always need to use dithering in order to have truly smooth-looking gradients in 8-bit. And since dithering is essentially noise to the encoder, preserving fine dither and not having the H.264 encoder introduce additional banding at the encoding stage meant that you'd have to throw a lot of extra bitrate at it. And remember that we're talking about digital download end products here, with bitrates usually varying between 1-4 Mbps for TV 720p stuff and 2-12 Mbps for BD 720p/1080p stuff, not encodes for Blu-ray discs where the video bitrate is around 30-40 Mbps.
>Because of the whole "digital download end products" thing, banding was still the most common issue with anime encodes, and people did a whole bunch of tricks to try to minimize it, like overlaying masked static grain on top of the video (which I used to do, and incidentally is something I've later seen used in professional BDs as well - though they seem to have forgot to properly deband it first). These tricks worked to a degree, but usually came with a cost in picture quality (not everyone liked the look of the overlaid static grain, for example). Alternatively, the videos just had banding, and that was it.
>Over the years, our video processing tools have got increasingly sophisticated. Nowadays the most used debanding solutions all work in 16-bit, and you can do a whole bunch of other filtering in 16-bit too. Which is nice and all, but ultimately, you'll have to dither it down to 8-bit and encode it, at which point you'll run into the issue of gradient preservation once again.
>Enter 10-bit encoding: With the extra two bits per channel, encoding smooth gradients suddenly gets a lot easier. You can pass the 16-bit debanded video to the encoder and get nice and smooth gradients at much lower bitrates than what you'd need to have smooth dithered gradients with 8-bit. With the increased precision, truncation errors are also reduced and compression efficiency is increased (despite the extra two bits), so ultimately, if you're encoding at the same bitrate and settings using 8-bit and 10-bit, the latter will give you smoother gradients and more detail, and you don't really need to do any kind of processing tricks to preserve gradients anymore. Which is pretty great!
>Now, obviously most people don't have 10-bit screens or so, so dithering the video down to 8-bit is still required at some point. However, with 10-bit, this job is moved from the encoder to the end-user, which is a much nicer scenario, since you don't need to throw a ton of bitrate for preserving the dither in the video anymore. The end result is that the video looks like such an encode on a 8-bit (or lower) screen, but without the whole "ton of bitrate" actually being required.
>So the bottom line is that even with 8-bit sources and 8-bit (or lower) consumer displays, 10-bit encoding provides notable benefits, especially for anime. And since anime encoders generally don't give a toss about hardware decoder compatibility (because hardware players are generally terrible with the advanced subtitles that fansubbers have used for a long time), there really was no reason not to switch.
Hard to say for sure. If you encode in 8-bit, you'll have dither the source before encoding to prevent banding. Dithering is more or less just noise, which hurts compressibility very much, i.e. increases the necessary bitrate, giving higher file sizes. Encoding in 10-bit, you can postpone the dithering until playback, which is a much better solution.
Also see http://x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf
>it's fucking stupid anyway since sources are 8-bit
Okay, now you're just trolling.