8-bit vs. 10-bit Video | What's the Difference?

preview_player
Показать описание
What is 10-bit video? Should I shoot it over 8-bit video? In this video, Doug explores what's at the heart of this debate as he explains what bit depth is, and why it matters both in production, and in post.

Shop B&H:

Subscribe to the B&H Photo YouTube Channel

Follow us on Social Media
Рекомендации по теме
Комментарии
Автор

2025: I finally got 10bit everything!


2026: You must have 12bit.

ChunkyChest
Автор

B H is our go to source for gear and equipment. Been using you guys for over 30 years. Appreciate your honest and informative videos

designmediaconsultants
Автор

This was really informative and really easy to understand. Thank you Doug and B&H

rinusworldzm
Автор

Well, I'll put it simpler: 10 bit to 8 bit, is like 4K to FullHD. When recording in 4K, You'll have more informations, pixels, to edit later, and the final result even if you export in 1080p, will be better than shooting at 1080p at the very beginning

kimsonpro
Автор

I never got why 10bit was important because all the examples talks about banding in the sky and I was like: so? I hardly shoot sky!
But while doing a side-by-side between Z6+Ninja V and A7III, I realized how much better the "color separation" was between objects of similar colors/shades. Even more so in the skin tones/textures - 10bit made the skin look so much more lifelike.

hb
Автор

I just went from an 8 bit monitor to a 10 bit one and I I'm seeing shades of purple and blue that I have never seen on the 8 bit one. I tested this by comparing the exact scene and it does really make a difference

JohnJohn-pmwq
Автор

Nicely done. Great speaking voice for YouTube productions and you're a great "explainer." Keep producing! Oh, and the content was great, too! (smiley thing)

johnolson
Автор

I have the GH5 and normally shoot 8bit, shot a short movie on the weekend with some night scenes, let me tell you the bit 10 made a HUGE difference in post, I was blown away by the flexibly of the gradding

astrokeneda
Автор

Found out why shooting in 8 bit f log I got color banding everywhere when I graded. This helped so much, now shooting in 10 bit I can do really great color grades!

BarnabyFWNightingale
Автор

One question bothered me for a long time. OLED iPhone has 8bit display, but it supports both HDR10 and Dolby Vision, why?

aih
Автор

I just bought a Lumix S5 camera from you. I didnt understand what 10 bit was, but every video hyping the camera talked about how it has 10 bit, making it clear that anyone who didnt know what that is, aint right for this camera. But nobody ever explained what it is, until this helpful video you made here.

jasonrossrealty
Автор

Nicely explained! As a one-man video production crew, I'll stick to 8-bit as the quality is acceptable for the average consumer, and doesn't require more space and processing requirements like the 10-bit. I don't see the budget justification for using 10- bit.

myxsys
Автор

I can't really see any banding in the sky shot at 3:42 even before the noise was added. Not sure if it's because my high-end laptop's monitor might be very good, or because Youtube's compression algorithm actually hid the banding, or what?

bennemann
Автор

That was a clear and peaceful explanation

mistermagnifico
Автор

Love every video that explains the advantage of one tech but not necessarily recommend it to every one.

jianxiang
Автор

now i know. thanks for the explanation. it's simple, and easy to understand, even without turning speaker on. thanks for "CC" feature from youtube, i can watch this video in office.

DeddyHermansyah
Автор

I know Wikipedia says dithering is noise, but is really not. Check the algorithms. It compensates for error in one pixel by changing next pixel towards the previous colour. Keep accumulating errors and correcting them. Looks like noise but there is no randomness involved. Doing it linearly is not the only option hence different dithering algorithms. I have also built a few new dithering algorithms but nothing better than existing ones.

AnitaSV
Автор

Hey guys (I know there are many expert viewers here and B&H), some computer related technical question that I'm curious about. Let's say a monitor uses DP 1.2 Input (21.6 Gbps bandwidth) connected to a computer that has DP 1.4 Output (32.4 Gbps bandwidth) with a DP 1.4 Cable.

Basically
Output DP 1.4
Cable DP 1.4
Input DP 1.2 (That's what written on the monitor)

Computer is showing that screen is running 2560 x 1440 Native Resolution at 165 Hz YCbCr422 10 Bit Color Depth. Which I think is higher than DP 1.2 bandwidth on the receiving side. By right, Windows would not even show it if it's not capable of such specs. Unless, the display is secretly DP 1.4 hahaha.... My curiosity is killing me.

Help...

RayMak
Автор

Ok, got it. Now, can you tell me how much larger will be a 1 minute file shot in 10 bit versus 8 bit ? Thank you

elmono
Автор

my computer does not render fast with video card, when i record in 4.2.2 10 bit. only recognizes 4.2.0.

TVPALOTINA