Ugly Hedgehog - Photography Forum
Home Active Topics Newest Pictures Search Login Register
Main Photography Discussion
10 bit monitor on 8 bit video card/chip computer
Jan 7, 2020 15:50:52   #
wrangler5 Loc: Missouri
 
I saw a post after Christmas pointing out that B&H had a sale on the BenQ SW240 24.1" 16:10 PhotoVue IPS Monitor at a good discount. (I'd swear I saw it on UHH, but a search back through the digests, which is what I read, didn't turn up the post. Anyway . . . ) In a fit of get-it-while-I-can I ordered one at the last minute on 12/31 and it arrived yesterday.

In the meantime I've done some research and realized that this monitor has (or can handle) 10 bit color depth. But my 2012 quad core i7 Mac Mini can only put out (if that's the correct term) 8 bit color information. I do all of my image work on the Mac, but do have a spring 2019 Windows 10 box available. It's a Micro Center house brand with an 8 core i7 processor that includes an embedded Intel UHD Graphics 630 chip/function. (I didn't buy it to do photo work, just to run a couple of programs that are Windows-only products.) I gather that this provides 8 bit color depth natively, but it seems there may be a software switch that can turn on 10 bit processing. Or, I could add a new GPU to the Windows box that does 10 bit color natively.

ALL of my printing is B&W. I do all of my post processing in Lightroom 6.14, and print through LR to a pair of Canon Pro-10 printers (they were on sale.) I gather that LR has no facility for dealing with 10 bit color depth, regardless of the file it starts with or the type of computer it is running on.

My immediate dilemma is whether to unpack the new monitor and set it up, or just send it back unopened. If it's a "10 bit monitor", will it function effectively if it's only getting 8-bit information from Lightroom? I am currently using an inexpensive Asus monitor (on both the Mac Mini and Windows boxes - I switch inputs when I switch machines) which replaced a probably just as inexpensive Acer monitor that died in 2019. It is not calibrated to Lightroom, and while it looks good for general computer use, the B&W photo images that I Develop in LR need brightness (-30) and contrast (+50) correction on the Print panel to look good out of the printers. I thought a "better" monitor might be the solution, but now I'm starting to think I should just try to adjust the Asus so what I see in LR will print with little or no additional compensation.

Thoughts will be welcome.

Reply
Jan 7, 2020 16:16:50   #
johngault007 Loc: Florida Panhandle
 
I had to do a little reading for Windows/Mac on 10-bit, and it seems that the limitation is more at the graphics card->software level. If your workflow is only providing 8 bit output, the monitor will handle it just fine. If you can afford to keep it, or do other things like gaming or high def videos, I would keep it and let the other technologies catch up.

Reply
If you want to reply, then register here. Registration is free and your account is created instantly, so you can post right away.
Main Photography Discussion
UglyHedgehog.com - Forum
Copyright 2011-2024 Ugly Hedgehog, Inc.