500 nits vs. Spyder5Pro (and probably other calibration tools)
a6k
Loc: Detroit & Sanibel
BebuLamar wrote:
I do like high res screen like 4K but I do need monitor that supports hardware calibration and those that do and 4K are still quite expensive.
The BenQ SW271 does all that and is around 1100 USD at B&H and others.
tcthome wrote:
I would look for a monitor that's geared for photography & be more concerned with color output ( 100% adobe ). Benq , Eizio , Nec all make these monitors but they all cost more. I also think they make the monitors your talking about that bright for movies & gamers. Also some monitors have different modes so you can calibrate in say , photo or 1 of the custom modes & switch back to a reading or gaming mode if need be. Good luck & happy shopping.
Dell's P2715Q was the first 27" monitor that delivered 95% Adobe. It has aged for years and has been succeeded by more than 1 that offer 100% for prices well below the brands named above.
OK, maybe I'm dumb, but what is a nit?
a6k
Loc: Detroit & Sanibel
RCJets wrote:
OK, maybe I'm dumb, but what is a nit?
The technical answer was given by thaupt as (Cd/m^2)., above. It is a measure of brightness in terms of the strength of the light per square measure and it is thus linear rather than in stops.
a6k wrote:
You all have offered interesting information. But the question I'm trying to resolve is why it would be important to have a brighter monitor if good calibration and my own comfort tell me it would not be used. Is there something more to this that I am missing?
I do understand that more pixels on a particular size of screen necessarily means that absent any scaling, text will be smaller. I also understand that monitors may or may not perform at their advertised specifications. Those are not questions I'm asking however important it may seem to others.
A related question would be this: if black is black then let's call it zero. So a brighter monitor puts out more lumens and, in inverse log logic, more "stops". Mine is around 6 or 6.5 stops as measured by a light meter. That's nowhere near the rated contrast ratio. But how can it be if I have the brightness turned down because of calibration? As an example of the math, 500 nits is one full stop brighter than 250 nits and around two full stops brighter than 120 nits. Right?
You all have offered interesting information. But ... (
show quote)
Excessive brightness will result in color intensity not matching what is on the screen with what you print.
a6k
Loc: Detroit & Sanibel
rgrenaderphoto wrote:
Excessive brightness will result in color intensity not matching what is on the screen with what you print.
That makes sense and makes my question relevant. Why use 350 if it's too bright? The explanation about viewing environment is the only answer that makes much sense.
RCJets wrote:
OK, maybe I'm dumb, but what is a nit?
Cd/M^2 would be spoken as "Candela per square meter (or per meter squared if you prefer).
The candela is the base unit of luminous intensity in the International System of Units (SI); that is, luminous power per unit solid angle emitted by a point light source in a particular direction. Think of a candle - that's it!
But why say some complicated SI term when "nit" is more fun! It's the normal shorthand term.
Having a bright display looks great - the images pop so well.
But yes, your printed output is better represented by lower intensity display.
If you want to reply, then
register here. Registration is free and your account is created instantly, so you can post right away.