Jump to content

Under $400 1080p monitor for color correction?


mood_lover

Recommended Posts

Hi all, wasn't sure where to post this so mods please move if necessary.

 

I am looking for a relatively affordable (< $400) color correction monitor. I understand these things are super hard to come by for this price but its all I can afford at the moment. I can probably get my hands on an X-Rite i1 calibrator if necessary.

 

It has to be 1920x1080 minimum (but I have no need for more resolution) and great for color correction. Honestly I have no clue what to look for in a monitor that proves it's meant for this but I'm pretty technically savvy so I can pick it up quickly. Thanks for any suggestions!

Link to comment
Share on other sites

Years back I paid $330 for an LED 27in.1920x1080 LG 27EA63 at Best Buy. It is sRGB color gamut and I've color corrected nearly 1000 Raw images on it so far. I also bought a Colormunki Display to calibrate it just to be sure that the $10k Minolta Color Analyzer LG used at the factory to calibrate it out of the box was true to its color accuracy claims. I can assure you it is.

 

Below is images I shot of this display to show how it renders smooth gradients and accurate color along with LG calibration spec sheet and the results from the Colormunki Display showing Apple's Colorsync color gamut map to compare against sRGB.

 

That model and newer LG LED display models are still under $400 and I highly recommend it for color editing.

 

00IMGP7933.thumb.jpg.406ce11a0325610aca4f0c898aed5225.jpg

Link to comment
Share on other sites

Years back I paid $330 for an LED 27in.1920x1080 LG 27EA63 at Best Buy. It is sRGB color gamut and I've color corrected nearly 1000 Raw images on it so far. I also bought a Colormunki Display to calibrate it just to be sure that the $10k Minolta Color Analyzer LG used at the factory to calibrate it out of the box was true to its color accuracy claims. I can assure you it is.

 

Below is images I shot of this display to show how it renders smooth gradients and accurate color along with LG calibration spec sheet and the results from the Colormunki Display showing Apple's Colorsync color gamut map to compare against sRGB.

 

That model and newer LG LED display models are still under $400 and I highly recommend it for color editing.

 

[ATTACH=full]1214695[/ATTACH]

Hey Tim thanks for the response. Some better monitors I'm looking at boast that they cover 99% of AdobeRGB but some of the lower end ones don't boast that, they claim that they cover 100% sRGB but don't even mention AdobeRGB. I know AdobeRGB is a wider color space, is that something I need? I rarely print, most of my final output is for web. Will look into this LG, thanks!
Link to comment
Share on other sites

Couldn't tell you if you "need" an AdobeRGB gamut display since I've never seen a calibrated wide gamut display. You may need it if your going to use it for graphic design for color critical Pantone color matching inks for product and fashion design as well as motion picture production for output to wide color gamut Sony 4K projectors and IMAX.

 

I've seen AdobeRGB Samsung Galaxy pads and they look way too saturated. Can't say I've seen "more" colors though.

 

With the sRGB LG I get print matches even on my local Walmart's Fuji Frontier Dry Lab inkjet printer from my images converted to sRGB with Fuji enhancements turned off.

Edited by Tim_Lookingbill
Link to comment
Share on other sites

Yeah I might do that. I'm looking at two different BenQ monitors. One is the new SW2700PT at $600 99% AdobeRGB and the other is the older PW2700Q at $329 75% AdobeRGB. Both have almost identical specs with the difference being in AdobeRGB and the more expensive one sports a 14-bit 3D LUT. Wondering how significant these two differences are, maybe some monitor pros can pitch in

 

Comparison between: 27" BenQ PD2700Q, 27" BenQ SW2700PT, 27" Dell InfinityEdge U2717D, 27" LG 27UD68

 

I think I may buy the cheaper one since its 100% sRGB and if its going to do the job

Edited by mood_lover
Link to comment
Share on other sites

Why would you need a monitor pro to tell you something you don't need since you're wanting a budget monitor.

 

A search on the web will give you plenty of info on high bit internal hardware display LUTs. That subject has been discussed to exhaustion for over ten years.

 

It appears you already had the info you needed to make a decision to begin with. I'm done here.

Link to comment
Share on other sites

Why would you need a monitor pro to tell you something you don't need since you're wanting a budget monitor.

 

A search on the web will give you plenty of info on high bit internal hardware display LUTs. That subject has been discussed to exhaustion for over ten years.

 

It appears you already had the info you needed to make a decision to begin with. I'm done here.

Whoa it was just a random comment no need to get angry...thanks anyways have a good one Edited by mood_lover
Link to comment
Share on other sites

Whoa it was just a random comment no need to get angry...thanks anyways have a good one

Where did I say I was angry? I'm just puzzled why you'ld ask for help on this and then indicate you already had display models in mind.

 

You came across as someone completely new to this going by your questions on wide gamut and hardware LUTs when it's clear you've already done research on this which makes you more informed than you let on.

 

Are you a beginner or not? I'm puzzled by this. This thread was posted in the Beginner Questions forum.

Link to comment
Share on other sites

Where did I say I was angry? I'm just puzzled why you'ld ask for help on this and then indicate you already had display models in mind.

 

I think it's reasonable to say "I've got this far, and am looking at these, is this the right kind of thing, and if so which option is better?" - I'd rather than than someone not sharing what they've established so far.

 

Are you a beginner or not? I'm puzzled by this. This thread was posted in the Beginner Questions forum.

 

Er...

 

Hi all, wasn't sure where to post this so mods please move if necessary.

 

You came across as someone completely new to this going by your questions on wide gamut and hardware LUTs when it's clear you've already done research on this which makes you more informed than you let on.

 

Yes, well...

 

Let's put it down to a misunderstanding.

 

If you're targeting web use, sRGB is currently what, if anything, you may expect your audience to see. Anything claiming 100% of sRGB really ought to cope, and budgeting for a colorimeter is probably more important than spending your money on a high-gamut model. Gamuts are increasing (wide gamut is starting to appear on web sites, wide-gamut and HDR monitors, TVs and mobile devices are appearing) but if you're aiming for a wide audience then sRGB is all you can really rely on. And actually you can't rely on that, since a lot of people are using fairly cheap and uncalibrated displays (including the one I'm typing this on).

 

If you want to cover more of a printer gamut, or if you want to target wide gamut devices that are actually calibrated, then yes, there's something to be said for a SpectraView or ColorEdge - but it really sounds as though you don't need one. So long as you watch out for uneven backlighting and direction-dependent colour shifts, even cheap monitors, used properly, are a lot less awful than they used to be. Having inappropriate ambient lighting or a coloured background (both on the computer and in your room) will probably throw your use off more than most mid-range displays.

 

Executive summary: don't sweat it, unless your current display is really terrible.

 

Disclaimer: I'm not a graphic artist, but I know a moderate amount about colour science and display technology.

Link to comment
Share on other sites

Where did I say I was angry? I'm just puzzled why you'ld ask for help on this and then indicate you already had display models in mind.

 

You came across as someone completely new to this going by your questions on wide gamut and hardware LUTs when it's clear you've already done research on this which makes you more informed than you let on.

 

Are you a beginner or not? I'm puzzled by this. This thread was posted in the Beginner Questions forum.

I was doing research in the first two hours of this thread. I am technically inclined as I stated on my first post and pick this stuff up quickly. I didn't have any models in mind, I just came across the BenQ's that have recently been marketed towards photographers and they had stunning reviews so I wanted to bring them up and ask which would be a good option to go with

 

I looked for your recommended LG, it was discontinued. So I looked into what makes a photographers monitor and learned about IPS, color gamut and fidelity, color reproduction, watched all the youtube reviews etc you don't have to research for weeks to quickly get an idea of what to look for can you understand that? Im shopping, not becoming an engineer

 

What I didn't understand is why the expensive monitor has 14-bit 3D LUT and how much of a difference it makes for color correctors like myself using a monitor that may not have that. Hence me asking if someone who knows can pitch in (whether thats you or anyone else). Really don't understand your confusion, not everyone is a slow learner

Edited by mood_lover
Link to comment
Share on other sites

What I didn't understand is why the expensive monitor has 14-bit 3D LUT and how much of a difference it makes for color correctors like myself using a monitor that may not have that.

 

I'm not 100% sure of myself on this answer even though I used to work with desktop graphics cards and still work in the computer graphics industry, but I'll bring up an argument I had with Gretag Macbeth one day, since it may help:

 

Most (recent) computers store the display in memory memory with 8 bits each for red, green and blue - 8 bits gives you 256 possible values (2 to the 8), and with three colour channels that gives you 256 x 256 x 256 = over 16 million colours. Each of the 256 values represents a brightness, and because displays (usually) use additive colours, you can represent any colour on the display by combining red, green and blue brightnesses. Basic DVI, HDMI and (although it's flexible) DisplayPort send these 8 bit values for each channel to the monitor, which then converts those values to a display colour. It's possible to store more different colours than this, but uncommon, and the vast majority of applications expect to work with a display configured like this.

 

Note that the maximum brightness and black level of the display ("contrast" and "brightness") are fairly separate to calibration and controlled directly by the monitor - although you're typically asked to set them to a nominal level when you calibrate the display. Everything else we do is relative to the brightness and contrast settings of the monitor.

 

Now let's calibrate the display. There are two parts to this:

  • a custom display profile, which tells the colour management system (CMS) how to map defined colours to the display colours (this has a performance cost and is only used by some bits of software, like Photoshop, which actually care about colour)
  • a calibration curve for the colour channels, which maps from a nominal brightness in each colour channel to a different value that should be "correct" when the display shows it

Note that the profile can do relatively complex conversions (such as changing the definition of "red" to have a bit of green in it), whereas the calibration curve only works on each colour channel independently. Historically, the reason for this is that the calibration curve can be built into the graphics card: having a look-up table for 256 values (times 3) to map one brightness value to another is a relatively small piece of silicon, but a table with 16 million values in it that can map any input colour to any arbitrary output colour is very costly, especially if it has to work as fast as pixels get transferred from the graphics card to the display. Because the calibration curve is built into the hardware, any software should pick up on it without having to do anything, unlike a profile which requires explicit application support.

 

In the good old days of VGA connectors and CRT monitors, the look-up table often had 8-bit inputs (you can store 256 tones in the desktop memory, or "frame buffer" to use a technical term), but could often describe the output colour in more detail (say 10-bit - we'll come back to this). Because a continuous analogue voltage got fed over a VGA connector, calibrating the look-up table on the graphics card gave you more accurate colours. (The down side of putting out a continuous voltage was that any interference on the cable affected the display.)

 

Now let's look at DVI connectors, which were common for most flat panel displays until fairly recently. With a basic DVI system, the graphics card in the computer mapped 256 brightness levels to another set of 256 brightness levels, because that's all the graphics card could send over the connector.

 

The good news is that if, say, value "128" is too dark, and should be sent to the monitor as value "140" to achieve the correct brightness, you can put this in the graphic's cards look-up table; the application writes "128" into memory, blissfully aware of what the monitor is doing, and the graphics card translates that to "140" by the time the monitor sees it, so it comes out as the right brightness.

 

The bad news is that you only have 256 values to play with on the output, so if you've mapped "128" to "140" and you have another 127 values (the ones from 129..255) to fit into the remaining 141..255 range, there isn't space. What for the computer is a continuous range of brightness values will get banding on the display, because multiple colour values map to the same thing. To make things worse, you've "stretched" the values from 0..128 on the display to 0..140 on the monitor (I'm assuming you don't have values that make the display get darker as the computer thinks they're getting brighter). Since there are more tones that the monitor can display than you're feeding it, there are gaps - again, giving banding. Part of the problem is that you probably didn't really want exactly "140" as brightness - you actually wanted a fractional value, and the rounding makes things uneven.

 

Let me give an example by pretending the display could only use 16 values (0..15) rather than 256 (because otherwise this table would be huge), and that we can send the same to the display. If you're old enough, you may have used a 12-bit display that was this limited - that representation is still used in computer graphics to save space. For the sake of argument, let's have a nice smooth gamma curve (as typically applied to a display), and I'll pick a gamma value of 0.87, because that happens to give roughly the mapping from 128/255 to 140/255 that I arbitrarily picked above. Yes, I'm dull enough to have done the maths.

 

Okay, bbcode tables don't work in photonet. Now I know. I'll try writing it long-hand. I miss the old HTML input scheme, again...

 

Column A: Input screen (framebuffer) value.

Column B: Ideal output value (screen value divided by 15, to the power of 0.87, times 15)

Column C: Rounded value sent to monitor

 

A B C

0 0.00 0

1 1.42 1

2 2.60 3

3 3.70 4

4 4.75 5

5 5.77 6

6 6.76 7

7 7.73 8

8 8.68 9

9 9.62 10

10 10.54 11

11 11.45 11

12 12.35 12

13 13.24 13

14 14.13 14

15 15.00 15

 

Um. I'm going to stop there because I'm not sure whether the table renders correctly in the forum (it didn't). If it does, I'll continue in a second post. If not, I'll fix it first (I did).

Edited by Andrew Garrard
Link to comment
Share on other sites

Also, curse the inability to select anything other than a proportional font.

 

Right, what did this (not very clearly) show? There's a gap in output values between 1 and 2, where what should be a brightess gap of "1.18" levels is actually a gap of 2. Desktop values 10 and 11, which should be "0.91" apart, actually have the same value. We have banding. Even with calibration, both the 10 and 11 values are quite a long way off what they should be, they're just the best we can do.

 

The solution is to have more output levels. If, instead of just whole numbers, we could send quarter values to the display, we'd be much better off:

 

Column A: Input screen (framebuffer) value.

Column B: Ideal output value (screen value divided by 15, to the power of 0.87, times 15)

Column C: Rounded value sent to monitor, now rounded to the nearest 1/4.

 

A B C

0 0.00 0.00

1 1.42 1.50

2 2.60 2.50

3 3.70 3.75

4 4.75 4.75

5 5.77 5.75

6 6.76 6.75

7 7.73 7.75

8 8.68 8.75

9 9.62 9.50

10 10.54 10.50

11 11.45 11.50

12 12.35 12.50

13 13.24 13.25

14 14.13 14.25

15 15.00 15.00

 

It's still not perfect, but we have much more even gaps, and we don't have duplicate values (although we would if the "gamma" value was much farther from 1). The ability to store "quarters" is what we get if we go from 4 bits (0..15) to six bits (0..63). In a desktop, it's also what we get if you go from 8 bits per channel to 10 bits per channel. On a CRT, the best graphics cards had 10-bit outputs from their look-up tables, meaning they could do this much accuracy - and on a CRT display with limited gamut and brightness, it's hard to see any errors. The problem was the display connection, which stopped us applying this trick.

 

Some modern graphics cards do have ways of working around this, such as dithering the colours (like a newspaper halftone) to approximate the intermediate values - it's harder to see colour errors in very small areas, and for large areas of the display this isn't a bad approach. It can go wrong if you feed the wrong pattern to the display, though. If you can drive the display with 10 bits of accuracy per channel, you get back to the situation with an old CRT - with the advantage that you might be wanting to represent intermediate values in your source content as well. Exposing this functionality can be a little messy for compatibility, though. A better solution, where possible, is to send everything to the display unmodified, and let it sort everything out for you. If the display can handle more bits of accuracy internally, it can apply a better smoothing curve.

 

All of this is a "1D LUT" - the look-up table for each colour channel affects only that colour channel. A true "3D LUT" also allows mappings between the channels, meaning that, as I mentioned under the profile, you can make red "a bit greener" or blue "a bit more violet", which you can't by affecting the channels individually. Different manufacturers use different filters (or phosphors, or OLED chemistry) so the exact native colours of the monitors vary; a 3D LUT lets you compensate for this. To show this isn't so obscure: various different TV standards (NTSC standard def, everyone else standard def, HDTV, UHDTV) all have different ideas of what is meant by "red", "green" and "blue". In a lot of professional photo applications, the colour management system does this correction in software; if your monitor can do it in hardware then everything is corrected - including, for example, simplistic video playback.

 

Nobody will use a full-size LUT mapping all the representable input colours to all the representable output colours (which would actually avoid the need for 1D LUTs); even for 10-bit values, this would need about 4GB of very fast memory just to store the LUT. It's much more common to store the "3D" portion at lower accuracy and interpolate between values, combining it with higher-precision 1D LUTs to get a smooth tone curve (with a separate curve the 3D part of the LUT is typically close enough to linear that interpolation doesn't lose much accuracy). The precision with which values are stored in the LUTs and the precision of calculations performed on them affect the output - that's where the "14-bit" marketing term comes from. The computer can do this as well in its colour management system, but it's limited in the precision of colours it can send to the display.

 

The argument I mentioned with Gretag Macbeth? I was using a monitor with 8-bit values and an 8-bit connector. I wanted their software to produce a profile (so colour-aware software could produce correct colours on my screen) but leave the graphics card LUTs alone (so I still had all 256 shades of each channel available for the computer to use). I believe this was technically quite possible by the rules of how colour profiles are defined, but I was dismissed as wanting an "advanced use case". I've not gone back to this in several years, so I hope things have moved on. (I was also told by Adobe at the time that Photoshop dithers its screen output and that there was no point in my enabling dithering on the monitor - which, at least at the time, I later found it didn't. But I'm not bitter. Don't get me started on why it was so hard to use absolute colorimetric rendering intents with a calibrated light. I got as far as people telling me that reflected photons were "different" from emitted ones, established that the official definition of the perceptual rendering intent is that there is no definition of the perceptual rendering intent, and washed my hands of the field. Well, kind of.)

 

So. Do you need a monitor with a high bit depth connection and an internal LUT? If you're producing wide-gamut printed content, maybe. If you're targeting wide-gamut and, especially, HDR displays, probably - you'll want the gamut and the range of values is such that you're more likely to see banding between them.

 

For web content where the majority of the audience are using an uncalibrated sRGB display? I'd say probably not, unless you're really sensitive to how close to perfect the display is and you're worried about banding. I'd just get a display known for a good approximation to the values it's supposed to be producing (a lot of monitor reviews tend to test for this) and live with it, and calibrate if you're paranoid.

 

Viewsonic and Eizo have pretty slide decks explaining all this with more pictures, if it helps; there's another overview here.

 

That's a bit overkill for the beginners' forum, but I hope it helps! Apologies for any inaccuracies as technology has moved on since I touched it, but this is my best understanding at the time of writing.

Link to comment
Share on other sites

I was doing research in the first two hours of this thread. I am technically inclined as I stated on my first post and pick this stuff up quickly. I didn't have any models in mind, I just came across the BenQ's that have recently been marketed towards photographers and they had stunning reviews so I wanted to bring them up and ask which would be a good option to go with

 

I looked for your recommended LG, it was discontinued. So I looked into what makes a photographers monitor and learned about IPS, color gamut and fidelity, color reproduction, watched all the youtube reviews etc you don't have to research for weeks to quickly get an idea of what to look for can you understand that? Im shopping, not becoming an engineer

 

What I didn't understand is why the expensive monitor has 14-bit 3D LUT and how much of a difference it makes for color correctors like myself using a monitor that may not have that. Hence me asking if someone who knows can pitch in (whether thats you or anyone else). Really don't understand your confusion, not everyone is a slow learner

At least you know the quality that can be attained in a display priced under $400 as you asked in your OP. Since my model has been discontinued, any current display model will be the same or better quality for posting content to the web. You know what to look for in a quality display even if it doesn't have a 3D hardware LUT and AdobeRGB color gamut.

Link to comment
Share on other sites

Where did I say I was angry? I'm just puzzled why you'ld ask for help on this and then indicate you already had display models in mind.

 

You came across as someone completely new to this going by your questions on wide gamut and hardware LUTs when it's clear you've already done research on this which makes you more informed than you let on.

 

Are you a beginner or not? I'm puzzled by this. This thread was posted in the Beginner Questions forum.

 

It seems that he didn't have one in mind when he posted, but even if he did ...

 

There is a certain feeling when buying something somewhat expensive that you want some confirmation

that it is the right one, even if you already are almost sure.

 

Better to ask in advance, than buyer's remorse later.

-- glen

Link to comment
Share on other sites

It seems that he didn't have one in mind when he posted, but even if he did ...

 

There is a certain feeling when buying something somewhat expensive that you want some confirmation

that it is the right one, even if you already are almost sure.

 

Better to ask in advance, than buyer's remorse later.

That's understandable but since you mentioned being careful about expensive purchases I feel I need to mention that after buying about 5 displays online spread across 20 years with screen quality issues that have to be seen that have nothing to do with smooth gradients, accurate color and other concerns the OP expresses, I have to say the best decision I made was to buy from a local brick and mortar store such as Best Buy where I could return the display if it had electronic or screen non-uniformity issues.

 

Buying displays online can be a crap shoot no matter the price or brand.

Edited by Tim_Lookingbill
Link to comment
Share on other sites

I know AdobeRGB is a wider color space, is that something I need? I rarely print, most of my final output is for web.

 

If you are posting to the web, you will be displaying in sRGB, and for that purpose, a wider-gamut monitor would be a complete waste of money. Worse, actually, because it you edit it to your satisfaction in a gamut wider than sRGB, people viewing the image will not see what you see, as most of them will be seeing it in sRGB.

 

That would rule out the BenQ monitors, I think. Just get an sRGB monitor that is IPS, not TN.

 

I have used a number of different Dell sRGB monitors for editing. I haven't looked, but I suspect you can get one at a reasonable size for under $400. After color correction, they have been fine. The only issues arise when printing colors that are outside the sRGB gamut. I print a fair amount, but you said you don't.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...