Beginner Trouble Calibrating Monitors

Messages
22
Edit My Images
Yes
Hi

I wonder if you can help me with an issue with monitor calibration that is puzzling me. First off I am something of a noob with all this stuff, so please forgive me if I come across as completely clueless.

I have two identical monitors (Dell U2713H) both attached to the same computer and graphics card in dual monitor set up. Both monitors display the same wallpaper image. In the colour settings, the preset modes is set to: Colour Space: Cal 1 for both monitors.

I ran colour calibration using the Dell Ultrasharp Calibration software and the X-Rite i1 Display Pro colorimeter which is required by the software. The settings I used for both monitors were White Point: D65, Luminance: 120 cd/m, Tone Response Curve; Standard (default) and Gamma: 2.20. For the measurement I set it to automatic display control.

I calibrated both monitors one after the other with the same lighting for both. Yet the results I got were markedly different. One of the displays is now a lot darker and duller than the other one.

As I understand it the calibration creates a profile like a look up that the graphics card reads to render colour and luminosity correctly. If this is so, then surely two identical monitors, both with exactly the same settings should after calibration render the display the same?

Is my understanding of this all wrong, or am I missing something here? I tried again just in case I messed up with much the same results. I even re-set both monitors to their factory settings and had another go. They still look totally different. If you aren’t getting consistent results, doesn’t that kind of defeat the entire object of calibration and render the whole exercise rather pointless?

Any help or advice here would be gratefully appreciated.
 
I remember at one time Color Munki (I think it was) that was used at my camera club, was unable to calibrate a two monitor set up - as in only one of the monitors was ever calibrated and the other would default to something else. Not sure if it was specific to the calibration tool they were using. Sorry I can't remember more details, it was quite a few years ago.
 
Last edited:
I think it comes down to manufacturing variables and not being made to exacting standards.
Some measuring instruments are calibrated others are for indication purposes and they have to be labelled as such.
I think if that applied to screens they would fall into the latter category.
 
Last edited:
I agree with @Bebop with my understanding as......

Unless you have a graphics card that is specified as being a two LUT card (e.g. the older Matrox Dual output cards) which are normally high end & high cost cards. It will only 'hold' one LUT profile generated by a calibrator and its software.

Therefore, as no two monitors even of the same make & model are 100% identical you cannot (se above) calibrate both monitors [there may be ways round this where you could manually switch between two profiles?]

So choose one to be your #1 and calibrate that accepting that the other one will be your general use one.
 
Make sure that both monitor's hardware settings are the same.
There's actually two things that can occur that we typically combine under the generic term of "calibration." There is hardware calibration with is modification of the settings (e.g. brightness) and LUT's (of card/monitor) if that is supported (typically involves a proprietary program/interface). And the second part is software profiling; the monitor icc profile that is created.

Because they are separate devices, they should be using their own individual profiles; not the same profile. And you need a system that is able to apply separate icc profiles to separate monitors. Windows 10 and Macs can; I believe Windows XP needed a separate program, and Vista had no support. How to set up multiple monitors on Windows 10
 
Last edited:
I am operating Windows 10 and I did use seperate icc profiles for each monitor when I did the calibration. Assuming my grahics card allows this. It's a nvidia geforce rtx 2060 super. If anybody knows anything on this, I am all ears. I've tried googling it to see if there was any info re the card, but all I get is a lot of people going on about gaming and most of it makes no sense to me whatsoever. I think it is safe to say I prefer cameras to computers!
 
I am operating Windows 10 and I did use seperate icc profiles for each monitor when I did the calibration. Assuming my grahics card allows this. It's a nvidia geforce rtx 2060 super. If anybody knows anything on this, I am all ears. I've tried googling it to see if there was any info re the card, but all I get is a lot of people going on about gaming and most of it makes no sense to me whatsoever. I think it is safe to say I prefer cameras to computers!

I cannot find any specific reference to the number of LUTs.....by inference that suggests it only has a single one = AFAIK you can't software calibrate more than 1 monitor.

The description re: Dell software allows you to create a 14bit calibration.......does not explain quite the what and how of the way it works. With my BenQ monitor I use BenQ Pallet Master software and that writes the profile to the monitor itself i.e. the monitor is hardware profiled but the monitor spec states it is/can be hardware calibrated. Using the i1 Display software (Pallet Master is a special version of this) it writes the profile to the GPU.

Perhaps talk to Dell and find out what they have to say on the calibration process???
 
Last edited:
I am running win 10 with onboard graphics and I cannot use seperate ICC profiles, so I can only have one monitor calibrated.
 
This (respected) review site does mention hardware calibration support....so the plot thickens!


If that is indeed accurate then in principle AFAIK you should be able to hardware calibrate them individually so that for all practical purposes they will appear the same.

Perhaps, then it is what the settings/process of calibration that the Dell software tells you to do......?
 
I am operating Windows 10 and I did use seperate icc profiles for each monitor when I did the calibration. Assuming my grahics card allows this. It's a nvidia geforce rtx 2060 super. If anybody knows anything on this, I am all ears. I've tried googling it to see if there was any info re the card, but all I get is a lot of people going on about gaming and most of it makes no sense to me whatsoever. I think it is safe to say I prefer cameras to computers!
Are you using extended desktop? If you use mirrored or duplicate it can cause issues, and no splitters/switches.
 
Last edited:
Box Brownie, the process mentioned in the review was the process I followed. That was why I was surprised the results were so different for both monitors. sk66, yes I am using extended desktop.
 
Box Brownie, the process mentioned in the review was the process I followed. That was why I was surprised the results were so different for both monitors. sk66, yes I am using extended desktop.
I'm on Mac, so I'm running out of options/ideas for you. I would verify that your card supports dual LUT's. But understand that 1D LUT is only a gamma curve offset for the video card. The rest of the calibration (color shifts) is in the icc profile, which is only applied by a fully color managed application (like photoshop)... are you viewing them both in a fully color managed environment?

I would try uninstalling the monitor that calibrated well and redoing the other one as stand alone... that will help verify that there isn't some kind of hardware issue.
 
The implication is that if you have one graphics card you can only have one LUT active at a time, The card itself is "dumb" I.e. it doesn't know which monitor is attached to it.
The only way to run two calibrated monitors is to run two graphics cards, each calibrated to suit the monitor that is attached.
 
The implication is that if you have one graphics card you can only have one LUT active at a time, The card itself is "dumb" I.e. it doesn't know which monitor is attached to it.
The only way to run two calibrated monitors is to run two graphics cards, each calibrated to suit the monitor that is attached.

I used to run dual monitors using the onboard graphics of the PC motherboard. I could calibrate both monitors using the Spyder 5 and associated software. When I installed a discrete graphics card the same was true.
What may be pertinent to the OP's problem is how the OP is comparing the two screens and with what application as suggested by sk66 above
 
I think most points have been discussed here although many monitors with built in cameras use auto ambient light adjustments.
I have no idea if this is applied to individual monitors or it's just a global dimming/brightening.

That said, iMacs, (I think) apply an ambient light colour correction curve as well as brightness.

I could be talking total twaddle though!
 
I had the a similar problem with my Spyder 3 calibrating a dual monitor setup, switched to using DisplayCal and was able to get them looking the same (or close enough that they look the same to me!)
 
Sorry its taken me so long to reply. Work and other stuff has intruded to the point that I only had time today to test out sk66's suggestion to uninstall the monitor that calibrated well and redoing the other one as stand alone. All seemed fine, so I don't think the issue is with the monitor.

I tried something else, whereby I set the colourspace setting on monitor 1 to use CAL 1 in the monitor menu and monitor 2 to use CAL 2. I then calibrated monitor 1 set to CAL 1 in the Dell Colour Management software (DCMS), and created an ICC profile for use solely by monitor1. I then switched to CAL 2 in the DCMS then calbrated monitor 2 and created a separate ICC profile for monitor 2. I then checked the Display Settings in Windows 10 to ensure that both monitors were set to the individual ICC profiles that I had created for them. (I hope that all makes sense).

It did seem to work because both monitors now look pretty alike with the same wallpaper showing, unlike before, when one was considerably darker and duller than the other one. If this has worked and my graphics card is capable of having two LUTS then hurrah and much rejoicing. I am still non the wiser. On the other hand if its only capable of one LUT at a time and I've just managed to get both monitors looking alike by some lucky fluke, then I guess I will find out when its time to re-calibrate in a couple of weeks time. If that is the case then oh well, never mind. It's by no means a show stopper.

Two graphics cards would be complete overkill in my opinion. I will just follow Box Brownie's advice to 'choose one to be my #1 and calibrate that accepting that the other one will be my general use one'. Whilst having both monitors bang on would be very nice, that is a perfectly acceptable solution. One has to be pragmatic about these kinds of things and I am sure I will cope!

Thank you to everybody on this thread who has taken the trouble to reply to me. All of your help and advice is very much appreciated.
 
If you sell your 2060 Super (which is a gaming card and complete overkill for photography) you'd probably have enough cash to buy 2 photo centric graphics cards.

Just a thought.
 
I then calibrated monitor 1 set to CAL 1 in the Dell Colour Management software (DCMS), and created an ICC profile for use solely by monitor1. I then switched to CAL 2 in the DCMS then calbrated monitor 2 and created a separate ICC profile for monitor 2. I then checked the Display Settings in Windows 10 to ensure that both monitors were set to the individual ICC profiles that I had created for them. (I hope that all makes sense).
That sounds about right... they should both be using their own specific profile.

It did seem to work because both monitors now look pretty alike with the same wallpaper showing, unlike before, when one was considerably darker and duller than the other one. If this has worked and my graphics card is capable of having two LUTS then hurrah and much rejoicing. I am still non the wiser.
I checked the specs for your video card... it supports up to 4 monitors.

Screen Shot 2021-05-08 at 1.38.28 PM.png
 
Last edited:
That sounds about right... they should both be using their own specific profile.


I checked the specs for your video card... it supports up to 4 monitors.

View attachment 317939

If he has hardware profiled each monitor, AFAIK it should be academic/unnecessary to 'load' the profiles into W10........as whatever profile is listed of those available, are overridden by the hardware profile?

Plus, AFAIK unless specified the number of ports & supported monitors does not refer to how many LUTs the card supports???
 
If he has hardware profiled each monitor, AFAIK it should be academic/unnecessary to 'load' the profiles into W10........as whatever profile is listed of those available, are overridden by the hardware profile?

Plus, AFAIK unless specified the number of ports & supported monitors does not refer to how many LUTs the card supports???
If it was hardware profiled IDT it would create an icc profile, it would create a 3D LUT that is loaded into the monitor's . And IDT it would be written to the disk drive either, it is stored in the monitor's RAM; it is a direct rewrite of the monitor's LUT (unless as backup?).
IDKT multiple ports means it necessarily supports multiple LUT's; but I think it is probably unusual for it to not support multiples these days. Nvidia's calibration instructions specify assigning separate profiles for multiple monitors when using their cards; and I couldn't find anything to the contrary saying they don't support multiple calibrations.
 
If it was hardware profiled IDT it would create an icc profile, it would create a 3D LUT that is loaded into the monitor's . And IDT it would be written to the disk drive either, it is stored in the monitor's RAM; it is a direct rewrite of the monitor's LUT (unless as backup?).
IDKT multiple ports means it necessarily supports multiple LUT's; but I think it is probably unusual for it to not support multiples these days. Nvidia's calibration instructions specify assigning separate profiles for multiple monitors when using their cards; and I couldn't find anything to the contrary saying they don't support multiple calibrations.

I stand corrected............... :thinking:

I have found some references on other fora where the same question has cropped up more than once. It does make me wonder why nVidia don't make clear statements about this on a per card basis. But the positive responses are all(?) based on empirical "it worked for me" replies. So perhaps there might be card model to card model variations???

PS the only other thing I recall about the multi ported GPU's is that not all the ports are 'created equal' in regard to maximum supported resolution e.g. need to check as appropriate if using say the DVI and DP or perhaps the DP and the HDMI..?

PPS I am sorry if I inadervantly muddied the waters :coat:
 
Back
Top