Atom Feed
Comments Atom Feed

Similar Articles

Recent Articles

2019-07-28 16:35
git http with Nginx via Flask wsgi application (git4nginx)
2018-05-15 16:48
Raspberry Pi Camera, IR Lights and more
2017-04-23 14:21
Raspberry Pi SD Card Test
2017-04-07 10:54
DNS Firewall (blackhole malicious, like Pi-hole) with bind9
2017-03-28 13:07
Kubernetes to learn Part 4

Glen Pitt-Pladdy :: Blog

Colour Managed Dual monitors with nVidia, Hardy, Argyll

Getting full colour management working with dual displays can be a bit tricky. Certainly, it isn't what will happen by default, and it may require a bit of work to get it all working.

That said, it is still easier than my experience of colour management under Vista which insists of resetting the LUTs at any opportunity and doesn't have all the hooks to easily script up a work-around.

One thing that is important here is that the monitor is connected by digital DVI interface else further errors may be introduced by going via analogue and back to digital in an LCD monitor.

Although this is aimed at Ubuntu Hardy, it will probably work for almost all current distributions out there which suffer from similar problems.


As I spend much of my time needing accurate colour, simply leaving things to chance isn't acceptable. Monitors vary massively. Even the same model of monitor with the same settings can be a completely different animal.

Colour management allows all colour input and output devices to be matched as accurately as practical. This means that when I take a photo, the colour will be the accurate on the screen to the original, and when I print it, the print will again be as accurate as practical to what I was expecting.

There are a few caveats with colour management:

  • Not all devices can accurately capture or display all colours. When this happens the colour engine will simply substitute it's best effort (based on rendering intent) for the colours that it can't produce / match.
  • Not all humans see colour the same way. The models for colour are based on a standardised model of how humans perceive colour. Variations between people's eyes may mean that one will perceive an accurate colour match, where another will perceive a colour mismatch.
  • Viewing conditions can produce large variations in perceived colours. This can happen in a number of ways. When matching prints with the monitor, monitors are generally set up for a colour temperature of 6500K where standardised viewing conditions for prints is D50 (5000K). Also, if light sources have an uneven spectrum (the norm with fluorescent lighting which is now common everywhere) then prints may look very different to what they would under standardised viewing conditions (eg. see Observer metameric failure).

How it works (for monitors)

The basics of how colour profiles work is that they convert source colours to a scientific representation, and then from this they convert to the destination colours. This means that two profiles are used for they typical scenario.

Colour profiles generally have two parts:

Device Linearisation

This basically takes the native device colour and tries to remove any imbalances and misbehaviour. This is rather like adjusting Brightness, Contrast and colour components on the monitor controls, but far more accurate and versatile.

With monitors, we normally have the first set of correction curves loaded into the LUTs (Look Up Tables) on the graphics card. These are sometimes referred to as "Gamma Tables" which is rather a simplistic view of what they can do. They are normally 8-bit (265 levels) and as such are very vulnerable to rounding errors which leads to visible banding.

Additionally, within most LCD monitors is additional sets of LUTs for applying Contrast (effectively Gamma on LCDs), and colour balancing.

These can be extremely destructive to colour accuracy and results in visible bands or steps in the colour:

Gradient showing rounding errors due to 8-bit corrections

If your monitor is not set up to show colours accurately, then the banding will be less obvious. The Rounding Errors are most visible as vertical lines where colours jump.

Each time more LUTs are used, more errors will be introduced. The best approach is generally to only have one LUT, and the graphics card LUT is the the best choice as this will be accurately set by instrument when we calibrate and profile the monitor.

Few monitors are direct (no LUTs - Laptops and Apple monitors are often a good bet here), but failing that, the next best thing is to find the settings where the the monitor LUTs do not change the input, or at best have minimum impact. The annoying thing is that it would be a very simple task to add a mode to monitors for bypassing internal LUTs for those who need accurate colour, but none I have seen seem to have this. This is likely to be a case of trial and error:

  • Turn off all colour management and correction (use Argyll utility "dispwin -c" to clear the graphics card LUT)
  • Display a suitable greyscale gradient and start experimenting
  • Generally setting all the colour channels at Max in the monitor set up is a good place to start.
  • Adjust Contrast so that the bands on the gradient space out as much as possible, and preferably disappear.
  • Adjust Green colour channel so that bands on the gradient space out as much as possible, and preferably disappear. You may need to repeat adjusting the Contrast and Green channel a few times. Green is best to adjust as the human eye is most sensitive to Green light.
  • Adjust Red and Blue colour channels so that bands on the gradient space out as much as possible, and preferably disappear.

The Brightness on LCD displays generally only controls the backlight so simply adjusting this to a comfortable working level is all that is needed. If you operate with some specific colour (eg. 90cd/m2 is commonly used as a target monitor brightness for graphics work) then set it up with an instrument to that brightness.

Colour conversion

This converts between the scientific colour representation, and the corrected (linearised) device colour and only occurs with colour managed applications that do this conversion.

For accurate results, this is normally done from another LUT, but this time 3-dimensional so that conversion can be done between different colours.

Dual Monitors (Xinerama)

Most Dual Monitor X configurations are TwinView which does have some benefits and works around all sorts of quirks of different graphics drivers. It also introduces a problem for accurate monitor calibration as it uses one LUT for both monitors so we can't have custom corrections for each monitor (BAD!).

I use Xinerama which does allow each monitor to be individually corrected, and apart from breaking "Visual Effects" (Compiz Fusion) seems to cause no problems.

There are instructions on how to setup Xinerama in the Ubuntu Forums. Also you can see my Xinerama xorg.conf that I use.

Once you are running with a good Xinerama config (you should see nVidia logos on both monitors when X starts) then you should be ready to start calibration.


Calibration is the linearisation curves which are loaded into the graphics card LUTs and all applications can benefit from irrespective of them being colour managed or not. This provides the basic level of matching between both monitors.

$ dispcal -v -d 1 -t 6500 -p 0.5,0.5,1.5 -y l monitor


$ dispcal -v -d 2 -t 6500 -p 0.5,0.5,1.5 -y l monitor_b

These will run a calibration cycle on the first monitor, and then the second monitor. The colour window should appear on the respective monitor and you will be given instructions on what to do. In my case I go for "7) Continue on to calibration" and don't generally bother with the rest: as discussed above, I want the calibration to do the job with no rounding errors from LUTs in the monitor.

The "-p 0.5,0.5,1.5" provides a larger colour window since the default one is a bit small on my monitors.

This should leave us with a "" and "" files, but you can name them as you want. I start off with "hplp2065[ab]-<date>" so each time I know when I did the profile and keep the last few profiles.


Next we need to generate the colour patches for profiling. Argyll has a neat feature that it can optimise the colour patches for the device if you have a previous profile for the device:

$ targen -v -d 3 -f 600 -A 0.8 -c old_monitor.icc monitor

$ targen -v -d 3 -f 600 -A 0.8 -c old_monitor_b.icc monitor_b

If you don't have a previous profile for the monitors then leave out the "-c old_monitor.icc" option.

That should have created files named "monitor.ti1" and "monitor_b.ti1" containing the colour patches for each monitor. Now we just need to read them in from the display:

$ dispread -v -d 1 -y l -p 0.5,0.5,1.5 -k monitor

$ dispread -v -d 2 -y l -p 0.5,0.5,1.5 -k monitor_b

These will run a display reading on the first monitor, and then the second monitor. The colour window should appear on the respective monitor and you will be given instructions on what to do (placing the instrument on the monitor).

That should leave files named "monitor.ti3" and "monitor_b.ti3" containing the values read in.

Now we just need to generate the profiles:

$ colprof -v -A"MANUFACTURER" -M"MODEL" -D"DESCRIPTION/DATE" -qm -al -S PATHTO/sRGB.icc -S PATHTO/AdobeRGB1998.icc monitor

$ colprof -v -A"MANUFACTURER" -M"MODEL" -D"DESCRIPTION/DATE" -qm -al -S PATHTO/sRGB.icc -S PATHTO/AdobeRGB1998.icc monitor_b

The "-S PATHTO/PROFILE.icc" options generate gamut mapping for that profile, so worth putting any working space profiles you regularly use in there.

That should leave files "monitor.icc" and "monitor_b.icc", the profiles for each monitor.

Storing the profiles

There doesn't yet seem to be complete consensus on where profiles should be kept in Linux. The current favourites seem to be "/usr/share/color/" for system wide profiles, and "~/.color/icc/" for user profiles. Personally I use so much colour management that a hidden directory is a bit of a pain so store my profiles under "~/icc_profiles/" and symlink all the other user areas to that.

I also rsync this directory to other machines (eg. my laptop) to take with me which creates a unique problem: the monitor profiles need to be different for each host, but the config scripts I run should ideally be universal (rather than specific to each host).

I have solved this by naming the profiles according to the monitor / host they are for, and then on each host having a "~/icc_profile_links/" directory  which links to the profile(s) for that host within "~/icc_profiles/". In turn "~/icc_profiles/monitor.icc" links to "~/icc_profiles_links/monitor.icc" which means that it will always be the correct profile for the first monitor on each host.

Likewise I have done the same with "monitor_b.icc" for the second monitor (where available).

I have created two small scripts to make it easy to load the profiles:

You will need to set these executable (chmod +x monitor-*) and I put them in my "~/icc_profiles/" directory.

These simply load and clear the monitor calibration / profile. If a profile for the second monitor exists, then it does the same with that too.

I have created launchers on the panel to run each of these to switch calibration / profiling on and off.

Note that if you decide to store your profiles somewhere else then you will need to modify these scripts accordingly.

Avoiding LUT Resets (loading LUTs at login)

An annoyance with colour management on Hardy is that the Gnome Screensaver resets the graphics card LUTs to the state when the screensaver started (at login). This means that as soon as the screensaver kicks in, any calibration is lost.

My simple workaround to this is to ensure that gdm loads the profiles before Gnome loads. For this I have created the Xsession file (script) "/etc/X11/Xsession.d/10-local-icc_profiles" which gdm will run first when a user logs in.

This file / script looks for "/home/$USER/icc_profiles/monitor-setcal" (modify it if you put your profiles elsewhere) and runs it to load the calibration and profiles before doing anything else.

That way any calibration is already in when Gnome Screensaver loads and so it remains. Be aware that if you change your profiles during a session, Gnome Screensaver will keep resetting calibration to the calibration at login.


Before and After Dual Monitor Calibration and Profiling

This was done using a X-Rite DTP94 Colourometer. I suspect I could have got even better results using my GretagMacbeth Spectrolino Spectrophotometer but it's rather a lot more hassle to setup.

The one question I have here about these monitors is how HP managed to achieve such bad colour matching? The left one was the first one I bought, and the right one added later.

The left one seems to reach unity on all the internal LUTs with a contrast of 88 and all the colour channels on 100. The right one (bought later) with the Green channel on 100 appears to not be reaching unity on the colour and still shows some Green banding on the gradient test (I suspect unity on Green is around 110, but you can't adjust it to that!). It appears to me that HP may have deliberately reduced Green and given later production a magenta cast by changing the LUTs in such a way that it is not possible to achieve unity (avoid rounding errors). So much for accurate colour with "Professional" monitors.....


Annoyingly, using Xinerama has badly broken my graphics tablet. I have tried many different config options, plus tweaks with xsetwacom, and I can only achieve 2 main things - either the mouse cursor jumps from the middle of the left screen to the middle of the right screen, or I can configure it to stay on one screen.

On the other hand, Gimp's brush moves smoothly across both screens no matter what the configuration. That means that the cursor and the brush aren't in sync, and makes it extremely difficult to use.

Information seems very difficult to find on this and some say it just works, and others are stuck with the same problem I have. I am beginning to suspect this is a bug with the tablet and/or xorg in Hardy, but can't be sure at this stage. Some time I will have another go at it in case it is my config.

Anyone got any suggestions?