A reliable colour appearance model is desired by industry to achieve high colour
fidelity between images produced using a range of different imaging devices. The aim
of this study was to derive a reliable colour appearance model capable of predicting
the change of perceived attributes of colour appearance under a wide range of
media/viewing conditions. The research was divided into three parts: characterising
imaging devices, conducting a psychophysical experiment, and developing a colour
appearance model.
Various imaging devices were characterised including a graphic art scanner, a
Cromalin proofing system, an IRIS ink jet printer, and a Barco Calibrator. For the
former three devices, each colour is described by four primaries: cyan (C), magenta
(M), yellow (Y), and black (K). Three set of characterisation samples (120 and 31
black printer, and cube data sets) were produced and measured for deriving and
testing the printing characterisation models. Four black printer algorithms (BPA),
were derived. Each included both forward and reverse processes. A 2nd BPA printing
model taking into account additivity failure, grey component replacement (GCR)
algorithm gave the most accurate prediction to the characterisation data set than the
other BPA models. The PLCC (Piecewise Linear interpolation assuming Constant
Chromaticity coordinates) monitor model was also implemented to characterise the
Barco monitor.
The psychophysical experiment was conducted to compare Cromalin hardcopy
images viewed in a viewing cabinet and softcopy images presented on a monitor
under a wide range of illuminants (white points) including: D93, D65, D50 and A.
Two scaling methods: category judgement and paired comparison, were employed by
viewing a pair of images. Three classes of colour models were evaluated: uniform
colour spaces, colour appearance models and chromatic adaptation transforms. Six
images were selected and processed via each colour model. The results indicated that
the BFD chromatic transform gave the most accurate predictions of the visual results.
Finally, a colour appearance model, LLAB, was developed. It is a combination of the
BFD chromatic transform and a modified version of CIELAB uniform colour space to
fit the LUTCRI Colour Appearance Data previously accumulated. The form of the
LLAB model is much simpler and its performance is more precise to fit experimental
data than those of the other models.