Color Mixing Property of a Projector-Camera System

Color Mixing Property of a Projector-Camera System
Color Mixing Property of a Projector-Camera System
Xinli Chen∗
Xubo Yang†
Shuangjiu Xiao‡
Meng Li§
School of Software
Shanghai Jiao Tong University, China
Abstract
In this paper, we investigate the property of how color channels of
a projector-camera system interact with each other, which is also
called color mixing of the system. We propose a method to describe this property with a single color mixing matrix for the system, rather than a different matrix for every surface point. The matrix is independent of projection surface and ambient light. It can be
measured just like response functions of projector-camera system.
The matrix is helpful for color sensitive applications like radiometric compensation and scene reconstruction. By decoupling color
channels with the matrix of the system, color images can be processed on each channel separately just like gray images. As most
projector-camera systems have broad overlap color channels, it’s
improper to neglect their interactions. We will show the validity
of describing color mixing of the system with a single matrix. In
the end, some experiments are designed to verify our method. The
result is convincing.
2002]. The projector response function can be measured directly
using a photometer [Yang et al. 2001; Majumder and Stevens 2002].
However, a photometer is prohibitively expensive as well as hard
to use. Then a method adapted the high-dynamic range imaging
was proposed by [Raij et al. 2003] to measure the response function using a black-white camera. These methods measured the response functions of projector and camera separately. [Song and
Cham 2005] presented a theory for self-calibration of overlapping
projectors and cameras. It doesn’t require any expensive equipment
and may be carried out with low dynamic range camera. Another
method for calibration of a projector-camera system was proposed
by [Juang and Majumder 2007], which measure the response functions as well as the vignetting effect.
CR Categories:
H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—Artificial, Augmented, and Virtual Realities; I.3.3 [Computer Graphics]:
Picture/ImageGeneration—Display Algorithms
Keywords: Color Mixing, Interaction of Color Channels, System
Calibration, Radiometric Compensation
1
Introduction
Projector-camera system has been used in a wide variety of applications like radiometric compensation [Nayar et al. 2003; Grossberg
et al. 2004; Fujii et al. 2005; Bimber et al. 2005], tiled displays [Majumder and Stevens 2002; Majumder 2003; Raij et al. 2003],
shadow elimination [Sukthankar et al. 2001; Cham et al. 2003]
and so on. With its extensive usage, many works have been done
to study the properties of such system. The space of camera response function was discussed and modeled by [Grossberg and Nayar 2003; Grossberg and Nayar 2004]. High-dynamic range imaging methods [Debevec and Malik 1997; Mitsunaga and Nayar 1999]
are designed to measure the camera response function through a series of images captured with different but known exposures. For
projector, Majumder studied its properties in the work [Majumder
∗ e-mail:[email protected]
† e-mail:[email protected]
‡ e-mail:[email protected]
§ e-mail:[email protected]
Copyright © 2008 by the Association for Computing Machinery, Inc.
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for commercial advantage and that copies bear this notice and the full citation on the
first page. Copyrights for components of this work owned by others than ACM must be
honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on
servers, or to redistribute to lists, requires prior specific permission and/or a fee.
Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail
[email protected]
PROCAMS 2008, Marina del Rey, California, August 10, 2008.
© 2008 ACM 978-1-60558-272-6/08/0008 $5.00
Figure 1: Interaction of color channels. Green images with intensity from 0 to 255 (red and blue are always 0) are projected, while
the red and blue channel also increase as a result of color mixing.
The above efforts have made the using of a projector-camera system more convenient. But they deal with only gray images or treat
color images simply as gray images for each channel. However, this
assumption doesn’t hold for most cases, as most projector-camera
systems have wide overlapping color channels and the interaction
is significant as shown in Figure 1. To our knowledge, color mixing of a projector-camera system is first addressed by Nayar et al.
in their work [Nayar et al. 2003]. A matrix is used to capture the
interactions of color channels as well as surface reflectance. As a
result, the color mixing matrix varies from point to point. It’s surface dependent and need to be re-evaluated for different projection
surfaces. [Grossberg et al. 2004] used the color mixing matrix in a
similar way. [Fujii et al. 2005] has decoupled color mixing from reflectance in their photometric model. However, they didn’t measure
it separately and different mixing matrices were used for different
points in their proposed method.
In this paper, we propose a method to determine a single color
mixing matrix for a projector-camera system, rather than a different matrix for every point. It can be regarded as a property of
the projector-camera system: it is independent of projection surface or ambient light; it doesn’t change dynamically unless system
settings(white-balance, contrast, saturation) change. It’s most useful for color sensitive applications. Take radiometric compensation
for example. With the pre-determined color mixing matrix, only
surface reflectance and ambient light are measured online and the
compensation for color images is just as simple as for gray images.
2
Photometric Model
As our focus is on color mixing of the system, we assume linear
response functions for both projector and camera1 , each with three
channels (R, G, B). We use a photometric model similar to [Nayar
et al. 2003; Grossberg et al. 2004; Fujii et al. 2005] with a little
modification.
The above process is also applicable for G and B channels of projector. At last, the photometric model of a projector-camera system
at each pixel can be written as:
C = A(V P + F),
where
There are some symbols and functions used in this model. For clarity reason, we try to keep them consistent with the original. They
are listed below:


F = 
• PK : projector brightness for channel K.
• C L : camera captured brightness for channel L.
• λ: wavelength of visible spectrum.
VKL =
• s(λ): spectral reflectance function for a surface point in
viewer’s direction. We assume the reflectance is constant for
each color channel. Similar assumption was also used by [Nayar et al. 2003; Song and Cham 2005; Fujii et al. 2005].
• qL (λ): spectral quantum efficiency function for L channel of
camera.
• eK (λ, PK ): spectral distribution function for K channel of projector with input intensity PK .
We start with R channel of projector for example. The irradiance on
the surface point is the sum of projector illumination and ambient
light. The ambient light for the point is denoted by a(λ). Suppose
the projector input intensity is PR , then the radiance of the surface
point in viewer’s direction can be written as:
lR (λ) = s(λ)(eR (λ, PR ) + a(λ)).
(1)
The radiance of this point is measured with a RGB camera. The
spectral quantum efficiency for the three channels are qR (λ), qG (λ),
qB (λ) separately. As we assume reflectance for each channel is constant, denoted by (AR , AG , AB ), the irradiance measured by the camera for RGB channels are:
Z
CR = AR qR (λ)(eR (λ, PR ) + a(λ))dλ
Z
CG = AG qG (λ)(eR (λ, PR ) + a(λ))dλ
Z
C B = AB qB (λ)(eR (λ, PR ) + a(λ))dλ.
(2)
We make an assumption that the spectral distribution function of
projector is stable for R channel, which means:
eR (λ, PR ) = PR wR (λ),



0
 CR 
 AR




C =  CG  , A =  0 AG
0
0
CB


FR 
 VRR VGR VBR
FG  , V =  VRG VGG VBG


FB
VRB VGB VBB
(3)
where wR (λ) is spectral response function for R channel of projector. We will verify this property in Section 3.1. With the assumption, Equation 2 can be written as:
Z
CR = AR qR (λ)(PR wR (λ) + a(λ))dλ
Z
CG = AG qG (λ)(PR wR (λ) + a(λ))dλ
Z
C B = AB qB (λ)(PR wR (λ) + a(λ))dλ.
(4)
1 We measure the response functions for both projector and camera for
each channel separately, using the method proposed by [Song and Cham
2005]. Assuming identical response function for different channels is not
proper. Particularly, response functions for some projectors vary a lot for
different channels.
(5)

0 
0  ,

AB



 PR
 , P =  P

 G
PB


 ,

Z
qL (λ)wK (λ)dλ ,
Z
FL =
qL (λ)a(λ)dλ .
The matrix A is reflectance of the surface. The vector F shows
the contribution of ambient light. For convenience, the black-offset
of projector is also absorbed by F. The matrix V is called color
mixing matrix which describes the interaction of color channels of
a projector-camera system. We will show how to determine it in
next section.
3
Color Mixing of a Projector-Camera System
In this section, we first investigate the stability of spectral distribution function for projector. This property is fundamental for the
model shown in Equation 5. After validation of the model, we propose a method to determine a single color mixing matrix for the
system. With the determined matrix, reflectance map of a surface
can be recovered with two images.
3.1
Stability of Spectral Distribution Function
We describe the method to verify stability of green channel at first;
the verifications for red and blue are similar. Two uniform images
are projected with different colors as:




 0 
 0 
P(0) =  0  , P(1) =  T  .
0
0
Applying Equation 2, we have
∆CR = CR(1) − CR(0) = AR
Z
qR (λ)eG (λ, T )dλ
Z
∆CG = CG(1) − CG(0) = AG qG (λ)eG (λ, T )dλ
Z
(0)
∆C B = C (1)
−
C
=
A
qB (λ)eG (λ, T )dλ .
B
B
B
(6)
If the spectral distribution function is stable as shown in Equation
3, we can get the following ratios:
R
AR qR (λ)wG (λ)dλ
∆CR
R
=
∆CG
AG qG (λ)wG (λ)dλ
R
AB qB (λ)wG (λ)dλ
∆C B
R
=
.
(7)
∆CG
AG qG (λ)wG (λ)dλ
As the above equation shows, the ratios are independent of projector input intensities if the spectral distribution function for green
(a)
(b)
Figure 2: Stability of spectral distribution function for green channel. The left is the plot of captured data from a reference point with green
channel of projector increasing from 0 to 255; the right is the plot of computed ratios, blue line for red/green and green line for blue/green
channel of projector is stable. To verify this property, a series of
256 images with green level increasing from 0 to 255(red and blue
are always 0) are projected and captured. A reference point is chosen randomly. After excluding ambient light, the captured data for
this reference point is shown in Figure 2(a). We compute the ratios of this reference point for every input intensities of projector.
The result is shown in Figure 2(b). It shows that the ratios do keep
invariant to different projector intensities2 , which supports the assumption of stable spectral distribution function for green channel.
a reference point P0 is chosen with combined result A0 V. For other
points of the surface, we have:
0
Ai V = Ai (A0 V)
i = 1, 2, . . .
(8)
Then A0 V is used as the color mixing matrix for the system, and the
0
surface reflectance is described by Ai , which is relative reflectance
to the reference point.
To measure the combined result of AV, four images are projected
and captured, first a black image, then a red, a green and a blue in
sequence. Take red channel for example. According to Equation 5,
the captured data for black image and red image is:




 PR 
 0 



(1)
(0)
(9)
C = A(V  0  + F) = AF, C = A(V  0  + F).
0
0
Then the first column of AV can be computed as:
AR VRR =
Figure 3: The left shows the stable distribution function for red
channel, and the right for blue channel.
The verification for red and blue channels of projector is similar.
The result is shown in Figure 3. With these results, we can say that
the spectral distribution functions for this projector are stable for
all channels. We have verified the stability for both LCD and DLP
projectors, and this property is always satisfied.
3.2
Evaluation of Color Mixing Matrix
From Equation 5, as the reflectance of surface point can’t be determined in advance, there’s no way to measure the accurate color
mixing matrix for the system. Instead, we choose a reference point,
measure the combined result of AV for this point, and take the combined matrix as color mixing matrix for the system. This approxcimation is reasonable by considering the fact that the unknown
scales to each row of V, caused by the reference point’s reflectance
A, will be absorbed by relative reflectance of other points. Suppose
2 The noise is mainly caused by temporally-varying projector and camera
noise. The plot starts from 50 of projector input. As the camera we use has
low dynamic range, captured data for low irradiance is not accurate.
∆CR
∆CG
∆C B
, AG VRG =
, AB VRB =
.
PR
PR
PR
(10)
Similarly, the other two columns of AV are computed with green
and blue images. This method is also used by [Nayar et al. 2003;
Grossberg et al. 2004; Fujii et al. 2005].
3.3
Retrieval of Reflectance Map
With the pre-determined color mixing matrix for a projector-camera
system, the reflectance map of a surface can be retrieved with two
images, one black (0, 0, 0) and one with intensity (PR , PG , PB ). By
excluding the ambient light from the captured gray image, we have

 
 

0
0   PR 
 ∆CR   AR
 ∆C  =  0 A


0  V  PG  .
(11)
G 
G

 
 

∆C B
0
0 AB
PB
As V is known, we can measure the reflectance for each channel as:
∆CR
VRR PR + VGR PG + VBR PB
∆CG
AG =
VRG PR + VGG PG + VBG PB
∆CR
AB =
.
VRB PR + VGB PG + VBB PB
AR =
(12)
4
Experiments
We use a Sony VPL-CX71 projector, a Flea2-14S3C camera. In
our experiments, every scene is captured ten times and the average result is used to minimize the temporally-varying projector and
camera noise.
4.1
Estimating Color Mixing Matrix
We estimate the color mixing matrix of our system on a colorful
surface as shown in Figure 4. The combined result of AV is retrieved for every point of the surface using the method described in
Section 3.2. We choose a white point in the center and use its combined result as the color mixing matrix for our system as shown in
Figure 4.3
Figure 4: Projection surface for evaluation of color mixing matrix.
(a) Red channel
(b) Green channel
(c) Blue channel
(d) Combined RGB
Figure 5: Reflectance map for the surface shown in Figure 4.
As the color mixing matrix has been determined, the relative reflectance for every point of this surface is computed according to
Equation 8. There are totally 70x70 points. We expect the reflectance matrix is diagonal for every point. We take the absolute
value, add them up for every element of A separately, and get a
total reflectance matrix for this surface. Then the total matrix is
normalized by scaling every colum with the corresponding diagonal element. The result is


 1.0000 0.0063 0.0012 

Aaverage =  0.0119 1.0000 0.0127  .
0.0046 0.0178 1.0000
The mean error is quite small (In this case, 1.78 percent at most) as
we expect. The computed reflectance map for this surface is shown
in Figure 5 for RGB channels separately.
4.2
Prediction of Camera Output for a Single Point
To predict the output for a point, its reflectance is measured at first
using the method proposed in Section 3.3. Then for a given projector input, the output for this point can be computed from Equation
5. We compute the camera output of this point for projector intensities from 0 to 255(gray image). For comparison, these images are
also projected and captured with our projector-camera system. We
choose a point from the bottom-left of the projection surface shown
in Figure 4. The comparison between our prediction and real captured result is shown in Figure 6 for each channel separately. As the
result shows, the prediction matches the real result very well. The
average difference between prediction and real capture is 0.8518
intensity for red channel, 0.4128 intensity for green channel, and
0.4536 intensity for blue channel.
4.3
Prediction of Output Image Based on Known Input
In this section, we try to predict the camera output based on the
known projector input. The prediction is done on both complex colored surface and white wall as shown in Figure 7. The reflectance
3 Generally, a white point is chosen to be the reference point, as people
tend to think white point has same reflectance for each channel. However,
this is not restrictive.
Figure 6: Prediction of camera output for a single point for R, G,
B channel with projector intensity increasing from 0 to 255.
maps for the two surfaces are retrieved as before. The reflectance
map for the colored surface is similar to the map shown in Figure 5.
The reflectance map for the white wall is shown in Figure 8 for
green channel; red and blue channels are similar. Note that the input to projector is a uniform gray image, and the wall is white with
equal reflectance for every point. However, the relative reflectance
varies from point to point, peak in the center and attenuation along
the radial direction. This is because the reflectance map absorbs the
vignetting effect of the projector-camera system.
With projector input intensity (100, 200, 100), the real captured result and our prediction on both surfaces are shown in Figure 9. The
difference maps for each channel are shown in Figure 10. The mean
errors of RGB channels between prediction and real captured result are (0.4133, 0.8466, 0.3838) intensities on colored surface, and
(0.3228, 0.9495, 0.5008) intensities on white wall. We have good
reason to neglect these small errors, because of the temporallyvarying projector and camera noise. We have verified the predictions for many other projector inputs. The results are quite alike, so
(a)
(b)
(a)
(b)
(c)
(d)
Figure 7: Prediction of camera output on a colored surface(a) and
a white wall(b). The two images are taken with projector intensity
(220, 220, 220).
Figure 9: Prediction for projector input intensity (100, 200, 100).
The left shows the real captured data, and the right shows our prediction.
Figure 8: Reflectance map of green channel for the white wall
shown in Figure 7(b).
(a) Difference map of colored surface for RGB channels
we don’t show them repeatedly here.
4.4
Radiometric Compensation
(b) Difference map of white wall for RGB channels
In this section, we illustrate the using of color mixing matrix in
radiometric compensation. For radiometric compensation, the desired camera output is given; projector intensities for each point are
then evaluated based on the desired output and surface reflectance.
The calibration is the same as before by projecting two images.
With the recovered reflectance and ambient light, projector input to
produce the desired output C can be computed as:
P = V −1 (A−1 C − F).
(13)
First, we try to create uniform gray outputs (80, 80, 80) and
(160, 160, 160) on a white wall. The result is shown in Figure 11.
Note that the compensation image overflows in the corners for output intensity 160 because of strong vignetting effect. This problem
has been addressed by [Ashdown et al. 2006; Grundhofer and Bimber 2008], and we don’t cover it here. We also test on color surface.
The result is shown in Figure 12.
No accurate difference maps are provided for comparison, as radiometric compensation requires accurate geometric mapping between
projector points and camera points. In our experiments, we use a
simple projective transformation for the mapping. Even on planar
surface, noticeable displacement of pixels can be detected as a result of camera distortion.
Figure 10: Difference map between prediction and real captured
data multiplied by 120 for both the projection surfaces with input
intensity (100, 200, 100).
5
Conclusion and Future Work
We have investigated the color mixing property of a projectorcamera system in this paper, and presented a method to describe
this property with a single matrix. As our experiment shows, it’s
independent of surface reflectance, it keeps identical for every surface point, and it doesn’t change dynamically unless system settings change. It simplifies the projector-camera system, and makes
the using of such system more convenient, as color images can be
processed just like gray images for each channel separately by decoupling with this matrix firstly.
This work has a future extension for system with multiple projectors and cameras by measuring the color mixing matrix for every
pair of projector and camera. A foreseeable usage is for multiprojector display. As luminance uniform for such applications has
been studied a lot, chrominance can also be considered by integrating the color mixing of the system and a better visual result can be
expected.
F UJII , K., G ROSSBERG , M. D., AND NAYAR , S. K. 2005. A
projector-camera system with real-time photometric adaptation
for dynamic environments. In Proc. of CVPR, vol. 1, 814–821.
(a) Desired camera output (80, 80, 80)
G ROSSBERG , M. D., AND NAYAR , S. K. 2003. What is the space
of camera response functions? In Proc. of CVPR, vol. 2, 602–
612.
G ROSSBERG , M. D., AND NAYAR , S. K. 2004. Modeling the
space of camera response functions. IEEE Trans. Pattern Anal.
Mach. Intell. 26, 10, 1272–1282.
(b) Desired camera output (160, 160, 160)
G ROSSBERG , M. D., P ERI , H., NAYAR , S. K., AND B EL HUMEUR , P. N. 2004. Making one object look like another:
Controlling appearance using a projector-camera system. In
Proc. of CVPR, vol. 1, 452–459.
Figure 11: Creating uniform gray image on white wall. left: projection without compensation; middle: compensation image; right:
projection with compensation image.
G RUNDHOFER , A., AND B IMBER , O. 2008. Real-time adaptive
radiometric compensation. IEEE Trans. Vis. Comput. Graph. 14,
1, 97–108.
J UANG , R., AND M AJUMDER , A. 2007. Photometric selfcalibration of a projector-camera system. In IEEE International
Workshop on Projector-Camera Systems.
M AJUMDER , A., AND S TEVENS , R. 2002. Lam: luminance attenuation map for photometric uniformity in projection based displays. In Proc. of VRST, ACM, 147–154.
(a) projection surface
(b) original image
M AJUMDER , A., J ONES , D., M C C RORY, M., PAPKA , M. E.,
AND S TEVENS , R. 2003. Using a camera to capture and correct spatial photometric variation in multi-projector displays. In
IEEE International Workshop on Projector-Camera Systems.
M AJUMDER , A., 2002. Properties of color variation across multiprojector displays. Proc. of SID Eurodisplay.
(c) uncompensated result
(d) compensation result
Figure 12: Compensation of color image on color surface.
Acknowledgements
M AJUMDER , A. 2003. A Practical Framework to Achieve Perceptually Seamless Multi-projector Displays. PhD thesis. The
University of North Carolina at Chapel Hill.
M ITSUNAGA , T., AND NAYAR , S. K. 1999. Radiometric self
calibration. In Proc. of CVPR, 1374–1380.
NAYAR , S. K., P ERI , H., G ROSSBERG , M. D., AND B EL HUMEUR , P. N. 2003. A projection system with radiometric
compensation for screen imperfections. In IEEE International
Workshop on Projector-Camera Systems.
This work was conducted at Digital Art Laboratory of Shanghai
Jiao Tong University. It was sponsored by 863 National High Technology R&D Program of China (No. 2006AA01Z307).
R AIJ , A., G ILL , G., M AJUMDER , A., T OWLES , H., AND F UCHS ,
H. 2003. Pixelflex2: A comprehensive, automatic, casuallyaligned multi-projector display. In IEEE International Workshop
on Projector-Camera Systems.
References
R ASKAR , R., W ELCH , G., C UTTS , M., L AKE , A., S TESIN , L.,
AND F UCHS , H. 1998. The office of the future: A unified
approach to image-based modeling and spatially immersive displays. In Proc. of ACM SIGGRAPH 1998, 179–188.
A SHDOWN , M., O KABE , T., S ATO , I., AND S ATO , Y. 2006. Robust content-dependent photometric projector compensation. In
IEEE International Workshop on Projector-Camera Systems, 6.
B IMBER , O., E MMERLING , A., AND K LEMMER , T. 2005. Embedded entertainment with smart projectors. IEEE Computer 38,
1, 48–55.
C HAM , T.-J., R EHG , J. M., S UKTHANKAR , R., AND S UK THANKAR , G. 2003. Shadow elimination and occluder light
suppression for multi-projector displays. In Proc. of CVPR,
vol. 2, 513–520.
D EBEVEC , P. E., AND M ALIK , J. 1997. Recovering high dynamic
range radiance maps from photographs. In Proc. of ACM SIGGRAPH 1997, 369–378.
S ONG , P., AND C HAM , T.-J. 2005. A theory for photometric
self-calibration of multiple overlapping projectors and cameras.
In IEEE International Workshop on Projector-Camera Systems,
97.
S UKTHANKAR , R., C HAM , T.-J., AND S UKTHANKAR , G. 2001.
Dynamic shadow elimination for multi-projector displays. In
Proc. of CVPR, vol. 2, 151–157.
YANG , R., G OTZ , D., H ENSLEY, J., T OWLES , H., AND B ROWN ,
M. S. 2001. Pixelflex: a reconfigurable multi-projector display
system. In Proc. of IEEE Visualization, 167–174.
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement