Digital Camera System for Recording, Editing and Visualizing Images

Digital Camera System for Recording, Editing and Visualizing Images
US 20100111489A1
(19)
United States
(12) Patent Application Publication (10) Pub. No.: US 2010/0111489 A1
Presler
(54)
Related US. Application Data
DIGITAL CAMERA SYSTEM FOR
RECORDING, EDITING AND VISUALIZING
IMAGES
(76)
May 6, 2010
(43) Pub. Date:
Inventor:
(60)
Provisional application No. 60/923,339, ?led on Apr.
13, 2007.
Ari M. Presler, Niskayuna, NY
(
US
Publication Classi?cation
)
(51) Int. Cl.
Correspondence Address:
H04N 5/93
STEFAN KIRCHANSKI
VENABLE LLP 2049 CENTURY PARK EAST,
215T FLOOR
LOS ANGELES, CA 90067 (Us)
(200601)
H04N 5/225
(2006.01)
H04N 5/228
(200601)
H04N 13/02
(2006.01)
(52) US. Cl. ........ .. 386/52; 348/340; 348/222.1; 348/46;
348/E05.024; 348/E05.031; 348/E13.074;
(21) Appl. No.:
12/595,811
3864305003
_
(57)
(22) PCT F1led:
(86)
ABSTRACT
Apr. 14, 2008
PCT NO;
pCT/Us08/60272
A digital camera system (20), as illustrated in FIG. 1, includes
an optical assembly (22) to gather light (24) from a desired
Oct. 13, 2009
the optical assembly (22), and an image processing, recording
and display subsystem (34).
scene (26), a modular imaging subsystem (28) aligned With
§ 371 (0X1),
(2), (4) Date;
L v
‘11
'-
/u
it?”
\ /
7/5
\L
3"
a
w
3,,
I
l
3‘
31
SOFTWARE
1
|
3”’
‘Z
—
.
‘f0
IMAGE
FRAME
SENSOR
GRABBER -— PROCESSOR —
SCENE
/
/
|
Li L
/
|
USER
INPUT
DISPLAY
\
3;
STORAGE
/V
V
Patent Application Publication
May 6, 2010 Sheet 1 0f 8
US 2010/0111489 A1
Lu
'
L6
/
'
'LV
‘11
ft?“
\
20
v3
x
1
%
f8
32
SOFTWARE I!
I
I
_
-—PROCESSOR- DISPLAY
SCENE
l
/
31/
I
\
A / F|G.1
RR
16
/
34
/
N
‘TL
FRAME
,_
LU
%u
oE
SENSOR
E ‘”
UNIT
gE
SCENE
.
7
GRABBER
8‘
-
CONTROLLER
0
5Z
RAM
L
-
RECORD
'
/
/ MOTORS &
SENSORS
/
Q5
5;
317
7,
|_
‘I1.1;.
" '
90
FRAME
0
-
5 52 u
SENSOR
I
g I! N
UNIT
1
E Z
I
6D
|
DATA LINK,
A
.
L--
/ / 87'
CONTROL
SYNC &
D'SPLAY
SENSORS
/
DOCKING
RECORDER
MOTORS &
;3
‘
a
cghggg'g'ggggm
_
ii‘:
I’
GRABBER
a
T
_
I
, 7 1
6€SYNC
WIRING
éL-"éIG. 2 \ s I;
7,2
.
START/STOP
/
_/7
a2
z LU
7
9/
'
/
AUDIO SOURCES
MOTION CONTROL
TIME CODE GENERATOR
(METADATA)
FIG. 3
,
-- 89’
Patent Application Publication
May 6, 2010 Sheet 2 0f 8
US 2010/0111489 A1
REMOTE sENsOR AND FRAME GRABBER CAM ERA MODULE
r
r
l I H l IH
IOT.
952.
f‘
\D‘('
HD-SDI
FRAME
DAT
SENSOR - LINKA:
UNIT
W4
DISPLAY
GRAEBER
SYNC&
COMMUNICATION
PROCESSOR
CONTROL
f__\
RAW 40?
OUTPUT
' MOTORS &
SYNC &
CONTROL
SENSORS
{no}I
FIG- 4
SOFTWARE
‘5'1,
REMOTE S-SENSOR AND FRAME GRABBER CAMERA MODULE
‘L
g 2’
E
E
m I:
E
E
g
[I w
8 3"
5
g
_2 g
5
5
5
5
w I.
w z
:
=
m 0
g
é
5
/
MOTORS &
l /
SENSORS
_I
m
/ (LT-i
I-ID-sDI
FRAME
DATA
GRABBER
LINK,
&
SYNC 8'
CONTROL
COMMUNICATION
PROCESSOR
-
[04
DISPLAY
O
svNca
CONTROL
l
/ I16
QQ
31
‘Z8
Iw
FIG. 5
22,
11%
3°
I
I
'L'L
\
~
If
1;’
\
/I
52.
/
/
. M
“gag/g“
FRAME
GRABBERI;
' 'T
o
IMAG‘NG
CONTROLLER
U
MODULE
RAM
I
I3‘?
/
Y0
I
DISPLAY
RAwIMAGE
62.
PROCESSING &
ICOMMUNICATION
USER
MODULAR IMAGING SUBSYSTEM
/
/
|
it!
I30
38
IMAGING
SOHWARE
/
p
I
az
FIG. 6
INPUT
DEVICE
L I
ap 0p compIner
REMO AB
DIGITAL
STORAGE
a
I
I
I
“1
‘W
Patent Application Publication
May 6, 2010 Sheet 3 0f 8
US 2010/0111489 A1
'LL
5-8
.
DC BATI'ERY POWER
__
,j‘z
:
/
5 a m HD/MK
65,5551;
8E 0
CONTROLLER
5 2 m
w LU
U
N
[?ags
MODULE "'
|
IMAGING
SOFTWARE
,
I
$5
4Z2
RAW IMAGE
I
pRocEssmsa
RAM 3*" £2
3%
COMMUNICATION
Z I
; LLI
Z T
_|
MODULAR IMAGING
SUBSVSTEM
USER
‘NPUT
36
REMOVABLE
DIGITAL STORAGE
?/y
WIRED OR WIRELESS
NETWORK
V
‘Mre'ess c‘"“Puier
“mg
NETWORK RECORDING, EDITING
& VISUALIZATION
’
Patent Application Publication
May 6, 2010 Sheet 4 0f 8
US 2010/0111489 A1
MOBILE DOCKING CAMERA & PROCESSING SYSTEM
TOUCHSCREEN
DISPLAY
\61
4/1.
5G
Q
I:
‘Cg
5%
.
‘ (‘I
/
/
POSITION
._ w
M
ég 0
‘
INPUT
CONTROL —
T
POWER
‘E FRAME
U
N
HDI2KI4K
GRABBER &
SENSOR
CONTROLLER
5N“ _ E
5z
T
'5,
Q
s
c
'
E
TIMECODE
DISPLAY
V
READER
PROCESSOR
E
/
Lgq
R
5'4 DOCKING CAMERA MODULE
H 5.3 D]
Q.
4
DC BATTERY
POWER
EE
%E
, H8
/13%
MODULAR PRO SSING SUBSYSTEM
IMAGING
I SOFTWARE
USER INTERFACE
32
8g
|
I
RAW lMAGEor HD-SDI MULTI-CHANNEL
I3 1/
PROCESSING, SYNC, BROADCAST& _
l
:
LN
usER INPUT
REMOVABLE
DEVICE
DIGITAL STORAGE
‘1' I
I80
WIRED 0R
WIRELESS
NEI'WORK
".—1_-—~1__, '
NETWORK RECORDING, EDITlNG
l‘fg
a. VISUALIZATION
.
METADATA MANAGER
HD-SDI
DISPLAY
BRO
& COLOR GRADING
ADCAST
INFRASTRUCTURE
FIG. 8
'
COMMUNICATION
DQSPLAY
'
1
Patent Application Publication
May 6, 2010 Sheet 5 0f 8
US 2010/0111489 A1
STEREO 3D NETWORK RECORDING & VISUALIZATION
/
51/
I
g
AM
SPLI
[8g
If
/ 57’
/
/ 74'
ER
_
'12,
an STEREO
VISUALIZATION
W7,»
‘<\
&RECORDING
'
/
v
TILT!
ROTATE
I
TILT!
ROTATE /
STEREO CAMERA POSITION
STAGE CONTROL
3D STEREO
VISUALIZATION
&RECORDING
FIG. 9
Lg
Patent Application Publication
May 6, 2010 Sheet 6 0f 8
US 2010/0111489 A1
MOBILE STEREO CAMERA &
RECORDING SYSTEM
STEREO OR
DUAL
DISPLAY
Stereo
D
"5'
FRAME
HQ
Sensor
GRABBER
Unit
& PROCESSOR
38
IMAGING
SOFTWARE
‘3
8
USER
INTERFACE
"91.
STEREO
6g
FRAME
RAW IMAGEor HD-SDI
PROCESSING &
/
FRAME GRABBER
K
COMMUNICATION
CONTROLLER
‘3},
INPUT
DEVICE
>>>
REMOVABLE
DIGITAL STORAGE
vInreless
Network
(0c,
'72’
\q?' KI’E
I
L
Stereo Rig
1
WIRED OR
WIRELE
VVIIISEEEISQ
NETWOIEE
NETWORK
STEREO 3D VIRTUAL SET
CONTROLLER
Optical
HD-SDI
DISPLAY
Unit
STEREO 3D
Remote
Power, Sync
& Control
Interface
OR DUAL
DISPLAY
>>>
FIG. 10
METADATA MANAGER
& COLOR GRADING
[Y8
Patent Application Publication
May 6, 2010 Sheet 7 0f 8
US 2010/0111489 A1
MULTl-CAMERA & STEREO-3D SYSTEM WITH NETWORK AND
BROADCAST INFRASTRUCTURE
(,0
'
u g;
\
/
\
lé o
I32
HD-SDI
.
g
30
=
/
g8
M
Em
U
U) 1-“
N
z ‘2
Lil-1 _
FRAME
~
DISPLAY
"‘
GRAZBER 5}
{ DOCKING
' COMMUNICATIO
UN'T
83
" RECORDER
PROCESSOR
I
T
—-=_—-
DATA LINK,
FRAME
=
c?mi?
GRABBER
MOTORS &
a
COMMUNICATION
Z
SKINSORS
a
PROCESSOR
_ L?>
TCP/IP
,b-b
HD BROADCAST
NETWORK
INFRASTRUCTURE
\ Stereo Rig
4K
SENSOR
UNIT
MOBILE STEREO
4'?)
,m
CAT-5E
RECORDER
4K
SENSOR
MEI'ADATA MANAGER
& COLOR GRADING
UNIT
Stereo Rig
/l @‘T
7,1.’
.
optlcal
'32
Imaging
Un't
Module
—/
‘
n
‘
"
NETWORK RECORDING, EDITING
PLAYBACK a. VISUALIZATION
"
MOTORS 1,
1’-
49
m
g
; §
Z9 8- :g
E 5
(D q;
HD-SDI
'- cmuIe
*6 ._
Power,
E
Sync &
l
(g
82.
DISPLAY
‘
\
Patent Application Publication
May 6, 2010 Sheet 8 0f 8
US 2010/0111489 A1
SILICONDVR SOFTWARE DIAGRAM
U( 0 o
IDENTIFY
CAM ERA (S)
/
,U ‘,1
/'~I0‘€>
ADJUST
SETTINGS &
INPUT
REC'EVE
CALIBRATION
1,
’ Lt 17'
i
LOAD
CALIBRATIONS “(06
PHI)
STORAGE
/
’
1
‘
CAPTURE
WAGE &
/W°
METADATA
'
READ FROM
NETWORK
1% 4
l
RECEIVE
/ V3 \(
'
I
IMAGE
.1 ‘499
STORAGE
CORRECTION & / YQ
STREAM &
PACKING
CONTROL
RAM BUFFER
H26 J
,-
‘08
RECORD I
LIVE
STREAMING
PREVIEW
LBL I’
I
PLAYBACK
DECODE
APPLY
METADATA
(eg. CINEFORM CODEC)
(eg. IRIDAS SPEEDGRADE)
I
.
N
ENCODING, PACKING
& BUFFERING
l
WRITE FILE &
‘(30 I
/ ‘17v
g
I
I
2.6
CONT-AMER
(AVI, Mov, SIV)
STREAMING
STORAGE
NETWORK
/ wg
DISPLAY
/ y to
VISUALIZATION
IMAGE PROCESSING
/_. f
(If/C
OUTPUT 1
I
FIG. 12
L” l-
OUTPUT 2
l
q 17
May 6, 2010
US 2010/0111489 A1
DIGITAL CAMERA SYSTEM FOR
RECORDING, EDITING AND VISUALIZING
IMAGES
and have complex Work?oWs With respect to stereo 3D and
multi-camera content acquisition, broadcast and netWork
CROSS-REFERENCE TO RELATED
transmission either live or in a post production process. These
digital cinema cameras are limited to 1920x1080 image sen
sor pixel arrays that require the use of a multiple sensor prism
APPLICATION(S)
block Which, in turn, requires use of complex and expensive
[0001] This application claims priority to and the bene?t of
US. Provisional Application Ser. No. 60/923,339, the entire
content of Which is incorporated herein by reference.
optics. These digital cinema cameras utiliZe dedicated hard
Ware functions With no or limited image processing ?exibility
FIELD OF THE INVENTION
perform non-reversible color space transformations or sub
sampling to formats, such as YUV 4:2:2 and 4:4:4, as stan
[0002] The present invention is related, in general, to a
digital cinema camera system. More particularly, the present
dard broadcast signals. These digital cinema cameras imple
ment a variety of proprietary compression and coding
schemes that introduce visible image artifacts, especially
invention is related to a digital cinema camera system for
recording, editing and visualizing images.
or upgrade capability. Dedicated hardWare functions utiliZed
by these digital cinema cameras include video processing to
When projected on large screens. While a number of these
digital cinema cameras can generate previeW imagery for
BACKGROUND OF THE INVENTION
[0003]
For many years, ?lm cameras Were the only option
for capturing cinema quality motion pictures. The time
requirements and costs related to shooting and processing
motion picture images on ?lm stock and then transferring
those images into a digital form have created a need for
motion picture cameras that capture high de?nition or cinema
resolution imagery directly in a digital form. The advent of
Digital Cinemas, cost effective Stereo 3D Digital cinema
projection systems and establishment of Digital Cinema Ini
tiative SMPTE Standards has fueled the need for more con
tent creation for delivery at 2K, 4K and Stereo formats.
[0004] Accordingly, there is a need for a digital camera
system that meets the needs described above that reduces
display on an electronic vieW?nder, these digital cinema cam
eras can only do so With limited resolution or visualiZation
tools. High-resolution outputs from these digital cinema cam
eras are restricted to transmission in SMPTE standard reso
lution and formats. These digital cinema cameras often output
imagery to be record on restrictive, proprietary or large exter
nal storage devices. These storage devices include a tape
storage system having only linear data access, Non-Volatile
Flash or RAM drives With limited storage, and multiple hard
disk drive RAID storage systems Which are often non-por
table and Whose media cannot be easily removed or trans
ported for direct use in other systems. Also, the ?les stored on
these storage devices have limited color correction, image
processing or post production metadata integration.
costs. There is also a need for a digital camera system that
[0006]
leverages digital processing and visualiZation tools. There is
mode video and still camcorders have also been developed
Which use single image sensors With color ?lter arrays. These
digital still cameras and camcorder devices do use higher
a further need for a digital camera system that provides user
feedback and metadata collection When shooting special
effects, compositions and stereo or multi-camera content.
There is an additional need for a digital camera system that
In recent years, many digital still cameras or dual
resolution sensors (e.g., HD (1920x1080) camcorders, digital
?lm-like reproduction qualities and camera operation. There
single-lens re?ex camera (DSLR) are now 10 MP and higher).
HoWever, these digital still cameras and camcorders have
sloW readout architectures (e.g., a DSLR may only shoot four
(4) frames per second) and can only achieve video rate pre
vieW at loW resolution (e.g., 640x480) or standard de?nition
is also a need for a digital camera system that not only pro
vides the capabilities described above but can also utiliZe
(e.g., VGA 640x480 at thirty (30) frames per second) using
sub-sampling or WindoWing techniques. These digital still
improves ?exibility in netWorked collaboration, enables
separated imaging block and recording, has a simple Work
?oW With metadata management and, importantly, maintains
existing cabling infrastructure, broadcast signal monitoring
cameras and camcorders use dedicated hardWare functions or
and transmission systems. There is a need for a digital camera
system that mixes broadcast standard digital sources into a
recording and visualiZation system, as Well as generate broad
targeted function digital signal processors (DSP) to perform
image processing to interpolate and coloriZe the raW image
data from the image sensor. These digital still cameras and
cast standard and netWork streaming outputs for 2D and 3D
camcorders compress the coloriZed images for storage; but
content.
the compressing process performed by these devices prevents
[0005]
In the past feW years, While several digital cinema
cameras have emerged on the market these digital cinema
cameras are complex designs With limited connectivity that
are only able to address a limited set of the needs described
above. For example, these digital cinema cameras are incom
patible With existing cable infrastructure. Also, these digital
cinema cameras either completely lack netWork management
or are capable of only minimal netWork management (i.e.,
access to the original full raW image pixel data for later
processing, analysis or coloriZation. In addition, the interpo
lation and color processing applied to the source raW data in
those devices initially generates data sets that are larger than
the source raW data Which, in turn, requires the application of
higher compression to ?t the data sets into a target storage
capacity. This typically results in a reduction in image quality
the ability to capture or record multi-sensor 2K or 4K image
compared to the original image or a coded version of the raW
data.
[0007] A feW single sensor cameras have been developed
for use in 2K and 4K acquisitions in raW format. HoWever,
data using a single control application. Additionally, these
these cameras use dedicated hardWare or targeted function
digital cinema cameras lack visualiZation tools or metadata
DSPs to perform image processing to interpolate, coloriZe
and display previeW quality output and simultaneously com
only simple controls that lack full image streaming or meta
data management). Further, these digital cinema cameras lack
integration. These digital cinema cameras do not utiliZe exist
ing broadcast infrastructure to transmit multi-resolution data
press the raW sensor image data for later digital editing and
May 6, 2010
US 2010/0111489 A1
grading. Also, the compression method and metadata
an encoder, Which can generate multiple streams one for the
employed by these cameras foreclose the dynamic retrieval of
purpose of recording at highest quality and optionally addi
alternative resolution or representations at different bit rates
tional streams at loWer data rates for remote transmission. It
during recording for netWork streaming, remote grading or
adaptive editing. Due to their architectures, these single sen
a common removable storage device or across multiple
sor cameras must apply high compression to ?t data into
devices for increased throughput. The recording can make use
enables the recording of one or multiple image streams using
target internal storage capacity devices. Also, due to their
of a single ?le containing the streams from multiple imaging
architectures, these single sensor cameras lack the ability to
modules With metadata to enable the selective playback of
transmit the imager raW content over existing broadcast or
one or more streams. The output processing can include mix
netWork infrastructure cabling for remote monitoring, net
ing the imagery from the multiple streams for display on
Working, recording or visualiZation. These single sensor cam
eras cannot process captured signals With prerecorded con
tent or broadcast format signals for live composition,
cessed for output on specialiZed stereographic displays that
standard computer or broadcast monitoring devices or pro
require formatting and synchronization from dual image
sWitching, grading, mixing into virtual sets or adding graphic
streams. UtiliZing metadata encoded in the recorded stream or
overlays based on extracted metadata or analytics. These
single sensor cameras also lack the ability to manage, control
or record multi-sensor imagers, Which may be remotely con
nected to a recording system.
[0008] In recent years, there has been an interest in produc
generated thru user input during playback the relative posi
ing digital cinema quality 3D stereographic, Wide-dynamic
and immersive content using multiple imagers. This has cre
tion, color transformation and format of the dual streams,
representing the left and right eye content, can be adjusted to
change the stereo graphic effect and depth perception on these
displays.
[0010] This invention enables reduced complexity for cap
turing imagery from one or more image modules, enables
ated a need for more e?icient modular and scalable cameras
remote image sensing and frame grabbing With transmission
and Work?oW solutions. There is a further need for a digital
using existing industry standard broadcast and netWorking
infrastructure, improves storage and processing e?iciency,
camera system having a precise synchronization mechanism
to enable images to be mixed or stitched Without motion
artifacts. While digital camera systems have been used to
produce this type of content, these camera systems suffer
from the same system limitations as the cameras described
above. These camera systems are mostly comprised of stand
alone cameras, each With individual controls, vieWing and
recording systems, With no integration mechanism other than
provides increased ?exibility and tools visualiZation, net
Working, analysis and mixing of prerecorded or computer
generated data, and delivers unique display modes 2D and 3D
representation of the multiple streams during live, recording,
playback or post processing. The disclosed digital camera
system may include optics, one or more imaging modules, a
era systems cannot be placed very close together physically,
frame grabber, a processor, softWare, user input mechanism,
a display, synchroniZation mechanism, netWorking means
and storage means. In addition, a con?guration is disclosed
for a portable digital camera and recording system capable of
HD, 2K and 4K stereo-3D or Wide-dynamic multi-image
as is required for short inter-ocular distances in 3D stereo
acquisition using tWo image sensing modules and separated
graphic or for creating hemispherical vieWs Where cameras
image processing, recording and display subsystem.
a common sync signal (i.e., there is no communication
betWeen camera controls or vieWing and recording settings).
These camera systems are large and bulky such that the cam
need to be placed as close together as possible from a com
[0011]
mon center point. When shooting thru mirrors and beam
tion Will become apparent from the folloWing more detailed
splitters, rigs (i.e., a combination of digital cameras, optics
description, taken in conjunction With the accompanying
draWings, Which illustrate, by Way of example, the principles
and mounting platform) become more cumbersome and dif
?cult to use in handheld shooting environments. Finally, these
camera systems lack a comprehensive set of image process
of the invention.
ing, visualiZation, positioning control, recording, playback,
communications and display tools for use in such high-de?
nition multi-camera systems.
SUMMARY OF THE INVENTION
Other features and advantages of the present inven
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The accompanying draWings illustrate the inven
tion. Throughout the draWings like reference numbers indi
cate like exemplary elements, components, or steps. In such
draWings:
[0013]
FIG. 1 is a block diagram ofa digital camera system
[0009] The present invention as described herein discloses
a digital camera system that captures scalable resolution,
bit-depth and frame rate raW or color processed images from
embodying the present invention;
one or more modular imaging modules at precise ?lm or
camera system;
video rates, can utiliZe industry standard cabling infrastruc
raW on the same or different links, provides a mechanism for
[0015] FIG. 3 is a diagram of an alternate embodiment of
the digital camera system;
[0016] FIG. 4 is a diagram of an additional embodiment of
timing synchroniZation of exposure and readout cycles from
the digital camera system covering a remote sensor and frame
multiple imaging modules, uses a uni?ed softWare or operator
interface to control the capture, processing and non-destruc
grabber camera module;
ture for transmitting either the raW sensor data or processed
tive visualiZation from one or more imaging modules, can
optionally combine the live imagery With previously stored
imagery or computer generated virtual sets and simulta
neously record the raW, broadcast format, or visualiZation
processed imagery in its original or near original representa
tion. The processor can be used to compress the imagery With
[0014]
FIG. 2 is a diagram of an embodiment of the digital
[0017] FIG. 5 is a diagram of another embodiment of the
digital camera system covering a remote three-sensor and
frame grabber camera module;
[0018] FIG. 6 is a diagram of yet another embodiment of
the digital camera system;
[0019] FIG. 7 is a diagram of an alternate embodiment of
the digital camera system;
May 6, 2010
US 2010/0111489 A1
[0020] FIG. 8 is a diagram of another embodiment of the
digital camera system a mobile docking camera and process
playback or post processing. The present invention discloses
ing system;
imaging modules, a frame grabber, a processor, softWare, user
a digital camera system that includes optics, one or more
FIG. 9 is a diagram of another embodiment of the
input mechanism, a display, synchronization mechanism,
digital camera system covering stereo 3D netWork recording
and visualization;
netWorking means and storage means. In addition, a con?gu
ration is disclosed for a portable digital camera and recording
[0021]
[0022]
FIG. 10 is a diagram of another embodiment of the
digital camera system covering a mobile stereo camera and
recording system;
[0023] FIG. 11 is a diagram of another embodiment of the
digital camera system covering a multi-camera and stereo-3D
system With netWork and broadcast infrastructure; and
[0024] FIG. 12 is a How chart for SiliconDVR softWare
associated With the present invention.
DETAILED DESCRIPTION OF THE PREFERRED
EMBODIMENTS
[0025] As shoWn in FIGS. 1-12 for purposes of illustration,
the present invention resides in a digital camera system that
captures scalable resolution, bit-depth and frame rate raW or
color processed images from one or more modular imaging
modules at precise ?lm or video rates. The present invention
utilizes industry standard cabling infrastructure for transmit
ting either the raW sensor data or processed raW on the same
or different links. The present invention provides a mecha
nism for timing synchronization of exposure and readout
cycles from multiple imaging modules. The present invention
also provides a uni?ed softWare or operator interface to con
trol the capture, processing and non-destructive visualization
from one or more imaging modules. The present invention
can optionally combine live imagery With previously stored
imagery or computer generated virtual sets, While simulta
neously recording the raW, broadcast format, or visualization
processed imagery in its original or near original representa
tion. The present invention discloses a processor that can be
used to compress the imagery With an encoder, Which can
system capable of HD, 2K and 4K stereo-3D or Wide-dy
namic multi-image acquisition using tWo image sensing mod
ules and separated image processing, recording and display
subsystem.
[0027]
An embodiment of the present invention in the form
of a digital camera system 20 is illustrated in FIG. 1 and
described beloW in order to provide an overvieW of the cam
era system 20 as Well as various components of the system 20
and their respective functions. The camera system 20 includes
an optical assembly 22 to gather light 24 from a desired scene
26. The system 20 also includes a modular imaging sub
system 28 aligned With the optical assembly 22 to receive
light 24 gathered and/or modi?ed by the optical assembly 22.
The modular imaging subsystem 28 comprises one or more
imagers 30 and at least one frame grabber 32. The imager 30
captures high de?nition raW images at ?lm or video rates for
HD, 2K and 4K, cinema quality production. The imager 30
can come in various forms including, Without limitation, at
least one pixilated image sensor unit having one or more
arrays of pixels, a pickup tube, a semiconductor detector or
the like. The pixilated image sensor unit can come in various
forms including, Without limitation, a complimentary metal
oxide semiconductor (CMOS) active-pixel image sensor, a
metal-oxide semiconductor (MOS) active-pixel image sen
sor, a charge-coupled device (CCD), a contact image sensor
(CIS) or other pixilated detection devices. A single image
sensor 30 may include color ?lters that are used to capture a
representation of the full color images.
[0028] The optical assembly 22 includes optics (e.g.,
generate multiple streams one for the purpose of recording at
lenses). The system 20 includes a lens mount (not shoWn) that
interconnects the optical assembly 22 and the modular imag
highest quality and optionally additional streams at loWer
ing subsystem 28. The lens mount can come in various forms
data rates for remote transmission. The present invention
enables the recording of one or multiple image streams using
including, Without limitation, a ?xed optical interface mount,
an interchangeable lens optical interface mounting system or
a common removable storage device or across multiple
the like. Thus, the lens mount can provide for ?lm or video
devices for increased throughput. The recording can make use
of a single ?le containing the streams from multiple imaging
lenses to be removably connected to the modular imaging
subsystem 28. The interchangeable lens mount is a precise
modules With metadata to enable the selective playback of
mounting surface and locking mechanism, Which enables
one or more streams. The output processing can include mix
?eld exchange of the lens mount to support the use of a variety
ing the imagery from the multiple streams for display on
of industry standard lenses such as, PL, Nikon-F, Panavision,
standard computer or broadcast monitoring devices or pro
Leica, C and Canon. An interchange lens mount With an
cessed for output on specialized stereographic displays that
integrated optic enables the use of B4 optics (originally
require formatting and synchronization from dual image
designed for use With three-sensor prism cameras) on a
streams. Utilizing metadata encoded in the recorded stream or
single-sensor based camera unit. In the alternative, the image
generated thru user input during playback the relative posi
sensor unit may have an integrated lens.
tion, color transformation and format of the dual streams,
representing the left and right eye content, can be adjusted to
change the stereo graphic effect and depth perception on these
adjustment mechanisms (not shoWn) to adjust the position of
the image sensor unit relative to the optical center of lens
displays.
projection and/or to adjust the co-planarity of a sensing plate
[0026]
[0029]
The image sensor unit 30 includes a plurality of
The present invention enables reduced complexity
(i.e., the surface Which holds the sensor circuit board of the
for capturing imagery from one or more image modules,
image sensor) relative to the optical interface mount. The
enables remote image sensing and frame grabbing With trans
mission using existing industry standard broadcast and net
image sensor unit 30 also includes a mechanism for back
Working infrastructure, improves storage and processing e?i
ciency, provides increased ?exibility and tools visualization,
netWorking, analysis and mixing of prerecorded or computer
focus adjustment. Any of the adjustment mechanisms can
include an electronic positioning device for remote operation
for moving the sensor(s), optics, camera(s) or rig(s). In the
alternative, the image sensor unit 30 may be integrated With
generated data, and delivers unique display modes 2D and 3D
representation of the multiple streams during live, recording,
an optical beam splitter or rotating shutter mechanism to
enable the use of an optical vieW?nder While continuing to
May 6, 2010
US 2010/0111489 A1
acquire imagery. In another alternative, an electronic display
ules). In this manner, the image sensor unit 30 can be pro
unit can be mounted into the optical beam splitter mechanism
grammed as a master to output the sync signals or as slave to
to enable selectable operation as an optical or electronic vieW
?nder or as a combination optical vieW?nder With virtual
receive a synch signal from external sources. An input (i.e.,
electronic image display.
pad), or from the frame grabber 32 thru softWare initiation
[0030]
from a recorder) to the image sensor units 30 can be used to set
the operation either as a master or a Slave camera. An input
An optical device (not shoWn), such as an RGB
prism, may be positioned in front of the optical assembly 22
so that a plurality of pixilated sensor units 30 in the imaging
subsystem 28 capture color-separated channels. Alterna
tively, a plurality of pixilated image sensor units 30 and beam
splitting optics may also be used to capture a Wide dynamic
range representation of light 24 from the scene 26, Where each
command via hardWare (from external signal device (key
can also be used to select a skip or non-skip synchronization
mode. In a skip mode, the master camera (i.e., the master
sensor unit) Will only output a synch pulse at the top of frame
of each frame that should be grabbed (i.e., captured by the
frame grabber 32). The slave camera (i.e., the slave sensor
pixilated image sensor unit 30 captures a range of scene
intensity (i.e., each sensor Will have a bounded range of
unit) Which is programmed for skip mode, Will receive a pulse
intensity that the sensor can accurately detect or measure
chronized readout of a frame pair. The double speed readout
skip capture mode alloWs reduced motion blur With a faster
based on the capacity of sensitivity and setting of the camera
and its sensor (e.g., one cannot typically see details of both the
Moon and the Sun in the same scene). Signals from each of
the plurality of image sensor units 30 can be processed and
combined into a single image representing a Wider range of
scene intensity than can be accomplished With a single image
sensor unit 30. Each image sensor unit 30 is capable of out
putting multiple readouts from the at least one array of pixels
every other frame scanned from the master to begin a syn
readout and also enables the use of a mechanical shutter, such
as a rotating mirror shutter of the type used on traditional ?lm
cameras, With a rolling shutter readout sensor, such as are
commonly found on CMOS image sensors, Wherein the
image exposed during the open period of the mechanical
With varying integration times for each pixel of the array
shutter can be readout during the closed period.
[0033] The frame grabber 32 includes an on-board memory
(not shoWn) to capture full frame images (from a single or
during a single frame time, to be later combined to achieve a
multi-exposure readout image data) or buffer the image data
Wide dynamic representation of light 24 from the scene 26.
The multiple exposure readout per frame can be applied to
single or multiple image sensor unit con?gurations.
[0031] The image sensor unit 30 contains a time base and
controller With precision to enable audio synchronization.
The pixel data is readout of the image sensor unit 30 at
variable clock speeds or resolutions and is captured via the
frame grabber 32. The images output from the image sensor
unit 30 may be captured by the frame grabber 32 either
for transfer to a processing subsystem 34 having an associated
memory (i.e., a RAM part of the frame grabber 32). The frame
grabber 32 can be used for image pre-processing that
includes, Without limitation, noise and pixel defect correc
continuously or on an intermittent basis. The image sensor
34 to initiate speci?c actions Which may include triggers for
timing synchronization or event initiation (e.g., timing syn
unit 30 can operate stand alone to generate and output image
data over a high speed data link to the remote frame grabber
32. The image sensor unit 30 includes local inputs (e.g.,
sensor input, motor encoders, timecode, triggers, etc.) and
can generate external sync and control signals (i.e., signal
output Which can be used to synchronize other cameras to the
tion, binning, sub-sampling, black frame offset correction,
multi-exposure data mixing, data packing or data coding
Which may be lossy or lossless.
[0034] The frame grabber 32 is able to receive input from
external components and/or from the processing subsystem
chronization of readout and event initiation such as “Start
recording” or “move lens”). The external inputs can be de
bounced and re-timed relative to the image readout cycle to
enable precise record start event synchronization.
[0035] The frame grabber 32 includes a time code reader
?rst camera’s timebase; control canbe for lens, motors, lights,
(not shoWn) and a timing unit (not shoWn) for matching time
etc.). The image sensor unit 30 can also accept synchroniza
tion and clock source reference from the frame grabber 32
(i.e., the frame grabber 32 gets the external sync information
reference With other system devices, such as an audio
and sends the external sync information to the sensor or
recorder (not shoWn). The time code reader obtains a time
reference from a master source and the timing unit keeps the
[0032] The image sensor unit 30 can operate in either a
continuous mode or a skip frame output mode. The image
time like a clock. The timing unit can set its reference time
from an external time code source. The time code generator is
a clock Which has an output connection to alloW other devices
to receive the current time and set its clock to match it. The
sensor unit 30 can also operate in a readout mode at a rate
timing unit (i.e., the clock) may contain a charge-storing
greater than the desired capture frame rate (i.e., the rate at
device, such as a large capacitor, Which can enable continued
operation and time keeping When an external poWer source
operates the sensor timing).
Which data is captured by the frame grabber 32), generally
instruct the frame grabber 32 to capture the intermittent or
(not shoWn) is disconnected from the system 20. The system
20 may also be poWered by a battery (not shoWn).
[0036] The frame grabber 32 may also generate outputs for
controlling various devices including, Without limitation,
alternating frames, With speci?c timing relative to the top of
motors such as pan and tilts to position the camera (i.e., the
tWice the desired capture frame rate, and can produce an
associated synchronization signal (e.g., once for every tWo
images). This synchronization (or sync) signal can be used to
frame readout. The sync signal can be output for external
synchronization of additional cameras (e.g., sensor units,
camera modules, etc.). The sync signal can also be received
by the image sensor unit 30 from external sources (e.g.,
image sensor unit 30), rotating shutters (i.e., spinning mirrors
imager and frame grabber), a master sync generator Which
like those used on traditional ?lm cameras), slide motion
stages (i.e., that slide along a rail from the left to right side of
a stage along a beam) or lens motors for zoom, focus or iris,
Which may be used to control a stereo 3D rig or lighting such
as strobes. The frame grabber 32 can also generate synchro
generates sync signals for all sensor units or camera mod
nization signals for multi-camera operation. The frame grab
another image sensor unit 30, modular imaging systems (i.e.,
May 6, 2010
US 2010/0111489 A1
ber 32 can also receive external information from sensors
(e.g., positioning encoders, laser distance sensors, micro
phones), rotating shutter position detectors, time code gen
erators and positioning devices. The external information
received by the frame grabber 32 can be used for system
control and transmitted along With the imagery (i.e., the data
from image sensors Which can be raW or processed, codec or
uncoded) for recording as metadata or user interface feed
the local memory of the frame grabber 32 can be transmitted
to obtain a live previeW display. The modular camera unit 28
can generate multiple data streams With varying bit rates that
are simultaneously transmitted over a single or multiple data
links. The data transmission link from the image sensor unit
30 to the frame grabber 32 or from frame grabber 32 to the
processor 36 may use existing infrastructure broadcast trans
mission media, such as 1.5 Gigabit/sec or 3.0 Gigabit/sec
back. The frame grabber 32 can also accept audio signal
input, Which can be processed and mixed into the image data
capable links including Coax and triax, Wireless links, ?ber
stream for transmission. The audio source is another data
the raW 2K or 4K image data or SMPTE format RGB and
source. It is desirable to transmit the audio together to keep lip
YUV serial digital data. The raW or color processed data,
sync With the images (e.g., sounds heard correspond to
images of a mouth making those sounds).
[0037] The frame grabber 32 can be used to generate color
processed RGB or transformedYUV imagery at the time base
of the readout from the image sensor unit 30 or scan-con
verted for output at a different rates. An external sync refer
ence signal can be used to establish the rate and phase for the
scan-converted output. The frame grabber 32 can output the
imagery (i.e., a sequence of motion pictures) as standard
de?nition, high de?nition (digital or analog signaling) or
computer output formats.
[0038]
The frame grabber 32 can be housed With the image
sensor unit 30 in a shared housing or enclosure (not shoWn).
optics or netWork cables, such as CAT-5e, to transmit either
either coded or uncoded, can be transmitted on the same data
transmission link. The data transmission link can also be used
to simultaneously transmit the raW and processed data in
either coded or uncoded formats. This enables a very high
quality data set to be used for recording, While another data set
is used for remote transmission or display. The data transmis
sion links using coax or triax can incorporate a reverse control
channel modulated onto the cable, from a remote processing
sub-system 34. Additionally, poWer can be received on the
triax connection. In this Way, a single cable can be used for
image and metadata transmission as Well as control and
poWer.
[0040] A modular camera unit 28 that uses a 2K image
In the alternative, the frame grabber 32 can be remotely
sensor unit 30 can use a at least one SMPTE 1.5 Gigabit/sec
connected to the image sensor unit 30 in order to alloW a
smaller form factor sensor unit (i.e., a camera head siZed to ?t
HD link to transmit 2K raW image data at ?fty (50) images per
second With 10-bit per pixel data. A modular camera unit 28
into a hand, put on a goal post for sports, etc.). Combined in
that uses a 4K image sensor unit 30 can use tWo or more
the same housing, the image sensor unit 30 and the frame
3-Gigabit per second links to transmit 4K raW images at
minimum of 23.97 images per second With at least 10-bit
precision. The 4K image sensor unit 30 can transmit image
data at a minimum of 5.1 Gigabit/second and the modular
grabber 32 (i.e., modular imaging system 28) comprise a
modular camera unit 28 Which can operate either standalone
With an output for display or can be connected to a processing
sub-system 34, capable of operating one or more modular
camera units 28. The modular camera unit 28 can be remov
ably docked (i.e., electro-mechanically connected) to the pro
cessing sub-system 34, or can be detached from the process
ing sub-system 34 for remote camera unit operation (i.e.,
remote operation of the modular imaging system 28) via
Wired or Wireless connections. A docking mechanism (not
shoWn) provides electro -mechanical connection of the modu
lar camera unit 28 and the processing sub-system 34 that
camera unit 28 using the 4K image sensor unit 30 can transmit
data at a minimum of 2.55 Gigabit/ second. A sensor unit 30 or
camera module 28 may use a four pair netWork cable With up
to three pairs carrying data to the frame grabber 32 and the
fourth pair for reverse control and optional feedback (Which
may include a digitally encoded display stream for output).
An alternate con?guration can use the netWork cable as a
single Gigabit Ethernet connection from the frame grabber 32
enables ?eld insertion or removal of the modular camera unit
that can be used to transmit 12-bit raW uncompressed data at
over 100 MB/ sec to enable capturing of 2048 by 1152 reso
28 from the processing sub-system 34. In the dockedposition,
lution imagery (i.e., RAW data Which may or may not be
the modular camera unit 28 can receive poWer and controls
coded by the frame grabber 32) at up to tWenty ?ve (25)
from a processor 36 comprising a portion of the processing
images per second, even if the sensor unit has 4K resolution.
In a sensor WindoWing mode, the frame grabber 32 can trans
mit 1280 by 720 resolutions at rates faster than standard 720P
sub-system 34 and can also transmit images (i.e., image data)
and metadata to the processor 36 using the docking connec
tions.
video and ?lm rates, up to eighty ?ve (85) images per second.
processed, codec or uncoded) using a data link and may
generate an output display of the motion pictures. The data
link or data channel betWeen the frame grabber 32 and the
At a resolution of 960 by 540, a rate of one hundred ?fty (150)
images per second can be achieved. Similarly, other resolu
tions and over-cranking frames rates can be achieved Within
the channel limits of the link rate. Using coding methods on
the data, higher resolutions and frame rates can be transmit
processor 36 can transfer the raW or processed image data at
ted, including but not limited to cinema 4K. The Gigabit
variable speeds, Which may be beloW, at or above the readout
rate of the image sensor unit 30. In the case of the capacity of
the data channel being less than the rate of readout from the
image sensor unit 30, multiple images can be buffered in the
Ethernet link may use poWer-over-ethernet or additional dedi
[0039] The modular camera unit 28 can communicate
imagery (i.e., data from image sensors Which can be raW or
local memory of the frame grabber 32 and images transmitted
cated Wire pairs alongside the Ethernet, to achieve a single
connection for poWer, data and control.
[0041] Data received by the processing sub-system 34
includes, Without limitation, image data, metadata, control
signals or the like. The metadata may include information
using a ?rst in, ?rst out (FIFO) basis or in a skipped frame
fashion. All imagery may be buffered until the memory of the
frame grabber 32 is ?lled and then emptied at the available
pertaining to the frame grabber 32, the image sensor unit 30,
data channel rate. A subset of the imagery being captured into
audio, global positioning system (GPS), Timecode or motor
camera or lens settings, scene data, or external inputs (e.g.,
May 6, 2010
US 2010/0111489 A1
position feedback. The control signals may include functions
outputs and external controls such as lighting controls, cool
for synchronization and event initiation such as starting and
ing system control, poWer management, motor positioning,
lens controls, time code, device synchronization, multi-cam
stopping of recordings.
[0042] The processor 36 of the processing sub-system
executes reprogrammable softWare 38 that performs image
processing for visualization, analysis, or storage. The proces
sor 36 may be either dedicated hardWare or general purpose
era synchronization, calibration stimulus, tactile feedback,
status indicators and audio.
[0048] The processor 36 monitors the temperature and
recording status of the processing sub-system 34 and can
central processing unit (CPU), graphics processing unit
automatically adjust a cooling system (not shoWn) that cools
(GPU) or DSP or a combination thereof.
the processing sub-system 34. The cooling system can
[0043]
include a fan that dissipates heat. The processor 36 adjusts the
The reprogrammable softWare 38 can use a touch
screen oriented user interface, run on an industry-standard
cooling system in various Ways including, Without limitation,
notebook computer or Workstation x86 PC platforms, and use
reducing fan speed, to loWer the ambient noise levels gener
ated by the camera system 20.
built-in communication and display ports or additional mod
ules can be added into the notebook computer or Workstation
[0049]
for additional frame grabbing, processing, storage or display
output functions.
[0044] The reprogrammable softWare 38 can perform vari
ous image processing functions including, Without limitation,
image correction, interpolation, White balance, color correc
user interface) or the frame grabber 32 to perform speci?c
tasks. Various input mechanisms 42 may include, Without
The processor 36 can accept input from a user (via a
limitation, a computer mouse, a pointing device, a touch
screen input, direct digital signaling or netWork commands.
The speci?c tasks may include, Without limitation, start or
tion, color transformation including the use of three dimen
stop recording, initiate playback, adjust the settings of the
sional look-up tables, motion detection, object detection and
frame grabber 32 and/or the image sensor 30, or select image
processing or display modes.
[0050] The system 20 also includes a storage device 44. The
classi?cation, tracking, triangulation, calibration, color key
ing, image mixing, stereographic processing, anaglyph gen
eration, indicator overlay, focus detection, exposure meter
storage device 44 can be either internal or external and either
ing, zooming and scaling, ?ipping, data packing, pattern
?xed or removable. Examples of various storage devices that
may be used include, Without limitation, non-volatile
memory, ?ash memory, a magnetic hard drive, an optical disk
and tape. For the purposes of external storage, display or
processing, the processed or raW data may be externally trans
mitted. The transmission methods may include, Without limi
matching and recognition, face recognition, data rate analy
sis, enhancement, stabilization and compression. The image
processing functions may be softWare selectable and may be
combined for multiple image processing functions.
[0045] The image data can be stored, displayed, or trans
mitted. The processor 36 generates a ?le management system
for subsequent data storage. The system 20 also includes a
display 40 connected to the processor 36. The display 40 can
come in the form of various devices including, Without limi
tation, USB, FireWire/IEEEl394, SATA, Ethernet, PCI, PCI
Express, Camera link, HDMI, DVI, HD-SDI, Displaylink,
In?niband, Wireless or Fiber optic link.
[0051] A particular con?guration of a digital camera sys
tation, electronic vieW?nders, cathode ray tube (CRT) moni
tem 20 that uses multiple image sensor units 30 (via one or
tors, liquid crystal displays (LCD), organic light emitting
more modular camera units 28) input into a single processing
diodes (OLED), LCOS displays or projectors, or stereo
sub-system 34 may be used for capturing multiple image
graphic displays, such as virtual reality (VR) goggles. The
simultaneously With the ability to synchronize the sources
display 40 may be on-board the camera or remotely con
(i.e., sensor imaging units or camera modules), coordinate
nected. The processor 36 generates a loW latency representa
control and combine image processing, recording, display,
tion of the scene 26 to the display 40 for user feedback and
interactive control or positioning. In the event the processing
storage and communication. This multiple camera con?gu
ration can be used for processing 3D stereographic and
sub-system 34 is not capable of displaying full resolution, full
immersive scenes. The imagery (i.e., the RAW image data)
frame representations, the processor 36 can send a reduced
resolution or reduced frame rate to the display 40 in order to
this multiple camera con?guration can be recorded on a
maintain the loW latency.
[0046]
The processor 36 generates one or more outputs for
display With image data, status and setting information or
on-screen menus that are usable for a touch-screen user inter
face. The output can also be set to display image data With or
Without image processing in a full screen display mode With
out the status or operator control information, for projection
or broadcasting. In this mode, it may still be possible to mix
additional imagery, graphics and overlays for use in the trans
mission or recording. These features may include sports
image annotation, advertisement insertion, animations, key
ing, virtual set integration or multi-channel streaming content
mixing. In multiple display output con?guration, one monitor
and metadata (i.e., audio, positioning, timecode, etc.) from
single removable storage medium or to independent storage
devices in a synchronized fashion to enable simpler post
processing and display. The combined imagery can be out
putted to specialized displays such as stereographic LCD
monitors, 3D or spherical projection systems and VR
goggles.
[0052] The system 20 includes softWare 38 stored on a
memory and running on the processor 36. The softWare 38
provides the user With the ability to obtain stereo imaging. An
imaging softWare program 38 provides control of single and
stereo image sources (i.e., imagery) With synchronized image
capture, frame grabbing, processing, metadata capture, dis
play, coding, recording and playback, using a single user
or display 40 may be used for user interface and additional
interface. The image sources (i.e., imagery) can be from
outputs used for full screen display With different image
processing functions applied to the user images.
[0047] The processor 36 can output raW image data, image
processed data or metadata in any combination for storage,
image sensor units 30 or camera modules 28 capable of cap
turing high de?nition raW images at ?lm or video rates for
transmission or display. The processor 36 can also generate
CMOS, CCD or other pixilated detection device that contains
HD, 2K and 4K cinema quality production. As discussed
above, the image sensor unit 30 may be based on at least one
May 6, 2010
US 2010/0111489 A1
a time base and controller With precision to enable audio
synchronization. The system 20 can record the sound or the
timecode for the audio can be synchronized With the timecode
for the images. A user can record the audio With the imagery
or the timecode Which is associated With the audio in another
audio recording device Which also records timecode and then
tied back together during editing or post production process.
[0053] The softWare 38 running on the processor 36 may
automatically detect the presence of an image sensing unit 30
alloWs compression rates to rise dynamically for more com
plex scenes, and alloWs compression rates to dynamically fall
for less-complex scenes. The codec can support RaW pixel
data, RGB orYUV data formats. The codec can combine data
from image, audio and metadata and streaming metadata in
headers of ?les and Within groups of pictures (GOP) foruse in
decoding and editing. The coded data can be encapsulated
into industry standard format ?le containers such Audio
video lnterleaves (AVI), Quicktime (MOV). In playback, the
frame grabber 32 to determine the camera module 28 or
codec, Which may be on the same softWare platform or part of
a post-production softWare program, can adaptively select to
sensor unit identi?cation and its image capture or processing
capability. Upon identi?cation, the softWare 38 can load
transform, to enable real-time, multi-stream editing perfor
image calibration data from a storage device 44 or can initiate
mance in softWare on standard PCs, Without the need for
a calibration process, Which can extract data from the con
specialiZed hardWare. The compression or coding method
may be softWare selectable for each recording and streaming
or a camera module connected to a Network or to a hardWare
nected camera module(s) 28 including, Without limitation,
decode hierarchical resolution data, inherent in the Wavelet
settings of the camera module 28 and/ or image sensor 30. The
function. The softWare 38 can also capture and record imag
ery as uncompressed data at various bit depths.
[0058] The softWare 38 running on the processor 36 can
softWare 38 adjusts various settings including, Without limi
tation, resolution, frame rate, exposure, gains and stereo sync
pre-combine the imagery into a single larger image or inter
source master or slave. The software 38 can program a camera
module 28 or an image sensor unit 30 as a master that uses an
leaved sequence for coding as a single stream. The metadata
contained in the stream can be used to indicate the left and
internal sync and outputs the sync signals or as slave that
receives a sync signal from external sources (e.g., another
right image source and alloW playback and editing of the
pixel-by-pixel black level, gains, shading, and defect pixels.
[0054]
The softWare 38 running on the processor 36 adjusts
code stereo streams as independent full-frame streams or can
stereo ?le as a single video source, yet displaying either
camera module 28 or image sensor unit 30 acting as a master).
source individually or as a mixed representation for stereo
The softWare 38 can be used to set operation of the image
sensor unit 30 in a continuous or skip frame output mode and
to instruct the frame grabber 32 to capture the intermittent or
graphic display.
alternating frames, With speci?c timing relative to the top of
frame readout.
[0055] The softWare 38 running on the processor 36 can be
used for controlling camera and optical positioning devices
[0059] The softWare 38, either in a mobile stereo recorder
of the type described beloW or on a separate playback system,
can retrieve and render the recorded imagery into a sequence
of raW images along With metadata as industry standard Digi
tal Negative (DNG) ?les or as fully processed RGB orYUV
images based on metadata stored Within the image and data
on a 3D stereo rig for stereo effect adjustment such as focus,
stream or in an associated container ?le, in formats such as
iris, inter-ocular distance and convergence. The softWare 38
running on the processor 36 may also control a positioning
system on Which the 3D rig is mounted. The softWare 38
running on the processor 36 can capture metadata such as rig
DPX and TIF. The softWare 38 may alloW modi?cation to the
associated metadata streams effects on the retrieved images.
The method for debayer algorithm can be selectable or
accomplished thru replacement of softWare modules.
motor position data, timecode, lens and optics settings and
[0060]
camera settings.
[0056] The softWare 38 running on the processor 36 can
generate processed display imagery from live or playback
perform image and stereo processing functions in any com
bination of general purpose processor or dedicated hardWare,
cessed image data and metadata can be sent to a host (i.e., a
RISC arrays or DSP’s. The image and stereo functions may
include, Without limitation, image correction, interpolation,
White balance, color correction, color transformation includ
ing the use of three dimensional look-up tables, motion detec
tion, object detection and classi?cation, tracking, triangula
tion, calibration, color keying, image mixing, stereographic
processing, indicator overlay, focus detection, exposure
metering, Zooming, scaling, Warping, ?ipping, data packing,
pattern matching and recognition, face recognition, data rate
analysis, enhancement, stabiliZation and compression. The
image processing functions may be softWare selectable and
may be combined for multiple image processing functions.
[0057]
The softWare 38 running on the processor 36 can
perform compression and can employ a full-frame temporal
The softWare 38 running on the processor 36 can
sources on single or dual outputs. The coloriZed raW or pro
separate computer Which is not the mobile stereo recorder)
for additional display, processing, recording or transmission.
The softWare 38 can ?ip and mirror display imagery to enable
a vieWing system With tWo displays on a 3D beam splitter
vieWer. The display image data can be formatted and scaled
for standard de?nition or high de?nition displays 44. For
bayer image sources, the softWare 38 can be used to select the
demosaic method, based on the available processing capabil
ity. The processed imagery may include generating stereo
graphic displays including dual-streams image mixing, ana
glyph, over-under, side-by-side, sequential sWitching and
other modes, Which may assist in perceiving the potential 3D
stereo.
[0061]
The overlap or mix from the dual image streaming
sources may be adjusted relative to each other to change the
Wavelet transform codec to eliminate “block artifacts” that
stereo effect. The repositioning and adjustment may include
are often present When using DCT compression. The softWare
translation, rotation and Warping. The resulting adjustment
38 may have scalable precision to operate on data from 10-bit
done thru the user interface becomes another metadata
and higher, With optimiZed arithmetic precision based on
source, While alloWing the full-siZe non-adjusted original
source resolution, and scalable resolution to support a variety
of formats including HD, 2K and 4K. The softWare 38 canuse
data to be transmitted or recorded.
constant-quality, variable bitrate (VBR) compression that
[0062] The softWare 38 running on the processor 36 can
operate in a client-server con?guration With remote control
May 6, 2010
US 2010/0111489 A1
over a wired or wireless network. The software 38 can accept
acts as a mobile stereo recorder 46. The mobile stereo
a trigger to initiate synchronized start and stop recording.
recorder 46 executes the software 38 of the camera system 20
Client software can request and receive data from a server on
and can capture, process and record synchronized imagery
the network where the data comes in various forms including,
from at least two image sensor units 30 or camera module
without limitation, single images, stereo image, streaming
source playback using a single user interface.
images, audio, time code, camera settings, server settings,
project settings, color look up tables and other metadata. The
[0069] The recorder 46 includes a battery voltage input
power supply with gigabit Ethernet for image and data com
munication. The streaming data can then be processed by a
client software can send the same or modi?ed versions of the
data back into a camera system which also has the processing,
such as the mobile stereo recorder, and effect changes on the
live or recorded data.
host computer, such as a single or multi-core x86 cpu with a
graphics processing unit (GPU) for display and have inter
faces for removable storage which may include IDE, USB,
[0063] The software 38 running on the processor 36 can
execute a timed motion and recording event sequence, which
Network and SATA.
may include, without limitation, start recording, continuous
adjustment of stereo rig positioning and lens settings such as
programmed slew rates, target positions and pauses, record
speed changes and event and timer triggers.
tional multi-input frame grabber processing system. The
[0064] The software 38 running on the processor 36 may
have a mode for calibration of a display device 44. In the
calibration mode, the software 38 can generate test pattern
outputs for stimulus and the response values can be measured
using a connected optical sensor. A sequence of stimulus and
response measurement values can then be used to create a
modi?cation to the imagery sent to the display 44, such as
using 3D Look-up-tables applied to the raw data or used to
modify the settings on the hardware used to generate the
output.
[0065] The software 38 running on the processor 36 can
combine the stereo imagery in a virtual studio which takes a
video image of live scene shot against a keying color back
ground or stage and composite them against a computer
generated 3D environment to create the illusion that the live
actors are actually inside and interacting within a virtual
world. The software 38 can switch between multiple stereo
sources, have selectable image and audio stream mixing or
delaying, chromakeying, and renderer 3D Graphic real-time.
[0070]
The mobile stereo recorder 46 may also use an addi
frame grabber 32 may use FPGA devices and scalable mas
sively parallel RISC Processor Arrays. The RISC array pro
cessor may use architecture of brics, which contain multiple
compute units, such as Streaming RISC units and streaming
RISC units with DSP extensions, and memory RAM units.
The RAM Units can stream addresses and data over channels.
These channels can be word-wide and run at 10 Gigabits per
second or higher. These processors can execute an operation,
do a loop iteration, input from channels, and output to a
channel every cycle. These brics can connect by abutment
through channels that cross bric-to-bric. The compute units
and ram unit can be arranged so that, in the array of brics,
there are contiguous compute units and contiguous ram units.
The array processor can have a con?gurable interconnect in
hierarchical fashion with several levels of hierarchy.
[0071] The frame grabber processing system may be
capable of capturing stereo image data from multiple high
speed serial digital transmission links, such as 1.5 Gigabit and
3.0 Gigabit HD-SDI, HSDL, Easylink and Cameralink. The
image data may be from image sensor units 30 or camera
modules 28 in raw pixel data, color processed RGB, color
processedYUV in coded or uncoded formats. The image data
The software 38 can provide trackless camera control where
the camera’s motion and switching between shots are accom
may also come from broadcast video or computer sources.
plished by manipulating the 3D virtual set itself rather than by
manipulating the real, physical cameras. Within the V1rtual
4K image data at a minimum of 4096x2180 from an image
Studio, movements like complicated pans, swoops, and tilts
are then possible because the camera is not actually physi
cally movingithe 3D set is. The software 38 can have a
[0072]
The frame grabber 32 can be capable of capturing
sensor unit 30 at a minimum 5 Gigabit/sec or from a 4K
camera module 28 at a minimum of 2.55 Gbit/ sec of raw data.
pre-de?ned track assigned by 3D design or manipulated
[0073] The frame grabber processing system may be
capable of stereo display outputs. Each output may be capable
through an external input device such as a joystick. The soft
ware 38 can make positional adjustments of the stereo rig
or camera modules. The stereographic outputs may include
(i.e., a physical package with two sensor units or camera
modules) synchronized with changes in the virtual set further
enhancing the stereo perception. The combined output of the
of displaying live or processed imagery from the sensor units
stereo visualization video processing and signaling to drive
dual displays or displays requiring mixed streams, included
synchronization data for shutter glasses controls.
stereo sources and virtual set can be displayed for live pro
[0074]
duction and can be encoded for streaming and recording.
programmable code 28 for image and stereo processing func
[0066]
tions. The processing functions can be done by the frame
The software 38 running on the processor 36 can
operate on independent hardware platforms for increase pro
cessing power, where capture, processing, switching and
effects, such as virtual sets, streaming and recording can be
distributed and controlled via network, yet can maintain syn
chronized recording events.
[0067] The software 38 running on the processor 36 records
on a single removable storage medium or to independent
storage devices in a synchronized fashion with common ?le
The mobile stereo recorder system 46 can execute
grabber processor alone or in combination with the host com
puter and graphics processor unit. Playback can be done on
the host or in combination with the frame grabber processing
system, where the imagery can also be output for display. The
processing and control functions of the mobile stereo
recorder 46 may be remotely controlled from another system
via wired or wireless network or other input device, such as
touchscreen keypad with serial communication.
naming conventions, to enable simpler stereo post process
[0075]
ing, editing and playback.
[0068] Various components of the processing sub-system
storage magazine 44, which may contain at least one storage
media unit. The removable storage magazine may use at least
34 of the system 20 can be embodied in a single housing that
one SATA interface. The storage device 44 may include a
The mobile recorder 46 can use a single removable
May 6, 2010
US 2010/0111489 A1
Raid controller, With drive carrier selecting the RAID storage
methods, such as RAID-0 or RAID-1.
[0076]
The mobile recorder 46 can be contained in an ergo
nomic package Which enables it to mount on a stereo camera
stabilizing platform, such as a steadicam and MK-V Revolu
tion System, along With stereo image sensor units 30 or cam
era modules 28, Which mechanically isolates the movement
of the camera rig (i.e., the platform Which has the sensor
unit(s) or camera modules or camera With recording system)
from that of the camera operator, providing a very smooth
shot even When the operator is moving quickly over an uneven
surface.
[0077] In another embodiment of the present invention, as
seen in FIG. 2, a digital camera system 50 includes an optical
assembly 22 to gather light 24 from a desired scene 26. The
approximately 23.976 frames per second (i.e., 24 frames per
second) and video rate is 25-30 frames per second.
[0080] Channel bandWidth betWeen image sensing unit and
frame grabber and betWeen frame grabber and processor is
suf?cient for transmission to enable full resolution motion
picture raW image or data processing. This means the pipe to
move the data from the sensor to the frame grabber must be
fast or Wide enough to carry all the raW pixel data as it is
reading out of the sensor at the ?lm and video rates. For
example, to move 2K image, Which has 3.3 MB per frame at
48 frames per second Would require almost 200 MB/sec
throughput from the sensor to the frame grabber. Often, read
out from the sensor is at 2x the speed than needed to capture
in the frame grabber 32. The frame grabber 32 then only needs
to move 24FPS or 100 MB/sec to the PC for processing. It is
reasonable to have the frame grabber 32 do lossless coding of
aligned With the optical assembly 22 to receive light 24 gath
the data to achieve a 2:1 data reduction, Which Would get the
frame grabber to host link bandWidth at 50 MB (hence a
ered and/ or modi?ed by the optical assembly 22. The modular
minimum of 48 MHZ as Intel 4.
system 50 also includes a modular imaging subsystem 28
imaging subsystem 28 comprises at least one imager sensing
[0081]
unit or imager 30 and at least one frame grabber 32. The
(e.g., a momentary mechanical sWitch) electro-mechanically
modular imaging subsystem and the frame grabber are in a
shared housing 52 and comprise, in part, a camera module 54.
connected to the camera module 54 along With poWer and
The system 50 includes a record start/ stop button 64
The optical assembly 22 includes optics 56 (e.g., a
sync input and output Wiring 66 using a connector (e.g., an
8-Pin Lemo FGG.1B.308.CLAD52 connector). An output
Carl Zeiss Super 16 Ultra Prime lens) connected to the camera
module 54 at an appropriate back focal distance using an
mechanically connected to the iPort GPIO that illuminates
interchangeable lens optical interface mounting assembly 58
When recording is active and un-illuminated When recording
that includes an optical loW pass ?lter (OLPF) 60 (e.g., a P+S
is non-active.
[0078]
signal is a light emitting diode (LED) (not shoWn) electro
Technik PL lens Interchange mount With Sunex Optical LoW
[0082]
pass ?lter). Alternatively, the mount may be a P+S IMS Inter
change Mount With calibrated back focus and sensor co
and sensors 68 (e.g., a C-motion lens control system, a Pre
planarity adjustment mechanism. The interchangeable lens
optical interface mounting assembly 58 is a precise mounting
The system 50 further includes a plurality of motors
ston Motor and control system, etc.) that act as adjustment
mechanisms to adjust the position of the image sensor unit
relative to the optical center of the lens projected image circle
surface and locking mechanism, Which enables ?eld
and/or to adjust the co-planarity of the sensing plate upon
exchange of the lens mount to support the use of a variety of
Which the image sensor pixel circuit board rests relative to the
industry standard lenses such as, PL, Nikon-F, Panavision,
Leica, C and Canon.
optical interface mounting assembly 58. The sensor is
mounted behind the lens and can be adjusted for ?atness,
[0079]
centering and rotation. Any of the adjustment mechanisms
The modular imaging sub-system 28 comprises an
HD/2K or 4K CMOS image sensor unit 30 and a frame
can include an electronic positioning device for remote opera
grabber and controller 32. One example of an image sensor
unit 30 is an Altasens 4562 2K and HD CMOS system-on
tion.
[0083] The system 50 further includes a laptop notebook
computer 70 connected to the camera module 54 by a cable
chip, Xilinx FPGA, Microchip PIC micro controller, Linear
LT series Linear regulators, IDT ICS-307 programmable
clock and Fox 924 temperature controlled crystal oscillator.
Another example of an image sensor unit 30 is an Altasens
8472 Quad HD and 4K format capable sensor. An example of
a frame grabber and controller 32 is a Pleora iPort With FPGA,
RAM Buffer and Gigabit Ethernet connectivity With serial
ports and GPIO for programmable device control and com
munication. An example of a camera module 54 comprising
an integrated imaging sub-system 28 in a housing 52 is an
SI-2K MINI digital camera from Silicon Imaging, Inc. The
SI-2K MINI includes an Altasens 4562 CMOS imaging sen
sor unit and a Pleora frame grabber. The SI-2K MINI is
(not shoWn) (e.g., a CAT-5e Ethernet cable) through a Net
Work 72. The cable is connected to the camera module 54 by
a connector (e.g., a 12-pin LEMO FGG.2B.312.CLAD52
Connector). One example of the notebook computer 70 is a
Dell M90 With a MarvellYukon Gigabit Ethernet Expresscard
for camera connectivity. On-board Wired and Wireless Ether
net ports of the notebook computer provide remote connec
tions streaming, internet connectivity and control.
[0084] A further embodiment of the present invention is
illustrated FIG. 3, Where a digital camera system 80, similar
to the digital camera system 50 described above, includes a
camera module 54, as described above in relation to FIG. 2,
capable of capturing 2K (2048x1152), 1080P HD (l920><
1080), 720P (1280x720) AND 540p (960x540) resolution
that has an additional capability for the frame grabber 32 to
motion pictures. The SI-2K MINI can operate at various ?lm
frame grabber processing section comprises a data link, sync
and video rates including 23.97, 24, 25, 29.97, 30, 50, 59.9
and 60 frames per second. The SI-2K MINI has local RAM
buffer 62 to capture images at higher rate than the channel
capacity and can buffer frames and transmit on a skip frame
basis. Minimum Resolution for 2K is 2048x1080 and mini
and control unit 82 (e.g., an Altera FPGA With Lux Media
Plan HD-1100 raW color processing core, dedicated clock
reference for 74.25 and 74.1758 MHZ, Gennum GS Series
SerialiZers and cable drivers for 1.5 or 3 Gbit/ sec HD-SDI,
IDT Dual Port FIFO for external time base synchronization or
mum ?lm rates Would be 23 .976, except for special time lapse
retiming, SSRAM, Flash andAnalog Devices RAMDACs for
shoot. For 4K shooting, a minimum 4096x1714. A ?lm rate is
SVGA Output) connected to the sensor imaging unit 30 and
process raW video into a live video output. The additional
May 6, 2010
US 2010/0111489 Al
the frame grabber 32. The processing core of the frame grab
c-mount (e. g., a P+S IMS Interchange Mount With calibrated
ber 32 can convert raW pixel data into RGB data using demo
back focus and sensor co-planarity adjustment mechanism)
saic, image correction and color conversion. The processing
and an optic (e.g., a Linos MeVis c-mount lens, a P+S inter
core can also convert betWeen RGB and YUV spaces and
output either 4:4:4 or 412:2 data. The same data link Which
change B4 optical mount and Zeiss Digiprime Lens or the
carries the color processed data from the frame grabber 32 to
the vieWing and recording system can also be used to transmit
[0090] This remote camera head module 102 is commer
cially available as either a SI-2K MICRO-CL or SI-1920HD
the raW data.
CL With 4562 sensor from Silicon Imaging, Inc. An alternate
link con?guration uses an National EasyLink DS32ELX0421
for transmission over CAT5e and Coax and triax. Another
link con?guration uses FPGA With built-in serialiZer logic
and either National or Gennum cable drivers. A further link
con?guration uses a triax cable to transmit image and asso
[0085]
The system 80 further includes an input unit 84
electro-mechanically connected to the camera module 54 and
a plurality of motors and sensors 68 that are also electro
mechanically connected to the camera module 54. The input
unit 84 includes audio sources, motion control and a time
code generator. An Audio codec (not shoWn) With preamps
can receive line or microphone level audio input to mix into
the data stream. The time code generator can be an Ambient
ALL-601. The plurality of motors and sensors 68 are used, in
part, to control the lens (e. g., a C-motion lens control system)
act as adjustment mechanisms to adjust the position of the
image sensor unit 30 relative to the optical center of the lens
mount and/or to adjust the co-planarity of the sensing plate
(i.e., the surface or circuit board Which holds the image sensor
in the correct position) relative to the optical interface mount
ing assembly 58.
[0086] The system 80 also includes a display 86 (e.g., a
Cine-Tal CineMage LCD Monitor, a Lite-Eye LE-450 OLED
vieW?nder or the like) for vieWing image data output from the
camera module 54. Image, metadata and controls touchscreen
like).
ciated data, receive control data via demodulation, and
receive poWer. If poWer is applied locally to the image sensor
unit 30, a coax cable can then operate on coax.
[0091] An Altera FPGA SerialiZes the sensor data output
(i.e., the data from the sensor or from the analog digital
converter Which samples the pixel photosite) and uses at least
one Easylink or cable driver for the data link. One con?gu
ration uses a Cat-5 With up to 3 pairs for transmitting the
serialiZed data to and from the sensor unit and one pair for
poWer. The frame grabber 32 and processing unit (i.e., the
processing unit Which can perform video processing for dis
play) are described above With respect to the SI-2K MINI
With Video processing Where the image sensor unit 30 is
remotely connected. One connection output from the frame
grabber unit 104 drives an HD-SDI display 106 While another
user interface can also be shoWn on this display.
connection output from the frame grabber unit 104 outputs
[0087]
raW pixel data 108, either coded or uncoded.
[0092] The camera head module 102 is connected via a
netWork 72 for remote control and setup. The camera head
The system 80 additionally includes a docking
recorder 88. One commercial available example of a docking
recorder is a SI-2K available from Silicon Imaging, Inc.,
running SiliconDVR softWare, With remote camera module
54 (e.g., an SI-2K MINI) With Live Video output processing.
The docking recorder 88 includes USB ports to connect the
docking recorder 88 With a number of photometric measuring
devices (e.g., a colorimeter Which measures intensity at dif
ferent light Wavelengths) Which can is placed on a display 90
of the docking recorder 88 to create calibration pro?les. The
calibration pro?les serve to get accurate settings independent
of monitor adjustments (e.g., the user may have turned a color
hue knob on the monitor) such that orange Will be orange and
not orange-red. The camera module 54 can be electro-me
chanically docked With the docking recorder 88.
[0088] An additional embodiment of the present invention
is illustrated FIG. 4, Where a digital camera system 100,
similar to the digital camera systems 50 and 80 described
above, includes a camera head module 102 comprising an
optical assembly 22, an image sensor unit 30 and a data link,
sync and control unit 82 that is located remotely from a frame
grabber unit 104 comprising a frame grabber unit 32 and a
sync and control unit 82.
[0089]
The remote camera head module 102 uses an
HD/ 2K or 4K CMOS image sensor unit 30 (e.g., anAltasens
4562 2K and HD CMOS system-on-chip, an Altasens 8472
module 102 can also be connected to a MacbookPro notebook
computer 110 running bootcamp and WindoWsXP. Silicon
Imaging SiliconDVR softWare is used for camera control,
processing, display and recording. Alternatively, the camera
head module 102 can be connected to an SI-2K docking
recorder 88 running SiliconDVR softWare.
[0093] The system 100 further includes a plurality of
motors and sensors 68 that are also electro-mechanically
connected to the camera head module 102. The plurality of
motors and sensors 68 are used, in part, to control the lens
(e.g., a C-motion lens control system) act as adjustment
mechanisms to adjust the position of the image sensor unit 30
relative to the optical center and/or to adjust the co-planarity
of the sensing plate relative to the optical assembly 22.
[0094] An embodiment of the present invention is illus
trated FIG. 5, Where a digital camera system 120, similar to
the digital camera systems 50, 80 and 100 described above,
includes a camera head module 122, similar the camera mod
ules 54 and 102 described above, comprising an optical
assembly 22, at least one pixilated multi-sensor image sensor
unit 124 and a data link, sync and control unit 82 that is
located remotely from a frame grabber unit 104 comprising a
frame grabber unit 32 and a sync and control unit 82.
Quad HD and 4K format capable sensor), a Xilinx CPLD for
[0095]
sync and sensor timing control, a Microchip PIC micro con
troller, Linear LT series Linear regulators, an IDT ICS-307
programmable clock and a Fox 924 temperature controlled
crystal oscillator. The camera head module 102 also includes
a National Channelink LVDS 28:4 DS90 serialization and
National LVDS receiver and driver for serial communication
prises an optic 126 (e.g., a beam splitter) to split light 24 in
multiple directions. For example, one particular con?gura
The remote camera head module 122 further com
tion of the camera head module 122 comprises tWo sensors
128 (each sensor 128 having an array of pixels) and a beam
splitter 126 for stereo 3D or Wide dynamic image capture.
and trigger input and output. The optical assembly 22
Each sensor 128 can comprise either a monochromatic pixel
array or a color-?ltered pixel array. Each of the sensors 128
includes a lens mount comprising a back focus adjustable
can be controlled and moved mechanically or, alternatively,
May 6, 2010
US 2010/0111489 A1
each of the sensors 128 can be electronically WindoWed to
move the readout region of the particular sensor 128. One of
the tWo sensors 128 can be replaced With an optical vieW
docked With a processing, storage and display system 134.
?nder port, allowing simultaneous thru-the-lens focusing and
processing, storage and display system 134 includes a raW
image processing and communication unit 36 (e.g., a com
digital capture. In an alternate con?guration, the camera head
module 122 comprises an RGB prism 126 and three mono
chromatic sensors 128 (as seen in FIG. 5), one for each color
channel. Additional prism ports are added to achieve addi
tional targeted Wavelength or full spectrum illumination
range capture. One connection output from the frame grabber
unit 104 drives an HD-SDI display 106.
[0096] The camera head module 122 is connected for
remote control and setup. The camera head module 122 can
also be connected to a MacbookPro notebook computer 100
running bootcamp and WindoWsXP. Silicon Imaging Sili
conDVR softWare is used for camera control, processing,
display and recording. Alternatively, the camera head module
122 can be connected to an SI-2K docking recorder 88 run
ning SiliconDVR softWare.
[0097]
The system 120 further includes a plurality of
The camera head module 132 is commercially available as the
SI-2K MINI, as described above in other embodiments. The
puter based on an Intel Core 2 Duo T7600 2.33 GHZ multi
core processor With dual channel DDR2 667 MHZ RAM,
Intel PRO/1000 Gigabit Ethernet and Intel GMA950 Inte
grated GPU With dual HD video outputs, such as the MEN
F17 Modular Compact PCI system). The processing, storage
and display system 134 also includes a user input device 42
(e.g., an Interlink Micromodule mouse pointing module) con
nected via USB to the processor 36. The processing, storage
and display system 134 also includes another user input
device 42 in the form of a Soundevice USBPre for audio
input. The processing, storage and display system 134 further
includes a battery 138 (e.g., an Anton Bauer Hytron140
Lithium Ion connected via the Anton Bauer Gold Mount plate
or external 4-pin Neutrik XLR) and a DC battery input poWer
supply 118 (e.g., an Opus Solutions DCA7.150 150W DC
motors and sensors 68 that are also electro-mechanically
DC converter) Which delivers regulated 12VDC poWer for
connected to the camera head module 102. The plurality of
motors and sensors 68 are used, in part, to control the lens
(e.g., a C-motion lens control system) act as adjustment
peripherals and 5VDC to poWer the processing and commu
mechanisms to adjust the position of the image sensor unit
further includes a display device 40 (e.g., a Xenarc 700TSV
LCD Monitor With touchscreen (Where the touch screen also
provides an additional user input via a USB interface) or
124 relative to the optical center and/or to adjust the co
planarity of the sensing plate relative to the optical assembly
nication subsystem 134.
[0100] The processing and communication subsystem 134
described above, connected to the optical assembly 22. The
Kopin Cyberdisplay With DVI interface are display devices.
[0101] The imaging softWare 38 used in the processing and
communication subsystem 134 is the Silicon Imaging Sili
conDVR program running under Microsoft WindoWs XP.
Other softWare tools used by the SiliconDVR for image pro
cessing include Microsoft Direct-X graphics, CineForm
RAW Codec and Iridas Speedgrade 3D look-up-table soft
camera head module 132 is remotely located from a process
Ware.
ing, storage and display system 134. The camera head module
[0102] The processing and communication subsystem 134
additionally includes removable digital storage 44 (e.g., a
CRU DP-25 SATA RAID Frame, KingWin KF-25 mobile
22.
[0098] FIG. 6 illustrates an additional embodiment of the
present invention, Where a digital camera system 130, similar
to the digital camera systems 50, 80, 100 and 120 described
above, includes an optical assembly 22, a camera head mod
ule 132, similar the camera modules 54, 102 and 122
132 can be in the form of an SI-2K MINI, as described in
detail above and shoWn in FIG. 2, Where the image sensing
unit 30 comprises a Kodak 2 k><2 k CCD and the processing,
storage and display system 134 comprises a Dell Precision
Workstation With Intel Core 2 Duo processor, Intel Pro/1000
Gigabit Multiple Port NetWork Interface card, Nvidia Quadro
Graphics GPU, CRU Removable DP-25 SATA Cartridge sys
tem and LCD display With 1920x1200 native resolution. The
camera head module 132 (i.e., SI-2K MINI) is poWered by a
rack With Seagate 2.5" Hard drive, Crucial Flash device, or 32
GB Intel SLC Solid State Drive SSD Device With 150 MB/ sec
Write speed). The system 140 includes a laptop computer 142
(e.g., a Dell Precision M90) and a NetWork recording, editing
and visualization system 144 (e. g., a Dell Precision 390 Work
station) and a Wireless computer 146 (e.g., a motion comput
ing LE1700). The laptop computer 142 and the NetWork
12 VDC supply or Li-ion battery, such as an Anton Bauer
recording, editing and visualiZation system 144 are each con
Dionic 90 and is connected to the processing, storage and
display system 134 using Gigabit Ethernet over CAT5e or
thru Cisco ?ber optic links and routers. The 12-bit raW imag
nected to the removable digital storage 44 of the processing
ery and metadata is transmitted betWeen the camera head
module 132 (i.e., the SI-2K MINI) and the processing, stor
age and display system 134 (i.e., the Dell Precision Worksta
tion) at minimum sustained data rates of 80 MB/ sec for 2K
DCI 2048x1080 capture, control, visualization, recording
and communication. A laptop notebook computer 136 (e.g., a
Dell M90) is connected to the processing, storage and display
system 134 (i.e., the Dell Precision Workstation) via Wired or
Wireless netWork for remote control, vieWing, playback and
metadata management.
[0099] As seen in FIG. 7, another embodiment of the
present invention illustrates a digital camera system 140,
similar to the digital camera systems 50, 80, 100, 120 and 130
described above, shoWing a modular imaging sub-system 28
that comprises, in part, a camera head module 132 that is
and communication subsystem 134 via a Wired or Wireless
netWork 148. The Wireless computer 146 operates over a
802.1 1 Wireless netWork Which is using D-link Wired or Wire
less routers or USB Wireless netWorking devices to connect to
the other components of the system 140.
[0103] An additional embodiment of the present invention
is illustrated FIG. 8, Where a digital camera system 150,
similar to the digital camera systems 50, 80, 100, 120, 130 and
140 described above, comprises an image sensor and frame
grabber camera module 152, similar the camera modules 54,
102, 122 and 132 described above, including a display pro
cessor 154, embedded time-code reader 156 and motion con
trol 158, Where the camera module 152 is docked With the
modular processing, storage and display system 134. The
system 150 includes an optical assembly 22. The camera
modules includes a CMOS image sensor unit 30 (e.g., an
Altasens 4562 HD/2K sensor or 8472 Quad HD and 4K
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement