Image editing apparatus, image editing method, image editing

Image editing apparatus, image editing method, image editing
US 20090027399A1
(19) United States
(12) Patent Application Publication (10) Pub. No.: US 2009/0027399 A1
Sato et al.
(43) Pub. Date:
(54) IMAGE EDITING APPARATUS, IMAGE
(30)
Jan. 29, 2009
Foreign Application Priority Data
EDITING METHOD, IMAGE EDITING
PROGRAM, AND C()MPUTER_REA])ABLE
Feb. 3, 2005
(JP) ............................... .. 2005-028277
RECORDING MEDIUM
Publication Classi?cation
(75) Inventors:
Takeshi Sato, Tokyo (JP);
Kenichiro Yano, Tokyo (JP); Koji
Kogas Tokyo (JP); Goro
Kobayashis Tokyo (JP)
(51) Int‘ Cl‘
G06T 11/00
G06T 1/00
YOUNG
Correspondence
& THOMPSON
Address:
US.
209 Madison Street, Suite 500
ALEXANDRIA, VA 22314 (US)
Cl-
(2006.01)
(2006.01)
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..
(57)
ABSTRACT
An image editing device is provided With an input section
(73) Assignee?
PIONEER CORPORATION,
(103) for receiving input of image data including information
TOKYO (JP)
relating to date and time; an acquiring section (104) for
acquiring information relating to a route and a time by Which
a mobile Ob‘Ject moved,. and an associating
' '
NO.Z
and at
(22) PCT Filed:
Feb 2, 2006
section (105) for associating the image data With map infor
mation based on information relating to the date and the time
(86) PCT NO;
pCT/Jp2006/301757
of the image data received by the input section (103) and the
information relating to the route and the time acquired by the
§ 371 (c)(1),
(2), (4) Date;
acquiring section (104). The image editing device automati
Oct, 17, 2007
417
cally edits the image data in time series or in route order.
418
g
SOUND
g
310
COLLECTOR —- ISQSTNBF
(MICROPHONE)
4G1
402
4%)?)
D'(i%-|Q)T’(I)JRN)‘T
ACQUISITION UNIT
USER OPERATION UNIT
POSITION
(REMOTE CONTROLLER,
TOUCH PANEL)
404
4I§5
RECORDING
MEDIUM (HD,
DVD, AND
MEDIUM
DECODER
OTHERS)
(DRIVE)
(QAIIAIéEi? —
'INPUT/
S
OUTPUTSI/F
416
4%)0
RECORDING
IMAGE
C P
413
»— EI'g?fgi
8
412
(GPS, SENSOR)
496
\
411
CONTROLLER
GUIDANCE-SOUND
OUTPUT UNIT
COMMU-
ROUTE
SOUND
ROUTE
NICATION
SEARCH-
REPRO-
GUIDE
SOUND
UNIT
ER
DUCER
UNIT
GENERATOR
409
I
410
3
407
S
408
3
414
3
GUIDANCE
‘
Patent Application Publication
wow
Jan. 29, 2009 Sheet 1 0f 10
mow
w
wow
w
w
ZOEw5G<
.2 3
mi“
59E
NEW
83
S3
Emil/6
mo 5m:Q2;3wo
mob
83
US 2009/0027399 A1
$m3oP0w5EQ2
oznw
Patent Application Publication
Jan. 29, 2009 Sheet 2 0f 10
US 2009/0027399 A1
Fl -
START
INPUT IMAGE DATA
'
ACQUIRE INFORMATION
CONCERNING TRAVELING
_ ROUTE
NCLOCK TIME
S201
-
_
-
S202
_
ASSOCIATE IMAGE DATA WITH
MAP INFORMATION
DISPLAY IMAGE DATA
' S203
S204
Patent Application Publication
Jan. 29, 2009 Sheet 3 0f 10
US 2009/0027399 A1
FIGS
305
3
314
305
305
306
302b
1310b
306
312
305
311
Patent Application Publication
Jan. 29, 2009 Sheet 6 0f 10
US 2009/0027399 A1
QZDOw .52930 m;
05
A
QUE 0EM2OD3QWm
QZOFDEn mowua.
WEIOKPZ
cow
-0532 E05:
2.95Qm5.2m30w
mm
Patent Application Publication
Jan. 29, 2009 Sheet 7 0f 10
US 2009/0027399 A1
F|G.7
REE
CAPTURE IMAGE IN CAR
l-/\ S701
COLLECT SOUND IN CAR
|/“ S702
DETECT EACH CHARACTERISTIC AMOUNT
S703
ATMOSPHERE IN CAR
CHANGED?
YES
ACQUIRE IMAGE DATA
i/T S705
ACQUIRE CURRENT POSITION
INFORMATION OF VEHICLE
S706
ACQUIRE MAP INFORMATION
if S707
ACQUIRE INFORMATION CONCERNING
TRAVELING ROUTE AND CLOCK TIME
S708
ASSOCIATE IMAGE DATA WITH I‘V'IAP
INFORMATION
S709
CREATE ALBUM DATA
i» 8710
ACQUIRE BEHAVIOR INFORMATION
F 5711
ASSOCIATE SOUND DATA
S712
S713
NO
I’
IS ALBUM DATA
COMPLETED’?
'
,
Patent Application Publication
Jan. 29, 2009 Sheet 8 0f 10
US 2009/0027399 A1
FIG.8
_
START
ACQUIRE INFORMATION
CONCERNING REPRODUCTION
HISTORY OF SONS
' S801
MAKE REFERENCE TO TIME STAMP
DATA OF IMAGE DATA IN ALBUM DATA
S802
SELECT SOUND DATA OF SONG
HAVING INFORMATION CONCERNING
REPRODUCTION HISTORY
REPRODUCED AT CLOCK TIME
CLOSEST TO TIME STAMP DATA
S803
ASSOCIATE SELECTED SOUND DATA ‘
WITH ALBUM DATA
' S804
Patent Application Publication
Jan. 29, 2009 Sheet 9 0f 10
US 2009/0027399 A1
FIG.9
I START
’
V
MAKE REFERENCE TO
7
CHARACTERISTIC AMOUNT OF COLOR
TONE DATA OF PICTURE IMAGE
S901
Ir
SELECT SOUND DATA
CORRESPONDING TO CHARACTERISTIC
AMOUNT OF COLOR TONE DATA
S902
V
ASSOCIATED SELECTED SOUND DATA
WITH ALBUM DATA
PHOTOGRAPH
ACQUISITION POINT C
PHOTOGRAPH
ACQUISITION POINT D
PHOTOGRAPH
ACQUISITION POINT B
\
START
S903
/
} l 5
POINT ST
A
/
END
7
\
POINT E
PHOTOGRAPH
ACQUISITION POINT A
US 2009/0027399 A1
IMAGE EDITING APPARATUS, IMAGE
EDITING METHOD, IMAGE EDITING
PROGRAM, AND COMPUTER-READABLE
RECORDING MEDIUM
Jan. 29, 2009
and the time in the image data received by the input unit and
the information on the route and the time acquired by the
acquiring unit.
[0008] Moreover, an image editing program according to
the invention of claim 11 causes a computer to execute the
TECHNICAL FIELD
[0001] The present invention relates to an image editing
apparatus, an image editing method, an image editing pro
gram, and a computer-readable recording medium that edit
image editing method according to claim 10.
[0009] Moreover, a computer-readable recording medium
according to the invention of claim 12 stores therein the image
editing program according to claim 11.
image data such as a photograph. However, use of the present
invention is not restricted to the image editing apparatus, the
BRIEF DESCRIPTION OF DRAWINGS
image editing method, the image editing program, and the
computer-readable recording medium.
[0010] FIG. 1 is a block diagram of an example ofa func
tional structure of an image editing apparatus according to an
BACKGROUND ART
[0002] In recent years, with spread of a digital still camera
(DSC), a digital video camera (DVC), and others, an elec
tronic-album creating apparatus that creates a so-called elec
tronic album using image data, e. g., a captured still image or
moving image so that the created album can be readily
embodiment;
[0011]
FIG. 2 is a ?owchart of an example of an image
editing processing performed by the image editing apparatus
according to the embodiment;
[0012] FIG. 3 is an explanatory drawing of an example of
the inside of a vehicle having the image editing apparatus
according to an example mounted therein;
electronic-album creating apparatus creates an electronic
[0013] FIG. 4 is a block diagram of an example ofa hard
ware structure of the image editing apparatus according to the
album as follows.
example;
released on a web page and others is provided. Such an
[0003]
Speci?cally, program software that edits digital
image data to create an electronic album is provided in a
server connected to, e.g., the Internet, the server can receive
image data captured by a digital camera, capturing time data
of the image, position data acquired by a mobile terminal, and
time data when the position data is acquired, and these pieces
[0014]
FIG. 5 is a block diagram of an example of an
internal structure of an image editor in the image editing
apparatus according to the example;
[0015]
FIG. 6 is a block diagram of an example of an
internal structure of a sound reproducer in the image editing
apparatus according to the example;
of received data are associated with each other to create an
[0016]
electronic album by using the program software (refer to, for
example, Patent Document 1).
[0004] Patent Document 1: Japanese Patent Application
Laid-open Publication No. 2002-183742
editing processing performed by the image editing apparatus
FIG. 7 is a ?owchart of an example of an image
according to the example;
[0017]
FIG. 8 is a ?owchart of an example of still another
association processing for audio data in the image editing
processing by the image editing apparatus according to the
DISCLOSURE OF INVENTION
example;
[0018]
FIG. 9 is a ?owchart of an example of still another
Problem to be Solved by the Invention
association processing for audio data in the image editing
[0005]
processing by the image editing apparatus according to the
However, in the electronic-album creating appara
tus disclosed in Patent Document 1, since the program soft
ware in the server associates, for example, a capturing time
and a capturing place of image data with each other to create
the electronic album, a connection environment with respect
to the server must be established and the entire apparatus
structure becomes complicated as an example of a problem.
example;
[0019] FIG. 10 is an explanatory drawing of an example of
a distribution processing for image data in the image editing
processing by the image editing apparatus according to the
example; and
[0020]
FIG. 11 is an explanatory view ofa speci?c process
ing example of the image editing processing by the image
Means for Solving Problem
[0006] An image editing apparatus according to the inven
tion of claim 1 includes an input unit that receives an input of
image data including information on a date and a time; an
acquiring unit that acquires information on a route and a time
at which a mobile object has passed a point on the route; and
an associating unit that associates the image data with map
information based on the information on the date and the time
in the image data received by the input unit and the informa
tion on the route and the time acquired by the acquiring unit.
[0007] Moreover, an image editing method according to the
invention of claim 10 includes an input step of receiving an
input of image data including information on a date and a
time; an acquiring step of acquiring information on a route
and a time at which a mobile object has passed a point on the
route; and an associating step of associating the image data
with map information based on the information on the date
editing apparatus according to the example.
EXPLANATIONS OF LETTERS OR NUMERALS
[0021]
101 capturer
[0022]
102 sound collector
[0023]
[0024]
103 input unit
104 acquisition unit
[0025]
105 association unit
[0026]
106 display unit
[0027]
[0028]
107 detector
108 controller
[0029] 109, 414 sound reproducer
[0030] 310 image editing apparatus
[0031] 412 image editor
[0032] 510 image editing processor
[0033] 610 sound reproduction processor
US 2009/0027399 A1
Jan. 29, 2009
BEST MODE(S) FOR CARRYING OUT THE
by the capturer 101 and a characteristic amount of a sound
INVENTION
parameter included in audio data of a sound collected by the
sound collector 102.
[0034] Exemplary embodiments of an image editing appa
ratus, an image editing method, an image editing program,
and a computer-readable recording medium storing therein
the program according to the present invention will be
explained in detail hereinafter with reference to the accom
panying drawings.
Embodiment
(Functional Structure of Image Editing Apparatus)
[0035]
[0040]
Speci?cally, the characteristic amount of the picture
image includes, e.g., a characteristic amount of a facial pic
ture image of a person included in a picture image of the
image data. As the characteristic amount of the sound param
eter, speci?cally, there are, e.g., characteristic amounts of a
sound volume component (magnitude of a sound volume), a
time component (sound production duration time) and a fre
quency component (magnitude of a frequency).
[0041] The controller 108 controls the capturer 101 based
on the characteristic amount of the picture image and the
characteristic amount of the sound parameter detected by the
detector 107. The controller 108 also controls the capturer
Contents of an image editing apparatus according to
101 to capture an image when the characteristic amount
an embodiment of the present invention will be ?rst
explained. FIG. 1 is a block diagram of an example of a
to an embodiment of the present invention. As shown in FIG.
detected by the detector 107 is changed.
[0042] The sound reproducer 109 reproduces audio data.
When displaying the image data in the display unit 106, the
sound reproducer 109 selects audio data to be reproduced
1, the image editing apparatus is mounted in a mobile object,
based on, e.g., the characteristic amount detected by the
e.g., a vehicle (including a four-wheel vehicle and a two
detector 107 and the behavior information acquired by the
acquisition unit 104. The sound reproduced by the sound
reproducer 109 includes, e.g., musical pieces, sound effects,
functional structure of an image editing apparatus according
wheel vehicle), and includes a capturer 101, a sound collector
102, an input unit 103, an acquisition unit 104, an association
unit 105, a display unit 106, a detector 107, a controller 108,
and a sound reproducer 109.
[0036] The capturer 101 captures an image. The image
captured by the capturer 101 includes an image obtained by
capturing the inside or the outside of a vehicle. The capturer
101 is integrally or detachably attached to the image editing
apparatus. The sound collector 102 collects, for example, a
sound inside of the vehicle. The sound collected by the sound
collector 102 includes a sound collected from a sound ?eld in
the vehicle.
[0037]
The input unit 103 accepts input of image data
including information concerning a date and a time (e.g., time
stamp data). The input unit 103 also accepts input of image
and others.
(Image-Edition Processing Procedure of Image Editing
Apparatus)
[0043] An image-edition processing procedure of the
image editing apparatus according to the embodiment of the
present invention will be explained. FIG. 2 is a ?owchart of an
example of an image-edition processing procedure of the
image editing apparatus according to the embodiment of the
present invention.
[0044] As shown in the ?owchart of FIG. 2, ?rst, the input
unit 103 (see FIG. 1 hereafter) inputs image data including a
picture image of, e. g., a person or a landscape and information
concerning a date and a time from one or more capturers 101
data of an image captured by the capturer 101 and audio data
of a sound collected by the sound collector 102. The acqui
(see FIG. 1 hereafter) (step S201). Then, the acquisition unit
sition unit 104 acquires information concerning a route and a
route and a time of traveling of a vehicle (step S202).
[0045] The association unit 105 (see FIG. 1 hereafter) asso
ciates the image data with map information based on the
clock time of traveling of the vehicle. The acquisition unit 1 04
also acquires behavior information concerning behaviors of
the vehicle. The behavior information is speci?cally informa
tion indicative of a movement or a stopped state of the vehicle,
and includes, as the behavior information, e.g., at least one of
information concerning a vehicle speed (speed information,
acceleration information, angular speed information, and oth
ers), tilt angle information, lateral gravity (G) information,
and current position information.
[0038] The association unit 105 associates the image data
with map information based on the information concerning a
date and a time of the image data accepted by the input unit
103, and the information concerning a route and a clock time
of the vehicle and the behavior information acquired by the
acquisition unit 104. Association carried out by the associa
tion unit 105 determines when and where the image data is
captured by the capturer 101.
[0039] The display unit 106 displays the image data asso
ciated by the association unit 105. The display unit 106 may
display the image data arranged in, e.g., a time-series order of
104 (see FIG. 1 hereafter) acquires information concerning a
information concerning the date and the time of the image
data input at the step S201 and the information concerning the
route and the time acquired at the step S202 (step S203).After
associating the image data with the map information in this
manner, the display unit 106 (see FIG. 1 hereafter) displays
the image data (step S204). With these operations, the image
editing processing based on the ?owchart ends.
[0046] Although not shown, in the display processing of the
image data by the display unit 106 at the step S204, the sound
reproducer 109 (see FIG. 1 hereafter) may select audio data to
be reproduced based on the characteristic amount of the pic
ture image and the characteristic amount of the sound param
eter detected by the detector 1 07 (see FIG. 1 hereafter) and the
behavior information acquired by the acquisition unit 104,
thereby reproducing the selected audio data. When the char
acteristic amount detected by the detector 107 is changed, the
controller 108 (see FIG. 1 hereafter) may control the capturer
101 to capture an image.
capturing the image data or a route order of traveling of the
[0047] As explained above, according to the image editing
vehicle. The detector 107 detects a characteristic amount of a
apparatus based on the embodiment of the present invention,
the input image data can be associated with the map informa
picture image included in the image data of an image captured
US 2009/0027399 A1
tion based on the information concerning the date and the
time of the image data and the acquired information concem
ing the route and the clock time Without using, e.g., a server.
Jan. 29, 2009
computer formed of a central processing unit (CPU) that
executes predetermined arithmetic processing, a read only
memory (ROM) that stores various kinds of control pro
Therefore, the image data obtained during driving of a vehicle
grams, a random access memory (RAM) that functions as a
can be automatically edited in the time-series order or the
traveling route order in association With a passage point or a
Work area for the CPU, and others.
[0054] In a route guidance for a vehicle, the controller 400
calculates Where in a map the vehicle is currently traveling
based on information concerning a current position of the
passage time of the vehicle Without complicating the struc
ture of the apparatus, thereby reducing a complicated opera
tion in image editing and a cost.
[0048] An example of the embodiment according to the
present invention Will be explained in detail. An example
Where the image editing apparatus according to the embodi
ment is applied to an in-vehicle navigation apparatus Will be
explained.
vehicle acquired by the position acquisition unit 403 (current
position information) and map information obtained from the
recording medium 404 through the recording medium
decoder 405, and outputs a calculation result to the display
unit 402. The controller 400 inputs/ outputs information con
cerning the route guidance to/from the route searcher 408, the
route guide unit 409, and the guidance sound generator 410 in
EXAMPLE
the route guidance, and outputs resultant information to the
(Explanation of Inside of Vehicle Having Image Editing
Apparatus Mounted Thereon)
display unit 402 and the guidance sound output unit 406.
[0055] The user operation unit 401 outputs information
input through an operation by a user, e. g., characters, numeric
[0049] The inside of a vehicle having the image editing
apparatus according to the example of the present invention
values, or various kinds of instructions to the controller 400.
As a structure of the user operation unit 401, various kinds of
mounted therein Will be ?rst explained. FIG. 3 is an explana
tory draWing of an example of the inside of a vehicle having
knoWn conformations, e.g., a push-button type sWitch that
detects a physical pushed/non-pushed state, a touch panel, a
keyboard, a joystick, and others can be adopted. The user
operation unit 401 may utiliZe, e.g., a microphone that inputs
the image editing apparatus according to the example of the
present invention mounted therein. As shoWn in FIG. 3, a
monitor 30211 as the display unit 106 shoWn in FIG. 1 and
speakers 304 as sound output devices that are the sound
reproducer 109 are disposed around, e.g., a driver’s seat 311
and a passenger’s seat 312. Cameras 305 as the capturer 101
in FIG. 1 and microphones 306 as the sound collector 102 are
a sound from the outside like a later-explained sound collec
tor 417 to perform an input operation using the sound.
[0056] The user operation unit 401 may be integrally pro
vided to the image editing apparatus 310, or may be operable
from a position separated from the image editing apparatus
disposed in a ceiling portion 314 of the vehicle.
[0050] A monitor 30219 as the display unit 106 is disposed to
be formed as one or more of these various kinds of confor
the passenger’s seat 312 for passengers in a rear seat 313. An
mations. A user appropriately performs an input operation
image editing apparatus 310 (310a and 3101)) includes the
monitor 302 (302a and 30219), the speakers 304, the cameras
305, and the microphones 306. It is to be noted that the
cameras 305 and the microphones 306 may be individually
mounted in the image editing apparatus 310 (310a and 31019).
The image editing apparatus 310 (310a and 3101)) may have
a structure that can be attached to/detached from the vehicle.
(Hardware Structure of Image Editing Apparatus)
[0051] A hardWare structure of the image editing apparatus
according to the example of the present invention Will be
explained. FIG. 4 is a block diagram of an example of a
hardWare structure of the image editing apparatus according
to the example of the present invention.
[0052]
As shoWn in FIG. 4, the image editing apparatus 310
310 like a remote controller. The user operation unit 401 may
according to a conformation of the user operation unit 401 to
input information.
[0057] Information input through an input operation of the
user operation unit 401 includes, e. g., destination information
concerning navigation. Speci?cally, When the image editing
apparatus 310 is provided in, e.g., a vehicle, a position aimed
by a person Who is in the vehicle is set. Information input to
the user operation unit 401 includes, e.g., information of a
display format of image data in an electronic album input
from the later-explained image input/output UP 413 to the
image editor 412 in relation to image editing. Speci?cally, a
display format of an electronic album desired by a person Who
is in the vehicle is set.
[0058] When adopting, e.g., a touch panel as a conforma
tion of the user operation unit 401, the touch panel is lami
is detachably mounted in a vehicle as explained above, and
nated on a display screen side of the display unit 402 and used
con?gured to include a controller 400, a user operation unit
in the laminated state. In this case, managing a display timing
in the display unit 402, an operation timing With respect to the
(remote controller, touch panel) 401, a display unit (monitor)
402, a position acquisition unit (GPS, sensor) 403, a record
touch panel (user operation unit 401), and a position coordi
ing medium 404, a recording medium decoder 405, a guid
nate enables recogniZing input information obtained based on
an input operation. When the touch panel laminated on the
ance-sound output unit 406, a communication unit 407, a
route searcher 408, a route guide unit 409, a guidance sound
generator 410, a speaker 411, an image editor 412, an image
input/output UP 413, a sound reproducer 414, a sound output
display unit 402 is adopted as a conformation of the user
operation unit 401, many pieces of information can be input
Without increasing a siZe of the conformation of the user
unit 415, a capturer 416, a sound collector 417, and a sound
operation unit 401. As the touch panel, various kinds of
input UP 418.
[0053] The controller 400 controls, e.g., the entire image
editing apparatus 310, and executes various kinds of arith
knoWn touch panels, e. g., a resistance ?lm type and a pres sure
metic operations according to a control program to entirely
sensitive type can be adopted.
[0059] The display unit 402 includes, e.g., a cathode ray
tube (CRT), a TFT liquid crystal display, an organic EL dis
control respective units included in the image editing appa
play, a plasma display, and others. Speci?cally, the display
ratus 310. The controller 400 can be realiZed by, e. g., a micro
unit 402 can be formed of, e.g., a picture I/F or a display
US 2009/0027399 A1
device for picture display connected to the picture l/F (not
shown). The picture UP is speci?cally formed of, e.g., a
graphic controller that controls the entire display device, a
Jan. 29, 2009
the lateral G sensor is output to the controller 400 as behavior
information concerning behaviors of the vehicle.
[0063] The recording medium 404 records various kinds of
buffer memory, e. g., a video RAM (VRAM) that temporarily
control programs or various kinds of information in a com
stores image information that can be immediately displayed,
puter-readable state. The recording medium 404 accepts Writ
ing information by the recording medium decoder 405, and
a control IC or a graphics processing unit (GPU) that per
forms display control over the display device based on image
information output from the graphic controller, and others.
records the Written information in a non-volatile state. The
recording medium 404 can be realiZed by, e.g., a hard disk
The display unit 402 displays an icon, a cursor, a menu, a
WindoW, or various kinds of information such as characters or
and a medium that can be attached to/detached from the
(HD). The recording medium 404 is not restricted to the HD,
images. The display unit 402 also displays image data edited
by the later-explained image editor 412.
[0060] The position acquisition unit 403 receives electric
recording medium decoder 405 and has portability, e.g., a
digital versatile disk (DVD) or a compact disk (CD) may be
used as the recording medium 404 in place of the HD or in
addition to the HD. The recording medium 404 is not
Waves from, e.g., an arti?cial satellite to acquire a current
restricted to the DVD and the CD, and a medium that can be
position information (longitude and latitude information) of a
attached to/detached from the recording medium decoder 405
vehicle having the image editing apparatus 310 mounted
and has portability, e. g., a CD-ROM (CD-R, CD-RW), a
magneto-optical disk (MO), or a memory card can be also
utiliZed.
[0064] It is to be noted that the recording medium 404
stores an image editing program that realiZes the present
therein. Here, the current position information is information
acquired by receiving electric Waves from the arti?cial satel
lite to obtain geometric information With respect to the arti
?cial satellite, and it can be measured anyWhere on the earth.
It is to be noted that the position acquisition unit 403 includes
a GPS antenna (not shoWn). Here, the global positioning
invention, a navigation program, image data, and map infor
on the earth. Here, the explanation about the GPS Will be
omitted since it is a known technology. The position acquisi
mation recorded therein. Here, the image data means a value
in a tWo-dimensional array representing a picture image con
cerning, e.g., a person or a landscape. The map information
includes background information representing a feature, e. g.,
a building, a river, or a ground level and road shape informa
tion unit 403 can be formed of, e.g., a tuner that demodulates
tion representing a shape of a road, and is tWo-dimensionally
system (GPS) is a system that receives electric Waves from
four or more arti?cial satellites to accurately obtain a position
electric Waves received from an arti?cial satellite or an arith
or three-dimensionally draWn in a display screen of the dis
metic circuit that calculates a current position based on the
play unit 402.
[0065] The background information includes background
shape information representing a shape of a background and
background type information representing a type of the back
ground. The background shape information includes infor
demodulated information.
[0061] It is to be noted that, as the electric Wave from an
arti?cial satellite, an L1 electric Wave that is a carrier Wave of
1.57542 GHZ and has a coarse and acquisition (C/A) code and
a navigation message thereon is used, for example. As a
result, a current position (latitude and longitude) of the
mation representing, e. g., a typical point of a feature, a
polyline, a polygon, or a coordinate of the feature. The back
vehicle having the image editing apparatus 310 mounted
ground type information includes text information indicating,
therein is detected. It is to be noted that, When detecting a
e.g., a name, an address, or a telephone number of a feature,
current position of the vehicle, information collected by vari
type information representing a type of the feature, e.g., a
building or a river, and others.
ous kinds of sensors, e.g., a vehicle speed sensor or a gyro
sensor may be added. The vehicle speed sensor detects a
vehicle speed from an output-side shaft of a transmission in
the vehicle having the image editing apparatus 310 mounted
[0066] The road shape information is information concem
ing a road netWork having a plurality of nodes and links. The
node is information indicative of an intersection Where plural
therein.
roads cross, e.g., a junction of three streets, a crossroad, or a
[0062] Besides, When detecting a current position of the
vehicle, information collected by various kinds of sensors,
junction of ?ve streets. The link is information indicative of a
road coupling the nodes. Some of the links includes a shape
e.g., an angular speed sensor, a traveling distance sensor, a tilt
angle sensor, or a lateral gravity (G) sensor may be added. The
complementary point that enables representing a curved road.
The road shape information includes tra?ic condition infor
angular speed sensor detects an angular speed When the
vehicle rotates, and outputs angular speed information and
mation. The tra?ic condition information is information
indicative of characteristics of an intersection, a length of
each link (distance), a car Width, a traveling direction, passage
prohibition, a road type, and others.
[0067] The characteristics of the intersection includes, e. g.,
relative direction information. The traveling distance sensor
counts the number of pulses in a pulse signal having a prede
termined cycle that is output With rotations of Wheels to
calculate the number of pulses per rotation of the Wheels, and
outputs traveling distance information based on the number
of pulses per rotation. The tilt angle sensor detects a tilt angle
of a road surface, and outputs tilt angle information. The
lateral G sensor detects a lateral G that is an outWard force that
occurs due to a centrifugal force at the time of cornering of the
vehicle, and outputs lateral G information. It is to be noted
that the current position information of the vehicle acquired
by the position acquisition unit 403 or information detected
by the vehicle speed sensor, the gyro sensor, the angular speed
sensor, the traveling distance sensor, the tilt angle sensor, and
a complicated intersection such as a junction of three streets
or a junction of ?ve streets, an intersection Where a road
bisects at a shalloW angle, an intersection near a destination,
an entrance/exit or a junction of an expressWay, an intersec
tion having a high route deviation ratio, and others. The route
deviation ratio can be calculated from a past traveling history.
The road types include an expressWay, a toll road, a general
road, and others.
[0068] It is to be noted that the image data or the map
information is recorded in the recording medium 404 in the
example, but the present invention is not restricted thereto.
US 2009/0027399 A1
The image data or the map information is not recorded in a
medium provided integrally with the hardware of the image
editing apparatus 310 alone, and the medium may be pro
vided outside the image editing apparatus 310. In this case,
the image editing apparatus 310 acquires the image data
through, e.g., the communication unit 407 via a network. The
image editing apparatus 310 also acquires the map informa
tion through, e.g., the communication unit 407 via the net
work. The image data or map information acquired in this
way may be recorded in, e.g., a RAM in the controller 400.
[0069] The recording medium decoder 405 controls read
ing/writing information from/to the recording medium 404.
For example, when an HD is used as the recording medium
404, the recording medium decoder 405 serves as a hard disk
drive (HDD). Likewise, when a DVD or a CD (including a
CD-R or a CD-RW) is used as the recording medium 404, the
recording medium decoder 405 serves as a DVD drive or a CD
drive. When utiliZing a CD-ROM (CD-R, CD-RW), an M0,
or a memory card as the writable and detachable recording
medium 404, a dedicated drive device that can write informa
Jan. 29, 2009
and processed in the VICS center in real time and displays the
information in the form of characters/?gures in an in-vehicle
device, e.g., a car navigation apparatus although its detailed
explanation will be omitted since it is a known technology. As
a method of transmitting the road traf?c information (VICS
information) edited and processed in the VICS center to the
navigation device, there is a method of utiliZing a “beacon”
and “FM multiple broadcasting” installed in each road. The
beacon includes an “electric wave beacon” mainly used in
expressways and an “optical beacon” used in primary general
roads. When the “FM multiple broadcasting” is utiliZed, road
traf?c information in a wide area can be received. When the
“beacon” is utiliZed, the road tra?ic information required at a
position where a driver’s own car is placed, e.g., detailed
information of an immediately adjacent road based on a posi
tion of the driver’s own car (vehicle) can be received. When a
communication method with respect to another image editing
apparatus is different from a communication method of
receiving image data or road tra?ic information, the commu
tion into various kinds of recording mediums or read infor
mation stored in various kinds of recording mediums may be
appropriately used as the recording medium decoder 405.
nication unit 407 may include plural communicating units
associated with the respective communication methods.
[0070]
from a current position to a destination based on current
The guidance-sound output unit 406 controls output
to the connected speaker 411 to reproduce a guidance sound
for navigation. One or more speakers 411 may be provided.
Speci?cally, the guidance-sound output unit 406 can be real
iZed by a sound l/F (not shown) connected to the sound output
speaker 411. More speci?cally, the sound l/F can be formed
of, e.g., a D/A converter that performs D/A conversion of
[0074]
The route searcher 408 calculates an optimum route
position information of the vehicle acquired by the position
acquisition unit 403 and information of the destination input
by a user. The route guide unit 409 generates real-time route
guide information based on information concerning a guide
route searched by the route searcher 408, route information
received by the communication unit 407, and the current
digital audio data, an ampli?er that ampli?es an analog sound
position information acquired by the position acquisition unit
signal output from the D/A converter, and an A/ D converter
403, and the map information obtained from the recording
medium 404 through the recording medium decoder 405. The
route guide information generated by the route guide unit 409
is output to the display unit 402 via the controller 400.
[0075] The guidance sound generator 410 generates infor
that performs A/D conversion of an analog sound signal.
[0071]
The communication unit 407 carries out communi
cation with another image editing apparatus. The communi
cation unit 407 in the example may be a communication
module that performs communication with a communication
server (not shown) through a base station (not shown) like a
mobile phone, or may be a communication module that
directly carries out wireless communication with another
mation of a tone and a sound corresponding to a pattern. In
other words, the guidance sound generator 410 sets a virtual
sound source corresponding to a guide point and generates
sound guidance information based on the route guide infor
image editing apparatus. Here, wireless communication
mation generated by the route guide unit 409, and outputs
means communication that is performed by using electric
them to the guidance-sound output unit 406 via the controller
waves or infrared rays/ultrasonic waves without utiliZing a
400.
wire line serving as a communication medium. As standards
that enable wireless communication, there are various kinds
[0076] The speaker 411 reproduces (outputs) a guidance
sound for navigation output from the guidance-sound output
of technologies, e. g., wireless LAN, infrared data association
unit 406 or a sound output from the later-explained sound
output unit 415. It is to be noted that, for example, a head
(lrDA), home radio frequency (HomeRF), Bluetooth, and
others, but various kinds of known wireless communication
technologies can be utiliZed in the example. It is to be noted
that the wireless LAN can be utiliZed as a preferable example
from the aspect of an information transfer rate and others.
phone may be provided to the speaker 411 to appropriately
[0072] Here, the communication unit 407 may periodically
(or occasionally) receive road tra?ic information of, e.g., a
sound.
tra?ic j am or a tra?ic regulation. The communication unit 407
may receive the road tra?ic information at timing of distribu
tion of the road tra?ic information from a vehicle information
and communication system (VICS) center or may receive it
by periodically requesting the VICS center for the road tra?ic
information. The communication unit 407 can be realiZed as,
e.g., anAM/FM tuner, a TV tuner, aVlCS/beacon receiver, or
any other communication device.
[0073] It is to be noted that the “VICS” means an informa
tion communication system that transmits the road tra?ic
information of, e.g., a tra?ic jam or a traf?c regulation edited
change an output conformation of a guidance sound or a
sound in such a manner that the whole inside of the vehicle
does not serve as a sound ?eld of the guidance sound or the
[0077] The image editor 412 performs image editing pro
cessing of image data acquired from the later-explained cap
turer 416 and the communication unit 407 via the image
input/output UP 413 and image data recorded in the recording
medium 404. Speci?cally, the image editor 412 includes, e. g.,
a GPU. The image editor 412 creates electronic album (here
inafter, “album”) data using image data in response to a
control command from the controller 400. Here, the album
data means digital data that enables, e. g., image data captured
by the capturer 416 formed of a shooting device such as a
digital still camera (DSC) or a digital video camera (DVC) to
be viewed in a display screen of the display unit 402 like a
US 2009/0027399 A1
picture diary or a photographic album or to be broWsed/ edited
by a personal computer and others.
[0078] The image input/ output UP 413 inputs/outputs
Jan. 29, 2009
[0083] Here, the controller 400 judges an atmosphere in the
vehicle based on image data that is captured by the capturer
416 and output from the image editor 412 or audio data that is
collected by the sound collector 417 and output from the
image data that is input/output to the image editor 412 from
the outside. The image input/ output UP 413 outputs, e. g.,
image data from the recording medium 404 that stores image
data captured by the DSC or the DVC or image data that is
vehicle is judged by, e.g., detecting a change in a character
stored in the DSC or the DVC and input from the communi
cation unit 407 through communication based on, e.g., uni
versal serial bus (USB), institute of electrical and electronic
engineers 1394 (IEEE1394), infrared radiation and others to
(DSP).
the image editor 412, and outputs image data output from the
image editor 412 to the recording medium 404 or the com
munication unit 407. When inputting/outputting image data
With respect to the recording medium 404, the image input/
output UP 413 may have a function of a controller that con
trols reading/Writing of the recording medium 404. When
inputting/outputting image data With respect to the commu
nication unit 407, the image input/ output UP 413 may have a
function of a communication controller that controls commu
nication in the communication unit 407.
[0079] The sound reproducer 414 selects, e.g., audio data
obtained from the recording medium 404 via the recording
medium decoder 405, audio data obtained from the commu
nication unit 407 through the controller 400, and others, and
reproduces the selected audio data. The sound reproducer 414
reproduces audio data stored in a storage device such as a
later-explained sound database (hereinafter, “sound DB”)
611 (see FIG. 6). The audio data to be reproduced includes
audio data, e.g., musical songs or sound effects. When the
image editing apparatus 310 includes an AM/FM tuner or a
TV tuner, the sound reproducer 414 may be con?gured to
reproduce a sound from a radio receiver or a television set.
[0080] The sound output unit 415 controls output of a
sound that is output from the speaker 411 based on the audio
data selected and reproduced by the sound reproducer 414.
Speci?cally, for example, the sound output unit 415 adjusts or
equaliZes a volume of a sound, and controls an output state of
the sound. The sound output unit 415 controls output of a
sound based on, e.g., an input operation from the user opera
tion unit 401 or control by the controller 400.
[0081] The capturer 416 includes the camera 305 mounted
in the vehicle shoWn in FIG. 3 or an external capturing device,
e.g., the DSC, the DVC, and others, has a photoelectric trans
ducer, e.g., a C-MOS or a CCD, and captures an image inside
and outside the vehicle. The capturer 416 is connected to the
image editing apparatus 310 With or Without a cable, and
captures, e.g., an image of a person Who is in the vehicle in
response to a capturing command from the controller 400.
Image data of the image captured by the capturer 41 6 is output
to the image editor 412 via the image input/ output UP 413.
[0082] The sound collector 417 includes, e.g., the in-ve
hicle microphone 306 shoWn in FIG. 3, and collects a sound,
e.g., a vocaliZed sound of a person Who is in the vehicle from
a sound ?eld inside the vehicle. The sound input UP 418
converts the sound collected by the sound collector 417 into
digital audio data, and outputs it to the controller 400. Spe
ci?cally, the sound input UP 418 may include, e.g., an A/D
converter that converts input analog audio data into digital
audio data. Besides, the sound input UP 418 may include a
?lter circuit that performs ?lter processing With respect to the
digital audio data, an amplifying circuit that ampli?es the
analog audio data, and others.
sound input UP 418. Speci?cally, the atmosphere in the
istic amount of a facial expression or a voice of a person Who
is in the vehicle. Therefore, the controller 400 may be con
?gured to have a function of, e.g., a digital signal processor
[0084]
It is to be noted that the capturer 101 shoWn in FIG.
1 speci?cally realiZes its function by, e.g., the capturer 416
and the sound collector 102 realiZes its function by, e.g., the
sound collector 417. Speci?cally, the input unit 103 shoWn in
FIG. 1 realiZes its function by, e.g., the image input/output UP
413 and the sound input UP 418 and the acquisition unit 104
realiZes its function by, e.g., the position acquisition unit 403.
[0085] Speci?cally, the association unit 105, the detector
107, and the controller 108 shoWn in FIG. 1 realiZe their
functions by, e. g., the controller 400 and the image editor 412.
Speci?cally, the display unit 106 shoWn in FIG. 1 realiZes its
function by, e.g., the display unit 402, and the sound repro
ducer 109 realiZes its function by, e.g., the sound reproducer
414, the sound output unit 415, and the speaker 411.
[0086] Internal structures of the image editor 412 and the
sound reproducer 414 Will be explained. FIG. 5 is a block
diagram of an example of the internal structure of the image
editor in the image editing apparatus according to the
example of the present invention. FIG. 6 is a block diagram of
an example of the internal structure of the sound reproducer in
the image editing apparatus according to the example of the
present invention.
[0087] As shoWn in FIG. 5, the image editor 412 includes
an image editing processor 510, a display controller 511, an
image recogniZer 512, an image storage unit 513, a person
recogniZer 514, and a person database (hereinafter, “person
DB”) 515. The image editing processor 510 performs image
editing processing With respect to image data that is input to
the image editor 412 from the capturer 416 (see FIG. 4 here
after) or the outside through the image input/output UP 413 or
image data that is input to the image editor 412 from the
recording medium 404 (see FIG. 4 hereafter) through the
recording medium decoder 405 (see FIG. 4 hereafter) and the
controller 400 (see FIG. 4 hereafter). The image editing pro
cessor 510 reads image data stored in the later-explained
image storage unit 513 to carry out image editing processing.
Contents of the image editing processing include, e. g., editing
image data into album data.
[0088]
The display controller 511 executes control for dis
playing image data output from the image editing processor
510 in the form of an album in a display screen of the display
unit 402. The image recogniZer 512 recogniZes a type of a
picture image included in image data input to the image
editing processor 510 based on the image data. The image
storage unit 513 stores image data input to the image editing
processor 510.
[0089] When a picture image in the image data input to the
image editing processor 510 includes a picture image con
cerning a person, the person recogniZer 514 reads a picture
image concerning a person that is previously stored in the
person DB 515 and recogniZes a person represented by the
picture image. Speci?cally, the recognition processing is car
ried out by, e.g., facial authentication based on a facial picture
image of a person. Since the facial authentication is a knoWn
US 2009/0027399 A1
Jan. 29, 2009
technology, an explanation thereof Will be omitted here. The
editor 412 and the controller 400 detect a characteristic
person DB 515 stores image data including picture images of
amount of a picture image in the image data and a character
istic amount of a sound parameter in the audio data, respec
persons Who are in the vehicle, individual identi?cation data,
e.g., ages or genders of these persons, and others.
[0090] It is to be noted that the image editor 412 detects a
characteristic amount of a picture image in the image data
recogniZed by the image recogniZer 512 or a picture image
concerning the person recogniZed by the person recogniZer
514, and outputs the detected amount to the controller 400.
The characteristic amounts of these picture images are
detected from, e.g., color tone data of a picture image or an
emotion parameter of a facial picture image of a person.
Speci?cally, the color tone data indicates a hue such as red,
blue, or green that is closest to the entire picture image, and
the emotion parameter indicates a facial expression such as
delight, anger, sorroW, or pleasure that is closest to the facial
image of the person.
[0091] On the other hand, as shoWn in FIG. 6, the sound
reproducer 414 includes a sound reproduction processor 610,
a sound database (hereinafter, “sound DB”) 611, and a music
selection history database (hereinafter, “music-selection his
tory DB”) 612. The sound reproduction processor 610
selects/reproduces audio data input to the sound reproducer
414 or audio data stored in the sound DB 611. The sound
reproduction processor 610 selects/reproduces audio data in
association With, e.g., image data in album data created by the
image editor 412 (see FIG. 4 hereafter). Association of the
audio data in the example may be carried out based on a
characteristic amount of time stamp data included in the
image data in the album data, color tone data of a picture
image, a facial picture image of a person, and others.
[0092] The sound DB 611 stores audio data reproduced by
the sound reproducer 414. The audio data stored in the sound
DB 611 may be audio data that is input to the sound repro
ducer 414 from the recording medium 404 (see FIG. 4 here
after) or the communication unit 407 (see FIG. 4 hereafter), or
tively (step S703). At the step S703, information concerning
the characteristic amount of the picture image detected by the
image editor 412 is output to the controller 400.
[0095] After detecting the characteristic amount of the pic
ture image and the characteristic amount of the sound param
eter, the controller 400 judges Whether the atmosphere in the
car is changed based on the detected characteristic amounts
(step S704). The judgment on Whether the atmosphere in the
car is changed is carried out by judging, e. g., a change in the
detected characteristic amount of the picture image from an
emotion parameter indicating a “smiling face” to an emotion
parameter indicating a “tearful face” or a change in the char
acteristic amount of the sound parameter from a frequency
component indicating a “laughter” to a frequency component
indicating an “angry shout”.
[0096] When the controller 400 determines that the atmo
sphere in the car is not changed at the step S704 (step S704:
NO), the control returns to the step S701 to repeat the pro
cessing from the step S701 to the step S704. When it is
determined that the atmosphere in the car is changed at the
step S704 (step S704: YES), the image editor 412 acquires
image data including time stamp data captured by the cap
turer 416 through the image input/output UP 413 (step S705).
[0097] Besides obtaining the image data at the step S705,
the controller 400 acquires current position information of the
vehicle from the position acquisition unit 403 (see FIG. 4
hereafter) (step S706), obtains map information from the
recording medium 404 (see FIG. 4 hereafter) via the record
ing medium decoder 405 (see FIG. 4 hereafter) (step S707),
and further acquires information concerning a route and a
clock time of traveling of the vehicle (step S708).
[0098] After acquiring the information concerning the trav
eling route and the clock time at the step S708, the controller
audio data previously provided in the image editing apparatus
310. When audio data reproduced by the sound reproducer
400 collates the time stamp data in the image data acquired by
the image editor 412 With the information of the traveling
414 is song data, the music-selection history DB 612 stores
information concerning a reproduction history or a music
route and the clock time to detect a point in the map Where the
vehicle has passed at the clock time indicated in the time
selection history of the song. For example, When the image
stamp data of the image, thereby associating the image data
With the map information (step S709).
[0099] After associating the image data With the map data,
editing apparatus 310 is mounted on the vehicle, the music
selection history DB 612 stores information concerning a
reproduction history or a music selection history of a song
reproduced during driving.
(Image Editing Processing Procedure of Image Editing Appa
ratus)
the image editor 412 uses the image data to create album data
(step S710). After creating the album data in this manner, the
position acquisition unit 403 and others acquire behavior
information on behaviors of the vehicle, e.g., information
concerning a speed of the vehicle or tilt angle information
[0093] An image-editing processing procedure of the
image editing apparatus according the example of the present
(step S711).
invention Will be explained. FIG. 7 is a ?owchart of an
output to the sound reproducer 414 (see FIG. 6 hereafter)
through the controller 400, the sound reproducer 414 acquires
the album data from the image editor 412, and the sound
reproduction processor 610 (see FIG. 6 hereafter) makes ref
example of the image-editing processing procedure of the
image editing apparatus according to the example of the
present invention. As shoWn in FIG. 7, ?rst, the capturer 416
(see FIG. 4 hereafter) provided in the car captures an image of
the inside of the car (step S701), and the sound collector 417
(see FIG. 4 hereafter) provided in the car collects a sound in
[0100]
The behavior information acquired in this manner is
erence to audio data from the sound DB 611 (see FIG. 6
hereafter) or information concerning a music selection his
the car generated from a person Who is in the car (hereinafter,
tory from the music-selection history DB 612 (see FIG. 6
hereafter), thereby associating the audio data With the album
“passenger”) (step S702).
data (step S712).
[0094] Image data of the image captured by the capturer
[0101] Here, in regard to association of the audio data, a
land form or a road type at the time of capturing the image is
judged based on, e.g., the map information or the behavior
416 is input to the image editor 412 (see FIG. 4 hereafter),
audio data of the sound collected by the sound collector 417
is input to the controller 400 (see FIG. 4 hereafter) through the
sound input UP 418 (see FIG. 4 hereafter), and the image
information associated With the album data, and audio data,
e.g., a song matching With the judged land form or road type
US 2009/0027399 A1
is read out from the sound DB 611 to be associated. Besides,
reference may be made to a characteristic amount of the
picture image and a characteristic amount of the sound
parameter to associate audio data matching With these char
acteristic amounts.
[0102] After associating the audio data With the album data
at the step S712, the image editor 412 and the controller 400
Jan. 29, 2009
step S903, the audio data may be associated With the album
data to respond to a main part (highlight part) in the selected
audio data, for example.
[0108] It is to be noted that selection of the audio data at the
step S902 may be carried out based on, e.g., an emotion
parameter represented by a facial picture image in the image
data. In this case, for example, audio data of a melody With an
judge Whether the album data is completed (step S713). When
upbeat mood is selected When the facial picture image repre
it is determined that the album data is yet to be completed
(step S713: NO), the control returns to the step S701 to repeat
the processing from the step S701 to the step S713. When it is
determined that the album data is completed (step S713:
tempo melody is selected When the image represents plea
YES), the series of image editing processing based on the
sents joy, audio data of a melody With a ?ery mood is selected
When the image represents anger, and audio data of an up
sure.
[0109] A speci?c example of the image editing processing
by the image editing apparatus according to the example of
?owchart ends.
the present invention Will be explained. FIGS. 10 and 11 are
[0103] Another association processing of the audio data
With the album data at the step S712 Will be brie?y explained.
image editing processing by the image editing apparatus
FIGS. 8 and 9 are ?oWcharts of an example of another asso
ciation processing procedure of the audio data in the image
explanatory draWings of a speci?c processing example of the
according to the example of the present invention. As shoWn
in FIG. 10, the image editor 412 (see FIG. 5 hereafter) in the
editing processing by the image editing apparatus according
image editing apparatus 310 (see FIG. 4 hereafter) makes
to the example of the present invention. It is to be noted that
FIG. 8 depicts association processing based on time stamp
reference to current position information of the vehicle that is
data in image data and FIG. 9 depicts association processing
acquired by the position acquisition unit 403 (see FIG. 4
hereafter) and input through the controller 400 (see FIG. 4
based on color tone data in a picture image in the image data.
hereafter) and information concerning a route and a clock
[0104] As shoWn in FIG. 8, ?rst, the sound reproduction
processor 610 (see FIG. 6 hereafter) in the sound reproducer
414 (see FIG. 6 hereafter) acquires, e.g., information con
cerning a reproduction history of songs reproduced in the
image editing apparatus 310 (see FIG. 4 hereafter) from the
song selection history DB 612 (see FIG. 6 hereafter) (step
S801). After acquiring the information concerning the repro
duction history, the sound reproduction processor 610 makes
reference to time stamp data of image data in album data (step
S802).
[0105] After making reference to the time stamp data at the
step S802, the sound reproduction processor 610 selects
audio data of a song having information concerning a repro
duction history reproduced at a clock time closest to the
referred time stamp data (step S803). After selecting the audio
time and map information input from the recording medium
404 (see FIG. 4 hereafter) through the controller 400, and
acquires image data When it determines that an atmosphere in
the car is changed in a traveling route from a start point S to an
end point E of the vehicle based on respective characteristic
amounts of a picture image and a sound parameter in image
data and audio data from the capturer 416 (see FIG. 4 here
after) and the sound collector 417 (see FIG. 4 hereafter).
[0110] In the example depicted in FIG. 10, photograph
acquisition points A to D indicate that the atmosphere in the
car is determined to be changed and image data is acquired.
The image editor 412 associates the image data acquired at
the photograph acquisition points A to D With map informa
tion based on, e.g., time stamp data of the image data acquired
at the photograph acquisition points A to D and current posi
tion information of the car acquired at the photograph acqui
data in this manner, the sound reproduction processor 610
associates the selected audio data With the album data (step
S804). At the step S804, the audio data can be associated With
the album data to respond to a main part (highlight part) in the
selected audio data.
[0106] On the other hand, as shoWn in FIG. 9, the sound
audio data such as songs is associated With the album data,
music and others can be appropriately automatically repro
duced.
[0111] The album data created by the image editor 412 in
reproduction processor 610 (see FIG. 6 hereafter) makes ref
this manner can be displayed in, e.g., a display screen of the
erence to, e.g., the album data from the image editing proces
display unit 402 (see FIG. 4 hereafter) like a double-page
spread album as depicted in FIG. 11. For example, the respec
tive pieces of image data 1120, 1130, 1140, and 1150
acquired at the photograph acquisition pointsA to D (see FIG.
10 hereafter) can be displayed in the displayed album data
1100 in the time-series order. In the respective pieces of
sor 510 (see FIG. 5 hereafter), and makes reference to a
characteristic amount of color tone data as a characteristic
amount of a picture image in entire image data in the album
data (step S901). The sound reproduction processor 610
selects audio data corresponding to the referred characteristic
amount of the color tone data from the sound DB 611 (see
sition points A to D or information concerning a route and a
clock time of the vehicle, thereby creating album data. Since
FIG. 6 hereafter) (step S902).
displayed image data 1120, 1130, 1140 and 1150, picture
images 1121, 1131, 1141, and 1151 ofin-car photographs A
[0107]
Selection of the audio data at the step S902 is carried
to D at the photograph acquisition points A to D or picture
out in such a manner that audio data of a melody With a sad
images 1122, 1132, 1142, and 1152 of landscapes A to D
outside the vehicle may be displayed.
mood is selected When a color tone of a picture image in the
entire image data is blue, audio data of a melody With a
healing mood is selected When the color tone is green, and
audio data of an up-tempo melody is selected When the color
tone is red, for example. After selecting the audio data in this
manner, the sound reproduction processor 610 associates the
selected audio data With the album data (step S903). At the
[0112] In the respective pieces of displayed image data
1120, 1130, 1140 and 1150, text information shoWing a geo
graphic name obtained from, e.g., the associated map infor
mation or a clock time When the image has been captured or
When the vehicle has passed may be displayed. As shoWn in
FIG. 11, for example, the image data 1120 has been captured
US 2009/0027399 A1
at a clock time “AM 8: 14”, and represents that the photograph
acquisition point A Where the image has been captured is
“near Yorii station”. Likewise, the image data 1130 has been
captured at a clock time “AM 8:37” and represents that the
photograph acquisition point B Where the image has been
captured is “near Nagatoro”. The image data 1140 has been
Jan. 29, 2009
an associating unit that associates the image data With map
information based on the ?rst information and the sec
ond information; and
a display that displays the associated image data.
14. The image editing apparatus according to claim 13,
further comprising:
photograph acquisition point C Where the image has been
a capturing unit that captures an image;
a sound collecting unit that collects a sound;
captured is “near Chichibu station”. The image data 1150 has
a detecting unit that detects a characteristic amount of the
captured at a clock time “PM 1:20” and represents that the
been captured at a clock time “PM 2:50” and represents that
the photograph acquisition point D Where the image has been
captured is “near Shoumaru Touge”. It is to be noted that the
respective pieces ofimage data 1120, 1130, 1140, and 1150
image and the sound; and
a control unit that controls the capturing unit based on the
characteristic amount.
15. The image detecting apparatus according to claim 14,
are displayed in the time-series order, but they may be dis
Wherein the detecting unit detects the characteristic amount
played in the vehicle traveling route order, for example.
based on a facial image of a person in the image.
[0113] As explained above, the image editing apparatus
16. The image editing apparatus according to claim 14,
further comprising a reproducing unit that reproduces audio
data, Wherein
the acquiring unit acquires behavior information on behav
iors of the mobile object, and
the reproducing unit selects audio data to be reproduced
according to the example can associate image data With map
information based on time stamp data of image data captured
When an atmosphere in a car is changed and information
concerning a route and a clock time of traveling of the vehicle
Without using a server and others, thereby creating album
data. Therefore, the image data can be automatically edited in
the time-series order or the route order to create album data,
thereby reducing a trouble for image editing.
[0114] Since the image editing apparatus according to the
example can also associate image data in album data With
audio data Without using a server and others, thus improving
entertainment properties and reducing a trouble or a cost for
based on at least one of the characteristic amount and the
behavior information When the display unit displays the
associated image data.
17. The image editing apparatus according to claim 13,
Wherein the display displays text information included in the
map information along With the associated image data.
18. The image editing apparatus according to claim 15,
the image editing processing.
Wherein the detecting unit detects the characteristic amount
[0115] As explained above, according to the image editing
apparatus, the image editing method, the image editing pro
based on at least one of an emotion parameter of the facial
gram, and the computer-readable recording medium of the
present invention, an effect of appropriately automatically
ponent of the sound, and a frequency component of the sound.
capturing an image and appropriately automatically associ
Wherein the control unit controls the capturing unit to capture
ating the acquired image data With map information or audio
an image When the characteristic amount varies.
data to create an electronic album can be demonstrated.
image, a sound volume component of the sound, a time com
19. The image editing apparatus according to claim 14,
20. The image editing apparatus according to claim 16,
[0116] It is to be noted that the image editing method
explained in the embodiment can be realiZed by executing a
prepared program by using a computer, e.g., a personal com
Wherein the behavior information includes at least one of
puter or a Work station. The program is recorded in a com
information.
puter-readable recording medium, e. g., a hard disk, a ?exible
disk, a CD-ROM, an M0, or a DVD, and executed When read
out from the recording medium by a computer. The program
may be a transmission medium that can be distributed through
a netWork, e.g., the lntemet.
traveling speed information of the mobile object, tilt angle
information, lateral gravity information, and current position
21. An image editing method comprising:
receiving image data including ?rst information on a time
at Which an image of the image data is obtained;
acquiring second information on a route and a time at
Which a mobile object passes a point on the route;
1-12. (canceled)
13. An image editing apparatus comprising:
associating the image data With map information based on
the ?rst information and the second information; and
a receiving unit that receives image data including ?rst
displaying the associated image data.
22. A computer-readable recording medium storing therein
information on a time at Which an image of the image
data is obtained;
an acquiring unit that acquires second information on a
route and a time at Which a mobile object passes a point
on the route;
an image editing program that causes a computer to execute
the image editing method according to claim 21.
*
*
*
*
*
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement