Technical Report - Center for Robotics and Autonomous Systems

Technical Report - Center for Robotics and Autonomous Systems
laboratory
Gerstner
Project COLOS - Technical Report 2.1
Camera Module for Onboard Relative
Localization in SWARM of robots
User Manual
Jan Faigl, Tomáš Krajnı́k, Jan Chudoba, Martin Saska, Libor Přeučil
Version 1.0
The Gerstner Laboratory for Intelligent Decision Making and Control
Department of Cybernetics (K13133)
Faculty of Electrical Engineering, Czech Technical University
166 27 Prague 6, Technická 2, Czech Republic
[email protected]
Contents
1 Hardware Description
5
1.1
Overo minicom board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
1.2
Voltage regulator board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
1.3
Boards Connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
1.4
LED Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6
1.5
Disabling Undervoltage Switch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
2 Blob Detection
7
2.1
Principle of Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
2.2
Performance Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9
2.2.1
Real Computational Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . .
9
2.2.2
Precision of the Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.3
Detectability of the Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3 System Configuration
17
3.1
WiFi Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.2
Image Capturing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.3
Initialization of the Tracker Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4 Camera Settings
4.1
28
Practical Tips for Using the Overo Caspa Camera . . . . . . . . . . . . . . . . . . . . . . . 30
4.1.1
Camera Control Settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.1.2
Auto exposure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5 Applications Settings
31
5.1
Tracker Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.2
Tracker Client . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
5.3
tcapture - Capturing mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
5.4
Applications for Remote Access to the Camera
5.5
Creating Camera Control Settings File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
5.6
Taking Images for the Camera Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
. . . . . . . . . . . . . . . . . . . . . . . . 39
6 Future Work
44
A Specifications
45
B Camera Module Hardware
46
C Applications
48
LIST OF FIGURES
LIST OF TABLES
List of Figures
1
Hardware boards of the camera module. . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6
2
LED indicators of the hardware boards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
3
Additional pins for the power source of the regulator without undervoltage switch. . . . . .
7
4
B/W pattern for the circle detector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
8
5
Measured distances for the image resolutions tested.
6
Examples of captured images. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
7
Detectability and localization precision in x−axis for the pattern with outer diameter 3.5 cm
and two sizes of the squared bounding box. . . . . . . . . . . . . . . . . . . . . . . . . . . 15
8
Detectability and localization precision in x−axis for the pattern with outer diameter 3.5 cm
and two sizes of the squared bounding box. . . . . . . . . . . . . . . . . . . . . . . . . . . 16
9
Examples of tcam usage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
10
Connectors of the Overo minicom board. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
11
Schematic of the voltage regulator and battery undervoltage switch. . . . . . . . . . . . . . 46
12
Circuits of the voltage regulator and battery undervoltage switch. . . . . . . . . . . . . . . 47
. . . . . . . . . . . . . . . . . . . . . 12
List of Tables
1
Maximal FPS achieved . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2
Minimal FPS achieved
3
Measured distances for the image resolutions tested . . . . . . . . . . . . . . . . . . . . . . 12
4
Measured distances in the y-axes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
5
Measured distances in the z-axes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
6
Hardware parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
7
Access parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
8
Tracker parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3
LISTINGS
LISTINGS
Listings
2
An example of ~/.ssh/config file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
An example of the /etc/ntp.conf file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3
An example of the /etc/hosts file with a local database of the host names. . . . . . . . . 18
4
An example of the local startup configuration file /etc/local.start. . . . . . . . . . . . . 19
5
The /opt/wifid script for periodical checking of the WiFi connection. . . . . . . . . . . . 19
6
The /opt/check wifi.sh script for testing availability of the WiFi connection. . . . . . . . 19
7
The /opt/wlan colos script for configuring WiFi connection to the colos network. . . . . . 20
8
Camera driver initialization /etc/init.d/camera init. . . . . . . . . . . . . . . . . . . . 20
9
Initialization of user input handling in the /etc/init.d/camera event script. . . . . . . . 21
10
Initialization of capturing service in /etc/init.d/camera capture script. . . . . . . . . . 21
11
Activation of the capturing service scripts. . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
12
Handling of user inputs coming GPIO pints. . . . . . . . . . . . . . . . . . . . . . . . . . . 23
13
Starting of the capturing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
14
Tracker service initialization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
15
Tracker program execution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
16
Directory layout of the tracker program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
17
Printed version of the tracker program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
18
Gumstix Caspa camera control settings accessible by the V4L2 interface provided by the
mt9v032 driver. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
19
Gumstix Caspa camera control settings accessible by the V4L2 interface provided by the
mt9v032 driver. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
20
An example of the tracker server configuration file (tracker-480x360.cfg) . . . . . . . . 31
21
An example of the camera parameters found by the Matlab toolbox [1]. . . . . . . . . . . . 33
22
A structure send by the tracker server (tracked object.h) . . . . . . . . . . . . . . . . . 34
23
A simple client of the tracker server. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
24
A configuration for the capturing program. . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
25
A configuration file for the tcapture program in the server mode. . . . . . . . . . . . . . . 39
26
Selected program options of the tcam application. . . . . . . . . . . . . . . . . . . . . . . . 40
27
All program options of the tcam application. . . . . . . . . . . . . . . . . . . . . . . . . . . 48
1
4
1
1
HARDWARE DESCRIPTION
Hardware Description
The camera module consists of four main electronic boards. The core of the module is the Gumstix Overo
board with the OMAP 3503 Application processor running at 600 Mhz and accompanied with 128 MB
RAM and 802.11b/g wireless communication. The second board is the Caspa camera board with the
Aptina MT9V032 CMOS sensor. This board is also an off-the-self product provided by Gumstix [2].
Two additional boards are custom boards providing power and interfaces to Gumstix Overo. The boards
are called the Overo minicom and voltage regulator boards. The boards are shown in Fig. 1. A summary of
the hardware parameters is shown in Table 6.
1.1
Overo minicom board
The Gumstix Overo processor board is attached to the developed Overo minicom board, which is an interface
board for Overo processor boards. Onboard circuits provide 3.3 V power supply for Overo, and 1.8 V for
a logic-level conversion between 1.8 V (Overo levels) and 5 V (external levels). The Overo minicom board
itself is powered by 5 V supply voltage thru J1 connector. The J19 connector make accessible the first serial
port of Overo COM (i.e., the /dev/ttyS0 device under Linux based operating systems) in 5 V TTL levels.
The J15 connector provides the Overo console serial port also in 5 V levels. Locations of the connectors are
depicted in Fig. 10.
1.2
Voltage regulator board
The voltage regulator board can be divided into two parts, which are also physically collected at the board.
The first part is a battery undervoltage switch, which avoids battery damage due to excessive discharge.
The second part is a voltage regulator providing 5 V. A scheme of the board is depicted in Fig. 11 and
circuits placement in Fig. 12.
The undervoltage switch disconnects battery in a case of battery voltage below 8.7 V, which is safe value
for the 3-cell Li-Pol battery with the nominal voltage 11.4 V. At the voltage 9.3 V a warning LED (D3) is
lighted up. The voltage threshold may be adjusted by a resistor divider consisting of resistors R6, R7, and
R8. The divider voltage
(R7 + R8)
Udiv =
R7 + R8 + R9
is compared to the reference voltage 1.2 V. The undervoltage protection can be disabled by removing
resistor R8.
The voltage regulator is based on the LT1766 adjustable switching regulator. The output voltage is set
to 5 V by a voltage divider consisting of resistors R1 - R4. The maximal load current is 1.5 A. Regarding
the LT1766 datasheet the expected efficiency is about 90%.
1.3
Boards Connections
The boards are connected together as follows. The Gumstix Caspa camera board is connected to the Gumstix
Overo board using a supplied flex cable. The Gumstix Overo board itself is attached to the Overo minicom
board by two 60-pin Hirose connectors. The 5 V voltage provided by the voltage regulator board is connected
by two wires. Finally the battery is connected to the voltage regulator board using a dean-T connector. All
boards connected together are shown in Fig. 1i.
5
1.4
LED Indicators
1
HARDWARE DESCRIPTION
(a) Gumstix Overo - top
(b) Gumstix Overo - bottom
(c) voltage regulator board - top
(d) voltage regulator board - bottom
(e) minicom board - top
(f) minicom board - bottom
(g) minicom connected to the power board
(h) Gumstix Caspa camera
(i) overview of the connected boards
(j) attached console USB converter
Figure 1: Hardware boards of the camera module.
A USB to RS232 converter can be attached to the Overo minicom board by 3-pins. The pins are labeled
on the board. However, if the module is used with the supplied converter, it is necessary to cross the TX
and RX pins, see Fig. 1j.
1.4
LED Indicators
The boards are equipped with LEDs indicated its operational modes. The LEDs are shown in Fig. 2.
6
1.5
Disabling Undervoltage Switch
(a) voltage
(switched on)
regulator
boards
2
(b) Minicom
(switched on)
power
(d) Overo power indicator (switched on)
indicator
BLOB DETECTION
(c) Camera usage indicator (capturing / tracking is running
(e) Overo WiFi indicator (connection to a
network)
Figure 2: LED indicators of the hardware boards.
1.5
Disabling Undervoltage Switch
If it is necessary to disable the undervoltage switch protecting the battery, it can be realized in several ways.
First, it can be done by changing values of the resistor divider, R6, R7, and R8, see Fig. 11 and the resistors
locations in Fig. 12.
The second way is a bypassing the mosfet transistor IRF7416 (a part Q1 in Fig. 11 and Fig. 12), which
is located next to the main switch S1, see Fig. 3.
Finally, the power voltage regulator can be powered without the undervoltage switch if pins of J3 are
used for powering instead of J1. A location of the pins is shown in Fig. 3.
Figure 3: Additional pins for the power source of the regulator without undervoltage switch.
2
2.1
Blob Detection
Principle of Detection
A principle of the blob detection is based on an image segmentation and finding two discs forming a black
and white ring. The idea is based on a comparison of the blob size (in the number of pixels) with the
expected size of the pattern used, i.e, disk and ring. An example of the pattern is shown in Fig. 4.
In particular the algorithm works as follows. First, an image is thresholded to separate black and white
pixels. Then, the image is searched for segments. A segment is labeled as a candidate segment if it has a
minimal expected size (in number of pixels), its bounding box fits to the expected dimensions, and ratio of its
size with the expected size corresponds to the selected values. As two segment form the B/W ring pattern,
7
2.1
Principle of Detection
2
BLOB DETECTION
di
do
d
Figure 4: B/W pattern for the circle detector.
two segments with appropriate outer and inner parameters have to be found consecutively. Regarding the
pattern, these two segments should have the same center point (considering a small tolerance). Besides, a
ratio of the segments’ sizes have to meet selected tolerance (according to the width of the ring) in order to
denote the segments as a found blob.
The above finding procedure can be repeated until the whole image is processed, and several blobs can
be found. In a case of a single blob assumed in the scene, the searching process can be terminated once
the first blob is detected. Moreover, the searching procedure can be speeded up using previous position
(center) of the blob, which represents some kind of tracking mechanism. However, if the pattern’s position
is changed significantly between two consecutive images, the whole image is processed.
Although B/W pattern, which should be robust to changes of the light conditions, is considered, the
appropriate threshold separating the dark and light shades has to be selected according to the current light
conditions. The threshold is determined as follows. If the blob is found, the threshold is set as the average
value of the mean intensity values of each disk forming the ring. The intensity value is a single value in a
case of grayscale image or a sum of particular intensity values of each color channel, e.g. in a case of the
RGB image. In a case the current threshold does not lead to detect a blob, its value is increased (if its
value is less than the last value for which a blob has been found) or decreased by a selected bias value. The
threshold value is bounded by zero and the maximal value (e.g., 768 from 3×256 for RGB images). If the
threshold value reaches one of the bounds, the bias is set to zero, the threshold value is restricted to the
particular bound, and the last threshold is set into half of the threshold value interval. Notice, the threshold
bias is increased by a selected step after each increase of the threshold value, e.g. 60, to set the threshold
value progressively.
The threshold changing procedure can be applied multiply times for a single image until a blob is not
found. If the changes of the light conditions are not fast and significant, it is usually sufficient to process
several consecutive images to determine the threshold.
In addition, it is also necessary to have a blank space around the outer disk of the pattern to separate
the ring from the background, e.g., see Fig. 4 where a square bounding box forms a 2 cm width strip around
the disk.
The pattern is defined by three parameters: the diameter of the outer disc do , the diameter of the inner
8
2.2
Performance Indicators
2
BLOB DETECTION
disc di , and the squared bounding box with the side d. The discs are located at the center of the box.
2.2
Performance Indicators
The above described blob detection has been implemented in C++ and its performance has been experimentally verified in a series of tests for analyzing its computational requirements and achievable precision
of relative localization.
Regarding the computational requirements, the implemented algorithm has been compiled by the G++
ver. 4.3.3 cross-compiler for the arm-angstrom-linux distribution used on the Gumstix Overo board.
As the blob detection considers only B/W pattern, the computational burden can be decreased using a
grayscale image, which also reduced computational time of the image conversion from the YUV format
used for reading the image from the Caspa device.
The precision of the localization depends on the estimated parameters of the camera [3]. In particular,
the camera has been calibrated using the Matlab toolbox [1].
Results presented in Section 2.2.1 and Section 2.2.2 have been measured using a pattern with do =14 cm,
di =8.4 cm, and d =18 cm.
2.2.1
Real Computational Requirements
The computational requirements has been measured as the maximal number of the processed images per
second (herein denoted as FPS following standard conventions) in the full processing loop, i.e., including
the capture time, YUV to grayscale conversion, blob detection, transformation of the coordinates using
camera parameters, and transfer of the coordinates from the camera module to a client computer using
UDP protocol over WiFi. Thus, the results presented demonstrate real application and really achievable
FPS.
The image processing time depends on several factors. Although the image conversion and coordinates
transformation depends only on the image resolution, the blob detection can vary. First, using previously
know position of the blob in the image can significantly reduce the required time to find the segments.
Besides, the processing time also depends on the number of pixels forming the segments; thus, a smaller
pattern or a pattern at a longer distance can be detected faster than a larger pattern or a pattern close to the
camera. The worst case scenario is a situation when a blob is not detected, because it requires searching of
the whole image. In such a situation, the processing time depends only on the image resolution. Regarding
these aspects the achievable FPS has been measured for four selected resolutions 320×240, 480×360,
640×480, and 752×480 and for a pattern placed at various distances from the camera. The detailed results
are presented in Table 1 and an overview of the FPS in Table 2.
In Table 1, the maximal achieved FPS using the tracking (i.e., the previous position of the found blob)
is presented. For the lowest resolution considered, the processing is limited by the Caspa camera, which can
provide images at 60 Hz. On the other hand, a lower resolution limits maximal measured distance. In the
case of 320×240 resolution, the blob is sporadically detected also for the distance 3.5 m, but the blob is
detected approximately in about 4 % cases. For the 480×360 resolution, the maximal usable distance is
3.5 m, and for the distance 3.8 m the blob is detected in about 15 % cases. For longer distances the blob
is not detected reliably.
The guaranteed numbers of processed images per second are depicted in Table 2, where FPSworse are
for situations where the blob is not detected at all, and therefore, the whole image must be searched for
a segment. The column FPSmin denotes minimal image processing frequency when the blob is perfectly
found using its previous position and its position is not closer than 0.5 m.
9
2.2
Performance Indicators
2
BLOB DETECTION
Table 1: Maximal FPS achieved
L
[m]
320×240
F P Smax
480×360
F P Smax
640×480
F P Smax
752×480
F P Smax
0.5
1.0
1.5
2.0
2.5
3.0
3.2
60.0
60.0
60.0
60.0
60.0
60.0
60.0
35.0
46.0
50.0
51.0
51.0
51.0
52.0
23.0
30.0
33.0
34.0
35.0
35.0
35.0
20.0
27.0
30.0
30.0
30.0
31.0
31.0
3.5
-
52.0
34.0
31.0
4.0
4.5
5.0
5.5
-
-
34.0
35.0
35.0
35.0
30.0
31.0
30.0
31.0
Table 2: Minimal FPS achieved
Resolution
FPSworse
FPSmin
33
18
9
7
60
35
23
20
320x240
480x360
640x480
752x480
It is worth to mention that the image is captured from the camera in the YUV format, which needs
a conversion to the RGB (or grayscale), alternatively the segmentation can be done directly in YUV. In
the second revision of the camera module, a new optimized implementation of the conversion is used.
The real required computational times of the conversion (for 480×360 resolution) are about 11 ms and 7
ms for the RGB and grayscale images, respectively. The achieved FPS presented above are high enough
for the COLOS project needs, therefore an eventual expected benefit of the direct segmentation in YUV
is not evident because of a more complex implementation. Thus, the direct segmentation in YUV is not
implemented for the second revision of the camera module. Besides, it is expected that the mt9v032 driver
will allow direct usage of the Bayer10 format in the future revisions.
2.2.2
Precision of the Localization
Similarly to the presented real computational requirements the precision of the localization has been measured using the camera module and a client computer receiving information about the detected blobs. The
precision is measured as a difference between the position provided by the camera module and the real
distance measured by a tape measure. The provided relative position of the detected blob consists of 3D
coordinates x, y, z, estimated rotations roll, pitch, yaw, and two pixel ratios of the blob (ratio of dark
10
2.2
Performance Indicators
2
BLOB DETECTION
and light pixels and ratio of the current and expected number of pixels). Although rotations are estimated,
only the position coordinates are considered in the experimental evaluation of the camera module presented
here.
The information about the tracked object provided by the UDP server of the camera module also contains
a bit flag denoting if the estimated values are valid or not. The validity is determined according to detected
blob; thus the valid flags is true if the blob has been detected, otherwise it is false. The current version
of the tracker does not use sophisticated methods of the sensor fusion, and therefore, it may happened
that another object is recognized as the ring pattern. However, this situation does not happen during the
experimental evaluation as the scene does not contain objects, which can eventually be interpreted as the
pattern.
The relative position of the detected blobs can vary due to small changes in the captured images, and
therefore, average values of the measured values of the tracked object are considered. The average values
are computed only from valid information about the tracked object, i.e., several measurements over a period
are considered providing more than 20 measurements. In fact, the pattern has been successfully detected for
several such periods without fall outs during the experimental evaluation. The pattern has been sporadically
detected only in few cases when the maximal usable distance for object tracking has been measured. These
cases are described in Section 2.2.1.
First, measured distances in the x-axis are presented in Table 3. In this test, the distance of the camera
module from the pattern has been changed while its orientation has been fixed; however, only roughly as
the module has been positioned by hands. Average measured distance is subtracted from the real distance
L, the average error is denoted as Le , and the standard deviation as se . The standard deviation in percents
of the measured value is denoted as %se . The results indicate that a higher resolution provides more precise
estimations at longer distances, but principally it provides the same performance, which can be easily seen
from Fig. 5, where the average errors in percentage points of the real distances are presented. Regarding
the standard deviations, the repeatability of the measurements are in units of millimeters, which means
tenths percentage points of the measured distance. The results are in a correlation of the expected precision
reported in [4]. Also be aware that the camera has been placed and pointed out to the pattern imprecisely
by a hand with an expected precision about one centimeter.
Precision in other axes has been tested only for the resolution 480×360, because it provides a good tradeoff between achieved FPS and the maximal measured distance. In this experimental setup, the camera has
been fixed at the position 3 m far from a wall and the pattern has been placed into various locations
providing measurements in y and z axes. The orientation of the axes depend on the placement of the
camera, i.e., on the orientation of the CMOS sensor according to the real world coordinates. The results
presented in this report correspond to the placement in which x-axis represents depth, y-axis is aligned
with the vertical direction, and z-axis is aligned with the horizontal axis. In this setup the pattern has been
placed at the wall by a hand, but the estimated precision of the placement is subjectively a bit better than
in the previous setup.
The results for changes in the y-axis are presented in Table 4 and in Table 5 for the changes in z-axis,
where Le is the average value of the error of the measured distances in the particular axis, %Le is the same
error in the percentage points of the real distance, se is the standard deviation, and %se is the standard
deviation in the percentage points of the measured distance. All the results presented are computed from
more than 20 measurements and in all cases the detection of the blob is valid and the validity is stable over
hundreds of images. Notice, the repeatability of the measurements is very high, as the standard deviations
are very low.
The results presented in Table 5 also indicate how the radial distortion affects the maximal usable field of
view. If a blob is placed at the height y=–1 m, it is detectable up to horizontal position z=1.1 m. However,
11
2.2
Performance Indicators
2
BLOB DETECTION
Table 3: Measured distances for the image resolutions tested
[m]
320×240
Le
se
%se
[cm] [cm] [%]
480×360
Le
se
%se
[cm] [cm] [%]
640×480
Le
se
%se
[cm] [cm] [%]
0.5
1.0
1.5
2.0
2.5
3.0
3.2
3.6
4.5
6.1
8.7
9.0
12.0
17.0
0.1
0.1
0.1
1.0
0.6
0.7
0.8
0.2
0.1
0.1
0.5
0.2
0.2
0.3
3.5
4.2
4.4
5.6
8.0
10.0
8.1
0.9
0.1
0.2
0.2
0.4
1.0
2.2
2.0
0.1
0.1
0.1
0.2
0.3
0.7
3.5
3.6
4.4
5.4
8.0
11.0
8.4
0.3
0.3
0.1
0.3
0.4
0.6
0.6
0.6
0.3
0.1
0.1
0.2
0.2
0.2
3.3
3.5
3.8
4.8
6.0
6.0
8.6
0.6
0.6
0.8
0.2
0.4
0.5
0.8
1.3
0.6
0.5
0.1
0.2
0.2
0.3
3.5
-
-
-
12.5
3.2
0.9
12.4
0.9
0.3
10.0
0.6
0.2
4.0
4.5
5.0
5.5
-
-
-
-
-
-
10.0
14.9
22.2
28.7
1.5
1.2
3.0
2.6
0.4
0.3
0.6
0.5
11.1
14.1
22.2
27.5
15.7
1.1
3.0
3.7
4.0
0.2
0.6
0.7
L
7
Distance Estimation
7
Distance Estimation
480x360
Resolution:
640x480
752x480
5
4
3
2
1
0
0
1
2
3
4
Average error [%]
5
6
320x240
6
Resolution:
Average error [%]
752×480
Le
se
%se
[cm] [cm] [%]
0.5
1
1.5
2
2.5
3
3.2
3.5
0.5
1
1.5
2
2.5
3
3.5
Distance [m]
Distance [m]
(a)
(b)
4
4.5
5
5.5
Figure 5: Measured distances for the image resolutions tested.
it is not the case of its position at y=–0.5 m that allows horizontal position z=1.0 m, or the height y=0 m
providing maximal horizontal position z=0.9 m.
12
2.2
Performance Indicators
2
BLOB DETECTION
Table 4: Measured distances in the y-axes
Pattern Position
(x, y, z) [m]
3.0
3.0
3.0
3.0
3.0
3.0
0.0
-0.4
-0.8
-1.2
-1.6
-2.0
0.0
0.0
0.0
0.0
0.0
0.0
Le
[cm]
%Le
[%]
0.9
0.2
0.1
2.1
2.7
3.7
n/a
0.5
0.1
1.7
1.7
1.8
se
[cm]
%se
[%]
0.001
0.070
0.280
0.470
0.700
1.700
0.11
0.18
0.35
0.40
0.43
0.87
Table 5: Measured distances in the z-axes
(a) y=0.0 m
Pattern Position
(x, y, z) [m]
3.0
3.0
3.0
3.0
3.0
3.0
3.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
-0.9
-0.6
-0.3
0.0
0.3
0.6
0.9
Le
[cm]
%Le
[%]
se
[cm]
%se
[%]
0.8
0.2
0.6
0.7
1.1
3.1
2.9
0.9
0.3
2.0
0.490
0.200
0.070
0.001
0.300
0.090
0.150
0.55
0.33
0.23
0.14
1.04
0.16
0.17
n/a
3.7
5.2
3.2
(b) y=-0.5 m
Pattern Position
(x, y, z) [m]
3.0
3.0
3.0
3.0
3.0
3.0
3.0
3.0
3.0
-0.5
-0.5
-0.5
-0.5
-0.5
-0.5
-0.5
-0.5
-0.5
-1.0
-0.6
-0.9
-0.3
0.0
0.3
0.6
0.9
1.0
(c) y=-1.0 m
Le
[cm]
%Le
[%]
se
[cm]
%se
[%]
Pattern Position
(x, y, z) [m]
Le
[cm]
%Le
[%]
se
[cm]
%se
[%]
7.4
2.2
2.2
2.1
1.7
1.2
0.3
0.7
0.3
7.4
3.7
2.4
7.0
0.100
0.100
0.170
0.060
0.004
0.110
0.060
0.190
0.120
0.11
0.17
0.19
0.22
0.24
0.35
0.10
0.21
0.12
3.0
3.0
3.0
3.0
3.0
3.0
3.0
3.0
3.0
10.8
5.5
5.9
4.8
4.5
4.1
3.6
3.1
1.3
9.8
9.2
6.6
16.0
0.650
0.200
0.250
0.070
0.001
0.800
0.140
0.480
0.020
0.66
0.37
0.30
0.28
0.02
2.35
0.22
0.52
0.02
n/a
4.0
0.5
0.8
0.3
13
-1.0
-1.0
-1.0
-1.0
-1.0
-1.0
-1.0
-1.0
-1.0
-1.1
-0.6
-0.9
-0.3
0.0
0.3
0.6
0.9
1.1
n/a
13.7
6.0
3.4
1.2
2.2
Performance Indicators
2
(a)
BLOB DETECTION
(b)
Figure 6: Examples of captured images.
2.2.3
Detectability of the Pattern
The blob detection is based on separation of two discs forming a B/W ring pattern, see Fig. 4. It is clear
that the size of the pattern affects practical deployment of the presented blob detection algorithm. Small
patterns can be more practical; however, the size of the pattern affects the maximal measurable distance as
too small pattern can be hardly detected from long distances. Regarding the computational requirements
small patterns contain less pixels; thus, the processing can be faster. Therefore, a higher resolution can
be used for better detection of small patterns, with lower computational requirements than for larger
patterns. However, in the worst case (i.e., not detected pattern), the computational requirements are same
as presented in Table 2.
The detectability of patterns of various sizes has been studied in a similar experimental setup like in
Section 2.2.2, i.e., the position of the pattern has been fixed and the camera module has been pointed to
the center of the pattern and positioned at different distances (manually by hand), see examples of captured
images by the camera module showed in Fig. 6.
The detectability of the pattern is measured as a number of detected blobs from 100 captured images;
thus, the number of successfully detected blobs is an estimation of the detectability of the pattern at the
given distance in percentage points. In addition, an estimation of the localization precision in x−axis is
measured as the average value of the error in percentage points of the real distance as in the previous
sections. The camera module reference point has been established for the distance 1 m from the blob. It
means, that the module position has been adjusted until the provided distance is 1 m with precision in units
of millimeters. Therefore the error at the distance 1 m is typically smaller than for other distance. It is worth
to mention that the error can be decreased using an additional corrections addressing systematic error of
the distance measurement.
First, a small pattern with the outer diameter do =3.5 cm has been considered with two dimensions of
the bounding square: d=7.5 cm and d=5.5 cm. Results are presented in Fig 7. The results show that such a
small pattern can be used reliability within a range about one meter. Using the highest resolution available
752×480 the blob detection algorithm can be used up to 1.5 m. The results also indicate that a wider blank
space around the disc increases detectability a bit (in particular using the 480×360 resolution).
A larger disc can be detected from a longer distances as can be see in Fig. 8. In this experimental
evaluation, patterns with the discs’ outer diameters 7.0 cm and 6 cm have been considered. For both
patterns, the width of the outer blank space is formed from the 1 cm width strip.
14
100
80
60
40
320x240
480x360
640x480
752x480
0.5
1
Distance [m]
1.5
480x360
2
640x480
2
752x480
0.5
1
Distance [m]
1.5
Error in Distance Estimation
(a) do =3.5 cm, di =2.1 cm, d=7.5 cm
20
0
12
10
8
6
4
0.5
Distance [m]
1.5
Distance [m]
1.5
(d) do =3.5 cm, di =2.1 cm, d=5.5 cm
1
Error in Distance Estimation
(b) do =3.5 cm, di =2.1 cm, d=5.5 cm
1
480x360
2
640x480
2
752x480
2
Figure 7: Detectability and localization precision in x−axis for the pattern with outer diameter 3.5 cm and two sizes of the squared bounding box.
320x240
480x360
640x480
752x480
0.5
320x240
Reliability of Blob Detection
Performance Indicators
(c) do =3.5 cm, di =2.1 cm, d=7.5 cm
2
Reliability of detection [%]
100
80
60
40
0
Average Error [%]
320x240
Average Error [%]
Reliability of Blob Detection
Reliability of detection [%]
20
0
12
10
8
6
4
2
15
0
2.2
BLOB DETECTION
100
80
60
40
0.5
1
1.5
Distance [m]
2
2.5
480x360
640x480
3
3
752x480
0.5
320x240
480x360
640x480
752x480
1
1.5
Distance [m]
2
Error in Distance Estimation
(a) do =7.0 cm, di =4.2 cm, d=9.0 cm
2.5
20
0
12
10
8
6
4
1.5
Distance [m]
2
1.5
Distance [m]
2
(d) do =6.0 cm, di =3.6 cm, d=8.0 cm
1
Error in Distance Estimation
2.5
2.5
(b) do =6.0 cm, di =3.6 cm, d=8.0 cm
1
480x360
640x480
3
3
752x480
2
Figure 8: Detectability and localization precision in x−axis for the pattern with outer diameter 3.5 cm and two sizes of the squared bounding box.
0.5
320x240
480x360
640x480
752x480
0.5
320x240
Reliability of Blob Detection
Performance Indicators
(c) do =7.0 cm, di =4.2 cm, d=9.0 cm
2
Reliability of detection [%]
100
80
60
40
0
Average Error [%]
320x240
Average Error [%]
Reliability of Blob Detection
Reliability of detection [%]
20
0
12
10
8
6
4
2
16
0
2.2
BLOB DETECTION
3
SYSTEM CONFIGURATION
In both cases, the usable distance of the camera module from the blob is about 2.5 m using the higher
resolutions (640×480 or 752×480). However, using lower resolutions the detection of the smaller pattern
(8×8 cm) is more difficult. In the case of the resolution 480×360 the reliable distance is about 2 m for the
pattern with do =7 cm, and about 1.5 m for do =6 cm.
Regarding the results presented a smaller pattern can be used; however, with limited maximal distance
between the pattern and the camera module.
Lessons Learned
Based on the experimental evaluation, the detectability of the blobs depends of several factors. The
factors can be particularly addressed, which can eventually provide a better detectability of small patterns
at longer distances than in the results presented above. Comments considering practical tuning of camera
module, selected pattern, and the expected operational distances are presented here.
• The border between the dark and white parts of the pattern should be sharp enough. For example,
the black ring on a white background needs white surroundings. The B/W separating threshold is
computed from the mean values of the intensity of the detected segment; so, blurred borders can lead
to a different number of detected and expected pixels resulting in not detected blob.
• The color of the pattern can be exchanged, e.g., a white ring on a black background.
• It may happened that a reflection of light from the surface of the black ring can lead to a white color
in the image. Therefore, in such a situation using a different material can help.
• Although the ratio of the black and white discs of the used pattern seems to be good invariant, it
is not guaranteed that such a ratio is not also detected for another objects. So, it may eventually
happened that a false positive blob are reported as valid. This issue can be addressed by considering
a different diameters of the discs, different ratio, or size of the pattern. In addition, it can also be
addressed by more sophisticated sensor fusion techniques, e.g., considering previous position of the
blob and expected maximal movement, or constraining shape of the segment as the current version
of the algorithm considers only a bounding box of each detected segment.
• Having a strict range of distances where the pattern will be placed, the lens can be adjusted to provide
a sharper image. However, once the focus length is changed, it is necessary to find new parameters
of the camera.
• Here, it is worth to mention that the lens used does not provide an excellent image. With all respect,
a cheap web camera provides (subjectively) better image (sharper) for longer distances.
3
System Configuration
The developed camera module is based on the Gumstix Overo board with the OMAP 3503 application
processor. The operating system is a Linux based distribution created using the OpenEmbedded build
framework [5]. The used Linux kernel is in version 2.6.34 tweaked for the Gumstix Overo board. The basic
system configuration follows common installation of the Linux based operating system including system
tools and initialization. The boot loader as well as the operating system itself support the console port of
the Overo board; thus, any system administration can be made using a standard serial connection at 115200
baud rate, i.e., 8 bits, without parity, and one stop bit. For example, one can connect to the camera module
using the provided USB adapter and the application jerm [6] or minicom [7]. The default account is root
with an empty password.
17
3.1
WiFi Configuration
3
SYSTEM CONFIGURATION
In addition to the console, sshd daemon is started at the boot time, and it listens on all available
interfaces. So, if a WiFi connection is available, one can connect to the camera module using the remote
shell account. In the default settings, the user root can also connect via the ssh without a password.
A typical user does not work under root account, therefore, the local ssh client can be configured to
prefer root for accessing the camera module as follows. Let the camera module IP address is recorded in
the /etc/hosts file with the cam1 host name. Then, to simply connect to the module by the ’ssh cam1’
command, the local configuration file of the ssh client can be adjust as it is shown in Listing 1.
Host cam1
User root
Listing 1: An example of ~/.ssh/config file.
Local time at the camera module is not preserved due to missing backup battery. Therefore to synchronize
the system clock a ntpd daemon is started at the boot time. An example of the daemon configuration file
is depicted in Listing 2, where the synchronization server is timerserver, which IP address is specified in
/etc/hosts, see Listing 3 for an example.
# The d r i f t f i l e must r e m a i n i n a p l a c e s p e c i f i c t o t h i s
# machine − i t r e c o r d s t h e machine s p e c i f i c c l o c k e r r o r
d r i f t f i l e / e t c / ntp . d r i f t
server timeserver
# Using l o c a l hardware c l o c k as f a l l b a c k
# server 127.127.1.0
f u d g e 1 2 7 . 1 2 7 . 1 . 0 s t r a t u m 14
# Defining a default security setting
r e s t r i c t d e f a u l t nomodify nopeer
Listing 2: An example of the /etc/ntp.conf file.
127.0.0.1
10.10.40.1
localhost . localdomain
timeserver
localhost
Listing 3: An example of the /etc/hosts file with a local database of the host names.
The additional important configurations are related to the WiFi connection and initialization of the
programs for capturing images and tracking. These are described in the following sections.
3.1
WiFi Configuration
Availability of a WiFi connection to the camera module is one of the most important system configuration,
because it is the only possible way of the remote access to the module. However, a WiFi connection
depends on the signal strength, and once a signal is lost, the configuration can be lost as well. Therefore,
to avoid such misconfigured situation a simple script wifid is started at the boot time to periodically
check WiFi connection availability. The script is started using the local startup configuration specified in
the /etc/local.start file that is depicted in Listing 4. The wifid script itself is located at the /opt
directory and its content is shown in Listing 5. The script calls two additional scripts. The first one is
the /opt/check wifi.sh script that performs check of the filled ESSID in the wlan0 interface, which
is empty in a case of the lost connection. The second script is /opt/wlan config, which is responsible
18
3.1
WiFi Configuration
3
SYSTEM CONFIGURATION
for real configuration of the wlan0 interface. In fact, the script /opt/wlan config is a symbolic link to
provide a flexibility of changing wifi configuration without necessity to rewrite the scripts and restart the
wifid service. So, it is sufficient to create a new file and then to create a new symbolic link; however, it
is recommended to consider console access or the checking script check wifi.sh. It may happened the
check wifi.sh script can reconfigure the network using a wrong configuration file (e.g., due to missing
wlan config) and the connection to the camera module is lost, which makes impossible to change the
configuration using the WiFi access. An example of the WiFi configuration using the colos network without
any protections is depicted in Listing 7. This file is located at /opt/wlan_colos and it is the target of the
/opt/wlan_config symbolic link.
#! / b i n / b a s h
echo ” S t a r t i n g l o c a l
...”
echo ” I n i t i a l i z i n g GPIO . . . ”
/ opt / s b i n i t
echo ” S t a r t i n g w i f i
/ opt / w l a n c o n f i g
...”
ntpdate timeserver
nohup / o p t / w i f i d > / dev / n u l l &
Listing 4: An example of the local startup configuration file /etc/local.start.
PID=‘ p i d o f ’ w i f i d ’ ‘
i f [ −n ” $PID ” ] ; then
f o r p i d i n $PID ; do
i f [ ” $ p i d ” != ” $$ ” ] ; then
echo ” w i f i d a l r e a d y r u n n i n g ( p i d $ p i d ) ” >&2
exit 0
fi
done
fi
D=‘ d a t e +’%Y−%m−%d %H:%M:%S ’ ‘
echo ”$D S t a r t i n g w i f i d . . . ” >> / v a r / l o g / w i f i
w h i l e t r u e ; do
s l e e p 10
/ o p t / c h e c k w i f i . s h 2>&1 >> / v a r / l o g / w i f i
done
Listing 5: The /opt/wifid script for periodical checking of the WiFi connection.
LOGFILE=/ v a r / l o g / w i f i
ESSID=‘ i w c o n f i g 2>/dev / n u l l | \
awk ’ / wlan0 / { p r i n t $4 } ’ | \
sed −e ’ s / . ∗ ESSID : ” \ ( [ ˆ ” ] ∗ \ ) ” / \ 1 / ’ ‘
D=‘ d a t e +’%Y−%m−%d %H:%M:%S ’ ‘
i f [ −n ” $ESSID ” ] ; t h e n
exit 0
fi
19
3.2
Image Capturing
echo ”$D R e s t a r t i n g w i f i
3
SYSTEM CONFIGURATION
. . . ” >> $LOGFILE
/ o p t / w l a n c o n f i g &>/dev / n u l l
Listing 6: The /opt/check wifi.sh script for testing availability of the WiFi connection.
ESSID=c o l o s
NET= 1 0 . 1 0 . 4 0 .
ADDRESS=174
i f c o n f i g wlan0 up
i w c o n f i g wlan0 e s s i d x enc o f f mode ad−hoc
sleep 2
i w c o n f i g wlan0 e s s i d $ESSID
i f c o n f i g wlan0 $ {NET} ${ADDRESS} netmask 2 5 5 . 2 5 5 . 2 5 5 . 0
r o u t e add d e f a u l t gw ${NET}1
Listing 7: The /opt/wlan colos script for configuring WiFi connection to the colos network.
A summary of accessing parameters is depicted in Table 7.
3.2
Image Capturing
Image capturing functionality has been introduced in the first version of the camera module and it is briefly
described in [4]. In this section, a more detailed description together with particular scripts are presented.
The image capturing is a service started at the boot time of the system. The initialization of the capturing
consists of three system scripts located in the /etc/init.d directory. First, the camera driver is initialized
by the camera init service. This allows a custom specification of the driver (kernel module). The used
initialization is depicted in Listing 8
#! / b i n / s h
flash () {
for i in ‘ seq 1 1 5 ‘
do
echo 1 > / s y s / c l a s s / g p i o / g p i o 1 4 5 / v a l u e
sleep 0.2
echo 0 > / s y s / c l a s s / g p i o / g p i o 1 4 5 / v a l u e
sleep 0.2
done
}
case $1 i n
start )
rmmod mt9v032
insmod / l i b / m o d u l e s / 2 . 6 . 3 4 / k e r n e l / d r i v e r s / media / v i d e o / mt9v032 . ko a u t o e x p =0
a u t o g a i n =0 l o w l i g h t =0 h d r=0
echo ” mt9v032 l o a d e d ”
flash
;;
stop )
;;
restart )
20
3.2
Image Capturing
3
SYSTEM CONFIGURATION
;;
status )
flash
;;
∗)
esac
Listing 8: Camera driver initialization /etc/init.d/camera init.
After that the camera event service is started, which provides handling of the users’ interaction with
the module using selected GPIO pins. In this service, a named pipe /tmp/caspa is created and the
/opt/bin/capture event.sh script is executed, see Listing 9.
#! / b i n / s h
p i p e =/tmp/ c a s p a
case $1 i n
start )
rm − r f $ p i p e
mkfifo $pipe
/ o p t / b i n / c a p t u r e e v e n t . s h 2>&1 1>/ v a r / l o g / c a p t u r e e v e n t . l o g &
;;
stop )
;;
restart )
;;
status )
;;
∗)
esac
Listing 9: Initialization of user input handling in the /etc/init.d/camera event script.
Finally the capturing service itself is started by the camera capture script, which is shown in Listing 10.
The script contains simple user notification using 144 PIN of the GPIO; so, the red led is flashing for a
while after starting the service1 .
#! / b i n / s h
flash () {
for i in ‘ seq 1 1 5 ‘
do
echo 0 > / s y s / c l a s s / g p i o / g p i o 1 4 4 / v a l u e
sleep 0.2
echo 0 > / s y s / c l a s s / g p i o / g p i o 1 4 4 / v a l u e
echo 1 > / s y s / c l a s s / g p i o / g p i o 1 4 4 / v a l u e
sleep 0.2
done
}
case $1 i n
start )
1
The second version of the camera module does not include LEDs, therefore, the herein presented description is for the first
version of the module.
21
3.2
Image Capturing
3
p i d =‘ ps aux | g r e p ” / b i n / b a s h / o p t / b i n / c a p t u r e . s h ”
p r i n t $2 } ’ ‘
i f [ −n ” $ p i d ” ]
then
echo ” camera c a p t u r e a l r e a d y r u n n i n g ”
else
/ o p t / b i n / c a p t u r e . s h 2>&1 1>/ v a r / l o g / c a p t u r e . l o g
echo ” c a p t u r e s t a r t e d ”
fi
flash
;;
stop )
p i d =‘ ps aux | g r e p ” / b i n / b a s h / o p t / b i n / c a p t u r e . s h ”
p r i n t $2 } ’ ‘
i f [ −n ” $ p i d ” ] ; then
k i l l −9 $ p i d ;
echo ” c a p t u r e . s h k i l l e d ”
else
echo ” c a p t u r e n o t r u n n i g ”
fi
;;
restart )
p i d =‘ ps aux | g r e p ” / b i n / b a s h / o p t / b i n / c a p t u r e . s h ”
p r i n t $2 } ’ ‘
i f [ −n ” $ p i d ” ] ; then
k i l l −9 $ p i d ;
echo ” c a p t u r e . s h k i l l e d ”
else
echo ” c a p t u r e n o t r u n n i g ”
fi
/ o p t / b i n / c a p t u r e . s h 2>&1 1>/ v a r / l o g / c a p t u r e . l o g &
echo ” c a p t u r e s t a r t e d ”
flash
;;
status )
flash
;;
∗)
esac
SYSTEM CONFIGURATION
| g r e p −v g r e p | awk ’ {
&
| g r e p −v g r e p | awk ’ {
| g r e p −v g r e p | awk ’ {
Listing 10: Initialization of capturing service in /etc/init.d/camera capture script.
The system scripts are started at the run level 5 that is configured using the update-rc.d tool, see
Listing 11 for an example.
update −r c . d c a m e r a i n i t s t a r t 90 5 .
update −r c . d c a m e r a e v e n t s t a r t 95 5 .
update −r c . d c a m e r a c a p t u r e s t a r t 99 5 .
Listing 11: Activation of the capturing service scripts.
Beside the system service scripts, the image capturing consists of two additional scripts. Although it
may looks a bit complex, these scripts provide flexibility how to initiate the capturing process as they
provide independent way of handling user events from the GPIO pins, where buttons can be connected. The
22
3.2
Image Capturing
3
SYSTEM CONFIGURATION
scripts are placed at /opt/bin/capture event.sh and /opt/bin/capture.sh files 2 , and are depicted
in Listings 12 and 13. The first script handles user inputs coming from GPIO signals using gpio-event
tool. Even though gpio-event supports a kind of filtering of the input signal, it does not work reliably,
and therefore, additional conditions of the input events are considered in the script. A communication
between the scripts is realized using a named pipe, which should be created at the system boot time by the
camera event service. The named pipe is located at /tmp/caspa. The capture.sh script is an infinity
loop in which a finite machine with two states is implemented. At the first state, the pipe is read for an
incoming command. If the command is "start" a program for the capturing is started at the background
and the script waits for a termination of the program. A detailed description of the capturing program is
presented in Section 5.
#! / b i n / b a s h
c a p t u r e p i n =75
c a p t u r e e d g e=R
p i p e =/tmp/ c a s p a
c a p t u r e p i d =/ v a r / r u n / c a p t u r e . p i d
c m d p a t t e r n=t c a p t u r e
led on () {
echo 0 > / s y s / c l a s s / g p i o / g p i o 1 4 5 / v a l u e
}
l e d o f f () {
echo 1 > / s y s / c l a s s / g p i o / g p i o 1 4 5 / v a l u e
}
e v e n t s =/dev / g p i o −e v e n t
l a s t =0
g p i o −e v e n t $ c a p t u r e p i n : b : 3 0
w h i l e t r u e ; do
echo ”Watch f o r e v e n t ”
i f [ ! −p $ p i p e ]
then
echo ” C r e a t e p i p e $ p i p e ”
rm − r f $ p i p e ; m k f i f o $ p i p e
fi
g p i o −e v e n t $ c a p t u r e p i n : b : 3 0
led on
read < $ e v e n t s b e t
i f [ $e = ”R” ]
then
d t =‘echo ” $ t − $ l a s t ” | bc ‘
d t =‘ p r i n t f ”%.0 f ” $dt ‘
i f [ [ $b −eq $ c a p t u r e p i n && $ d t −ge 1 ] ]
then
echo ” ‘ da te ‘ INFO : T o g g l e c a p t u r e ”
2
In fact, these files are symbolic links to capture event.caspa.sh and capture.caspa.sh.
23
3.2
Image Capturing
3
SYSTEM CONFIGURATION
s t a r t =1
i f [ −f $ c a p t u r e p i d ]
then
p i d =‘ cat $ c a p t u r e p i d ‘
i f [ [ −n $ p i d && −n ‘ ps −p $ p i d | g r e p $ p i d | g r e p $ c m d p a t t e r n ‘ ] ]
then
s t a r t =0
fi
fi
led off
i f [ $ s t a r t −eq 0 ]
then
echo ” ‘ da te ‘ INFO : Stop c a p t u r i n g ”
k i l l ‘ cat $ c a p t u r e p i d ‘ 2>/dev / n u l l
k i l l a l l −9 t c a p t u r e 2>/dev / n u l l # f o r s u r e
rm − r f $ c a p t u r e p i d # f o r s u r e
else
echo ” ‘ da te ‘ INFO : S t a r t c a p t u r i n g ”
exec 5<> $ p i p e
read −t 0 −u 5 #r e a d e v e n t u a l c o n t e n t o f t h e p i p e
echo ” s t a r t ” > $ p i p e
fi
fi
l a s t=$ t
echo ” L a s t $ l a s t ; r e a d e v e n t s ”
exec 5<> $ e v e n t s
read −t 0 −u 5 #r e a d e v e n t u a l e v e n t s
echo ”DONE”
fi
done
Listing 12: Handling of user inputs coming GPIO pints.
#! / b i n / b a s h
p i p e =/tmp/ c a s p a
c a p t u r e p i d =/ v a r / r u n / c a p t u r e . p i d
o u t p r e f i x =/media / mmcblk0p1
l e d c a p t u r e=” / s y s / c l a s s / g p i o / g p i o 1 4 4 / v a l u e ”
on=0
o f f =1
w h i l e t r u e ; do
i f [ ! −p $ p i p e ]
then
echo ” C r e a t e p i p e $ p i p e ”
rm − r f $ p i p e ; m k f i f o $ p i p e
fi
echo $ o f f > $ l e d c a p t u r e
echo ” ‘ da te ‘ INFO : l e d c a p t u r e o f f , w a i t f o r cmd”
read <$ p i p e cmd
i f [ [ −n $cmd && $cmd = ” s t a r t ” ] ]
24
3.3
Initialization of the Tracker Server
3
SYSTEM CONFIGURATION
then
d=‘ d a t e +”%Y−%m−%d−%H.%M.%S” ‘
echo ”$cmd − $d ”
o u t=$ o u t p r e f i x / $d
mkdir $out
i f [ $ ? −eq 0 ]
then
echo ” ‘ da te ‘ INFO : o u t p u t d i r e c t o r y s e t t o $ o u t ”
else
echo ” ‘ da te ‘ ERROR : cann ’ t c r e a t e o u t p u t d i r e c t o r y $ o u t ”
fi
/ o p t / b i n / t c a p t u r e −c / o p t / t c a p t u r e . c f g − l / o p t / l o g g e r . c a s p a c a p t u r e . c f g
image−d i r e c t o r y $ o u t &
p i d=$ !
echo $ p i d > $ c a p t u r e p i d
echo $on > $ l e d c a p t u r e
echo ” ‘ da te ‘ INFO : l e d c a p t u r e on ”
wait $pid
rm − r f $ c a p t u r e p i d
echo ”INFO : C a p t u r i n g end w i t h r e t u r n v a l u e $ ? ”
fi
done
−−
Listing 13: Starting of the capturing.
The main advantage of the independent handling of event and capturing is that the capturing can be
managed remotely from the terminal by the following commands:
• echo start > /tmp/caspa - starts the capturing;
• killall -9 tcapture - terminated the capturing program.
This is very handful, if the buttons are not in a reachable distance, e.g., a camera module is placed at an
UAV in a high altitude, or the module does not have buttons at all, e.g. like in the second version of the
camera module.
3.3
Initialization of the Tracker Server
Initialization of the tracker server is very similar to the capturing program. It is done using the system
service /etc/init.d/tracker, which is shown in Listing 14. The service is added to the run level 5 by the
command update-rc.d tracker start 99 5 . The script starts (or eventually stop) the infinity loop
in which the tracker program is repeatedly started, see Listing 15. The loop allows to restart the tracker
with a different configuration, which is provided by the files tracker.cfg, cam.cfg, and cam.m. Besides,
the loop also resolves situations in which the tracker program is suddenly terminated, e.g., due to some
unknown bug.
#! / b i n / s h
case $1 i n
start )
p i d =‘ ps aux | g r e p ” / b i n / s h / o p t / t r a c k e r / t r a c k e r . s h ” | g r e p −v g r e p | awk
’ { p r i n t $2 } ’ ‘
i f [ −n ” $ p i d ” ]
25
3.3
Initialization of the Tracker Server
3
SYSTEM CONFIGURATION
then
echo ” t r a c k e r i s a l r e a d y r u n n i n g ”
else
/ o p t / t r a c k e r / t r a c k e r . s h 2>&1 1>/ v a r / l o g / t r a c k e r . l o g &
echo ” t r a c k e r s t a r t e d ”
fi
;;
stop )
p i d =‘ ps aux | g r e p ” / b i n / s h / o p t / t r a c k e r / t r a c k e r . s h ” | g r e p −v g r e p | awk
’ { p r i n t $2 } ’ ‘
i f [ −n ” $ p i d ” ] ; then
k i l l −9 $ p i d ;
k i l l a l l −9 t r a c k e r . s h ;
k i l l −9 ‘ p i d o f / o p t / t r a c k e r / t r a c k e r ‘
echo ” t r a c k e r . s h k i l l e d ”
else
echo ” t r a c k e r i s n o t r u n n i g ”
fi
;;
restart )
p i d =‘ ps aux | g r e p ” / b i n / s h / o p t / t r a c k e r / t r a c k e r . s h ” | g r e p −v g r e p | awk
’ { p r i n t $2 } ’ ‘
i f [ −n ” $ p i d ” ] ; then
k i l l −9 $ p i d ;
echo ” t r a c k e r . s h k i l l e d ”
k i l l a l l −9 t r a c k e r . s h ;
k i l l −9 ‘ p i d o f / o p t / t r a c k e r / t r a c k e r ‘
else
echo ” t r a c k e r i s n o t r u n n i g ”
fi
/ o p t / t r a c k e r / t r a c k e r . s h 2>&1 1>/ v a r / l o g / t r a c k e r . l o g &
echo ” t r a c k e r s t a r t e d ”
;;
status )
;;
∗)
esac
Listing 14: Tracker service initialization.
#! / b i n / s h
p r e f i x =/o p t / t r a c k e r
w h i l e t r u e ; do
l s −l $ p r e f i x / t r a c k e r
l s −l $ p r e f i x / t r a c k e r . cfg
l s − l $ p r e f i x /cam . c f g
l s − l $ p r e f i x /cam .m
echo ” ‘ da te ‘ INFO : S t a r t t r a c k e r ”
$ p r e f i x / t r a c k e r −−c o n f i g $ p r e f i x / t r a c k e r . c f g −−camera−c o n t r o l −s e t t i n g s $ p r e f i x
/cam . c f g −−camera−p a r a m e t e r s $ p r e f i x /cam .m −−l o g g e r −c o n f i g $ p r e f i x / l o g g e r .
cfg
echo ” ‘ da te ‘ INFO : T r a c k e r h a s been t e r m i n a t e d ”
done
26
3.3
Initialization of the Tracker Server
3
SYSTEM CONFIGURATION
Listing 15: Tracker program execution.
/ opt / t r a c k e r :
cam . c f g
cam .m−>e t c /cam4−480 x360 .m
etc
logger . cfg
tcapture
t r a c k e r −>t c a p t u r e
t r a c k e r . c f g −>e t c / t r a c k e r −480 x360 . c f g
t r a c k e r . sh
/ opt / t r a c k e r / e t c :
cam4−320 x240 .m
cam4−480 x360 .m
cam4−640 x480 .m
cam4−752 x480 .m
t r a c k e r −320 x240 . c f g
t r a c k e r −480 x360 . c f g
t r a c k e r −640 x480 . c f g
t r a c k e r −752 x480 . c f g
Listing 16: Directory layout of the tracker program.
The tracker program is identical binary file as the capturing program tcapture, just a different configuration is used. The behaviour of the program is also changed according to the name of the binary file, i.e.,
if the name is tracker or ttracker the behaviour of the program is forced to act as a tracker providing
information about the tracked object. In fact, the tracker file is a symbolic link with the target to the
tcapture binary, see a content of the opt/tracker directory showed in Listing 16. The origin of the
program can be printed by a -v command arguments as can be seen in Listing 17.
[email protected] : / o p t / t r a c k e r# . / t r a c k e r −v
Caspa c a p t u r e a p p l i c a t i o n 0 . 1
I d : $ I d : t c a p t u r e . c c 125 2012−01−10 2 0 : 1 2 : 2 7 Z honza $
[email protected] : / o p t / t r a c k e r#
Listing 17: Printed version of the tracker program.
The configuration of the tracker program (and the tcapture program as well) consists of the following
four configuration files.
• tracker.cfg - main configuration file of the program in a format for the Boost.Program options [8];
• logger.cfg - a specific configuration for the log4cxx logger [9];
• cam.cfg - values of the camera control settings in a v4l2 “format” [10], see Section 5 for further
details;
• cam.m - identified parameters of the camera found as a result of the calibration [1].
All the program options can be specified in the main configuration file; however, they can also be specified
as a program’s command line arguments, which have a higher priority. Detailed description of the most
important parameters is presented in Section 5. The main tracker directory /opt/tracker contains several
27
4
CAMERA SETTINGS
configuration files and files with the camera parameters for several resolutions. The selected default resolution is 480×360; thus, the particular configurations files are the target for the symbolic links cam.m and
tracker.cfg.
The tracker can be used simultaneously with the capturing program; however, the camera device (i.e.,
/dev/video0) can be opened only by a single application. Therefore, the tracker server monitors “connected” clients and closes the device once none of clients is connected3 . The opened camera device is
signalized by a red LED at the Overo Caspa board.
A summary of tracker parameters is depicted in Table 7.
4
Camera Settings
The Gumstix Caspa camera is accessible using the v4l2 programming interface [10] provided by the mt9v032
driver, and therefore, it can be used like any other v4l device. However, the driver supports only limited
control settings that are implemented by the driver. A list of available settings is depicted in Listing 18.
Here, it is worth to mention that the version r100 of the mt9v032 driver (the current version at the time
of developing the camera module) contains a small bug, which prevent to use the control settings. The bug
has been fixed during development of the camera module and all the listed settings can be used.
=== VIDIOC QUERYCAP ===
d r i v e r : omap3
c a r d : omap3/ mt9v032 //
version : 0
capabilites :
V4L2 CAP VIDEO CAPTURE
V4L2 CAP STREAMING
=== VIDIOC ENUMINPUT ===
Input 1
name : camera
t y p e : V4L2 INPUT TYPE CAMERA
s t a t u s : ( not a l l checked )
=== VIDIOC ENUM FMT ( t y p e = V4L2 BUF TYPE VIDEO CAPTURE ) ===
Format 1 :
description
pixelformat
Format 2 :
description
pixelformat
Format 3 :
description
pixelformat
: UYVY, p a c k e d
: 0 x59565955 (UYVY)
: YUYV (YUV 4 : 2 : 2 ) , p a c k e d
: 0 x56595559 (YUYV)
: B a y e r 1 0 ( GrR/BGb)
: 0 x30314142 ( BA10 )
=== VIDIOC QUERYCTRL ===
Control : 0
3
The server is UDP; thus, the connection is monitored using an explicit unsubscription of the connected clients.
28
4
i d : V4L2 CID BASE + 0
name : B r i g h t n e s s
type : i n t e g e r
minimum : 0
maximum : 255
step : 1
default : 1
Control : 1
i d : V4L2 CID BASE + 1
name : C o n t r a s t
type : i n t e g e r
minimum : 0
maximum : 255
step : 1
d e f a u l t : 16
Control : 2
i d : V4L2 CID BASE + 17
name : E x p o s u r e
type : i n t e g e r
minimum : 2
maximum : 480
step : 1
d e f a u l t : 480
f l a g s : 0 x00000020
Control : 3
i d : V4L2 CID BASE + 18
name : A u t o m a t i c Ga in
type : boolean
default : 1
Control : 4
i d : V4L2 CID BASE + 19
name : A na lo g Ga in
type : i n t e g e r
minimum : 16
maximum : 64
step : 1
d e f a u l t : 16
f l a g s : 0 x00000020
Control : 5
i d : V4L2 CID BASE + 20
name : F l i p H o r i z o n t a l l y
type : boolean
default : 0
Control : 6
i d : V4L2 CID BASE + 21
name : F l i p V e r t i c a l l y
type : boolean
default : 0
Control : 7
i d : V4L2 CID BASE + 31
name : C o l o r E f f e c t s
type : other : 3
minimum : 0
maximum : 2
step : 1
29
CAMERA SETTINGS
4.1
Practical Tips for Using the Overo Caspa Camera
4
CAMERA SETTINGS
default : 0
Control : 8
i d : V4L2 CID CAMERA CLASS BASE + 1
name : A u t o m a t i c E x p o s u r e
type : boolean
default : 1
Listing 18: Gumstix Caspa camera control settings accessible by the V4L2 interface provided by the mt9v032
driver.
In addition to this issue, the current version of the mt9v032 does not provide sufficient stability for
capturing an image in the BA10 format, that is why the V4L2 PIX FMT YUYV (YUYV) format is used.
Two above mentioned applications (the tcapture and tracker) use a simple text file to store desired
control settings of the camera. The format follows a simple interface to control a device using the ioctl
system call and definitions in the V4L2. Each control setting is at one line, where three columns are separated
by a space (or any whitespace) character. The first column is a string denoting the name of V4L2 request
(in a human readable form), the second column is integer representation of the request, and finally the third
column is the requested value. An example of the file is shown in Listing 19.
V4L2 CID BRIGHTNESS 9963776 2
V4L2 CID CONTRAST 9963777 16
V4L2 CID GAIN 9963795 16
V4L2 CID EXPOSURE AUTO 10094849 0
V4L2 CID EXPOSURE 9963793 375
V4L2 CID AUTOGAIN 9963794 1
V4L2 CID COLORFX 9963807 1
Listing 19: Gumstix Caspa camera control settings accessible by the V4L2 interface provided by the mt9v032
driver.
4.1
Practical Tips for Using the Overo Caspa Camera
The usage of the camera is pretty much straightforward; however, it may happened that one can be
surprised by the device behaviour due to implementation details. Even though, such facts can be found in
the documentation, they are presented here to provide a reminder of them.
4.1.1
Camera Control Settings
Although the above defined control camera settings files (see Listing 19) is straightforward, the behaviour
of the camera driver needs a special consideration regarding the settings related to the auto exposure and
auto gain controls. The manual settings of the exposure or gain take effect only if the auto exposure or auto
gain, respectively, is disabled. It means that disabling the automatic settings must be done prior setting the
desired values. Regarding the above described applications and the configuration file, this can be done by
an order of the control settings in the file. However, the applications developed fix this behaviour.
4.1.2
Auto exposure
The auto exposure might reduce the frame rate as low as 15 frames per second as it is noted at the Wiki
page of the Caspa board [11]. In such cases, the real required time for processing image together with time
for reporting local position of the detected blob and the time need to capture the image are similar. In
30
5
APPLICATIONS SETTINGS
a consequence, this significantly reduce the number of processed new images per second. Therefore, it is
recommended to use manual settings of the exposure, because it provides higher FPS as it is presented in
Section 2.2.1.
5
Applications Settings
This section presents a more detailed description of the main applications for the camera module. The
description is not exhaustive because the applications contains many settings and options, which are mostly
for developing and debugging purposes. Therefore, the description is focused on the most important parts
from a user point of view. Additional functions can be observed from the source codes of the applications.
5.1
Tracker Server
The tracker server acts as a daemon providing information about the tracked object. The application itself
is the same binary as the capturing program, and therefore, particular program options have to be set.
The most important options defining the role of the program are also set if the program name is tracker
or ttracker. The default configuration of the tracker server startup is described in Section 3, and more
detailed description of the all program options is provided by the program itself using -h or --help options
(the Boost.Program options [8] library is used for parsing the options).
camera−c o n t r o l −s e t t i n g s=cam . c f g
l o g g e r −c o n f i g=l o g g e r . c f g
use−c i r c l e −d e t e c t o r =1
c i r c l e −d e t e c t o r −t r a c k i n g =1
g r a y s c a l e =1
t r a c k e d −o b j e c t −d i a m e t e r =0.14
w i d t h =480
h e i g h t =360
camera−p a r a m e t e r s=cam4−480 x360 .m
camera−d e v i c e =/dev / v i d e o 0
f p s=−1
s e r v e r =1
t r a c k e r −s e r v e r =1
t r a c k e r −s e r v e r −p o r t =9001
t r a c k e r −s e r v e r −max−msg−s i z e =1024
t r a c k e r −s e r v e r −max− c l i e n t s =2
t r a c k e r −s e r v e r −c l o s e −camera−no− c l i e n t s =1
Listing 20: An example of the tracker server configuration file (tracker-480x360.cfg)
An example of the tracker server configuration file is shown in Listing 20 and a description of the particular
options is as follows.
• System options:
31
5.1
Tracker Server
5
APPLICATIONS SETTINGS
– logger-config=logger.cfg - the configuration file for the log4cxx logger [9], it allows to
control what type of messages are printed out and how (e.g., to standard output like a screen,
or into a file).
• Camera options:
– camera-device=/dev/video0 - a file name representing the camera device.
– width=480 - requested width of captured images.
– height=360 - requested height of captured images.
– camera-control-settings=cam.cfg - a file with particular camera control settings loaded
and set before capturing and processing images.
– camera-parameters=cam4-480x360.m - camera parameters found by the toolbox [1], the file
is directly the Matlab file produced by the toolbox. An example of the file is shown in Listing 21.
– grayscale=1 - consider the grayscale images. The option will cause the capture images are in
grayscale, which reduced time for the image conversion and also decrease the image processing
time a bit.
• Tracker / Detector settings
– use-circle-detector=1 - use detector of the blob based on the B/W ring pattern.
– circle-detector-tracking=1 - if set, the previous position of the detected blob is used to
speed up the segment searching process. If the blob is detected continuously, the tracking can
significantly reduced the required processing time.
– fps=-1 - desired number of processed images per second, for values less than zero images are
processed as fast as possible.
– tracked-object-diameter=0.14 - the outer diameter of the B/W ring pattern.
• Tracker server options
– server=1 - set the application to act as a server.
– tracker-server=1 - set the application to act as the tracker server.
– tracker-server-port=9001 - a port on which the server listen for clients (using UDP).
– tracker-server-close-camera-no-clients=1 - close the camera device if none of the clients
is connected (i.e., all clients close the connection to the server), it allows to use tracker server
simultaneously with other applications accessing the camera device. If the option is set to false,
the camera device is opened all the time the tracker server is running. Notice, if the device is
opened, the red LED at the camera board is switched on.
– tracker-server-max-msg-size=1024 - A maximum message size considered for the communicating with clients. (The default value should be sufficient in most cases.)
– tracker-server-max-clients=2 - A number of clients, which can connect to the server and
to which the information about the tracker object are sent. (The default value should be sufficient
in most cases.)
32
5.1
Tracker Server
5
APPLICATIONS SETTINGS
% I n t r i n s i c and E x t r i n s i c Camera P a r a m e t e r s
%
% T h i s s c r i p t f i l e can be d i r e c t l y e x c e c u t e d u n d e r Matlab t o r e c o v e r t h e camera
i n t r i n s i c and e x t r i n s i c p a r a m e t e r s .
% IMPORTANT : T h i s f i l e c o n t a i n s n e i t h e r t h e s t r u c t u r e o f t h e c a l i b r a t i o n o b j e c t s
n o r t h e image c o o r d i n a t e s o f t h e c a l i b r a t i o n p o i n t s .
%
A l l those complementary v a r i a b l e s are saved i n the complete matlab
d a t a f i l e C a l i b R e s u l t s . mat .
% For more i n f o r m a t i o n r e g a r d i n g t h e c a l i b r a t i o n model v i s i t h t t p : / /www. v i s i o n .
c a l t e c h . edu / b o u g u e t j / c a l i b d o c /
%−− F o c a l l e n g t h :
f c = [ 401.944133237755523 ; 471.921886874863844 ] ;
%−− P r i n c i p a l p o i n t :
cc = [ 234.749987607683408 ; 203.779018171760271 ] ;
%−− Skew c o e f f i c i e n t :
alpha c = 0.000000000000000;
%−− D i s t o r t i o n c o e f f i c i e n t s :
kc = [ −0.414761516174968 ; 0 . 1 9 1 1 3 5 9 6 3 0 5 4 8 1 8 ; 0 . 0 0 0 9 1 7 0 1 4 0 7 9 4 1 5 ;
0.000757979352098 ; 0.000000000000000 ] ;
%−− F o c a l l e n g t h u n c e r t a i n t y :
f c e r r o r = [ 0.380510813779749 ; 0.471484705407820 ] ;
%−− P r i n c i p a l p o i n t u n c e r t a i n t y :
c c e r r o r = [ 0.780420962614166 ; 0.760419558228052 ] ;
%−− Skew c o e f f i c i e n t u n c e r t a i n t y :
a l p h a c e r r o r = 0.000000000000000;
%−− D i s t o r t i o n c o e f f i c i e n t s u n c e r t a i n t y :
k c e r r o r = [ 0.002961741054041 ; 0.008264122674892 ; 0.000349479648818 ;
0.000261311125835 ; 0.000000000000000 ] ;
%−− Image s i z e :
nx = 4 8 0 ;
ny = 3 6 0 ;
%−− V a r i o u s o t h e r v a r i a b l e s ( may be i g n o r e d i f you do n o t u s e t h e Matlab
C a l i b r a t i o n Toolbox ) :
%−− Those v a r i a b l e s a r e u s e d t o c o n t r o l w h i c h i n t r i n s i c p a r a m e t e r s s h o u l d be
optimized
n ima = 22;
images
est fc = [ 1 ; 1 ];
t h e two f o c a l v a r i a b l e s
e s t a s p e c t r a t i o = 1;
aspect r a t i o fc (2) / fc (1)
% Number o f c a l i b r a t i o n
% Estimation i n d i c a t o r of
% Estimation i n d i c a t o r of the
33
5.2
Tracker Client
center optim = 1;
the p r i n c i p a l point
est alpha = 0;
t h e skew c o e f f i c i e n t
est dist = [ 1 ; 1 ; 1 ; 1 ; 0 ];
coefficients
5
APPLICATIONS SETTINGS
% Estimation i n d i c a t o r of
% Estimation i n d i c a t o r of
% Estimation i n d i c a t o r of the d i s t o r t i o n
Listing 21: An example of the camera parameters found by the Matlab toolbox [1].
5.2
Tracker Client
The tracker server uses a simple UDP protocol for transmitting position of the tracked object. The position
is encapsulated in the imr::STrackedObject structure, see Listing 22 and it is serialized using XDR [12].
A client has to connect to the server, which will then starts sending positions according to configuration,
e.g., desired fps. Although the client address can be retrieved from the incoming request at the server side,
such an address can be unreachable from the server because of poorly configured network environment,
or too complex system setup (e.g., a client can have multiple network interfaces and an address resolving
service can be unreachable from the camera module). Therefore, to guarantee the correct address to deliver
the messages from the server, the clients have to send particular address and port in its initial message to
the server. Then, the server will use the address for transmitting the messages; thus, the client have to read
from a socket binded into the provided address and port.
/∗
∗ F i l e name : t r a c k e d o b j e c t . h
∗ Date :
2012/01/07 1 4 : 0 4
∗ A ut ho r :
Jan F a i g l
∗/
#i f n d e f
#d e f i n e
TRACKED OBJECT H
TRACKED OBJECT H
namespace i m r {
s t r u c t STrackedObject {
b o o l v a l i d ; // t r u e i f t h e b l o b i n f o i s v a l i d
float x;
float y;
float z ;
float pitch ;
float roll ;
f l o a t yaw ;
f l o a t p i x e l r a t i o ; // r a t i o o f t h e e x p e c t e d and d e t e c t e d p i x e l s o f t h e
f l o a t b w r a t i o ; // r a t i o o f t h e d e t e c t e d c o l o r ( g r a y s )
};
} // end namespace i m r ;
#e n d i f
/∗ end o f t r a c k e d o b j e c t . h ∗/
Listing 22: A structure send by the tracker server (tracked object.h)
34
5.2
Tracker Client
5
APPLICATIONS SETTINGS
The address including the port also serves as a client identification. The identification is used in cases,
when a connection between the server and client is suddenly lost, e.g., due to limited range of WiFi, and
the client is reconnected to the server. In such a case, the server does not know the client is not available as
the UDP transmit protocol is used. So, once the client from the previously registered address is reconnected
to the server, the subscribing request is accepted. The UDP protocol is also the reason why an explicit
disconnect message has to be send to the server, which can then eventually close (and release) the camera
device.
All the communication primitives and marshaling / unmarshaling functions are part of the provided library.
An example of the library usage is depicted in Listing 23, where a source code of a simple application called
tclient is presented.
/∗
∗ F i l e name : t c l i e n t . c c
∗ Date :
2012/01/06 0 7 : 2 7
∗ Aut ho r :
Jan F a i g l
∗/
#i n c l u d e <i o s t r e a m >
#i n c l u d e <c s t r i n g >
#i n c l u d e < c s t d l i b >
#i n c l u d e <s y s / s o c k e t . h>
#i n c l u d e <n e t d b . h>
#i n c l u d e < n e t i n e t / i n . h>
#i n c l u d e <u n i s t d . h>
#i n c l u d e < s i g n a l . h>
#i n c l u d e ” t r a c k e d o b j e c t . h”
#i n c l u d e ” s e r i a l i z a t i o n . h”
#i n c l u d e ” t r a c k e r c l i e n t . h”
/// −−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−
imr : : C T r a c k e r C l i e n t ∗ c l i e n t = 0 ;
bool quit = false ;
/// −−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−
void terminate handler ( int sig ) {
if ( client ) {
s t d : : c e r r << ” T e r m i n a t e c l i e n t ! ” << s t d : : e n d l ;
q u i t = true ;
}
}
/// −−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−
v o i d p r i n t o b j e c t ( u n s i g n e d l o n g i d , u n s i g n e d time , c o n s t i m r : : S T r a c k e d O b j e c t& o b j
) {
s t d : : c o u t << ” O b j e c t i d : ” << i d << ” t i m e : ” << t i m e
<< ” v a l i d : ” << o b j . v a l i d
<< ” c o o r d s : ” << o b j . x << ” ” << o b j . y << ” ” << o b j . z
<< ” r o t a t i o n s : ” << o b j . p i t c h << ” ” << o b j . r o l l << ” ” << o b j . yaw
<< ” p i x e l r a t i o : ” << o b j . p i x e l r a t i o << ” b w r a t i o : ” << o b j . b w r a t i o
<< s t d : : e n d l ;
35
5.2
Tracker Client
5
APPLICATIONS SETTINGS
}
/// − main f u n c t i o n −−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−
i n t main ( i n t a r g c , c h a r ∗ a r g v [ ] ) {
i n t r e t = −1;
const
const
const
const
const
std : : s t r i n g host =
int server port =
s t d : : s t r i n g myip =
int client port =
b o o l BLOCKING READ
argc >
argc >
argc >
argc >
= argc
1
2
3
4
>
?
?
?
?
5
std : : s t r i n g ( argv [ 1 ] ) : ” l o c a l h o s t ” ;
a t o i ( argv [ 2 ] )
: 9001;
std : : s t r i n g ( argv [ 3 ] ) : ” 127.0.0.1 ” ;
a t o i ( argv [ 4 ] ) : 9005;
? a t o i ( argv [ 5 ] ) : false ;
c l i e n t = new i m r : : C T r a c k e r C l i e n t ( h o s t , s e r v e r p o r t , myip , c l i e n t p o r t ) ;
s i g n a l ( SIGINT , t e r m i n a t e h a n d l e r ) ; /// r e g i s t e r s i g n a l h a n d l e r
s t d : : c e r r << ”INFO : ”
<< ” c o n n e c t t o t h e t r a c k e r ” << h o s t << ” : ” << s e r v e r p o r t
<< ” u s i n g a d d r e s s ” << myip << ” : ” << c l i e n t p o r t
<< s t d : : e n d l ;
// a l l o c a t e b u f f e r f o r t h e message
c o n s t i n t SIZE = 2 ∗ ( s i z e o f ( i m r : : S T r a c k e d O b j e c t ) + 2 ∗ s i z e o f ( u n s i g n e d l o n g ) )
;
c h a r b u f [ SIZE ] ;
unsigned long id ;
unsigned long time ;
imr : : STrackedObject obj ;
i f (BLOCKING READ) {
//An e x a m p l e o f b l o c k e d r e c e i v i n g
i f ( c l i e n t −>c o n n e c t ( ) ) {
s t d : : c e r r << ”INFO : C o n n e c t e d t o t h e t r a c k e r s e r v e r ” << s t d : : e n d l ;
} else {
s t d : : c e r r << ”WARN: C o n n e c t i o n t o t h e t r a c k e r s e r v e r f a i l ! ” << s t d : : e n d l
;
}
q u i t = ! c l i e n t −>i s C o n n e c t e d ( ) ;
while ( ! q u i t ) {
s t d : : c e r r << ” q u i t : ” << q u i t << s t d : : e n d l ;
i n t l = c l i e n t −>r e c e i v e (& b u f [ 0 ] , SIZE , f a l s e ) ;
i f ( l > 0 and i m r : : unpack (& b u f [ 0 ] , SIZE , i d , time , o b j ) > 0 ) {
p r i n t o b j e c t ( i d , time , o b j ) ;
} else i f ( l < 0 ) {
s t d : : c e r r << ”WARN: E r r o r i n mesage r e c e i v e ” << s t d : : e n d l ;
u s l e e p (100∗1000) ;
}
} // end r e c e i v i n g l o o p
} else {
// An e x a m p l e o f non−b l o c k e d r e c e i v e w i t h r e c o n n e c t i o n
c o n s t i n t MSG WAIT = 1 0 ; // i n ms
c o n s t i n t MAX CONNECT FAILS = 1 0 ;
c o n s t i n t MAX RECEIVE FAILS = 1 0 ;
int state = 0;
i n t numConnectRetry = 0 ;
i n t numReceiveRetry = 0;
while ( ! q u i t ) {
36
5.2
Tracker Client
5
APPLICATIONS SETTINGS
const int prev state = state ;
switch ( state ) {
case 0 : // c o n n e c t i n g t o t h e s e r v e r
i f ( c l i e n t −>c o n n e c t ( ) ) {
numConnectRetry = 0 ;
numReceiveRetry = 0;
s t a t e = 1 ; // s w i t c h t o r e c e i v e t h e message
} else {
numConnectRetry++;
s t d : : c e r r << ” C o n n e c t i o n t o s e r v e r f a i l s ” << numConnectRetry
<< ” t i m e s ! ” << s t d : : e n d l ;
}
i f ( numConnectRetry >= MAX CONNECT FAILS ) { // d e c i d e what t o do
s t d : : c e r r << ” C o n n e c t i o n t o s e r v e r f a i l s t o o many t i m e s ! ” <<
std : : endl ;
q u i t = t r u e ; // e x i t t h e program
}
break ;
case 1 : // t r y t o r e c e i v e message
w h i l e ( c l i e n t −>r e c e i v e (& b u f [ 0 ] , SIZE , t r u e ) > 0 ) { // read a l l
messages
i f ( i m r : : unpack (& b u f [ 0 ] , SIZE , i d , time , o b j ) > 0 ) {
numReceiveRetry = 0;
p r i n t o b j e c t ( i d , time , o b j ) ;
}
}
i f ( n u m R e c e i v e R e t r y < MAX RECEIVE FAILS ) { // w a i t f o r a w h i l e
n u m R e c e i v e R e t r y ++;
u s l e e p (MSG WAIT ∗ 1 0 0 0 ) ;
} e l s e { //
s t d : : c e r r << ” R e c e i v e f a i l s ” << n u m R e c e i v e R e t r y << ” t i m e s ,
t r y t o r e c o n n e c t ” << s t d : : e n d l ;
c l i e n t −>d i s c o n n e c t ( ) ;
state = 0;
}
break ;
} // end s w i t c h
i f ( p r e v s t a t e != s t a t e ) {
s t d : : c e r r << ” Change s t a t e ” << p r e v s t a t e << ”−>” << s t a t e << s t d : :
endl ;
}
}
c l i e n t −>d i s c o n n e c t ( ) ;
}
return r e t ;
}
/∗ end o f t c l i e n t . c c ∗/
Listing 23: A simple client of the tracker server.
37
5.3 tcapture - Capturing mode
5.3
5
APPLICATIONS SETTINGS
tcapture - Capturing mode
The capturing program is called tcapture and it provides many functions and modes, e.g., the tracker
server described above. In this section, particular program options related to the simple capturing of images
are described. A basic configuration file is shown in Listing 24 and a detailed description of the options is
following.
• Program options:
– logger-config=logger.cfg - the configuration file for the log4cxx logger [9].
– camera-device=/dev/video0 - a file name representing the camera device.
– camera-control-settings=cam.cfg - a file with particular camera control settings loaded
and set before capturing and processing images.
– fps=10 - a desired FPS, it should be set to a reasonable value, because the images are stored in
jpeg format, which reduces the required space; however, it is time consuming. The recommended
values are around 10 or less than 15 fps for resolution 480×360. The achievable number of
captured images per second also depends on the quality of jpeg images, resolution, and settings
of the auto exposure, see Section 4.1.2.
• Image options:
– width=480 - requested width of captured images.
– height=360 - requested height of captured images.
– grayscale=1 - consider the grayscale images. The option will cause the capture images are in
grayscale, which reduced time for the image conversion.
– jpeg-quality=75 - a compression parameter of the jpeg images, lower values are less computational intensive.
• Image saving options:
– image-directory=images/ - the output directory into which images are saved.
– image-pattern=%09u.jpg - defines a name of the image files (following printf (3)4 ), the
input variable is the number of the captured image or time in milliseconds from the start of the
capturing (if the enable-capture-timestamp is 1).
– enable-capture-timestamp=1 - determines if the image names will be derived from the number of captured images or time.
l o g g e r −c o n f i g=l o g g e r . c f g
camera−d e v i c e =/dev / v i d e o 0
camera−c o n t r o l −s e t t i n g s=cam . c f g
s e r v e r=f a l s e
f p s =10
w i d t h =480
h e i g h t =360
j p e g −q u a l i t y =75
g r a y s c a l e =1
4
see man 3 printf
38
5.4
Applications for Remote Access to the Camera
5
APPLICATIONS SETTINGS
image−p a t t e r n =%09u . j p g
image−d i r e c t o r y=i m a g e s /
#u s i n g t i m e s t a m p i n g i n m i l l i s e c o n d s
e n a b l e −c a p t u r e −t i m e s t a m p=1
Listing 24: A configuration for the capturing program.
In addition to the presented simple capturing mode, the tcapture program supports another mode
suitable for evaluating performance of the detectors being developed. In this alternative mode, the images
can be stored in a raw format and also information about the detected segments and blobs can be stored
into additional files. The mode is mainly for developing and it is out of the scope of this report, therefore,
it is not described here. A reader interested in this mode is referred to the source codes.
5.4
Applications for Remote Access to the Camera
In this section, a brief description of applications allowing remote access is presented. These applications are
mainly for testing and development purposes, and therefore, only their basic configurations are presented.
The main motivation of the applications is to provide an easy way for developing new algorithms and their
testing with real images captured by the Overo Caspa camera. Moreover, it also allows to tune camera
control settings and creates the configuration file described in Section 4.
s e r v e r=t r u e
p o r t =9009
p o l l −t i m e o u t =200
d e f a u l t −b u f f e r −s i z e =8192
camera−c o n t r o l −s e t t i n g s=cam . c f g
use−c i r c l e −d e t e c t o r =1
number−b l o b s =2
rgb−cube−s i z e =64
Listing 25: A configuration file for the tcapture program in the server mode.
The applications follow a client/server architecture, where the server is the tcapture program described
above, which is in the server mode. An example of the configuration file is presented in Listing 25. The
purpose of the already described options is same as above, as it is the same application, therefore only the
newly introduced options are described in the following list.
• System options:
– server=true - start a server.
– port=9009 - port at which the server is listening (the transport protocol is TCP).
– poll-timeout=200 - a period used for checking new incoming messages from a client.
– default-buffer-size=8192 - an initial buffer size used for transmitted messages. (The default
value should be sufficient in most cases.)
• Detectors’ options (selected):
– use-circle-detector=1 - use detector of the blob based on the B/W ring pattern (otherwise
color blob finder is used).
39
5.4
Applications for Remote Access to the Camera
5
APPLICATIONS SETTINGS
– number-blobs=2 - the number of considered blobs for the color segmentation.
– rgb-cube-size=64 - a dimension of the RGB cube used in the color segmentation.
The client application is called tcam and has many options, see Listing 27. A part of the options is
related to the blob finders. In particular the current version of tcam contains two blob finders: the color
based and B/W ring pattern based detectors. The main features provided by the tcam application can be
summarized as follows.
• Testing blob finder algorithms using previously save images.
• Testing performance of blob finders.
• On-line capturing images using the camera module.
• On-line adjustment of the camera control settings and creation of settings file (cam.cfg).
• On-line testing of the blob finder algorithms running at the camera module.
cam−s e r v e r=cam1
cam−p o r t =9009
p r o c e s s −on−r e f r e s h =1
w i d t h =640
h e i g h t =480
j p e g −q u a l i t y =85
image−d i r e c t o r y=i m a g e s
image−p a t t e r n==%04d . j p g
use−c i r c l e −d e t e c t o r =1
Listing 26: Selected program options of the tcam application.
The program options are pretty much similar to the tcapture application (as most of them is related
to the particular blob finder algorithms), therefore only selected of them are shown in an example of the
configuration file in Listing 26.
• cam-server=cam1 - a host name of the tcapture application running in the server mode.
• cam-port=9009 - a port of the tcapture server.
• process-on-refresh=1 - if enabled, the selected blob finder is called for a new image (read from
the directory or received from the camera module).
• image-directory=images/ - the input/output directory where images are stored. List of images
presented in the directory is created after the tcam startup. Then, images can be used for testing the
image processing algorithms. Newly received images from the tcapture server are stored into this
directory in an incremental way (using a counter of the captured images).
• image-pattern=%04d.jpg - defines a name of the image files (following printf (3)5 ), for which
the application is looking at the startup. The same pattern is used to store the images.
5
see man 3 printf
40
5.4
Applications for Remote Access to the Camera
5
APPLICATIONS SETTINGS
Although the tcam application can be used for benchmarking algorithms being developed, probably its
main feature is an interactive mode. In this mode, a window containing an image (loaded from the given
image directory or captured from the camera module) is shown. From a user point of view, the most
important are interactive controls of the application that are based on keyboard commands. A summary of
the keys / controls are as follows, a key is represented by its usual character.
• Main controls:
– ’q’ - quit the application.
– ’b’, ’n’ - load previous / next image from the image directory.
– ’p’ - process the current image using the selected blob finder.
– ’l’ - perform blob finding step on the current image at the camera module, only the resulting
found segments are shown in the window (the image is not updated - received from the camera
module).
• Color based blob finder related controls:
– ’r’ - reset RGB cube of the color blob finder.
– ’s’ - save the current RGB cube to a file specified in the cube-rgb program option.
– ’k’ - download the current RGB cube to the camera module.
– mouse click - add the color at the cursor position to the RGB cube.
• Camera server related controls:
– ’c’ - connect to the tcapture server.
– ’o’ - open the camera at the server (connection has to be already establish).
– ’i’ - request information from the camera server.
– ’f’ - request an image from the camera in the jpeg format, the received file is stored in the
image directory.
– ’g’ - request a raw image from the camera, the received file is stored in the image directory in
the bmp format.
– ’1’, ’2’ - increase / decrease the camera contrast about 1 (and about 10 if shift key is
pressed).
– ’3’, ’4’ - increase / decrease the camera brightness about 1 (and about 10 if shift key is
pressed).
– ’5’, ’6’ - increase / decrease the camera exposure about 1 (and about 10 if shift key is
pressed).
– ’7’, ’8’ - increase / decrease the camera gain about 1 (and about 10 if shift key is pressed).
– ’9’ - enable / disable the camera auto exposure.
– ’0’ - enable / disable the camera auto gain.
– ’e’ - change the color mode of the camera (three values are supported V4L2 COLORFX NONE,
V4L2 COLORFX BW, and V4L2 COLORFX SEPIA).
– ’d’ - request the current control settings of the camera.
– ’z’ / ’x’ - set / store the camera control settings from / to the file specified in
camera-control-settings program options (at the client side).
41
5.4
Applications for Remote Access to the Camera
5
APPLICATIONS SETTINGS
– ’-’ / ’=’ save / load the camera control settings at the server side using the file specified in
the camera-control-settings program options of the tcapture program.
The full list of the control keys can be found in the cam app.cc file in definition of the CCamApplication::-start(void) method of the tcam application sources.
Examples of the demonstrative usage of the tcam application is shown in Fig. 9. In each sub-figure, the
top right text window is a remote session at the camera module (the output of the tcapture program in
the server mode). The bottom text window is output of the tcam application and the image at the left is
an actual image with visualized results of the found segments forming the detected blob.
(a) a received image from the camera
(b) locally detected ring pattern from the newly received
image from the camera
(c) detected blob after changing brightness and contrast
of the camera
(d) detected blob received from the camera
(e) selected pixels forming the RGB cube for the color
based blobfinder
(f) detected color blob received from the camera
Figure 9: Examples of tcam usage.
42
5.5
5.5
Creating Camera Control Settings File
5
APPLICATIONS SETTINGS
Creating Camera Control Settings File
The interactive mode of the tcam application can be used to create a file with camera control settings
for the capturing or tracking usage of the camera module. The adjustment of the control settings have
to be done according to the current light condition, therefore, on-line visualization of the captured images
together with a detected blob is handy.
First, the tcapture server has to be started. This can be done by the following sequence of commands,
as the needed files are already prepared on the camera module.
1. ssh cam1
2. ./tcapture
Similarly the tcam application can be started with the provided configuration file. Then, the following
sequence of control commands (keys) can be performed to create a new cam.cfg file.
1. ’c’ - connect to the server.
2. ’o’ - open the camera.
3. ’f’ - receive images from the camera to check if the camera is working properly and to see what
images are provided by the current settings.
4. adjust the camera control settings using the keys described above.
5. Press ’-’ to save the current camera settings to the file, i.e., into cam.cfg file at the server (using
the default configuration).
6. Quit the server.
7. Place the created file into appropriate directory at the camera module within the remote shell session,
e.g., cp cam.cfg /opt/cam.cfg.
5.6
Taking Images for the Camera Calibration
The tcam application can also be used for creating a set of images needed to obtain intrinsic and extrinsic
camera parameters by the following procedure.
1. Start the tcapture server at the camera using a remote ssh session.
2. Start the tcam application and connect to the camera by pressing ’c’ and ’o’ keys.
3. Adjust the camera control settings if necessary.
4. Place the testing pattern at the appropriate position, check the position using the jpeg image (by
pressing ’f’ key. .
5. Save the image in the bmp format by pressing ’g’ key (this is usually slower due to a larger amount
of data than for the jpeg images transmitted in the previous step. .
6. If all desired calibration images are retrieved, exit the application and close the remote session.
Otherwise repeat the steps 4 and 5 as needed.
43
6
6
FUTURE WORK
Future Work
Although the camera module for the COLOS project has been significantly improved in its second revision,
there is still potential for additional improvements. They are summarized in the following list.
• Stability of the mt9v032 driver in BA10 mode should be improved, which can allows consideration of
10-bit values of the captured image.
• The conversion from YUV to RGB or grayscale, alternatively conversion from BA10 to RGB or
grayscale can be done using the image signal processor, which can reduced the required computational
time a bit.
• Automatic exposure settings according to local neighborhood of the detected blob.
• Simultaneous tracking of more B/W patterns.
• Fine tuned model of the radial distortion.
• Distinguishing of more ring patterns using color.
• A client to the tracker server as a ROS node, which will allow a simple distribution of information
about a tracked object to other parts of the complex robotic system.
• The tracker server itself can be implemented as a ROS node; however, it requires ROS libraries for
the Overo OMAP3503 processor.
44
A
A
SPECIFICATIONS
Specifications
Table 6: Hardware parameters
Hardware Specifications
CPU:
CMOS sensor:
Lens:
Overo Board:
Camera Board:
Voltage Regulator Board:
Minicom Board:
Voltage Regulator Effic.:
Power Consumption:
OMAP3503 @ 600 MHz running armv7l GNU/Linux 2.6.34
Aptina MT9V032, 752×480 @ 60 Hz
F=2.8, FoVmin 42◦ , IR cut filter
58 mm × 17 mm × 5 mm, weight 6 g (including µSD card)
39 mm × 26 mm × 25 mm, weight 22.9 g (with lens)
68 mm × 23 mm × 18 mm, weight 18 g (with cables)
60 mm × 19 mm × 7 mm, weight 6 g
89 % (cpu idle), 84 % (cpu full load)
≤ 2.6 W
(2.39 W cpu idle, 2.55 W tracking at full speed)
Table 7: Access parameters
Access Specifications
Login:
Serial Console:
WiFi:
WiFi SSID:
WiFi mode:
WiFi IP address:
Remote shell:
root (password-less )
115200 bps, 8 data bits, 1 stop bit, none parity
802.11b/g
COLOS
adhoc, open without encryption
10.10.40.174
ssh
Table 8: Tracker parameters
Tracker Specifications
B/W Ring Pattern:
Image Resolution:
Field of View:
Images per Second:
Pattern Distance Range:
Expected Distance Error:
Average Distance Error:
Average Repeatability:
Tracker Server Address:
Tracker FPS restriction:
Outer diameter 14 cm, inner diameter 8.4 cm (width 2.8 cm)
480× 360 (grayscale)
33◦ × 67◦ (considering blob dimensions and radial distortion)
Max 52 FPS, Min: 18 FPS
0.2 - 3.5 m
units of cm
≤ 4.1%
≤ 0.5 cm
10.10.40.174:9001 (UDP/IP)
none (process as fast as possible )
45
B
Camera Module Hardware
J1
J15
RXD
GND
TXD
RXD
TXD
GND
GND
CONSOLE
+5V
UART1
B
CAMERA MODULE HARDWARE
J19
Figure 10: Connectors of the Overo minicom board.
Figure 11: Schematic of the voltage regulator and battery undervoltage switch.
46
B
CAMERA MODULE HARDWARE
(a) top
(b) bottom
Figure 12: Circuits of the voltage regulator and battery undervoltage switch.
47
C
C
APPLICATIONS
Applications
tcam t e s t i n g a p p l i c a t i o n 0 . 5
General options :
−h [ −−h e l p ]
p r o d u c e h e l p message
−c [ −−c o n f i g ] a r g (=./ tcam . c f g ) c o n f i g u r a t i o n f i l e
− l [ −−l o g g e r −c o n f i g ] a r g
logger configuration f i l e
CamApp o p t i o n s :
−−w i d t h a r g (=480)
−−h e i g h t a r g (=360)
−−cam−s e r v e r a r g (= l o c a l h o s t )
−−cam−p o r t a r g (=9009)
−−image a r g
−−use−image−s i z e a r g (=0)
d e f a u l t width
default height
hostname o f t h e cam s e r v e r
p o r t o f t h e cam s e r v e r
d e f a u l t image l o a d e d
a d j u s t windows s i z e from t h e l o a d e d
image
−−image−d i r e c t o r y a r g (= j p g )
image d i r e c t o r y t o s t o r e f i l e s c a p t u r e
from t h e cam
−−image−p a t t e r n a r g (=%04d . j p g )
image p a t t e r n f o r c a p t u r e d i m a g e s
−−out−image−d i r e c t o r y a r g
o u t p u t image d i r e c t o r y t o s t o r e
processed f i l e s
−−out−j p e g a r g (=1)
E n a b l e / d i s a b l e j p e g o u t p u t f o r m a t (bmp
i s used o t h e r w i s e
−−cube−r g b a r g
f i l e t o l o a d / s t o r e t h e r g b cube
−−p o i n t s −r g b a r g
l i s t of rgb v a l u e s ( r g b − space
s e p a r a t e d ) u s e d f o r a d d i n g t o t h e cube
using the hsv ranges
−−benchmark a r g (=0)
E n a b l e / d i s a b l e benchmark mode
−−benchmark−draw−image a r g (=0)
E n a b l e / d i s a b l e d r a w i n g r e s u l t image i n
benchmark mode
−−benchmark−r e s u l t −l o g a r g
F i l e n a m e t o s t o r e benchmark r e s u l t s l o g
−−benchmark−d a t a s e t a r g
Name o f t h e d a t a s e t u s e d i n t h e
benchmark r e s u l t s l o g
−−benchmark−name a r g
An a d d i t i o n a l name u s e d i n t h e
benchmark r e s u l t s l o g
−−benchmark−max−i m a g e s a r g (=−1)
I f >= 0 o n l y g i v e n number o f i m a g e s i s
processed
−−camera−c o n t r o l −s e t t i n g s a r g
F i l e name t o s t o r e / s a v e camera s e t t i n g s
−−o v e r o −bench− t r i a l s a r g (=100)
Number o f p r o c e s s e d i m a g e s on o v e r f o r
estimation of the fps
−−o v e r o −p r o c e s s − t r i a l s a r g (=100)
Number o f p r o c e s s e d i m a g e s on o v e r f o r
estimation of the fps
−−j p e g −q u a l i t y a r g (=85)
JPEG c o m p r e s s q u a l i t y
−−g r a y s c a l e a r g (=0)
Enable / d i s a b l e gray s c a l e
−−use−c i r c l e −d e t e c t o r a r g (=1)
Enable / d i s a b l e c i r c l e d e t e c t o r , i f not
set , si mp le c l o l o r b l o b f i n d e r i s used
−−p r o c e s s −on−r e f r e s h a r g (=1)
I n i t i a l v a l u e o f t h e p r o c e s s on r e f r e s h
variable
−−t r a c k e d −o b j e c t −d i a m e t e r a r g ( = 0 . 1 4 0 0 0 0 0 0 1 )
Diameter of the tra cke d o b j e c t used i n
transformation
−− f u l l −u n b a r r e l a r g (=0)
Enable / d i s a b l e f u l l u n b a r r e l in the
transformation of the coords
48
C
−−camera−p a r a m e t e r s a r g
APPLICATIONS
F i l e w i t h t h e camera p a r a m e t e r s i n
f o r m a t r e q u i r e d by t h e
CCameraTransformation
−−max−s e g m e n t s a r g (=10000)
Maximal number o f t h e r e c o g n i z e d
segments ( b l ob s )
−−number−b l o b s a r g (=1)
No . o f d e t e c t e d b l o b s
−−rgb−cube−s i z e a r g (=8)
d e f a u l t s i z e o f t h e r g b cube !
−−hsv−d e l t a −h a r g (=5)
r a n g e o f h v a l u e o f t h e h s v model f o r
adding c o l o r ranges
−−hsv−d e l t a −s a r g (=3)
r a n g e o f s v a l u e o f t h e h s v model f o r
adding c o l o r ranges
−−hsv−d e l t a −v a r g (=3)
r a n g e o f v v a l u e o f t h e h s v model f o r
adding c o l o r ranges
−−hsv−n e i g h −d e l t a −h a r g (=5)
r a n g e o f h v a l u e o f t h e h s v model f o r
adding c o l o r ranges
−−hsv−n e i g h −d e l t a −s a r g (=3)
r a n g e o f s v a l u e o f t h e h s v model f o r
adding c o l o r ranges
−−hsv−n e i g h −d e l t a −v a r g (=3)
r a n g e o f v v a l u e o f t h e h s v model f o r
adding c o l o r ranges
−− c i r c l e −d e t e c t o r −t r a c k i n g a r g (=1)
Enable / d i s a b l e tracking , i . e . , usage of
the p r e v i o u s p o s i t i o n
−−d e t e c t o r −i m p l e m e n t a t i o n a r g
For d e b u g i n g p u r p o s e s
−− c i r c l e −d e t e c t o r −max−s e g m e n t s a r g (=1000)
Maximal number o f s e g m e n t s
−− c i r c l e −d e t e c t o r −t h r e s h o l d − i n i t a r g (=203)
I n i t i a l value of the t h r e s h o l d
−− c i r c l e −d e t e c t o r −t h r e s h o l d −max a r g (=768)
Max v a l u e o f t h e t h r e s h o l d
−− c i r c l e −d e t e c t o r −t h r e s h o l d −s t e p a r g (=60)
Step v a l u e of the t h r e s h o l d
−− c i r c l e −d e t e c t o r −max− f a i l e d a r g (=10)
Max number o f d e t e c t i o n f a i l e s b e f o r e
t h r e s h o l d update
−− c i r c l e −d e t e c t o r −min−s i z e a r g (=20) Min s i z e o f t h e b l o b
−− c i r c l e −d e t e c t o r −c e n t e r −d i s t a n c e −t o l e r a n c e a r g (=15)
Value of the c e n t e r d i s t a n c e t o l e r a n c e
−− c i r c l e −d e t e c t o r −c i r c u l a r −t o l e r a n c e a r g ( = 0 . 3 0 0 0 0 0 0 1 2 )
Valued of the c i r c u l a r t o l e r a n c e
−− c i r c l e −d e t e c t o r −r a t i o −t o l e r a n c e a r g ( = 0 . 5 )
Valued of the r a t i o t o l e r a n c e
−− c i r c l e −d e t e c t o r −a r e a s −r a t i o a r g ( = 0 . 5 6 2 5 )
Discs area r a t i o s
−− c i r c l e −d e t e c t o r −o u t e r −a r e a −r a t i o a r g ( = 0 . 5 0 2 6 5 4 8 5 )
Outer d i s c a r e a r a t i o
−− c i r c l e −d e t e c t o r −i n n e r −a r e a −r a t i o a r g ( = 0 . 7 8 5 3 9 8 1 8 5 )
Inner disc area r a t i o
Listing 27: All program options of the tcam application.
49
REFERENCES
REFERENCES
References
[1] Camera calibration toolbox for matlab. http://robots.stanford.edu/cs223b04/JeanYvesCalib/
index.html, [Cit.: 2012-01-17].
[2] Gumstix. http://www.gumstix.com, [Cit: 2012-01-16].
[3] J. Heikkila and O. Silven. A four-step camera calibration procedure with implicit image correction. In
IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pages 1106
–1112, jun 1997.
[4] Jan Faigl, Martin Saska, Matt Turpin, Libor Přeučil, and Vijay Kumar. Project colos - technical report
1.1 - camera module for onboard relative localization in uav swarms. Technical report, Czech Technical
University in Prague, University of Pennsylvania, October 2011. Tech notes from work performend at
UPENN.
[5] Openembedded. http://www.openembedded.org/wiki/Main_Page, [Cit.: 2012-01-17].
[6] jerm. http://www.bsddiary.net/jerm, [Cit.: 2012-01-17].
[7] Minicom. http://alioth.debian.org/projects/minicom, [Cit.: 2012-01-17].
[8] Boost.program options.
http://www.boost.org/doc/libs/1_48_0/doc/html/program_
options.html, [Cit.: 2012-01-18].
[9] log4cxx. http://logging.apache.org/log4cxx, [Cit.: 2012-01-18].
[10] V4l2. http://linuxtv.org/wiki/index.php/Main_Page, [Cit.: 2012-01-18].
[11] Caspa camera boards.
[Cit: 2012-01-19].
http://wiki.gumstix.org/index.php?title=Caspa_camera_boards,
[12] M. Eisler. XDR: External Data Representation Standard. RFC 4506 (Standard), May 2006.
50
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement