Robots: Our new overlords - Robotics: Science and Systems

Robots: Our new overlords - Robotics: Science and Systems
Integrated force and distance sensing using
elastomer-embedded commodity proximity sensors
Radhen Patel and Nikolaus Correll
Department of Computer Science, University of Colorado, Boulder, CO 80309, USA
Abstract—We describe a combined force and distance
sensor using a commodity infrared distance sensor
embedded in a transparent elastomer with applications
in robotic manipulation. Prior to contact, the sensor
works as a distance sensor (0–10 cm), whereas after
contact the material doubles as a spring, with force
proportional to the compression of the elastomer (0–
5 N). We describe its principle of operation and design parameters, including polymer thickness, mixing
ratio, and emitter current, and show that the sensor
response has an inflection point at contact that is
independent of an object’s surface properties. We then
demonstrate how two arrays of eight sensors, each
mounted on a standard Baxter gripper, can be used
to (1) improve gripper alignment during grasping, (2)
determine contact points with objects, and (3) obtain
crude 3D models that can serve to determine possible
grasp locations.
I. Introduction
Grasping and manipulation remain hard challenges in
robotics. After identifying an object’s pose, the robot’s
end-effector needs to be controlled to create a sufficient
number of constraints for successful pick-up while maintaining the object’s pose until all desired contact points are
reached, thereby preventing the object from moving out of
the end-effector’s reach. The ability to use in-hand sensing
to better align with an object and stop exerting forces once
contact is made, might allow to provide these constraints
without affecting the object’s pose. In [15], touch sensors
are used to determine whether a grasp is successful, but
cannot be used to improve the grasp prior to contact.
Exploiting environmental constraints such as walls [11] or
a bowl [14] has been shown to increase grasping success
in specific scenarios, but planning such a motion requires
precise knowledge of the environment’s geometry and
would benefit from active sensing to determine whether an
object has reached a desired pose. Using 3D sensing suffers
from uncertainty, making reliable grasps very difficult,
regardless of the type of end-effector used.
Yet, commercially successful systems, that is widely
deployed, in-hand sensors are virtually non-existing as
they are difficult to manufacture and expensive. At the
same time, the algorithmic foundations for reactive grasp
planning are only sparsely developed, with most of the
focus on the sense-plan-act model that requires precise
sensing and actuation. This has been nicely illustrated at
the 2015 Amazon Picking Challenge, where only one of 25
teams have used tactile feedback for contact sensing [10].
Fig. 1. A Baxter parallel gripper enhanced by two arrays of combined
force and proximity sensors. Seven sensors are spaced along each
finger’s contact area, and one sensor is mounted at each tip, enabling
object detection and force control.
Motivated by this apparent gap both in available hardware and applications for in-hand sensing, we present a
simple and low-cost tactile sensor that combines force
and distance measurements. The proposed sensor is simple to manufacture and easy to integrate with existing
hardware. The sensor consists of a commodity digital
infrared distance sensor that is embedded in a soft polymer, which doubles as a spring for force measurements
based on Hooke’s law. We also show how the strong
dependence of infrared-based sensors on surface properties
can be overcome by exploiting the discontinuity that the
elastomer coating introduces into the sensor response.
After describing other sensors that are closest to the
work presented here in Sec. I-A, we explain the sensor’s
manufacturing and principle of operation in Sec. II. The
impact of various design parameters is shown in Sec. III,
and calibration results are shown in Sec. IV. We then
shown how the proposed sensor can be arranged into
an array and installed on a parallel gripper of Rethink
Robotic’s Baxter to improve grasping performance during
selective tasks in Sec. V.
A. Related work
Tactile sensing is widely considered an essential capability for efficient grasping and manipulation [12]. By touching an object, it is possible to measure contact properties
such as contact forces, contact position, and possibly even
gather information about an object’s surface properties.
At the same time, sensors need to be rugged enough
to withstand chemical and mechanical abrasion. Various
types of tactile sensors have been investigated based on
strain gauges, piezoelectric, capacitive, impedance, mechanical and magnetic effects[12]. For example, a silicon
piezoresistive sensor measures both compressive and shear
forces at the skin-object interface in [36]. A capacitive
tactile sensor reached a resolution of 0.05 N over the
range of 10 N for forces in both normal and tangential
directions in [38]. A soft tactile sensor, comprising multiple
capacitors embedded in polydimethylsiloxane (PDMS),
was reported with a minimum detectable force of 10 mN
in [27]. Fiber Bragg gratings based sensors enable small
size and high sensitivity. A soft fiber optic sensor based
on polymer fiber Bragg gratings is embedded into a soft
PDMS film in [37], for simultaneous measurement of shear
and normal stresses. A fiber optical sensor for a multifingered robotic hand was designed in [23] for contact
force and contact location sensing. Galinstan-based strain
sensors have been embedded into soft robotic fingers to
measure curvature and — in conjunction with pressure —
detect the presence of objects [16] and adjust a grasp if
necessary [17]. Finally, simple microphones can be used
to measure vibrations that allow identifying textures [22].
How to use all of this tactile information during control is
a topic of ongoing research [13, 4].
Part of the challenges with manufacturing and signal
processing that all of the above sensors share to some
extent, can be countered by using commodity digital
sensors. Barometric pressure sensors covered with flexible
polymers can provide contact and pressure sensing [35].
Similarly, optical proximity sensors have been integrated
into deformable rubber domes [24] and urethane foam
[20, 34, 18], which both alter light reflection upon compression and shear, a phenomenon known as frustrated
reflection. Finally, [26] combines off-the-shelf proximity
sensors embedded in elastomers with capacitive sensing
to perform directional force sensing.
In addition to contact/force, equipping robotic hands
[2, 21] or skin [32] with distance sensors is attractive
to improve a grasp during the approach phase. Optical
proximity sensors were integrated inside the fingertips of a
Barrett Hand in order to perform reactive online grasping
[21]. In [29], the fingertips of robot manipulator TUMRosie were equipped with proximity sensors to measure
the distance to objects.
There also exist approaches that use both distance and
force simultaneously. In [19], capacitive sensors are used
to measure both proximity of (conductive) objects as well
as pressure after contact. In [31], optical proximity and
capacitive pressure sensing are combined to realize a multimodal tactile sensing skin. In this paper, we demonstrate
how both distance and force sensing can be achieved
using a single off-the-shelf sensor, allowing for making the
device small enough for finger-tip operation, simple to
manufacture, and low-cost, while providing performance
that is comparable to the above approaches.
Fig. 2. Schematic sensor design illustrating key quantities. Infrared
lobes are reflected at the interface of PDMS/air due to Fresnel
reflection, as well as from close-by objects. Forces lead to deformation
of the PDMS that reduces its width d by ∆x.
II. Sensor design
Infrared sensors are strongly non-linear and their response depends on the surface properties of the sensed
objects and the angle of incidence. They are sensitive
to cross-talk from other sensors or infrared light in the
environment. Increasing use in consumer electronics such
as smart phones has led to a new generation of devices
that improve cross-sensitivity by integrating sensor and
emitter with digital signal processing.
We have chosen an integrated proximity and ambient
light sensor VCNL 4010 (Vishay Semiconductors). This
device has a miniature 3.95 × 3.95 × 0.75 mm3 package
which combines an infrared emitter and PIN photodiode
for proximity measurement, ambient light sensor, a signal
processing IC, a 16 bit ADC, and inter-integrated-circuit
(I 2 C) communication interface while requiring only very
few external components (four filter capacitors). The chip
allows setting a large variety of parameters, the most
important being the emitter current (20mA to 200mA in
increments of 10mA), and the carrier frequency in the
range from 390.625 kHz–3.125 MHz in four increments.
The ability to select the frequency of each sensor enables arranging sensors in opposite pairs, such as required
on a robotic gripper, without interference. The emitter
current should not be confused with the actual power
consumption, which is less than 4 mA when performing
250 measurements per second at full (200 mA) power, and
in the order of µA when doing 10 or less measurements per
second.
Sensors can be arranged in groups of 8 using an I 2 C
multiplexer (TCA9548A, Texas Instrument). This chip has
a programmable 3-bit address, allowing us to create arrays
of up to 8 × 8 sensors. At 100 kHz I 2 C bus frequency, a
single measurement requires 1470 µs including communication, allowing to read a 8 × 8 array at 10 Hz and a strip
of eight at 85 Hz.
To enable force measurements, sensors are embedded in
a thin layer of PDMS (Dow Corning Sylgard 184), see also
Figure 2. PDMS is a widely used silicon elastomer, whose
mechanical and optical properties have been widely studied [6, 7, 9]. It is simple to manufacture and cheap, while
providing good transparency and mechanical properties
such as resistance to chemical and mechanical abrasion.
Figure 1 shows two sensor arrays mounted to the parallel
gripper of a “Baxter” robot.
A. Principle of operation
The integrated infrared emitter of the VCNL4010 has
a peak wavelength of 890 nm. The light from the emitter
passes through a thin layer of PDMS. It is then reflected
by nearby objects and received by the photo-receiver. The
amplitude and phase of the light vary as a function of the
distance to the surface, its orientation, color and texture.
Due to the quadratic decay of light amplitude with
distance, the sensor has its highest resolution right after
its minimum range of 0.5mm. It is therefore possible to
measure small variations in the order of hundredths of
millimeters. We exploit this effect by measuring the elastic
deformation that occurs when an object is pressed against
the sensor. As the elastomer can be approximated by
a spring with a constant Young’s modulus E (for light
pressure), the force is given by
EA
∆x
(1)
d
with A the contact area over the sensor, d the width of
the PDMS layer and ∆x the measured deformation. Note
that the sensor area is constant and smaller than the
actual contact area of typical objects. Yet, the value of F
is approximate as PDMS cannot be infinitely compressed
and eventually changes its density and thereby absorption
properties.
Let the emitted light intensity be I0 and the measured
reflected intensity from an object be I. Let the thickness
of the rubber be d and the distance to the object x, see
also Figure 2. Depending on the index of refraction of the
rubber material, a fraction R of the light will be reflected
from the interface between rubber and air, a fraction
κ will be scattered, and a fraction α will be reflected
at the target surface. Assuming that the light intensity
decays quadratically with distance, we can approximate
the amount of returned infrared as
α
1
Ix>0 ≈ I0 (1 − R)
+ I0 R 2 − κI0 .
(2)
(d + x)2
d
F ≈
The reflection at the PDMS/air interface can be calculated using the Fresnel equation [5], which reduces to
n1 − n2 2
R=
(3)
n1 + n2 for normal incidence. With the refractive index of PDMS
n1 ≈ 1.41 and that of air n2 ≈ 1, around 2.9% of the light
gets reflected from the internal surface of the PDMS as
well as on the outside on the return path.
This formalism helps us to better understand certain
edge cases. First, when d x, the light intensity at the
receiver is dominated by Id02R , which leads to saturation of
the sensor, if I0 is too large or d is too small. The width d
of the PDMS therefore governs the maximum current at
which we can operate the sensor and thereby the maximum
attainable range. At the same time, the width governs the
maximum allowable ∆x and thereby the maximum force
and its resolution that the sensor can measure.
Fig. 3. (a) Sensors are mounted on the parallel gripper, (b) placed
in an acrylic mold and (c) cured.
Once the object touches the sensor surface, i.e. x = 0,
(2) reduces to a constant, which is only a function of material properties. After touching, the PDMS gets compressed
dF
by ∆x ≈ EA
, leading to
Ix<0 ≈ I0
α
− κI0
dF 2
(d − EA
)
(4)
Note that (4) still depends on the surface reflectance α,
which therefore needs to be known for accurate force
measurements. we observe, however, that x = F = 0 is
an inflection point of the signal that can be potentially
detected in the sensor’s response and therefore allow to
detect contact independently of surface properties. Indeed,
equations (2) and (4) yield the same values for x = 0
and F = 0, respectively, but have different slopes. We
experimentally show further below that this is the case
and can be detected in the sensor readings.
B. Manufacturing
The infrared sensors requires few external components
(3 capacitors). Encapsulation of the sensor in PDMS can
be readily accomplished by mounting the sensors on a
printed circuit board (PCB), placing it in a mold, and
pouring the two-component liquid polymer in it (Figure
3). Painters tape has been used to ensure that the PDMS
does not leak.
In order to avoid air being trapped at the interface
between PDMS and the sensor, we degas the assembly in
a vacuum chamber. The PDMS is then cured in an oven at
70o C for 20 minutes. This forms a robust and compliant
rubber contact surface for grasping and manipulation. To
improve repeatability of the optical properties of amorphous PDMS, it is advised to purify the raw materials
before the mixing process to avoid extrinsic losses, e.g.
by particle scattering. The base material and coupling
agents should thus be filtered using a mixed cellulose ester
membrane filter of pore size of 0.2 µm. The entire process
takes around 5 hours to prepare a pair of these sensors.
III. Design parameters
To experimentally characterize the performance of the
proposed tactile sensor, we first characterized the response
of an individual sensor and then the sensing capabilities
of a complete array by installing it on a parallel gripper.
Figure 4 shows the experimental setup to test and characterize the performance of these sensors. The setup is
designed in a way which allows the testing of both, an
individual sensor and complete arrays in their proximity
and force regimes. The setup consists of a 0.15 × 0.13 m2
screen that is mounted vertically on a sliding rod with a
precise linear control. A digital force gauge (Shimpo FGV10XY) is mounted horizontally on the opposite side of the
screen to measure the force exerted on the sensor.
Fig. 4. Experimental setup to conduct proximity and force measurements.
Considering the air-PDMS surface as the screen’s zero
position, the screen is moved in discrete steps towards
the sensor thereby measuring distance. After the screen
touches the surface of the PDMS, the screen is rigidly
fixed at a place where the force gauge reads 1N. A total of
five readings are then taken at intervals of 1N by further
compressing the screen upon the sensor. Note that this
approach allows us to seamlessly measure both distance
and force as well as experimentally detect contact, which
is defined by the force sensor changing from zero Newton
to a positive value.
We have been using this setup to study the effects of
various fabrication parameters, particularly the thickness
of the PDMS film, mixing ratio of base to curing agent of
pre-polymer and the operating sensor current value on the
sensors response.
A. Current
We recorded single-point measurements at distances
from 0 to 6cm in increments of 1cm, as well as force
from 0N (0cm) to 5N in increments of 1N for current
values from 40mA to 200mA in increments of 40mA
(Figure 5). PDMS with mixing ratio 8:1 has been applied
to a thickness of 6mm above the sensor. Results show
saturation of the sensor for distances below 1cm at current
values exceeding 80mA due to Fresnel reflection inside the
PDMS. At 80mA, the sensor saturates at less than 2N
force, whereas a 40mA setting allows to measure across
the range from 0 to 5N. Changing the current therefore
allows trading range for resolution. At 80mA, distances
from 0 to 6 cm result to ADC values from 5086 to 45789 or
40703 distinct values. At 40mA, the same range results to
only 20245 values, corresponding to a resolution of 2.96µm
and 1.47µm. While not relevant given the accuracy of
the sensor being four orders of magnitude worse than its
resolution, in particular at longer distances, reducing the
current allows extending the sensor’s range from around
2N to 5N at an ADC resolution of 80µN and 285µN,
respectively.
B. Thickness
The thickness of PDMS has a considerable effect on
the amount of light absorbed and scattered within the
material. (The amount of light reflected back from the
air-PDMS surface does not change as it depends only on
the refractive indexes of the material.) Figure 5, middle,
shows the response of two sensors cast in PDMS with the
base to curing agent in 8:1 ratio and thickness of 6mm
and 12mm. Considering distance measurements from 0
to 6cm, absorption within the material compresses the
range of raw ADC values from [5084; 48201] for 6mm
PDMS coating to [3956; 25612] for 12mm PDMS coating,
thereby reducing the resolution of the sensor for the 0–
6cm range from 1.39µm to 2.77µm. This effect is more
accentuated in the force domain where the ADC values
for measurements from 0 to 1N range from 48454 to
64824 (16370 individual values or approximately 60µN
resolution) for 6mm PDMS coating, but span only a range
of approximately 1000 values for the entire range of 0 to
5N for 12mm PDMS coating. For thicker coatings, we also
observe a systematic non-monoticity of the signal for small
forces, which is consistent for different current values and
surface reflectance, and whose origin we currently cannot
explain and which we wish to study in further work.
C. Mixing Ratio
The mid-infrared transmission of thin PDMS film is
characterized in [9] using Fourier Transform Infrared Spectrometry. The transmittance of infrared light is found to
depend strongly on the mixing ratio of base and curing
agents causing the composition of PDMS to change; lower
mixing ratios results in higher transmittance. Maximum
transmittance of about 95% is found between wavenumbers 2490-2231 cm−1 with mixing ratios of 8:1. To compare
the results at wavenumbers 12500-10526 cm−1 (800-950
nm), we prepared three mixtures of PDMS with different
mixing ratio of the base and curing agent (5:1, 10:1, and
12:1). Figure 5 shows the sensor proximity and force values
for different mixing ratios. Albeit the Young’s modulus
of PDMS changes by about 35-40% where the density
changes by only 1% over the range of mixing ratio from 8:1
to 12:1 [1], there is only little difference in the force region
Fig. 5. Design parameters of the sensor. Left: Sensor response for different current settings. Middle: Sensor readings as a function of PDMS
thickness at 80mA. Right: Sensor readings as a function of mixing ratio of the two PDMS components.
among these values, whereas distance measurements are
more distinct, in particular for 8:1 mixing ratios. As the
cross-over from distance to force is at approximately the
same sensor reading, we deduct that 8:1 mixing ratios
provide the widest dynamic range in the force regime, but
the smallest dynamic range in the distance regime.
IV. Calibration
For calibrating the relationship between sensor reading
and actual distance, we first characterize the sensitivity
of the sensor to surface reflectance, and then record data
for different distances across a variety of sensors for white
paper. We have chosen a width of 6mm at a mixing
rate of 8:1 for the remainder of this paper due to the
higher dynamic range in the distance and force regime,
respectively.
B. Distance
In order to obtain a relationship between sensor readings
and actual distance we recorded data from 14 different
sensors and white paper. We soldered seven sensors in
a line at 10mm spacing to a rigid PCB (Figure 1). We
recorded the response of two such arrays (14 sensors) at 24
distances ranging from 0.5 to 19cm and 50 measurements
each for 120mA. Albeit 120mA leads to saturation in the
force regime (when using white paper), this value allows
us to obtain better ranging and works fine with objects
that are less reflecting. The data is shown in Figure 6.
We fitted this data with a function of the form y =
axb + c using MATLAB’s curve fitting toolbox’s trustregion method and bisquare weighting of outliers. The
candidate function corresponds to physical intuition (with
b = −2) and can be inverted to
A. Color
The intensity of light reflected from objects are greatly
dependent on the color, pose and surface properties of
the object. We chose five different colored target objects
(red, yellow, white, gray and black, Canson, 150 gsm)
as described in [3]. The colored cardboard papers were
mounted on the screen shown in figure 4 which served as
target objects for a distance sensor coated by 6mm PDMS
at 8:1 mixing ratio.
Figure 6 shows the response of the sensor to different
colors. The proximity measurements are comparatively
lesser influenced by the reflective properties of the target
surface than the force measurements. Albeit bright colored
materials give better readings than darker ones, there is
not any significant difference in the sensor response to
different colors, except for the black paper.
These findings are in line with [3], who calculate the
reflectance for a variety of colors to be in the range of
0.9 (gray) to 1.0 (white), whereas black cardboard has a
reflectance of 0.12. Cardboard of all colors is much more
reflecting than wood (0.77), brick (0.61) or concrete (0.53),
but much less than reflecting surfaces such as polished
plastic or china.
x=
1
1
(5)
((y − c)/a) b
Notice that the denominator of the above equation
includes the b-th root, which yields complex values for
y < c. This happens whenever a sensor reading falls below
the assymptote of the fitted curve, which is often the case
for farther-away measurements. We therefore convert all
measurements into decibel scale using log10 II∞ , where I∞
is the measurement obtained in plein air. With b ≈ −1
after fitting on the log-scale, all distance measurements
remain real. The fit as well as absolute error for both
the raw and PDMS-coated sensors are shown in Figure 6,
middle. As expected, we observe a slightly higher absolute
error for all measurements with PDMS, which initially
makes objects appear closer (up to about 7cm) and then
farther apart than the raw sensor. Data follows a similar
trend for distances from 10 cm to 19 cm, but are not shown
as the high error at this range makes those measurements
impractical to use.
C. Force
As force measurements are much more susceptive to
surface reflectance, we have performed fits for a variety of
Fig. 6. Left: Sensor response (PDMS 6mm, 8:1 mixing ratio) to different color targets. Middle two: Sensor response to different distances at
120mA, 6mm PDMS with 8:1 mixing ratio averaged over 14 different sensors. Average error after applying calibration to raw data. Right:
Raw sensor readings vs. actual force for different colors and fit of the form y = axb .
colored papers using data from Figure 6, left. Results for
a subset (white, red, black) are shown in Figure 6, right.
Using an equation of the form y = axb has provided good
results, with R-squared values ranging from 0.9898 (black)
to 0.9953 (white).
V. Applications
We mount our finger sensors on the parallel gripper
of the Baxter robot from Rethink Robotics, which is
equipped with two 7-DOF arms. The size of each finger
sensor is 80 × 2 × 1 mm (Figure 1), small enough to
install on the stock electric parallel gripper of the robot.
We manufactured two separate pairs of finger set with
grasp ranges varying from 0-68 mm to 68-144 mm. Two
fingers are interfaced via an Arduino Uno microcontroller
that polls all 16 sensors in a round-robin fashion. The
microcontroller connects to a control computer and ROS
via an USB port, which also provides the supply voltage for
the sensors. Unless otherwise noted, all objects are chosen
from the Yale-CMU-Berkeley (YCB) Object and Model
set [8].
A. Gripper centering
We first investigate using proximity sensing to center a
gripper around an object. This is important when successful grasping requires both fingers to simultaneously
make contact. For example grasping a cup at its handle
induces a turning motion that needs to be counteracted by
the opposite finger before the cup has turned out of the
robot’s grasp. Similarly, removing a block from a Jenga
tower [25] requires to create force-closure with the block
while inducing a minimum amount of motion on the block
itself.
Figure 7 depicts a similar situation, in which imprecise
alignment will collapse a tower of wooden blocks. The
grippers were closed in discrete steps and the response
form the sensor was recorded. The response from the
sensors on the right finger is shown in blue and the
response from the left finger is shown in red.
Assuming the surface properties (reflectance) are the
same on both sides of the object, data shown in Figure 7
can be used to servo the end-effector to a position, in which
both distances are roughly equal using simple feedback
Fig. 7. Deconstructing a tower of 1-inch cubes requires exact centering of the gripper in order to prevent the cubes from falling. The plot
to the right shows raw sensing values for the left (blue) and right (red)
finger. The left gripper approaches the cube in phase 1–2. It makes
contact at 2 while the right finger continues its approach during 2–3.
As the left gripper is still pushing the cube, it will eventually fall
of the green cube and disappears from both grippers’ field of view
(3–4).
Fig. 8. Using the average of differences in sensor readings measured
from the left and right finger (Figure 7) to center an object.
control and inverse kinematics (Baxter SDK PyKDL), see
Figure 8.
B. Contact point evaluation
Force sensing can be used to determine the location
of incidence of an object on the gripper. Figure 9, left,
shows the raw measurements of all sensors when grasping
the handle of the YCB pan (Figure 10, left). The data
clearly shows the fingers approaching the handle and
which sensors made the most contact, letting us infer
the approximate size of the object. Closer inspection of
Fig. 9. Left: Raw sensor values on the left and right finger when grasping the YCB pan at its handle during roughly 6s. The contact points
on each finger are clearly discernible. Middle: Close-up on sensor 5 of the right finger and its numerical derivative. Horizontal and vertical
lines indicate the raw value that corresponds to the derivative’s peak. Right: Calibration data for black cardboard. Horizontal and vertical
lines indicate the raw value that corresponds to crossing from distance to force.
contact data, here the 5th sensor of the right finger, reveals
that gentle pressure drives the sensor to roughly 2 × 104
(Figure 10, middle), which is similar to values generated
by contact with black card-board (Figure 10, right). Furthermore, fitting a spline to the raw data and calculating
its derivative (MATLAB spline and fnder), reveals that
the sensor response has an inflection point close to where
the black cardboard crosses from the distance to the force
regime. Performing the same operation on data from the
left finger suggests a material of slightly higher reflectance
(the sensor maxes out at 2.4 × 104 ) with a cross-over
point at a raw value of 17529. Albeit not sufficient to
determine the actual object properties, we wish to explore
the suitability of using extrema on the sensor response
(minimum, maximum, and cross-over point) to identify
specific materials in the future. Indeed, we are able to
correctly detect the cross over point for all experiments
shown in Figure 6 (five different colors), left, for currents
ranging from 40 mA to 200 mA in increments of 40 mA
(25 experiments).
is rotated around the object in increments of 0.17rad in
the interval of [−π; π]. Using the actual encoder value at
each step and converting sensor readings into centimeters
using (5) yields polar coordinates of each point where the
infrared light hits the object. The resulting data is shown
in Figure 11. Albeit noisy due to non-orthogonal incidence
angles at the handle and the bottom of the cup, the
fidelity of the model is sufficient to highlight the presence
of the cup’s handle and is, after removing outliers, close
(±0.5cm) to the cup’s true diameter.
C. 3D Reconstruction
Given the material parameters, in-hand proximity sensing can be used to augment, and possibly register against,
conventional 3D sensing. The robot arm is programmed
Fig. 11. Left: 3D point cloud model of a cup (8cm diameter, 8cm
height, 3cm handle). The red colored points are from the left finger
sensor and the green colored points from the right finger sensor.
Right: Horizontal sweep across an airplane wing, only data from the
right finger is shown.
We also selected a toy airplane form the YCB object
set which has a highly reflecting surface. Figure 12 shows
the direction in which the robot wrist is swept across the
wing to detect a possible grasp location, this time using a
horizontal motion.
Fig. 10. Left: Grasping the YCB pan. Right: Scanning the YCB cup
by performing a 360 degree swivel.
to reach a specified scanning position on the table as
shown in Figure 10. We assume such a position can be
reached using coarse visual or RGB-D data, as well as
the proximity sensors themselves. The robot wrist joint
Fig. 12. Scanning a toy airplane wing.
Figure 11, right, shows the response of the sensors.
Albeit the distance measurements are underestimated due
to the reflectance of the opposite gripper (actually at
a distance of around 4 cm) and the airplane wing, the
presence of the airplane wing is clearly discernible. As the
wing starts appearing in the field of view of the sensors
we see a steep decrease in distance, which then reaches
a minimum at the wing’s center, and then gradually
increases as the robot arm moves away from the wing.
This is due to the fact that the infrared emitter is better
approximated by a lobe than by a ray. The symmetry of
the reflected plane at the center of the wing causes the
photo receiver to receive the maximum possible reflected
intensity available from the airplane wing, illustrating
the limitations in lateral resolution, which would need to
be compensated by an orthogonal sweep, should a more
accurate 3D reconstruction be desired.
VI. Discussion
The proposed sensor has a series of design parameters,
ranging from the choice of the material itself, its mixing
ratio, its thickness, and the current at which the emitter
operates. Each of these parameters affects the sensors’
range, dynamic range, and thereby resolution and accuracy. While far from exhaustive, systematic experiments
presented here highlight important trends, and allow to
obtain a good trade-off between ranging and force sensing
capabilities, as well as providing guidelines on how to
adapt the sensor to one’s own specifications.
Albeit roughly following the form y = axb + c, our
approximation introduces non-negligible systematic error,
an effect that gets amplified by adding a PDMS layer,
which introduces another constant to the denominator
of (2). While better non-linear approximations could be
found, e.g., using support vector machines or training a
neural network, the sensor is very sensitive to surface
properties. For example, black paper is five times less reflecting than white paper, whereas shiny objects are more
reflecting than objects with rougher surfaces. We observe,
however, that most practical application of the proposed
sensor might not require calibration at all. Indeed, centering around an object only requires equalizing raw sensor
readings, which are both monotonically increasing and
continuous from infinite distance to at least 5N force.
We also observe that the shape of the function that relates distance/force measurements to raw sensor readings
is of similar quality independent of the surface properties,
thickness, mixing ratio, and current, with an inflection
point at the contact point. Performing a firm grasp on an
unknown object such as the panhandle in Figure 9 allows
to record such a curve in its entirety and might allow to
infer its material properties given all other parameters of
the sensor are known. For example, when squeezing the
handle, the sensor reading maxes out at around 2.1 × 104 ,
which is slightly above the value of black paper at 120mA
(1.8 × 104 ) for 6mm PDMS (8:1). Together with actual
distance information obtained from the gripper itself, it
might be possible to calibrate the sensor online by performing a simple grasp, and then use this data to perform
an accurate 3D reconstruction. Squeezing an object might
also provide insight for tuning the sensing current. For
example, the sensor current could be reduced until the
sensor saturates at a value below the maximum reading,
and calibration data could be obtained during a second
squeeze. Finally, tactile and geometry information could
complement existing approaches in which robots use active
manipulation to understand objects in their environment
[33, 28].
Another limitation of optical proximity sensors is their
dependence on the angle of incidence. While this is not
noticed with rotation symmetric objects such as those
used here, scanning a rectangular object using a circular
swivel motion, e.g., would let the object appear elliptically.
As the resulting error is well quantified [3], we wish to
investigate how contact information can be exploited to
estimate the angle of a surface, and investigate sensorbased motion planning techniques that can help the robot
to differentiate between accurate and erroneous signals
by automatically testing different hypothesis of the true
object’s pose and surface properties.
VII. Conclusion and further work
We present an integrated force and distance sensor that
is simple to manufacture and low-cost, yet provides a
series of benefits that usually require much more complex
sensors. We experimentally evaluate the design space and
demonstrate the impact of different material properties
and control parameters on the sensor response. As expected with infrared-based sensors, the sensor is strongly
non-linear, highly sensitive to surface properties and has
poor lateral resolution when compared with ray-based or
RGB-D sensors. Nevertheless, we demonstrate that the
proposed sensor has a wide range of use cases that facilitate grasping and manipulation ranging from contact point
detection, determining grasp points, to object registration,
and can possibly be improved by better sensor models
and sensor-based motion planning strategies, which we
wish to investigate in the future. We are also interested
to explore real-time object recognition by squeezing and
scanning an object. In the long run, the necessary processing could be co-located with the sensor [30], allowing it to
autonomously identify surface properties of an object and
adapting accordingly.
Acknowledgments
The authors would like to thank Michael Bernabei for
initial testing, Dana Hughes and John Klingner for PCB
design, Jorge Canardo, Nick Farrow, Chris Heckmann, and
Andy McEvoy for helpful discussion, and Keyon Janani
and Carrie Weidner with help on sensor characterization.
This work has been supported by AFOSR grant FA955015-1-0238. We are grateful for this support.
References
[1] Deniz Armani, Chang Liu, and Narayan Aluru. Reconfigurable fluid circuits by PDMS elastomer micromachining. In Micro Electro Mechanical Systems,
1999. MEMS’99. Twelfth IEEE International Conference on, pages 222–227. IEEE, 1999.
[2] DJ Balek and RB Kelley. Using gripper mounted
infrared proximity sensors for robot feedback control.
In Robotics and Automation. Proceedings. 1985 IEEE
International Conference on, volume 2, pages 282–
287. IEEE, 1985.
[3] Gines Benet, Francisco Blanes, José E Simó, and
Pascual Pérez. Using infrared sensors for distance
measurement in mobile robots. Robotics and Autonomous systems, 40(4):255–266, 2002.
[4] Jeannette Bohg, Aythami Morales, Tamim Asfour,
and Danica Kragic. Data-driven grasp synthesis—a
survey. Robotics, IEEE Transactions on, 30(2):289–
309, 2014.
[5] Max Born and Emil Wolf. Principles of optics: electromagnetic theory of propagation, interference and
diffraction of light. Cambridge University press, 1999.
[6] Dengke Cai, Andreas Neyer, Rüdiger Kuckuk, and
H Michael Heise. Raman, mid-infrared, near-infrared
and ultraviolet–visible spectroscopy of pdms silicone
rubber for characterization of polymer optical waveguide materials. Journal of Molecular Structure, 976
(1):274–281, 2010.
[7] Ziliang Cai, Weiping Qiu, Guocheng Shao, and Wanjun Wang. A new fabrication method for all-PDMS
waveguides. Sensors and Actuators A: Physical, 204:
44–47, 2013.
[8] Berk Calli, Aaron Walsman, Arjun Singh, Siddhartha
Srinivasa, Pieter Abbeel, and Aaron M Dollar. Benchmarking in manipulation research: The YCB object
and model set and benchmarking protocols. arXiv
preprint arXiv:1502.03143, 2015.
[9] KC Chen, AM Wo, and YF Chen. Transmission
spectrum of PDMS in 4-7µm mid-IR range for characterization of protein structure. In NSTI-Nanotech,
volume 2, pages 732–735, 2006.
[10] Nikolaus Correll, Kostas E. Bekris, Dmitry Berenson, Oliver Brock, Albert Causo, Kris Hauser, Kei
Okada, Alberto Rodriguez, Joseph M. Romano, and
Peter R. Wurman. Lessons from the Amazon picking
challenge. arXiv:1601.05484, 2016.
[11] Nikhil Chavan Dafle, Alex Rodriguez, Robert Paolini,
Bowei Tang, Siddhartha S Srinivasa, Michael Erdmann, Matthew T Mason, Ivan Lundberg, Harald
Staab, and Thomas Fuhlbrigge. Extrinsic dexterity: In-hand manipulation with external forces. In
Robotics and Automation (ICRA), 2014 IEEE International Conference on, pages 1578–1585. IEEE,
2014.
[12] Ravinder S Dahiya, Giorgio Metta, Maurizio Valle,
[13]
[14]
[15]
[16]
[17]
[18]
[19]
[20]
[21]
[22]
[23]
[24]
[25]
and Giulio Sandini. Tactile sensing: from humans to
humanoids. Robotics, IEEE Transactions on, 26(1):
1–20, 2010.
Ravinder S Dahiya, Philipp Mittendorfer, Maurizio
Valle, Gordon Cheng, and Vladimir J Lumelsky. Directions toward effective utilization of tactile skin: A
review. Sensors Journal, IEEE, 13(11):4121–4138,
2013.
Raphael Deimel and Oliver Brock. A novel type of
compliant, underactuated robotic hand for dexterous
grasping. Robotics: Science and Systems, Berkeley,
CA, pages 1687–1692, 2014.
Aaron M Dollar, Leif P Jentoft, Jason H Gao, and
Robert D Howe. Contact sensing and grasping performance of compliant hands. Autonomous Robots, 28
(1):65–75, 2010.
Nicholas Farrow and Nikolaus Correll. A soft pneumatic actuator that can sense grasp and touch. In Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ
International Conference on, pages 2317–2323. IEEE,
2015.
Nicholas Farrow, Yang Li, and Nikolaus Correll.
Morphological and embedded computation in a selfcontained soft robotic hand. arxiv:1605.00354, 2016.
Yuki Fujimori, Yoshiyuki Ohmura, Tatsuya Harada,
and Yasuo Kuniyoshi. Wearable motion capture suit
with full-body tactile sensors. In Robotics and Automation, 2009. ICRA’09. IEEE International Conference on, pages 3186–3193. IEEE, 2009.
D Goger, Hosam Alagi, and Heinz Wörn. Tactile
proximity sensors for robotic applications. In Industrial Technology (ICIT), 2013 IEEE International
Conference on, pages 978–983. IEEE, 2013.
Greg Hellard and R Andrew Russell. A robust,
sensitive and economical tactile sensor for a robotic
manipulator. In Australian Conference on Robotics
and Automation, pages 100–104. Citeseer, 2002.
Kaijen Hsiao, Paul Nangeroni, Manfred Huber,
Ashutosh Saxena, and Andrew Y Ng. Reactive
grasping using optical proximity sensors. In Robotics
and Automation, 2009. ICRA’09. IEEE International
Conference on, pages 2098–2105. IEEE, 2009.
Dana Hughes and Nikolaus Correll. Texture recognition and localization in amorphous robotic skin.
Bioinspiration & biomimetics, 10(5):055002, 2015.
Leo Jiang, Kevin Low, Joey Costa, Richard J Black,
and Yong-Lae Park. Fiber optically sensorized multifingered robotic hand. In Intelligent Robots and
Systems (IROS), 2015 IEEE/RSJ International Conference on, pages 1763–1768. IEEE, 2015.
Jelizaveta Konstantinova, Agostino Stilli, and Kaspar
Althoefer. Force and proximity fingertip sensor to
enhance grasping perception. In Intelligent Robots
and Systems (IROS), 2015 IEEE/RSJ International
Conference on, pages 2118–2123. IEEE, 2015.
Torsten Kröger, Bernd Finkemeyer, Simon Winkel-
[26]
[27]
[28]
[29]
[30]
[31]
[32]
[33]
[34]
[35]
[36]
[37]
[38]
bach, Lars-Oliver Eble, Sven Molkenstruck, and
Friedrich M Wahl. A manipulator plays Jenga.
Robotics & Automation Magazine, IEEE, 15(3):79–
84, 2008.
S.S. Lancaster. A fuzzy logic controller for the
application of skin pressure. In Fuzzy Information,
2004. Processing NAFIPS ’04. IEEE Annual Meeting
of the, volume 2, pages 686–689 Vol.2, June 2004. doi:
10.1109/NAFIPS.2004.1337384.
Hyung-Kew Lee, Jaehoon Chung, Sun-Il Chang, and
Euisik Yoon. Normal and shear force measurement
using a flexible polymer tactile sensor with embedded
multiple capacitors. Microelectromechanical Systems,
Journal of, 17(4):934–942, 2008.
Lu Ma, Mahsa Ghafarianzadeh, Dave Coleman, Nikolaus Correll, and Gabe Sibley. Simultaneous localization, mapping, and manipulation for unsupervised
object discovery. In IEEE International Conference
on Robotics and Automation, pages 1344–1351, 2015.
Alexis Maldonado, Humberto Alvarez, and Michael
Beetz. Improving robot manipulation through fingertip perception. In Intelligent Robots and Systems
(IROS), 2012 IEEE/RSJ International Conference
on, pages 2947–2954. IEEE, 2012.
MA McEvoy and N Correll. Materials that couple
sensing, actuation, computation, and communication.
Science, 347(6228):1261689, 2015.
P Mittendorfer, E Yoshida, and G Cheng. Realizing
whole-body tactile interactions with a self-organizing,
multi-modal artificial skin on a humanoid robot. Advanced Robotics, 29(1):51–67, 2015.
Philipp Mittendorfer and Gordon Cheng. Humanoid
multimodal tactile-sensing modules. Robotics, IEEE
Transactions on, 27(3):401–410, 2011.
John Oberlin and Stefanie Tellex. Learning to pick
up objects through active exploration. In Development and Learning and Epigenetic Robotics (ICDLEpiRob), 2015 Joint IEEE International Conference
on, pages 252–253. IEEE, 2015.
Jonathan Rossiter and Toshiharu Mukai. An ledbased tactile sensor for multi-sensing over large areas.
In Sensors, 2006. 5th IEEE Conference on, pages 835–
838. IEEE, 2006.
Yaroslav Tenzer, Leif P Jentoft, and Robert D Howe.
The feel of mems barometers: inexpensive and easily
customized tactile array sensors. Robotics & Automation Magazine, IEEE, 21(3):89–95, 2014.
Lin Wang and David J Beebe. Characterization of a
silicon-based shear-force sensor on human subjects.
Biomedical Engineering, IEEE Transactions on, 49
(11):1340–1347, 2002.
Zhi Feng Zhang, Xiao Ming Tao, Hua Peng Zhang,
and Benpeng Zhu. Soft fiber optic sensors for precision measurement of shear stress and pressure. Sensors Journal, IEEE, 13(5):1478–1482, 2013.
F Zhu and JW Spronck. A capacitive tactile sensor
for shear and normal force measurements. Sensors
and Actuators A: Physical, 31(1):115–120, 1992.
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement