Optical Sensing: 1D to 3D using Time-of
Optical Sensing: 1D to 3D
using Time-of-Flight Technology
Shaping the Future of MEMS & Sensors
September 10, 2013
Agenda
Presentation
Marc
Drader
Optical Sensors Intro
Time-of-Flight Technology
ToF for 1D ranging
ToF for 3D gestures
Next Steps
2
STMicroelectronics Imaging Division
Camera
Modules
Image
Sensors
Imaging
Processors
Photonic
Sensors
Fixed focus camera
Wafer Level reflowable camera
EDoF camera
Auto-focus camera
Innovative optics,
assembly & test
technologies
Production from
1.4um to 5.6um
pixel
1.1um development
From VGA to
24Mpix
Stand alone ISP
Full ST video pipe
IP
Integration of third
party IP on demand
User detection,
Proximity, ALS,
Optical navigation,
Man Machine
Interface,
Automotive,
Medical
3
Brief Overview: Optical Sensors
• May seem obvious but…
• Optical path considerations
•
•
•
•
Transmission spectrum
Transmission path efficiency
Field of view
Target object characteristics
• System considerations
• Optical crosstalk!
• Ambient or background illumination (noise)
4
Proximity Detection
5
• Conventional IR sensor
• Attempt to detect whether object/user is near or far, based on reflected signal amplitude
•
2 Unknowns
1 Output
impossible to know object DISTANCE
• Target Distance
• Target Reflectance
Near
• Amplitude
• Signal & Noise
Far
Noise
90%
17%
3%
Threshold setting will detect user anywhere
from 0.5 to 6cm
What is “Time-of-Flight” Sensing?
Active Illumination system:
1.
2.
3.
Emit light (photons) towards a target
Light (partially) reflects from the target
Sensor determines “when” light (photons) arrive
Photon travel time multiplied by speed of light = distance
•
1cm = 66ps round-trip travel time at the speed of light
Target object
Photon travel time
NOT affected by
target reflectance
Photon(s)
Sensor
Emitter
Single photon travel time
Distance
6
ToF Illustration (video)
• Available on YouTube
7
Motivation for ToF
• Real-world application/need
• Best example: Smartphone proximity sensor detects user’s head during a phone
call; shuts off touchscreen & display
• But… it doesn’t work 100% of the time
• Search: “face hang-up” + any smartphone brand, to find frustrated users whose
touchscreen did not shut off before their cheek pressed a button
• Time-of-Flight technology adds value by
providing true, accurate distance measurements
• Independent of target object reflectance
• Immune to ambient illumination & optical path variations
(glass, plastic cover)
8
Motivation for ToF
9
Dark hair
Small displacements
ST ToF distance measurement
Far
Threshold
(screen on)
Near
(screen off)
Reflected power
(Conventional PS (*) )
Threshold
Near
(screen off)
Far
Skin
•
•
(*)
(screen on)
Conventional IR sensor bouncing between far and near
states vs robust ToF solution
Photon travel time NOT affected by the object reflectance
Reflected power information also available on ST ToF Proximity Module
Time-of-Flight Physics
10
• Emit and receive photons :
• ….follow Poisson distribution
• …may be correlated or uncorrelated (ambient, dark current) to emitter
• Finally
photon arrival rate does depend on object reflectance & distance
Delay we want to measure
= returned photons (correlated)
Emitted pulse
= ambient photons (uncorrelated)
Received pulse
(delay=distance)
Many repeated pulses required for correlation
Single Photon Avalanche Diode
• SPAD digital output used to:
• Count arrival of single photons and/or
• Time arrival of single photons
• Unique Properties
each photon provides valuable time/distance info
• Fully Integrated in CMOS
11
More Time-of-Flight Challenges
• Optical constraints
• Coverglass contributes optical crosstalk (shortcut from emitter to sensor)
• Ambient light is main contributor of uncorrelated photons
• Co-existence of visible & NIR systems for ALS & ranging
• Conflicting wavelength and field-of-view requirements
Ambient light
Sensor field
of view
Phone Window
Airgap
IR emission
12
Optical Crosstalk
• Time-of-Flight Sensor always “sees” two targets:
• Product-level cover (glass/plastic)
Target object
• fixed distance, and (relatively) fixed optical characteristics
• distorts reflected signal in both time & amplitude domain
• Target object
• varying distance and optical characteristics
Photon
Sensor
FlightSense technology compensates for optical crosstalk
automatically
Opens up use cases in very challenging optical environments
Emitter
13
Crosstalk Compensation
• Compensation algorithm
• Firmware uses known crosstalk characteristics to correct the time-domain measure
• Simple register write (absolute value of photons from emitter to sensor coupled
through phone housing)
Raw range results
(no compensation applied)
Crosstalk compensation
applied (register setting)
14
Ambient Immunity
• System performance
• Keep ambient photons out
• Optical filtering (notch around 850nm)
• Reject remaining ambient photons
• Time-domain rejection
• System-level noise management
• SNR limit
FlightSense technology will NOT report false distance in high ambient
light conditions
15
Ranging Conditions
• Ranging specifications
• 0 to 100mm, 3% to 90% reflectance
• 0 to 250mm for a 45% reflectance target (i.e.. Human hand)
• Eye-safe, low power IR (850nm) emitter
• Accuracy: σ = 3mm (resolution = 1mm steps)
• FlightSense architecture allows “zero” mm measurement
• * 0mm defined at the product/system-level
• There must be an available emitter
sensor optical path!
16
Simple Optical Module
• Simple (reflowable) package
• Small size (2.8 x 4.8 x 1.0mm)
• Integrated emitter/sensor & optics/filters
• Opposing requirements
• Proximity: near infra-red wavelengths, narrow field of view
• Ambient Light Sensor: visible wavelengths, wide field of view
• Device delivered full calibrated
• Simple electrical integration
• Single power supply (2.8V)
• I2C & GPIO (1.8V or 2.8V)
• Programmable I²C address
• Flexible window & threshold interrupts
17
VL6180X Ranging Performance
10x measurements per chart, 10mm step, in the dark,
0.2mm air-gap, no gasket, Oval artwork (75%>800nm)
• Ranging
Ranging performance
is is
independent
performance
independent
of target reflectance/color
of target reflectance/color
• Distance standard deviation < 3mm
Reflective charts (in %):
18
VL6180X Convergence Time
19
Distance (mm)
vs Convergence
Time (us)
Convergence
Time
Target reflectance from 3% to 88%
10000
9000
8000
Convergence Time (us)
7000
6000
5000
4000
3000
2000
1000
0
0
10
20
30
40
50
60
Target Object Distance (mm)
Reflective charts (in %):
3%
5%
17%
88%
70
80
90
100
Low Power Consumption
• Real-world current consumption
• Varies with object distance & reflectance
20
Peak current
consumption
• Max consumption set by user
Conv.
time
Conv.
time
• Examples (2.8V supply)
• 10Hz ranging, object held @ 5cm
88% (white): 40µA
18% (grey): 200µA
5% (black): 550uA
3% (deep black): 760uA
• 1Hz ALS, 100ms integration
• ALS: 32uA average
• Low standby current
• HW standby <1uA
• SW standby <7uA
Average Current Consumption
10Hz repetition rate
2.00
Average Current (mA)
•
•
•
•
1.50
88%
1.00
17%
5%
0.50
3%
0.00
0
20
40
60
Target Object Distance (mm)
80
100
New FlightSense™ VL6180X Sensor
Disruptive Time-of-Flight Technology
Fast, Accurate Distance Ranging
•
•
•
•
•
•
•
•
6 years of R&D
Key patents for innovative sensor/system architecture
Differentiating, unique technology
Manufactured in ST’s custom process
Independent of object reflectance (color)
Ambient rejection (sunlight, etc)
Phone window “crosstalk” compensation
Enable creative use cases (1D gesture application)
High-sensitivity ALS
Simplified Integration & Manufacturing
• “Invisible” for Industrial Design
• Ultra-wide dynamic range
• Calibrated output value in Lux
•
•
•
•
•
•
Small reflowable module with embedded light emitter
No additional optics or gasket
No phone-to-phone calibration required
Robust to phone glass manufacturing dispersion
Robust to phone drop / minimize field return
Robust supply chain with dual sourcing strategy
Production in H1 2014
21
Product Readiness
• Mass Production in H1/2014
• CP code : VL6180XV0NR/1
• Not just for mobile phone applications
• Robust proximity detection
• Consumer robotics
• Gaming
• And much more!
• SPAD/ToF potential applications are endless
Check out our page
on ST.com
22
ToF for Gestures: Motivation
• Multiple outputs eliminate ambiguity for gesture detection
Up/Down
Distance
Amplitude
Swipe
Distance
Amplitude
23
Multiple ToF sensor
Reduce ambiguity with more info:
• Order
• Position
1 device gestures capabilities :
2 devices gestures capabilities :
24
3D Gesture Detection
• High potential for differentiation
• Setting Expectations
• Unlike a touchscreen
• No “touch” or “release” – Detecting user intention more difficult
• No physical boundary
• “Live” interaction vs post-processed result
Optical 3D gestures can complement existing systems
• Off screen/over-screen sensing volume
• New uses cases
• Wakeup or UI response as user approaches
• Hands-free interaction (many ideas)
• Gaming controller
25
1D Gesture Detection
• Time of Flight IR
2 Outputs
• Object Properties
• Time domain
• Distance
• Distance
• Amplitude
• Signal & Noise
• Reflectance
• Surface
• % fill factor
• Multiple objects
• Distance measurement alone
• Tap / double-tap
• Up/down level control
• IF we assume fixed object reflectance
• Lateral motion can be estimated from a single pixel!
2 Unknowns
26
Lateral Motion Estimation
• Assume
• Surface properties are stable (same object) within a given time period
•
only change must be due to % filled FOV (represents x-y motion)
• We can therefore calculate the % of the Field of View filled by the object
• Independent of object distance!
20%
80%
Amplitude can be normalized
(using distance info)
100%
80%
20%
27
% Field-of-View Coverage
• Object reflectance modeled at all distances
• Model needs to include non-linearities
Sensor saturation
Emitter blocked
28
3D Gesture Detection 2 ToF Pixels
• Spatially or angularly separated ToF detectors
• Linear continuous slider
• Smooth triangulation of position/speed in X (horizontal) and Z (vertical)
29
Gesture Definitions 4 ToF Pixels
• Continuous control
hover & tilt
• Hover & tilt
• Great gameplay
• Can act as mouse/track pad
• Post-processed movement examples
Swipe
Press
Wave
1 2
2
1
30
More Gesture Definitions
• Movement properties that can be detected
• Motion lateral/angular speed
• Object width
• Closed vs spread fingers
• Hand-tilt detection
• Goal is for ROBUST detection of gestures/motion
Flat hand swipe
4 fingers swipe
Tilted hand swipe
Tilted hand swipe
31
What’s Next?
• 3D Gestures is a wide field
more to come!
32
33
Q&A
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement