ROBOT MOUNTED CAMERA + fixed frame offset

ROBOT MOUNTED CAMERA + fixed frame offset
<>!
R-30+A CONTROLLER
+RVision 2D Vision
START-UP GUIDANCE
B-82774EN-3/03
•
Original Instructions
Before using the Robot, be sure to read the "FANUC Robot Safety Manual (B-80687EN)" and
understand the content.
• No part of this manual may be reproduced in any form.
• All specifications and designs are subject to change without notice.
The products in this manual are controlled based on Japan’s “Foreign Exchange and
Foreign Trade Law”. The export from Japan may be subject to an export license by the
government of Japan.
Further, re-export to another country may be subject to the license of the government of
the country from where the product is re-exported. Furthermore, the product may also be
controlled by re-export regulations of the United States government.
Should you wish to export or re-export these products, please contact FANUC for advice.
In this manual we have tried as much as possible to describe all the various matters.
However, we cannot describe all the matters which must not be done, or which cannot be
done, because there are so many possibilities.
Therefore, matters which are not especially described as possible in this manual should be
regarded as “impossible”.
SAFETY
B-82774EN-3/03
1
SAFETY PRECAUTIONS
SAFETY PRECAUTIONS
For the safety of the operator and the system, follow all safety precautions when operating a robot and its
peripheral devices installed in a work cell.
In addition, refer to the “FANUC Robot SAFETY HANDBOOK (B-80687EN)”.
1.1
WORKING PERSON
The personnel can be classified as follows.
Operator:
• Turns robot controller power ON/OFF
• Starts robot program from operator’s panel
Programmer or teaching operator:
• Operates the robot
• Teaches robot inside the safety fence
Maintenance engineer:
• Operates the robot
• Teaches robot inside the safety fence
• Maintenance (adjustment, replacement)
-
An operator cannot work inside the safety fence.
A programmer, teaching operator, and maintenance engineer can work inside the safety fence. The
working activities inside the safety fence include lifting, setting, teaching, adjusting, maintenance,
etc..
To work inside the fence, the person must be trained on proper robot operation.
-
During the operation, programming, and maintenance of your robotic system, the programmer, teaching
operator, and maintenance engineer should take additional care of their safety by using the following safety
precautions.
-
Use adequate clothing or uniforms during system operation
Wear safety shoes
Use helmet
1.2
WORKING PERSON SAFETY
Working person safety is the primary safety consideration. Because it is very dangerous to enter the
operating space of the robot during automatic operation, adequate safety precautions must be observed.
The following lists the general safety precautions. Careful consideration must be made to ensure working
person safety.
(1) Have the robot system working persons attend the training courses held by FANUC.
FANUC provides various training courses. Contact our sales office for details.
s-3
SAFETY PRECAUTIONS
B-82774EN-3/03
(2) Even when the robot is stationary, it is possible that the robot is still in a ready to move state, and is
waiting for a signal. In this state, the robot is regarded as still in motion. To ensure working person
safety, provide the system with an alarm to indicate visually or aurally that the robot is in motion.
(3) Install a safety fence with a gate so that no working person can enter the work area without passing
through the gate. Install an interlocking device, a safety plug, and so forth in the safety gate so that the
robot is stopped as the safety gate is opened.
The controller is designed to receive this interlocking signal of the door switch. When the gate is
opened and this signal received, the controller stops the robot (Please refer to "STOP TYPE OF
ROBOT" in SAFETY for detail of stop type). For connection, see Fig.1.2 (a) and Fig.1.2 (b).
(4) Provide the peripheral devices with appropriate grounding (Class A, Class B, Class C, and Class D).
(5) Try to install the peripheral devices outside the work area.
(6) Draw an outline on the floor, clearly indicating the range of the robot motion, including the tools such
as a hand.
(7) Install a mat switch or photoelectric switch on the floor with an interlock to a visual or aural alarm that
stops the robot when a working person enters the work area.
(8) If necessary, install a safety lock so that no one except the working person in charge can turn on the
power of the robot.
The circuit breaker installed in the controller is designed to disable anyone from turning it on
when it is locked with a padlock.
(9) When adjusting each peripheral device independently, be sure to turn off the power of the robot.
RP1
Pulsecoder
RI/RO,XHBK,XROT
RM1
Motor power/brake
EARTH
Safety fence
Interlocking device and safety plug that are activated if the
gate is opened.
Fig. 1.2 (a) Safety Fence and Safety Gate
s-4
SAFETY PRECAUTIONS
B-82774EN-3/03
Dual chain
Panel board
EAS1
EAS11
EAS2
EAS21
(Note)
In case of R-30iA
Terminals EAS1,EAS11,EAS2,EAS21 or FENCE1,FENCE2
are provided on the operation box or on the terminal block
of the printed circuit board.
In case of R-30iA Mate
Terminals EAS1,EAS11,EAS2,EAS21 are provided
on the emergency stop board or connector panel.
(in case of Open air type)
Termianls FENCE1,FENCE2 are provided
on the emergency stop board.
Single chain
Panel board
Refer to controller maintenance manual for details.
FENCE1
FENCE2
Fig.1.2 (b) Connection Diagram for Safety Fence
1.2.1
Operator Safety
The operator is a person who operates the robot system. In this sense, a worker who operates the teach
pendant is also an operator. However, this section does not apply to teach pendant operators.
(1) If you do not have to operate the robot, turn off the power of the robot controller or press the
EMERGENCY STOP button, and then proceed with necessary work.
(2) Operate the robot system at a location outside of the safety fence
(3) Install a safety fence with a safety gate to prevent any worker other than the operator from entering the
work area unexpectedly and to prevent the worker from entering a dangerous area.
(4) Install an EMERGENCY STOP button within the operator’s reach.
The robot controller is designed to be connected to an external EMERGENCY STOP button.
With this connection, the controller stops the robot operation (Please refer to "STOP TYPE OF
ROBOT" in SAFETY for detail of stop type), when the external EMERGENCY STOP button is
pressed. See the diagram below for connection.
Dual chain
External stop button
Panel board
EES1
EES11
EES2
EES21
Single chain
External stop button
(Note)
Connect EES1and EES11,EES2 and EES21or EMGIN1and EMGIN2.
In case of R-30iA
EES1,EES11,EES2,EES21 or EMGIN1,EMGIN2 are on the panel board.
In case of R-30iA Mate
EES1,EES11,EES2,EES21 are on the emergency stop board
or connector panel (in case of Open air type),.
EMGIN1,EMGIN2 are on the emergency stop board.
Refer to the maintenance manual of the controller for details.
Panel board
EMGIN1
EMGIN2
Fig.1.2.1 Connection Diagram for External Emergency Stop Button
s-5
SAFETY PRECAUTIONS
1.2.2
B-82774EN-3/03
Safety of the Teach Pendant Operator
While teaching the robot, the operator must enter the work area of the robot. The operator must ensure the
safety of the teach pendant operator especially.
(1) Unless it is specifically necessary to enter the robot work area, carry out all tasks outside the area.
(2) Before teaching the robot, check that the robot and its peripheral devices are all in the normal operating
condition.
(3) If it is inevitable to enter the robot work area to teach the robot, check the locations, settings, and other
conditions of the safety devices (such as the EMERGENCY STOP button, the DEADMAN switch on
the teach pendant) before entering the area.
(4) The programmer must be extremely careful not to let anyone else enter the robot work area.
Our operator panel is provided with an emergency stop button and a key switch (mode switch) for selecting the
automatic operation mode (AUTO) and the teach modes (T1 and T2). Before entering the inside of the safety fence
for the purpose of teaching, set the switch to a teach mode, remove the key from the mode switch to prevent other
people from changing the operation mode carelessly, then open the safety gate. If the safety gate is opened with the
automatic operation mode set, the robot stops (Please refer to "STOP TYPE OF ROBOT" in SAFETY for detail of
stop type). After the switch is set to a teach mode, the safety gate is disabled. The programmer should understand
that the safety gate is disabled and is responsible for keeping other people from entering the inside of the safety
fence. (In case of R-30iA Mate Controller standard specification, there is no mode switch. The automatic operation
mode and the teach mode is selected by teach pendant enable switch.)
Our teach pendant is provided with a DEADMAN switch as well as an emergency stop button. These button and
switch function as follows:
(1) Emergency stop button: Causes an emergency stop (Please refer to "STOP TYPE OF ROBOT" in SAFETY for
detail of stop type) when pressed.
(2) DEADMAN switch: Functions differently depending on the mode switch setting status.
(a)
Automatic operation mode: The DEADMAN switch is disabled.
(b)
Teach mode: Servo power is turned off when the operator releases the DEADMAN switch or when the
operator presses the switch strongly.
Note) The DEADMAN switch is provided to stop the robot when the operator releases the teach pendant or
presses the pendant strongly in case of emergency. The R-30iA/ R-30iA Mate employs a 3-position
DEADMAN switch, which allows the robot to operate when the 3-position DEADMAN switch is pressed to
its intermediate point. When the operator releases the DEADMAN switch or presses the switch strongly,
the robot stops immediately.
The operator’s intention of starting teaching is determined by the controller through the dual operation of setting the
teach pendant enable/disable switch to the enable position and pressing the DEADMAN switch. The operator should
make sure that the robot could operate in such conditions and be responsible in carrying out tasks safely.
The teach pendant, operator panel, and peripheral device interface send each robot start signal. However the validity
of each signal changes as follows depending on the mode switch and the DEADMAN switch of the operator panel, the
teach pendant enable switch and the remote condition on the software.
In case of R-30iA Controller or CE or RIA specification of R-30iA Mate Controller
Mode
AUTO
mode
Teach pendant
enable switch
On
Off
Software
remote
condition
Teach pendant
Operator panel
Peripheral device
Local
Remote
Local
Remote
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
Allowed to start
Not allowed
Not allowed
Not allowed
Not allowed
Allowed to start
s-6
SAFETY PRECAUTIONS
B-82774EN-3/03
Mode
Teach pendant
enable switch
On
T1, T2
mode
Off
Software
remote
condition
Teach pendant
Operator panel
Peripheral device
Local
Remote
Local
Remote
Allowed to start
Allowed to start
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
Not allowed
In case of standard specification of R-30iA Mate Controller
Teach pendant enable switch
Software remote condition
Teach pendant
Peripheral device
On
Off
Ignored
Local
Remote
Allowed to start
Not allowed
Not allowed
Not allowed
Not allowed
Allowed to start
(5) (Only when R-30iA Controller or CE or RIA specification of R-30iA Mate controller is selected.) To
start the system using the operator’s panel, make certain that nobody is the robot work area and that
there are no abnormal conditions in the robot work area.
(6) When a program is completed, be sure to carry out a test run according to the procedure below.
(a) Run the program for at least one operation cycle in the single step mode at low speed.
(b) Run the program for at least one operation cycle in the continuous operation mode at low speed.
(c) Run the program for one operation cycle in the continuous operation mode at the intermediate
speed and check that no abnormalities occur due to a delay in timing.
(d) Run the program for one operation cycle in the continuous operation mode at the normal
operating speed and check that the system operates automatically without trouble.
(e) After checking the completeness of the program through the test run above, execute it in the
automatic operation mode.
(7) While operating the system in the automatic operation mode, the teach pendant operator should leave
the robot work area.
1.2.3
Safety of the Maintenance Engineer
For the safety of maintenance engineer personnel, pay utmost attention to the following.
(1) During operation, never enter the robot work area.
(2) Except when specifically necessary, turn off the power of the controller while carrying out
maintenance. Lock the power switch, if necessary, so that no other person can turn it on.
(3) If it becomes necessary to enter the robot operation range while the power is on, press the emergency
stop button on the operator panel, or the teach pendant before entering the range. The maintenance
personnel must indicate that maintenance work is in progress and be careful not to allow other people
to operate the robot carelessly.
(4) When disconnecting the pneumatic system, be sure to reduce the supply pressure.
(5) Before the start of teaching, check that the robot and its peripheral devices are all in the normal
operating condition.
(6) Do not operate the robot in the automatic mode while anybody is in the robot work area.
(7) When you maintain the robot alongside a wall or instrument, or when multiple workers are working
nearby, make certain that their escape path is not obstructed.
(8) When a tool is mounted on the robot, or when any moving device other than the robot is installed, such
as belt conveyor, pay careful attention to its motion.
(9) If necessary, have a worker who is familiar with the robot system stand beside the operator panel and
observe the work being performed. If any danger arises, the worker should be ready to press the
EMERGENCY STOP button at any time.
(10) When replacing or reinstalling components, take care to prevent foreign matter from entering the
system.
s-7
SAFETY PRECAUTIONS
B-82774EN-3/03
(11) When handling each unit or printed circuit board in the controller during inspection, turn off the circuit
breaker to protect against electric shock.
If there are two cabinets, turn off the both circuit breaker.
(12) When replacing parts, be sure to use those specified by FANUC.
In particular, never use fuses or other parts of non-specified ratings. They may cause a fire or result in
damage to the components in the controller.
(13) When restarting the robot system after completing maintenance work, make sure in advance that there
is no person in the work area and that the robot and the peripheral devices are not abnormal.
1.3
SAFETY OF THE TOOLS AND PERIPHERAL DEVICES
1.3.1
Precautions in Programming
(1) Use a limit switch or other sensor to detect a dangerous condition and, if necessary, design the program
to stop the robot when the sensor signal is received.
(2) Design the program to stop the robot when an abnormal condition occurs in any other robots or
peripheral devices, even though the robot itself is normal.
(3) For a system in which the robot and its peripheral devices are in synchronous motion, particular care
must be taken in programming so that they do not interfere with each other.
(4) Provide a suitable interface between the robot and its peripheral devices so that the robot can detect the
states of all devices in the system and can be stopped according to the states.
1.3.2
Precautions for Mechanism
(1) Keep the component cells of the robot system clean, and operate the robot in an environment free of
grease, water, and dust.
(2) Employ a limit switch or mechanical stopper to limit the robot motion so that the robot or cable does
not strike against its peripheral devices or tools.
(3) Observe the following precautions about the mechanical unit cables. When theses attentions are not
kept, unexpected troubles might occur.
•
Use mechanical unit cable that have required user interface.
•
Don’t add user cable or hose to inside of mechanical unit.
•
Please do not obstruct the movement of the mechanical unit cable when cables are added to
outside of mechanical unit.
•
In the case of the model that a cable is exposed, Please do not perform remodeling (Adding a
protective cover and fix an outside cable more) obstructing the behavior of the outcrop of the
cable.
•
Please do not interfere with the other parts of mechanical unit when install equipments in the
robot.
(4) The frequent power-off stop for the robot during operation causes the trouble of the robot. Please avoid
the system construction that power-off stop would be operated routinely. (Refer to bad case example.)
Please execute power-off stop after reducing the speed of the robot and stopping it by hold stop or
cycle stop when it is not urgent. (Please refer to "STOP TYPE OF ROBOT" in SAFETY for detail of
stop type.)
(Bad case example)
•
Whenever poor product is generated, a line stops by emergency stop.
•
When alteration was necessary, safety switch is operated by opening safety fence and power-off
stop is executed for the robot during operation.
•
An operator pushes the emergency stop button frequently, and a line stops.
•
An area sensor or a mat switch connected to safety signal operate routinely and power-off stop is
executed for the robot.
(5) Robot stops urgently when collision detection alarm (SV050) etc. occurs. The frequent urgent stop by
alarm causes the trouble of the robot, too. So remove the causes of the alarm.
s-8
B-82774EN-3/03
SAFETY PRECAUTIONS
1.4
SAFETY OF THE ROBOT MECHANISM
1.4.1
Precautions in Operation
(1) When operating the robot in the jog mode, set it at an appropriate speed so that the operator can
manage the robot in any eventuality.
(2) Before pressing the jog key, be sure you know in advance what motion the robot will perform in the
jog mode.
1.4.2
Precautions in Programming
(1) When the work areas of robots overlap, make certain that the motions of the robots do not interfere
with each other.
(2) Be sure to specify the predetermined work origin in a motion program for the robot and program the
motion so that it starts from the origin and terminates at the origin.
Make it possible for the operator to easily distinguish at a glance that the robot motion has terminated.
1.4.3
Precautions for Mechanisms
(1) Keep the work areas of the robot clean, and operate the robot in an environment free of grease, water,
and dust.
1.4.4
Procedure to move arm without drive power in emergency or
abnormal situations
For emergency or abnormal situations (e.g. persons trapped in or by the robot), brake release unit can be
used to move the robot axes without drive power.
Please refer to controller maintenance manual and mechanical unit operator’s manual for using method of
brake release unit and method of supporting robot.
1.5
SAFETY OF THE END EFFECTOR
1.5.1
Precautions in Programming
(1) To control the pneumatic, hydraulic and electric actuators, carefully consider the necessary time delay
after issuing each control command up to actual motion and ensure safe control.
(2) Provide the end effector with a limit switch, and control the robot system by monitoring the state of the
end effector.
1.6
STOP TYPE OF ROBOT
The following three robot stop types exist:
Power-Off Stop (Category 0 following IEC 60204-1)
Servo power is turned off and the robot stops immediately. Servo power is turned off when the robot is
moving, and the motion path of the deceleration is uncontrolled.
The following processing is performed at Power-Off stop.
An alarm is generated and servo power is turned off.
The robot operation is stopped immediately. Execution of the program is paused.
s-9
SAFETY PRECAUTIONS
B-82774EN-3/03
Controlled stop (Category 1 following IEC 60204-1)
The robot is decelerated until it stops, and servo power is turned off.
The following processing is performed at Controlled stop.
The alarm "SRVO-199 Controlled stop" occurs along with a decelerated stop. Execution of the
program is paused.
An alarm is generated and servo power is turned off.
Hold (Category 2 following IEC 60204-1)
The robot is decelerated until it stops, and servo power remains on.
The following processing is performed at Hold.
The robot operation is decelerated until it stops. Execution of the program is paused.
WARNING
The stopping distance and stopping time of Controlled stop are longer than the
stopping distance and stopping time of Power-Off stop. A risk assessment for the
whole robot system, which takes into consideration the increased stopping
distance and stopping time, is necessary when Controlled stop is used.
When the E-Stop button is pressed or the FENCE is open, the stop type of robot is Power-Off stop or
Controlled stop. The configuration of stop type for each situation is called stop pattern. The stop pattern is
different according to the controller type or option configuration.
There are the following 3 Stop patterns.
Stop
pattern
A
B
C
P-Stop:
C-Stop:
-:
Mode
AUTO
T1
T2
AUTO
T1
T2
AUTO
T1
T2
E-Stop
button
External
E-Stop
FENCE open
SVOFF input
Servo
disconnect
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
C-Stop
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
C-Stop
P-Stop
P-Stop
C-Stop
P-Stop
C-Stop
-
C-Stop
C-Stop
C-Stop
P-Stop
P-Stop
P-Stop
C-Stop
C-Stop
C-Stop
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
P-Stop
C-Stop
P-Stop
P-Stop
Power-Off stop
Controlled stop
Not stop
The following table indicates the Stop pattern according to the controller type or option configuration.
R-30iA
Standard
RIA type
(Dual)
R-30iA Mate
RIA
Standard
type
Option
Standard
(Single)
Standard
Stop type set (Stop pattern C)
(A05B-2500-J570)
B (*)
A
A
A
A (**)
A
A
N/A
N/A
C
C
N/A
C
C
CE type
CE
type
(*) R-30iA standard (single) does not have servo disconnect.
(**) R-30iA Mate Standard does not have servo disconnect, and the stop type of SVOFF input is
Power-Off stop.
The stop pattern of the controller is displayed in "Stop pattern" line in software version screen. Please refer
"Software version" in operator's manual of controller for the detail of software version screen.
s - 10
SAFETY PRECAUTIONS
B-82774EN-3/03
"Stop type set (Stop pattern C)" option
"Stop type set (Stop pattern C)"(A05B-2500-J570) is an optional function. When this option is loaded, the
stop type of the following alarms becomes Controlled stop but only in AUTO mode. In T1 or T2 mode, the
stop type is Power-Off stop which is the normal operation of the system.
Alarm
SRVO-001 Operator panel E-stop
SRVO-002 Teach pendant E-stop
SRVO-007 External emergency stops
SRVO-194 Servo disconnect
SRVO-218 Ext.E-stop/Servo Disconnect
SRVO-408 DCS SSO Ext Emergency Stop
SRVO-409 DCS SSO Servo Disconnect
Condition
Operator panel E-stop is pressed.
Teach pendant E-stop is pressed.
External emergency stop input (EES1-EES11, EES2-EES21) is
open. (R-30iA controller)
Servo disconnect input (SD4-SD41, SD5-SD51) is open.
(R-30iA controller)
External emergency stop input (EES1-EES11, EES2-EES21) is
open. (R-30iA Mate controller)
In DCS Safe I/O connect function, SSO[3] is OFF.
In DCS Safe I/O connect function, SSO[4] is OFF.
Controlled stop is different from Power-Off stop as follows:
In Controlled stop, the robot is stopped on the program path. This function is effective for a system
where the robot can interfere with other devices if it deviates from the program path.
In Controlled stop, physical impact is less than Power-Off stop. This function is effective for systems
where the physical impact to the mechanical unit or EOAT (End Of Arm Tool) should be minimized.
The stopping distance and stopping time of Controlled stop is longer than the stopping distance and
stopping time of Power-Off stop, depending on the robot model and axis. Please refer the operator's
manual of a particular robot model for the data of stopping distance and stopping time.
This function is available only in CE or RIA type hardware.
When this option is loaded, this function can not be disabled.
The stop type of DCS Position and Speed Check functions is not affected by the loading of this option.
WARNING
The stopping distance and stopping time of Controlled stop are longer than the
stopping distance and stopping time of Power-Off stop. A risk assessment for the
whole robot system, which takes into consideration the increased stopping
distance and stopping time, is necessary when this option is loaded.
s - 11
TABLE OF CONTENTS
B-82774EN-3/03
TABLE OF CONTENTS
SAFETY.......................................................................................................s-1
1
PREFACE................................................................................................ 1
1.1
2
OVERVIEW OF THE MANUAL ..................................................................... 1
OVERVIEW OF 2D VISION..................................................................... 3
2.1
OVERVIEW OF 2D SINGLE VIEW VISION PROCESS ................................ 3
2.1.1
2.1.2
2.2
OVERVIEW OF 2D MULTI VIEW VISION PROCESS .................................. 5
2.2.1
2.2.2
2.3
FEATURES AND NOTES ....................................................................................11
LAYOUT................................................................................................................12
OVERVIEW OF 3D TRI-VIEW VISION PROCESS ..................................... 13
2.6.1
2.6.2
3
FEATURES AND NOTES ......................................................................................9
LAYOUT................................................................................................................10
OVERVIEW OF BIN-PICK SEARCH VISION PROCESS ........................... 10
2.5.1
2.5.2
2.6
FEATURES AND NOTES ......................................................................................7
LAYOUT..................................................................................................................8
OVERVIEW OF DEPALLETIZING VISION PROCESS ................................. 9
2.4.1
2.4.2
2.5
FEATURES AND NOTES ......................................................................................6
LAYOUT..................................................................................................................6
OVERVIEW OF FLOATING FRAME VISION PROCESS ............................. 7
2.3.1
2.3.2
2.4
FEATURES AND NOTES ......................................................................................4
LAYOUT..................................................................................................................4
FEATURES AND NOTES ....................................................................................13
LAYOUT................................................................................................................14
2D SINGLE VIEW VISION PROCESS .................................................. 15
3.1
3.2
PREPARATION BEFORE SETUP .............................................................. 16
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"....................... 17
3.2.1
3.2.2
3.2.3
3.2.4
3.2.5
3.2.6
3.2.7
3.2.8
3.3
SETUP FOR "ROBOT MOUNTED CAMERA + FIXED FRAME OFFSET " 25
3.3.1
3.3.2
3.3.3
3.3.4
3.3.5
3.3.6
3.3.7
3.3.8
3.4
Camera Setting Data Creation and Teaching .........................................................18
Robot TCP Setup....................................................................................................18
Calibration grid frame Setup .................................................................................19
Application User Frame Setup ...............................................................................20
Camera Calibration Data Creation and Teaching...................................................21
Vision Program Creation and Teaching .................................................................22
Robot Program Creation and Teaching ..................................................................24
Robot Compensation Operation Check ..................................................................24
Camera Setting Data Creation and Teaching .........................................................26
Robot TCP Setup....................................................................................................26
Calibration grid Setup ...........................................................................................26
Teaching Application User frame ..........................................................................27
Camera Calibration Data Creation and Setup.........................................................27
Vision Program Creation and Teaching .................................................................29
Robot Program Creation and Teaching ..................................................................31
Robot Compensation Operation Check ..................................................................31
SETUP FOR "FIXED CAMERA + tool offset " ............................................. 31
3.4.1
3.4.2
3.4.3
Camera Setting Data Creation and Teaching .........................................................34
Calibration grid Setup ............................................................................................34
User frame for Compensation Setup ......................................................................35
c-1
TABLE OF CONTENTS
3.4.4
3.4.5
3.4.6
3.4.7
4
PREPARATION BEFORE SETUP .............................................................. 41
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"....................... 41
4.2.1
4.2.2
4.2.3
4.2.4
4.2.5
4.2.6
4.2.7
4.2.8
4.3
4.4
Camera Setting Data Creation and Teaching .........................................................52
Robot TCP Setup....................................................................................................53
Calibration Grid Setup............................................................................................53
Teaching Application User Frame..........................................................................53
Camera Calibration Data Creation and Setup.........................................................54
Vision Program Creation and Teaching .................................................................55
Robot Program Creation and Teaching ..................................................................60
Robot Compensation Operation Check ..................................................................60
SETUP FOR "FIXED CAMERA + tool offset" .............................................. 60
4.4.1
4.4.2
4.4.3
4.4.4
4.4.5
4.4.6
4.5
Camera Setting Data Creation and Teaching .........................................................43
Robot TCP Setup....................................................................................................43
Calibration grid frame setup...................................................................................43
Application User Frame Setup ...............................................................................44
Camera Calibration Data Creation and Teaching...................................................45
Vision Program Creation and Teaching .................................................................47
Robot Program Creation and Teaching ..................................................................50
Robot Compensation Operation Check ..................................................................51
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED FRAME OFFSET" . 51
4.3.1
4.3.2
4.3.3
4.3.4
4.3.5
4.3.6
4.3.7
4.3.8
Camera Setting Data Creation and Teaching .........................................................62
Calibration grid plate Setup....................................................................................62
User frame for Compensation Setup ......................................................................63
Camera Calibration Data Creation and Teaching...................................................64
Vision Program Creation and Teaching .................................................................65
Robot Program Creation and Teaching ..................................................................69
Robot Compensation Operation Check ....................................................... 69
SETUP OF FLOATING FRAME VISION PROCESS ............................ 70
5.1
5.2
PREPARATION BEFORE SETUP .............................................................. 71
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED FRAME OFFSET" . 72
5.2.1
5.2.2
5.2.3
5.2.4
5.2.5
5.2.6
5.2.7
5.2.8
6
Camera Calibration Data Creation and Teaching...................................................35
Vision Program Creation and Teaching .................................................................37
Robot Program Creation and Teaching ..................................................................38
Robot Compensation Operation Check ..................................................................39
2D MULTI VIEW VISION PROCESS .................................................... 40
4.1
4.2
5
B-82774EN-3/03
Camera Setting Data Creation and Teaching .........................................................73
Robot TCP Setting..................................................................................................73
Calibration Grid Setup............................................................................................74
Teaching Application User Frame..........................................................................74
Camera Calibration Data Creation and Teaching...................................................75
Vision Program Creation and Teaching .................................................................77
Robot Program Creation and Teaching ..................................................................79
Robot Compensation Operation Check ..................................................................79
SETUP OF DEPALLETIZING VISION PROCESS................................ 80
6.1
6.2
PREPARATION BEFORE SETUP .............................................................. 80
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED FRAME OFFSET" . 81
6.2.1
6.2.2
6.2.3
6.2.4
Camera Setting Data Creation and Teaching .........................................................82
Robot TCP Setting..................................................................................................82
Calibration Grid Setup............................................................................................82
Teaching Application User Frame..........................................................................83
c-2
TABLE OF CONTENTS
B-82774EN-3/03
6.2.5
6.2.6
6.2.7
6.2.8
7
SETUP OF BIN-PICK SEARCH VISION PROCESS ............................ 89
7.1
7.2
PREPARATION BEFORE SETUP .............................................................. 90
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"....................... 91
7.2.1
7.2.2
7.2.3
7.2.4
7.2.5
7.2.6
7.2.7
7.2.8
7.3
Camera Setting Data Creation and Teaching .......................................................101
Robot TCP Setting................................................................................................101
Calibration Grid Setup..........................................................................................101
Teaching Application User Frame........................................................................102
Camera Calibration Data Creation and Teaching.................................................102
Vision Program Creation and Teaching ...............................................................104
Robot Program Creation and Teaching ................................................................107
Robot Compensation Operation Check ................................................................107
SETUP OF 3D TRI-VIEW VISION PROCESS .................................... 108
8.1
8.2
8.3
PREPARATION BEFORE SETUP ............................................................ 108
Placement of Cameras .............................................................................. 109
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"..................... 110
8.3.1
8.3.2
8.3.3
8.3.4
8.3.5
8.3.6
8.3.7
8.4
Camera Setting Data Creation and Teaching .......................................................111
Calibration Grid Frame Setup ..............................................................................112
Application Frame Setup......................................................................................112
Camera Calibration Data Creation and Teaching.................................................113
Vision Program Creation and Teaching ...............................................................115
Robot Program Creation and Teaching ................................................................117
Robot Compensation Operation Check ................................................................118
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED FRAME OFFSET"118
8.4.1
8.4.2
8.4.3
8.4.4
8.4.5
8.4.6
8.4.7
9
Camera Setting Data Creation and Teaching .........................................................92
Robot TCP Setting..................................................................................................92
Calibration grid frame Setup ..................................................................................93
Application User Frame Setup ...............................................................................93
Camera Calibration Data Creation and Teaching...................................................94
Vision Program Creation and Teaching .................................................................97
Robot Program Creation and Teaching ..................................................................99
Robot Compensation Operation Check ..................................................................99
SETUP FOR "ROBOT MOUNTED CAMERA + fixed frame offset" ........... 100
7.3.1
7.3.2
7.3.3
7.3.4
7.3.5
7.3.6
7.3.7
7.3.8
8
Camera Calibration Data Creation and Teaching...................................................83
Vision Program Creation and Teaching .................................................................85
Robot Program Creation and Teaching ..................................................................88
Robot Compensation Operation Check ..................................................................88
Camera Setting Data Creation and Teaching .......................................................119
Calibration Grid Setup..........................................................................................120
Teaching Application User Frame........................................................................120
Camera Calibration Data Creation and Teaching.................................................121
Vision Program Creation and Teaching ...............................................................124
Robot Program Creation and Teaching ................................................................127
Robot Compensation Operation Check ................................................................127
SET UP OF SNAP IN MOTION ........................................................... 128
9.1
OVERVIEW OF SNAP IN MOTION........................................................... 128
9.1.1
9.1.2
9.1.3
9.1.4
9.1.5
FEATURES..........................................................................................................128
USING SNAP IN MOTION ................................................................................129
CHECKING POSITION AND SPEED AT SNAP ..............................................129
ROBOT PROGRAM FOR SNAP IN MOTION..................................................129
NOTES .................................................................................................................130
c-3
TABLE OF CONTENTS
9.2
STUDY FOR APPLICATION ..................................................................... 130
9.2.1
9.2.2
9.2.3
9.3
B-82774EN-3/03
LIGHT AND EXPOSURE TIME ........................................................................130
IMAGE PROCESSING TIME AND MOTION TIME........................................130
SHIFT OF SNAP POSITION ..............................................................................131
SAMPLE APPLICATIONS ......................................................................... 131
9.3.1
TOOL OFFSET WITH A FIXED CAMERA (2D SINGLE-VIEW VISION
PROCESS) ...........................................................................................................132
9.3.2
TOOL OFFSET WITH A FIXED CAMERA (2D MULTI-VIEW VISION
PROCESS) ...........................................................................................................133
9.3.3
FIXED FRAME OFFSET WITH A ROBOT MOUNTED CAMERA (3D
TRI-VIEW VISION PROCESS)..........................................................................135
9.3.1.1
9.3.2.1
9.3.3.1
ROBOT PROGRAM CREATION AND TEACHING................................... 132
ROBOT PROGRAM CREATION AND TEACHING................................... 134
ROBOT PROGRAM CREATION AND TEACHING................................... 135
10 TROUBLESHOOTING ........................................................................ 137
APPENDIX
A
CALIBRATION GRID SETUP AND TRAINING .................................. 141
c-4
1.PREFACE
B-82774EN-3/03
1
PREFACE
This chapter describes an overview of this manual which should be noted before operating the iRVision
function.
1.1
OVERVIEW OF THE MANUAL
Overview
This manual describes how to operate iRVision controlled by the R-30iA controller.
In this manual, only the operation and the technique of programming for the dedicated sensor functions
are explained, assuming that the installation and the setup of the robot are completed. Refer to the
"HANDLING TOOL Operations Manual" about other operations of FANUC Robots.
CAUTION
This manual is based on R-30iA system software of 7.70P/03 version. Note
that the functions and settings not described in this manual may be available,
and some notation differences are present, depending on the software version.
Contents of this manual
Chapter 1
Chapter 2
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Chapter 8
Chapter 9
Chapter 10
Appendix
How to use this manual.
Overview of 2D Vision
Setup of 2D Single-view Vision Process
Setup of 2D Multi-view Vision Process
Setup of floating frame Vision Process
Setup of Depalletizing Vision Process
Setup of Bin-Picking Search Vision Process
Setup of 3D Tri-View Vision Process
Setup of snap in motion
Trouble shooting
Calibration grid setup and training
Related manuals
R-30iA Operations Manual
HANDLING TOOL
B-83124EN-2
R-30iA Mate Operations Manual
LR HANDLING TOOL
B-82724EN-1
Topics: Robot functions, operations, programming, interfaces, alarms
Use: Applicable design, robot installation, teaching, adjustment
R-30iA Maintenance Manual
B-82595EN
B-82595EN-1 (For Europe)
B-82595EN-2 (For RIA)
R-30iA Mate Maintenance
Manual
B-82725EN
B-82725EN-1 (For Europe)
B-82725EN-2 (For RIA)
Topics: Installation and set-up, connection to peripheral equipment,
maintenance of the system
Use: Installation, start-up, connection, maintenance
Topics: Robot functions, operations, programming, interfaces, alarms
Use:
Applicable design, robot installation, teaching, adjustment
Topics: Installation and set-up, connection to peripheral equipment,
maintenance of the system
Use: Installation, start-up, connection, maintenance
-1-
1.PREFACE
R-30iA Mate Open Air
Maintenance Manual
B-82965EN-1
Force Sensor
3D Laser Vision Sensor
iRVision / V-500iA
Maintenance Manual
B-82775EN
R-30iA / R-30iA Mate
Alarm Code List
B-83124EN-6
iRVision Operator’s Manual
B-82774EN
B-82774EN-3/03
Topics: Installation and set-up, connection to peripheral equipment,
maintenance of the system
Use: Installation, start-up, connection, maintenance
Topics: Connection of the sensors, robot, and control devices, maintenance of
the sensors
Use: Connection and maintenance of the sensors
Topics:Error code listings, causes, and remedies.
Use: Installing and activating the system, connecting the mechanical unit to
the peripheral device and maintenance the robot.
Topics: Outline of iRVision, basic operation, sensor calibration,
description of each function, programming, alarms
Use: Teaching, adjustment
-2-
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
2
OVERVIEW OF 2D VISION
This manual explains a setup procedure concerning the two-dimensional compensation function of the
iRVision. The two-dimensional compensation function can be used in several applications. At present,
it can be used for the following six applications.
<1>
<2>
<3>
<4>
<5>
<6>
2D Single-view Vision Process
2D Multi-view Vision Process
Floating Frame Vision Process
Depalletizing Vision Process
Bin-Pick Search Vision Process
3D Tri-View Vision Process
The two-dimensional compensation enables workpieces to be compensated in the parallel movement (X,
Y) direction or the rotational movement (R) direction. In the Depalletizing Vision Process in <4> and
Bin-Pick Search Vision Process in <5> above, compensation can be done not only in the (X, Y, R)
directions but also in the height (Z) direction of the workpieces. In the 3D Tri-View Vision Process in
<6> can be done in the (X, Y, Z, W, P, R) directions.
This chapter outlines the above six applications, and Chapters 3 to 8 explain the setup procedures in detail.
Chapter 9 explains the setup procedures in snap-in-motion.
Chapter 10 gives examples of
troubleshooting. Appendix explains the calibration grid and describes how to set grid frame
information.
2.1
OVERVIEW OF 2D SINGLE VIEW VISION PROCESS
This manual describes the procedure to set up an application using 2D Single-view Vision Process.
With 2D Single-view Vision Process, the following four configurations can be used:
<1>
<2>
<3>
<4>
Fixed camera + fixed frame offset
Robot mounted camera + fixed frame offset
Fixed camera + tool offset
Robot mounted camera + tool offset
In the configuration of "<4> Robot mounted camera + tool offset” among the four configurations, robot A
holds a camera and robot B holds a workpiece, and the grip error of the workpiece held by robot B is
measured. One robot acquires the current position of the other robot, and vice versa, so that the
inter-robot communication function (the robot ring is set up from the main iRVision setup screen) is used.
If the camera held by robot A is handled as a fixed camera, the same setup as "<3> Fixed camera + tool
offset" results.
In any configuration, grid pattern calibration should be used as the method of camera calibration. As
two-dimensional compensation functions, a "2D Multi-view Vision Process" and a "Depalletizing Vision
Process" are available in addition to the four configurations above. The grid pattern calibration is a
method widely used with all of these two-dimensional compensation applications. Moreover, grid
pattern calibration can perform camera calibration precisely. This feature improves precision in
workpiece holding by a robot.
The grid calibration method requires a dedicated calibration grid . A FANUC standard calibration grid
is available in several sizes. It is strongly recommended that you order a calibration grid as well as a
camera and lens. For details, see Appendix A. The 2D Single-view Vision Process is available with
7DA1 series/edition 19 or later.
-3-
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
NOTE
When a fixed camera is used, the robot-generated grid calibration can be used
instead. When there is not an appropriate size of calibration grid for the camera
FOV, the robot-generated grid calibration is useful. For details, see Subsection
11.3, “ROBOT-GENERATED GRID CALIBRATION” in the iRVision operator’s
manual.
This manual is directed to users who are reasonably familiar with the FANUC two-dimensional vision.
For details of each setup item, refer to the online help information or the iRVision Operator’s Manual.
2.1.1
FEATURES AND NOTES
This section describes major features of 2D Single-view Vision Process, and provides notes on setup.
Features
−
−
−
−
−
Most universal vision applications using two-dimensional compensation can be implemented.
Fixed frame offset and tool offset can be performed.
Both a fixed camera and robot mounted camera can be used.
When a robot mounted camera is used, the position of a workpiece can be measured by moving the
robot in the X and Y directions. This capability is provided because the current position of a robot
is considered in iRVision calculation processing when a workpiece position calculation is made.
tool offset measures, with a camera, how much a workpiece gripped by a robot is deviated from the
correct grip position. This feature performs compensation so that the robot places the gripped
workpiece at the predetermined position correctly.
Notes
−
−
−
When using this function, use 7DA1 series/edition 19 or later.
It is recommended to use grid calibration. Select a grid size based on the size of the field of view.
The dots on the calibration grid should cover the field of view.
This function provides two-dimensional compensation, so that compensation is applied in the X, Y,
and R directions. Accordingly, it is assumed that the height of the workpiece remains unchanged
and that the workpiece is not tilted.
2.1.2
LAYOUT
2D Single-view Vision Process uses a fixed camera or robot mounted camera.
-4-
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
Camera
Z
Optical axis of camera
Workpiece
Pallet
XY
User coordinate system for compensation
−
When either a fixed camera or a robot mounted camera is used, ensure that the optical axis of the
camera is normal to the XY plane of the application user frame; where the robot performs a
compensation operation.
2.2
OVERVIEW OF 2D MULTI VIEW VISION PROCESS
The 2D Multi-view Vision Process measures multiple points of a workpiece to perform two-dimensional
compensation. This function is used to measure multiple points of a large workpiece that cannot be
contained in the field of view of a single camera.
Unlike the 2D Single-view Vision Process, the 2D Multi-view Vision Process contains "camera views" in
it. For each measurement point, a camera view is named camera view 1 or camera view 2. When a
new 2D Multi-view Vision Process is created, two camera views, camera view 1 and camera view 2, are
prepared. With two measurement points, namely, two camera views, 2D Multi-view Vision Process can
measure the position of a large workpiece. With iRVision, up to four camera views can be specified.
Either a fixed camera or a robot-mounted camera can be used. If a robot-mounted camera is used and
the robot holding the robot-mounted camera is moved in the X and Y directions, the position of a
workpiece is measured considering the amount of movement of the robot. As with ordinary
two-dimensional compensation, however, a tilted workpiece cannot be used.
If 2D Multi-view Vision Process is used, "fixed frame offset" or "tool offset" can be selected.
As with 2D Single-view Vision Process, grid pattern calibration should be used. The grid pattern
calibration method requires a dedicated calibration grid. A FANUC standard calibration grid is
available in several sizes. It is strongly recommended to order a calibration grid plate as well as a
camera and lens. For details, see Appendix A.
NOTE
When a fixed camera is used, the robot-generated grid calibration can be used
instead. When there is not an appropriate size of calibration grid for the camera
FOV, the robot-generated grid calibration is useful. For details, see Subsection
11.3, “ROBOT-GENERATED GRID CALIBRATION” in the iRVision operator’s
manual.
This manual is directed to users who are reasonably familiar with the FANUC’s two-dimensional vision.
For details of each setting, refer to the online help, or the iRVision Operator’s Manual.
-5-
2.OVERVIEW OF 2D VISION
2.2.1
B-82774EN-3/03
FEATURES AND NOTES
This section describes major features of the 2D Multi-view Vision Process function, and provides notes
on setup.
Features
−
−
−
−
This function performs two-dimensional compensation by measuring multiple points of a large
workpiece that cannot be contained in the field of view if only one point is measured.
Fixed frame offset and tool offset can be performed.
Both a fixed camera and robot-mounted camera can be used.
When a robot-mounted camera is used, the position of a workpiece can be measured by moving the
robot in the X and Y directions. This capability is provided because the current position of a robot
is considered in iRVision calculation processing when a workpiece position calculation is made.
Notes
−
−
−
−
When using this function, use 7.20P/19 series/edition 19 or later.
As with 2D Multi-view Vision Process, it is recommended to use grid pattern calibration. Before
setup, prepare a calibration grid of a size suitable for the field of view.
Pickup fixed frame offset is applied in the X, Y, and R directions. Accordingly, it is assumed that
the large workpiece is in the same plane and is not tilted. If the height of a measurement point varies
from one camera view to another set the proper workpiece height in the Z direction, the application Z,
for each camera view. The application Z value is the Z distance the part is from the application user
frame.
Up to four measurement points (camera views) can be set.
2.2.2
LAYOUT
The layout of a system with the 2D Multi-view Vision Process is the same as for 2D Single-view Vision
Process. Either a fixed camera or a robot-mounted camera is used.
Camera
Z
Optical axis of camera
Workpiece
Pallet
Application User Frame
−
XY
When either a fixed camera or a robot-mounted camera is used, ensure that the optical axis of the
camera is normal to the XY plane of the application user frame; where the robot performs a
compensation operation.
-6-
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
2.3
OVERVIEW OF FLOATING FRAME VISION PROCESS
The floating frame vision process function can perform two-dimensional measurement at various robot
positions by using one camera calibration data item. In addition, two-dimensional measurement can be
performed in different robot postures from those for camera calibration.
Only the following functions can be used in the floating frame vision process function.
−
Robot mounted camera + fixed frame offset
In the floating frame vision process function, be sure to use grid pattern calibration as the camera
calibration method.
The grid pattern calibration method requires a dedicated calibration grid. The FANUC standard
calibration grid is available in several sizes. It is strongly recommended that you order a calibration grid
as well as a camera and lens. For details of a calibration grid, see the appendix A.
NOTE
When a fixed camera is used, the robot-generated grid calibration can be used
instead. When there is not an appropriate size of calibration grid for the camera
FOV, the robot-generated grid calibration is useful. For details, see Subsection
11.3, “ROBOT-GENERATED GRID CALIBRATION” in the iRVision operator’s
manual.
The floating frame vision process function is available with 7DA3 series/edition 12 or later.
This manual is directed to users who are reasonably familiar with the FANUC two-dimensional vision.
For details of each setting item, refer to the online help information or the iRVision Operator’s Manual.
2.3.1
FEATURES AND NOTES
This section describes major features of floating frame vision process and provides notes on setup.
Features
−
−
−
−
Two-dimensional measurement can be performed at various robot positions using the robot mounted
camera from one calibration data item.
Two-dimensional measurement can be performed in a posture different from that when the camera
was calibrated.
This function is used when a workpiece moves relative to the camera.
The amount of movement of the robot position is automatically considered when the workpiece
position is calculated within a vision process.
Notes
−
−
−
−
When using this function, use 7DA3 series/edition 12 or later.
Be sure to use grid pattern calibration method. Select a calibration grid size appropriate for the size
of the field of view before setup.
Compensation data is represented in the user coordinate system, so compensation data items Z, W,
and P may not be 0 depending on the robot posture during measurement.
When two-dimensional measurement is performed with the robot posture changed, it may be difficult
to achieve high precision as compared with standard two-dimensional compensation because it is
difficult to set the standoff distance between the camera and the workpiece to a prescribed value or
the absolute position determination precision of the robot is likely to have an effect on the
-7-
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
measurement precision. In case of having to improve the precision, consider the use of standard
two-dimensional compensation instead of the floating frame vision process.
2.3.2
LAYOUT
The floating frame vision process uses a robot mounted camera.
Optical axis of camera
Camera
D
Z
XY
Workpiece
Application User Frame
Pallet
−
−
−
To perform camera calibration, ensure that the camera is orthogonal to the XY plane of the user
coordinate system used for compensation.
Consider the layout so that standoff distance D between the camera and the workpiece is constant
during measurement.
In the floating frame vision process function, measurement is enabled when the relative location and
posture of the workpiece and camera during reference position setting is the same as those during
actual measurement as shown in the figure below. If the location and posture of the camera for
actual measurement is changed from those for reference position setting, the user coordinate system
used for compensation virtually is considered to follow the camera. Even if the location and posture
of the robot are changed, compensation is performed within the XY plane of the virtual user
coordinate system.
Location and posture during
reference position setting
Z
Compensation in a virtual
user coordinate system that
follows the camera is
realized.
Location and posture
during actual
measurement
D
XY
-8-
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
2.4
OVERVIEW OF DEPALLETIZING VISION PROCESS
Depalletizing Vision Process is a vision program that performs compensation in the vertical direction in
addition to standard two-dimensional compensation. This function measures the height of the
workpieces based on the size of the workpiece image viewed by the camera.
Two stacks of workpieces are placed, each at a different height. Next, the workpiece height (reference Z
height) the application user frame and the size (reference size) detected by the camera at each height are
set. This processing differs from ordinary two-dimensional compensation.
Either a fixed camera or a robot-mounted camera can be used. If a robot-mounted camera is used and
the robot holding the robot-mounted camera is moved in the X, Y, and Z directions, the position (X,Y,R)
and height (Z) of the workpiece are measured taking the robot movement into account. As with ordinary
two-dimensional compensation, however, a tilted workpiece cannot be used.
Depalletizing Vision Process performs compensation in the vertical direction as well, so that grid pattern
calibration must be used at all times. The grid pattern calibration method requires a dedicated
calibration grid. A FANUC standard calibration grid is available in different sizes. It is strongly
recommended to order a calibration grid as well as a camera and lens. For details of a calibration grid,
see Appendix A.
NOTE
When a fixed camera is used, the robot-generated grid calibration can be used
instead. When there is not an appropriate size of calibration grid for the camera
FOV, the robot-generated grid calibration is useful. For details, see Subsection
11.3, “ROBOT-GENERATED GRID CALIBRATION” in the iRVision operator’s
manual.
Depalletizing Vision Process is available with 2D iRVision 7.20P/19or later.
This manual is directed to users who are reasonably familiar with the FANUC two-dimensional vision.
For details of each setting item, refer to the online help information or the iRVision Operator’s Manual.
2.4.1
FEATURES AND NOTES
This section describes major features Depalletizing Vision Process and provides notes on setup.
Features
−
−
−
This function performs compensation in the vertical (Z) direction in addition to ordinary
two-dimensional compensation.
After an operation for picking up a workpiece from one location of the pallet is taught, a workpiece
can be picked up from an arbitrary row/column/stage.
When a robot-mounted camera is used, the position of a workpiece can be measured by moving the
robot in the X, Y, and Z directions. This capability is provided because the current position of a
robot is considered in iRVision calculation processing when a workpiece position calculation is made.
Notes
−
−
−
−
When using this function, use 2D iRVision 7.20P/19or later.
Be sure to use grid calibration. Simple 2-D calibration is not applicable.
As with ordinary two-dimensional compensation, it is assumed that the workpiece is not tilted.
The height (standard Z coordinate) of the workpieces are measured based on the size of the workpiece
image viewed by the camera. For precise size measurement, a stable lighting environment is
important. When a robot-mounted camera is used, it is recommended to install a ring light around
the camera.
-9-
2.OVERVIEW OF 2D VISION
−
−
−
B-82774EN-3/03
The height of the workpieces are measured based on the size of the workpiece image viewed by the
camera. As a guideline, two workpieces found one layer apart should have a difference in size by at
least 5%.
As the distance between the camera and workpieces increases, the precision in height measurement is
degraded. If a workpiece to be measured is located far away, the workpiece can be measured again
by approaching the workpiece if the camera is robot-mounted. In this case, the same vision program
can be used. This is because the current position of the robot is considered when a workpiece
position calculation is made.
As mentioned above, a height change due to a change in the number of layers is calculated from the
detected workpiece size information. A viewed image of the workpieces can slightly vary, resulting
the height measurement differing from the actual height of the current layer. To protect against a
system failure, it is recommended to have error compensation in the gripper and a sensor for detecting
a contact with a workpiece on the gripper. For example, when the gripper contacts the workpiece,
an input can be used along with a high-speed skip function.
2.4.2
LAYOUT
This section describes a typical system layout for Depalletizing Vision Process and provides notes on
setup.
Robot-mount
Optical axis of camera
Workpiece
Pallet
−
−
−
−
−
The camera position for measurement is determined from the thickness of a workpiece and the
number of layers.
Ensure that the optical axis of the camera is normal to the pallet plane.
As the distance between the camera and workpiece increases, the precision in height measurement is
degraded. So, minimize this distance whenever possible.
Ensure that the camera focuses on both the workpiece at the top and the workpiece at the bottom.
Ensure that lighting is provided evenly to the workpiece at the top and the workpiece at the bottom
whenever possible. This is a key to stable workpiece detection and precise size measurement.
When a robot-mounted camera is used, it is recommended to install a ring light around the camera.
−
2.5
OVERVIEW OF BIN-PICK SEARCH VISION PROCESS
The Bin-Pick Search Vision Process function is a vision program that performs compensation in the
height direction in addition to standard two-dimensional compensation. This function measures the
height of a workpiece based on the size of the workpiece image taken by a camera. Basically, this
- 10 -
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
function has the same function as Depalletizing Vision Process, but the Bin-Pick Search Vision Process
function has the function for outputting information about a view line from the camera to the workpiece.
Specifically, two stacks of workpieces are placed, each at a different height, and the workpiece height
(reference Z height) in the user coordinate system and the size (reference size) detected by the camera at
each height are set.
Either a fixed camera or a robot mounted camera can be used. If a robot mounted camera is used and the
robot holding the camera is moved in the X, Y, and Z directions, the position (X,Y,R) and height (Z) of
the workpiece are measured taking the robot movement into account. Since information about a view
line from the camera to the workpiece is output, interference between the pallet and the robot can be
minimized if the robot approaches the workpiece along the view line.
The Bin-Pick Search Vision Process function performs compensation in the vertical direction as well, so
that grid pattern calibration must be used at all times. The grid pattern calibration method requires a
dedicated calibration grid. A FANUC standard calibration grid is available in different sizes. It is
strongly recommended to order a calibration grid as well as a camera and lens. For details of a
calibration grid, see the appendix A.
NOTE
When a fixed camera is used, the robot-generated grid calibration can be used
instead. When there is not an appropriate size of calibration grid for the camera
FOV, the robot-generated grid calibration is useful. For details, see Subsection
11.3, “ROBOT-GENERATED GRID CALIBRATION” in the iRVision operator’s
manual.
The Bin-Pick Search Vision Process function is available with 2D iRVision 7DA3 series/edition 12 or
later.
This manual is directed to users who are reasonably familiar with the FANUC two-dimensional vision.
For details on each setup item, refer to the online help information or the iRVision Operator’s Manual.
2.5.1
FEATURES AND NOTES
This section describes major features of the Bin-Pick Search Vision Process function and notes on setup.
Features
−
−
−
−
This function outputs information about a view line from the camera to the workpiece in addition to
standard 2.5-dimensional compensation.
When the robot approaches the workpiece along a view line, proper contact with the workpiece is
achieved and interference between the pallet and the robot can be minimized.
If workpiece picking motion is taught at one position in the pallet, the workpiece can be picked at any
position.
For a robot mounted camera, the workpiece position can be measured with the robot moved in the X,
Y, and Z directions. The amount of movement of the robot is automatically taken into account when
the workpiece position is calculated in the vision process.
Notes
−
−
−
When using this function, use 7DA3 series/edition 12 or later.
Be sure to use grid calibration method.
Compensation of the workpiece picking position relates to all components of X, Y, Z, W, P, and R;
W and P indicate the view line from the camera to a workpiece, not the inclination of the workpiece.
That is, it is impossible to align the orientation of the gripper with that of the workpiece during
picking. Therefore, the gripper needs to be designed so that the workpiece can be picked up even if
- 11 -
2.OVERVIEW OF 2D VISION
−
−
B-82774EN-3/03
the orientation of the workpiece does not align with that of the gripper. The position component, R
is reflected to robot motion.
The height (Z coordinate) of a workpiece is measured on the basis of the size of the workpiece image
taken by a camera. A stable lighting environment is important to measure the size accurately.
This function assures picking in the state where workpieces are stacked randomly. In such a state,
however, the inclination of each workpiece is not constant, so the size cannot be measured accurately.
It is important to consider the gripper design and robot motion in advance so that a workpiece can be
picked even if the height or orientation of the workpiece includes error. It is recommended that a
sensor for detecting a contact between the floating mechanism and the workpiece in the approach
direction be added to the gripper to prevent a system halt. For example, if the lead switch of the
cylinder and the high-speed skip function are used together, the gripper can adequately stop
approaching when the gripper makes contact with the workpiece.
2.5.2
LAYOUT
This section describes a typical system layout for the Bin-Pick Search Vision Process function and
provides notes on setup.
Fixed camera
Optical axis
Magnetic gripper
Pallet
Workpiece
−
−
−
−
−
−
The camera position for measurement is determined from the size and height of the pallet.
Ensure that the optical axis of the camera is vertical to the pallet plane.
As the distance between the camera and the workpiece increases, height compensation error by size
detection error increases. the precision in height measurement is degraded. So, minimize this
distance whenever possible.
Ensure that the camera focuses on both the workpiece near the pallet top and the workpiece near the
pallet bottom.
Ensure that lighting unit illuminates evenly to the workpiece near the pallet top and the workpiece
near the pallet bottom whenever possible. This is a key to stable workpiece detection and precise
size measurement. When a robot mounted camera is used, it is recommended to install a LED ring
light on the gripper.
It is impossible to align the orientation of the gripper with that of the workpiece during picking.
Therefore, the gripper needs to be designed so that the workpiece can be picked up even if the
orientation of the workpiece does not align with that of the gripper.
- 12 -
2.OVERVIEW OF 2D VISION
B-82774EN-3/03
2.6
OVERVIEW OF 3D TRI-VIEW VISION PROCESS
The 3D Tri-View Vision Process is the function for making three-dimensional compensation by
measuring three detection targets of a large workpiece such as a car body. Compensation is applied to
all of six degrees of freedom for parallel displacement (X, Y, Z) and rotation (W, P, R) of the workpiece.
The 3D Tri-View Vision Process has camera views in a program as two-dimensional compensation based
on multiple cameras. There are three camera views for measuring a total of three detection
targets.During detection, a total of three gaze lines (one for each camera view) are measured. A triangle
that takes the three detection targets as vertices and has known shape is applied to the three gaze lines to
identify the position of each detection target on the corresponding gaze line and obtain the
three-dimensional position and posture of the workpiece.Both a fixed camera and robot mounted camera
can be used. For a hand camera, even when the robot that has the camera is moved, the workpiece
position is measured with the amount of movement of the robot considered.The 3D Tri-View Vision
Process can use only “fixed frame offset” as the compensation method.
In addition, only the grid pattern calibration method is available as calibration method.
The grid pattern calibration method requires a dedicated calibration grid. A FANUC standard calibration
grid is available in several sizes. It is strongly recommended that you order a calibration grid as well as
a camera and lens. For details, see Appendix A.
NOTE
When a fixed camera is used, the robot-generated grid calibration can be used
instead. When there is not an appropriate size of calibration grid for the camera
FOV, the robot-generated grid calibration is useful. For details, see Subsection
11.3, “ROBOT-GENERATED GRID CALIBRATION” in the iRVision operator’s
manual.
The 3D Tri-View Vision Process is available with 7DA4 series/edition 04 or later.
This manual is directed to users who have taken the FANUC two-dimensional vision course. For details
of each setup item, refer to the online help information or the iRVision Operator’s Manual.
2.6.1
FEATURES AND NOTES
This section describes major features of 3D Tri-View Vision Process, and provides notes on setup.
Features
−
−
−
−
Three-dimensional compensation is made by measuring three points on a large workpiece having
fluctuations in its position and posture.
Only “fixed frame offset” can be performed.
Both a fixed camera and a robot mounted camera can be used.
A robot mounted camera can measure a detection target while moving the position of the robot
arbitrarily. This capability is provided because the current position of a robot is considered in
iRVision calculation processing when a target position calculation is made.
Notes
−
−
−
−
When using this function, use 7DA4 series/edition 04 or later.
Only the grid pattern calibration method is available as calibration method. Select a grid size based on
the size of the field of view. The dots on the calibration grid should cover the field of view.
Compensation is applied in the X, Y, Z and W, P, R directions.
The number of measurement points (number of camera views) is three and cannot be changed.
- 13 -
2.OVERVIEW OF 2D VISION
−
B-82774EN-3/03
The following conditions must be met when determining detection targets. (For a car body, the
reference holes are suitable).
The exact relative position between the three detection targets can be calculated from the
drawing or the like.
The relative relation between the positions of the three detection targets and the work positions
does not change individually.
Three detection targets can be set so that the whole workpiece can be covered.
The triangle having the three detection targets as its vertices is not too slim.
The shapes of the detection targets are not different visually.
There is no portion having a similar shape near the detection targets.
2.6.2
LAYOUT
3D Tri-View Vision Process uses fixed cameras or robot mounted cameras.
Triangle made of the
targets
Target3
Target1
Object
Target2
Camera3
Camera1
Camera2
−
Unlike two-dimensional compensation, the optical axis of the camera is not orthogonal to the XY
plane in the user frame used by the robot for compensation. When three gaze lines are measured
during measurement of detection targets, place the cameras so that any pair of gaze lines is not
parallel and any angle formed by two gaze lines is large to some extent (if possible, 60 degrees or
more). If an angle formed by gaze lines is small, the required compensation precision may not be
obtained. The orientations of all cameras to be used need to be angular to some extent. The user
frame used for compensation may be placed in any place. If there is no coordinate system to be
specifically specified, use the world coordinate system of the robot.
- 14 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
3
2D SINGLE VIEW VISION PROCESS
This chapter describes the setup procedure for 2D Single-view Vision Process by using the following
three application examples:
<1> Fixed camera + fixed frame offset
<2> Robot mounted camera + fixed frame offset
<3> Fixed camera + tool offset
In any of "<1> Fixed camera + fixed frame offset", "<2> Robot mounted camera + fixed frame offset",
and "<3> Fixed camera + tool offset ", a user frame is set in parallel with the plane on which the position
of the workpiece is deviated. In fixed frame offset of <1> and <2>, the "Offset" instruction is added as
an operation statement to the teach pendant program for robot teach fixed frame offset. In tool offset of
<3>, the "Tool_Offset" instruction is added as an operation statement in the teach pendant program for
robot teach fixed frame offset.
In tool offset as well, note that a user frame needs to be set in parallel with the plane on which the
position of the workpiece gripped by the robot is deviated.
This chapter assumes that a personal computer for teaching is already set up. No description of personal
computer setup is provided here.
Fixed camera + fixed frame offset
An example of layout for "fixed camera + fixed frame offset" is given below.
Camera
Z
Optical axis of camera
Height of workpiece in Z direction
(Z coordinate of measurement plane
viewed from XY plane of application
user frame)
Workpiece
Pallet
Application User Frame
XY
Robot mounted camera + fixed frame offset
An example of layout for "robot mounted camera + fixed frame offset" is given below.
- 15 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
Camera
Z
Optical axis of camera
Height of workpiece in Z direction
(Z coordinate of measurement
plane viewed from XY plane of
application user frame)
Workpiece
Pallet
XY
Application User Frame
Fixed camera + tool offset
An example of layout for "fixed camera + tool offset " is given below.
Camera
Z
Workpiece deviation direction
Y
X
Workpiece deviation direction
Workpiece
Application User Frame
* If measurement plane
matches XY plane of application
user frame" workpiece height in
Z direction = 0" results.
3.1
PREPARATION BEFORE SETUP
Section 3.2 through Section 3.4 describe the setup procedures for "fixed camera + fixed frame offset",
"robot mounted camera + fixed frame offset", and "fixed camera + robot holding workpiece". At first,
this section describes a calibration grid and memory card that are needed for each of these setup
procedures.
- 16 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
Calibration grid preparation
With the 2D Single-view Vision Process function, it is recommended to use a calibration grid. Do not
fail to prepare a calibration grid beforehand. A standard calibration grid is available from FANUC in
several sizes. It is strongly recommended that you order a calibration grid as well as a camera and lens.
Memory card preparation
iRVision can save undetected images to a memory card inserted into the main board of the robot
controller. It is recommended that at the time of system start-up and integration, a memory card be
inserted to save undetected images to the memory card. By doing so, a GPM Locator tool parameter
adjustment can be made using undetected images. Moreover, when the system is reinstalled after being
moved, for example, camera images before reinstallation, if saved, can be checked against camera images
after reinstallation to see if there is any major difference.
Note that even if "Log Failed Images" is set in the vision program, no undetected images can be saved
when no memory card is inserted.
A memory card, when inserted, can be used to back up all data in the robot controller. If all data in the
robot controller is backed up, the vision data can be backed up at the same time. Be sure to back up all
data in the robot controller upon completion of startup or integration.
3.2
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"
Use the following setup procedure for "fixed camera + fixed frame offset":
1. Camera setting data creation
and teaching
Teach the TCP at tip of the pointer
used for user frame teaching.
The pointer TCP is used to teach the
application user frame and the
calibration user frame.
2. Robot TCP setting
Teach the calibration grid user
frame or TCP frame
3. Teaching of the calibration grid frame
4. Teaching of the application user frame
5. Camera calibration data creation and teaching
6. Vision program creation and teaching
7. Robot program creation and teaching
8. Robot compensation operation check
- 17 -
Set application user frame in parallel
with the workpiece deviation plane.
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
Setup for "fixed camera + fixed frame offset" needs to set "calibration grid information" in a user frame
or tool coordinate system with an arbitrary number and to set a "user frame for compensation" in parallel
with the workpiece deviation plane as a user frame with an arbitrary number.
Z
Z
Workpiece deviation plane
Y
XY
Application User Frame
X
Calibration grid user
frame or TCP
3.2.1
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setup tool.
Whether to install a camera on the robot or on a fixed stand is set in the camera setup tool. When using
a fixed camera, do not check "Robot-Mounted Camera".
3.2.2
Robot TCP Setup
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy of
this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded, especially
when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
- 18 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
3.2.3
Calibration grid frame
Setup
Teach the calibration grid location in a user frame or user tool.
To perform calibration by setting the calibration grid that is fixed, teach a user frame to the grid. To
perform calibration by setting a calibration grid on the robot gripper, teach a user tool to the grid.
Properly secure the calibration grid then teach the grid with the pointer precisely to the frame. The
teaching precision here affects compensation precision. So, perform this teaching process precisely.
Note that the frame used for the calibration grid setup might differ from the application user frame used
for compensation described in Subsection 3.2.4.
When a grid is fixed
When the calibration grid is fixed, teach it in a user specified user frame.
Teach the calibration grid precisely with the pointer tool attached to the robot and the proper TCP, then
set the user frame as shown below.
Z
Y
User coordinate system
X
For the method of setting a user frame, see Appendix A, "CALIBRATION GRID AND SETUP".
The user frame 4-point setting method is used.
The number of a user frame set here needs to be different from the number of the user frame set in
Subsection 3.2.4, "Setting of the Application User frame ".
When the calibration grid is mounted on the robot
When a calibration grid is mounted on the robot, teach the calibration grid in a user specified tool frame.
Touch up the pointer installed on a fixed stand with the calibration grid precisely, then set a tool
coordinate system as shown below.
Z
Y
User Tool Frame
X
For the method of setting a tool coordinate system, see Appendix A, "CALIBRATION GRID SETUP".
The tool coordinate system 6-point setting method is used.
- 19 -
3.2D SINGLE VIEW VISION PROCESS
3.2.4
B-82774EN-3/03
Application User Frame Setup
Set the user frame to be used by the robot for compensation operation.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
Set a user frame for compensation so that the XY plane of the user frame is parallel with the pallet plane
on which a workpiece moves.
Camera
Z
Optical axis of camera
Height of workpiece in Z direction
(Z coordinate of measurement
plane viewed from XY plane of user
coordinate system)
Workpiece
Pallet
XY
Application User Frame
The figure below shows a case where a workpiece moves on a non-horizontal plane.
Ensure that the optical axis of the camera is normal to the plane where the workpiece moves, and that the
plane where the workpiece moves is parallel with the XY plane of the user frame.
Optical axis of camera
Z
XY
Application
User Frame
- 20 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
3.2.5
Camera Calibration Data Creation and Teaching
For camera calibration, the grid pattern calibration tool is recommended.
On the calibration data setting screen, select a coordinate system number set in Subsection 3.2.3, "
Calibration grid Setup", and a user frame number set in Subsection 3.2.4, "Application User Frame Set up
correctly.
When the calibration grid is fixed
When the calibration grid is fixed, perform one-plane calibration. When one-plane calibration is
performed, the focal length of the lens cannot be calculated precisely. So, set the focal length of the lens
manually after calibration grid detection. If the focal length of the lens used is 12 mm, enter 12.0.
When the calibration grid is robot mounted
When the calibration grid is robot mounted, perform two-plane calibration by moving the robot up and
down as shown in the figure below. The up/down distance for two-plane calibration should be 100 to
150 mm.
Find the grid pattern at two different heights. When moving the calibration grid up and down, jog the
robot without changing the robot posture.
Camera
Calibration grid
1st detection
100 to
150 mm
2nd detection
The calibration grid setup page is shown below. The calibration grid image is displayed.
- 21 -
3.2D SINGLE VIEW VISION PROCESS
•
In "Application Frame", select a user
coordinate system number set in Subsection
3.2.4.
•
Enter the interval between grid points.
•
For installation of a robot mounted calibration
grid, set the number of calibration planes to 2.
For fixed calibration grid plate setting, set the
number to 1.
•
According to the setting of Subsection 3.2.3,
select the user frame number that was taught
to the calibration grid . Next, press the “Set
Frame” button.
•
For single-plane calibration, enter the focal
length manually.
•
Move the calibration grid the proper distance
from the camera then press the snap button.
Next, press the “Find” button for the first
plane.
•
Change the distance between the calibration
grid and camera then press the snap button.
Next, press the “Find” button for the second
plane.
3.2.6
B-82774EN-3/03
Vision Program Creation and Teaching
Create a program for "2D Single-view Vision Process".
If the measurement plane of a workpiece is apart from the XY plane of the application user frame set in
Subsection 3.2.4, "Application User Frame Setup", set the proper value in "App. Z Coordinate" of the
vision program.
- 22 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
Z
Height of workpiece in Z
direction
(Z coordinate of measurement
plane viewed from XY plane of
user coordinate system)
Workpiece
Pallet
XY
Application User Frame
A description of the GPM Locator tool is omitted here. For setting of the GPM Locator tool, refer to
Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision Operator’s Manual.
•
Select camera calibration data.
•
Select a shutter speed.
•
Select "Fixed Frame Offset" in Offset
Mode.
•
Set the distance between the XY plane of
the user coordinate system and the
measurement plane of a workpiece if the
planes are apart from each other.
•
Press the 「Snap and Find」 button to
find the workpiece. Next, press the
「Set Ref. Pos.」 button to set a
reference position.
- 23 -
3.2D SINGLE VIEW VISION PROCESS
3.2.7
B-82774EN-3/03
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. For fixed frame offset, add the
"VOFFSET" instruction as an operation statement. Moreover, be sure to specify the number of the
application user frame to be used for compensation.
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
UFRAME_NUM=1
UTOOL_NUM=1
Sets the same value as the number of the
application user frame.
L P[1] 500mm/sec FINE
VISION RUN_FIND‘A’
L P[2] 500mm/sec FINE
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
3.2.8
Executes programA with the vision
detection instruction.
Obtains the measurement result of vision
programA.
Handling
P[3] 500mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
P[5] 100mm/sec FINE VOFFSET,VR[1]
Handling
Adds the “VOFFSET” instruction as an
operation statement, for position
compensation.
END
LBL[99]
UALM[1]
Robot Compensation Operation Check
Check that a workpiece placed on the pallet can be detected and handled precisely. At first, decrease the
override of the robot to check that the logic of the program is correct. Next, increase the override to
check that the robot can operate continuously.
- 24 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
3.3
SETUP FOR "ROBOT MOUNTED CAMERA + FIXED
FRAME OFFSET "
Use the following setup procedure for "Robot Mounted camera + fixed frame offset":
1. Camera setting data creation and teaching
2. Robot TCP setting
Teach the TCP at tip of the pointer
used for user frame teaching.
The pointer TCP is used to teach the
application user frame and the
calibration user frame.
Teach the calibration grid user frame
or TCP frame
3. Teaching the calibration grid frame
Teach the application user frame in
parallel with the workpiece deviation
plane.
4. Teaching of the application user frame
5. Camera calibration data creation and teaching
6. Vision program creation and teaching
7. Robot program creation and teaching
8. Robot compensation operation check
Setup for "robot mounted camera + fixed frame offset" needs to set "the calibration grid information" in a
user specified user frame and to set the application frame in parallel with the workpiece deviation plane in
a user specified user frame.
Z
Z
Workpiece deviation plane
Y
X
Application user frame
Calibration grid
user frame
- 25 -
XY
3.2D SINGLE VIEW VISION PROCESS
3.3.1
B-82774EN-3/03
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot or on a fixed stand is set in the camera setting data. When
using a robot mounted camera, be sure to check this item.
3.3.2
Robot TCP Setup
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy
of this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded,
especially when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
3.3.3
Calibration grid
Setup
Teach the calibration grid location in a user frame.
To perform calibration by setting the calibration grid in a fixed location on a table, set it up in a user
frame.
When the calibration grid is fixed, secure the calibration grid then touch up the grid with the pointer
precisely to set the user frame. The touch-up precision here affects compensation precision. So,
perform this touch-up processing precisely. Note that the user frame used for the calibration grid
differs from the application user frame described in Subsection 3.3.4.
When the calibration grid is fixed, teach a user specified user frame to it.
Teach the grid precisely with the pointer attached to the robot gripper. Teach the user frame as shown
below.
- 26 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
Z
Y
User coordinate system
X
For instructions on teaching a user frame, see Appendix A, "CALIBRATION GRID SETUP". The user
frame 4-point setting method is used.
3.3.4
Teaching Application User frame
Set a user frame to be used by the robot for compensation operation.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
Set a user frame for compensation so that the XY plane of the user frame is parallel with the pallet plane
on which a workpiece moves.
Camera
Z
Application User F
Pallet
3.3.5
XY
Camera Calibration Data Creation and Setup
For camera calibration, the use of a calibration grid is recommended.
On the camera calibration tools setup screen, select the frame number created in Subsection 3.3.3,
"Calibration Grid Setup", and a frame number created in Subsection 3.3.4, "Setting of the Application
User Frame " correctly.
When a robot mounted camera is used, perform two-plane calibration by moving the robot up and down
as shown in the figure below. The up/down distance for two-plane calibration should be 100 to 150 mm.
Next, move the camera about 100 to 150 mm in the Z direction. Find the calibration grid at two
different heights. In this operation, ensure that the posture of the robot for detecting the calibration grid
is the same as the posture for detecting a workpiece.
- 27 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
1st detection
100 to
150 mm
2nd detection
Calibration grid plate
The grid pattern calibration setup screen is shown below. A calibration grid image is displayed.
- 28 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
•
In "Application Frame", select the user
frame number set in Subsection 3.3.4.
•
Enter the interval between grid dots.
•
Set the number of calibration planes to 2.
•
According to the setting of Subsection
3.3.3, select the frame number of the
calibration grid position data is set.
Next, press the “Set Frame” button.
•
Move the camera for a proper distance
from the calibration grid then press the
snap button. Next, press the “Find”
button for the fist plane.
•
Change the distance between the
camera and calibration grid plate then
press the snap button. Next, press the
“Find” button for the second plane
l
3.3.6
Vision Program Creation and Teaching
Create a program for "2D Single-view Vision Process".
If the measurement plane of a workpiece is above or below the XY plane of the application user frame set
in Subsection 3.3.4, " Application User frame Setup", set the proper value in "App. Z Coordinate" of the
vision program.
Z
Height of workpiece in Z
direction
(Z coordinate of measurement
plane viewed from XY plane of
user coordinate system of the
application frame)
Workpiece
Pallet
Application User Frame
XY
A description of the GPM Locator tool is omitted here. For setting of the GPM Locator tool, refer to
Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision Operator’s Manual.
- 29 -
3.2D SINGLE VIEW VISION PROCESS
•
Select camera calibration data.
•
Select a shutter speed.
•
Select "Fixed Frame Offset" in Offset
Mode.
•
Set the distance between the XY plane of
the user coordinate system and the
measurement plane of a workpiece if the
planes are apart from each other.
•
Press the 「Snap and Find」 button to
detect a workpiece. Next, press the
「 Set Ref. Pos. 」 button to set a
reference position.
B-82774EN-3/03
- 30 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
3.3.7
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. For fixed frame offset, add the
"VOFFSET" instruction as an operation statement. Moreover, be sure to specify the number of a user
frame to be used for compensation.
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
Sets the same value as the number of the a
user coordinate system for compensation.
UFRAME_NUM=1
UTOOL_NUM=1
Moves the robot to the workpiece image
snap position.
L P[1] 500mm/sec FINE
VISION RUN_FIND‘A’
L P[2] 500mm/sec FINE
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
3.3.8
Executes programA
detection instruction.
Handling
P[3] 500mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
P[5] 100mm/sec FINE VOFFSET,VR[1]
Handling
with
the
vision
Obtains the measurement result of vision
programA.
Adds the “VOFFSET” instruction as an
operation statement, for fixed frame offset.
END
LBL[99]
UALM[1]
Robot Compensation Operation Check
Check that a workpiece placed on the pallet can be detected and handled precisely. At first, decrease the
override of the robot to check that the logic of the program is correct. Next, increase the override to
check that the robot can operate continuously.
3.4
SETUP FOR "FIXED CAMERA + tool offset "
tool offset measures, with a camera, how much a workpiece gripped by a robot is deviated from the
correct grip position. This feature performs compensation so that the robot places the gripped workpiece
at the predetermined position correctly.
Use the following setup procedure for "fixed camera + tool offset ":
- 31 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
1. Camera setting data creation and teaching
Teach the calibration grid user frame
or TCP frame
2. Teaching of calibration grid user frame
Set application user frame in parallel
with the workpiece deviation plane.
3.
Teaching of application user frame
4. Camera calibration data creation and teaching
5. Vision program creation and teaching
6. Robot program creation and teaching
7.
For single-plane calibration, enter the focal
Setup for "fixed camera + tool offset " needs to have the "calibration grid setup" in a user defined tool
frame and have the "application user frame" in parallel with the workpiece deviation plane as a user
defined user frame.
Z
Y
X
Calibration grid user tool
Z
Application user frame
XY plane of the application
user frame must be in parallel
with workpiece deviation plane
XY
Workpiece deviation plane
An example layout for "fixed camera + tool offset " is given below.
The workpiece gripped by the robot is viewed by a camera to measure the grip error.
- 32 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
Camera
Z
Y
X
Workpiece deviation direction
Workpiece deviation
direction
Workpiece
Application user frame
* If measurement plane
matches XY plane of user
coordinate system, "workpiece
height in Z direction = 0" results.
It is recommended to set the calibration grid on a dummy workpiece for calibration. The figure below
shows an example of setting the calibration grid at a workpiece measurement position. Prepare a
dummy workpiece that resembles an actual workpiece and can be gripped. Setup work can be simplified
by setting a calibration grid on a dummy workpiece.
Calibration grid plate
Dummy workpiece
By setting a calibration grid on a dummy workpiece for calibration, the application user frame can be
trained more easily. For details, see Subsection 3.4.3.
- 33 -
3.2D SINGLE VIEW VISION PROCESS
3.4.1
B-82774EN-3/03
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot gripper or on a fixed stand is set in the camera setting data.
When using a fixed camera, do not check "Robot-Mounted Camera".
3.4.2
Calibration grid Setup
Teach the calibration grid location as a user tool frame.
Perform calibration by placing the calibration grid on the robot, and teach the grid as a tool frame.
Train the calibration grid user tool frame carefully because the precision here affects compensation
precision. So, perform this touch-up processing precisely. Note that a coordinate system used for
setup differs from the application user frame described in Subsection 3.4.3.
When the calibration grid is installed on the robot, teach the calibration grid information in a user
specified user tool frame.
Touch up the calibration grid precisely with pointer installed on a fixed stand. Train the tool frame as
shown below.
Z
Y
Tool coordinate system
X
For instructions on setting a user tool frame, see Appendix A, "CALIBRATION GRID SETUP". The
tool coordinate system 6-point setting method is used.
- 34 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
It is common mistake to teach the X, Y, or Z directions incorrectly. It is recommended that you verify the
user tool frame by jogging the robot in tool coordinates and make sure that +X, +Y, and +Z directions
match the grid.
3.4.3
User frame for Compensation Setup
Teach the application user frame.
If 0 is selected as the user frame number, the world coordinate system of the robot is used.
In tool offset, set the application user frame so that the XY plane of the user frame is parallel with the
plane on which a workpiece moves, as with fixed frame offset.
This subsection describes how to set the application user frame by using the tool coordinate system
already set in Subsection 3.4.2.
1
2
3
4
With the robot gripper, pick up a dummy workpiece then set a calibration grid on the dummy
workpiece.
Prepare a teach pendant program for calibration, and set 0 (base coordinate system) as the user frame
number and set the tool coordinate system number already set in Subsection 3.4.2 in the teach
pendant program.
Move the robot to the measurement position, then teach the position as a point in the teach pendant
program.
Run the teach pendant program above to move the robot to the teach point and record the current
position after the movement, then set the position in an arbitrary user frame. For this purpose,
include the following instructions in the teach pendant program:
PR[1]=LPOS
UFRAME[1]=PR[1]
By this operation, the application user frame is the same as the calibration user tool that was set in
Subsection 3.4.2. This user frame is used because it is parallel with the workpiece grip error plane.
Z
Current position
Y
X
3.4.4
Application
system
frame
=
tool
coordinate
Camera Calibration Data Creation and Teaching
For camera calibration, the use of grid pattern calibration is recommended.
On the calibration data setting screen, select a coordinate system number set in Subsection 3.4.2, "Setting
of Calibration grid Setup", and a coordinate system number set in Subsection 3.4.3, "Setting of a User
frame for Compensation" correctly.
Perform two-plane calibration by moving a calibration grid installed on the robot hand up and down as
shown in the figure below. The up/down distance for two-plane calibration should be about 100 to 150
mm.
Move the camera about 100 to 150 mm in the Z direction. Find the grid pattern at the two different
heights. In this operation, ensure that the posture of the robot for detecting the calibration grid is not
changed.
- 35 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
Camera
Calibration grid plate
1st detection
100 to
150 mm
2nd detection
The calibration grid setup page is shown below. A calibration grid image is displayed.
- 36 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
3.4.5
•
In "Application Frame", select a user
coordinate system number set in
Subsection 3.4.3.
•
Enter the interval between grid points.
•
For installation of a calibration grid plate
on the gripper, set the number of
calibration planes to 2.
•
According to the setting of Subsection
3.4.2, select the number of a tool
coordinate system where calibration grid
position data is set. Next, press the “Set
Frame” button.
•
Move the calibration grid plate for a
proper distance from the camera then
press the snap button. Next, press the
“Find” button.
•
Change the distance between the
calibration grid plate and camera then
press the snap button. Next, press the
“Find” button.
Vision Program Creation and Teaching
Create a program for "2D Single-view Vision Process".
Only for tool offset, a user tool frame number needs to be used.
The user tool frame in the vision program differs from the user tool frame for calibration grid setup. The
user can select the proper user tool frame, likely the user tool frame of the robot gripper. User tool
frame number 0 can be used if no gripper user tool is defined. Note, however, that the user tool frame
number set here needs to be specified in the teach pendant program.
If the measurement plane of a workpiece is apart from the XY plane of a user frame set in Subsection
3.4.3, "Setting of an Application User frame ", set the proper value in "App. Z Coordinate" of the vision
program. Even for tool offset, a similar setting needs to be made.
Z
Height of workpiece in Z direction
(Z coordinate of measurement
plane viewed from XY plane of
user coordinate system)
Workpiece
Pallet
XY
Application user frame
A description of the GPM Locator tool is omitted here. For setting of the GPM Locator tool, refer to
Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision Operator’s Manual.
- 37 -
3.2D SINGLE VIEW VISION PROCESS
3.4.6
•
Select camera calibration data.
•
Select a shutter speed.
•
Select "Tool Offset" in Offset Mode.
•
A user tool frame different from the
pointer tool used for teaching the
calibration grid can be specified here.
An arbitrary user tool frame such as a
user tool frame for the robot gripper or
user tool frame number 0 can be used.
•
Set the distance between the XY plane
of the user coordinate system and the
measurement plane of a workpiece
if the planes are apart from each other.
•
Press the 「Snap and Find」 button to
detect a workpiece. Next, press the
「Set Ref. Pos.」 button to set a
reference position.
B-82774EN-3/03
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. For tool offset, add the "VOFFSET"
instruction as an operation statement. Moreover, be sure to specify the number of the application user
frame and the number of the user tool frame set in the vision program.
- 38 -
3.2D SINGLE VIEW VISION PROCESS
B-82774EN-3/03
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
UFRAME_NUM=1
UTOOL_NUM=1
Sets the application user frame.
Sets the number of the user tool.
L P[1] 500mm/sec FINE
VISION RUN_FIND‘A’
L P[2] 500mm/sec FINE
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
3.4.7
Moves the robot to the measurement point.
Upon completion of movement, the
position of the workpiece is measured
using the vision detection instruction.
Handling
P[3] 500mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
P[5] 100mm/sec FINE VOFFSET,VR[1]
Handling
Uses the “Tool_Offset” instruction instead
of the “VOFFSET” instruction as an
operation statement, for tool offset.
END
LBL[99]
UALM[1]
Robot Compensation Operation Check
Check that a workpiece gripped by the robot can be detected and positioned precisely at a desired location.
At first, decrease the override of the robot to check that the logic of the program is correct. Next,
increase the override to check that the robot can operate continuously.
- 39 -
4.2D MULTI VIEW VISION PROCESS
4
B-82774EN-3/03
2D MULTI VIEW VISION PROCESS
This chapter describes the setup procedure for 2D Multi-view Vision Process by using the following three
application examples:
<1> Fixed camera + fixed frame offset
<2> Robot-mounted camera + fixed frame offset
<3> Fixed camera + tool offset
This chapter assumes that a personal computer for teaching is already set up No description of personal
computer setup is provided.
Fixed camera + fixed frame offset
An example of layout for "fixed camera + fixed frame offset" is given below.
Camera
Workpiece
Pallet
Application User Frame
XY
Robot-mounted camera + fixed frame offset
An example of layout for "robot-mounted camera + fixed frame offset" is given below.
Camera
Workpiece
Pallet
Application User Frame
XY
Fixed camera + tool offset
An example of layout for "fixed camera + tool offset" is given below.
- 40 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Camera
Application User Frame
Workpiece
4.1
PREPARATION BEFORE SETUP
Section 4.2 describes the setup procedure for "fixed camera + fixed frame offset", and Section 4.3
describes the setup procedure for "robot-mounted camera + fixed frame offset". This section describes a
calibration grid and memory card that are needed for each of these setup procedures.
Calibration grid preparation
With the 2D Multi-View Vision Process, it is recommended to use a calibration grid. Do not fail to
prepare a calibration grid beforehand. A standard calibration grid is available from FANUC in several
sizes. It is strongly recommended that you order a calibration grid as well as a camera and lens.
Memory card preparation
iRVision can save undetected images to a memory card inserted into the main board of the robot
controller. It is recommended that at the time of system start-up and adjustment, a memory card be
inserted to save undetected images to the memory card. By doing so, a GPM Locator tool parameter
adjustment can be made using undetected images. Moreover, when the system is reinstalled after being
moved, for example, camera images before reinstallation, if saved, can be checked against camera images
after reinstallation to see if there is any major difference.
Note that even if "Log Failed Images" is set in the vision program, no undetected images can be saved if a
memory card is not inserted.
A memory card, when inserted, can be used to back up all data in the robot controller. If all data in the
robot controller is backed up, the vision data can be backed up at the same time. Be sure to back up all
data in the robot controller upon completion of start-up or integration.
4.2
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"
Setup for "fixed camera + fixed frame offset" needs to set "calibration grid plate setup" in a user
coordinate system or tool coordinate system with an arbitrary number and to set a "user coordinate system
for compensation" in parallel with the workpiece deviation plane as a user coordinate system with an
arbitrary number.
- 41 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Z
Z
Workpiece deviation plane
Y
Application User Frame
X
XY
Calibration grid user
frame or TCP
Use the following setup procedure for "fixed camera + fixed frame offset":
1. Camera setting data creation and teaching
Teach the TCP at the tip of the
pointer used for user frame teaching
information.
2. Robot TCP setting
Teach the calibration grid user frame
or TCP frame
3. Teaching of the calibration grid frame
Set application user frame in parallel
with the workpiece deviation plane.
4. Teaching of the application user frame
5. Camera calibration data creation and teaching
Create data items as many as
number of cameras and teach each
data item.
6. Vision program creation and teaching
7. Robot program creation and teaching
8. Robot compensation operation check
- 42 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
4.2.1
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setup tool.
Whether to install a camera on the robot or on a fixed stand is set in the camera setting data. When
using a fixed camera, do not check "Robot-Mounted Camera".
4.2.2
Robot TCP Setup
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy of
this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded, especially
when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
4.2.3
Calibration grid frame setup
Teach the calibration grid location in a user frame or user tool.
To perform calibration by setting the calibration grid that is fixed, teach a user frame to the grid. To
perform calibration by setting a calibration grid on the robot gripper, teach a user tool to the grid.
Properly secure the calibration grid then teach the grid with the pointer precisely to the frame. The
teaching precision here affects compensation precision. So, perform this teaching process precisely.
Note that the frame used for the calibration grid setup might differ from the application user frame used
for compensation described in Subsection 4.2.4.
When the grid is fixed
When the calibration grid is fixed, teach it in a user specified user frame.
Teach the calibration grid precisely with the pointer tool attached to the robot and the proper TCP, then
set the user frame as shown below.
- 43 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Z
Y
User coordinate system
X
For the method of setting a user frame, see Appendix A, "CALIBRATION GRID SETUP". The user
frame 4-point setting method is used.
When the calibration grid is mounted on the robot
When a calibration grid is mounted on the robot, teach the calibration grid in a user specified tool frame.
Touch up the pointer installed on a fixed stand with the calibration grid precisely, then set a tool
coordinate system as shown below.
Z
Y
Tool coordinate system
X
To set a tool coordinate system, see Appendix A, "CALIBRATION GRID SETUP".
coordinate system 6-point setting method is used.
4.2.4
The tool
Application User Frame Setup
Set the user frame to be used by the robot for compensation operation.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
Set a user frame for compensation so that the XY plane of the user frame is parallel with the pallet plane
on which a workpiece moves.
- 44 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Measurement point #1 camera
Measurement point #2 camera
Height of workpiece in Z direction
(Z coordinate of measurement
plane viewed from XY plane of
user coordinate system)
Workpiece
Pallet
Application User Frame
4.2.5
XY
Camera Calibration Data Creation and Teaching
For camera calibration, grid pattern calibration tool is recommended. The reason is that in an
application for measuring multiple points of a large workpiece, the heights of measurement points often
very from each other, so that grid pattern calibration is useful when a compensation calculation is made
considering different workpiece heights.
On the calibration data setting screen, select a coordinate system number set in Subsection 4.2.3, "
Calibration grid setup", and a user frame number set in Subsection 4.2.4, "Application User Frame Set up
correctly.
When the calibration grid is fixed
When the calibration grid is fixed, perform one-plane calibration. When one-plane calibration is
performed, the focal length of the lens cannot be calculated precisely. So, set the focal length of the lens
manually after calibration grid detection. If the focal length of the lens used is 12 mm, enter 12.0.
When the calibration grid is robot-mounted
When the calibration grid is robot mounted, perform two-plane calibration by moving the robot up and
down as shown in the figure below. The up/down distance for two-plane calibration should be 100 to
150 mm.
Find the grid pattern at two different heights. When moving the calibration grid up and down, jog the
robot without changing the robot posture.
- 45 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Fixed camera
Calibration grid
1st detection
100 to
150 mm
2nd detection
The calibration grid setup page is shown below. The calibration grid image is displayed.
- 46 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
•
In "Application Frame", select a user
coordinate system number set in Subsection
4.2.4.
•
Enter the interval between grid points.
•
For installation of a robot mounted calibration
grid, set the number of calibration planes to 2.
For fixed calibration grid plate setting, set the
number to 1.
•
According to the setting of Subsection 4.2.3,
select the user frame number that was taught
to the calibration grid . Next, press the “Set
Frame” button.
•
For single-plane calibration, enter the focal
length manually.
•
Move the calibration grid the proper distance
from the camera then press the snap button.
Next, press the “Find” button for the first
plane.
•
Change the distance between the calibration
grid and camera then press the snap button.
Next, press the “Find” button for the second
plane.
4.2.6
Vision Program Creation and Teaching
Create a program for "2D Multi-view Vision Process". The basic program setting method is the same as
for two-dimensional compensation based on a single camera. Unlike 2D Single-view Vision Process,
2D Multi-view Vision Process adds "camera views". There can be as many camera views as the number
of measurement points.
For each measurement point, a camera view is named camera view 1 or camera view 2. "GPM Locator
Tool" comes under each camera view.
- 47 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
1st point
1st view position
2nd point
2nd view position
Camera
Camera
Application User Frame
Workpiece height: Z1
Workpeice height: Z2
Use the following procedure to teach a 2D Multi-view Vision Process:
1. Teach pattern match for camera view 1.
2.
Select camera view 1 then press the 「Snap and Find」
button to detect a taught model.
3. Teach pattern match for camera view 2.
4.
Select camera view 2 then press the 「Snap and Find」
button to detect a taught model.
5. Select "2D Multi-View Vision Process" then set a reference position.
- 48 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Set a reference position immediately after detecting a model with two camera views. If the teach screen
is closed, the procedure needs to be performed all over again starting with model detection using each of
the two camera views.
Camera view teaching
With each camera view, teach GPM Locator tool. A detailed description of the GPM Locator tool is
omitted here. For setting of the GPM Locator tool, refer to Subsection 8.1, “GPM LOCATOR TOOL”
in the iRVision Operator’s Manual.
•
Select camera calibration data.
•
Select a shutter speed.
•
Teach pattern match.
•
Enter the height of the workpiece in the Z
direction at the 1st measurement point.
Perform the same processing for camera view
2.
In "App. Z Coordinate", set a proper value for
each camera view.
- 49 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Vision process teaching
Upon normal completion of detection with two camera views, set a reference position.
•
Select "Fixed Frame Offset" in Offset
Mode.
•
In “Combine Error Limit”, set a proper
value as needed.
•
Immediately after detecting a workpiece
by pressing the 「Snap and Find」 button
with each camera view, press the 「Set
Ref. Pos. 」 button.
This sets a
reference position for all camera views.
Only when a fixed camera is used, a
reference position can be set by pressing
the 「Snap and Find」 button once on the
vision process screen.
4.2.7
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. Two characteristic positions of a
large workpiece are detected. If multiple fixed cameras are used, no camera view number needs to be
specified in the "VISION RUN_FIND" instruction.
- 50 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
UFRAME_NUM=1
UTOOL_NUM=1
Sets the same value as the number of
application user frame
L P[1] 500mm/sec FINE
VISION RUN_FIND 'A'
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
This instruction is completed when the
image of camera view 1 and the image of
camera view 2 have been snapped.
Handling
P[3] 500mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
P[5] 100mm/sec FINE VOFFSET,VR[1]
Handling
Obtains the
programA.
offset
result
of
vision
Approaches to a workpiece.
END
LBL[99]
UALM[1]
When a fixed camera is used, one "VISION RUN_FIND" instruction measures all camera views prepared
beforehand. When the images of all camera views have been snapped, the line after the "VISION
RUN_FIND” instruction is executed.
4.2.8
Robot Compensation Operation Check
Check that multiple points of a workpiece can be detected and that the workpiece can be handled
correctly. At first, decrease the override of the robot to check that the logic of the program is correct.
Next, increase the override to check that the robot can operate continuously.
4.3
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED
FRAME OFFSET"
Setup for "robot-mounted camera + fixed frame offset" needs to set the "calibration grid plate " in a user
coordinate system with an arbitrary number and to set a "user coordinate system for compensation" in
parallel with the workpiece deviation plane as a user coordinate system with an arbitrary number.
Z
Z
Workpiece deviation plane
Y
Application User Frame
X
XY
Calibration grid user frame
Use the following setup procedure for "robot-mounted camera + fixed frame offset":
- 51 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
1. Camera setting data creation and teaching
2. Robot TCP setting
Teach the TCP at tip of the pointer
used for user frame teaching.
The pointer TCP is used to teach the
application user frame and the
calibration user frame.
Teach the calibration grid user frame
or TCP frame
3. Teaching the calibration grid frame
Teach the application user frame in
parallel with the workpiece deviation
plane.
4. Teaching of the application user frame
5. Camera calibration data creation and teaching
6. Vision program creation and teaching
When a robot-mounted camera is
used, this step is completed by
creating
and
teaching
one
calibration setup file.
7. Robot program creation and teaching
8. Robot compensation operation check
4.3.1
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot or on a fixed stand is set in the camera setting data. When
using a robot-mounted camera, be sure to check this item.
- 52 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
4.3.2
Robot TCP Setup
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy
of this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded,
especially when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
4.3.3
Calibration Grid Setup
Teach the calibration grid location in a user frame.
To perform calibration by setting the calibration grid in a fixed location on a table, set it up in a user
frame.
When the calibration grid is fixed, secure the calibration grid then touch up the grid with the pointer
precisely to set the user frame. The touch-up precision here affects compensation precision. So,
perform this touch-up processing precisely. Note that the user frame used for the calibration grid
differs from the application user frame described in Subsection 4.3.4.
When the calibration grid is fixed, teach a user specified user frame to it.
Teach the grid precisely with the pointer attached to the robot gripper. Teach the user frame as shown
below.
Z
Y
User coordinate system
X
For instructions on teaching a user frame, see Appendix A, "CALIBRATION GRID SETUP". The user
frame 4-point setting method is used.
4.3.4
Teaching Application User Frame
Set a user frame to be used by the robot for compensation operation.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
Set a user frame for compensation so that the XY plane of the user frame is parallel with the pallet plane
on which a workpiece moves.
- 53 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Camera
Z
Application User Frame
XY
Pallet
4.3.5
Camera Calibration Data Creation and Setup
For camera calibration, the use of a calibration grid is recommended.
On the camera calibration tools setup screen, select the frame number created in Subsection 4.3.3,
"Calibration Grid Setup", and a frame number created in Subsection 4.3.4, "Setting of the Application
User Frame " correctly.
When a robot mounted camera is used, perform two-plane calibration by moving the robot up and down
as shown in the figure below. The up/down distance for two-plane calibration should be 100 to 150 mm.
Next, move the camera about 100 to 150 mm in the Z direction. Find the calibration grid at two
different heights. In this operation, ensure that the posture of the robot for detecting the calibration grid
is the same as the posture for detecting a workpiece.
1st detection
100mm~
150mm
2nd detection
Calibration grid
When grid pattern calibration is performed with a robot-mounted camera, creating one grid pattern
calibration data item alone eliminates the need for creating further grid pattern calibration data even if the
camera measurement position is changed.
This is because the current position of the robot is
considered in iRVision calculation processing when a workpiece position calculation is made. At any
- 54 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
measurement point, however, ensure that the direction of the camera’s field of view is normal to the XY
plane of a user coordinate system used for the compensation operation.
The grid pattern calibration setup screen is shown below. A calibration grid image is displayed.
4.3.6
•
In "Application Frame", select the user
frame number set in Subsection 4.3.4.
•
Enter the interval between grid dots.
•
Set the number of calibration planes to 2.
•
According to the setting of Subsection
4.3.3, select the frame number of the
calibration grid position data is set.
Next, press the “Set Frame” button.
•
Move the camera for a proper distance
from the calibration grid then press the
snap button. Next, press the “Find”
button for the fist plane.
•
Change the distance between the
camera and calibration grid plate then
press the snap button. Next, press the
“Find” button for the second plane
Vision Program Creation and Teaching
Create a program for "2D Multi-view Vision Process". The basic program setting method is the same as
for 2-D Single-View Vision Process. Unlike 2D Single-view Vision Process, 2D Multi-view Vision
- 55 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Process adds "camera views" . There can be as many camera views as the number of measurement
points.
For each measurement point, a camera view is named camera view 1 or camera view 2. "GPM Locator
Tool" comes under each camera view.
1st point
2nd point
1st view position
2nd view position
Camera
Camera
(Camera view 1)
(Camera view 2)
Application User
Frame
Workpiece height: Z1
Workpeice height: Z2
- 56 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Use the following procedure to teach a 2D Multi-view Vision Process:
1.
Move the robot to the first measurement position then teach GPM Locator
tool for camera view 1.
2.
Select camera view 1 then press the 「Snap and Find」 button to
detect a taught model.
3.
Move the robot to the second measurement position then teach GPM
Locator tool for camera view 2.
4.
Select camera view 2 then press the 「Snap and Find」 button to
detect a taught model.
5.
Select "2D Multi-view Vision Process" then set a reference position.
Set a reference position immediately after detecting a model with two camera views. If the teach screen
is closed, the procedure needs to be performed all over again starting with model detection using each of
the two camera views.
Camera view teaching
With each camera view, teach GPM Locator tool. A detailed description of the GPM Locator tool is
omitted here. For setting of the GPM Locator tool, refer to Subsection 8.1, “GPM LOCATOR TOOL”
in the iRVision Operator’s Manual.
- 57 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
•
Select the camera calibration data.
•
Set the proper shutter speed.
•
Train the GPM Locator tool.
•
Enter the height of the workpiece relative
to the application user frame in the Z
direction at the 1st camera view point.
Perform the same processing for camera view
2.
In "App. Z Coordinate", set a proper value for
each camera view.
- 58 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Vision process teaching
After finding the workpiece with the two camera views, set the reference position.
•
Select "Fixed Frame" in Offset Mode.
•
In “Combine Error Limit”, set the proper
value as needed.
•
Immediately after finding the workpiece
by pressing the 「Snap and Find」 button
on each camera view individually, press
the 「Set Ref. Pos.」 button. This sets a
reference position for all the camera
views.
- 59 -
4.2D MULTI VIEW VISION PROCESS
4.3.7
B-82774EN-3/03
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. Two characteristic positions of a
large workpiece are found by moving the robot-mounted camera. Program A has two camera views, so
that each camera view number is added to the "VISION RUN_FIND" instruction.
1:
2:
3:
4:
5:
6:
7:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
UFRAME_NUM=1
UTOOL_NUM=1
Sets the application user frame here.
L P[1] 500mm/sec FINE
VISION RUN_FIND'A' CAMERA_VIEW[1]
L P[2] 500mm/sec FINE
VISION RUN_FIND'A' CAMERA_VIEW[2]
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
Moves to the first view position and run
find for camera view 1.
Moves to the second view position and run
find for camera view 2.
Handling
P[3] 500mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
P[5] 100mm/sec FINE VOFFSET,VR[1]
Handling
Approaches to a workpiece.
END
LBL[99]
UALM[1]
When a camera image has been snapped, the line after the "VISION RUN_FIND" instruction is executed.
4.3.8
Robot Compensation Operation Check
Check that multiple points of a workpiece can be detected and that the workpiece can be handled
correctly. At first, decrease the override of the robot to check that the logic of the program is correct.
Next, increase the override to check that the robot can operate continuously.
4.4
SETUP FOR "FIXED CAMERA + tool offset"
tool offset measures, with a camera, how much a workpiece gripped by a robot is deviated from the
correct grip position. This feature performs compensation so that the robot places the gripped workpiece
at the predetermined position correctly.
Setup for "fixed camera + tool offset" needs to have the "calibration grid setup" in a user defined tool frame
and have the "application user frame" in parallel with the workpiece deviation plane as a user defined user
frame.
Z
Z
Workpiece deviation plane
Y
Application User Frame
X
Calibration grid user tool
- 60 -
XY
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Use the following setup procedure for "fixed camera + tool offset":
1. Camera setting data creation and teaching
Teach the calibration grid user
frame or TCP frame
2. Teaching of calibration grid user frame
Set application user frame in
parallel with the workpiece
deviation plane.
3. Teaching of application user frame
4. Camera calibration data creation and teaching
Create a calibration for each
camera.
5. Vision program creation and teaching
6. Robot program creation and teaching
7. Robot compensation operation check
An example of layout for "fixed camera + tool offset" is given below.
A workpiece gripped by the robot is viewed by two to four cameras to measure the grip error.
Camera
Z
Y
X
Workpiece deviation
direction
Workpiece deviation direction
Workpiece
Application User Frame
*
*
If the measurement plane matches
the XY plane of the user coordinate
system, "workpiece height in Z
direction = 0" results.
If the height of the workpiece varies
from one camera view to another, set
a proper workpiece height in the Z
direction for each camera view.
It is recommended to set the calibration grid on a dummy workpiece for calibration. The figure below
shows an example of setting the calibration grid at a workpiece measurement position. Prepare a
dummy workpiece that resembles an actual workpiece and can be gripped. Setup work can be simplified
by setting a calibration grid on a dummy workpiece.
- 61 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Dummy workpiece
Calibration grid
By setting a calibration grid on a dummy workpiece for calibration, the application user frame can be
trained more easily For details, see Subsection 4.4.3.
4.4.1
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot gripper or on a fixed stand is set in the camera setting data.
When using a fixed camera, do not check "Robot-Mounted Camera".
4.4.2
Calibration grid plate Setup
Teach the calibration grid location as a user tool frame.
Perform calibration by placing the calibration grid on the robot, and teach the grid as a tool frame.
Train the calibration grid user tool frame carefully because the precision here affects compensation
precision. So, perform this touch-up processing precisely. Note that a coordinate system used for
setup differs from the application user frame described in Subsection 4.4.3.
When the calibration grid is installed on the robot, teach the calibration grid information in a user specified
user tool frame.
Touch up the calibration grid precisely with pointer installed on a fixed stand. Train the tool frame as
shown below.
- 62 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Z
Y
Tool coordinate system
X
For instructions on setting a user tool frame, see Appendix A, "CALIBRATION GRID SETUP". The
tool coordinate system 6-point setting method is used.
4.4.3
User frame for Compensation Setup
Teach the application user frame.
If 0 is selected as the user frame number, the world coordinate system of the robot is used.
In tool offset, set the application user frame so that the XY plane of the user frame is parallel with the
plane on which a workpiece moves, as with fixed frame offset.
This subsection describes how to set the application user frame by using the tool coordinate system
already set in Subsection 4.4.2.
1
2
3
4
With the robot gripper, pick up a dummy workpiece then set a calibration grid on the dummy
workpiece.
Prepare a teach pendant program for calibration, and set 0 (base coordinate system) as the user frame
number and set the tool coordinate system number already set in Subsection 4.4.2 in the teach
pendant program.
Move the robot to the measurement position, then teach the position as a point in the teach pendant
program.
Run the teach pendant program above to move the robot to the teach point and record the current
position after the movement, then set the position in an arbitrary user frame. For this purpose,
include the following instructions in the teach pendant program:
PR[1]=LPOS
UFRAME[1] = PR[1]
By this operation, the application user frame is the same as the calibration user tool that was set in
Subsection 4.4.2. This user frame is used because it is parallel with the workpiece grip error plane.
Z
Current position
Y
X
Application frame = tool coordinate system
If the workpiece measurement plane is apart from the XY plane of the user coordinate system for
compensation set here, the Z coordinate of the measurement plane viewed from the XY plane of the user
coordinate system for compensation needs to be set in the vision program.
- 63 -
4.2D MULTI VIEW VISION PROCESS
4.4.4
B-82774EN-3/03
Camera Calibration Data Creation and Teaching
Use grid pattern calibration for camera calibration.
On the calibration data setting screen, select a coordinate system number set in Subsection 4.4.2, "Setting
of Calibration grid Setup", and a coordinate system number set in Subsection 4.4.3, "Setting of a User
frame for Compensation" correctly.
Perform two-plane calibration by moving a calibration grid installed on the robot hand up and down as
shown in the figure below. The up/down distance for two-plane calibration should be about 100 to 150
mm.
Move the camera about 100 to 150 mm in the Z direction. Find the grid pattern at the two different heights.
In this operation, ensure that the posture of the robot for detecting the calibration grid is not changed.
Camera
Calibration grid
1st detection
100 to
150 mm
2nd detection
The calibration grid setup page is shown below. A calibration grid image is displayed.
- 64 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
4.4.5
•
In "Application Frame", select a user
coordinate system number set in
Subsection 4.4.3.
•
Enter the interval between grid points.
•
For installation of a calibration grid plate
on the gripper, set the number of
calibration planes to 2.
•
According to the setting of Subsection
4.4.2, select the number of a tool
coordinate system where calibration grid
position data is set. Next, press the “Set
Frame” button.
•
Move the calibration grid plate for a
proper distance from the camera then
press the snap button. Next, press the
“Find” button.
•
Change the distance between the
calibration grid plate and camera then
press the snap button. Next, press the
“Find” button.
Vision Program Creation and Teaching
Create a program for "two-dimensional compensation based on multiple cameras". The basic program
setting method is the same as for two-dimensional compensation based on one camera. Unlike
two-dimensional compensation based on one camera, two-dimensional compensation based on multiple
cameras adds "camera views" as many as the number of measurement points.
For each view, a camera view is named camera view 1 or camera view 2. “Geometric Locator Tool”
comes under each camera view.
- 65 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
1st point
2nd point
1st measurement position
2nd measurement position
Camera
Camera
Workpiece height: Z1
Workpeice height: Z2
Application User Frame
Use the following procedure to teach a 2D Multi-view Vision Process:
1.
Train the pattern match for camera view 1.
2.
Select camera view 1 then press the 「Snap and Find」 button to find
the taught model.
3.
Train the pattern match for camera view 2.
4.
Select camera view 2 then press the 「Snap and Find」 button to find
the taught model.
5.
Select "2D Multi-View Vision Process" then set a reference position.
Set a reference position immediately after finding the model with two camera views. If the teach screen
is closed, the procedure needs to be performed all over again starting with model detection using each of
the two camera views.
- 66 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Camera view teaching
With each camera view, teach GPM Locator tool. A detailed description of the GPM Locator tool is
omitted here. For setting of the GPM Locator tool, refer to Subsection 8.1, “GPM LOCATOR TOOL”
in the iRVision Operator’s Manual.
•
Select the camera calibration data.
•
Set the proper shutter speed.
•
Train the GPM Locator tool.
•
Enter the height of the workpiece
relative to the application user frame in
the Z direction at the 1st camera view
point.
Perform the same processing for camera
view 2.
In "App. Z Coordinate", set a proper value for
each camera view.
- 67 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
Vision process teaching
After finding the workpiece with the two camera views, set the reference position.
•
Select "Tool Offset" in Offset Mode.
•
A user tool frame different from the pointer tool
used for teaching the calibration grid can be
specified here. An arbitrary user tool frame such
as a user tool frame for the robot gripper or user
tool frame number 0 can be used.
•
In “Combine Error Limit”, set the proper value as
needed.
•
Immediately after finding the workpiece by
pressing the 「Snap and Find」 button on each
camera view individually, press the 「 Set Ref.
Pos.」 button. This sets a reference position for
all the camera views.
- 68 -
4.2D MULTI VIEW VISION PROCESS
B-82774EN-3/03
4.4.6
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. For tool offset, add the "VOFFSET"
instruction as an operation statement. Moreover, be sure to specify the number of the application user
frame and the number of the user tool frame set in the vision program.
1:
2:
3:
4:
5:
6:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
4.5
UFRAME_NUM=1
UTOOL_NUM=1
Sets the application user frame here.
L P[1] 500mm/sec FINE
VISION RUN_FIND'A'
L P[2] 500mm/sec FINE
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
Sets the number of the user tool.
Moves the robot to the measurement
point.Upon completion of movement, the
position of the workpiece is measured
using the vision detection instruction.
Handling
P[3] 500mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
P[5] 100mm/sec FINE VOFFSET,VR[1]
Handling
Approaches to a workpiece.
END
LBL[99]
UALM[1]
Robot Compensation Operation Check
Check that a workpiece gripped by the robot can be detected and positioned precisely at a desired location.
At first, decrease the override of the robot to check that the logic of the program is correct. Next,
increase the override to check that the robot can operate continuously.
- 69 -
5.SETUP OF FLOATING FRAME VISION PROCESS
5
B-82774EN-3/03
SETUP OF FLOATING FRAME VISION
PROCESS
This chapter describes the setup procedure for the Floating Frame Vision Process.
The Floating Frame Vision Process only applies to robot mounted camera applications.
The basic setup procedure for the floating frame vision process is the same as that for 2D Single-view
Vision Process.
This chapter assumes that a PC for teaching is already set up. No description of PC setup is provided.
Robot-Mounted Camera + fixed frame offset
An example of layout for the Floating Frame Vision Process is shown below.
Optical axis of camera
Robot-mounted
camera
Z
D (constant standoff
distance)
XY
Application User Frame
workpiece
pallet
*In this case, part-Z height
is set to 0.
When you want to change the standoff distance from initial standoff D to D’ like a following figure, you
need to change part-Z height.
If the workpiece is below the application user frame, then the part Z height is set to a negative number, if
it is above application user frame, the part Z height is a positive number. In the example below the part Z
height would be (-L).
- 70 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
Optical axis of camera
Robot-mounted
camera
Z
Application User Frame
D
L
D’
In this case, part Z height should
be changed to (-L).
workpiece
pallet
Please be careful that the initial standoff distance D at reference position setting should be maintained
while actual measurement. And if you change part Z height, then you should set reference position again.
In the Floating Frame Vision Process, the relative position from camera to workpiece at reference position
setting should be maintained while actual detection, evenif the location and posture of the robot is
changed.
If the location and posture at actual detection differs from the location and posture at reference position
setting, virtual application user frame follows the camera. So, the expected compensation can be provided
evenif the location and posture of the robot varies at run-time.
Location and posture at
reference position setting
Z
Compensation in a virtual application
user frame that follows the camera can
be realized.
An operator does not need to train
virtual application user frame.
Location and posture at
actual detection
D
XY
5.1
PREPARATION BEFORE SETUP
This section describes a calibration grid and memory card that need to be prepared before setup.
- 71 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
Calibration Grid Preparation
Grid calibration is recomended for the Floating Frame Vision Process. Do not fail to prepare a
calibration grid beforehand. A standard calibration grid is available from FANUC in several sizes. It
is strongly recommended to order a calibration grid as well as a camera and lens.
Memory card Preparation
iRVision can save failed images to a memory card inserted into the main board of the robot controller. It
is recommended that at the time of system start-up and adjustment, a memory card be inserted to save
undetected images to the memory card. By doing so, a GPM Locator tool parameter adjustment can be
made using undetected images. Moreover, when the system is reinstalled after being moved for example,
camera images before reinstallation, if saved, can be checked against camera images after reinstallation to
see if there is any major difference.
Note that even if "Log Failed Images" is set in the vision program, no undetected images can be saved if a
memory card is not inserted.
A memory card, when inserted, can be used to back up all data in the robot controller. If all data in the
robot controller is backed up, the vision data can be backed up at the same time. Be sure to back up all
data in the robot controller upon completion of start-up or adjustment.
5.2
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED
FRAME OFFSET"
Use the following setup procedure:
1. Camera setting data creation and teaching
Teach the TCP at the tip of the pointer
used for user frame teaching.
The pointer TCP is used to teach the
application user frame and the
calibration user frame.
2. Robot TCP setting
Teach the calibration grid user frame
or TCP frame number.
3. Teaching of the calibration grid frame
4. Teaching of the application user frame
Set user coordinate system in parallel
with workpiece deviation plane.
5. Camera calibration data creation and teaching
6. Vision program creation and teaching
7. Robot program creation and teaching
8. Robot compensation operation check
- 72 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
The Floating Frame Vision Process needs to set "the calibration grid information" in a user specified user
frame. The calibration grid can be placed anywhere in any orientation within the robot work envelope,
as long as a user frame can be taught to it and the camera can view it.
When setting the reference position, the workpiece must be parallel to the application user frame.
Workpiece deviation plane
Z
Z
Y
X
Application User Frame
XY
User coordinate system
(grid setting information)
5.2.1
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
The Floating Frame Vision Process must have a robot mounted camera. Make sure to check the robot
mounted camera box in the Camera Setup screen.
When a robot-mounted camera is used, iRVision automatically reads the current position of the robot to
make a compensation data calculation.
For a teach pendant program example, see 5.2.7 Robot Program Creation and Teaching.
5.2.2
Robot TCP Setting
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy
of this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded,
especially when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
- 73 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
5.2.3
Calibration Grid Setup
Teach the calibration grid location in a user frame.
To perform calibration by setting the calibration grid in a fixed location on a table, set it up in a user
frame.
When the calibration grid is fixed, secure the calibration grid then touch up the grid with the pointer
precisely to set the user frame. The touch-up precision here affects compensation precision. Therefore,
perform this touch-up processing precisely. Note that the user frame used for the calibration grid differs
from the application user frame described in 5.2.4 Teaching Application User Frame.
When the calibration grid is fixed, teach a user specified user frame to it.
Teach the grid precisely with the pointer attached to the robot gripper. Teach the user frame as shown
below.
Z
Y
User coordinate system
X
For the method of setting a user coordinate system, see Appendix A. The user coordinate system 4-point
setting method is recommended.
5.2.4
Teaching Application User Frame
Set the user frame to be used by the robot for compensation operation.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
When setting the reference position the workpiece must be in the same plane as XY plane of the
application user frame. The “Part Z Height”, the height the part is above or below the application frame
must also be known when setting the reference position.
- 74 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
Camera
D
Z
XY
workpiece
Application User Frame
Pallet
When you set the reference position as in the above situation, you can run the vision process even if the
camera pose differs from that at the reference position setting, like the below situation. In this case, you
do not need train the application user frame again.
Z
Z
D
workpiece
XY
D
XY
workpiece
pallet
5.2.5
Camera Calibration Data Creation and Teaching
For camera calibration, the use of a calibration grid is recommended.
On the Camera Calibration Tools Setup screen, select the frame number created in 5.2.3 Calibration Grid
Setup, and a frame number created in 5.2.4. Teaching Application User Frame correctly.
Perform two-plane calibration by moving the robot up and down as shown in the figure below. The
up/down distance for two-plane calibration should be 100 to 150 mm.
Next, move the camera about 100 to 150 mm in the Z direction. Find the calibration grid at two
different heights.
- 75 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
1st detection
100 to
150 mm
2nd detection
Calibration grid
The Grid Pattern Calibration Setup screen is shown below. A calibration grid image is displayed.
- 76 -
B-82774EN-3/03
5.2.6
5.SETUP OF FLOATING FRAME VISION PROCESS
•
In "Application Frame", select the user
frame number set in Subsection 5.2.4.
•
Enter the interval between grid dots.
•
Set the number of calibration planes to 2.
•
According to the setting of Subsection
5.2.3, select the frame number of the
calibration grid position data is set.
Next, press the “Set Frame” button.
•
Move the camera for a proper distance
from the calibration grid then press the
snap button. Next, press the “Find”
button for the fist plane.
•
Change the distance between the
camera and calibration grid plate then
press the snap button. Next, press the
“Find” button for the second plane
Vision Program Creation and Teaching
Create a Floating Frame Vision Process. The basic program setting method is the same as for 2D
Single-view Vision Process. However, unlike2D Single-view Vision Process, the distance between the
camera and the workpiece must remain constant.
When setting the reference position the part must be in the same plane as XY plane of the application
frame, and the Z axis of the application frame must be parallel to the optical axis of the camera.
The standoff distance from the camera to the top of the workpiece must remain constant. The standoff
distance used when setting the reference position must be the same standoff distance for all view
positions during runtime.
Measure the standoff distance from the part to the camera when setting the reference position. During
runtime, make sure that the robot positions the camera at the same standoff distance as was measured
during the reference position setting.
- 77 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
A detailed description of the GPM Locator tool is omitted here. For setting of the GPM Locator tool,
refer to Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision Operator’s Manual.
•
Select camera calibration data.
•
Select a shutter speed.
•
Set the distance between the XY plane
of the user coordinate system and the
measurement plane of a workpiece if
the planes are apart from each other. If
the XY plane of the application user
frame is on the top of the workpiece,
then set 0 in this item.
•
Press the 「Snap and Find」 button to
detect a workpiece. Next, press the
「 Set Ref. Pos. 」 button to set a
reference position.
- 78 -
5.SETUP OF FLOATING FRAME VISION PROCESS
B-82774EN-3/03
5.2.7
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. An application program would most
likely have multiple view positions, not the single view position on line 4.
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
UFRAME_NUM=1
UTOOL_NUM=1
Sets the same value as the number of a user
coordinate system for compensation.
L P[1] 500mm/sec FINE
VISION RUN_FIND‘A’
L P[2] 500mm/sec FINE
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
5.2.8
Moves the robot to the workpiece image
snap position. This will change during
runtime.
Handling
P[3] 500mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
P[5] 100mm/sec FINE VOFFSET,VR[1]
Handling
Obtains the measurement result of vision
program A.
Adds the “VOFFSET” instruction as an
operation statement, for position
compensation.
END
LBL[99]
UALM[1]
Robot Compensation Operation Check
Check that the robot picks up a workpiece in the pallet correctly.
At first, decrease the override of the
robot motion to check that the logic of the program is correct. Next, increase the override to check that
the robot can operate continuously.
- 79 -
6.SETUP OF DEPALLETIZING VISION PROCESS
6
B-82774EN-3/03
SETUP OF DEPALLETIZING VISION
PROCESS
This chapter describes the setup procedure for Depalletizing Vision Process in the following
configuration:
−
"Robot-mounted camera + fixed frame offset"
This chapter assumes that a PC for teaching is already set up. No description of PC setup is provided.
Robot-mounted camera + fixed frame offset
An example of layout for "robot-mounted camera + fixed frame offset" is given below.
Robot-mounted
camera
Optical axis of camera
Workpiece
Pallet
6.1
PREPARATION BEFORE SETUP
This section describes a calibration grid and memory card that need to be prepared before setup.
Calibration Grid preparation
Grid calibration is used for Depaletizing Vision Process at all times. Do not fail to prepare a calibration
grid beforehand. A standard calibration grid is available from FANUC in several sizes. It is strongly
recommended to order a calibration grid as well as a camera and lens.
Memory card preparation
iRVision can save undetected images to a memory card inserted into the main board of the robot
controller. It is recommended that at the time of system start-up and adjustment, a memory card be
inserted to save undetected images to the memory card. By doing so, a GPM Locator tool parameter
adjustment can be made using undetected images. Moreover, when the system is reinstalled after being
moved, for example, camera images before reinstallation, if saved, can be checked against camera images
after reinstallation to see if there is any major difference.
Note that even if "Log Failed Images" is set in the vision program, no undetected images can be saved
when no memory card is inserted.
- 80 -
6.SETUP OF DEPALLETIZING VISION PROCESS
B-82774EN-3/03
A memory card, when inserted, can be used to back up all data in the robot controller. If all data in the
robot controller is backed up, the vision data can be backed up at the same time. Be sure to back up all
data in the robot controller upon completion of start-up or adjustment.
6.2
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED
FRAME OFFSET"
Use the following setup procedure:
1. Camera setting data creation and teaching
Teach the TCP at the tip of the pointer
used for user frame teaching.
The pointer TCP is used to teach the
application user frame and the
calibration user frame.
2. Robot TCP setting
Teach the calibration grid user frame
or TCP frame number.
3. Teaching of the calibration grid frame
4. Teaching of the application user frame
Set user coordinate system in parallel
with workpiece deviation plane.
5. Camera calibration data creation and teaching
6. Vision program creation and teaching
7. Robot program creation and teaching
8. Robot compensation operation check
Setup for "robot mounted camera + fixed frame offset" needs to set "the calibration grid information" in a
user specified user frame and to set the application frame in parallel with the workpiece deviation plane in
a user specified user frame.
Workpiece deviation plane
Z
Z
Y
X
Application User Fra,e
User coordinate
system
(jig setting information)
- 81 -
XY
6.SETUP OF DEPALLETIZING VISION PROCESS
6.2.1
B-82774EN-3/03
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot or on a fixed stand is set in the camera setting data. When
using a robot-mounted camera, be sure to check this item.
When a robot-mounted camera is used, iRVision automatically reads the current position of the robot to
make a compensation data calculation. So, if a workpiece is detected by moving the camera over the
pallet, the logic of the teach pendant program need not be modified at each camera position.
For a TP program, see Subsection 6.2.7, "Robot Program Creation and Teaching".
6.2.2
Robot TCP Setting
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy
of this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded,
especially when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
6.2.3
Calibration Grid Setup
Teach the calibration grid location in a user frame.
To perform calibration by setting the calibration grid in a fixed location on a table, set it up in a user
frame.
When the calibration grid is fixed, secure the calibration grid then touch up the grid with the pointer
precisely to set the user frame. The touch-up precision here affects compensation precision. So,
perform this touch-up processing precisely. Note that the user frame used for the calibration grid differs
from the application user frame described in Subsection 6.2.4.
- 82 -
6.SETUP OF DEPALLETIZING VISION PROCESS
B-82774EN-3/03
When the calibration grid is fixed, teach a user specified user frame to it.
Teach the grid precisely with the pointer attached to the robot gripper. Teach the user frame as shown
below.
Z
Y
User coordinate system
X
For the method of setting a user coordinate system, see Appendix A, "CALIBRATION GRID SETUP".
The user coordinate system 4-point setting method is used.
6.2.4
Teaching Application User Frame
Set a user frame to be used by the robot for compensation operation.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
Set a user frame for compensation so that the XY plane of the user frame is parallel with the pallet plane
on which a workpiece moves.
Camera
Z
Application User Frame
Pallet
6.2.5
XY
Camera Calibration Data Creation and Teaching
For camera calibration, the use of a calibration grid is recommended.
On the camera calibration tools setup screen, select the frame number created in Subsection 6.2.3,
"Calibration Grid Setup", and a frame number created in Subsection 6.2.3, "Setting of the Application
User Frame " correctly.
When a robot mounted camera is used, perform two-plane calibration by moving the robot up and down
as shown in the figure below. The up/down distance for two-plane calibration should be 100 to 150 mm.
Next, move the camera about 100 to 150 mm in the Z direction. Find the calibration grid at two
different heights. In this operation, ensure that the posture of the robot for detecting the calibration grid
is the same as the posture for detecting a workpiece.
- 83 -
6.SETUP OF DEPALLETIZING VISION PROCESS
B-82774EN-3/03
1st detection
100 to
150 mm
2nd detection
Calibration grid
The grid pattern calibration setup screen is shown below. A calibration grid image is displayed.
- 84 -
6.SETUP OF DEPALLETIZING VISION PROCESS
B-82774EN-3/03
6.2.6
•
In "Application Frame", select the user
frame number set in Subsection 5.3.4.
•
Enter the interval between grid dots.
•
Set the number of calibration planes to 2.
•
According to the setting of Subsection
5.3.3, select the frame number of the
calibration grid position data is set.
Next, press the “Set Frame” button.
•
Move the camera for a proper distance
from the calibration grid then press the
snap button. Next, press the “Find”
button for the fist plane.
•
Change the distance between the
camera and calibration grid plate then
press the snap button. Next, press the
“Find” button for the second plane
Vision Program Creation and Teaching
Create a program for Depalletizing Vision Process. The basic program setting method is the same as for
2D Single-view Vision Process. Unlike 2D Single-view Vision Process, Depalletizing Vision Process
sets the relationship between the height of workpieces and the size of the workpiece viewed by the
camera.
Specifically, place one workpiece on the pallet then touch the top surface of the workpiece with the
pointer attached to the robot to measure the height of the workpiece in the application user frame. At the
same time, find the workpiece and set the size of the viewed workpiece image. Next, stack n workpieces
and touch the top surface of the top workpiece with the pointer attached to the robot to measure the height
of the stacked workpieces in the application user frame.
The 2.5-dimensional compensation program finds the height (Z value) of workpieces by using found
workpiece size information. So, the size of workpieces in the same plane should be constant whenever
possible. This requires robot-mounted lighting for stable detection.
- 85 -
6.SETUP OF DEPALLETIZING VISION PROCESS
B-82774EN-3/03
52%
100%
Camera
Camera
300 mm
100 mm
- 86 -
6.SETUP OF DEPALLETIZING VISION PROCESS
B-82774EN-3/03
A description of the GPM Locator tool is omitted here. For setting of the GPM Locator tool, refer to
Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision Operator’s Manual.
•
Select camera calibration data.
•
Select the shutter speed.
•
In "Sort by", select a sorting method for
the result of detection. The default is
"Score". Select "Score" or "Z". This
selection is available with 7.20P/21 or
later.
•
Place a workpiece then touch the top
surface of the workpiece with the pointer
to determine the workpiece’s height in the
application user frame. Enter the height
information in “Reference Height 1”
•
Press the 「Snap and Find」 button to
find the workpiece. Next, press the
「Set Scale」 button to set a reference
size.
•
Stack n workpieces then touch the top
surface of the top workpiece to determine
the workpiece’s height in the application
user frame. Enter the height information
in "Reference Height 2".
•
Press the 「Snap and Find」 button to
find the workpiece. Next, press the
「Set Scale」 button to set a reference
size.
•
Press the 「Snap and Find」 button to
find the workpiece with which the robot
reference position is taught. Next, press
the 「Set Ref. Pos.」 button to set the
reference position.
- 87 -
6.SETUP OF DEPALLETIZING VISION PROCESS
6.2.7
B-82774EN-3/03
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. A robot-mounted camera is used,
and the "OFFSET" instruction is used on the fourth line to shift the camera image snap position on the
pallet. Shifting the image snap robot position by adding a constant value to the value of PR[1] simplifies
programming.
1:
2:
3:
4:
5:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
UFRAME_NUM=1
UTOOL_NUM=1
Sets the same value as the number of a user
coordinate system for compensation.
L P[1] 500mm/sec FINE Offset,PR[1]
VISION RUN_FIND‘A’
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
!
L
L
L
!
Moves to the image snap position. Here,
the fixed frame offset instruction is
used to make a shift.
Handling
P[2] 500mm/sec FINE VOFFSET,VR[1]
P[3] 100mm/sec FINE VOFFSET,VR[1]
P[4] 100mm/sec FINE VOFFSET,VR[1]
Handling
Approaches to a workpiece.
END
LBL[99]
UALM[1]
When a fixed camera is used, the operation statement on the fourth line is unnecessary.
6.2.8
Robot Compensation Operation Check
Stack several workpieces then move the robot to the top workpieces. Check that the robot moves to the
workpieces correctly. Remove the workpiece and repeat with the next workpiece. At first, decrease
the override of the robot to check that the logic of the program is correct. Next, increase the override to
check that the robot can operate continuously.
- 88 -
B-82774EN-3/03
7
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
SETUP OF BIN-PICK SEARCH VISION
PROCESS
This chapter describes the setup procedure for the Bin-Pick Search Vision Process function in the
following configurations.
−
−
Fixed camera + fixed frame offset
Robot mounted camera + fixed frame offset
The basic setup procedure for the Bin-Pick Search Vision Process function is the same as that for
Depalletizing Vision Process.
This chapter assumes that a personal computer for teaching is already set up. Description of the setup of
the personal computer setup is not provided here.
Fixed camera + fixed frame offset
An example of layout for "fixed camera + fixed frame offset" is given below.
Fixed camera
Optical axis of
camera
Magnetic gripper
Pallet
Workpiece
- 89 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
Robot mounted camera + fixed frame offset
An example of layout for "robot mounted camera + fixed frame offset" is given below.
Robot mounted camera
Optical axis of camera
Magnetic gripper
Pallet
Workpiece
Note that the robot moves in the direction of the view line from the camera to workpiece as shown below.
Fixed camera
View line
Workpiece
7.1
PREPARATION BEFORE SETUP
This section first describes the calibration grid and memory card that need to be prepared before setup.
Calibration Grid Preparation
With the Bin-Pick Search Vision Process function, grid calibration method needs to be used. Be sure to
prepare the calibration grid beforehand. A standard calibration grid is available from FANUC in several
sizes. It is strongly recommended that you order a calibration grid as well as a camera and lens.
Memory Card Preparation
iRVision can save undetected images to a memory card inserted into the main board of the robot
controller. It is recommended that, at the time of system start-up and integration, a memory card be
inserted to save undetected images to the memory card. By doing so, a GPM Locator tool parameter
adjustment can be made using undetected images. Moreover, when, for example, the system is
- 90 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
reinstalled after being moved, camera images before reinstallation, if saved, can be checked against
camera images after reinstallation to see if there is any major difference.
Note that even if "Log Failed Images" is set in the vision process, no undetected images can be saved
when no memory card is inserted.
A memory card can be used to back up all data in the robot controller. If all data in the robot controller
is backed up, the vision data can be backed up at the same time. Be sure to back up all data in the robot
controller upon completion of startup or integration.
7.2
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"
Follow the setup procedure below for "fixed camera + fixed frame offset".
1. Camera setting data creation and teaching
Teach TCP at tip of the pointer used
for user frame teaching.
The pointer TCP is used to teach the
application user frame and the
calibration user frame.
2. Robot TCP setting
Teach the calibration grid user frame
or TCP frame.
3. Teaching of the calibration grid frame
4. Teaching of a application user frame
Set application user frame in parallel
with the pallet plane.
5. Camera calibration data creation and teaching
6. Vision process creation and teaching
7. Robot program creation and teaching
8. Robot compensation operation check
Setup for "fixed camera + fixed frame offset" needs to set "calibration grid information" in a user
coordinate system or tool coordinate system with an arbitrary number and to set a "user coordinate system
for compensation" in parallel with the pallet plane as a user coordinate system with an arbitrary number.
- 91 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
Z
Z
Y
X
7.2.1
B-82774EN-3/03
XY
User coordinate system or
tool coordinate system
(calibration grid information)
User coordinate system for compensation
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot hand or on a fixed stand is set in the camera setting data. When
using a fixed camera, do not check this item.
7.2.2
Robot TCP Setting
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy of
this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded, especially
when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
- 92 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
7.2.3
Calibration grid frame Setup
Teach the calibration grid location in a user frame or user tool.
To perform calibration by setting the calibration grid that is fixed, teach a user frame to the grid. To
perform calibration by setting a calibration grid on the robot gripper, teach a user tool to the grid.
Properly secure the calibration grid then teach the grid with the pointer precisely to the frame. The
teaching precision here affects compensation precision. So, perform this teaching process precisely.
Note that the frame used for the calibration grid setup might differ from the application user frame used
for compensation described in Subsection 7.2.4.
When the grid is fixed
When the calibration grid is fixed, teach it in a user specified user frame.
Teach the calibration grid precisely with the pointer tool attached to the robot and the proper TCP, then
set the user frame as shown below.
Z
Y
X
User coordinate system
For the method of setting a user coordinate system, see Appendix A, "CALIBRATIOB GRID SETUP
AND TRAINING ". The user coordinate system 4-point setting method is used.
The number of a user coordinate system set here needs to be different from the number of a user
coordinate system set in Subsection 7.2.4, "Application User Frame Setup ".
When the calibration grid is mounted on the robot
When a calibration grid is mounted on the robot, teach the calibration grid in a user specified tool frame.
Touch up the pointer installed on a fixed stand with the calibration grid precisely, then set a tool
coordinate system as shown below.
Z
Y
Tool coordinate system
X
For the method of setting a tool coordinate system, see Appendix A, "CALIBRATION GRID SETUP".
The tool coordinate system 6-point setting method is used.
7.2.4
Application User Frame Setup
Set the application user frame to be used by the robot for compensation operation.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
- 93 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
Set the application user frame so that the XY plane of the user coordinate system is parallel with the pallet
plane on which a workpiece moves.
Fixed camera
Optical axis of camera
Magnetic gripper
Pallet
Z
Application
User Frame
XY
7.2.5
Camera Calibration Data Creation and Teaching
For camera calibration, be sure to use grid pattern calibration.
On the calibration data setting screen, select the coordinate system number set in Subsection 7.2.3,
"Calibration grid setup", and the coordinate system number set in Subsection 7.2.4, "Application User
Frame Setup" correctly.
When the calibration grid is fixed
When the calibration grid is fixed, perform one-plane calibration. The calibration grid needs to be
placed in a slanting position between the upper surface and the lower surface of the pallet so that the
distance can be measured with grid points, as shown in the figure below. In one-plane calibration, the
focal length of the lens cannot be calculated correctly. So, set the focal length of the lens manually after
detection of the calibration grid. If the focal length of the lens used is 12 mm, enter 12.0.
Fixed camera
Calibration grid
- 94 -
B-82774EN-3/03
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
When the calibration grid is robot mounted
When the calibration grid is robot mounted, perform two-plane calibration by moving the robot up and
down as shown in the figure below. It is recommended that the up/down distance for two-plane
calibration be set to a value that covers the upper and lower ends of the workpiece in the pallet.
Detect the calibration grid at two different heights. When moving the calibration grid up and down,
perform robot jog without changing the robot posture.
Camera
Calibration grid
1st detection
Distance to cover
the
upper
and
lower ends of the
workpiece
2nd detection
The calibration grid setup page is shown below. The calibration grid image is displayed.
- 95 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
•
In "Application Frame", select a user
coordinate system number set in Subsection
7.2.4.
•
Enter the interval between grid points.
•
For installation of a robot mounted calibration
grid, set the number of calibration planes to 2.
For fixed calibration grid plate setting, set the
number to 1.
•
According to the setting of Subsection 7.2.3,
select the user frame number that was taught
to the calibration grid . Next, press the “Set
Frame” button.
•
For single-plane calibration, enter the focal
length manually.
•
Move the calibration grid the proper distance
from the camera then press the snap button.
Next, press the “Find” button for the first
plane.
•
Change the distance between the calibration
grid and camera then press the snap button.
Next, press the “Find” button for the second
plane.
- 96 -
B-82774EN-3/03
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
7.2.6
Vision Program Creation and Teaching
Create a new Bin-Pick Search Vision Process.
The basic method of creating a program is the same as that for 2.5-dimensional compensation based on a
single camera. Specifically, place a workpiece near the lower surface of the pallet, touch-up the
workpiece with the touch-up pin attached to the robot hand, and measure the workpiece height in the
application user frame (user coordinate system for compensation). Then, find the workpiece and set the
Reference Scale 1. Next, place the workpiece near the upper stage of the pallet, touch up the workpiece
with the touch-up pin attached to the robot hand, and measure the workpiece height in the application user
frame (user coordinate system for compensation). Then, find the workpiece and set the Reference Scale
2.
The Search vision process calculates workpiece height Z based on the detected workpiece scale.
Therefore, it is required that the workpieces placed on the same surface have the same size.
100%
52%
Camera
Camera
300 mm
60 mm
- 97 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
A description of the GPM Locator tool is omitted here. For setting of the GPM Locator tool, refer to
Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision Operator’s Manual.
•
Select camera calibration data.
•
Select a shutter speed.
•
Set the items to be prioritized when
multiple workpieces are detected.
Multiple items can be set.
Place the workpiece near the pallet lower
surface and:
•
Touch up the workpiece surface with the
TCP, measure the workpiece height, and
set the value to reference height 1.
•
Detect the workpiece with the "snap and
find" button and press the "Set Scale"
button to set the reference size.
Place the workpiece near the pallet upper
surface and:
•
Touch up the workpiece surface with the
TCP, measure the workpiece height, and
set the value to reference height 2.
•
Detect the workpiece with the "snap and
find" button and press the "Se Scale"
button to set the reference size.
For the workpiece for which the robot was
taught,
•
Press the "snap and find” button to
detect the workpiece. Then, press the
"Set Ref. Pos." button to set the
reference position.
- 98 -
B-82774EN-3/03
7.2.7
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. For fixed frame offset, add the
"VOFFSET" instruction as an operation statement. In the sample program, an operation for adjusting
the amount of approach(of the tenth line and the twelvth line) before the workpiece grasp position(of the
eleventh line) is included.
1:
UFRAME_NUM=1
2:
UTOOL_NUM=1
3:
4: L P[1] 500mm/sec FINE
5:
VISION RUN_FIND 'A'
6: L P[2] 500mm/sec FINE
7:
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
8:
9: ! Handling
10:
SKIP CONDITION xxx=yyy
10: L P[2] 500mm/sec FINE Tool_Offset,PR[2]
VOFFSET,VR[1]
11: L P[3] 100mm/sec FINE VOFFSET,VR[1] Skip,LBL[1]
PR[3]=LPOS
12: L P[4] 100mm/sec FINE Tool_Offset,PR[2]
VOFFSET,VR[1]
13: ! Handling
14:
15:
END
16:
LBL[1]
17:
END
18:
19:
LBL[99]
20:
UALM[1]
7.2.8
Sets the same value as the number of a user
coordinate system for compensation.
Excutes vision process A with the vision
detection instruction.
Obtains the measurement result of vision
process A.
Uses the skip condition to sense contact
with the workpiece. The condition needs
to be consider for each system.
Adds the "VOFFSET" instruction as an
operation statement, for position
compensation. Also uses the
"Tool_Offset" instruction to adjust the
amount of approach. If the Z element of
"PR" added to "Tool_Offset" is set to a
certain value, the amount of approach can
be adjusted.
Robot Compensation Operation Check
Check that a workpiece placed on the pallet can be detected and handled precisely. At first, decrease the
override of the robot motion to check that the logic of the program is correct. Next, increase the
override to check that the robot can operate continuously.
- 99 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
7.3
B-82774EN-3/03
SETUP FOR "ROBOT MOUNTED CAMERA + fixed frame
offset"
Follow the procedure below to perform setup.
1. Camera setting data creation and teaching
Teach the TCP at the tip of the
pointer used for user frame
teaching.
The pointer TCP is used to teach the
application user frame and the
calibration user frame.
2. Robot TCP setting
3. Teaching of the calibration grid frame
Teach the calibration grid user frame
or TCP frame number.
4. Teaching of the application user frame
Set user coordinate system
parallel with the pallet plane.
in
5. Camera calibration data creation and teaching
6. Vision program creation and teaching
7. Robot program creation and teaching
8. Robot compensation operation check
Setup for "robot mounted camera + fixed frame offset" needs to set "calibration grid information" in a
user coordinate system with an arbitrary number and to set a "application user frame" in parallel with the
pallet plane as a user coordinate system with an arbitrary number.
Z
X
Z
Y
User coordinate system
(calibration grid
installation information)
- 100 -
User coordinate system for
compensation
XY
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
7.3.1
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot hand or on a fixed stand is set in the camera setting data. When
using a hand camera, be sure to check this item.
When using a robot mounted camera, iRVision reads the current position of the robot automatically and
calculates compensation data. Therefore, it is not necessary to change the logic of a TP program
depending on the position of each camera when detecting a workpiece while moving the camera over the
pallet.
For details on a TP program, see Subsection 7.3.7, "Robot Program Creation and Teaching".
7.3.2
Robot TCP Setting
A pointer tool with a taught TCP is required to teach the application user frame and the calibration user
frame. In general, set the TCP accurately on the pointer installed on the robot gripper. If the accuracy
of this TCP setting is low, the precision in handling of a workpiece by the robot is also degraded,
especially when the workpiece is rotated. Set a robot TCP in an arbitrary tool coordinate system.
To reuse the pointer TCP for recalibration, the reproducibility of pointer installation is required. If the
reproducibility of pointer installation is not assured, a TCP needs to be set each time a pointer is installed.
When a robot mounted camera or calibration grid plate is used, the calibration grid user frame can be set
with the grid frame setting function. In this case, the calibration grid user frame can be set without
setting a TCP for a robot. For details, see Subsection 11.2, ”GRID FRAME SETTING” in the iRVision
operator's manual.
7.3.3
Calibration Grid Setup
Teach the calibration grid location in a user frame.
To perform calibration by setting the calibration grid in a fixed location on a table, set it up in a user
frame.
When the calibration grid is fixed, secure the calibration grid then touch up the grid with the pointer
precisely to set the user frame. The touch-up precision here affects compensation precision. So,
perform this touch-up processing precisely. Note that the user frame used for the calibration grid differs
from the application user frame described in Subsection 7.3.4.
- 101 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
When the calibration grid is fixed, teach a user specified user frame to it.
Teach the grid precisely with the pointer attached to the robot gripper. Teach the user frame as shown
below.
Z
Y
User coordinate system
X
For the method of setting a user coordinate system, see Appendix A, "CALIBRATION GRID SETUP".
The user coordinate system 4-point setting method is used.
7.3.4
Teaching Application User Frame
Set an application user frame to be used by the robot for compensation operation. The vision results are
converted into values in the application user frame.
If 0 is selected as the application user frame, the world coordinate system of the robot is used.
Set the application user frame for compensation so that the XY plane of the user coordinate system is
parallel with the pallet plane on which a workpiece moves.
Camera
Pallet
Z
Application
user frame for
compensation
XY
7.3.5
Camera Calibration Data Creation and Teaching
For camera calibration, grid calibration method is performed.
For a robot mounted camera, two-plane calibration is performed by moving the robot up and down as
shown in the figure below. Set the vertical movement distance for two-plane calibration to be set to a
value that covers the upper and lower ends of the workpiece in the pallet.
The calibration grid can be installed in any place. Secure the calibration grid, touch-up the origin,
X-axis direction point, and Y-axis direction point with the robot, and set them as the calibration grid user
frame. The touch-up precision here affects compensation precision. So, perform this touch-up
processing precisely. Note that the user frame used for the grid differs from the application user frame
described in Subsection 7.3.4.
Next, move the camera in the Z direction by a distance that covers the upper and lower ends of the
workpiece in the pallet and detect the grid pattern at two different heights. If the robot posture in which
- 102 -
B-82774EN-3/03
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
a calibration grid is detected is the same as the posture in which a workpiece is detected, the precision is
improved.
1st detection
Distance to cover the
upper and lower ends
of the workpiece
2nd detection
Calibration grid
The grid pattern calibration screen is shown below. A calibration grid image is displayed.
- 103 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
•
In "Application Frame", select the user
frame number set in Subsection 7.3.4.
•
Enter the interval between grid dots.
B-82774EN-3/03
•
7.3.6
•
According to the setting of Subsection
5.3.3, select the frame number of the
calibration grid position data is set.
Next, press the “Set Frame” button.
•
Move the camera for a proper distance
from the calibration grid then press the
snap button. Next, press the “Find”
button for the fist plane.
•
Change the distance between the
camera and calibration grid plate then
press the snap button. Next, press the
“Find” button for the second plane
Vision Program Creation and Teaching
Create a new Bin-Pick Search Vision Process.
The basic method of creating a program is the same as that for 2.5-dimensional compensation based on a
single camera. Specifically, place a workpiece near the lower surface of the pallet, touch-up the
workpiece with the touch-up pin attached to the robot hand, and measure the workpiece height in the
application user frame(user coordinate system for compensation). Then, find the workpiece and set the
Reference Scale 1. Next, place the workpiece near the upper stage of the pallet, touch up the workpiece
with the touch-up pin attached to the robot hand, and measure the workpiece height in the application user
frame (user coordinate system for compensation). Then, find the workpiece and set the Reference Scale 2.
The Search vision process calculates workpiece height Z based on the detected workpiece scale. Therefor,
it is required that the workpiece placed on the same surface have the same size.
- 104 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
100%
52%
Camera
Camera
300 mm
60 mm
- 105 -
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
B-82774EN-3/03
A description of the GPM Locator tool is omitted here. For setting of the GPM Locator tool, refer to
Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision Operator’s Manual.
•
Select camera calibration data.
•
Select a shutter speed.
•
Set the items to be prioritized when
multiple workpieces are detected.
Multiple items can be set.
Place the workpiece near the pallet lower
surface and:
•
Touch up the workpiece surface with
the TCP, measure the workpiece
height, and set the value to reference
height 1.
•
Detect the workpiece with the "snap
and find" button and press the "Set
Scale" button to set the reference size.
Place the workpiece near the pallet upper
surface and:
•
Touch up the workpiece surface with
the TCP, measure the workpiece
height, and set the value to reference
height 2.
•
Detect the workpiece with the "snap
and find” button and press the "Set
Scale" button to set the reference size.
For the workpiece for which the robot was
taught,
•
Press the "snap and find" button to
detect the workpiece. Then, press the
"Set Ref. Pos." button to set the
reference position.
- 106 -
B-82774EN-3/03
7.3.7
7.SETUP OF BIN-PICK SEARCH VISION PROCESS
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. For fixed frame offset, add the
"VOFFSET" instruction as an operation statement. In the sample program, an operation for adjusting
the amount of approach before the workpiece grasp position is included.
1:
UFRAME_NUM=1
2:
UTOOL_NUM=1
3:
4: L P[1] 500mm/sec FINE
5:
VISION RUN_FIND 'A'
6: L P[2] 500mm/sec FINE
7:
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
8:
9: ! Handling
10:
SKIP CONDITION xxx=yyy
10: L P[2] 500mm/sec FINE Tool_Offset,PR[2]
VOFFSET,VR[1]
11: L P[3] 100mm/sec FINE VOFFSET,VR[1] Skip,LBL[1]
PR[3]=LPOS
12: L P[4] 100mm/sec FINE Tool_Offset,PR[2]
VOFFSET,VR[1]
13: ! Handling
14:
15:
END
16:
LBL[1]
17:
END
18:
19:
LBL[99]
20:
UALM[1]
7.3.8
Sets the same value as the number of a user
coordinate system for compensation.
Excutes vision process A with the vision
detection instruction.
Obtains the measurement result of vision
process A.
Uses the skip condition to sense contact
with the workpiece. The condition needs
to be consider for each system.
Adds the "VOFFSET" instruction as an
operation statement, for position
compensation. Also uses the
"Tool_Offset" instruction to adjust the
amount of approach. If the Z element of
"PR" added to "Tool_Offset" is set to a
certain value, the amount of approach can
be adjusted.
Robot Compensation Operation Check
Check that workpieces placed on the pallet can be handled in sequence precisely. At first, decrease the
override of the robot motion to check that the logic of the program is correct. Next, increase the
override to check that the robot can operate continuously.
- 107 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
8
B-82774EN-3/03
SETUP OF 3D TRI-VIEW VISION PROCESS
This chapter describes the setup procedure for 3D Tri-View Vision Process by using the following two
application examples:
<1> Fixed camera + fixed frame offset
<2> Robot mounted camera + fixed frame offset
This chapter assumes that a personal computer for teaching is already set up. No description of personal
computer setup is provided here.
Fixed camera + fixed frame offset
An example of layout for "fixed camera + fixed frame offset" is given below.
Robot mounted camera + fixed frame offset
An example of layout for "robot mounted camera + fixed frame offset" is given below. Three points of a
workpiece are measured by moving one camera.
8.1
PREPARATION BEFORE SETUP
Following sections describe the setup procedures for "fixed camera + fixed frame offset" and "robot
mounted camera + fixed frame offset". At first, this section describes preparations for each of these setup
procedures.
- 108 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
Calibration grid preparation
With the 3D Tri-View Vision Process, use a calibration grid. Do not fail to prepare a calibration grid
beforehand. A standard calibration grid is available from FANUC in several sizes. It is strongly
recommended that you order a calibration grid as well as a camera and lens.
When two-plane calibration is performed by a fixed camera, the grid pattern calibration needs to be
attached to the robot. Therefore, it is recommended that the robot be designed so that it can incorporate it.
Memory card preparation
iRVision can save undetected images to a memory card inserted into the main board of the robot
controller. It is recommended that at the time of system start-up and integration, a memory card be
inserted to save undetected images to the memory card. By doing so, a GPM Locator Tool parameter
adjustment can be made using undetected images. Moreover, when the system is reinstalled after being
moved, for example, camera images before reinstallation, if saved, can be checked against camera images
after reinstallation to see if there is any major difference. Note that even if "Log Failed Images" is set in
the vision program, no undetected images can be saved when no memory card is inserted.
A memory card, when inserted, can be used to back up all data in the robot controller. If all data in the
robot controller is backed up, the vision data can be backed up at the same time. Be sure to back up all
data in the robot controller upon completion of startup or integration.
Preparation for drawings etc. about a workpiece
The 3D Tri-View Vision Process uses the distance between detection targets for calculation, so it is
necessary to input the coordinates of detection targets of the workpiece in an arbitrary coordinate system.
Make preparation so that the coordinates of the workpiece can be input from drawings etc. of the
workpiece during setting.
8.2
Placement of Cameras
The 3D Tri-View Vision Process uses a fixed camera or robot mounted camera.
Install three cameras considering the following points.
Determining a camera view
Determine the camera view so that the detection targets do not fall outside the camera view even when
they deviate at the maximum. If the camera view is too wide, the required compensation accuracy may
not be obtained.
Determining the placement of cameras
When the detection targets are detected, three gaze lines are measured. The cameras need to be placed
so that any pair of gaze lines is not parallel and any angle formed by two gaze lines is large to some
extent (if possible, 60 degrees or more).If the angle formed by gaze lines is tool small, the required
precision may not be obtained.
In this figure, these two lines are
substantially parallel and the angle
formed by gaze lines is small.
Do not select this placement.
- 109 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
Determining the focal length of a lens
Determine the focal length of a lens from the camera view and the distance between the camera and the
target. When the camera is XC-56, focal length f (mm) is calculated by the following expression.
f= 3.55×L÷W
L: Distance between Camera and Target(mm)
W: Camera field of view (mm)
Field of view
Target
W
L
Camera
8.3
SETUP FOR "FIXED CAMERA + FIXED FRAME OFFSET"
Setup for "fixed camera + fixed frame offset" needs to set "calibration grid frame" in a tool coordinate
system with an arbitrary number, and to set "application frame" in a user frame system with an arbitrary
number.
An example layout for "fixed camera + fixed frame offset" is given below.
Three fixed cameras measure 3 points of the workpiece.
Use the following setup procedure for "fixed camera + fixed frame offset":
- 110 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
1. Camera setting data creation and teaching
2. Set the calibration grid frame attached to the robot hand in a user
specified tool frame.
Use the vision grid pattern
setting for this setting.
3. Set a user frame system as needed
User frame system is used for
measurement. The world frame
may be used. In this case, select
0.
4. Camera calibration data creation and teaching
Create a calibration for each
5. Vision program creation and teaching
6. Robot program creation and teaching
7. Robot compensation operation check
8.3.1
Camera Setting Data Creation and Teaching
With iRVision, items such as camera types and camera installation methods for all cameras used are set in
camera setting data. Whether to install a camera on the robot gripper or on a fixed stand is set in the
camera setting data. When using a fixed camera, do not check "Robot-Mounted Camera".
- 111 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
8.3.2
B-82774EN-3/03
Calibration Grid Frame Setup
Perform calibration by setting a calibration grid on the robot gripper.
Fix the calibration grid installed on the hand securely and set the tool frame using the grid frame setting.
For setting of the grid frame setting, refer to Subsection 12.2.2, "When the grid is mounted on the robot
end of arm tooling", "Setting Based on Measurement with a Camera" in the iRVision Operator's Manual.
It is also possible to set the tool frame by teaching the calibration grid precisely with the pointer tool fixed
within the operation range of the robot in addition to using the grid frame setting. (Refer to Subsection
12.2.1, " When the grid is mounted on the robot end of arm tooling", "Setting Based on Touch-Up
Operation" in the iRVision Operator's Manual.) In this case, perform touch-up operation correctly not to
reduce the touch-up precision.
8.3.3
Application Frame Setup
Set the user frame to be used by the robot for compensation operation. The measurement result is output
as values in the user frame set here.
If 0 is selected as the coordinate system number, the world coordinate system of the robot is used.
Unlike two-dimensional compensation, it is unnecessary to set the optical axis of the camera normal to
the XY plane of the application frame, where the robot performs a compensation operation. The user
frame used for compensation may be placed in any place. If there is no coordinate system to be
specifically specified, use the world coordinate system of the robot.
Application frame(=World coordinate system)
- 112 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
8.3.4
Camera Calibration Data Creation and Teaching
For all cameras, the use of a calibration grid is recommended.
On the calibration data setting screen, select the coordinate system number set in Subsection 8.3.2,
"Calibration Grid Frame Setup", and the coordinate system number set in Subsection 8.3.3, "Application
User Frame Setup" correctly. Note that the coordinate system number of "application frame" must be the
same in all calibration data used.
When the calibration grid is robot mounted, perform two-plane calibration by moving the robot up and
down as shown in the figure below. Perform detection by bringing calibration surface 1 as close as
possible to the detection target. The up/down distance for two-plane calibration should be 100 to 150 mm.
Detect the calibration grid at two different heights. When moving the calibration grid up and down,
perform robot jog without changing the robot posture.
Position as close as
possible to the
detection target
1st detection
2nd detection
The calibration grid setup page is shown below. The calibration grid image is displayed.
SUPPLEMENT
Camera calibration clarifies the positional
relationship between the camera and the user
frame used for calibration.
- 113 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
- Set the user frame for compensation set
in Subsection 8.3.3. Specify the same
number for all calibration data used. If
there is no coordinate system to be
specifically specified, select 0 which is
the number for world coordinate
system.
- Enter the interval between grid points.
- Set the number of calibration planes to
2.
- According to the setting of Subsection
8.3.2, select the user frame number that
was taught to the calibration grid.
- Install the calibration grid in a place
close to the detection target of the
workpiece, orthogonally to the optical
axis of the camera, press the snap
button, and press the detection button.
- Change the distance between the
calibration grid and camera then press
the snap button. Next, press the “Find”
- 114 -
B-82774EN-3/03
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
8.3.5
Vision Program Creation and Teaching
Create a program for 3D Tri-View Vision Process. For each measurement point, a camera view is named
camera view 1 or camera view 2. "GPM Locator Tool" comes under each camera view.
Use the following procedure to teach “3D Tri-View Vision Process”.
For correct teaching, do not change the workpiece position until Subsection 8.3.6, "Robot Program
Creation and Teaching" is completed after performing "Snap and Find" of the camera view.
1. Select calibration data for camera view, and teach fundamental data.
2. Teach GPM Locator Tool for the target of camera view.
3. Select camera view then press the [Snap and Find] button to detect a
taught model.
4. Perform steps 1, 2, and 3 above for all camera views, select "3D
Tri-View Vision Process" in the tree view and press [Snap] button to
perform detection.
5.Select [Set Ref. Pos.] button to set a reference position.
In "3D Tri-View Vision Process", be sure to perform detection on the "3D Tri-View Vision Process"
setting screen before setting the reference position.
- 115 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
Camera View Teaching
With each camera view, teach the GPM Locator tool. A detatiled description of the GPM Locator tool is
omitted here. For setting of the GPM Locator tool, refer to Subsection 8.1, “GPM LOCATOR TOOL”
in the iRVision Operator’s Manual.
- Select the camera calibration data
- Set the proper shutter speed.
- Train the GPM locator.tool.
- Set the fundamental data. Enter the
coordinates of the detection target in
any coordinate system. (For example,
the coordinate in a drawing)
- Select [Snap and Find] button to detect
the workpiece.
Perform the same processing for camera
view 2 and camera view 3.
- Example of entering fundamental data:
This is an example of entering CAD
data of the workpiece. Enter the
coordinates of the detection target
displayed in CAD as fundamental data.
Coordinate system of CAD
(the position is arbitrary)
circle:
center: (x3, y3, z3)
circle:
center: (x1, y1, z1)
circle:
center: (x2, y2, z2)
- 116 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
Vision Program Teaching
Upon normal completion of detection with three camera views, set a reference position.
- In “Combine Error Limit”, set a
proper value as needed.
- Press the [Find] button to find the
workpiece.
- Check that a workpiece is detected,
then press the [Set Ref. Pos.]
button. This sets a reference
8.3.6
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. Three targets of a workpiece are
found. If multiple fixed cameras are used, no camera view number needs to be specified in the "VISION
RUN_FIND" instruction.
Sets the same value as the number of the
application frame.
1:
UFRAME_NUM=1
2:
UTOOL_NUM=1
3:
4:
VISION RUN_FIND‘A’
5:
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
6:
7: ! APPROACH
8: L P[2] 500mm/sec FINE VOFFSET,VR[1]
9:
10:
END
11:
LBL[99]
12:
UALM[1]
This instruction is completed when the
image of camera view 1, 2, and 3 have
been snapped.
Obtains the measurement result of vision
program A.
Approaches to a workpiece.
Add exception processing when
compensation data fails to be obtained.
- 117 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
When a fixed camera is used, one "VISION RUN_FIND" instruction measures all camera views prepared
beforehand. When the images of all camera views have been snapped, the line after the "VISION
RUN_FIND” instruction is executed.
8.3.7
Robot Compensation Operation Check
Check that three points of a workpiece can be detected and that the compensation can be performed
correctly. At first, decrease the override of the robot to check that the logic of the program is correct. Next,
increase the override to check that the robot can operate continuously.
8.4
SETUP FOR "ROBOT-MOUNTED CAMERA + FIXED
FRAME OFFSET"
Setup for "robot-mounted camera + fixed frame offset" needs to set “calibration grid frame”and
"application frame" in a user coordinate system with an arbitrary number.
An example of layout for "robot-mounted camera + fixed frame offset" is shown below. Three targets of a
workpiece are measured by moving one camera.
- 118 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
Use the following setup procedure:
1. Camera setting data creation and teaching
2. Set the calibration grid frame in a user specified user frame.
Use the vision grid
setting for this setting.
3. Set a user frame system as needed
User frame system is used for
measurement. The world frame
may be used. In this case, select
0.
4. Camera calibration data creation and teaching
pattern
Create a calibration for each
5. Vision program creation and teaching
6. Robot program creation and teaching
7. Check the robot compensation operation.
8.4.1
Camera Setting Data Creation and Teaching
With iRVision, items such as a camera type and camera installation method are set in camera setting data.
Whether to install a camera on the robot or on a fixed stand is set in the camera setting data. When using a
robot mounted camera, be sure to check this item.
- 119 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
8.4.2
B-82774EN-3/03
Calibration Grid Setup
Teach the calibration grid location in a user frame.
Set the calibration grid fixed and then set "the calibration grid frame" in a user specified user frame. The
calibration grid can be placed in any place. Use the grid frame setting for this setting.Fix the calibration
grid securely and set the grid frame setting. For information on setting the grid frame setting, refer to
"When the grid is installed on a fixed place" in Subsection 12.2.2, "Setting Based on Measurement with a
Camera" in the iRVision Operator's Manual. Note that the user frame used for the calibration grid differs
from the “application frame”.
It is also possible to set the user frame by teaching the calibration grid precisely with the pointer tool
fixed within the operation range of the robot in addition to using the grid frame setting. (Refer to
Subsection 12.2.1, "When the grid is installed on a fixed place", "Setting Based on Touch-Up Operation"
in the iRVision Operator's Manual.) In this case, perform touch-up operation correctly not to reduce the
touch-up accuracy.
Calibration grid plate
8.4.3
Teaching Application User Frame
Set the user frame to be used as the reference of robot compensation operation. If 0 is selected as the
coordinate system number, the world coordinate system of the robot is used.
Unlike two-dimensional compensation, it is unnecessary to set the optical axis of the camera normal to
the XY plane of the application user frame, where the robot performs a compensation operation. The user
frame used for compensation may be placed in any place. If there is no coordinate system to be
specifically specified, use the world coordinate system of the robot.
- 120 -
B-82774EN-3/03
8.SETUP OF 3D TRI-VIEW VISION PROCESS
Application frame(=World coordinate system)
8.4.4
Camera Calibration Data Creation and Teaching
Perform grid calibration method.
On the calibration data setting screen, select the coordinate system number set in Subsection 8.4.2,
"Calibration Grid Frame Setup", and the coordinate system number set in Subsection 8.4.3, "Teaching
Application User Frame" correctly. Note that the coordinate system number of "application frame" must
be the same in all calibration data used.
Perform two-plane calibration by moving the robot up and down as shown in the figure below. When
detecting, set the optical axis of the camera normal to the XY plane of the application user frame; where
the robot performs a compensation operation. The up/down distance for two-plane calibration should be
100 to 150 mm.
Detect the calibration grid at two different heights. When moving the calibration grid up and down,
perform robot jog without changing the robot posture.
1st
100mm ~
150mm
2nd detection
Calibration grid plate
The calibration grid setup page is shown below. The calibration grid image is displayed.
- 121 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
SUPPLEMENT
Camera calibration clarifies the positional
relationship between the camera and the user
frame used for calibration.
- 122 -
B-82774EN-3/03
B-82774EN-3/03
8.SETUP OF 3D TRI-VIEW VISION PROCESS
- Set the user frame for compensation.
Specify the same number for all
calibration data used. If there is no
coordinate system to be specifically
specified, select 0 which is the number
for world coordinate system.
- Enter the interval between grid points.
- Set the number of calibration planes to
2.
- Select the number of a user coordinate
system where calibration grid position
data is set. Next, press the [Set Frame]
button.
- Move the camera for a proper distance
from the calibration grid then press the
snap button. Next, press the [Find]
button for the fist plane.
- Change the distance between the
camera and calibration grid plate then
press the snap button. Next, press the
[Find] button for the second plane.
- 123 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
8.4.5
B-82774EN-3/03
Vision Program Creation and Teaching
Create a program for 3D Tri-View Vision Process. For each measurement point, a camera view is named
camera view 1 or camera view 2. "GPM Locator Tool" comes under each camera view.
Use the following procedure to teach the program for "3D Tri-View Vision Process".
For correct teaching, do not change the workpiece position until Subsection 8.3.6, "Robot Program
Creation and Teaching" is completed after performing "Snap and Find" of the camera view.
1.
Select calibration data for camera view, and teach distance between
camera and target and fundamental data.
2.
Move the camera to a proper position and teach the pattern match of
the detection target of a camera view. Record the current camera
position in the robot program (see Subsection 2.4.2).
3. Select camera view then press the [Snap and Find] button to detect a
taught model.
4. Perform steps 1, 2, and 3 above for all camera views, select "3D
Tri-View Vision Process" in the tree view and press "Snap" button to
perform detection.
5.Select [Set Ref. Pos.] button to set a reference position.
In "3D Tri-View Vision Process", be sure to perform detection on the "3D Tri-View Vision Process"
setting screen before setting the reference position.
- 124 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
Camera View Teaching
With each camera view, teach GPM Locator Tool. A description of the GPM Locator Tool is omitted here.
For setting of the GPM Locator Tool, refer to Subsection 8.1, “GPM LOCATOR TOOL” in the iRVision
Operator’s Manual.
- Select the camera calibration data
- Set the proper shutter speed.
- Train the GPM locator tool.
- Change this value to the rough distance
between the camera and the detection
target during imaging. (See the figure
below.)
- Set the fundamental data. Enter the
coordinates of the detection target in
any coordinate system. (For example,
the coordinate in a drawing)
- Select [Snap and Find] button to detect
the workpiece.
Perform the same processing for camera
view 2 and camera view 3.
Enter the rough distance between the camera and the detection
target during imaging in "Dist. between Camera and Target".
An error of 20 to 30 mm between the actual distance and the
entered value is generally acceptable.
- 125 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
- Example of entering fundamental data:
This is an example of entering CAD data
of the workpiece. Enter the coordinates
of the detection target displayed in CAD
as fundamental data.
Coordinate system of CAD
(the position is arbitrary)
circle:
center: (x3, y3, z3)
circle:
center: (x1, y1, z1)
circle:
center: (x2, y2, z2)
Vision Program Teaching
Upon normal completion of detection with three camera views, set a reference position.
- In “Combine Error Limit”, set a
proper value as needed.
- Press the [Find] button to find the
workpiece.
- Check that a workpiece is
detected, then press the [Set Ref.
Pos.] button. This sets a reference
position for all camera views.
- 126 -
8.SETUP OF 3D TRI-VIEW VISION PROCESS
B-82774EN-3/03
8.4.6
Robot Program Creation and Teaching
In the sample program below, a vision program named "A" is used. Three targets of a workpiece are
found by moving the robot-mounted camera. Program A has three camera views which have different
view positions of robot mounted cameras, so that each camera view number is added to the "VISION
RUN_FIND" instruction.
1:
2:
3:
4:
5:
6:
7:
8:
9:
5:
6:
7:
8:
9:
10:
11:
12:
Sets the application frame here.
UFRAME_NUM=1
UTOOL_NUM=1
Moves to the first view position and run
find for camera view 1.
L P[1] 500mm/sec FINE
VISION RUN_FIND 'A' CAMERA_VIEW[1]
L P[2] 500mm/sec FINE
VISION RUN_FIND 'A' CAMERA_VIEW[2]
L P[3] 500mm/sec FINE
VISION RUN FIND 'A' CAMERA_VIEW[3]
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
Moves to the second view position and run
find for camera view 2.
Moves to the third view position and run
find for camera view 3.
! APPROACH
L P[4] 500mm/sec FINE VOFFSET,VR[1]
Approaches to a workpiece.
END
LBL[99]
UALM[1]
Add exception processing when
compensation data fails to be obtained.
When a camera image has been snapped, the line after the "VISION RUN_FIND" instruction is executed.
8.4.7
Robot Compensation Operation Check
Check that three points of a workpiece can be detected and that the compensation can be performed
correctly. At first, decrease the override of the robot to check that the logic of the program is correct. Next,
increase the override to check that the robot can operate continuously.
- 127 -
9.SET UP OF SNAP IN MOTION
9
B-82774EN-3/03
SET UP OF SNAP IN MOTION
This function enables iRVision to snap an image without stopping robot motion, and is effective when
measuring a fixed frame offset with a robot-mounted camera and/or measuring a tool offset of a part held
by a robot with a fixed camera or a camera mounted on another robot. Using this function will reduce
robot cycle time compared to the legacy way.
9.1
OVERVIEW OF SNAP IN MOTION
This section gives an overview of the function snap-in-motion.
9.1.1
FEATURES
This function enables iRVision to snap an image without stopping robot motion, and is effective when
measuring a fixed frame offset with a robot-mounted camera and/or measuring a tool offset of a part held
by a robot with a fixed camera or a camera mounted on another robot. The following vision processes
support this function.
2D Single-view Vision Process
2D Multi-view Vision Process
Depalletizing Vision Process
Bin-pick Search Vision Process
3D Tri-View Vision Process
You can calibrate cameras and teach vision processes as is conventionally.
Obtaining an accurate robot position at a snapping moment is a key technology to measure a fixed frame
offset of part with a robot-mounted camera or a tool offset of a part held by a robot with a fixed camera or
a camera mounted on another robot. Conventionally an accurate robot position can be obtained only
while a robot remains stationary. The function snap-in-motion enables measuring a fixed frame offset
and a tool offset by getting an accurate robot position at a snapping moment.
This function has the following restrictions.
The robot that holds a camera or a part should be controlled by the controller on which iRVision
resides.
Only one robot can move at a snapping moment.
When measuring a tool offset with a robot-mounted camera, two robots are involved. In this case, only
one of two robots can move at snapping. The other robot should stop while snapping an image.
iRVision assumes a robot that can move at a snapping moment in the following manner:
If [This Controller] is selected for [Robot Holding the Part], the robot that held a part can move.
If a robot other than [This Controller] is selected for [Robot Holding the Part] and [This Controller]
is selected for [Robot Holding the camera], the robot that holds a camera can move.
Above conditions are used in both cases that two robots are controlled by one controller and two
controllers.
- 128 -
9.SET UP OF SNAP IN MOTION
B-82774EN-3/03
9.1.2
USING SNAP IN MOTION
By default, the function snap-in-motion is disabled. To use this function, change the system variable
$VSMO_CFG.$ENABLE to TRUE.
By setting this system variable to true, the function snap-in-motion itself is enabled. But, to snap an
image actually without stopping robot motion, you need to modify your robot program. Refer to the
subsection “9.1.4 ROBOT PROGRAM FOR SNAP IN MOTION” about how to modify robot programs.
By enabling the function snap-in-motion, the method to obtain a robot position at the snapping moment is
changed internally. Basically, you can use camera calibrations, vision processes and robot programs that
you taught previously even after this function is enabled. However, potentially a quantity of errors may
be observed if you continue to use camera calibrations and vision processes that you have taught before
the function is enabled. In such a case, calibrate cameras and teach the reference positions again.
9.1.3
CHECKING POSITION AND SPEED AT SNAP
When the function snap-in-motion is enabled, the actual position and speed of the robot at the last
snapping moment are recorded in the following system variables. Refer to these variables to determine
the exposure time and so on in study for application described later.
$VSMO_VAL.$POSITION
It is the actual robot position at the snapping moment. It is in the Cartesian format. X, Y and Z are in
millimeters, and W, P and R are in degrees. For a robot-mounted camera, the position of the mechanical
interface relative to the application user frame is recorded. For tool offset, the position of the user tool
that you selected in the vision process setup page is recorded.
$VSMO_VAL.$SPEED
It is the actual robot speed at the snapping moment. X, Y and Z are in mm/sec, and W, P and R are in
degrees/sec.
9.1.4
ROBOT PROGRAM FOR SNAP IN MOTION
For snap-in-motion, you need to execute the VISION RUN_FIND instruction in a different manner from
usual.
Snap after stopping
This is a sample program to snap an image while stopping the robot as is conventionally done. FINE is
specified for P[2] in the second line. While the robot is stopping at P[2], the VISION RUN_FIND
instruction snaps an image in the third line. After that, the robot restarts to move to P[3] in the forth
line.
1: L P[1:start] 500mm/sec FINE
2: L P[2:snap] 500mm/sec FINE
3:
VISION RUN_FIND 'A'
4: L P[3:stop] 500mm/sec FINE
Snap in motion
To snap an image without stopping the robot, the VISION RUN_FIND instruction should be executed by
using the TIME BEFORE instruction. In the robot program below, CNT100 is specified for P[2] in the
second line, and the subprogram FIND.TP is called at the moment that the robot passes through P[2] by
using the TIME BEFORE instruction.
- 129 -
9.SET UP OF SNAP IN MOTION
B-82774EN-3/03
1: L P[1:start] 500mm/sec FINE
2: L P[2:snap] 500mm/sec CNT100 TB 0.00sec,CALL 'FIND'
3: L P[3:stop] 500mm/sec FINE
FIND.TP that is called by the TIME_BEFORE instruction is as below. The VISION RUN_FIND
function is executed in this subprogram.
1:
VISION RUN_FIND 'A'
You can tweak the start time of the TIME BEFORE instruction, or you can use the TIME AFTER
instruction or the DISTANCE BEFORE instruction instead of the TIME BEFORE instruction. For these
instructions, refer to “R-30iA HANDLING TOOL OPERATOR’S MANUAL” or “R-30iA Mate LR
HANDLING TOOL OPERATOR’S MANUAL”.
If positions P[1] ~ P[3] in the above program are not on a line, the robot moves inside of the original path
and does not pass through P[2] because of CNT100. In such a case, modify P[2] so that the robot
actually passes through the expected snap position.
9.1.5
NOTES
To use this function, use 7DA5/09 or later of the controller software.
Vibration of robot arm while robot moving is not considered. The bigger the vibration is, the larger the
error is.
9.2
STUDY FOR APPLICATION
This section describes issues to consider when the function snap-in-motion is used.
9.2.1
LIGHT AND EXPOSURE TIME
In the case of snapping an image in motion, the robot is moving even during exposure and it causes image
blur. Because this blur can cause detection error, the exposure time should be set a smaller value than
usual to mitigate the blur.
Assuming that the robot moves in the direction at a right angle to the optical axis of the camera, the
amount of blur is calculated by the expression V × T × N ÷ S in pixels, where V is the velocity at the
snapping moment in mm/sec, T is the exposure time in sec, S is the size of the camera field of view in
millimeters and N is the number of effective pixels of the camera. Determine the exposure time so that
this amount of blur is smaller than 1 pixel and prepare a good light source to get a fully bright image with
the selected exposure time.
9.2.2
IMAGE PROCESSING TIME AND MOTION TIME
In the case of snapping images in motion, you should consider the relationship between the image
processing time and the robot motion time.
GET_OFFSET
The following program calls the subprogram FIND.TP, which include the VISION RUN_FIND
instruction, at the moment that the robot passes through P[2] in the second line, and tries to get the
resulting vision offset in the fourth line. If P[2] and P[3] were too close and the robot reached P[3] in a
short time, the robot would wait for the completion of vision detection in the fourth line, resulting the
robot stopping. To avoid such a thing, consider a good layout of the camera so that the image
processing finishes while the robot is moving from P[2] to P[3].
- 130 -
9.SET UP OF SNAP IN MOTION
B-82774EN-3/03
1:
2:
3:
4:
5:
L P[1:start] 500mm/sec CNT100
L P[2:snap] 500mm/sec CNT100 TB 0.00sec,CALL FIND
L P[3:stop] 500mm/sec CNT100
VISION GET_OFFSET 'A' VR[1] JMP,LBL[1]
L P[4:approach] 500mm/sec FINE VOFFSET,VR[1]
Continuous RUN_FINDs
In iRVision, another vision process cannot be started until the previous vision process completes. So, if
theVISION RUN_FIND instruction were called twice at a too short interval, the latter vision process
would be kept waiting and its image snapping would not happen at the expected position.
For example, in the following program, if the time to move from P[2] to P[3] is shorter than the time for
the image processing for FIND1.TP, snapping an image for FIND2.TP will be delayed.
1:
2:
3:
4:
L
L
L
L
P[1:start] 500mm/sec CNT100
P[2:snap1] 500mm/sec CNT100 TB 0.00sec,CALL FIND1
P[3:snap2] 500mm/sec CNT100 TB 0.00sec,CALL FIND2
P[4:stop] 500mm/sec CNT100
Time a vision process takes depends on the shape of the trained model pattern, the detection parameters,
the condition of the snapped image and the load of the controller that performs the vision process.
Check your vision process time by actually executing the vision process.
In addition, if the vision process executed previously is configured to log images, the next vision process
can be kept waiting until the previous vision process completes image logging and snapping an image of
the next vision process can be delayed. In such a case, configure the previous vision process not to log
images. The delay of snapping an image can be shortened slightly if the ‘Disable Logging’ is checked
in the iRVision configuration setup page.
9.2.3
SHIFT OF SNAP POSITION
Depending on the condition of the controller, timing of snap may be fluctuated slightly. The shift of the
snap timing is basically no problem, because iRVision can get an accurate robot position at snapping.
But if the size of the camera field of view is adjusted so that an image is filled with a part, the shift of the
snap timing lead the part being out of the camera field of view and that causes detection failure. Set the
size of the camera field of view to have enough margins to accept a quantity of the shift of the snap
timing. The size of margin depends on the speed of the robot at snapping. The faster the robot is
moving, the bigger the shift of the snapping position is.
9.3
SAMPLE APPLICATIONS
This section demonstrates three sample applications using the function snap-in-motion.
Tool offset with a fixed-mounted camera (2D Single-view Vision Process)
Tool offset with a fixed-mounted camera (2D Multi-view Vision Process)
Fixed frame offset with a robot-mounted camera (3D Tri-view Vision Process)
In either application, generally the setup procedure and items to consider are the same as those of each
application without the function snap-in-motion. The descriptions about the area of overlap with those
in each vision processes are left out.
- 131 -
9.SET UP OF SNAP IN MOTION
9.3.1
B-82774EN-3/03
TOOL OFFSET WITH A FIXED CAMERA (2D SINGLE-VIEW
VISION PROCESS)
In a tool offset application with a fixed-mounted camera of 2D Single-view Vision Process, a part held by
the robot is shown at the camera and the tool offset is measured. Using the measured tool offset, the
robot positions are compensated to put the part in a correct position. The tool offset is usually measured
with stopping the robot. Using the function snap-in-motion, the vision detection can be executed while
the robot is transporting a part without stopping robot motion.
The subsection “3.4.6 ROBOT PROGRAM CREATION AND TEACHING” in the setup procedure
described in the section “3.4 SETUP FOR "FIXED CAMERA + GRIP ERROR COMPENSATION"” is
different from those using the function snap-in-motion. “ROBOT PROGRAM CREATION AND
TEACHING” is explained below.
The figure below is an example of layout in the case of tool offset using 2D Single-view Vision Process
with a fixed-mounted camera.
9.3.1.1
ROBOT PROGRAM CREATION AND TEACHING
This subsection demonstrates robot programs for tool offset in motion. Based on sample robot programs
below, create appropriate robot programs for your application.
The following two programs are created.
Main robot program (MAIN.TP)
Robot program to detect a target (FIND.TP)
MAIN.TP
This is the main program. In this program, the robot moves from the start position P[1] to the stop
position P[3] through the snap position P[2], and call the subprogram FIND.TP at the moment that the
robot passes through the snap position. After that, the VISION GET_OFFSET instruction is called to
- 132 -
9.SET UP OF SNAP IN MOTION
B-82774EN-3/03
get the resulting vision offset. The robot positions P[4] and P[5] are compensated with the vision offset
so that the part is put at the correct location.
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
UFRAME_NUM=1
UTOOL_NUM=1
L P[1:start] 500mm/sec CNT100
L P[2:snap] 500mm/sec CNT100 TB 0.00sec,CALL FIND
L P[3:stop] 500mm/sec CNT100
VISION GET_OFFSET 'A’ VR[1] JMP LBL[99]
L P[4:approach] 500mm/sec FINE VOFFSET,VR[1]
L P[5:set] 100mm/sec FINE VOFFSET,VR[1]
L P[4:approach] 100mm/sec FINE VOFFSET,VR[1]
END
LBL[99]
UALM[1]
FIND.TP
This is a program called by the TIME BEFORE instruction. This program executes the vision program
“A” using the VISION RUN_FIND instruction.
1:
VISION RUN_FIND 'A'
CAUTION
If you encounter the alarm CVIS-174 “ Robot Pos(Z) is different from Reference
Pos.” in tool offset application using the function snap-in-motion, increase the
system variable $VISTOL_2DZ so that alarm does not occur. But you will have
less accuracy by increasing too much on this system variable.
9.3.2
TOOL OFFSET WITH A FIXED CAMERA (2D MULTI-VIEW
VISION PROCESS)
In a tool offset application with a fixed-mounted camera of 2D Multi-view Vision Process, features on a
part held by the robot are shown at the camera and the tool offset is measured. Using the measured tool
offset, robot positions are compensated to put the part at a correct position. The tool offset is usually
measured with stopping the robot. Using the function snap-in-motion, the vision detection can be
executed while the robot is transporting a part without stopping robot motion.
The subsection “4.4.6 ROBOT PROGRAM CREATION AND TEACHING” in the setup procedure
described in the section “4.4 SETUP FOR "FIXED CAMERA + GRIP ERROR COMPENSATION"” is
different from those using the function snap-in-motion. “ROBOT PROGRAM CREATION AND
TEACHING” is explained below.
The figure below is an example of layout in the case of tool offset using 2D Multi-view vision process
with a fixed-mounted camera
- 133 -
9.SET UP OF SNAP IN MOTION
9.3.2.1
B-82774EN-3/03
ROBOT PROGRAM CREATION AND TEACHING
This subsection demonstrates robot programs for tool offset in motion. Based on sample robot programs
below, create appropriate robot programs for you application.
Following two programs are created.
Main robot program (MAIN.TP)
Robot program to detect a target (FIND.TP)
MAIN.TP
This is the main program. In this program, the robot moves from the start position P[1] to the stop
position P[4] through the two snap positions P[2] and P[3], and calls the subprogram FIND.TP with a
camera view number as an argument at the moment that the robot passes through the snap positions. In
order to confirm that the vision processes are completed in every camera view, it waits until R[1]
becomes 2. After that, the VISION GET_OFFSET instruction is called to get the resulting vision offset.
The robot positions P[5] and P[6] are compensated with the vision offset so that the part is put at the
correct location.
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
UFRAME_NUM=1
UTOOL_NUM=1
R[1]=0
P[1:start] 500mm/sec CNT100
P[2:snap1] 500mm/sec CNT100 TB 0.00sec,CALL FIND(1)
P[3:snap2] 500mm/sec CNT100 TB 0.00sec,CALL FIND(2)
P[4:stop] 500mm/sec CNT100
WAIT R[1]>=2
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
L P[5:approach] 500mm/sec FINE VOFFSET,VR[1]
L P[6:set] 100mm/sec FINE VOFFSET,VR[1]
L P[5:approach] 500mm/sec FINE VOFFSET,VR[1]
END
LBL[99]
UALM[1]
L
L
L
L
- 134 -
9.SET UP OF SNAP IN MOTION
B-82774EN-3/03
FIND.TP
This is a program called by the TIME BEFORE instruction. This program executes detection of the
camera view specified by the argument using the VISION RUN_FIND instruction. And R[1] is
incremented after the image acquisition completes.
1:
2:
VISION RUN_FIND 'A' CAMERA_VIEW[AR[1]]
R[1]=R[1]+1
CAUTION
If you encounter the alarm CVIS-174 “ Robot Pos(Z) is different from Reference
Pos.” in tool offset application using the function snap-in-motion, increase the
system variable $VISTOL_2DZ so that alarm does not occur. But you will have
less accuracy by increasing too much on this system variable.
9.3.3
FIXED FRAME OFFSET WITH A ROBOT MOUNTED CAMERA
(3D TRI-VIEW VISION PROCESS)
In a fixed frame offset application with a robot-mounted camera of 3D Tri-view Vision Process, three
features on a part are detected by the robot mounted-camera and the 3D position of the part is measured.
Using the measured position, robot positions are compensated so that the tool of the robot can reach the
part. The fixed frame offset is usually measured with stopping the robot. Using the function
snap-in-motion, the measurements can be executed continuously without stopping the robot holding the
camera at each snap position.
The subsection “8.4.6 ROBOT PROGRAM CREATION AND TEACHING” in the setup procedure
described in the section “8.4 SETUP FOR "ROBOT-MOUNTED CAMERA + GRIP ERROR
COMPENSATION"” is different from those using the function snap-in-motion. “ROBOT PROGRAM
CREATION AND TEACHING” is explained below.
The figure below is an example of layout in the case of fixed frame offset using 3D Tri-view vision
process with a robot-mounted camera
9.3.3.1
ROBOT PROGRAM CREATION AND TEACHING
This subsection demonstrates robot programs for fixed frame offset with a robot-mounted camera in
motion. Based on sample robot programs below, create appropriate robot programs for you application.
- 135 -
9.SET UP OF SNAP IN MOTION
B-82774EN-3/03
Following two programs are created.
Main robot program (MAIN.TP)
Robot program to detect a target (FIND.TP)
MAIN.TP
This is a main program. In this program, the robot moves from the start position P[1] to the stop position
P[5] through the three snap positions P[2] ~ P[4], and calls the subprogram FIND.TP with a camera view
number as an argument at the moment that the robot passes through each snap position. To confirm that
the vision processes are completed in every camera view, it waits until R[1] become 3. After that, the
VISION GET_OFFSET instruction is called to get the vision offset. The robot positions P[6] and P[7]
are compensated with the vision offset so that the robot holds the part.
1:
2:
3:
4:
4:
5:
6:
7:
8:
9:
9:
10:
11:
12:
13:
14:
15:
UFRAME_NUM=1
UTOOL_NUM=1
R[1]=0
P[1:start] 500mm/sec CNT100
P[2:snap1] 500mm/sec CNT100 TB 0.00sec,CALL FIND(1)
P[3:snap2] 500mm/sec CNT100 TB 0.00sec,CALL FIND(2)
P[4:snap3] 500mm/sec CNT100 TB 0.00sec,CALL FIND(3)
P[5:stop] 500mm/sec CNT100
WAIT R[1]>=3
VISION GET_OFFSET 'A' VR[1] JMP LBL[99]
L P[6:approach] 500mm/sec FINE VOFFSET,VR[1]
L P[7:hold] 100mm/sec FINE VOFFSET,VR[1]
L P[6:approach] 500mm/sec FINE VOFFSET,VR[1]
END
LBL[99]
UALM[1]
L
L
L
L
L
FIND.TP
This is a program called by the TIME BEFORE instruction. This program executes detection in the
camera view specified by the argument using the VISION RUN_FIND instruction. And R[1] is
incremented after the image acquisition completes.
1:
2:
VISION RUN_FIND 'A' CAMERA_VIEW[AR[1]]
R[1]=R[1]+1
- 136 -
10.TROUBLESHOOTING
B-82774EN-3/03
10
TROUBLESHOOTING
iRVision has an online help function. Open the "Vision Setup" screen then select "Help" from the menu
of the screen to view online help.
Adjustment method after camera replacement
If the camera fails, replace the camera with a new one and adjust the new camera according to the
procedure below.
Before removing the faulty camera in use, check that the lens aperture and focus rings are firmly secured.
The lens is reusable. By replacing the lens without moving the focus and lens rings, the need for pattern
match readjustment can be eliminated for easier adjustment operation.
Use the following replacement and adjustment procedure:
1 Turn off the power to the robot controller.
2 By turning off the power to the controller, the power to the camera is also turned off.
3 Remove the camera.
4 At this time, be careful not to apply force to the lens aperture and focus rings.
5 Remove the lens from the camera.
6 Change the settings of the DIP switch on the rear panel of the new camera according to the manual.
7 Attach the lens mentioned above to the new camera.
8 Install and secure the new camera.
9 Perform camera calibration.
This completes the replacement and adjustment operation.
- 137 -
10.TROUBLESHOOTING
B-82774EN-3/03
Method of restoring vision data
Prepare the memory card for data back-up described in Section 2.1, "PREPARATION BEFORE
SETUP".
The extension of a vision data file is "VD". Vision data can be completely restored by loading all *.VD
files.
If the copy destination robot controller differs from the copy source robot controller, pay attention to the
software version. The software version of the copy destination robot controller must be the same as or
later than the software version of the copy source robot controller.
- 138 -
APPENDIX
APPENDIX A.CALIBRATION GRID SETUP AND TRAINING
B-82774EN-3/03
A
CALIBRATION GRID SETUP AND
TRAINING
Calibration grid plate
In grid calibration, a calibration grid with a predetermined grid pattern is used to perform camera
calibration. By causing the camera to view grid points as shown below, iRVision automatically
recognizes the positional relationship between the calibration grid and camera, lens distortion, focal
length, and so forth the grid calibration setup.
On the calibration grid, black solid circles are arranged in a grid form. Each of the circles at the center
and the four corners internally has a small white circle. These small white circles are touched with the
pointer tool attached to the robot to teach the calibration user frame.
X
Y
Origin
Calibration grid plate Setup
Both calibration grid setup in a user frame when a calibration grid is fixed, and in a tool frame when a
calibration grid is held by the robot are described below.
When the plate is fixed
When a calibration grid is fixed, the robot touches the center of the large circle at the center of the grid
and the center of the small circles at three corners of the grid with the pointer tool to create a user frame.
Select 「User Frame Setup/Four Point」 from the teach pendant.
- 141 -
A.CALIBRATION GRID SETUP AND TRAINING APPENDIX
B-82774EN-3/03
When the 4-point teaching method is used on the teach screen, the pointer tool attached to the robot
gripper touches up each of the small white circles on the calibration grid as shown below to set the
calibration user frame.
X-axis direction
Origin of
coordinate
system
Y-axis direction
X-axis start point
When the calibration grid is held by the robot
When the calibration grid is held by the robot, a pointer is secured near the robot, but not on the robot.
The robot touches the tip of the fix-mounted pointer to the robot mounted calibration grid to teach a user
tool frame to the calibration grid. Select 「Tool Frame Setup/Six Point」 from the teach pendant.
- 142 -
APPENDIX A.CALIBRATION GRID SETUP AND TRAINING
B-82774EN-3/03
When the 6-point teaching method is used on the teach screen, each of the small white circles on
the calibration grid
as shown below touches up the tip of the fix mounted
Z-axis pointer
direction to teach the user
Origin of coordinate system
(Jog the robot
mounted calibration
grid so this point
touches the fixed
mounted pointed
without changing the
robot posture.
manually enter:
W = W + 90)
tool frame.
Approach point 1
Approach point 2
Approach point 3
X-axis direction
(Jog the robot mounted calibration grid so this
point touched the fixed mounted pointer without
changing the robot posture.)
- 143 -
TABLE OF CONTENTS
B-82774EN-3/03
INDEX
<Number>
<P>
2D MULTI VIEW VISION PROCESS .........................40
2D SINGLE VIEW VISION PROCESS........................15
Placement of Cameras..................................................109
PREFACE ........................................................................1
PREPARATION BEFORE SETUP .... 16,41,71,80,90,108
<A>
<R>
Application Frame Setup..............................................112
Application User Frame Setup .............................20,44,93
Robot Compensation Operation Check 24,31,39,51,60,69,
..................................................... 79,88,99,107,118,127
Robot Program Creation and Teaching 24,31,38,50,60,69,
................................ 79,88,99,107,117,127,132,134,135
ROBOT PROGRAM FOR SNAP IN MOTION..........129
Robot TCP Setting ........................................ 73,82,92,101
Robot TCP Setup ............................................ 18,26,43,53
<C>
Calibration grid Setup ...................................................26
Calibration grid frame Setup .........................................19
Calibration grid frame setup...............................43,93,112
Calibration grid plate Setup............................................62
Calibration grid Setup ....................... 34,53,74,82,101,120
CALIBRATION GRID SETUP AND TRAINING .....141
Camera Calibration Data Creation and Setup ...........27,54
Camera Calibration Data Creation and Teaching.....21,35,
........................................... 45,64,75,83,94,102,113,121
Camera Setting Data Creation and Teaching ......18,26,34,
...................................... 43,52,62,73,82,92,101,111,119
CHECKING POSITION AND SPEED AT SNAP ......129
<S>
SAFETY ....................................................................... s-1
SAMPLE APPLICATIONS.........................................131
SET UP OF SNAP IN MOTION .................................128
SETUP FOR "FIXED CAMERA + FIXED FRAME
OFFSET"................................................... 17,41,91,110
SETUP FOR "FIXED CAMERA + tool offset " ...........31
SETUP FOR "FIXED CAMERA + tool offset" ............60
SETUP FOR "ROBOT MOUNTED CAMERA +
FIXED FRAME OFFSET "........................................25
SETUP FOR "ROBOT MOUNTED CAMERA + fixed
frame offset".............................................................100
SETUP FOR "ROBOT-MOUNTED CAMERA +
FIXED FRAME OFFSET"........................ 51,72,81,118
SETUP OF 3D TRI-VIEW VISION PROCESS ..........108
SETUP OF BIN-PICK SEARCH VISION PROCESS ..89
SETUP OF DEPALLETIZING VISION PROCESS .....80
SETUP OF FLOATING FRAME VISION PROCESS .70
SHIFT OF SNAP POSITION ......................................131
STUDY FOR APPLICATION.....................................130
<F>
FEATURES .................................................................128
FEATURES AND NOTES ...........................4,6,7,9,11,13
FIXED FRAME OFFSET WITH A ROBOT
MOUNTED CAMERA (3D TRI-VIEW VISION
PROCESS) ...............................................................135
<I>
IMAGE PROCESSING TIME AND MOTION TIME 130
<L>
LAYOUT ....................................................4,6,8,10,12,14
LIGHT AND EXPOSURE TIME ................................130
<T>
<N>
Teaching Application User frame ..... 27,53,74,83,102,120
TOOL OFFSET WITH A FIXED CAMERA (2D
MULTI-VIEW VISION PROCESS)........................133
TOOL OFFSET WITH A FIXED CAMERA (2D
SINGLE-VIEW VISION PROCESS) ......................132
TROUBLESHOOTING ...............................................137
NOTES.........................................................................130
<O>
OVERVIEW OF 2D MULTI VIEW VISION
PROCESS.....................................................................5
OVERVIEW OF 2D SINGLE VIEW VISION
PROCESS.....................................................................3
OVERVIEW OF 2D VISION ..........................................3
OVERVIEW OF 3D TRI-VIEW VISION PROCESS ...13
OVERVIEW OF BIN-PICK SEARCH VISION
PROCESS...................................................................10
OVERVIEW OF DEPALLETIZING VISION
PROCESS.....................................................................9
OVERVIEW OF FLOATING FRAME VISION
PROCESS.....................................................................7
OVERVIEW OF SNAP IN MOTION .........................128
OVERVIEW OF THE MANUAL ...................................1
<U>
User frame for Compensation Setup .........................35,63
USING SNAP IN MOTION ........................................129
<V>
Vision Program Creation and Teaching22,29,37,47,55,65,
..................................................... 77,85,97,104,115,124
i-1
Revision Record
FANUC Robot series R-30+A/ R-30+A Mate CONTROLLER
+RVision 2D Vision SET-UP GUIDANCE (B-82774EN-3)
Add the setup for the 3D Tri-view Vision Process and the
Snap-In-Motion.
03
Sep, 2010
02
Apr., 2008
Add the setup for the floating frame vision process and
the search vision process.
01
Aug., 2007
Edition
Date
Contents
Edition
Date
Contents
B-82774EN-3/03
* B - 8 2 7 7 4 E N - 3 / 0 3 *
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertising