kinect sensor implementation in fanuc …atmia.put.poznan.pl/woluminy/fil/atmia_34_3_4.pdf ·...

10
ARCHIVES OF MECHANICAL TECHNOLOGY AND AUTOMATION Vol. 34 no. 3 2014 MIROSŁAW PAJOR * , KAROL MIĄDLICKI ** , MATEUSZ SAKÓW ** KINECT SENSOR IMPLEMENTATION IN FANUC ROBOT MANIPULATION The paper presents a control system of the FANUC S-420F, a six-axis robot. Its control system is based on gesture monitoring and recognition using the Microsoft Kinect Motion Controller. The motion of the operator’s/controller’s hand can naturally and intuitively control the robot that per- forms day-to-day pick-and-place operations. The system was implemented in the Matlab/Simulink environment with Kinect for Windows Runtime and Kinect SDK & Developer Toolkit library. To investigate the system, a set of gestures used to control the robot was developed. The influ- ence of light, distance from the robot, the speed of gestures performed by the operator on the system was tested. The paper presents test results and elaborates on the advantages and potential problems of the proposed control system. Key words: gesture control, Microsoft Kinect, FANUC S-420F, FANUC KAREL, machine vison 1. INTRODUCTION 1.1. Current system of robot control Industrial robots play a significant role in many fields of manufacturing, in- cluding the automotive, machine, food, electronic, chemical and pharmaceutical industries. They are also used by rescue teams and for military purposes (e.g. mine detection) [3, 10]. The technology and functionality of industrial robots have been constantly developed to simplify their operation and off-line programming [14]. Their accuracy, working area and handling capacity have been significantly in- creased. The improved work load ergonomics and robot motion optimization ex- tend robot’s tool life [3]. * Dr hab. inż. ** Mgr inż. *** Inż. Institute of Mechanical Technology, West Pomeranian University of Tech- nology.

Upload: vantuong

Post on 12-Feb-2018

278 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

A R C H I V E S O F M E C H A N I C A L T E C H N O L O G Y A N D A U T O M A T I O N

Vol. 34 no. 3 2014

MIROSŁAW PAJOR*, KAROL MIĄDLICKI**, MATEUSZ SAKÓW**

KINECT SENSOR IMPLEMENTATION

IN FANUC ROBOT MANIPULATION

The paper presents a control system of the FANUC S-420F, a six-axis robot. Its control system

is based on gesture monitoring and recognition using the Microsoft Kinect Motion Controller. The

motion of the operator’s/controller’s hand can naturally and intuitively control the robot that per-

forms day-to-day pick-and-place operations. The system was implemented in the Matlab/Simulink

environment with Kinect for Windows Runtime and Kinect SDK & Developer Toolkit library.

To investigate the system, a set of gestures used to control the robot was developed. The influ-

ence of light, distance from the robot, the speed of gestures performed by the operator on the

system was tested. The paper presents test results and elaborates on the advantages and potential

problems of the proposed control system.

Key words: gesture control, Microsoft Kinect, FANUC S-420F, FANUC KAREL, machine

vison

1. INTRODUCTION

1.1. Current system of robot control

Industrial robots play a significant role in many fields of manufacturing, in-

cluding the automotive, machine, food, electronic, chemical and pharmaceutical

industries. They are also used by rescue teams and for military purposes (e.g. mine

detection) [3, 10]. The technology and functionality of industrial robots have been

constantly developed to simplify their operation and off-line programming [14].

Their accuracy, working area and handling capacity have been significantly in-

creased. The improved work load ergonomics and robot motion optimization ex-

tend robot’s tool life [3].

* Dr hab. inż. ** Mgr inż. *** Inż.

Institute of Mechanical Technology, West Pomeranian University of Tech-

nology.

Page 2: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

M. Pajor, K. Miądlicki, M. Saków 36

Robots can be fitted with additional sensors and systems (e.g. vision) to ensure

more precise operation [21]. So far, little attention has been paid to user interface

issues, on-line control and the effects they can have on the operator’s performance. In most applications, the robot is programmed only once, and then it cyclicly per-

forms the pre-programmed moves (pallet loading, welding). However, for medi-

cal, military and scientific applications with health hazardous experiments to be

performed, this is hardly enough. With dynamically changing conditions, remote

robot control in real-time is required [16]. Currently available commercial solu-

tions enable robot control and programming using a computer, an operation con-

trol panel and less often joysticks or pads [15]. These interfaces are not adequate

tools to control a robot in real-time. While they are not intuitive enough, the opera-

tor cannot effectively control all the degrees of freedom on the robotic arm. Tradi-

tional control interfaces require the operator to remember massive amounts of data

(diode markings, switches, situations following joystick operation) and do not

provide direct feedback of the robotic arm movements.

1.2. New concepts of industrial robot control

With the advent of readily available computers with huge computing capaci-

ty, the use of virtual reality and 3D environments, the application of gestures and

human senses to remotely control various machines (haptic feedback) seem to be

inevitable. Attempts have been made to modify existing control methods and to

develop virtual environments with the view of executing remote control of ma-

chinery in modern factories [2]. Such systems should enable innovative, simple,

remote ways of operator-machine interaction. Programming, robot- and machine

tool positioning should be performed remotely by the operator, who is able to

control several machines while being at the control center. A remote control

system would help to fit more machines into existing shop floors. Remote con-

trol would make safety zones redundant since all the machine control operations

would be performed remotely [24].

Although remote control undoubtedly provides some advantages, it also re-

quires the use of multiple sensors, cameras and computers with massive compu-

tation power for real-time data processing. Depth data is of particular importance

for a comprehensive analysis of the environment surrounding a machine, gesture

recognition and the positioning of a workpiece which is being worked on or

which is being picked and placed. Optical methods for depth sensing do have

their drawbacks: require substantial computational time (stereoscopy), do not

provide adequate speed of data processing (laser methods), are expensive (Time

of Flight) and are sensitive to external stimuli (structural light) [11]. Therefore,

the system proposed in the present paper uses the Microsoft Kinect sensor which

Page 3: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

Kinect sensor implementation in FANUC robot manipulation 37

despite its modest price can perform depth measurements and features gesture

recognition.

1.3. Concept of the proposed control system

The paper presents the control system of the FANUC S-420F, a six-axis ro-

bot, based on gesture monitoring and recognition. Its concept is based on track-

ing operator’s gestures using a Microsoft Kinect Motion Controller. Track point position data acquired by the system is processed by the algorithms of control

and gesture recognition. The operator’s hand, used as the controller, provides

a natural and intuitive control interface which imitates day-to-day operations of

picking and placing objects. The system was implemented in the Matlab/Simu-

link environment, using Kinect for Windows Runtime and Kinect SDK & De-

veloper Toolkit library.

2. MICROSOFT KINECT

2.1. Its use in applications

The Microsoft Kinect sensor (Figure 1) was unveiled as Project Natal by Mi-

crosoft on 1st June, 2009 at the E3 fair. This novel solution enabled its users to

interact with their console without hav-

ing to use pads or other control systems,

commonly used at that time. The system

implemented a new interface using hand

gestures, body movements and spoken

commands. Although Kinect was de-

signed as a peripheral communicating

with a console, it quickly arose the in-

terest of the academic community.

The Microsoft Kinect sensor is used

in robotics, medicine and IT as it can

measure depth indoors more quickly

and precisely (for distances up to 2–3

meters) than other methods, e.g. stere-

oscopy [1]. The sensor is used for remote control, environment scanning and

collision avoidance in mobile [4], humanoid [22] and flying robots [18] as well

as industrial manipulators [1, 8, 20]. Attempts were made to use the sensor in the

process of controlling surgical robots [12] and to control data displayed on the

Figure 1. Microsoft Kinect sensor [9]

Page 4: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

M. Pajor, K. Miądlicki, M. Saków 38

screen without the need to touch the keyboard or mouse [7] thus maintaining

aseptic conditions in the operating theatre. The controller can also be used as

part of a physical rehabilitation system [7]. The ability to measure depth and

track the position of track points on a human body offered by Microsoft Kinect

is used in image processing, including 3D scanners [17], human detection [19]

and emotion recognition [14].

2.2. Design and operation

The Kinect sensor consists of six modules: a multi-array microphone, an in-

frared emitter, a depth camera, an angularity controller, connection lead, and an

RGB camera. The Kinect sensor has a range of 1.2–3.5 metres in the normal mode

and 0.8–2.5 in the near mode. The lens have an angular field of view of 43.5° ver-

tically, with tilt adjustment of ±27°, and 57° horizontally. The RGB camera fea-

tures a CMOS sensor and enables user’s facial recognition and image processing. It operates in two modes: 640 ´ 480 at 30 fps and 1280 ´ 960 at 12 fps.

For depth data input, the sensor interprets data stream from the depth camera.

The infrared emitter projects a grid of pseudorandom points over a large area.

The depth camera (IR CMOS) is capable to detect position with a resolution of

80 ´ 60, 320 ´ 240 and 640 ´ 480, at 30 fps. Next, a triangulation based range

sensor measures the distance to a track point. Although traditional triangulation

requires two cameras, Kinect uses only one depth camera and the infrared emit-

ter which functions as the other camera. Since it remembers the structure of

emitted points, it can be compared with an image provided by the IR CMOS. As

the device knows the position of certain points, a frame of reference points can

be inferred. Then, the depth camera records the translated image and uses it to

perform discrepancy check of the points. This is how the distance between an

object and the sensor is calculated. If the images overlap, depth calculation is

easy and can be quickly performed by the processor.

Kinect enables user skeletal tracking using the depth camera and specific data

on human kinematics. The points spread on the body (rigging) are processed.

The images of rigs are used to teach the algorithm of human motion generation.

To enable body recognition for it to be used as the controller, data fed by sensors

are processed by the software in several steps. The moment a user appears in

front of the Kinect controller, a 3D cloud representing the user is generated us-

ing depth data. The model is later fitted to a given typology of the initial skele-

ton. Using human’s body kinematics data, Kinect learns to recognise different parts of the user’s body and assigns specific weights to each of them depending on how certain the algorithm is of having properly recognised the part. Once the

process is finished, Kinect finds the most probable skeleton position against the

body and connects the body parts according to their weights [21].

Page 5: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

Kinect sensor implementation in FANUC robot manipulation 39

3. TEST STAND

3.1. FANUC S-420F robot

A FANUC S-420F robot with a RH controller was used to implement and con-

duct the tests of the proposed control system. The robot is driven by 6 servo mo-

tors responsible for motion in the respec-

tive axes (Figure 2).

The RH controller is a modular system

dedicated to control FANUC industrial

manipulators. It consists of the main CPU,

a path tracking processor, divided RAM,

bubble memory and servo motor control

systems. Six I/O cards are available with

a total of 48 inputs, 48 digital outputs and

two RS-232 communication ports. The

device can communicate with external

systems, e.g. to feed the status of a pro-

duction line. The controller also contains

six servo amplifiers. A comprehensive description of the robot and the controller

can be found in their technical specifications [5]. The RH controller can be pro-

grammed using FANUC Karel language. A comprehensive description of the syn-

tax of the programming language, user manual for CRT/KB operation control

panel, KCL command prompts, system variables and error codes can be found in

the technical specification [6].

3.2. Communication

The Kinect controller communicates with the robot according to a scheme

presented in Figure 3. The control of FANUC S-420F using RS-232 port in the

RH driver required writing a programme.The programme was written in

FANUC Karel programming language. It enables the reading of the robot’s cur-

rent position and moving it for its new task. Position determination can be done

in two ways:

- Absolute – the robot arm is moved directly to a position defined by

(x, y, z) coordinates

- Incremental – the robot arm is moved by a pre-defined vector from the

current position (Δx, Δy, Δz).

Trajectory is automatically generated by the RH driver, based on received

points or translations, with the time interval of approximately 200 ms between

Figure 2. Design of the FANUC S-420F [6]

Page 6: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

M. Pajor, K. Miądlicki, M. Saków 40

the successive point data packages. For safety purposes, the programme

accounts for upper and lower motion ranges for all the axes.

The second program, responsible for the Microsoft Kinect sensor’s operation,

gesture recognition and feeding coordinates to the RH driver, was written in

MATLAB environment using “Kinect for Windows Runtime” and Kinect SDK

& Developer Toolkit library. The programme consists of 5 modules responsible

for:

– Communication with Kinect – receiving data about the position of tracked

points.

– Filtering the received coordinates – smoothing tracked points using a sim-

ple average filter with a step of 5 samples. The filter turned out to be the best out

of all 19 filters presented in [9].

– Gesture recognition – pre-programmed were the following gestures: hands

up/down, a hand moving up/down, a fist/an open hand. While algorithms im-

plemented in Kinect SDK were used to detect the first two gestures, KHand

library was used for the third gesture, described in [23].

– Position fitting and scaling – this part of the program is responsible for

a good fit of the robotic arm and the operator’s hand. It is critical for safety as

without the function, at first the robot would make an unknown move to assume

the position signalled by the operator’s hand.

– Communication with the RH driver – sending, receiving and scaling data

about the robot’s position.

Figure 3. A scheme of communication in the control system

Page 7: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

Kinect sensor implementation in FANUC robot manipulation 41

Figure 4. Concept of gesture control

3.3. Control methods

In the proposed solution, the operator teleoperates the FANUC S-420F robot

by moving his/her hand in the space. Then, the gestures are mapped and repeated

in 3D space by the robotic arm

(Figure 4). Its speed is propor-

tional to the speed of the opera-

tor's hand movements. Tracking

the operator's hand begins with

the tightening of his right hand.

From this point on each hand

movement is followed by a ro-

bot. At the current stage of the

system development, effector

orientation changes have not yet

been included. For safety pur-

poses, some special gestures

were introduced to activate mo-

tion control. The following ges-

tures were implemented in the

proposed application:

- Both hands up, the right hand clenched into a fist – gesture control activa-

tion, the robot follows the right hand movements.

- Both hands up, the left hand clenched into a fist – command recognition ac-

tivation: the left hand up – “work” mode, the left hand down – “standby” mode.

- Any other hand configuration does not trigger the robot response.

4. RESEARCH RESULTS

The proposed system was tested on a test stand in the Technology Hall of the

Institute of Materials Science and Engineering, West Pomeranian University of

Technology, Szczecin. Experiments revealed that the proposed robot control sys-

tem was much more intuitive than the operation control panel (teach pendant),

command line or the programming language (KAREL) provided by the manufac-

turer. In the tested system, the operator does not have to know how to use the op-

eration control panel or command line or know specific commands. All the opera-

tor needs to do is to learn the basic set of gestures and take position in front of the

Microsoft Kinect sensor. Then, the operator is ready to control the FANUC S-

420F manipulator. However, the proposed system has its drawbacks which are

mainly due to hardware imperfections.

Page 8: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

M. Pajor, K. Miądlicki, M. Saków 42

The first problem was the sensitivity of the system to light conditions. Since

the sensor uses a structural light based method for depth determination, the mo-

ment a strong source of light is directed towards it, the sensor cannot operate

properly. The readings of hands and other points have either random or maximum

values rendering proper control of the system. There were no problems with the

operation of the system in low light conditions. The second problem is delay in

feeding points to the RH driver which is necessary for trajectory generation.

A delay of approximately 200 ms is both evident and perceptible when the opera-

tor performs quick hand motion. The robot does not always precisely follow the

operator’s gestures. Given precise control, i.e. slow hand motion, the robot’s

moves are smooth and latency is not clearly noticeable. This delay is related to the

use of the communication interface RS-232 and low computing power RH driver

produced in 1991. At the current stage of research, precision and repeatability the

robot moves was not measured. However, it was observed that it depends mainly

on the speed of operator’s movements and condition of light. Background does not

affect the operation of the system. The system allowed to control the robot over

a static white background and as well as in a normal work environment (people

and objects running in the background). The influence of selected factors on the

precision and control comfort are presented in Table 1.

Table 1

The influence of selected factors on the precision and control comfort

Factor

Influence of factors on the precision

and comfort control

High Medium Low

High light intensity ´

Low light intensity ´

Dynamic Background ´

Delay between 15 ms–200 ms ´

Delay over 200 ms ´

5. SUMMARY

Despite the technological development of industrial manipulators and control

algorithms over the recent years, little attention has been paid to user interface

issues and the effects they can have on the operator’s performance. The methods

of robot control have not changed for many years. Producers still choose to rely on

operation control panels, scripting languages and joysticks. In these traditional

systems, the interface receives the operator’s commands (e.g. pressing a button)

and then it sends a signal to the computer where it is processed using an appropri-

ate algorithm. As a result, a setting signal is generated which once fed to motors

triggers the robot’s operation (e.g. rotation of one of the manipulator’s axes).

Page 9: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

Kinect sensor implementation in FANUC robot manipulation 43

The proposed system is setting a new approach to control of industrial manipu-

lators. It uses vision and gesture recognition technologies. Although it has some

drawbacks, it is very intuitive and does not require industrial manipulator opera-

tors and programmers to undergo difficult and long-lasting training. The system

presented in the paper can be further extended with voice control, augmented re-

ality and novel ToF cameras for depth sensing applications. With the advent of

readily available computers with huge computing capacity, the use of virtual reali-

ty and 3D environments, the application of gestures and human senses to remotely

control various machines (haptic feedback) seem to be inevitable. Virtual technol-

ogies no longer belong only to realm of entertainment, but are beginning to play

an increasingly important role in industrial applications.

REFERENCES

[1] Arango C., MartíNez J., PéRez V., Master-slave system using Kinect and an industrial ro-

bot for teleoperations, in: Proceedings of the Health Care Exchanges (PAHCE), Pan Ameri-

can 2013, p. 1–6.

[2] Bejczy A.K., Virtual reality in robotics, in: Proceedings of the IEEE Symposium on Emerg-

ing Technologies & Factory Automation, ETFA 1996, p. 7–15.

[3] Brogårdh T., Present and future robot control development – An industrial perspective, An-

nual Reviews in Control, 2007, 31(1), p. 69–79.

[4] Du G., Zhang P., Markerless human–robot interface for dual robot manipulators using Ki-

nect sensor, Robotics and Computer-Integrated Manufacturing, 2014, 30(2), p. 150–159.

[5] Fanuc Robotics Maintenance manual [MARMKS42H1174EF][B-67205EG01].pdf.

[6] Fanuc Robotics MAROKENHA0885EF – Enhanced KAREL Operations Manual v. 2.22 R.pdf.

[7] Gallo L., Placitelli A.P., Ciampi M., Controller-free exploration of medical image data: Ex-

periencing the Kinect, in: Proceedings of the 2011 24th International Symposium on Com-

puter-Based Medical Systems (CBMS), Bristol, IEEE 2011, p. 1–6.

[8] Gośliński J., Owczarek P., Rybarczyk D., The use of Kinect sensor to control manipulator

with electrohydraulic servodrives, Pomiary, Automatyka, Robotyka, 2013, 17, p. 481–486.

[9] http://msdn.microsoft.com/en-us/library/jj131429.aspx , stan na 30.06.2014

[10] Karabegović I., Karabegović E., Husak E., Industrial robot applications in manufacturing

process in Asia and Australia, International Journal of Engineering & Technology, 2013,

11(01), p. 200–204.

[11] Langmann B., Hartmann K., Loffeld O., Depth Camera Technology Comparison and Per-

formance Evaluation, in: Proceedings of the 1st International Conference on Pattern Recogni-

tion Applications and Methods, Portugal, SciTePress, 2012, p. 438–444.

[12] Liu W., Ren H., Zhang W., Song S., Cognitive tracking of surgical instruments based on

stereo vision and depth sensing, in: Proceedings of the 2013 IEEE International Conference

on Robotics and Biomimetics (ROBIO), Shenzhen, IEEE 2013, p. 316–321.

[13] Mohamed Abderrahim, Alla Khamis, Garrido S., Accuracy and Calibration Issues of In-

dustrial Manipulators, in: Industrial Robotics: Programming, Simulation and Applications,

ed. L.K. Huat, Germany, Pro Literatur Verlag 2006, p. 131–145.

[14] Oszust M., Wysocki M., Recognition of signed expressions observed by Kinect Sensor, in:

Proceedings of the 2013 10th IEEE International Conference on Advanced Video and Signal

Based Surveillance (AVSS), Krakow, IEEE 2013, p. 220–225.

[15] Pan Z. et al., Recent Progress on Programming Methods for Industrial Robots, in: Proceed-

ings of the Robotics (ISR), 2010 41st International Symposium, VDE, Germany 2010, p. 1–8.

Page 10: KINECT SENSOR IMPLEMENTATION IN FANUC …atmia.put.poznan.pl/Woluminy/Fil/ATMiA_34_3_4.pdf · Kinect sensor implementation in FANUC robot manipulation 37 ... in robotics, medicine

M. Pajor, K. Miądlicki, M. Saków44

[16] Pretorius J., Van Der Merwe A.F., Development and implementation of a telerobotic sys-

tem with visual and haptic feedback: current progress, in: Proceedings of The Industrial Sys-

tems and Engineering Management, South Africa Industrial Engineering, Stellenbosch Uni-

versity 2011.

[17] Rakprayoon P., Ruchanurucks M., Coundoul A., Kinect-based obstacle detection for ma-

nipulator, in: Proceedings of the 2011 IEEE/SICE International Symposium on System Inte-

gration (SII), Kyoto, IEEE 2011, p. 68–73.

[18] Sanna A., Lamberti F., Paravati G., Manuri F., A Kinect-based natural interface for quad-

rotor control, Entertainment Computing, 2013, 4(3), p. 179–186.

[19] Shen Y. et al., A Novel Human Detection Approach Based on Depth Map via Kinect, in:

Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition

Workshops (CVPRW), Portland, OR, IEEE 2013, p. 535–541.

[20] Shirwalkar S. et al., Telemanipulation of an industrial robotic arm using gesture recognition

with Kinect, in: Proceedings of the 2013 International Conference on Control, Automation,

Robotics and Embedded Systems (CARE), Jabalpur, IEEE 2013, p. 1–6.

[21] Slot K., Owczarek A., Janczyk M., Vision based human-machine interfaces: visem recogni-

tion. Computer Vision, Robotics and Industrial Applications, 2013, vol. 3, p. 173–194.

[22] Song W. et al., Teleoperation Humanoid Robot Control System Based on Kinect Sensor, in:

Proceedings of the 2012 4th International Conference on Intelligent Human-Machine Sys-

tems and Cybernetics (IHMSC), Jiangxi, IEEE 2012, p. 264–267.

[23] Teixeira J.M. et al., Open/Closed Hand Classification Using Kinect Data, in: Proceedings of

the 2012 14th Symposium on Virtual and Augmented Reality (SVR), Rio Janiero, IEEE

2012, p. 18–25.

[24] Xiang Y. et al., A human-centered Virtual Factory, in: Proceedings of the Management

Science and Industrial Engineering (MSIE), 2011 International Conference on Harbin, IEEE

2011, p. 1138–1142.

IMPLEMENTACJA STEROWANIA CZUJNIKIEM KINECT

DLA ROBOTA FANUC S-420F

S t r e s z c z e n i e

W artykule przedstawiono system sterowana robotem przemysłowym FANUC S-420F o sze-

ściu stopniach swobody. Jest on oparty na śledzeniu gestów operatora i ich rozpoznawaniu za

pomocą kontrolera ruchu Microsoft Kinect. Dzięki wykorzystaniu ręki operatora jako kontrolera

uzyskano naturalny, intuicyjny sposób sterowania robotem, naśladujący codzienną czynność prze-

noszenia przedmiotów. Do realizacji systemu wykorzystano środowisko Matlab/Simulink z pakie-

tem Kinect for Windows Runtime oraz biblioteki Kinect SDK & Developer Toolkit.

Na potrzeby badań zaproponowanego systemu opracowano zestaw podstawowych gestów,

które pozwalają na sterowanie robotem. Testy obejmowały wpływ na system m.in. takich parame-

trów, jak oświetlenie i szybkość wykonywania gestów przez operatora. W artykule przedstawiono

uzyskane wyniki oraz omówiono zalety zaproponowanego systemu sterowania i potencjalne pro-

blemy.

Słowa kluczowe: Microsoft Kinect, sterowanie gestem, widzenie maszynowe, FANUC S-420F,

FANUC KAREL