open architecture dynamic manipulator design philosophy (dmd)
TRANSCRIPT
ARTICLE IN PRESS
Robotics and Computer-Integrated Manufacturing 26 (2010) 156–161
Contents lists available at ScienceDirect
Robotics and Computer-Integrated Manufacturing
0736-58
doi:10.1
� Corr
E-m
journal homepage: www.elsevier.com/locate/rcim
Open architecture dynamic manipulator design philosophy (DMD)
Syed Hassan a,b,�, Naveed Anwer b, Zafar Khattak b, Jungwon Yoon a
a School of Mechanical and Aerospace Engineering and ReCAPT, Gyeongsang National University, Jinju, Koreab Computer Science and IT Department, University of Gujrat, Gujrat, Pakistan
a r t i c l e i n f o
Article history:
Received 15 September 2008
Received in revised form
5 July 2009
Accepted 8 July 2009
Keywords:
Manipulator
Open architecture
Robotics
Dynamic controller
Design philosophy
45/$ - see front matter & 2009 Elsevier Ltd. A
016/j.rcim.2009.07.006
esponding author.
ail address: [email protected] (S. Hassa
a b s t r a c t
This paper presents a generic and universal architecture design for robotic manipulators. A flexible
approach is taken to develop the design philosophy throughout, resulting in a hardware architecture
that is portable, can be integrated and enables the implementation of advanced control methods. A
software kernel of management, supervision and control was developed in order to obtain an easy user
interface to the robotic researcher. The application of many such controls has, traditionally, often been
severely restricted in partial commercial robotic systems because of limitations associated with their
controllers; rather than the arms themselves.
& 2009 Elsevier Ltd. All rights reserved.
1. Introduction
Industrial robots are currently employed in a large number ofapplications, and are available with a wide range of configura-tions, drive systems, physical sizes and payloads. However, despitethe perceived wide spread deployment of robots, recent surveysindicate that the number in service throughout the world aremuch less than predicted twenty, or even then, years ago.
In contrast, much academic research has been undertaken inrecent years aimed at improving the performance of robotsusing a number of advanced techniques. They have includedmodel based techniques for predictive and adaptive control, forceand hybrid force position control schemes, and attempts tointroduce intelligent control methods, especially using ANNsand fuzzy-logic based control Linkens and Nyoungesa [2].
Whilst varying degrees of success have been demonstrated, theapplication of many advanced methods has often been severelyrestricted in practical commercial robotic systems because oflimitations associated with their controllers, rather than the armsthemselves Kozlowski [3].
In order for the robotic device to perform a given task it isnecessary to control it that is basically, to have the capabilities of,first, sending specific orders to the manipulator, in terms ofpositions or velocities of its final effector, and second, sensing thereal obtained position or velocity. Other sensing capabilities maybe important depending on the application, such as force feed-back. So, in order to implement a given control algorithm for the
ll rights reserved.
n).
manipulator to solve a task, it is necessary to control its basicfunctionalities (moving and sensing) which implies the develop-ment of a software platform.
2. Existing open architecture controller
The impact that ‘open’ system has had on the computer scienceculture has been remarkable.
From a general computing point of view, the term ‘openarchitecture’ has been attributed the following definition:
An architecture whose specifications are public; this includesofficially approved standards as well as privately designedarchitectures whose specifications are made public by thedesigners. The opposite of open is closed or proprietary
This definition is applicable to the general computer sciencecommunity as a whole, but the phrase ‘open architecture controller ‘,which has been partly plagiarized from this definition since the late1980s Zeng G et al. [4], requires a slightly different definition.
The need for open architecture controllers has arisen because of apressing need for more advanced flexible manufacturing system(FMS) in factory environments Proctor and Albus [21]. The NGC–SO-SAS establishes a virtual controller infrastructure that integratessoftware-based ‘services’ under the aegis of a master controller, and isdesigned to provide interoperability, portability, scalability andinterchangeability to a controller Anderson [1]. In Europe, the OSACA(open system architecture for control within automation) project aimsto define hardware independent reference architecture for a range ofindustrial controllers Pritschow [15]. The OSEC project in Japanfollowing the similar goals of open philosophy developed the
ARTICLE IN PRESS
S. Hassan et al. / Robotics and Computer-Integrated Manufacturing 26 (2010) 156–161 157
seven-tiered reference model the Kasashime et al. [14]. Naturally, eachof these specifications and standards defines an ‘open architecturecontroller’ with a slightly different emphasis, but there are commonthemes throughout.
The demand for faster manufacturing processes with tighterquality control standards is pushing the limits of older generationrobots Proctor and Albus [21]. Whilst many robot controllers featurescommon elements in their hardware (CPUs, for examples the Intel8088), adding performance upgrades is limited or even impossible.Even the most well adapted proprietary controllers are falling behindwhen compared with other devices in a FMS. To add newfunctionality in the controller without restoring or going for reverseengineering, the only option is using a communication port that willallow real- time path modification of the robot’s trajectory. The slowcommunication rates manufacturers usually provide with theseinterfaces renders it impossible to effectively make compensationsin a sensor based control loop with typical sampling requirements upto 1 kHz Proctor and Albus [21]. This limitation has negatively affectedseveral attempts to implement sensor based control with aproprietary controller Bicker et al. [16].
The ESA’s CDM utilizes a functional reference model (FRM) for theactivities of robot control systems. The top hierarchy, the MISSIONlayer, attempts to describe the activities that the robot is responsiblefor very abstract terms, for example SERVICE a satellite, and REPAIR aplatform. The TASK layer decomposes the high level activities intotasks, which are defined as the highest level of activity that can beperformed on a single subject/object (OPEN a door, WELD a seam).
The NEXUS open software system has a similar methodology,whereby modules are defined by the services they provideFern�andez-Madrigal and Gonz�alez [17]. Data and event flowsbetween different modules are handled by an Internal Commu-nications Manager (ICM) and an Architecture InformationManager (AIM). Zhou and deSilva have retrofitted a PUMA 560with an open-structure controller Zhou and deSilva [20]. The hostcomputer is an intel 80486 processor with AT system bus. Thischoice of OS is not specified, but the real-time support provides aGUI. Brambones and Etxebarria have reverse engineered acontroller for the Tecquipment Ltd. MA2000 laboratory manip-ulator Barambones and Etxebarria [18]. This is a six-DOFmanipulator actuated by DC motors with potentiometer positionfeedback. The existing controllers surveyed had a drawback withcontroller that when operates in free motion, the system iscontrolled in open loop. Precompiled code cannot be modified andto link more libraries the software requires updating which againpush them to closed end besides the controller is still an openarchitecture.
In general, robot controllers can be broadly classified into threedifferent types Fu et al. [6]:
(a)
Proprietary: the controller structure is effectively closed.Integration of external or new hardware (including sensors)is either very difficult or impossible.(b)
Hybrid: the majority of the system is closed (control laws, etc.)but some aspects of the system remain open. It is possible toadd new devices such as sensors.(c)
Open: the controller design is completely available to bechanged or modified by a user. The hardware and softwarestructures can be changed such that all elements (servo laws,sensors, GUIs) can be modified without difficulty.3. Enabling technologies
When considering the implementation of an open architecturecontroller based on the architectural specifications and modelsdiscussed in Section 2, a particular hardware architectural must
be committed to follow open standards and open source code. Thehigh level, somewhat abstract architectural specification docu-ments have lead developers of prototype systems to choose thefollowing enabling technologies Fu et al. [6]; James and Graham[7]:
A.
Standard operating system (OS) like DOS or Windows. B. Non-proprietary hardware such as PC’s or SUN workstations. C. Standard bus systems such as PCI or VME. D. Use of standard control languages such as C or C++ or Java.4. OS for open controllers
The operating system provides a software interface to enablethe user to run application program and performs tasks such asport I/O, updating the screen display and communicating withperipheral device. In general, the tasks that an open architecturecontroller has to manage can be split into two different categories:
A.
Direct machine control: this encompasses drive interfacing,signal conditioning, trajectory generation, servo-control (orother joint control algorithms), sensor/transducer interfacingand coordinating asynchronous events.B.
Non-machine control: this encompasses tasks such as inter-preting instruction sequences (CNC codes/robot program files),higher level communications to other systems and providinguser interfaces.We can also classify these two sets of task into real-time andnon-real-time. The definition of real-time, which relates to thecomputing control systems, is given accurately by MicrosoftCorporation [8]:
A real time system is one in which the correctness of thecomputation not only depends upon the logical correctness ofthe computation, but also upon the time at which the result isproduced. If the timing constraints of the system are not met,system failure is said to have occurred.
5. Motion controllers
From the wide variety of industrial motor control equipmentavailable a servo controller of this nature requires a minimum, foreach axis under control, 0–10 VDC analogue output channel andan encoder input channel Perry and Wolf [9]. A wide variety ofproducts exist, each with differing feature and option.
The PMAC (programmable multi-axis controller) card manu-factured by Delta Tau Systems is a PC expansion card for mostcommon local bus systems (PCI, VME, etc.) and is equipped withonboard A/D and D/A converters, digital I/O, encoder inputs andPLC emulation. Many accessories are available to provide furtherexpansion. It has a Motorola DSP 56001 Digital Signal Processorrunning at a clock speed of 20 MHz (standard), and up to 60 MHz(enhanced). It is highly flexible system that allows many advancedcustom features to be incorporated Lee and Lee [13]. A diversearray of options and control features is available as standard,making the card highly suited to research, robotics and machinetool applications.
Communications between the host processor and the PMACservo controller take place via the common system bus,using ASCII character format. This is generally the case for thesetypes of controller. The interpretation of the meaning of the
ARTICLE IN PRESS
Mission Program Library
Graphical User Interface
Third Party SW Robot / System
Configuration
Mission Program Management
SystemConfigurationManagement
Application Level
S. Hassan et al. / Robotics and Computer-Integrated Manufacturing 26 (2010) 156–161158
information contained in each ASCII word is dependent on themanufacturer who designed the product, and it is for this reasonthat the OSEC committee plan to link activities with the Europeanand American organizations responsible for OSACA and SOSAS inan attempt to globally standardize a set of software services thatopen controllers should be able to provide, including low-levelmotion control. An initial proposal has been drafted andsubmitted to the International Standards Organization (ISO) draftof 1995 Albus et al. [10].
C control Level Controller Library
Device Library Device Level
6. Sensor interfacing
One of the main motivating forces behind the push proprietaryto open architectures was the need to add extra sensing devices tothe controlled machinery, in order to improve quality andefficiency. According to the control design methodology (CDM),sensory processing in a robotic system can be sub-divided intotwo categories:
A.
Physical Devices
Fig. 1. High level reference architecture.
Internal sensory feedback. This concerns data that are anintegral part of motion control aspects of the system, such asfeedback from axis encoders and solvers, use of a motioncontroller will provide appropriate interfaces to this equip-ment.
B.
Environmental sensory feedback. This concerns the data comefrom additional sensors not directly utilized for axis control,for example a force sensor.7. Proposed architecture
The architectural presented has three layers, each of which canbe decomposed into further functional sub-modules. Fig. 1 showsgraphically the high level reference model for controllingintelligent sensor based manipulators.
Before decomposing the levels into their more detailed sub-modules, an overview of the high level model will be given. Eachlayer of the architecture has a general descriptive name: the toplayer is called the application layer and acts to provide a high levelservice interface to the user. The control layer is directly beneaththe application layer and receives these requests for task services.In a similar manner to the CDM, each task has an associated modeof control that is to be performed to complete the task.
7.1. Application layer
Two of the main features of the application layer functionality,shown in Fig. 2, are in handling a graphical interface to the userand providing a set of services that enable the application tocommunicate with third-party software and hardware, and allowthird-party software access to, and control of, its own features.
The users interface must be intuitive and allow the user toaccess all of the controller’s functionality and parameters. It mustprovide features that enable the creation and testing of missionprograms. In addition, it must allow access to design time tools forsoftware module creation.
7.2. Control layer
When the task service requests and its associated parametersarrives, via the OSI interface, the control layer Fig. 3 must first decideon the required control strategy for each task, we can define three
possible runtime states, initialization, execution and termination.Initialization involves the calibration of external sensors, loading ofrequired modules, calculation of drive transformation, etc. andanything that is required for the next state, is execution. This stateinvolves the actual execution of the required control method in real-time until a terminating condition is reached. Upon entering thetermination state, the required clean-up routines can be executed tounload control modules or ‘look ahead’ to the next task to see if anycan remain loaded. The control layer provides the application withany desired information as task executes, and requests the services ofthe device layer as required.
To perform these functions, the primary components of the controllayer are the Dynamically Reconfigurable Control System (DRCS).During initialization, required components from the controller libraryare configured inside the DRCS to perform the required control.During execution, its inputs are from internal and environmentalsensors feedback, and task parameters. Its outputs are the internalstates of its control modules that are passed to the scheduler, andmotion control signals to be passed to the device layer. The scheduleris responsible for ensuring that each of the runtime states areexecuted correctly, and for overseeing the internal communications ofthe layer. It is also the scheduler that decides for the given task, whichmodules are to be loaded into the DRCS from the control library andwhat information to the sensor library in the form of requests forsensor data, or the setting of output values. The sensor library isresponsible for interpreting this data, and assembling the correctrequest to pass to the field-bus protocol to carry it out. It can also beconfigured to provide the DRCS with the required environmentalsensor information at each sampling interval when executing acontrol mode.
Again assuming that the interface between control and devicelayer is implemented via OSI model, the request to, and responsesfrom, the device layer will be in the form of ASCII code stringspassed and received from the OSI application level interface.
7.3. Device layer
The interface between the device layer and the control layer isof vital importance. The interpretation of the information flowing
ARTICLE IN PRESS
Task Service
User Interface
Mission Program Application Third Party Interface
Task Service
Fig. 2. The application layer.
Scheduler
Dynamic Control System
Control Library Sensor Library
Network Service Response
Network Service Request
Motion Service Response
Motion Service Request
Task Service Response
Task Service Request
Sensor Parameter
Extroceptive Sensor Data
ControllerData
Task Status Task Parameters
Fig. 3. The control Layer.
S. Hassan et al. / Robotics and Computer-Integrated Manufacturing 26 (2010) 156–161 159
between the application layer and the control layer is necessarilyabstract, and its actual meaning depends upon the syntax of themission program, how it is broken down into the required taskswhat control method can be employed to perform tasks. This canbe configured in the controller’s design state. The interface to theactual devices that control the robot and provide sensoryinformation is not abstract and must adhere to a recognizedstandard, and it seems this is solely dependent upon theimplementation equipment.
If there was a universal protocol and set of services for field-bus devices to adhere to, and a similar common protocol to dictatethe operation of motion control equipment, this could be attachedto the reference architecture. This would ensure that whateverimplementation equipment was chosen, the functionality wouldremain identical. From the discussion of chapter three, it was seenthat although there are plans for such standard protocols in thenear future, none exist in practice at the time of writing. At thisstage, it will be assumed that such protocols exist and design theinterpretation of the device level service requests and responses.When suggesting the implementation model, suitable technolo-gies can be chosen from those available.
Essentially, the device layer, which is shown in Fig. 4, receivestwo different types of request from the control layer interface. Oneis a network request, either to set or read a sensor value, orconfigure a new node via field-bus. To form the stored routines inthe protocol library, these requests are carried out physically andthe resulting data/status information is passed back to the controllayer. The other is a motion service request, and can take the form
of data requests for the current joint angles/torques, for example,or to servo each joint to a given position. The motion controlelements of the device layer perform these functions and passmeasured data/status back to the control layer.
8. Hardware implementation
From the reference architecture as described in Section 3,suitable implementation architecture must be chosen from themultitude of enabling technologies before a prototype design canbe created and tested. Suitable hardware implementation archi-tecture is shown in Fig. 5.
This architecture is based on an array of AMD processorrunning on Win95 with a real-time extension designed as multimicroprocessor OS so inter-process communication can be doneeasily. The application layer is coded to be executed via theprocessor array with the real-time extension to the controller andfield-bus cards with dedicated embedded process handling oftime critical data.
PMAC was used because of its high performance controlcapability and because it possesses ‘open’ software slot, with theability to externally compile and download a user writtenalgorithm in it, not only gaining the advantage of off-the-shoremotion control technology but also the ability to implement thedevice library and configuration tools on the main processor array,and download the required control module. Win95 was chosen asOS for being real-time and supporting open architectures.
ARTICLE IN PRESS
PCI BUS (PC) BackBone
ToPeripheralDevices
CA
N O
pen C
ard
PM
AC
C
ard
ToSensors
ToRobot
Bridge
Operating System
Fig. 5. Controller’s hardware implementation.
Field bus Protocol Library
Field bus Protocol
Motion Control Motion Control Library
Network Service Response
Network Service Request
Motion Service Response Motion Service
Request
Field bus and Sensors
Joint Actuation and Sensors
Communication Data Controller
Data
Environment
Fig. 4. The device Layer.
S. Hassan et al. / Robotics and Computer-Integrated Manufacturing 26 (2010) 156–161160
To create the executable code C++ was chosen as programminglanguage using the GUI version of VC++, Active X set of rules wereused for how application should share information; these featuresof Active X makes the application highly suited for using thirdparty interfaces and modules. The application has one moreadvantage that any programming language can be used to developany module or interface that is enabled with information transferprotocol and Active X control.
9. Discussion
The reference model is arranged logically, with each layerhaving a specific group of similar functions perform. This enablesthe real-time requirements of each layer’s functions to be thesame, with the tightest requirement at the device level, inter-mediate requirement at the control level and almost asynchro-nous at the application level. The intermediate requirement is stillclassified as hard real-time, but the sampling rate can be reducedby a factor of approximately ten from that of the device layerKhosla [12]; Chen and Parker [11]. At the application level, theinformation exchange is still classified as real-time, but can beconsidered as soft real-time compared to the lower level [8]. Forexample, the GUI must be updated with new information around20 Hz, but a small deviation in the update interval (so long as itdoes not exceed a pre-specified bound such as 400%), will notcause system failure.
The architecture is designed so that some of the principles ofalternative reference architecture and methodologies, especiallythe Control Design Methodology (CDM) Putz and Elfving [22], canstill be utilized to some extent by GCA. In particular, the activityanalysis/task definition stages of the CDM can be utilized, alongwith the task definition from the CDM catalogue. The extendedhybrid architecture of the GCA removes the need for theremaining steps of the CDM to be followed, as the implementationarchitecture exists and can be modified in design time to suit. TheDRCS incorporates some of the concepts from the Chimera
methodology Stewart and Khosla [19], but remains abstract in
implementation where as Chimera methodology could be incorpo-rated as a DRCS implementation for the GCA, so long as a suitableinterface could be designed to allow the creation and manage-ment of port objects by the application layer, and suitable runtime functionality could be achieved.
The existing controllers surveyed had a drawback withcontroller that when operates in free motion, the system iscontrolled in open loop, Precompiled code cannot be modified andto link more libraries the software requires updating which againpush them to closed end besides the controller is still an openarchitecture. Whereas, this architecture eliminates all the draw-backs mentioned above and earlier and enables the researchers toadd their own modules like inverse or forward kinematicsalgorithms during run time using Active X format. Thus, asoftware kernel of management, supervision and control weredeveloped in order to obtain an easy user interface to the roboticresearcher with the ability to externally compile and download auser written algorithm in it, not only gaining the advantage of off-the-shore motion control technology but also the ability toimplement the device library and configuration tools on the mainprocessor array, and download the required control module.
10. Conclusion
This paper tries to present and discuss an architecture modelfor the control of next generation sensor-based robots. A suitableimplementation architecture that can satisfactorily realize theopen controller functionality has been chosen and described.
Each layer of the architecture has a general descriptive name:the top layer is called the application layer and acts to provide ahigh level service interface to the user. This level provides GUI andan interface to third-party software. Since the model possessesextended hybrid architecture, in its un-configured state there aremany ‘open’ slots, or modules, in the lower layers that needdefining bases on the expected scope of actions the robot perform.Hence the application layer also directly provides facilities for thedesign and management of the functional modules that the lowerlayer utilize.
Acknowledgements
This work was supported by the Korea Research FoundationGrant funded by the Korean Government (MOEHRD) (KRF-2008-005-J01002). The Authors are thankful for the support of UOG,Pakistan.
ARTICLE IN PRESS
S. Hassan et al. / Robotics and Computer-Integrated Manufacturing 26 (2010) 156–161 161
References
[1] Anderson RJ. SMART: a modular architecture for robotics and teleoperation.Robotics and Automation Proceedings. 1993 IEEE International Conference1993;2:416–21.
[2] Linkens D, Nyoungesa H. Learning systems in intelligent control: an appraisalof fuzzy, neural and genetic algorithm control applications. IEE Proceedings ofthe Control Theory and Application 1996;143(4):367–86.
[3] Kozlowski K. Modeling and identification in robotics. Berlin: Springer;354076240X.
[4] Zeng G, Hemami A. An overview of robot force control. Robotica1997;15(5):473–82, doi:10.1017/S026357479700057X.
[6] Fu KS, Gonzalez RC, Lee CSG. Robotics: control, sensing, vision andintelligence. McGraw-Hill International Edition; 1987 ISBN: 0070226256.
[7] James H, Graham A. Special computer architecture for robotics: tutorial andsurvey. IEEE Transactions on Robotics and Automation 1989;5.
[8] Microsoft Corp. Real Time systems and Microsoft Windows NT. MicrosoftMSDN Library 1995.
[9] Perry DE, Wolf AL. Foundation for the study of software architecture. ACMSIGSOFT 1992;17(4):40–52.
[10] Albus JE, Barbera AJ, Nagel RN, ISO Draft 1995 ‘‘Theory and practice ofhierarchical control’’. In: Proceedings of the 23rd IEEE Computer SocietyInternational Conference, Washington, USA, 1981, p. 18–39.
[11] Chen N, Parker GA. Design of a robot control system architecture.Microprocessors and Microsystems 1994;18(6).
[12] Khosla P, Choosing sampling rates for robot. In: Proceedings of the IEEEInternational Conference on Robot Automation, 1992, p. 169–74.
[13] Lee CSG, Lee BH. Resolved motion adaptive control for mechanicalmanipulators. Transactions of ASME, Journal of Dynamic Systems, Measure-ment, and Control 1984;106(2):134–42.
[14] Kasashime N, Mori K, Yamane T. An intelligent numerical controller for
machining system. In: Zurawski R, Dillon T, editors. Modern tools for
manufacturing systems; 1993. p. 77–85.
[15] Pritschow G. Automation technology—on the way to an open system
architecture. Robotics and Computer Integrated Manufacturing 1990;7(1/
2):103–11.
[16] Bicker R, Burn K, Glennie D, Ow SM, Application of force control in
telerobotics. European Robotics and Intelligent Systems Conference (EUR-
ISCON ‘94), Malaga, Spain, 1994. p. 1509–17.
[17] Fern�andez-Madrigal JA, Gonz�alez J. The NEXUS open system for integrating
robotic software. Robotics and Computer-Integrated Manufacturing
1999;15(6):431–40.
[18] Barambones O, Etxebarria V. Robust neural control of robot manipulators.
Automatica 2002;38(2):235–42.
[19] Stewart David B, Khosla Pradeep K. The chimeramethodology: designing
dynamically reconfigurable and reusable real-time software using port-based
objects. International Journal of Software Engineering and Knowledge
Engineering 1996;6(2):249–77 June 1996.
[20] Zhou Yuchen, de Silva CW, Real-time control experiments using an industrial
robot retrofitted with an open-structure controller. Systems, Man and
Cybernetics, 1993, Systems Engineering in the Service of Humans, Conference
Proceedings, International Conference, 1993.
[21] Proctor MF, Albus JS. Open architecture controller. IEEE Spectrum
1997;34(6):60–4.
[22] Putz P, Elfving A. ESA’s control development methodology for space A&R
systems. In: Jamshidi M, et al., editors, Robotics and manuf.: recent trends in
research, education, and appl. vol. 4. ASME Press; 1992. p. 487–92.