process control on way to becoming process management

7
NEWS FOCUS Process Control on Way To Becoming Process Management Control concept broadens to include supervisory and management levels, as the control system designer gives way to the "systems architect" Joseph Haggin, C&EN Chicago Process control technology is undergoing change. In this article, the second in a three-part series to conclude next month, C&EN examines how process control in practice is changing conceptually to the idea of process management. An earlier article (April 2, page 7) looked at how process control is moving toward a deeper integration with process simulation and design functions. Distributed digital control (DDC) currently dominates industrial control technology. And it is generating a great deal of enthusiasm among its practitioners as it leaves the novelty phase and enters an era of major expansion. As befits a revolution in progress, its effects are turning up in unusual places and in unusual forms. And some of these are rather discomforting for the people affected. DDC's importance stems from the abrupt enlarge- ment of scope it affords control technology through integration of functions beyond the primary control loop. Process control is no longer restricted to physi- cal control of processing units. It now includes the various levels of supervisory control in a plant or group of plants, upper management fiscal and policy control, production scheduling to match present or anticipated sales, and optimization of operations at all levels. Sometimes the integration extends to R&D, which eventually will result in changes in a company's operations. DDC is not geographically restricted and is not limited to any one type of industry. Indeed, to recognize more formally the changes taking place in control technology, the function is even being referred to frequently as "process manage.- ment." The distinction isn't trivial. When in the 1960s direct digital control began to replace the then-conventional pneumatic and elec- tronic controllers, it required a central, usually rather large, digital computer. Unit control was accomplished through time sharing. Unfortunately, the reliability Computer terminals are now common to control systems, as in this Leeds & Northrup Systems'operator station for the firm's LN5800 distributed control system of such a system was not always the best. And any problem with the central computer meant trouble throughout the system, since few systems were large enough to justify an additional large digital computer for backup. With the introduction of mini- and microcomputers in the 1970s, this dilemma eventually was resolved. The minis and micros provided sufficient computer power at a low enough price to permit a high degree of redundancy for reliability. Digital control also could be distributed throughout the control system to fur- ther improve reliability. The rapid reduction in com- puter costs has accelerated this trend toward distribut- ed control. May 21, 1984 C&EN 7

Upload: joseph

Post on 17-Feb-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

NEWS FOCUS

Process Control on Way To Becoming Process Management

Control concept broadens to include supervisory

and management levels, as the control system designer

gives way to the "systems architect"

Joseph Haggin, C&EN Chicago

Process control technology is undergoing change. In this article, the second in a three-part series to conclude next month, C&EN examines how process control in practice is changing conceptually to the idea of process management. An earlier article (April 2, page 7) looked at how process control is moving toward a deeper integration with process simulation and design functions.

Distributed digital control (DDC) currently dominates industrial control technology. And it is generating a great deal of enthusiasm among its practitioners as it leaves the novelty phase and enters an era of major expansion.

As befits a revolution in progress, its effects are turning up in unusual places and in unusual forms. And some of these are rather discomforting for the people affected.

DDC's importance stems from the abrupt enlarge­ment of scope it affords control technology through integration of functions beyond the primary control loop. Process control is no longer restricted to physi­cal control of processing units. It now includes the various levels of supervisory control in a plant or group of plants, upper management fiscal and policy control, production scheduling to match present or anticipated sales, and optimization of operations at all levels. Sometimes the integration extends to R&D, which eventually will result in changes in a company's operations. DDC is not geographically restricted and is not limited to any one type of industry.

Indeed, to recognize more formally the changes taking place in control technology, the function is even being referred to frequently as "process manage.-ment." The distinction isn't trivial.

When in the 1960s direct digital control began to replace the then-conventional pneumatic and elec­tronic controllers, it required a central, usually rather large, digital computer. Unit control was accomplished through time sharing. Unfortunately, the reliability

Computer terminals are now common to control systems, as in this Leeds & Northrup Systems'operator station for the firm's LN5800 distributed control system

of such a system was not always the best. And any problem with the central computer meant trouble throughout the system, since few systems were large enough to justify an additional large digital computer for backup.

With the introduction of mini- and microcomputers in the 1970s, this dilemma eventually was resolved. The minis and micros provided sufficient computer power at a low enough price to permit a high degree of redundancy for reliability. Digital control also could be distributed throughout the control system to fur­ther improve reliability. The rapid reduction in com­puter costs has accelerated this trend toward distribut­ed control.

May 21, 1984 C&EN 7

News Focus

Implicit in the idea of distributed control is the communications network for the control system. It is the communications network that has opened up the scope of DDC technology.

With the availability of a dedicated communica­tions network, containing a lot of computing power, organization of the network into precedence levels or functional hierarchies becomes necessary. Much of the work of adapting DDC to particular processes is concerned with tailoring such a hierarchy to specific company needs.

This activity has broadened the scope of control companies. It isn't unusual, for example, for a control company to be a de facto management consultant to many of its clients. It also frequently supplies simula­tion and modeling support, particularly with respect to operator training.

One result of this activity is that it has produced the "systems architect/' who has nearly replaced the con­trol systems designer. Malcolm Beaverstock, senior automation architect for Foxboro Co., Foxboro, Mass., observes that the new craft of control systems archi­tect has been prompted by the very rapid expansion of the control business today to areas that would have been unthinkable only a few years ago.

The control systems architect functions much like an architect at a building site. He or she is responsible for seeing that all the parts go together as called for by the client. The former restriction of activities to controllers and control loops has given way to process simulation and integration of management control systems. To Beaverstock, this latter event is probably the biggest revolution of all.

The principal stimulus for the rapid growth of DDC is economic. A rather big improvement in overall productivity often results from appending DDC to the middle and upper management of companies—for a long time more or less sacred ground not open to intrusions by machines.

One survey indicates that 1980 sales of DDC sys­tems were about $200 million, with a projected annual growth to 1990 of 33%. The projected market then would be about $3.5 billion. Not everyone is that optimistic, but there is little doubt that DDC systems will dominate the control market for a long time.

The extent of conversion to DDC systems depends a lot on how much of industry is included in the perspective. One estimate comes from Theodore Stoltenberg, control technology consultant at Fisher Controls International Inc., Clayton, Mo. He believes that about one third of all operating chemical plants are controlled either by direct digital control or by DDC. He further estimates that more than 80% of new plants are so equipped, and more than 70% of retrofits are being converted to DDC.

Stoltenberg also notes that there aren't very many new plants being built these days and that most work is in retrofitting. Most of Fisher's work is domestic, but there appears to De a growing amount of work overseas.

Foxboro's experience with retrofits is similar. Most of the company's business is with petroleum refining

and petrochemicals—about 27% of sales in refinery operations and 23% in petrochemicals—with much of the remainder in pulp and paper. Beaverstock notes also that most of Foxboro's business today is in retrofitting existing plants; about 75% is his unofficial estimate. There is also a new kind of business in application of DDC systems where no systems at all were previously used.

Retrofitting a plant with DDC is obviously beneficial, Stoltenberg says. He cites as typical a 10% saving in energy consumption and from 5 to 7% saving in feed­stock cost because of close control. There are also other benefits such as improved safety and more uni­form product quality. The payout time for these con­trol systems is typically a matter of months, even when the existing controls have been in place since 1960, a frequent encounter.

Fisher provides a good bit of simulation assistance as part of its customer service packages. Being a wholly owned subsidiary of Monsanto, Fisher utilizes that company's FLOWTRAN simulation system both for designing control packages and for operator training. Like most control companies, Fisher adapts its wares

8 May 21, 1984 C&EN

As mini- and microcomputers have gained widespread use in process control systems, control rooms have undergone a metamorphosis, with process control consoles replacing the panelboards that had been a common sight Typical of current-generation controls are (clockwise from bottom left) Leeds & Northrup's Max 1 distributed control system, Foxboro's Spectrum control system work station, Honeywell's TDC3000 universal operator station, and Taylor Instrument's MOD 30 system

to many types of companies other than chemical pro­cess operators. Business is good, and it's expected to remain good for some time to come. One very-long-term trend may be the distribution of the manufactur­ing base itself. The advantages of very large single plants have been largely offset by rapidly rising trans­portation costs in many industries, and the availabili­ty of modern DDC encourages the trend toward a distributed manufacturing base. Since it probably would add to future business, Fisher certainly doesn't object.

But one problem that does bother Fisher, and most other control makers as well, is the shortage of trained manpower. There simply aren't enough chemical engineers, control engineers, technicians, and others to do all the work available at all times. Stoltenberg suggests that the ideal background for a modern con­trol engineer, or systems architect, would be in pro­cess engineering with strong mathematical skills and strong interest in simulation, coupled with as much computer experience as possible. That is admittedly a tall order, especially for a new graduate. There are few such people being trained in technical colleges, and industry has invested a lot in in-house training.

One effect of DDG on industrial operations is that as control systems become more distributed, there is a corresponding tendency to decentralize management as well. Foxboro's Beaverstock says that more and more decisions are being pushed farther down in the organization. The trend is definitely toward each pro­duction unit's becoming a department on its own.

This decentralization requires a lot more work by the local staff. And it presents a problem for middle managers: There is progressively less for middle man­agers to do. The middle manager may not yet be an endangered species, but he or she is certainly in sur­plus in many companies that have opted for extensive DDC. Beaverstock suggests that middle managers prob­ably won't be kicked upstairs but more likely will be retrained for jobs closer to the process.

Another trend that appears evident in many places is an abruptly increased flow of data in the DDC networks. Local microprocessors are generating data faster than they often can be accommodated. This, in turn, has created renewed interest in reintroduction of a central computer strictly for data p rocess ing-number crunching, as opposed to control functions.

However, Beaverstock suggests an alternative solu­tion to the data handling problem. He believes that local managers will be formatting a lot of their infor­mation locally to reduce the flow of raw data. The net result would be local generation of reports rather than an endless stream of raw data along the network. This aspect of local data management may become another of the computer-based revolutions, Beaverstock says. Yet another may be highly expanded modeling of plant operations.

All in all, the activities relative to DDC are adding up to the demise of process control in the traditional sense. The name of the game today is integration of functions through the control network.

A thoughtful view of process control that expands on these themes comes from Theodore J. Williams, director of Purdue University's Laboratory for Ap­plied Industrial Control. At bottom, as he sees it, the objective of process control is automation of an indus­trial plant. Automation requires the monitoring of many variables of many kinds. The relationships that translate variables into control action are complex in the extreme and require extensive programing and definition. Thus, automation becomes, essentially, the management of a plant's information system to en­sure that the correct information is obtained and prop­erly used and that it generates appropriate action.

But industry also is faced with the continuing prob­lem of adjusting production to sales volume, while maintaining high productivity and minimum produc­tion costs. This leads to the definite trend today to integrate all of these functions in a single system of control.

Williams notes that one of the main benefits of digital control has been in the role of "control sys­tems enforcer," meaning that it is necessary to ensure that the control equipment actually is being used as intended. Often, tasks now carried out by these digi­tal devices have been done by skilled and attentive

May 21, 1984 C&EN 9

News Focus

operators. However, the dedicated digital devices in­crease the degree of attentiveness that can be delivered in the long run.

To Williams, the ideal control system must have some minimum characteristics. It must permit tight control 6f the operating units from which profits are ultimately realized. The system also must permit su­pervision and coordination of all production units at some optimum level, in concert with a company's inventory requirements. The production control sys­tem must schedule plant operations for optimum economics. And there must be adequate provision for reliability and availability of the control system.

The first three of these characteristics effectively reflect three levels of a hierarchical control structure. Because of the great amount of information constantly in transit, a distributed computational capability orga­nized in hierarchical fashion must be the logical con­trol structure for the system. Mini- and microcomputers can handle the first two tasks, as well as ensure reli­ability and availability of the system. However, accomplishing production scheduling for optimum economics is the biggest problem of all and probably will require larger computers in the future.

For control system purposes, computing system hard­ware costs are negligible, or soon will become so, Williams says. If the algorithms and programs for the control system are available, an almost unlimited amount of computing power can be dedicated to the solution of any automation task. The major economic benefits should accrue from planning of production schedules—through coordination of operations—at the upper levels of the hierarchy. Williams cites numer­ous examples of payout times from major installations in Japanese chemical, petroleum, and paper plants as being on the order of three or four months.

Human activities in such a control system, Williams says, should be confined to true emergencies, for which the control system can't be preprogramed. People thus become supervisors of and not participants in the operation of a plant. If this appears to be uncomfortably "inhuman," there is the human benefit that dangers, both physical and psychological, are correspondingly reduced.

Nevertheless, Williams is concerned about a poten­tial lingering social problem with such highly auto­mated plants: the suspicion people may have that no one is in charge. There is an understandable reluc­tance by many to accept something that runs itself, even though the plant was designed to be operated by people. Williams is concerned that engineers and sci­entists often don't have a clear perception of the ultimate impact their creations may have. To the nonscientific and nontechnical public, this is frequent­ly manifested as a fear of technology.

According to Williams, adequate technology now exists to handle any essential task in plant automation. Future deve lopments wil l improve th ings , bu t breakthoughs aren't now required to implement an automated industrial base. One of the developmental areas frequently suggested to hold "breakthroughs" is the field of artificial intelligence. Williams foresees

In modern control systems, many monitoring and other activities are carried out through control-console

displays. Examples include two process schematics (above), a trend overview display (top right), an

annunciator display (center right), and a boiler load optimization display (lower right)

no breakthroughs in this field that would have a major impact on industrial automation. Similarly, the field of robotics may become important in the devel­opment of specialized tools and materials-handling devices but not as general-purpose automation devices.

The basic control system that emerges from these considerations is essentially what is now available— namely, a widely distributed set of isolated functions at the lower level of a hierarchy. More and more, fault detection will be built in, and there will be consider­able redundancy. Work done by the system will be coordinated by a set of successively higher-level, and

10 May 21, 1984 C&EN

probably larger, computers connected together and to the distributed remote control computers by an exten­sive communications system.

Williams likens the true hierarchical and distribut­ed control system to a military organization. Power resides in the upper levels of the hierarchy and is delegated downward. Also like the military organiza­tion, there is a demand that information flow through channels so all intermediate levels have access to all information that is sent to neighboring organizations and also is shared with higher levels.

The picture that Williams has provided is general

and, perhaps, somewhat idealistic. However, he also notes some severe constraints in implementation of automation to manufacturing. A very great problem is a lack of standardization. As is often the case with new developments, impetus comes from many places simultaneously, thereby providing confusion along with the enthusiasm.

A case in point is the need for a universally applica­ble programing language for control computers. Until that happens, there will be little interchangeability in hardware. A closely related problem is the bewilder­ing variety of database formats that are now in use. This has created the need for a standardized database management system.

One approach being taken is that of chemical engi­neering professor Rodolphe L. Motard of Washington University, St. Louis, who has been addressing the problem of database management systems and is de­veloping a prototype system that may find wide use among the engineering design fraternity.

Also, detailed mathematical modeling is generally lacking for the process control architect. Williams notes that good models aren't yet available for all of the chemical and physical processes encountered in a plant. Ideally these models should be developed from first principles to make them most widely adaptable. Existing ones are also often too language-specific for easy use in many potential applications.

Some theorists argue that a major flaw in the preoc­cupation with implementing present-generation DDC is that it doesn't provide for future inclusion of more-advanced control techniques. Adaptive control, feed forward control, optimal control, and the like all have their advocates, and there are examples of successful use of these techniques. However, they also suffer from a requirement for considerably more computing power than is usually available. Williams would like to see such approaches developed, but in the interim, he feels, maximum use should be made of what is available.

Taking all things into consideration, Williams sug­gests that, depending on industry enthusiasm and the state of the economy, the widespread application of DDC to U.S. industry could take from five to 15 years. DDC could be encountered frequently by 1990 but probably will require until the end of the century for broad-scale realization. Most of the constraints to this automation effort are institutional and not technical. Much depends on acceptance by management, avail­ability of trained people, and sustaining of the present enthusiasm.

The extension of plant control to the management hierarchy has become of prime concern to control companies and has prompted some new enterprises in the process. The way automation is considered by one major control company was outlined by Philip W. Bur and James Brosvic, staff engineers at Honeywell's pro­cess management systems division, Fort Washington, Pa., at last November's meeting of the Instrument Society of America. To begin with, they make a dis­tinction between hard automation and soft automa­tion—hard automation referring to the electrical and

May 21, 1984 C&EN 11

News Focus

1 Control scheme uses on-line simulation

Complex process

Process variables %

j k _

^ &l

Control ;, outputs :;

\i"

(-

Pro

cess

inp

ut/

ou

tpu

t in

terf

ace

:m ->

U v

Ly i

Model parameter adaptation algorithm

On.-line simulation

model

Process control "laws"

::-;

:'V

s$ \

1 1 Proposed ^ _ _ 1 control

1 1 change

Operator's console

•Jo] UK tO

proceed button

mechanical equipment for materials and energy handling; soft automation referring to application of control technology to information flow. Soft automa­tion, they say, is the key to better productivity.

There have been sharply increased demands on the information handling system since the middle 1970s, Bur and Brosvic point out. As Williams has suggested, they note that the organization of the information handling system roughly approximates the manage­ment structure of a company. Organizational direc­tion comes down from the top. After execution, direc­tives produce results—good or bad—that are passed back up through channels to top management. The mere passing of information is insufficient; it must reflect directives and results.

The hierarchy appropriate to this information sys­tem often is composed of six to eight levels. Corporate, division, plant, area, unit, loop, and point would be an example. But usually there are three levels of responsibility. These are the corporate level, plant level, and unit level.

Not all companies have automated to the same degree, and there appear to be some generic characteris­tics associated with the types of companies involved. Bur and Brosvic make a distinction, for example, be­tween processing and manufacturing. In Honeywell's terminology, processing consists of the extraction and preparation of raw materials—production of crude oil and its subsequent refining, for example. Manufactur­ing involves production of consumer materials and products from the raw materials. A roll of vinyl sheet made from petrochemicals would be an example.

The distinction is important because processing plants have been automating regularly with hard auto­mation for about 50 years, whereas manufacturing plants are relative newcomers. Consequently, there is a better-developed base on which to append new technology in the processing industries.

Whatever the differences in the hard automation characteristics between processing and manufacturing companies, the information systems are much the same for both, although with variation in form and com­plexity in passing through the organizational hierarchy. Top management generally has little interest in the details of a particular processing unit. Hence a lot of information must be summarized in a form usable by top management.

Bur and Brosvic note that there is a tremendous range in the frequency of information collection in the normal plant. At top levels, years may go by before information of a particular type is requested from the system. At the unit level, information may be requested in millisecond frequencies. That is a 10-order range in frequencies and illustrates the de­mands that can be made on an information system.

As control integration moves into the era of major expansion, some trends are evident to Honeywell. One is that integration is occurring at all levels of the managerial hierarchy. There is now under way the connection of individual desktop computers in a net­work that will allow access to much bigger databases. Also there are the corresponding problems with the

management of the databases themselves because of nonstandardization. At the plant level, operational planning and scheduling are being merged into fully automated systems that include purchasing, order entry, and shipping databases. At the unit level, programable controllers and distributed signal controllers now com­municate between themselves directly and with pro­cess control computers as well. This kind of integra­tion is expected soon to extend into the process itself in the form of intelligent digital sensors/actuators, and analytical equipment.

The progress made so far with integration has raised old problems and created some new ones. A major problem, say Bur and Brosvic, is ensuring that database management systems are sufficiently comprehensive. The present state of theory and design leaves much to be desired, and some even doubt that this problem can be overcome.

An all-pervasive problem is that the way decisions are made at the different levels really hasn't been addressed. At lower levels of the hierarchy, decisions are made with the aid of highly structured models and algorithms. Higher up in the organization, deci­sion making is less structured and sometimes can even be intuitive. It is doubtful that models can be concocted for such unstructured situations. Even so, there must be some methodology for decision making that applies, more or less, at all levels.

The weakest link is making the control systems more secure. Self-diagnosis and error detection tech-

12 May 21, 1984 C&EN

niques are improving matters, but limiting who has access to the system is still the surest way of making things secure. One new development is the "firmware" that is now starting to appear. As the name suggests, firmware is software resident on a hardware chip. It cannot be changed without destroying the chip. This suggests that customer manipulation of software could be severely restricted in the future.

Will people accept all the changes being wrought by DDC? Hard automation has been displacing blue-collar workers, and the prospect is that soft automa­tion will displace white-collar workers. Bur and Bros-vic believe that some organizational changes are inevitable. There will have to be a lot of filtering of information as it flows through the system. The boss won't be so insulated from bad information as he or she may formerly have been.

While the integration of unit and plant control systems is proceeding, there is a constant flow of developments in hardware and software and in opera­tional techniques for using them, a recent example of which is use of a dynamic model on-line in superviso­ry control systems.

Purdue's Williams has proposed such a system spe­cifically for a blast furnace and a Kamyr digester for a paper mill, although it could be used wherever there is a process that has no precisely defined internal mechanism. Such uses for dynamic models have been proposed before, but only in recent years has there been enough computing power available to make brute-force methods practical.

In Williams' proposal, the computer, through on­line simulation of the process, supplies the plant opera­tion and control system with the necessary process information not available by direct measurement. In the case of the blast furnace, no two charges react in the same way, and the complexities of the interior of the furnace defy description. The simulation would be run at a time scale exactly matching the process itself. This might be called "tracking simulation." In addition, fast simulations could be run in parallel with real-time simulation to answer the "what if" questions. This also would permit a "look ahead" simulation of the system.

In addition to the on-line model, there would be a system for reading operating variables in the process with currently available equipment. The control sys­tem would be connected with the present actuators on the unit. There also would be a set of "furnace control laws" derived from off-line simulation and that would prescribe probable control responses for predictable problems. Further, there would be a set of process identification algorithms that compare real operating data with corresponding outputs from the on-line model. The algorithms would continually adjust the operating parameters of the model to keep the simula­tion tracking the actual operation of the process. The operator's console would register the operational sta­tus of the process in the usual manner. There would be an "okay to proceed" button for appropriate action as the need requires.

The key to success with this system, Williams says,

is the capability of the system to modify the simula­tion parameters such that it continually tracks the internal operation of the process. This is the task of the adaptation algorithms.

All major control manufacturers offer a wide range of packages in DDC systems. There are about 40 such manufacturers worldwide. In addition, there are many smaller hardware firms that make specialty equip­ment, mostly at the sensing end of the control loop.

The very rapid changes in control technology have left some academic institutions high and dry. To ac­commodate the changes, staff and equipment must either be acquired or adapted at rates that are seldom in keeping with the normal administrative pace of universities. It is very expensive to organize, maintain, and operate courses of instruction in modern control technology. The technical integration of model-building, mathematical simulation, computerization, programing, and the like is difficult enough to handle by itself. The managerial and commercial integration superimposed on the technical integration makes the problem almost impossible to solve.

However, there appear to be some new cooperative ventures between universities and the various compa­nies that supply the control markets. One example is the nearly year-old cooperative venture among Purdue University's school of chemical engineering, the Uni­versity of Waterloo (Canada), IBM, and Honeywell. Students are participating in an advanced control course using the latest computational and display equipment furnished by IBM, along with the Honeywell TDC2000 DDC system. IBM also furnishes its Automatic Con­trol System package, which has been in use around the world for some time. Evidence from the classroom indicates that the courses, although demanding, are also extremely helpful for the students and of more than passing interest to potential employers. The courses taught at Purdue shortly will be made avail­able to the University of Wisconsin, Stevens Point, and to Louisiana State University via cable link.

The only really unfortunate aspect of courses such as those offered by Purdue and Waterloo is that they can accommodate only a small number of students. Industry demands must, therefore, be met by industry either on the job or through in-plant courses of instruction. Both are being used.

One of the by-products of the new era of computeri­zation in general and the expansion of the scope of process control is the passing of the unit operations laboratory and what it once represented. Most likely a need to have operating equipment available for in­struction will persist. However, the educational pow­er that is offered by a console simulator system to students is far greater than anything ever devised in a hands-on unit operations laboratory. One observer sums it up by noting that the unit ops lab is now really a kind of museum, where it exists at all. Unit operations remain the backbone of chemical engi­neering, but they have been raised to a new level of abstraction and have been much more closely connected to the fundamental sciences that underlie them. •

May 21, 1984 C&EN 13