c,) · these tools have been used to implemilenit several dozen languages,...

6
The design and implementation of compilers for pro- gramming languages is an essential part of systems soft- ware. In the last decade many new general-purpose pro- gramming languages, such as Ada, Bliss, C, Fortran 77, and Pascal, have emerged along with a much larger number of more specialized languages. Although the con- struction of compilers is becoming easier, thanks in part to new theoretical insights and to new language develop- ment tools, the implementation of a compiler for a major language is still a nontrivial task. Since the early 1960's there has been considerable in- terest in tools that reduce the effort needed to construct a good compiler. These tools are often called compiler- compilers or translator writing systems. The purpose of this special issue of Comnputer is to look at some of the newly created language development tools that are nlow in use and to look at some of the current research in ex- perimental compiler-compilers. Toward that end, it con- tains three articles on compiler-compilers by active re- searchers in the field. In the first article, Stephen Johnson explains some of the language development tools that he and others have August 1980 developed for the Unix* operating system. Of particular interest are a lexical analyzer generator, an LA LR parser generator, the systems-oriented language C for encodinig semantic routines, and the Unix conitimiand interpreter. These tools have been used to implemilenit several dozen languages, including a portable compiler for Fortrati 77,1 a preprocessor for a language for typesetting mathemilat- ics,2 a portable compiler for the programminig language C,) and an interpreter for a pattern-action languagc lor processing files.4 Table-driven code generation is the topic of the second article, in which Susan Graham presents the research on automatic code generation that she and her colleagues are conducting at the University of California at Berkeley. The thrust of this work is on simplicity and efficiency in the code generation process. The author compares the Berkeley approach to several other receitly devised techniques for the design of automatic codc-generator- generators. 'Unix is a tradeniark of Bell Laboratories. 001ts9162/80)/018(1811X)009(51).75 t980 t11 I

Upload: truongxuyen

Post on 29-Jul-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

The design and implementation of compilers for pro-

gramming languages is an essential part of systems soft-ware. In the last decade many new general-purpose pro-

gramming languages, such as Ada, Bliss, C, Fortran 77,and Pascal, have emerged along with a much largernumber of more specialized languages. Although the con-struction of compilers is becoming easier, thanks in partto new theoretical insights and to new language develop-ment tools, the implementation of a compiler for a majorlanguage is still a nontrivial task.

Since the early 1960's there has been considerable in-terest in tools that reduce the effort needed to construct a

good compiler. These tools are often called compiler-compilers or translator writing systems. The purpose ofthis special issue of Comnputer is to look at some of thenewly created language development tools that are nlow inuse and to look at some of the current research in ex-

perimental compiler-compilers. Toward that end, it con-tains three articles on compiler-compilers by active re-

searchers in the field.In the first article, Stephen Johnson explains some of

the language development tools that he and others have

August 1980

developed for the Unix* operating system. Of particularinterest are a lexical analyzer generator, an LALR parser

generator, the systems-oriented language C for encodinigsemantic routines, and the Unix conitimiand interpreter.

These tools have been used to implemilenit several dozenlanguages, including a portable compiler for Fortrati 77,1a preprocessor for a language for typesetting mathemilat-ics,2 a portable compiler for the programminig languageC,) and an interpreter for a pattern-action languagc lorprocessing files.4

Table-driven code generation is the topic of the secondarticle, in which Susan Graham presents the research on

automatic code generation that she and her colleagues are

conducting at the University of California at Berkeley.The thrust of this work is on simplicity and efficiency inthe code generation process. The author compares theBerkeley approach to several other receitly devisedtechniques for the design of automatic codc-generator-generators.

'Unix is a tradeniark of Bell Laboratories.

001ts9162/80)/018(1811X)009(51).75 t980 t11

I

In the third article, Bruce Leverett and his colleaguesdescribe an ambitious compiler-compiler project theyhave undertaken at%Carnegie-Mellon University. The goalof this project is to develop a compiler-compiler that iscapable of producing high-quality optimizing compilersfrom formal machine descriptions. This article describessome of the decisions, strategies, and assumptions thathave gone into the design of this compiler-compiler.

It is impossible to cover all of the ongoing research intocompiler-compilers in the space of a few articles. The onesin this issue-in conjunction with the backgroundmaterial in the remainder of this introduction-are in-tended to indicate some of the major trends in currentcompiler-compiler research, particularly in the area ofautomatic code generation. There are several other majorcompiler-compiler projects currently underway, such asthe HLP project at the University of Helsinki,5 the MUG2project at the University of Munich,6 and theECS projectat IBM Yorktown Heights.7 Another active area ofresearch which can potentially have a significant impacton the automation of compiler construction is the formalspecification of the semantics of programming lan-guages.8 More information on the historical developmentof compilers and compiler-compilers can be found in theexcellent survey articles by Bauer,9 Knuth, 10 and Feldmanand Gries,"I and in many of the modern textbooks oncompilers. 12-15

A simple compiler model

A compiler takes as input a program written in a sourcelanguage and produces as output an equivalent programin a target language. It has proven useful to subdivide theprocess of translating a source program to a target pro-gram into a sequence of simpler subprocesses calledphases. The exact function defined by a phase issomewhat arbitrary, but it is convenient to think of aphase as a logically coherent operation that takes as inputone representation of the source program and produces asoutput another representation. Here we shall consider asimple compiler model consisting of the following fivephases:

* lexical analysis (or scanning),* syntax analysis (or parsing),* semantic analysis,* code optimization, and* code generation.Early work on compiler-compilers tended to treat the

compilation process as a single phase, specified as asyntax-directed translation. A typical compiler-compilerof the 1960's would provide an automatic parsing tech-nique or a language for specifying parsers. Associatedwith each parsing action would be a semantic routine writ-ten in some procedural language, which Would be invokedto perform some semantic action immediately after thatparsing action was executed.1

In the 1970's the phases of compilation tended to bestudied as separate processes. Tools emerged for im-plementing some of these phases automatically fromhigher-level specifications of the input-output mapping

defined by the phase. These tools often exploited special-purpose knowledge derived from extensive theoreticalanalysis. Some of these tools could be used independentlyof compiler generation. Therefore, to appreciate recentadvances in compiler-compilers it is necessary to under-stand what happens during each of the phases of compila-tion.

Lexical analysis. The first phase of a compiler, the lex-ical analyzer, sees the source program as a sequence ofcharacters. The lexical analyzer groups these charactersinto tokens, which are substrings of the input stream thatlogically belong together, such as identifiers and con-stants. The lexical analyzer is also usually responsible foridentifying comments, blanks, and quoted strings in thesource program. Information collected about the tokensappearing in the source program is entered into a symboltable by the lexical analyzer. The output of the lexicalanalysis phase is a stream of tokens.The notation of regular expressions is useful for

describing the structure of tokens in a programminglanguage.12"'6 Efficient algorithms have been developedto construct finite automaton recognizers for tokens.Special-purpose languages for specifying lexical analyzersfrom regular expression descriptions began to appear inthe late 60's. 17,18 These languages were often used for con-structing pattern-matching programs for other than com-piling applications.

Syntactic analysis. The second phase of the compiler,the syntax analyzer or parser as it is often called, groupsthe tokens emitted by the lexical analyzer into syntacticstructures such as expressions and statements. It is thesyntax analysis phase that determines whether the sourceprogram satisfies the formation rules of the programminglanguage. If the source program is not syntactically cor-rect, the syntax analyzer must determine where errors oc-cur, give appropriate error diagnostics, and recover fromeach error to catch any subsequent errors that might bepresent.The syntax analyzer usually invokes some semantic ac-

tion as each syntactic construct is recognized. The seman-tic action may be to enter some information into a table,emit some error diagnostics, or generate some output.The output of the syntax analyzer is some intermediatelanguage representation of the source program such aspostfix Polish notation, a tree structure, or quadruples.Much of the published compiler research in the 1960's

and early 1970's dealt with the development of parsingtechniques. The parsing techniques used in the earlycompiler-compilers were of two types. One type handledvery general classes ofgrammars, but often in a very time-consuming way; the other type would handle smallerclasses of grammars, but much more efficiently. The lat-ter techniques were further classified into "top-down" or"bottom-up" methods. Aho and Ullman describe theseearly parsing methods in detail.'9By the early 1970's parsing methods had become well

understood from a theoretical point of view. The earlywork on recursive descent top-down parsing was for-malized into the theory of LL grammars.20'2' Knuth22generalized the work on bottom-up parsing into the

COMPUTER10

theory of LR grammars, the largest natural class ofcontext-free grammars that could be parsed with-a deter-ministic pushdown automaton. Subsequent work byKorenjak,23 DeRemer,24 Aho, Johnson, and Ullman,25and many others succeeded in making LR parsing prac-tical for use in compiler-compilers. Because of itsgenerality and its good error detection capabilities, LRparsing was a natural method to incorporate in automaticparser generators.The successful automation of syntax analysis was

perhaps the most striking advance in compiler-compilersin the 1970's. With a parser generator such as Yacc,26 aprogrammer can automartically construct a reliable parserfor any programming language directly from a gram-matical description. A decade before, the construction ofthe parser had been considered a significant task in the im-plementation of a compiler.Work on methods of automatic error recovery pro-

ceeded with the development of formal parsing tech-niques. Although there have been considerable im-provements,27,28 automatic generation of good diagnos-tics and error recovery methods is still wanting.

Semantic analysis. Semantic analysis usually refers tothe verification of various semantic integrity constraintsthat must take place sometime during the compilationprocess. Type checking, determining that functions arecalled with the appropriate number of arguments, andverifying that identifiers have been declared are typical ofwhat takes place during semantic analysis.

(

Attribute grammars29 and affix grammars30 are twoformalisms that have been proposed for describing thesemantics of programming languages. A few compiler-compilers have used these formalisms for automatingsemantic analysis. However, the automatic implementa-tion of attribute and affix grammars can be slow and mayrequire the entire parse tree to be stored in memory.

Nevertheless, there is considerable research interest inattribute grammars at present. (A recent bibliographycites over 150 references on attribute grammars.31) Theextent to which attribute grammars can be used in prac-tical compiler-compilers is still to be determined.

Code optimization. An "optimizing" compiler has acode optimization phase that attempts to transform an in-termediate language representation of the source pro-gram into one from which a faster or smaller (but notnecessarily optimal) target program can be generated.The first Fortran compiler set remarkably high standardsfor code optimization that most subsequent compilershave not met.32Code optimization was an active research area during

the 1970's. Optimizing transformations were codified,and efficient algorithms to implement various optimiza-tions were found.33'34 It was discovered that "structured"programs were easier to optimize than unstructuredones. 3536A number of optimizations can be safely performed

only by knowing information gathered from the entireprogram. Global data-flow analysis techniques were

RESEARCH SCIENTISTS.

The Corporate Computer Sciences Center ofHoneywell, located in suburban Minneapolis,has excellent immediate research opportunitiesfor Computer Scientists with backgrounds andinterests in Software Technology or Design Auto-mation for VLSI.

Selected scientists will be key participantsin a research project involving the developmentof software and hardware design methodologies.

SOFTWARE ENGINEERINGCandidates for these positions should possess anMS, PhD or equivalent in Computer Science orElectrical Engineering, plus experience in com-puter technology.

Software development experience, knowl-edge- of current re-search in software methodolo-gies and research capabilities are required. Abackground in one ofthe following areas is desir-able: programming methodology, operating sys-tems, communicating systems, concurrentprocessing, distributed computing, modeling,performance analysis, system architecture ordatabase management systems.

DESIGN AUTOMATIONIndividuals should possess an MS, PhD or equi-valent in Computer Science, Electrical Engi-neering or Math plus experience in computertechnology.

These key positions require experience indesign automation or design-for-test. Ability increating applications of new techniques is alsorequired.

Honeywell offers a competitive salary, acomprehensive benefit package, and excellentopportunities for both personal and professionalgrowth and recognition.

For immediate confidential attention,direct your resume with salary history to:

Mr. Tom (C) WylieHoneywell CorporateComputer Sciences Center

10701 Lyndale Avenue SouthBloomnington, MN 55420.

HoneywellAn Equal Opportunity Employer M/F

J

mI

%-l. -- .010,N

studied in great depth by Kennedy,37 Kildall,38 andothers.35 Some of the data-flow analysis techniques werealso used in detecting programming errors39 and inrestructuring programs.40 Ullman gives a general surveyof data-flow analysis techniques.4'

However, these developments are only beginning to beharnessed into effective code optimizer generators. In-teresting experimentation with automatic code optimiza-tion is underway at IBM Yorktown Heights42 and atCarnegie-Mellon University.43 The paper by Leverett etal. in this issue describes some of the code optimizationstrategies incorporated into the Production-QualityCompiler-Compiler project at Carnegie-Mellon.

Code generation. Code generation, the final phase ofour compiler model, translates the intermediate languageprogram into the target language program. General codegeneration algorithms have been studied since the late1950's.44 In 1970, Sethi and Ullman wrote an influentialpaper describing how to generate optimal code for expres-sions on two-address machines.45 In 1976 Aho andJohnson published a template-based code generationalgorithm that produces optimal code for expressions onmuch more general machines.46However, it was discovered that optimal code genera-

tion in the presence of common subexpressions is in-herently hard, no matter how simple the machine.4748Also, optimal code sequences for many existing machinearchitectures can be very unintuitive.49'50 Therefore, inmost cases of practical interest, a compiler must rely oneffective heuristics to generate good code. The thrust ofmuch of the recent work in code generation stresses port-ability and/or retargetability of the resulting compiler.

There are two popular approaches to code generation.One is the use of systems-oriented procedural languagesto facilitate implementation of code generators.51,52 Theother is the design of general code algorithms that operateusing tables to represent the specific target machine in-structions.53-58The latter style of code generation is often implemented

as a template-matching process. The intermediate lan-guage program is a tree. Code generation consists' ofcovering the tree with templates representing the targetmachine instructions. This tree template matching ap-proach allows many machine dependencies to be isolatedinto the tables containing the templates. This approachappears attractive for use in compiler-compilers in that itmay be possible to derive the template tables automatical-ly from a formal specification ofthe target machine and inthat the same compiler can be used to generate code for anew machine by changing only the contents of thetemplate tables. The papers by Graham and Leverett et al.in this issue describe this approach to code generation inmore detail.

Formal specification of languages

The formal specification of a programming languageconsists of two parts: a syntactic specification that definesthe set of well-formed programs and a semantic specifica-

tion that gives a meaning for each well-forrned program.Some researchers have set the goal of constructingcompiler-compilers that will automatically produce pro-duction compilers from formal specifications of thesource and target languages. Unless the source and targetlanguages are suitably restricted, this goal is unreasonablyoptimistic at present.

Context-free grammars (BNF) have been the de rigueursyntactic specification of programming languages sincethe publication of the Algol 60 report,59 and the as-sociated theory of context-free languages has become wellunderstood.16 As we have noted, an added advantage inusing context-free grammars to specify the syntax oflanguages is that we can automatically construct an effi-cient syntax analyzer directly from a defining grammar.The semantic specification of a language is more dif-

ficult. Various methods of specifying the semantics of aprogramming language have been proposed,60 and con-siderable progress has been made in developing a soundmathematical theory of programming language seman-tics.6' Nevertheless, at present there is no formal semanticspecification from which production quality compilerscan be generated automatically.

Recently, denotational semantic specifications of pro-gramming languages have become popular,62-64 and it ispossible to automatically generate an interpreter from adenotational semantic specification of a language.65More research is necessary in order to translate thedenotational semantic specification into an efficient codegenerator for a given target machine.

Our understanding of compilation has increasedsubstantially during the 1970's. 'We are now able toautomatically construct reasonably good lexical and syn-tactic analyzers for most programming languages,although some problems still remain with the automaticdesign of'good error-handling routines. The remainingphases of compilation are not yet as thoroughlyautomated as lexical or syntactic analysis, but template-driven code generation techniques appear to be attractivefor use in automatic code-generator-generators.

In the last decade, the mathematical foundations ofprogramming language sentantics have become much bet-ter understood, but not yet as well as those of syntax. Theautomatic translation of programming language seman-tics into machine language semantics is still an activeresearch area.On the whole, the programming language area is

becoming more complex. Programming languages arebecoming more diverse. With the arrival of large-scale in-tegration, new machine architectures are potentiallyeasier to fabricate. What machine should we build to im-plement a given language is a question that will be askedmore and more often. Compiler-compilers can helpanswer this question by enabling us to construct a com-piler for a hypothetical new machine quickly and cheaply,and thereby allowing us to measure the quality of codethat would be produced for that new machine. The 1980'spromise to be a lively period in the history of program-ming languages and their compilers.-

COMPUTER12

Acknowledgments

I would like to express my appreciation to all of theauthors for their efforts, and to Ted Lewis for his help inputting together this special issue.

I would also like to thank Steve Johnson, Ravi Sethi,and Tom Szymanski for their helpful comments on thisoverview.

References

1. S. I. Feldman, "Implementation of a Portable Fortran 77Compiler Using Modern Tools," Proc. ACM SIGPLANSymp. Compiler Construction, Aug. 1979, pp. 98-106.

2. B. W. Kernighan and L. L. Cherry, "A System for Type-setting Mathematics," Comm. ACM, Vol. 18, No. 3, Mar.1977, pp. 151-156.

3. S. C. Johnson, "A Portable Compiler: Theory and Prac-tice," Proc. 5th ACM Symp. Principles ofProgrammingLanguages, Jan. 1978, pp. 97-104.

4. A. V. Aho, B. W. Kernighan, and P. J. Weinberger,"AWK-A Pattern Scanning and Processing Language,"Software-Practice andExperience, Vol. 9, Apr. 1979, pp.267-279.

5. K.-J.Raiha, M. Saarinen, E. Soisanlon-Soininen, and M.Tienari, "The Compiler Writing System HLP," ReportA-1978-2,, Department of Computer Science, Universityof Helsinki, Finland, 1978.

6. H. Ganzinger, K. Ripken, and R. Wilhelm, "AutomaticGeneration of Optimizing Multipass Compilers," Infor-mation Processing 77, North Holland, Amsterdam, 1977,pp. 535-540.

7. F. E. Allen et al., "The Experimental Compiling SystemsProject," Report RC 6718, IBM T. J. Watson ResearchCenter, Yorktown Heights, N.Y., Sept. 1977.

8. D. Bjorner, "Programming Languages: Formal Develop-ment of Interpreters and Compilers," in InternationalComputing Symp. 1977, E. Morlet and D. Ribbens, eds.,North Holland, Amsterdam, 1977, pp. 1-21.

9. F. L. Bauer, "Historical Remarks on Compiler Construc-tion," in Compiler Construction: An Advanced Course,F. L. Bauer and J. Eickel, eds., Springer-Verlag, Berlin,1974, pp. 603-621.

10. D. E. Knuth, "A History of Writing Compilers," Com-puters and Automation, Vol. 11, No. 12, Dec. 1962, pp.8-14.

11. J.A. Feldrtan and D. Gries, "Translator WritingSystems," Comm. ACM, Vol. 11, No. 2, Feb. 1968, pp.77-113.

12. A. V. Aho and J. D. Ullman, Principles of CompilerDesign, Addison-Wesley, Reading, Mass., 1977.

13. F. L. Bauer and J. Eickel, eds., Compiler Construction:An Advanced Course, Springer-Verlag, Berlin, 1974.

14. D. Gries, Compiler Construction for Digital Computers,Wiley, New York, 1971.

15. P. M. Lewis, II, D. J. Rosenkrantz, and R. E. Stearns,Compiler Design Theory, Addison-Wesley, Reading,Mass., 1976.

16. J. E. Hopcroft and J. D. Ullman, Introduction toA utomata Theory, Languages, and Computation,Addison-Wesley, Reading, Mass., 1979.

17. W. L. Johnson et al., "Automatic Generation of EfficientLexical Processors Using Finite State Techniques,"Comm. ACM, Vol. 11, No. 12, Dec. 1968, pp. 805-813.

18. M. E. Lesk, "LEX-a Lexical Analyzer Generator,"Computing Science Technical Report 39, BellLaboratories, Murray Hill, N.J., 1975.

Reader Service Number 4 0

19. A. V. Aho and J. D. Ullman, The Theory of Parsing,Translation, and Compiling, Prentice-Hall, EnglewoodCliffs, N.J., Vol. 1 1972, Vol. II, 1973.

20. D. E. Knuth, "Top-Down Syntax Analysis," Acta Infor-matica, Vol. 1, No. 2, 1971, pp. 79-1 10.

21. D. J. Rosenkrantz and R. E. Stearns, "Properties of Deter-ministic Top-Down Parsing," Information and Control,Vol. 17, No. 3, Oct. 1970, pp. 226-256.

22. D. E. Knuth, "On the Translation of Languages from Leftto Right," Information and Control, Vol. 8, No. 6, Dec.1965, pp. 607-635.

23. A. J. Korenjak, "A Practical Method for ConstructingLR(k) Processors," Comm. ACM, Vol. 12, No. 11, Nov.1969, pp. 613-623.

24. F. L. DeRemer, "Simple LR(k) Grammars," Comm.ACM, Vol. 14, No. 7, July 1971, pp. 453-460.

25. A. V. Aho, S. C. Johnson, and J. D. Ullman, "Deter-ministic Parsing of Ambiguous Grammars," Comm.ACM, Vol. 18, No. 8, Aug. 1975, pp. 441-452.

26. S. C. Johnson, "YACC-Yet Another Compiler-Compiler," Computing Science Technical Report 32, BellLaboratories, Murray Hill, N.J., 1975.

27. S. L. Graham, C. B. Haley, and W. N. Joy, "Practical LRError Recovery, " Proc. ACMSIGPLANSymp. CompilerConstruction, Aug. 1979, pp. 168-175.

28. S. L. Graham and S. P. Rhodes, "Practical Syntactic ErrorRecovery in Compilers," Comm. ACM, Vol. 18, No. i 1,Nov. 1975, pp. 639-650.

29. D. E. Knuth, "Semantics of Context-free Languages,"Math. Systems Theory, Vol. 2, No. 2, June 1968, pp.127-145.

NEW LIFE FOR1401 PROGRAMS

CS-TRAN converts your 1401 object programs toCOBOL for the mainframe or mini of your choice.CS-TRAN is the only translator that accepts your objectprograms, patches and all, yet allows you to includeactual COBOL paragraph names and record defini-tions.If you'd like more details about new life for your 1401programs just call or write Russ Sandberg.

C S Computer Systems Inc.90 John Street, New York, NY 10038 * 212-349-3535

August 1980

30. C. H. A. Koster, "Using the CDL Compiler-Compiler," inCompiler-Construction: An Advanced Course, F. L.Bauer and J. Eickel, eds., Springer-Verlag, Berlin, 1974,pp. 366-426.

31. K.-J. Raiha, "Bibliography on Attribute Grammars,"ACM SIGPLAN Notices, Vol. 15, No. 3, Mar. 1980, pp.35-44.

32. J. W. Backus et al., "The FORTRAN Automatic CodingSystem," Proc. Western Joint Computer Conf., Vol. 11,1957, pp. 188-198.

33. F. E. Allen and J. Cocke, "A Catalogue of OptimizingTransformations," in Design and Automation of Com-pilers, R. Rustin, ed.; Prentice-Hall, Englewood Cliffs,N. J., 1972, pp. 1-30.

34. D. E. Knuth, "An Empirical Study of FORTRAN Pro-grams," Software-Practice and Experience, Vol.1, No.2, 1971, pp. 105-133.

35. M. S. Hecht, Data FlowAnalysis ofComputer Programs,American Elsevier, New York, 1977.

36. M. V. Zelkowitz and W. G. Bail, "Optimization of Struc-tured Programs," Software-Practice and Experience,Vol. 4, No. 1, 1974, pp. 51-57.

37. K. Kennedy, "A Global Flow Analysis Algorithm," Intl.J. Computer Math., Vol. 3, 1971, pp. 5-15.

38. G. A. Kildall, "A Unified Approach to Global ProgramOptimization," Proc. ACMMSymp. Principles ofProgram-mning Languages, 1973, pp. 194-206.

39. L. D. Fosdick and L. J. Osterweil, "Data Flow Analysis inSoftware Reliability," Compuling Surveys, Vol. 8, No. 3,1976, pp. 305-330.

40. B. S. Baker, "An Algorithm for Structuring Programs,"J. ACM, Vol. 24, No. 1, Jan. 1977, pp. 98-120.

41. J. D. Ullman, "Data Flow Analysis," Proc. 2nd USA-Japan Comnputer Con.f., AFIPS Press, Montvale, N.J.,1975, pp. 335-342.

42. W. Harrison, "A New Strategy for Code Generation-theGeneral Pupose Optimizing Compiler," Fourth ACMSymp. Principles ofProgramming Languages, Jan. 1977,pp. 29-37.

43. W. A. Wulf et al., The Design ofan Optimizing Comnpiler,American Elsevier, New York, 1975.

44. A. P. Ershov, "On Programming of Arithmetic Opera-tions," Dokl. A. N. USSR, Vol. 118, No. 3, 1958, pp.427-430; also in Cotnm. ACM, Vol. 1,No.8,1958, pp.3-9.

45. R. Sethi and S. C. Johnson, "The Generation of OptimalCode for Arithmetic Expressions," J. ACM, Vol. 17, No.4, 1970, pp. 715-728.

46. A. V. Aho and S. C. Johnson, "Optimal Code Generationfor Expression Trees," J. ACM, Vol. 23, No. 3, 1976, pp.488-501.

47. J. L. Bruno, and R. Sethi, "Code Generation for a OneRegister Machine," J. ACM, Vol. 23, No. 3, 1976, pp.502-510.

48. A. V. Aho, S. C. Johnson, and J. D. Ullman,."CodeGeneration for Expressions with Common Subexpres-sions," J. ACM, Vol. 24, No. 1, Jan. 1977, pp. 146-160.

49. A. V. Aho, S. C. Johnson, and J. D. Ullman, "CodeGeneration for Machines with Multiregister Operations,"Proc. Fourth ACM, Svmp. Principles of ProgramnmingLanguages, 1977, pp. 21-28.

50. T. G. Szymanski, "Assembling Code for Machines withSpan-Dependent Instructions," Comnmn. ACM, Vol 21,No. 4, April 1978, pp. 300-308.

51 W. M. McKeeman, J. J. Horning, and D. B. Wortman, ACotmpiler Generator, Prentice-Hall, Englewood Cliffs,N. J., 1970.

52. M. Elson and S. T. Rake, "Code-Generation Techniquefor Large-Language Compilers," IBM System J., Vol. 9,No. 3, 1970, pp. 166-188.

53. R. G. G. Cattell, "Automatic Derivation of Code Gener-ators from Machine Descriptions," ACM Trans. Pro-gramming Languages and Systetns, Vol. 2, No. 2, Apr.1980, pp. 173-190.

54. C. W. Fraser, "Automatic Generation of Code Gener-ators," PhD dissertation, Computer Science Dept., YaleUniversity, New Haven, Conn., 1977.

55. K. Ripken, "Formale Beschreibun von Maschinen, Im-plementierungen und Optimierender Maschinen-codeerzeugung aus Attributierten Programmgraphe,"Technische Univer. Munchen, Munich, Germany, 1977.

56. P.L. Miller, "Automatic Creation of a Code Generatorfrom a Machine Description," Tech. Rep. TR 85, ProjectMAC, MIT, Cambridge, Mass., 1971.

57. J. M. Newcomer, "Machine Independent Generation ofOptimal Local Code," PhD dissertation, ComputerScience Dept., Carnegie-Mellon University, Pittsburgh,Pa., 1975.

58. S. G. Wasilew, "A Compiler Writing System with Op-timization Capabilities for Complex Order Structures,"PhD dissertation, Northwestern University, Evanston,111., 1971.

59. P. Naur et al., "Report on the Algorithmic LanguageALGOL 60," Comn. ACM, Vol. 3, No. 5, May 1960, pp.299-314.

60. D. Bjorner, "Programming Languages: Linguistics andSemantics," International Computing Symp. 1977, E.Morlet and D. Ribbens, eds., North Holland, Amsterdam,1977, pp. 511-536.

61. J. E. Stoy, Denotational Semantics: The Scolt-StracheyApproach to Progra,nmning Language Theory, MIT Press,Cambridge, Mass., 1977.

62. M. J. C. Gordon, The Denotalional Description of Pro-gramnming Languages, Springer-Verlag, New York, 1979.

63. R. Sethi, "A Case Study in Specifying the Semantics of aProgramming Language," Proc. Seventh Annual ACMSymnp. Principles of Programming Languages, Jan. 1980,pp. 117-130.

64. R. D. Tennent, "The Denotational Semantics of Program-ming Languages," Cotnmn. ACM, Vol. 19, No. 8, Aug.1976, pp. 437-453.

65. P. D. Mosses, "SIS: A Compiler-Generator System UsingDenotational Semantics," Report 78-4-3, Dept. of Com-puter Science, University of Aarhus, Denmark, June 1978.

Alfred V. Aho is head of the ComputingPrinciples Research Department at BellLaboratories, Murfay Hill, New Jersey.His current research interests includealgorithms, compilers, data bases, andtheoretical computer science. He is anauthor of numerous papers and books inthe computer science field. He is an af-filiate professor at Stevens Institute ofTechnology and is past chairman of the

ACM Special Interest Group on Automata and ComputabilityTheory.Aho received the BASc degree in engineering physics from the

University of Toronto in 1963 and the MA and PhD degrees inEECS from Princeton University in 1965 and 1967.

COMPUTER