crossroads vol 16 issue1

24
www.acm.org/crossroads Fall 2009 Issue 16.1

Upload: crossroads

Post on 18-Dec-2016

216 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: CrossRoads vol 16 issue1

ww

w.a

cm.o

rg/c

ross

road

sFa

ll 2

009

• Iss

ue 1

6.1

Page 3: CrossRoads vol 16 issue1

CROSSROADS STAFF

MANAGING EDITOR:Justin Solomon, Stanford University

COPY EDITOR:Joe Nyiri, Niagara University

ASSOCIATE EDITORS:Malay Bhattacharyya, Indian Statistical InstituteAndrew David, University ofMinnesota Twin CitiesAris Gkoulalas-Divanis, Vanderbilt UniversityDan Goldberg, University ofSouthern CaliforniaRyan K. L. Ko, Nanyang Technological UniversitySumit Narayan, University of Connecticut

CONTRIBUTORS:Anna Ritchie, University of CambridgeTal Rusak, Stanford University

INSTITUTIONAL REVIEWERS:Ernest Ackermann, Mary Washington CollegePeter Chalk, London Metropolitan UniversityNitesh Chawla, University of Notre DameJosé Creissac Campos, University of MinhoAshoke Deb, Memorial Universityof NewfoundlandSteve Engels, University of TorontoJoão Fernandes, University of MinhoChris Hinde, Loughborough UniversityMichal Krupka, Palacky UniversityPiero Maestrini, ISTI-CNR, PisaJosé Carlos Ramalho, University of MinhoSuzanne Shontz, Pennsylvania State UniversityRoy Turner, University of MainePing-Sing Tsai, University of Texas—Pan AmericanAndy Twigg, University of CambridgeJoost Visser, Software Improvement GroupTingkai Wang, London Metropolitan UniversityCharles Won, California State University, Fresno

GRAPHICS EDITOR:Salik Syed, Stanford University

ONLINE EDITORS:Gabriel Saldaña, Instituto de EstudiosSuperiores de Tamaulipas (IEST), MexicoSrinwantu Dey, University of Florida

ASSOCIATE COPY EDITORS:David Chiu, Ohio State UniversityScott DuVall, University of UtahLeslie Sandoval, University of New Mexico

SPANISH EDITOR:Pablo Cayuela, Universidad TecnológicaNacional, Facultad Regional Córdoba, Argentina

OFFERING #XRDS0161

ISSN#: 1528-4981 (PRINT)1528-4982 (ELECTRONIC)

Front cover photograph by Josep Rosell

Fall 2009—Issue 16.1

COLUMNS & REVIEWSINTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2by Justin Solomon, Managing Editor

CS EDUCATION IN THE U.S.: HEADING IN THE WRONG DIRECTION? . . . 17by Robert Dewar and Owen Astrachan

What are the most effective methods for teaching students the fundamentalprinciples of software engineering?

FEATURESDON’T CHANGE A THING:

HOW ADOPTING A SERVICE PROVIDER ATTITUDE CAN BOOST YOUR CAREER

by Michael DiBernardoIn software development, treating recently graduated new employees as “apprentices only” is a waste of talent. Michael DiBernardo advises CS students, as those future newhires, to adopt a service-provider mentality to leverage their experience, improvethe organization, and more quickly establish trust and credibility there.

DYNAMIC DISPLAYS: TOUCHSCREENS, MEET BUTTONS

by Chris Harrison and Scott HudsonWhat happens when you take an interface that combines the graphical flexibility of touchscreens and add the beneficial tactile properties of physical buttons? Two brightminds at Carnegie Mellon University’s Human-Computer Interaction Institutebuilt a device that is just that.

SERVER VIRTUALIZATION ARCHITECTURE AND IMPLEMENTATION

by Jeff DanielsPhD student Jeff Daniels explores the history of the virtual machine, since its birthin the 1960s, and outlines the key concepts in virtualization technology today.

GETTING FROM HERE TO THERE: AN INTERVIEW

WITH TRONSTER HARTLEY, SENIOR PROGRAMMER AT FIRAXIS GAMES

by Crossroads StaffTronster Hartley, a senior programmer at Firaxis Games, explains why he startedhis own game company as an educational side project to his day job, and considershow CS education helped shape his career path.

6

8

13Contact ACM and Order Today!Phone: 1.800.342.6626 (USA/Canada) Postal Address: ACM Member Services

+1.212.626.0500 (outside USA/Canada) P.O. Box 11405Fax: +1.212.944.1318 New York, NY 10286-1405 USAInternet: http://store.acm.org/acmstorePlease note the offering numbers for fulfilling claims or single order purchase below.

Copyright 2009 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part ofthis work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page or initial screen of the document. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Publications Dept., ACM Inc., fax +1 (212) 869-0481, or [email protected].

Crossroads is distributed free of charge over the internet. It is available from: http://www.acm.org/crossroads/.

Articles, letters, and suggestions are welcomed. For submission information, please contact [email protected].

3

Page 4: CrossRoads vol 16 issue1

Introduction: Maxed Out?

By Justin Solomon, Managing Editor

2 Fall 2009/ Vol. 16, No. 1 www.acm.org/crossroads Crossroads

During this transformation, graphics research progressively becamenarrower. While older SIGGRAPHs may have been exciting venues tosee technology unimaginable just a year earlier, more recent confer-ences have shown a convergence in the topics of interest to graphicsresearchers, who are presenting refinements and polishing techniquesfor existing approaches to graphics problems rather than sharingtotally new ideas.

As a result, the papers at the conference have become narrower andless interesting to a general audience. In 2009, they focused on topicslike rendering insects encased in amber and refining geometric struc-tures already explored in previous papers. Several years ago, on theother hand, we saw the introduction of cloth simulation, a techniquethat formed the basis for several other innovative animation processes.

Although some of Cook’s observations may have been particular tocomputer graphics, his impressions of the changing profile of graph-ics research reflect a similar story in other areas of interest to com-puter scientists.

The ubiquity of personal computing has moved the PC from a nov-elty or even curiosity to a less exciting computational tool. This shift inperspective necessarily changed researchers’ outlooks and enthusiasmfor computing as its own scientific endeavor.

Such a development brings with it several pertinent questions. Onthe whole, the history of computer science is miniscule when com-pared to that of more “established” fields, like physics or biology. Ifresearchers are already afraid that their work will fall into obscurity indecades rather than centuries, is computer science likely to stagnate inthe near future? Have all the relevant computer-related discoveriesalready been made?

As the next generation of computer scientists and professionals, westudents are the only ones who can determine the future of our indus-try. I am optimistic that we have but scratched the surface of a rich andchallenging field of research and engineering likely to offer up inter-esting problems for centuries to come.

It’s true that the foundations of many subfields of computinghave been laid out. Nobody would dare discount the importance ofdata structures, databases, and depth buffers. With these basics out

of our way, however, we have theunique and exciting opportunityto use the tools in place to generate developments not just within ourown areas of interest, but across computing and other disciplines.

Just as Rob Cook advocated expanding our conception of graphicsto include “interactive techniques” and other settings not directlyrelated to rendering increasingly complex images, we should look forthe next generation of interesting problems not in laying groundworkbut in solving real-world problems—theoretical or practical—in thelarger scientific, mathematical, and engineering communities.

As the student members of the ACM, Crossroads readers alreadymake up a self-identified group of the leaders of future developmentsin computing. Having observed the breadth and depth of our readers’work by reading their submissions to Crossroads and other venues, Ihave no doubt that the future of computing is in good hands. After all,in this issue alone we explore topics as diverse as virtualization, careersin video game development, and entering the workforce as a softwareengineer. Moreover, our readers’ active participation in ACM chaptersnationwide demonstrates their commitment to building the social net-works and lines of communication necessary to make the next big dis-coveries in computer science.

As we continue to share the latest accomplishments of your peersin computing as well as news and career advice, we welcome yourcomments and suggestions via email, at [email protected].

I wish you the best of luck in the 2009-2010 academic year. Andplease keep Crossroads posted as you continue to impress program-mers and non-programmers alike with your talents and creativitysolving tough computer problems.

BiographyJustin Solomon is an undergraduate at Stanford University doublemajoring in Computer Science and Mathematics. Along with his workas the managing editor for ACM Crossroads, he participates in com-puter graphics research in collaboration with the Stanford Depart -ment of Computer Science and Pixar Animation Studios, competes inprogramming contests, and plays cello and piano.

During the recent ACM SIGGRAPH conference for computer graphics(August 3-7 in New Orleans), Pixar vice president Rob Cook gave a partic-ularly thought-provoking speech on the future of graphics as an area of

research. Cook, the most recent winner of the Steven Anson Coons Award forOutstanding Contributions to Computer Graphics, highlighted the evolution ofgraphics from a fast-moving, speculative area of research to a massive enterprise withmillions of dollars pouring in from entertainment, electronics, and other industries.

Visit the NEW site at www.acm.org/crossroads

Page 5: CrossRoads vol 16 issue1

3Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

Don’t Change a Thing

How Adopting a Service Provider Attitude Can Boost Your Career

By Michael DiBernardo

Students and experienced software developers alike often assumethat a new graduate starting his or her first job will work within anexisting development process, without necessarily contributing to it orchanging it. The new hire, it’s assumed, has enough academic trainingto contribute to the codebase and intellectual framework around theproduct, but not to the methodology that’s actually used to guide thatdevelopment. This process can include status-quos for project plan-ning, work estimation, revision control, check-in policies, productarchitecture, development techniques (such as design by contract,test-driven development, Scrum), testing methodologies, build man-agement, and deployment.

But are software developers wasting their recent graduate employ-ees by keeping them out of the process?

The idea of the “new graduate as apprentice only” reduces both thenew hire’s experience as an employee and the organization’s return oninvestment of hiring. Many CS graduates of today’s universities havesignificant experience with companies and organizations that have atrack record of engineering excellence. Typically, this experiencecomes from co-op or internship placements, contributions to open-source projects, community or volunteer projects, and research assis-tanceships or “honors projects” that are supervised by professors orother highly esteemed mentors.

In the course of their internship-level work, new graduates gainexperience working with processes that are often more efficient thanthe ones that are in place at their new employer. Failing to leveragethese prior experiences to improve the performance is a major dis-service to both the new graduates and the software companies thathire them. On the other hand, there are many things that can gowrong when trying to change processes that are already in place, espe-cially if the person initiating the change is perceived as less experi-enced. However, by presenting themselves as “service providers,“ CSgraduates can overcome much of the resistance that they would oth-erwise face. Taking on this role can speed up the process of establish-ing trust and credibility in a new organization.

The New Employee TrapBefore I discuss the service-provider approach to encourage change,I want to first consider how a new hire, who is eager to improve thestatus quo of software development, might approach the problem.

Let’s say we have a new graduate named Mark, who has just joineda medium-sized organization called Informila on a team of about

50 developers. Within a few weeks of working at Informila, Mark iden-tifies some existing processes that he would like to changed orreplaced. Specifically, he finds that using Concurrent Versions System(CVS) as a revision-control system is causing problems that could beobviated by switching to a new system. Mark has already lost somework due to failed commits that submitted partial changes. Havingpreviously implemented a conversion from CVS to Subversion duringa co-op term, he believes that he could implement the change in hisnew organization with very little negative impact.

In a sentence, here’s how Mark sees the problem from his perspective:

As a new but experienced team member, I want to make thingsbetter by changing the way Informila has always done things.

In other words, Mark views himself as part of the team already, albeita bit green. However, the other developers at Informila may not feelquite this way just yet. It takes time to become an integrated memberof a team, and Mark just hasn’t been around long enough.

Another premise we can draw from Mark’s statement is that he hasa genuine desire to make things better, and to do so, he believes heneeds to change existing processes. Change always comes at a cost. Inthis scenario, there are at least two currencies: time and money (whichwe treat interchangeably) and trust or reputation. The latter currencyis much harder to gauge and is often forgotten—but it is a real cost. IfMark has not built up the trust or reputation required within his teamto lead change, the other developers will resist it.

Back to the scenario: Mark suggests to his teammates that theymigrate their current revision control system to Subversion. He isunable to convince them and their development manager that this is agood idea. Even though Mark has written some sample scripts todemonstrate that he can make the conversion without losing the exist-ing revision history, and he has previous experience and thus somecredibility, the team worries that this would be a risky change to theirprocess and isn’t worth the potential benefit. Mark decides that heneeds to spend more time doing his best to demonstrate his develop-ment prowess by writing quality code before he can build the trustneeded to make such changes.

The Service-Provider ApproachThwarted from making any inroads, Mark had decided to operatewithin his narrow job responsibilities for some time while he slowlybuilds trust with the team. The problem with this approach is that it

A s a student of computer science, there’s a significant chance you will end up working in softwaredevelopment after graduation. Despite whether your career path takes you into industry or academia, you’re likely to have some kind of interaction with software development companies

or organizations, if only in trying to get the most out of a project or collaboration.

Page 6: CrossRoads vol 16 issue1

will take much longer than Mark expects to build his reputation. Evenif Mark is an exceptionally good programmer, his team is already quitecompetent, and the decision-makers within the organization are moreconcerned with the performance of the team as a whole than they arewith any individual. Mark may not realize it yet, but it will be difficultfor him to differentiate himself technically from his teammates.

In the interim, Mark will struggle with the development processesthat he strongly feels are inefficient, and as a result, he’ll enjoy his workless. This will in turn affect his own motivation to differentiate him-self from his peers. There is a better way. In his article “At YourService: Creating a Service-Oriented Software Engineering Teamwithin your Organiza tion” (STQE Magazine, May/ June 2001), RobertSabourin describes how reframing a software engineering team as aservice provider to other teams in the organization can greatlyimprove the ease with which that team is able to enact measurablypositive change on the organization as a whole. And there’s no reasonthis idea should be limited to only teams of testers or developers.

Sabourin says that when a service provider is first seeking out ways toearn trust from its customers, it’s much simpler to build trust by addingservices instead of trying to convince the customer to abandon a serviceprovided by some other organization in exchange for one you will provide.

Almost every student of computer science is taught in introductoryalgorithms courses that a change can be viewed as some combinationof an addition and a deletion. We can view the changing of an existingprocess in the same way. There are costs—time and money andtrust—associated with removing or “deleting” the old process, andthen similar costs for adding the new service. However, it typicallyrequires much less of an investment of trust to add a service, becausedoesn’t disrupt an existing structure. If you can manage to add a serv-ice that’s useful to the organization and meet a need that was not cur-rently being met before, you will build trust that can then be “spent” tochange existing processes. In fact, once you have successfully added afew processes, you may even be asked to change things that you pre-viously petitioned to improve in the first place.

A Matter of MindsetLet’s say Mark decides to take this approach. How does it differ fromwhat he was already doing?

There are two steps to adapting a service-provisioning based out-look. The first is to reflect on the services that you are capable of pro-viding, which may or may not already exist in the company.

For example, Mark is working in a Windows-centric environment,but he’s also quite accomplished in Linux development. He has donea great deal of web application development using open source tech-nologies. He is willing to spend some extra hours outside work on sideprojects related to his job, which is a luxury that many of his older co-workers cannot afford due to other responsibilities. Finally, he has afair amount of experience in operating and integrating bug-trackingand revision-control systems. Mark decides that within his organiza-tion, he is capable of adding value by providing services related to1) Linux development, 2) web application development, and 3) revi-sion control and bug tracking.

The second step involves finding opportunities to “sell” these serv-ices in situations where people want to add to the existing process.This step is fairly simple because most people who are in pain are notshy about expressing their anguish.

There’s one sticking point in the second step for many people,though. Most developers will hesitate to add components process that

they feel aren’t really needed, no matter how loudly others are clam-oring for it. This desire to be efficient in the short term can actuallyhamper one’s ability to improve the overall efficiency of the process inthe long term, since adding services can be a great way to build up thetrust needed to make broader changes.

As an example, Mark has often heard Andy, the director of test engi-neering, complaining that he has no good way of visualizing the statisticsmaintained by their current bug-tracking system. Andy would really likean application that could generate charts to show the trend of how thenumber of bugs opened, re-opened, resolved, and so forth, has changedover the course of a project, especially when each project approaches itsdeadline, and different branches are required to be maintained.

Mark’s first thought when he hears this request is that some otheropen-source bug-tracking tools already have this capability, and thatthis functionality could be acquired “for free” by simply switching over.It’s a procedure that he has implemented before, and he knows it canbe done relatively painlessly, save for the fact that the test team andproduct owners would need to learn how to use the new system (atime and money cost).

The other approach—writing a program to solve a problem thathas already been solved—is something Mark finds highly distasteful.In many respects, writing new software is wasteful compared to sim-ply switching to the other bug-tracking system.

However, Mark reasons that if he were a consultant offering hisservices to this organization, this perceived need would provide theperfect opportunity to build trust with the client, because it doesn’tinvolve changing any existing processes at all, and it’s something theclient feels is desperately needed. By meeting the perceived need, Markcan start to acquire the trust currency he needs to make things moreefficient in the long run, especially since the director of test engineer-ing is such an influential person in the organization.

Perceive and Be PerceivedIt may seem as if adopting a service-oriented “method” isn’t reallymuch of a change at all. Mark’s intent has not changed. He still wantsto make things better within the organization. However, his perspec-tive of what he’s fundamentally doing has shifted subtly, and in a waythat will improve how others perceive Mark and his actions.

Mark’s new statement of action would be something like:

As a provider of Linux development, web application development,and revision control and bug tracking services at Informila, Iwould like to make things better by adding components or processesto fulfill needs that have been expressed by the organization.

The outlook is totally different. Mark has gone from focusing on him-self as an individual to identifying himself as a provider of services. Whilethe shift is strictly one of notation, it removes some of the ego from hisperceived role, and allows him to occasionally see the opportunity incourses of action that he might otherwise dismiss as silly or ineffectual.

In my experience, this seemingly superficial change has a very realeffect on the kinds of opportunities that one will pursue, and the ways inwhich one will react when confronted with adversity in these pursuits.

Another shift Mark has made is that he’s focused on addingprocesses to build trust before trying to introduce changes to existingprocesses. Lastly, he’s focusing first on needs that have already beenrecognized by the organization, before trying to address things he hasonly noticed himself.

Michael DiBernardo

4 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

Page 7: CrossRoads vol 16 issue1

Don’t Change a Thing

Mark ends up working on the bug charting system for Andy, andhas a first version ready within a month. Andy uses the system todemonstrate some of the current statistics to management, and every-one remarks on how much easier the data is to interpret now with thenew charts. Despite the fact that Mark has just essentially replicatedfunctionality that was available in another system, by adding this com-ponent to Informila’s existing workflow, he has made more headwaytoward garnering trust than he would have been able to accomplishthrough months and months of focusing solely on fulfilling hisdeclared job duties.

In PerspectiveI have met many recruiters, managers, and company owners who havecommented that people in technical roles are expected to have excel-lent technical skills as a prerequisite. However, they are differentiatedbased on their ability to communicate and how they get along wellwith others. Seeing yourself—and getting others to see you—as a serv-ice provider is a highly valued soft skill. And you might just make yourwork environment a happier place for yourself while you’re at it.

BiographyMichael DiBernardo ([email protected]) is a software craftsmen whowrites, teaches, and practices the art of software development. He hasworked for the government of Canada, the universities of Waterloo andToronto, Google Inc., and Novell Canada Inc. He holds a B.Math in Com-puter Science and Bioinformatics from the University of Waterloo, andan M.Sc in Computer Science from the University of British Columbia.

5Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

Page 8: CrossRoads vol 16 issue1

Dynamic Displays

Touchscreens, Meet ButtonsBy Chris Harrison and Scott Hudson

6 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

While touchscreens allow extensive programmability and have become ubiquitous in today’s gadgetry,such configurations lack the tactile sensations and feedback that physical buttons provide. As a result,these devices require more attention to use than their button-enabled counterparts. Still, the displays

provide the ultimate interface flexibility and thus afford a much larger design space to application developers.

But from the user’s point of view, touchscreens require direct visualattention. This makes them dangerous in contexts like driving, andpotentially disruptive in casual social situations. Furthermore, evenwith visual attention, touchscreen interfaces tend to be slower and moreerror prone than gadgets with keyboards, buttons, knobs, and the like.

The tactile sensations produced by physical buttons often makethem easier to find and use; thus, they require less attention from theuser. Pumping the music in a car while driving, for instance, doesn’trequire much more than a quick reach and turn of knob, or a few tapsof a button. Nevertheless, most buttons are static, both in appearanceand tactile expression, meaning a single button must be used for manydifferent functions.

Our goal is to devise a display technique that occupies the spacebetween these two extremes, offering some graphical flexibility whileretaining some of the beneficial tactile properties. To achieve this, werequire that graphics be displayed without interference from hands orthe tactile control and actuation elements.

The screen has to sense user input without preventing tactiledeformation or hiding graphics. Finally, the screen has to provide sup-port for tactile expression beyond simple on/off state changes.

What We BuiltOur design consists of one or more air chambers that are created bylayering several specially cut pieces of clear acrylic. On top of this, wedrape a thin sheet of translucent latex, held in place with a specificallystructured pattern of adhesive. Through pneumatic actuation, we cancreate dynamic physical features and allow a small set of distinct inter-face elements to occupy the same physical space at different times.

The fabrication is straightforward. We are able to assemble work-ing prototypes with complex features in under an hour using a lasercutter. The displays rely on inexpensive materials: acrylic, glue, andlatex. Air chambers can be negatively or positively pressurized with asmall and inexpensive pump, allowing for easy actuation.

Applying clear acrylic to such a display allows the image to be rearprojected. Our design doesn’t suffer occlusion from user input. Thisnovel approach enables us to employ diffused infrared illuminationand an infrared camera for multi-touch sensing.

Biographies Chris Harrison ([email protected])is a PhD student in the Hu-man-Computer Interaction Institute at Carnegie Mellon University. Hisresearch interests primarily focus on novel input methods and inter-action technologies, especially those that leverage existing hardware innew and compelling ways. Over the past four years, he has worked onseveral projects in the area of social computing and input methods atIBM Research, AT&T Labs, and most recently, Microsoft Research.

Scott Hudson ([email protected])is a professor in the Human-Computer Interaction Institute within the School of Computer Scienceat Carnegie Mellon University, where he directs the HCII PhD program.His research interests have covered a wide range of topics within the areaof user interface software and technology, though his work has alwaysrevolved around the invention and building of things that lead to a bet-ter user experience, often indirectly through tools for the UI developer.

Page 9: CrossRoads vol 16 issue1

7Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

The acrylic elements are shown in various shades of grey. Areas where adhesive is applied are shown

with a textured blue. The thin latex layer is shown in translucent green.

Page 10: CrossRoads vol 16 issue1

Server Virtualization Architecture

and Implementation

By Jeff Daniels

8 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

IntroductionVirtualization in the enterprise is catching on across the country.Hardware vendors are packaging systems tuned to support virtualmachines, and software vendors are developing virtual server man-agement tools for migrations, performance, and high-availability.Customer IT organizations have defined a virtualization strategy andhave begun deploying virtualized data centers.

The virtual machine concept has been around for years. The recentrevolution in virtualization technology, hypervisors, and paravirtual-ization has allowed servers using the popular x86 architecture to oper-ate efficiently and effectively with virtual machines.

Virtual machine technology is an enabler for service-orientedarchitectures, isolated secure systems, and flexible deployment.

This paper describes the virtual machine from its inception in the1960s to present day virtual machines. Various types of virtualizationwill be discussed, as well as the associated costs and benefits of usingvirtual machines. Information from this paper should outline thebasics of virtualization and offer key concepts when implementing vir-tualization technology.

What is a Virtual Machine?A virtual machine (VM) is an abstraction layer or environmentbetween hardware components and the end-user. Virtual machinesrun operating systems and are sometimes referred to as virtualservers. A host operating system can run many virtual machines andshares system hardware components such as CPUs, controllers, disk,memory, and I/O among virtual servers [8].

A “real machine” is the host operating system and hardware com-ponents, sometimes described as “bare metal,” such as memory, CPU,motherboard, and network interface.

The real machine is essentially a host system with no virtualmachines. The real machine operating system accesses hardware com-ponents by making calls through a low-level program called the BIOS(basic input/output system).

Virtual machines are built on top of the real machine core compo-nents. Goldberg describes virtual machines as “facsimiles” or a “hard-ware-software duplicate of a real existing machine“ [4, 5]. Abstractionlayers called hypervisors or VMMs (virtual machine monitors) make

calls from the virtual machine to the real machine. Current hypervi-sors use the real machine hardware components, but allow for differ-ent virtual machine operating systems and configurations. Forexample, a host system might run on SuSE Linux, and guest virtualmachines might run Windows 2003 and Solaris 10.

Virtual machine monitors and hypervisors are similar to “emula-tors.” Emulation is a “process whereby one computer is set up to per-mit the execution of programs written for another computer” [9].Hypervisors offer a level of efficiency, in that emulators translate everyinstruction or system call to the CPU, memory, and disk.

Hypervisors have specialized management functions that allowmultiple VMs to co-exist peacefully while sharing real machineresources. Mallach concludes the differences are largely semanticbecause both hypervisors and emulators require I/O requests, mem-ory mapping, and logical memory schemes [10].

Virtual Machine HistoryVirtual machines have been in the computing community for morethan 40 years. Early in the 1960s, systems engineers and programmersat MIT recognized the need for virtual machines. In her authoritativediscourse, “VM and the VM Community: Past, Present, and Future,”Melinda Varian [17] introduces virtual machine technology, startingwith the Compatible Time-Sharing System (CTSS). IBM engineershad worked with MIT programmers to develop a time-sharing systemto allow project teams to use part of the mainframe computers. Variangoes on to describe the creation, development, and use of virtualmachines on the IBM OS/360 Model 67 to the VM/370 and theOS/390 [17]. Varian’s paper covers virtual machine history, emergingvirtual machine designs, important milestones and meetings, andinfluential engineers in the virtual computing community.

In 1973, Srodowa and Bates [15] demonstrated how to create vir-tual machines on IBM OS/360s. In “An Efficient Virtual MachineImplementation,” they describe the use of IBM’s Virtual MachineMonitor, a hypervisor, to build virtual machines and allocate memory,storage, and I/O effectively. Srodowa and Bates touch on virtualmachine topics still debated today: performance degradation, capac-ity, CPU allocation, and storage security.

Abstract

Virtual machine technology, or virtualization, is gaining momentum in the information technol-ogy community. While virtual machines are not a new concept, recent advances in hardware andsoftware technology have brought virtualization to the forefront of IT management. Stability, cost

savings, and manageability are among the reasons for the recent rise of virtualization. Virtual machinesolutions can be classified by hardware, software, and operating system/containers. From its inception onthe mainframe to distributed servers on x86, the virtual machine has matured and will play an increas-ing role in systems management.

Page 11: CrossRoads vol 16 issue1

Server Virtualization Architecture and Implementation

9Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

Goldberg concludes “the majority of today’s computer systems donot and cannot support virtual machines. The few virtual machinesystems currently operational, e.g., CP-67, utilize awkward and inade-quate techniques because of unsuitable architectures“ [6].

Goldberg proposes the “Hardware Virtualizer,” in which a virtualmachine would communicate directly with hardware instead of goingthrough the host software. Nearly 30 years later, industry analysts areexcited about the announcement of hardware architectures capable ofsupporting virtual machines efficiently. AMD and Intel have revealedspecifications for Pacifica and Vanderpool chip technologies with spe-cial virtualization support features.

The 1980s and early 1990s brought distributing computing to datacenters. Centralized computing and virtual machine interest wasreplaced by standalone servers with dedicated functions: email, data-base, Web, applications. After significant investments in distributedarchitectures, renewed focus on virtual machines as a complimentarysolution for server consolidation projects and data center manage-ment initiatives has resurfaced [14].

Recent developments in virtual machines on the Windows x86platform merit a new chapter in virtual machine history. Virtualmachine software from Virtuozzo, Microsoft, Xen, and EMC(VMWare) has spurred creative virtual machine solutions. Grid com-puting, computing on demand, and utility computing technologiesseek to maximize computing power in an efficient, manageable way.

The virtual machine was created on the mainframe. It has onlyrecently been introduced on the mid-range, distributed, x86 platform.Technological advancements in hardware and software make virtualmachines stable, affordable, and offer tremendous value, given theright implementation.

Types of VirtualizationVirtual machines are implemented in various forms. Mainframe, opensource, paravirtualization, and custom approaches to virtual machineshave been designed over the years. Complexity in chip technology andapproaches to solving the x86 limitations of virtualization have led tothree different variants of virtual machines:

1. software virtual machines (see Figure 1), which manage interactionsbetween the host operating system and guest operating system(e.g., Microsoft Virtual Server 2005);

2. hardware virtual machines (see Figure 2), in which virtualizationtechnology sits directly on host hardware (bare metal) using hy-pervisors, modified code, or APIs to facilitate faster transactionswith hardware devices (e.g., VMWare ESX); and

3. virtual OS/containers (see Figure 3), in which the host operating system is partitioned into containers or zones (e.g., Solaris Zones,BSD Jail).

A simple UNIX implementation called chroot allows an alternatedirectory path for the root file system. This creates a “jail,” or sandbox,for new applications or unknown applications. Isolated processes inchroot are best suited for testing and applications prototyping. Theyhave direct access to physical devices, unlike emulators.

Sun Microsystems’ “Solaris Zones” technology is an implementa-tion of chroot, similar to the FreeBSD jail design, with additional fea-

Figure 1: Software virtual machines.

Figure 2: Hardware virtual machines.

Figure 3: Virtual OS/containers virtual machines.

tures. Zones allow multiple applications to run in isolated partitionson a single operating system [16]. Each zone has its own uniqueprocess table and management tools that allow each partition to bepatched, rebooted, upgraded, and configured separately. Distinct rootprivileges and file systems are assigned to each zone.

Microsoft Corporation’s Virtual Server 2005 is a new virtualmachine manager in the market. After acquiring virtual machine tech-nology from software vendor Connectix in 2003, Microsoft intro-duced the Virtual Server 2005 product, which runs on a Windows2003 host and, predictably, supports Windows guest operating sys-tems only. At the time of publishing this paper, Virtual Server is lim-

Page 12: CrossRoads vol 16 issue1

Jeff Daniels

10 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

ited to running on single-processor hosts and cannot support sym-metric multiprocessing (SMP).

SMP was introduced on RISC platforms, such as Sun Sparc andDEC Alpha chipsets, before being adopted on the x86 Intel Xeon andAMD Athlon processors. SMP allows multiple, identical chipsets toshare one memory bank.

Instructions can be shared among the processors or isolated to adedicated processor on the system. The system can share a workload,and with increased efficiency. A variation of SMP is AMD’s Opterontechnology, which allows dual-processor chips. The Opteron usesDDR SDRAM memory dedicated to each processor, as opposed to asingle shared memory bank. The multiprocessing nature of numerousvirtual machine guest servers on one host makes dual-core Opteronchips an attractive platform.

Paravirtualization is a variant of full operating system virtualiza-tion. Paravirtualization avoids “drawbacks of full virtualization by pre-senting a virtual machine abstraction that is similar but not identicalto the underlying hardware” [18]. This technique allows a guest oper-ating system to be “ported” through a special API (application pro-gramming interface) to run. The Xen paravirtualization researchproject, at the University of Cambridge, is a virtual machine monitor(hypervisor) that allows commodity operating systems to be consoli-dated and effectively mobilizes guests across physical devices. Xen cur-rently supports only open source guest systems, though a Windows XPport is being developed. Denali is another paravirtualization imple-mentation, but it requires significant modification to host system bina-ries and focuses on high-performance virtual machines.

EMC’s VMWare technology is the market leader in x86 virtualiza-tion technology. VMWare ESX server uses a special hypervisor to“dynamically rewrite portions of the hosted machine code to inserttraps wherever VMM intervention might be required” [1]. TheVMWare solution is more costly, but it provides a robust managementconsole and full-virtualization support for an array of guest operatingsystems including Solaris, Linux, Windows, and DOS.

Why Virtualization?A recent Gartner survey revealed that “less than one-quarter of enter-prises use virtual machines. However, more than 70 percent say theyplan to test them in the near future” [12]. Data center floor space andrack space are prime real estate in computing environments. Coolingand electricity costs have risen in recent years. Infrastructure man-agers are looking to maximize the investment in existing computingpower while keeping server sprawl and overhead costs in check.

Virtual servers generate hardware cost savings by allowing devicesto be used to their full potential. Most distributed computing envi-ronments underutilize server capacity. Estimates for distributed,Windows-based servers indicate average capacity of 8 to 12 percent;UNIX servers use 25 to 30 percent of their capacity on average [3].Virtual server technology unlocks unused capacity and allows theCPU, memory, disk, and controllers to be maximized for each physi-cal device. Based on performance measurements, testing, estimates,and trial and error, any number of virtual servers can be added to aphysical device, thereby increasing server utilization to sustainablelevels. Instead of purchasing expensive servers with unused or excesscapacity, a new virtual machine could be created for an application.Maintenance costs are avoided on the idle servers, and floor space is

freed for virtual server hosts. A manageable growth plan can be cre-ated to add virtual servers, host servers, and related services.

The cost to implement virtual machines has significantly decreased.Recent virtual machine monitors, hypervisors, and paravirtualizationtools make it easy to create virtual machine instances, instead of devel-oping virtual machine code. The 1980 paper “A Virtual Operating Sys-tem” identifies two costs to implement virtual machines: cost to write vir-tual machine software and implementation costs. The estimated cost oflabor to develop the initial virtual machine monitor was eight to ten per-son-months and an estimated four person-months to port the entire sys-tem [7]. With current virtual machine monitors, an engineer can have anOracle 10g cluster hosted on Red Hat Enter prise Linux running withinminutes—basically, the amount of time it takes to download the binaries.

While the development and implementation costs of virtualmachines are significantly less today than in 1980, “A Virtual OperatingSystem” touches on another benefit of virtual machines: migrationcosts. Traditional systems are tied to server or desktop hardware. Thelife expectancy of servers is typically three to five years. When servertechnology becomes obsolete, the data must be migrated to a new plat-form, and applications must be reconfigured in the new environment.Worse, if the equipment is leased or acquired under a capacity servicesagreement, large scale system migrations must occur at the end of theterm in order to avoid contract penalties. Virtual machines make thosetransitions easier. VMWare offers a migration tool called P2V, physicalto virtual machine, which helps engineers move from legacy environ-ments to virtual servers. Platespin Ltd. offers a flexible suite of tools toautomatically migrate between virtual and physical machines (andback again), dynamically reallocate disk space, network configuration,unique identifiers, and other configuration settings. In contrast to tra-ditional standalone systems, migrating virtual machines from one hostplatform to another host platform is relatively simple, in terms of con-figuration, man-hours, and resources required.

LicensingVirtual servers can provide opportunities for software consolidationand reduced licensing costs. A Forrester study concludes Windowslicenses and maintenance costs total $5,800 per year. Adapting to newvirtual machine technology, many vendors have changed their licens-ing models to a “cost per instance” model instead of the “cost perprocessor” model.

Saving licensing fees when migrating from physical to virtualservers may not be effective under the cost per instance model. Forexample, Microsoft recently announced its new licensing model, not-ing that “per-processor licensed software will be licensed by virtualprocessor when used in a virtual OS environment and by physicalprocessor when run in physical OS environments” [12]. However, vir-tual servers offer the ability to consolidate similar systems and softwarepackages on common platforms to recognize license cost savings.

Consolidation is a key driver for many organizations implementingvirtual machine technology. Nearly 60 percent of IT managers areconsidering consolidation [11] projects. Consolidation efforts repre-sent an attempt by IT management to capture cost savings by retiringor decommissioning legacy devices and standardizing supportprocesses. Consolidation projects present the opportunity to mini-mize the number of physical devices as well as software licenses, var-ious packages, and management tools.

Page 13: CrossRoads vol 16 issue1

Server Virtualization Architecture and Implementation

11Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

Once legacy systems are migrated to a consolidated, virtual envi-ronment, standardized images can be built and cloned to ensureintegrity. High availability systems and clustered environments can bequickly configured with virtual machines.

Strategic initiatives can start with standardized images as a launchingpad for new applications builds. When physical hosts need to be retiredor phased out, virtual machines can easily be moved to the new platformwith little interruption. Products such as Virtual Motion and Xen canmove virtual machines on the fly, with little or no user downtime.

Virtualization in the IT Strategic PlanVirtual servers should be a component in any Information TechnologyStrategic Plan (ITSP). As organizations plan for technologies,roadmaps are developed in one, three, five, seven, and out years. Forexample, an ITSP might have biometric readers on a three year plan,while an enterprise resource planning (ERP) upgrade is on a five yearoutlook. Virtualization technologies will fall in the one to three yearplanning cycle for most organizations.

The goal of IT projects and initiatives is to create business oppor-tunities or generate cost savings. Virtualization is a key component inseveral planning areas:

• expanding business lines, such as shared and dedicated hosting;

• faster deployment, time to market;

• increased standardization, leading to lower total cost of ownership;

• consolidation efforts; and

• increased utilization of computing capital.

There are various other possibilities where virtual server technolo-gies could create opportunities or cost savings. As business goals aredefined and objectives determined by the business, virtualizationtechnologies should be considered as one of the ways IT can help meetthose goals.

Enterprise architecture is “the organizing logic for business processand IT infrastructure capabilities reflecting the integration and stan-dardization requirements of the firm’s operating model” [13].Enterprise architecture seeks to align business goals and organiza-tional needs with technology. The idea is to plan, deploy, and managetechnologies to meet business objectives. Similar to the IT strategicplan, virtualization technologies have their place in the enterprisearchitecture model.

Ross mentions two important concepts in her definition of enter-prise architecture: integration and standardization. Virtual serversoffer increasingly flexible methods of systems integration. Hotfailovers, highly available systems, real-time relocation of virtual sys-tems, dynamic reallocation of system resources, and even wide-areanetwork disaster recovery (backup) are integrated with virtual servers.The “data-center in a box” concept is a physical device with many inte-grated virtual servers that performs data center functions such asrouting, messaging, and directory services.

Virtual servers go a long way towards standardization for infrastruc-ture operations. Servers can be commoditized using the “gold image”model, where a virtual machine with the latest compliant system con-figuration is used to build new servers, ensuring standardization and

change control. This also reduces risk of misconfiguration or non-con-figuration of features that may occur due to human error when buildingand rebuilding physical systems. Common platforms serve as an enablerfor business objectives and other enterprise architecture components.Initiatives such as ERP implementations and service-oriented architec-ture applications rely on infrastructure being available, standardized,and usable. Virtual server technologies can be used as a building blockin standardization and integration in enterprise architectures.

Virtual Server ImplementationImplementation plans differ at every organization. What is applicable forone industry or business may not work for others. However, there aresome common implementation techniques that transcend business lines.

VMWare, a leading vendor of virtualization products, uses theVMWare Infrastructure Methodology (VIM): assess, plan, build, man-age. The process considers the existing inventory of systems, creates aplan to “virtualize” the systems, install and configure the hosts, andmanage the new virtual server infrastructure. Many organizations willfollow these steps even if they are outside of the VIM methodology, butthe figures, processes, and systems will be different.

Organizations tend to start using virtual servers in developmentsystems, instead of production, to prove the new technology.Generally, the lower service levels and less criticality of developmentsystems make an ideal choice to implement and evaluate the impact tothe environment before going to production.

Teranet, an e-commerce and government systems integrator, offersa modified approach: perform the assessment, build a business casefor all servers, perform a proof-of-concept, build a business case forall development and test servers, and, finally, deploy in phases. Usingthis implementation methodology, Teranet successfully deployedmore than 270 virtual servers at a cost savings of over $3 million.

The phased approach was also used by Moen, the faucet manufac-turer. Moen went through four phases of implementation, each inte-grating more virtualization technologies in the data center. In Moen’scase, each phase had specific success criteria, such as cost avoidance, per-formance, cost reduction, and operating efficiencies [2]. The Moen teamcarefully evaluated success factors, process changes, and implementationgoals following each phase. Moen also captured tangible and intangiblecosts and benefits throughout the implementation. The figures belowshow some of the types of costs and benefits identified by Moen.

Similar to the proof-of-concept approach, a pilot implementationis another way to “kick the wheels,“ so to speak. Pilots offer a quick winin many ways. Virtual server technology is proven during the pilot.

Figure 4: Moen tangible and intangible costs during implemen-tation of virtual servers [2].

Page 14: CrossRoads vol 16 issue1

Jeff Daniels

12 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

Figure 5: Moen tangible and intangible benefits during imple-mentation of virtual servers [2].

The pilot team will test-drive the systems and test functionality in anoperational or small subset of systems. Pilots can promote virtualiza-tion success by sharing early wins with project management.Successful pilots allow users and project teams to gain valuable expe-rience that will come in handy during full-scale production roll-outs.

SummaryVirtual machines have enjoyed a special niche within the informationtechnology community over the years. Systems engineers and devel-opers have continued to support virtual machines and push innova-tion in new ways. Virtual machines are gaining wider acceptance dueto new efficiencies, ease of use, and users’ demands for flexibility.Hardware, software, and operating system (container) virtual servertechnology are among the various virtual machine implementations.

There is no “one size fits all” virtual machine solution. Rather, manyoptions are designed around specialized approaches. Hardwareadvances such as the AMD Opteron dual-core processors are makingit possible to build powerful servers to host guest operating systems.Intel’s Vanderpool and AMD’s Pacific next-generation architecturewill allow more flexible virtual systems at the hardware level.

Data centers and IT management are implementing virtual servertechnology, often as part of a consolidation strategy. Cost savings in theareas of software license management, systems management, data cen-ter, and overhead costs, such as electricity, generators, and floor space arekey benefits for consolidated virtual server environments. IT managerstrying to contain server sprawl, standardize and control systems, andbuild strategic platforms see virtual machine technology as an enabler.

Virtual storage area networks and grid computing are taking vir-tual machines to new levels. Advanced technologies such as high-performance computing, grid computing, and service-orientedarchitectures with dynamic allocation of resources are complimentarysolutions to virtual machines. From its inception on the mainframe todistributed servers on x86, the virtual machine has matured and willplay an increasing role in systems management.

References1. Barnham, P., Dragovic, B., Fraser, K. et al. Xen and the art of virtual-

ization. 2003. In Proceedings of the 19th ACM Symposium on Oper-ating System Principles (SOSP’03). 164–177.

2. Buchwald, R. 2005. Many happy returns: Techniques on how toidentify VMware return on investment and how to use it to justifyVMware expansion. VMWorld. Presentation SLN693 (Oct. 20).

3. Day, B. 2005. Identifying server consolidation cost savings. ForresterResearch, Cambridge, MA.

4. Goldberg, R. P. 1971. Virtual machines—Semantics and examples.IEEE Computer Society Conference. 141-142.

5. Goldberg, R. P. 1971. Hardware requirements for virtual machinesystems. In Proceedings of the Hawaii International Conference onSystem Sciences.

6. Goldberg, R. P. 1973. Architecture of virtual machines. HoneywellInformation Systems, Inc., Billerica, MA.

7. Hall, D. E., Scherrer, D. K., and Sventek, J. S. 1980. A virtual operat-ing system. Comm. ACM 23, 9.

8. Kreuter, D. 2004. Where server virtualization was born. VirtualStrategy Magazine (July 21).

9. Lichstein, H. A. 1969. When should you emulate? Datamatlon 15, ii.205-210.

10. Mallach, E. G. 1972. Emulation: A survey. Honeywell Comput. J. 6, 4.287-297.

11. ONStor, Inc. 2005. Top 10 requirements for effective server consoli-dation. www.onstor.com.

12. Park, A. R. and Gammage, B. 2005. Microsoft updates server licensing to enable virtualization. ID Number G00132810. GartnerGroup, Stamford, CT.

13. Ross, J. W. 2007. Enterprise architecture as a strategy. Center forInfor mation Systems Research, MIT Sloan-CISR.

14. Serjeant, A. 2005. Building a case for server consolidation. VMWorld.Presentation (Oct. 20).

15. Srodawa, R. J. and Bates, L. E. 1973. An efficient virtual machine im-plementation. In Proceedings of ACM SIGARCH-SIGOPS Workshopon Virtual Computer Systems.

16. Tucker, A. and Comay, D. 2004. Solaris zones: Operating system up-port for server consolidation. Sun Microsystems, Inc.

17. Varian, M. 1997. VM and the VM community: Past, present, and future. Office of Computing and Information Technology, PrincetonUniversity, Princeton, NJ.

18. Whitaker A., Shaw, M., and Gribble, S. D. 2002. Denali: Lightweightvirtual machines for distributed and networked applications. Tech.rep. 02-02-01. University of Washington.

BiographyJeff Daniels ([email protected]) is a doctoral candidate in dig-ital communications at the Indiana State University in Terre Haute. Hehas authored papers and presented numerous international conferencepresentations on topics including virtualization, security, and systemsarchitecture. He is the recipient of several awards, including the Lock-heed Martin Pinnacle Award for Excellence and the Lockheed MartinPresident’s Award, and holds a Master’s degree from Rensselaer Poly-technic Institute and a BS degree from the University of Central Florida.

Visit the NEW Sitewww.acm.org/crossroads

Page 15: CrossRoads vol 16 issue1

Getting from Here to There:

An Interview with Tronster Hartley,:

Senior Programmer at Firaxis Games:

13Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

Crossroads: Can you explain what the difference is between your twojobs, Firaxis and Geek House Games?

Tronster Hartley: Sure. Firaxis Games is my day job. I work it Mondaythrough Friday, roughly 40 hours a week, but during crunch time[when extreme overtime occurs in order for the company to meet itsdeadline] a little bit more. Geek House Games is more of a personal pas-sion. On nights and weekends, myself along with other professionals,students, and indies [independent game developers] come together towork on a game that will go into realms that we might not have a chanceto explore in our day-to-day activities.

Crossroads: Can you talk a little bit about why having something likeGeek House Games is important, not just to you personally, but interms of your career development?

TH: Before I even started working full-time in the game industry, Iwas always fascinated by games. I loved playing them. I would tinkeraround and make them in college during spring break.

After working a computer programming-based job in a few busi-ness sectors, I realized my passion for games had not been diminish-ing. I really missed doing game projects that were structured andorganized and I wanted something that would hold me accountable tofinishing a game.

My hard drive had half a dozen or a dozen projects that werestarted but never completed. I realized in order to hold myselfaccountable, it was important to establish a business entity and striveto make some sort of goal with milestones and deadlines that wouldforce me to finish a game.

Once every year, I intend to submit a game to the IGF [IndependentGames Festival, an annual competition that is well-recognized in thevideo game development industry for introducing experimental andinnovative concepts]. Making games in my spare time, even beforeworking full time for a AAA studio, actually helped get me a leg upwhen I started interviewing at some of the game studios.

One thing I tell students whenever I talk to them about the gameindustry is that even if they don’t have a job lined up or an internshiplined up, the best thing they can do for their careers is to start makinggames right now. If the best prospect for them is to create a businessentity, do that. If they are disciplined enough to make games on theirown time and see it through from start to finish, then I recommenddoing that—whatever works best for them.

Now that I’m full-time in the game industry, I found that I am mostvaluable if I specialize in a particular area of programming. For me it’sbeen user interfaces. And while this is my focus, I still have a passionto do a bit more with computer graphics, game design, and pixelshaders. Every now and again, I even get the urge to just to make awell-written system for playing sounds.

Since I’ve become very specialized in user interface, I put most ofmy energy into it during my day job. Occasionally I will get opportu-nities to do more in art or design; I welcome those opportunities. Butwhen there are no opportunities outside of UI, during my day job, I canalways satisfy my other interests in what I do at Geek House Games.

Crossroads: I think something a lot of people don’t realize, especiallywhen they are new to the workforce, is that they don’t have to bebeholden to the thing that pays their bills 100 percent completely. Thatreally is a difficult thing for many people.

TH: Very much so. Before breaking into the game industry—my firstfull-time job was working on a AAA title at BreakAway Games—I hada job at a start-up that was creating and supporting backup software.

Early on in my life I set some financial goals. I wanted to be mak-ing six figures by the time I was 30, and I was making well beyond thaton contractor rates, but it didn’t make me happy.

When the opportunity arose at BreakAway, even though the salarywas a third of what I was making, my quality of life was going toincrease. At BreakAway, when I was coming out of meetings, we weretalking about where we’d be placing Tiberium on a map [in the gameCommand & Conquer 3: Tiberium Wars] rather than what files had tobe restored on someone’s hard drive.

Even crunching is different. I welcome crunching at a game com-pany. Crunching happens at every other type of job I’ve been at, but Idon’t think there’s been a single case, outside of the game industry,where I can say I’ve had a good experience from doing a crunch.

Crossroads: Just to back up, BreakAway Games is a studio that doesserious games, or games with objectives other than entertainment, aswell as AAA titles, right?

TH: Right. BreakAway is diversified. At least when I was there, therewas an entertainment section and a serious games section. They havesome of the most cutting-edge serious games technology. The peoplewho built that technology were able to transfer their skill sets very wellinto the AAA space.

Tronster Hartley is a senior programmer at Firaxis Games, thevideo game development company best known for its Civiliza -tion series, as well as president of Geek House Games. In thisinterview with Crossroads, he explains how his career pathwas influenced by not only his computer science education, butalso his willingness to experiment with game-making andinteract with new people on his own time.

Page 16: CrossRoads vol 16 issue1

In the past, BreakAway had done a few contracts with EA to putout an expansion to Battle for Middle Earth, an extension, and when Icame on board, I was hired to do UI work on Command & Conquer 3:Kane’s Wrath.

Crossroads: What is your educational background? I know you wentto Ohio Wesleyan University.

TH: Yes. Before I talk about my degree, I’m going to take you way farback because it explains how and when I got interested in programming.

I’ve always had long-term goals on my mind. From kindergarten, Iwas going to be a chef, until third grade when I learned how to doAppleSoft Basic, and from third grade on, I realized I wanted to makegames. This is way back in the early 1980s, and I realized the only wayI was going to make games was to get proficient with a computer, so Iknew I would need to go for a CS degree. All through middle schooland high school, that was the target.

OWU had a very good program. I think there were about sevenpeople per class. The entire school has about 2,000 people on campus,so it was bigger than my high school, but still small enough to get thepersonal attention that I was looking for.

Crossroads: Did you find you were naturally adept at learning com-puting and programming before you started at Wesleyan?

TH: I did, but I was a bit of an ass in high school. I’m a big geek; I thinkone of the problems with geeks in general, myself included, is that webecome very specialized in technology early on and it can breed a bitof arrogance. My arrogance was the biggest issue for me, especiallyamongst my friends. I would be taking an advanced placement com-puter science course in Pascal and I would be working with variouslibraries the teachers were not familiar with. I actually had an ASCii,3D, rotating cube in a program for a help screen, when all my teacherwanted was a line of text along the lines of “This will count cards andscore them for a hand of bridge.”

I was thirsting to do more with what I knew, and it did cause a lotof conflicts in high school. But once college started, I was quickly putin my place. A lot of it had to do with the curriculum and program-ming competitions. There was an ACM programming competitionwhere OWU represented with two teams. It was a fantastic experienceas the problems were challenging, showing me how much I stillneeded to learn. One of our teams placed; my team did not.

Crossroads: You were saying you started at Ohio Wesleyan and it hada small class size and you got a lot of personal attention. Talk a littlebit more about what you studied there.

TH: Because of my AP scores in high school, I started immediately inthe Assembly class. It involved a lot of low-level “register” work, whichrequired an existing knowledge of programming. So my class wasfilled with sophomores and one other student who also jumped aheadvia AP scores. Ironically, that student was the one who had hired mefor the backup software job.

The course was hard, due to so many factors. It had a long timeslot, started at 8 a.m., and involved looking at low-level code on ablack-and-white LCD that was projected via an overhead projector ina dark room. I have this one memory of Dr. Zaring, our professor,showing a difficult concept via the computer, and then flipping on thelights. The six other people in the class were all asleep, heads on desks,

except me. I was lucky enough to have had a Mountain Dew in onehand and a Surge in the other, double-fisting caffeine the whole time.It was hard picking up the concepts, in this environment, but the smallclass size and availability of Dr. Zaring made it possible.

In my senior year I had finished a lot of the courses that wererequired to graduate, and instead, I was taking a lot more interestingcourses, such as compilers and a computer graphics study. I was hav-ing a lot more fun learning concepts that I knew would be more imme-diately applicable to the projects I wanted to do when I graduated.

I also became the president of the student chapter of the ACM atOhio Wesleyan. At that time, not many people on campus knew aboutthe ACM. We continued that year to do all the stuff that had beendone in the previous years, mostly computer competitions. But addi-tionally, I wanted to do things that would make the ACM more visible.

Once a month, we would have an open, campus-wide “computer”movie night. One night we’d show Tron and another night we wouldshow Lawnmower Man or War Games in the student areas. I see theACM as being about computers and culture and the fact that we can-not live without computers today, and those types of movies helpedbridge the gap between those of us who loved computers and thosewho felt they were a necessary evil. Now this is in the late 1990s, andtoday our society is even more reliant on them, but at that time, I feltit was key for the rest of the campus to understand how importantcomputers were becoming.

Crossroads: So, you used movies as a hook into showing other peoplehow this field could actually be applicable in their everyday entertain-ment lives, as well as the deep backchannel stuff that goes on.

TH: Right. It was one of two hooks that we tried, but it was the onlysuccessful one. The other hook was shut down by the administration.That year, for Valentine’s Day we decided to have a match-makingservice. All the students submitted ballots and then we had a com-puter algorithm—one of the guys had figured out a matching system—and paired up students on campus. If people wanted to act on it theycould, but they didn’t have to. But some of the questions were a littlecheeky, like, “How far is too far on the first date?” Even though the lan-guage wasn’t crude, the dean pulled me aside and told me he waspulling the plug on our project.

There are a lot of misconceptions about what computers can andcannot do, and the people behind them.

Just today, I heard on the radio about a local college, which is offer-ing some degree in computer security, and there was a voiceover ofthis woman who said, “When I nab my first hacker, I’m going to dedi-cate that to my sister!”

I grew up in a hacker culture, and am offended by that commercial.“Hacker” should not be synonymous with “evil person trying to com-mit crimes.” Having misperceptions of people who use computers,what they do with computers, and being able to assess who is doinggood and who is doing bad, what it means to be doing good and whatit means to be doing bad—those kinds of things were important to mewhen I was in college leading the ACM chapter, and continue to beimportant to me today.

Crossroads: I can imagine that that has a lot of crossover with work-ing in the game industry, too, this whole notion of doing good. I wouldthink it might be complicated at a company like BreakAway, where you

An Interview with Tronster Hartley

14 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

Page 17: CrossRoads vol 16 issue1

Getting from Here to There

were working on a violent title like Command & Conquer 3, but on theother hand, BreakAway does all this altruistic stuff in its serious games.

TH: Right. As games are becoming more realistic, as they’re becomingbetter at what they’re supposed to represent, the lines are becomingblurred, and I recognize that.

I loved playing the original Doom and the original Quake. But Ican’t play Doom 3 because I dislike horror movies. After the first level,it freaked me out so much that I had to put it down.

Some games, such as what the U.S. Army has put together, areamazing in terms of technology, but at the same time, are a little dis-turbing in what they are portraying.

But I don’t have a hard stance on violent video games; differentgames for different people. And while I don’t play violent games Ithink for the most part those types of games are used as a scapegoat,particularly when people act out and blame their actions on a game. Ido recognize that games, like movies, have the ability to evoke emo-tion. But neither games nor movies make a person behave outsidetheir norm. As I understand it, studies have been performed that showwhile aggressive people may play “aggressive” games, aggressive gamesdo not convert docile people into being more combative. At the most,playing violent games is a cathartic activity.

Crossroads: It’s really interesting to consider the fact that if you go into study computer science or programming, nobody would ever onthat basis accuse you of having bad intentions or doing some sort of illto the world, until it becomes clear that you intend to program videogames, or that you know about hacker culture. Those things are sointertwined in one sense, in the popular culture sense, but then in theacademic sense, we think of people who study computer science asbeing very very different, almost harmless or geeky. It’s kind of funny.

TH: It is.

Crossroads: I wanted to also ask about your involvement with theInternational Game Developers Association (IGDA). You’re the pres-ident of your local chapter, is that right?

TH: I could be considered the president, but we call the position chap-ter chair. It’s the person with the responsibility of coordinating the restof the board.

In 2006, when I was creating Geek House Games, I wanted to geta local IGDA chapter started in Baltimore because although there wasone in Washington D.C., that’s a far commute from Baltimore, partic-ularly after a full day’s work.

I knew about the IGDA from attending a few Game DevelopersConferences, but I had held off because I heard through the localgrapevine that there was someone else at BreakAway who was alreadytrying to start a local chapter.

And so while I waited for it to start, I began having meetings in myhouse with some friends. Once a month, everyone would come overand show what games they had, whether they were board games orcomputer games or computer systems. Once Geek House Gamesstarted to get rolling, I realized I should just try to start to create anIGDA chapter because I didn’t know if the grapevine was correct or ifthat other person had time to follow through.

Once I contacted the IGDA headquarters, they put me in contactwith Jonathan Hamel, who was the game designer at BreakAway look-

ing to start a local chapter. He and I came together with SorenJohnson, who was working at Firaxis at the time, and Paul Stephanouk,who was a lead game designer at Big Huge Games.

The four of us talked over coffee about what it would be like for usto create a chapter, and then in 2006 we kicked off the first meeting atThe Tree House Lounge. It was only later that I discovered this was aplace where Microprose developers use to hang out at in the 1980s.Jonathan was elected chair for the first two years, and I have beenelected chair the past two years.

Our chapter has been strong, with 50/50 attendance from devel-opers at the local AAA studios, and the other half comprising indiesand educational institutions with game-related programs.

Crossroads: What are some of those in your area?

TH: Studio-wise, we have Firaxis Games, ZeniMax, Day 1 Studios,Digital Steamworks, Big Huge Games, BreakAway Games, and mostrecently Zynga and Kalypso have set up shops.

Crossroads: What are some of the universities or institutes?

TH: We have had tremendous support from UMBC [University ofMaryland, Baltimore County], UB [University of Baltimore], andMICA [Maryland Institute College of Art]. They’ve been refining theirprograms for a few years and they’ve had excellent curriculum for get-ting students ready for game development. Recently Johns HopkinsUniversity has started a game-related program, as well as a few otheruniversities outside of the immediate Baltimore area.

Crossroads: What do you think those students get out of attending anIGDA meeting? Can you describe what goes on at the meetings yourchapter has?

TH: We have a broad spectrum of meetings, but we always try to makethe topic accessible to anyone regardless of what discipline they’re in—programming, animation, game design, and so forth. We want tomake it interesting enough so that after spending all day working onmusic or art or code, developers will still want to come out. If it’s inter-esting to them, it should be interesting to the students as well.

We’ve done some postmortems of games. We’ve had Day 1 Studiostalk about how they built the engine they use inside of Fracture. We’vehad a few other companies outside Baltimore, such as Fallen Earth,come out to promote their upcoming MMO. Scaleform, who arelocated in D.C., came up and promoted their GFx 3.0 user interfacesolution about a month before they premiered it at GDC’09.

We occasionally have topics that are not tied directly to a game, butbroader topics, such as what makes a game fun, or this month wetalked about all the new input devices that are out, from the additionson the Wiimote to Microsoft Natal, and hosted a roundtable discus-sion on how they’re shaping the industry.

Once every year, we hold our biggest meeting with slightly over ahundred attendees, where we have an indie and student showcase ofgames. In the last two years, BreakAway has been kind enough to hostit. We set up game stations all around for students and indie gamedevelopers to set up their games. The first year we tried this format,Firaxis hosted it. I hope we can have each Hunt Valley studio host oneof these meetings as time rolls on.

This year we were lucky to have Sid Meier and Firaxis’s executiveproducer Barry Caudil, make time to see the games. That was a treat for

15Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

Page 18: CrossRoads vol 16 issue1

both the students, who got to see these established individuals try outtheir games, and employees at Firaxis, who got to hear Sid’s thoughts onthe new games and their mechanics later at his weekly design meeting.

Crossroads: To tie all these things together—being involved in theACM during college, joining the IGDA, participating in events whereyou get to meet people like Sid Meier—it sounds like you’re really talk-ing about networking.

TH: The industry is very small. There are thousands of game devel-opers out there, but it seems that everyone is just a few degrees offfrom knowing someone else inside the industry. I know at least here,it’s a very tight-knit group inside of Hunt Valley, Maryland. Even whenstudios need to bring new people in, they’ll usually pool from peoplewhom they’ve worked with before, sometimes even from other cities.

During some recent layoffs in my local area, I knew of one personwho didn’t get hired into a new job because he had a reputation for notbeing very positive and not being an easy person to work with.

In that, networking is key. To be out there, to show that you do havea personality that makes people want to work with you. “Is this some-one that I could go to if I needed help?” or if I needed help, I wouldn’tmind sitting in an office with for two or three hours going over low-level code. Trying to debug code is bad enough. Imagine having to do itwith someone whom you don’t even want to be in the same space with.

Crossroads: What kind of advice do you have for people looking tofinish at university and go into the job market in the next year?

TH: Never stop learning. The most important thing college shouldhave taught you or is teaching you is how to learn. The concept oflearning is more important than knowing a particular skill or lan-

guage. I’ve found that people who tend to keep learning have theopportunities to move up to the senior-level programmers, the archi-tects, and management.

Crossroads: Is that what you do with Geek House Games? You weresaying before that it’s a creative outlet, but it seems like it would alsobe a place where you could self-learn some things that you may nothave time to do in a regular full-time job.

TH: Yes, that is spot on. For example, I wanted to learn more aboutgraphics programming and pixel shaders, and the engine that we’reusing at Geek House Games right now support Pixel Bender, Adobe’sversion of pixel shaders, and while the language is a little differentfrom what Xbox 360 or PCs are using right now, the fundamentalstranslate from one to the next.

Likewise, we have an artist, Toby Franklin, who worked with me atGeek House Games on a game called Collide Reactor, and he had lim-ited experience with Flash, but working on that project, not only washe able to increase his portfolio for his 3D modeling ability, but he alsoincreased how well he knew Flash and created games using a Flash-based pipeline.

Toby is someone who has since been picked up by Firaxis sinceCollide Reactor came out. He’s a good example of someone who ben-efited directly from networking. When he was interviewing here, Icould say, “I worked with this guy on a project, and I can speak to hiswork ethic, temperament, and ability to work with a good deal of con-fidence.” And that’s why I love doing Geek House Games. Besideshelping me gain skills, it’s also going to lend opportunities to studentsor others who haven’t gotten a break yet.

—Crossroads Staff

An Interview with Tronster Hartley

16 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

Page 19: CrossRoads vol 16 issue1

CS Education in the U.S.:

Heading in the Wrong Direction?

By Robert Dewar and Owen Astrachan

Last year, Edmond Schonberg and I published an article in CrossTalk (a U.S. Department ofDefense software engineering journal) titled “Computer Science Education: Where Are theSoftware Engineers of Tomorrow?” in which we criticized the state of computer science edu-

cation in U.S. universities [9]. The article caused quite a mini-storm of discussion and was pickedup by Slashdot and also by Datamation in an article titled “Who Killed the Software Engineer? (Hint:It Happened in College)” [6].

In our CrossTalk article, we expressed the general concern that the computer science curriculumwas being “dumbed down” at many universities, partly in an effort to bolster declining enrollments.The enrollment decline at many universities has been dramatic, and still has not shown much signof recovery. The twin effects of the dot-com crash and the concern of outsourcing of IT jobs seemto have convinced many parents and students that IT is not a field with a future, despite studies thatproject a shortage of software engineers in the near future [5]. Perhaps the global economic melt-down will slow this cycle a bit, but I tend to agree that we will be facing a real shortage of well-trainedsoftware engineers in the future.

So obviously the question is what do I mean by a well-trained software engineer? To me, the crit-ical need is the knowledge required to build large complex reliable systems. It is undeniable that oursociety depends in a critical manner on complex software. This is not only in the familiar areas ofsafety-critical software like avionics systems, but also in everyday financial systems. For example,consider the report from Moody stating a bug in the Moody computer system caused an incorrectAAA rating to be assigned to $1 billion worth of “constant proportion debt obligations” [4]. Now Ido not know exactly what this means but it is surely one of the variety of peculiar economic instru-ments that have been factors in the current financial crisis: the credit ratings provided by agenciessuch as Moody are a critical element.

I frequently give talks on safety- and security-critical software, and whenever I give such a talk, Iperuse the news the week before for stories on computer security failures. Prior to a talk last year, thehigh-profile stories receiving the most media attention included the break-in to vice presidentialcandidate Sarah Palin’s email account and the successful hacking of the Large Hadron Collider Website. Recently, one of my credit card companies reissued a card to me because a third-party databasehad been hacked (the credit card company would not identify the database).

I often encounter CS faculty members who take it for granted that all large computer systems arefull of bugs and unreliable, and of course our experience with popular software such as MicrosoftWindows reinforces this notion. The very use of the word “virus” is annoyingly misleading becauseit implies that really such infections are expected and impossible to eliminate, when in fact it isperfectly possible to design reliable operating systems that are immune to casual attacks. Early in

17Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

This article originally appeared in the “Viewpoints” section of Communications of the ACM 52, 7, pp. 41-45. It is reprinted here with the authors’ permission.

Page 20: CrossRoads vol 16 issue1

the history of eBay, its auction software failed for nearly a week, andthe company lost billions of dollars in capitalization. At the time Iwrote to the founders of eBay that they had a company with a hugevalue depending on one relatively simple software application, andthat there was no excuse for this application being other than entirelyreliable. I commented that if their software people were telling themthat such failures were inevitable, they should all be fired and replaced;I never received a reply.

So just what do we need to teach our students if they are to havethe right viewpoint and skills to construct the complex reliable soft-ware systems of tomorrow, and to maintain, extend, and fix the sys-tems in use today? In my experience, undergraduate computer sciencecurricula simply do not regard complex software construction as acentral skill to be taught. Introductory courses are dumbed down inan effort to make them fun and attractive, and have sacrificed rigor indesigning and testing complex algorithms in favor of fiddling aroundwith fun stuff such as fancy graphics. Most of these courses at thisstage are using Java as a first language, and all too often Java is the onlylanguage that computer science graduates know well.

The original CrossTalk article was widely regarded as an anti-Javarant (one follow-up article was titled “Boffins Deride Java”) [8]. It isindeed the case that the use of Java complicates basic education of pro-grammers. It’s not impossible to teach the fundamental principlesusing Java, but it’s a difficult task. The trouble with Java is twofold. Firstit hides far too much, and there is far too much magic. Students usingfancy visual integrated development environments working with Javaend up with no idea of the fundamental structures that underlie whatthey are doing. Second, the gigantic libraries of Java are a seductivedistraction at this level. You can indeed put together impressive funprograms just by stringing together library calls, but this is an exercisewith dubious educational value. It has even been argued that it is use-less to teach algorithms these days. It’s as though we decided that sinceno one needs to know anything about how cars work, there is no pointin teaching anyone the underlying engineering principles. It is vitallyimportant that students end up knowing a variety of programminglanguages well and knowledge of Java libraries is not in itself sufficient.

Although the article was regarded as being anti-Java that missesthe main point, which is that the curriculum lacks fundamental com-ponents that are essential in the construction of large systems. Thenotions of formal specification, requirements engineering, systematictesting, formal proofs of correctness, structural modeling, and so forthare typically barely present in most curricula, and indeed most facultymembers are not familiar with these topics, which are not seen asmainstream. For an interesting take on the importance of a practicalview, see Jeff Atwood’s column discussing the need to teach deploy-ment and related practical subjects [1].

Another area of concern is that the mathematics requirements formany CS degrees have been reduced to a bare minimum. An interest-ing data point can be found in the construction of the iFacts system[7], a ground-based air-traffic control system for the U.K. that is beingprogrammed from scratch using SPARK-Ada and formal specificationand proof of correctness techniques [2]. It has not been easy to findprogrammers with the kind of mathematical skills needed to deal withformal reasoning. And yet, such formal reasoning will become anincreasingly important part of software construction. As an example,

consider that of the seven EAL levels of the Common Criteria forsecurity-critical software, the top three require some level of formalreasoning to be employed [3].

It is true that a lot of software development is done under condi-tions where reliability is not seen as critical, and the software is rela-tively simple and not considered as safety- or security-critical.How ever, if this is all we train students for then we won’t have the peo-ple we need to build large complex critical systems, and furthermorethis kind of simple programming is exactly the kind of job that can besuccessfully transferred to countries with less expensive labor costs. Weare falling into a trap of training our students for outsourceable jobs.

The original article in CrossTalk was based on our observations asfaculty members and as software company entrepreneurs, rather thanon a carefully researched study. When several people asked us for datato back up our claims, we had none to offer. Since then, however, it hasbeen very interesting to read the flood of email we received inresponse to this article. In hundreds of messages, we did not get any-one saying “what are you talking about? We have no trouble hiringknowledgeable students!” On the contrary, we got hundreds of mes-sages that said “Thank you for pointing out this problem, we find itimpossible to hire competent students.” One person related an experi-ence where he had a dump from a customer for a program that hadblown up and was sifting through it trying to determine what wascausing the problem. A newly hired student asked him what he wasdoing, and he said that he was disassembling the hex into assemblylanguage to figure out the problem. The student, who had always con-sidered himself superior because of his computer science degree,replied “Oh yes, assembly language, I’ve heard of that,” and wasamazed that the senior programmer (whose degree was in music)could in fact figure out the problem this way.

Another company noted that it had found it a complete waste oftime to even interview graduates from U.S. universities, so they addedat the end of the job description the sentence “This work will notinvolve Web applications or the use of Java,” and that had served toalmost completely eliminate U.S. applicants. Here was a case ofdomestic outsourcing where they were looking for people in the U.S.who had been trained in Europe and elsewhere and were better edu-cated in the fundamentals of software engineering. These are just twoexamples of many similar responses, so it is clear that we have hit ona problem here that is perceived by many to be a serious one.

BiographyRobert Dewar ([email protected]) is a professor emeritus of com-puter science at the Courant Institute of New York University and is co-founder, president, and CEO of AdaCore.

References1. Atwood, J. 2008. How should we teach computer science? Coding

horror. http:// www.codinghorror.com/ blog/ archives/ 001035. html.

2. Barnes, J. 2003. High Integrity Software—The SPARK Approach toSafety and Security. Addison-Wesley.

3. Common Criteria. 2006. Common criteria for information technology security evaluation, Version 3.1. http:// www. commoncriteriaportal. org.

Robert Dewar & Owen Astrachan

18 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

Page 21: CrossRoads vol 16 issue1

CS Education in the U.S.: Heading in the Wrong Direction?

4. Farrell, N. 2008. Boffins deride Java. The Inquirer. http:// www. theinquirer.net/gb/inquirer/news/2008/01/08/boffins-deride-java.

5. Maloney, P. and Leon, M. 2007. The state of the national securityspace workforce. http:// www.aero.org/ publications/ crosslink/spring2007/01.html.

6. McGuire, J. 2008. Who killed the software engineer? (Hint: It hap-pened in college.) Datamation. http://itmanagement. earthweb.com/ career/article.php/3722876.

7. National Air Traffic Services. NATS pioneers biggest ATC advancesince radar. http:// www.nats.co.uk/ article/ 218/ 62/ nats_ pioneers_biggest_ atc_ advance_ since_ radar.html.

8. Oates, J. 2008. Moody’s to fix sub-prime computer error. The Register. http:// www. theregister. co. uk/ 2008/07/03/ moodys_ computer_ bug.

9. USAF Software Technology Support Center (STSC). 2008. Comput -er science education: Where are the software engineers of tomor-row? CrossTalk. http:// www. stsc. hill. af. mil/ CrossTalk/ 2008/01/0801DewarSchonberg.html.

Counterpoint: Owen AstrachanRobert Dewar has graciously shouldered the task of castigating the lan-guage commonly used in introductory programming courses. Dewar,like Edsger Dijkstra [13] and others before him, holds the language atleast partially responsible for, and decries the state of, computer sci-ence curricula; he then attempts to use the programming language asa lever to move curricula in a particular direction. How ever, the leverof the introductory programming language is neither long enough norstrong enough to move or be responsible for our curricula. Attempts touse it as such can generate discussion, but often more heat than light.The discussion is often embroiled in fear, uncertainty, and doubt (akaFUD) rather than focused on more profound issues.

There are definite elements of FUD in the arguments offered byDewar just as there have been by his predecessors in making similararguments. Whereas Dijkstra lamented “the college pretending thatlearning BASIC suffices or at least helps, whereas the teaching ofBASIC should be rated as a criminal offense: it mutilates the mindbeyond recovery” we see Dewar noting that “It’s not impossible to teachthe fundamental principles using Java, but it’s a difficult task.” Dewarand Dijkstra perhaps would like us to return to the glorious days of texteditors and punch cards rather than “fancy visual IDEs.” However, theslippery slope of assumption that the new generation just doesn’t get itleads to the Sisyphean task of pushing the pebble of language, be itBASIC or Java, uphill against the landslide of boulders that representsthe reality of computer science. This is the case regardless of whetherwe’re in Dijkstra’s world of 25 years ago, the world of 2009, or theSkynet world of tomorrow—which is probably closer than we think.

I don’t mean to suggest that Dewar and Dijkstra are arguing for thesame thing. Dewar would like computer science programs to producewell-trained software engineers who can build large complex reliablesystems. Dijkstra excoriated software engineering at every opportunityfixing as its charter the phrase “how to program if you cannot.” Bothmiss part of the bigger picture in the same way that Stephen Andriolemissed it in the July 2008 Communications Point/Counterpoint “Tech -

nol ogy Curriculum for the Early 21st Century” [10]. In his Counter point,Eric Roberts points out the flaw of “generalizing observations derivedfrom one part of the field to the entire discipline.” Computer science pro-grams must embrace a far wider audience than software engineersbuilding secure systems. Many top programs are housed in schools ofArts and Sciences rather than in Engineering, many have chosen not tobe accredited by CSAB/ABET Students may choose computer scienceas a stepping-stone to law, medicine, philosophy, or teaching rather thanas a foundation for becoming a programmer or software engineer.Schools like Georgia Tech are developing innovative programs to ad-dress the different needs of diverse audiences: students looking tocomputer science as the basis for visual studies or biology rather thanpreparing them for a software-oriented career. There is no one-size-fits-all solution to addressing the skills and knowledge needed to succeedin these areas. Should we expect Craig Venter or Gene Myers to askcomputer science programs to include more computational biology be-cause the demand for bioinformaticians exceeds supply? Will we be sur-prised if Ken Perlin asks for programs to embrace games and graphicsmore than they do to ensure a steady supply of people interested in an-imation or computer-generated imagery? We are discussing the re-quirements and curricula of an undergraduate degree! Our programscan certainly build a superb foundation on which students can continueto gain knowledge and skills as they work and study in different areas,but we should no more expect students to be expert or even journey-men than we expect our premed students to be able to remove an ap-pendix after four years of undergraduate study.

As Fred Brooks reminded us more than 20 years ago, there is nosilver bullet that will solve the problems endemic to software develop-ment nor is there a panacea to cure the ills that may plague computerscience curricula and programs [11]. Studying more mathematics willnot make software bugs disappear, although both Dijkstra and Dewarseem to think so. Dewar points out the need for “formal specificationand proof of correctness techniques” as foundational for softwaredevelopment using Ada. Dijkstra tells us “where to locate computingscience on the world map of intellectual disciplines: in the direction offormal mathematics and applied logic,” but pines for Algol rather thanAda. Both miss Brooks’ point about the essential complexity of build-ing software, the essence in the nature of software. In a wonderful trea-tise that has more than stood the passage of 20 years and in which hepresciently anticipated the tenets of Agile software methodologies,Brooks claims that “building software will always be hard,” and thatthis essence will not yield dramatic improvements to new languages,methodologies, or techniques.

Brooks has hopes that the essential aspects and difficulties of soft-ware may be improved by growing software rather than building it, bybuying software rather than constructing it, and by identifying anddeveloping great designers. He differentiates between essential andaccidental aspects of software where accidental is akin to incidentalrather than happenstance. Changing programming languages, usingMapReduce or multicore chips, and employing a visual IDE in intro-ductory courses address these accidental or incidental parts of soft-ware development, but these don’t mitigate the essential problems indeveloping software nor in educating our students. As Brooks notes,addressing these accidental aspects is important—high-level languagesoffer dramatic improvements over assembly-language programmingboth for software design and for introductory programming courses.

19Crossroads www.acm.org/crossroads Fall 2009/ Vol. 16, No.1

Page 22: CrossRoads vol 16 issue1

Brooks’ view, which I share, calls for “Hitching our research to some-one else’s driving problems, and solving those problems on the owners’terms, [which] leads us to richer computer science research” [12]. I willreturn to problem-driven approaches later.

It would seem from the juxtaposition of amusing anecdotes regard-ing flawed software systems that Dewar would like to make the aca-demic community and the larger computer science and softwarecommunities aware that a simple change in attitude and programminglanguage in our colleges and curricula will help make the world moresecure and safe with respect to the reliable systems on which itdepends. Although software runs on computers it produces outputsand effects that transcend computers. It was not a simple bug inMoody’s computer system that caused constant proportion debt obli-gations to be incorrectly assigned the AAA rating. The model thatMoody used was likely incorrectly parameterized. Even if the flaw wasrelated to code, rather than to a model, Moody‘s correction of themodel did not lead to a change in the AAA rating as it should havebecause of larger and more deeply entrenched financial and politicalconcerns. Standard and Poor’s model also assigned the AAA rating tothe same constant proportion debt obligations. Both services eventu-ally lowered their ratings, but arguably these actions were insufficient.

Blaming the current economic crisis even in part on softwareerrors is more than a stretch. Similarly, Dewar notes that U.S. vicepresidential nominee Sarah Palin’s email account was compromisedand that a Web site was hacked, implying these are security failuresthat might be fixed if only we didn’t use Java in our introductorycourses. Because Governor Palin used Yahoo mail for what appears tobe at least semiofficial business, her password recovery mechanismswere based on publicly available information such as her birthday, andher hacked email was posted on 4chan and Wikileaks: this is a casestudy in social engineering rather than one in secure systems.

Dewar’s claim that Java is part of a “dumbing down” of our curric-ula has been echoed in other venues, notably by Joel Spolsky [15] andBjarne Stroustrup [14]. However, Stroustrup notes that it isn’t the lan-guage that’s a problem—it is attitude. He says, and I agree that:“Education should prepare people to face new challenges; that’s whatmakes education different from training. In computing, that meansknowing your basic algorithms, data structures, system issues, etc., andthe languages needed to apply that knowledge. It also means havingthe high-level skills to analyze a system and to experiment with alter-native solutions to problems. Going beyond the simple library-userlevel of programming is especially important when we consider theneed to build new industries, rather than just improving older ones.”

These articles, like Dewar’s, associate Java with a “dumbing down”of curricula. Spolsky specifically mentions the school at which I teachas one of the new JavaSchools. He laments that our students are luckyin that: “The lucky kids of JavaSchools are never going to get weirdsegfaults trying to implement pointer-based hash tables. They’re nevergoing to go stark, raving mad trying to pack things into bits.”

We didn’t become a JavaSchool because we wanted to avoid seg-faults, pointers, and bits. We use the same assignments and have thesame attitude we did when we used C++. We switched from C++ forwell-founded pedagogical reasons: Java is a better teaching languagefor the approach we were using than C++. Note that I’m not claimingJava is the best language for every program, but we spend much more

time in our courses dealing with the Brooksian essence of program-ming, algorithms, and software using Java rather than with the acci-dental aspects symbolized by the kind of cryptic error messages thatresult from misusing the STL in C++. Our switch to Java was groundedneither in perceived demands from industry nor in an attempt toattract majors to our program, but in working to ensure that our begin-ning courses were grounded in the essence of software and algorithms.

We must work to ensure we attract motivated and capable students,not because it is incumbent on us as faculty to train the next generationof software engineers, but because it is our responsibility as educatorsand faculty to encourage passion and to nurture and increase theamazing opportunities that computing is bringing to our world. It ishighly likely that some programming languages are better for teaching,others are better for Ajax applications, and the right flavor of Linuxmakes a difference. But we shortchange our students and ourselves ifwe live at the level of what brand of brace and bit or drill is best for acarpenter. Instead, we should look for problems that motivate the studyof computing, problems that require computation in their solution.

Just as we cannot escape the essential complexity and difficulty ofdeveloping software we cannot escape the essence of undergraduateeducation. We each bear the burden of our past experiences in con-structing models for education. In my case this is the grounding ofcomputer science as a liberal art, since my education began in thatrealm. For others, computer science is clearly an engineering disci-pline and to others still it is a science akin to biology or physics. Wedon’t need to look for which of these is the correct view; they are allpart of our discipline. The sooner we accept differing views as part ofthe whole, rather than insisting that our personally grounded view isthe way to look at the world, the sooner we will make progress in craft-ing our curricula to meet the demands and dreams of our students.

BiographyOwen Astrachan ([email protected]) is professor of the practice of com-puter science at Duke University and the department’s director of un-dergraduate studies for teaching and learning.

References1. Andriole, S. J. and Roberts, E. 2008. Technology curriculum for the

early 21st century. Comm. ACM 51, 7. 27-32.

2. Brooks, F. 1987. No silver bullet: Essence and accidents of softwareengineering. IEEE Computer 20, 4. 10-19. (Reprinted in The Mythi-cal Man-Month: Essays on Software Engineering, Anni versary Edi-tion. Addison-Wesley, 1995.)

3. Brooks, F. 1996. The computer scientist as toolsmith II. Comm.ACM 39, 3. 61-68.

4. Dijkstra, E. 1984 Keynote address at ACM South Central Regional Conference. http:// www.cs.utexas.edu/ users/ EWD/ transcriptions/ EWD08xx/EWD898.html.

5. Maguire, J. 2008. Bjarne Stroustrup on educating software develop-ers. Datamation. http://itmanagement.earthweb.com/features/ article.php/3789981/.

6. Spolsky, J. 2005. The perils of JavaSchools. Joel on Software. http://www.joelonsoftware.com/articles/ThePerilsofJavaSchools. html.

Robert Dewar & Owen Astrachan

20 Fall 2009/ Vol. 16, No.1 www.acm.org/crossroads Crossroads

Page 23: CrossRoads vol 16 issue1

Check the appropriate box and calculate Issues Member Member Rateamount due. per year Code Rate PLUS Air*• acmqueue (online only) 6 143 Visit www.acmqueue.org for more info.• Computers in Entertainment (online only) 4 247 $44 � N/A

Computing Reviews 12 104 $60 � N/A• Computing Surveys 4 103 $37 � $62 �• Crossroads 4 XRoads $35 � $53 �• interactions (included in SIGCHI membership) 6 123 $55 � $84 �

Int’l Journal of Network Management (Wiley) 6 136 $85 � $110 �Int’l Journal on Very Large Databases 4 148 $85 � $110 �

• Journal of Educational Resources in Computing (see TOCE) 4 239 N/A N/A• Journal of Experimental Algorithmics (online only) 12 129 N/A N/A• Journal of Personal and Ubiquitous Computing 6 144 $86 � $119 �• Journal of the ACM 6 102 $56 � $107 �• Journal on Computing and Cultural Heritage 4 173 $50 � $68 �• Journal on Data and Information Quality 4 171 $50 � $68 �• Journal on Emerging Technologies in Computing Systems 4 154 $43 � $61 �• Linux Journal (SSC) 12 137 $27 � $60 �• Mobile Networks & Applications 4 130 $64 � $89 �• netWorker 4 133 $56 � $81 �• Wireless Networks 4 125 $64 � $89 �

Transactions on:• Accessible Computing 4 174 $50 � $68 �• Algorithms 4 151 $53 � $71 �• Applied Perception 4 145 $44 � $62 �• Architecture & Code Optimization 4 146 $44 � $62 �• Asian Language Information Processing 4 138 $40 � $58 �• Autonomous and Adaptive Systems 4 158 $42 � $60 �• Computational Biology and Bio Informatics 4 149 $36 � $77 �• Computer-Human Interaction 4 119 $46 � $64 �• Computational Logic 4 135 $45 � $63 �• Computation Theory (NEW) $50 � $68 �• Computer Systems 4 114 $48 � $66 �• Computing Education (formerly JERIC) 277 N/A N/A• Database Systems 4 109 $47 � $65 �• Design Automation of Electronic Systems 4 128 $44 � $62 �• Embedded Computing Systems 4 142 $45 � $63 �• Graphics 4 112 $52 � $70 �• Information & System Security 4 134 $45 � $63 �• Information Systems 4 113 $48 � $66 �• Internet Technology 4 140 $43 � $61 �• Knowledge Discovery From Data 4 170 $42 � $60 �• Mathematical Software 4 108 $48 � $66 �• Modeling and Computer Simulation 4 116 $52 � $70 �• Multimedia Computing, Communications, and Applications 4 156 $43 � $61 �• Networking 6 118 $55 � $108 �• Programming Languages & Systems 6 110 $60 � $85 �• Reconfigurable Technology & Systems 4 172 $50 � $68 �• Sensor Networks 4 155 $43 � $61 �• Software Engineering and Methodology 4 115 $44 � $62 �• Speech and Language Processing (online only) 4 153 N/A N/A• Storage 4 157 $43 � $61 �• Web 4 159 $42 � $60 �marked • are available in the ACM Digital Library*Check here to have publications delivered via Expedited Air Service.For residents outside North America only.

Carefully complete this application and returnwith paymentbymail or fax to ACM. Youmustbe a full-time student to qualify for student rates.

Name Please print clearly

Address

City State/Province Postal code/Zip

Country E-mail address

Area code & Daytime phone Fax Member number, if applicable

acmINSTRUCTIONS

CONTACT ACM

phone: 800-342-6626(US & Canada)

+1-212-626-0500(Global)

hours: 8:30am–4:30pmUS Eastern Time

fax: +1-212-944-1318email: [email protected]: Association for Computing

Machinery, Inc.General Post OfficeP.O. Box 30777New York, NY 10087-0777

For immediate processing, FAX thisapplication to +1-212-944-1318.

EDUCATION

CODE: CRSRDS

MEMBERSHIP BENEF ITS AND OPTIONS

PRINT PUBLICATIONS Please check one

Name of SchoolPlease check one: �High School (Pre-college, SecondarySchool) College: � Freshman/1st yr. � Sophomore/2nd yr.� Junior/3rd yr. � Senior/4th yr. Graduate Student: �

Masters Program � Doctorate Program � PostdoctoralProgram �Non-Traditional Student

Major Expected mo./yr. of grad.

Age Range: � 17 & under � 18-21 � 22-25 � 26-30

� 31-35 � 36-40 � 41-45 � 46-50 � 51-55 � 56-59 � 60+

Do you belong to an ACMStudent Chapter? �Yes �No

I attest that the information given is correct and that I willabide by the ACMCode of Ethics. I understand that mymembership is non transferable.

Signature

STUDENT MEMBERSHIP APPLICATION AND ORDER FORM

PUBLICATION SUBTOTAL:

Join ACM online: www.acm.org/joinacm

• Electronic subscription to Communications ofthe ACM magazine

• Electronic subscription to Crossroads magazine

• Free software and courseware through the ACMStudent Academic Initiative

• 2,500 online courses in multiple languages,1,000 virtual labs and 500 online books

• ACM's CareerNews (twice monthly)

• ACM e-news digest TechNews (thrice weekly)

• Free e-mentoring service from MentorNet

• ACM online newsletter MemberNet (twice monthly)

• Student Quick Takes, ACM student e-newsletter(quarterly)

• ACM's Online Guide to Computing Literature

• Free "acm.org" email forwarding address plusfiltering through Postini

PAYMENT INFORMATION

Payment must accompany application

Member dues ($19, $42, or $62)To have Communications of the ACMsent to you via Expedited Air Service,add $50 here (for residents outside ofNorth America only).

Publications

Total amount due

Check or money order (make payable to ACM,Inc. in U.S. dollars or equivalent in foreign currency)

Visa/Mastercard American Express

Card number Exp. date

Signature

Member dues, subscriptions, and optional contributionsare tax deductible under certain circumstances. Pleaseconsult with your tax advisor.

$

$

$

$

� �

PLEASE CHOOSE ONE:

� Student Membership: $19 (USD)� Student Membership PLUS Digital Library: $42 (USD)� Student Membership PLUS Print CACM Magazine: $42 (USD)� Student Membership w/Digital Library PLUS Print CACM Magazine: $62 (USD)