[ieee 2010 ieee 18th international conference on program comprehension (icpc) - braga, portugal...

2
Multi-Touch for Software Exploration Sandro Boccuzzo Department of Informatics University of Zurich Zurich, Switzerland boccuzzo@ifi.uzh.ch Harald C. Gall Department of Informatics University of Zurich Zurich, Switzerland gall@ifi.uzh.ch Abstract—The design of software systems is often so intricate that no individual can grasp the whole picture. Multi-Touch screen technology combined with 3D software visualization offers a way for software engineers to interact with a software system in an intuitive way. In this paper we present first results on how such emerging technologies can be used to explore software systems. Keywords-Human-computer interaction, Maintenance and evolution, Software visualization I. I NTRODUCTION Work in software visualization has come a long way in presenting the complex relationships in an intuitive, self- explaining manner. Semi-automated processes support engi- neers in finding points of interest while exploring software projects. Our target it to combine emerging technologies, such as multi-touch tables, with software visualization and exploration, to help software engineers in investigate such points of interest and consult colleagues involved in the project. The main contribution of our work is a first adoption of emerging multi-touch tables in the context of software visualization and software exploration. Concepts and multi- touch features have been integrated in our CocoViz tool 1 . II. TOUCHING SOFTWARE Software visualization aims to find relevant aspects in a complex system as fast as possible. Because relationships between software entities have a complex nature, discov- ering such aspects in a static visualization might be hard. Explorative software visualizations such as polymetric views [5] allow one an interactive approach, by offering filter and customization facilities to limit the amount of software entities in a view. Our focus is to simplify the workflow for software explo- ration with a seamless integration of exploration concepts. With the recently emerging multi-touch technologies, we investigated new input device suitable for software ex- ploration. Multi-touch technologies are based on a touch- sensitive surface that recognizes multiple touch points at 1 This work was partially supported by the Hasler Foundation Switzerland within the Project “EvoSpaces II”. the same time. Interaction typically occurs with a stylus (digital pen), tagged objects or simply a finger’s touch. One advantage over the mouse is that, while using the fingers one can interact with a virtual representation in a way that is closer to the behavior anyone uses with natural objects.[1] In our context, we can especially benefit from this natural and intuitive interaction, because our cognitive software visualization approach deals with graphical elements rep- resenting a metaphor known from daily life, such as houses. However, when using software visualization tools on a multitouch device, usability can suffer - compared to a mouse-/keyboard-based approach. This is mainly due to the fact that existing software visualizations often expect the user to enter text, or go hand in hand with pre-configuration steps that are long on mouse usage. In order to overcome the mentioned limitations, we investigated for improvements to the efficiency of these workflows on a multi-touch device and found a solution based on our work done with automated comprehension tasks [3]. Such tasks allow one to automate configurations for a visualization, automate analyzes of a system for most common software comprehension tasks [9] and reduce the workload. Adapting this concept to a multi-touch device allowed us to simplify the access to other analyses during software exploration and in particular benefit from a reduced workload. III. MULTITOUCH ARCHITECTURE Nowadays there is a diversity of multi-touch technologies for different sizes, from multi-touch-tables 2 down to mobile- phones. 3 The common idea behind them is to track the movement of the fingers on a display and map those movements to events in the system. The interaction becomes intuitive and natural like grabbing and moving objects in our everyday life. To adapt multi-touch to the software engineering context, in CocoViz we built our own gesture handling framework. We implemented multi-touch functionality into additional dedicated controllers interacting with the main controllers 2 www.microsoft.com/surface/ last checked 18.2.2010 3 www.apple.com/iphone/ last checked 18.2.2010 18th IEEE International Conference on Program Comprehension 978-0-7695-4113-6/10 $26.00 © 2010 IEEE DOI 10.1109/ICPC.2010.30 54

Upload: harald-c

Post on 06-Aug-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Multi-Touch for Software Exploration

Sandro BoccuzzoDepartment of Informatics

University of ZurichZurich, [email protected]

Harald C. GallDepartment of Informatics

University of ZurichZurich, Switzerland

[email protected]

Abstract—The design of software systems is often so intricatethat no individual can grasp the whole picture. Multi-Touchscreen technology combined with 3D software visualizationoffers a way for software engineers to interact with a softwaresystem in an intuitive way. In this paper we present first resultson how such emerging technologies can be used to exploresoftware systems.

Keywords-Human-computer interaction, Maintenance andevolution, Software visualization

I. INTRODUCTION

Work in software visualization has come a long way inpresenting the complex relationships in an intuitive, self-explaining manner. Semi-automated processes support engi-neers in finding points of interest while exploring softwareprojects. Our target it to combine emerging technologies,such as multi-touch tables, with software visualization andexploration, to help software engineers in investigate suchpoints of interest and consult colleagues involved in theproject.

The main contribution of our work is a first adoptionof emerging multi-touch tables in the context of softwarevisualization and software exploration. Concepts and multi-touch features have been integrated in our CocoViz tool1.

II. TOUCHING SOFTWARE

Software visualization aims to find relevant aspects in acomplex system as fast as possible. Because relationshipsbetween software entities have a complex nature, discov-ering such aspects in a static visualization might be hard.Explorative software visualizations such as polymetric views[5] allow one an interactive approach, by offering filterand customization facilities to limit the amount of softwareentities in a view.

Our focus is to simplify the workflow for software explo-ration with a seamless integration of exploration concepts.

With the recently emerging multi-touch technologies, weinvestigated new input device suitable for software ex-ploration. Multi-touch technologies are based on a touch-sensitive surface that recognizes multiple touch points at

1This work was partially supported by the Hasler Foundation Switzerlandwithin the Project “EvoSpaces II”.

the same time. Interaction typically occurs with a stylus(digital pen), tagged objects or simply a finger’s touch. Oneadvantage over the mouse is that, while using the fingersone can interact with a virtual representation in a way thatis closer to the behavior anyone uses with natural objects.[1]

In our context, we can especially benefit from this naturaland intuitive interaction, because our cognitive softwarevisualization approach deals with graphical elements rep-resenting a metaphor known from daily life, such as houses.

However, when using software visualization tools on amultitouch device, usability can suffer - compared to amouse-/keyboard-based approach. This is mainly due to thefact that existing software visualizations often expect theuser to enter text, or go hand in hand with pre-configurationsteps that are long on mouse usage.

In order to overcome the mentioned limitations, weinvestigated for improvements to the efficiency of theseworkflows on a multi-touch device and found a solutionbased on our work done with automated comprehensiontasks [3]. Such tasks allow one to automate configurationsfor a visualization, automate analyzes of a system for mostcommon software comprehension tasks [9] and reduce theworkload. Adapting this concept to a multi-touch deviceallowed us to simplify the access to other analyses duringsoftware exploration and in particular benefit from a reducedworkload.

III. MULTITOUCH ARCHITECTURE

Nowadays there is a diversity of multi-touch technologiesfor different sizes, from multi-touch-tables2 down to mobile-phones.3 The common idea behind them is to track themovement of the fingers on a display and map thosemovements to events in the system. The interaction becomesintuitive and natural like grabbing and moving objects in oureveryday life.

To adapt multi-touch to the software engineering context,in CocoViz we built our own gesture handling framework.

We implemented multi-touch functionality into additionaldedicated controllers interacting with the main controllers

2www.microsoft.com/surface/ last checked 18.2.20103www.apple.com/iphone/ last checked 18.2.2010

18th IEEE International Conference on Program Comprehension

978-0-7695-4113-6/10 $26.00 © 2010 IEEE

DOI 10.1109/ICPC.2010.30

54

Figure 1. a) A three touch gesture in CocoViz b) Optimized configuration

of our Model-View-Controller-Architecture [8]. The gesture-controller has a set of registered gestures that are loadedat runtime and enabled or disabled according to the usecase. Whenever a touch-event occurs the gesture controllerevaluates how many touches are still active and passes thecurrent event together with the other still active touch-eventsto the enabled and applicable gesture methods. If a gestureapplies, a gesture-event with relevant information is sent tothe application’s event-handler.

With our gesture handler, we were able to overcome thelimited amount of common gestures (pinch, rotate, swipe),and now offer adequate gestures for our software explorationcontext. Besides the common gestures, CocoViz currentlyoffers an extended set of Single-, Dual- and multi-touch-gestures.

A circle gesture is triggered when the touch-move pathresembles a segment of a circle to ask for details about aselected entity group.

A wave gesture is triggered when the touch-move pathresembles a sinus-wave to present a historical view on theselected entity.

A control drag gesture is triggered whenever one fingeris not moved and a second touches the screen to the rightto show a context menu

A three-touch gesture is used to move a software entityindependently from its layout and to preserve it, even whena visualization changes its filters or visible entity (Fig. 1a).

A four-touch gesture is used to access an entity’s audiosource as described in previous work [2].

IV. RELATED WORK

One of the first multi-touch systems was described with the’Flexible Machine Interface’ by Nimish Mehta in 1982 [7]. Itconsisted of a glass panel showing a black spot, whose sizedepended on finger pressure allowing a multi-touch inputpicture drawing with simple image processing.

Lee et al. in [6] presented a prototype of a touch-sensitivetablet that is capable of sensing more than one point ofcontact at a time. In their work they also discuss how multi-touch sensing, interpolation, and degree of contact sensingcan be combined to expand the vocabulary of ’Human-Computer Interaction’.With DiamondTouch[4], Dietz and Leigh explain a techniquefor creating a touch-sensitive input device that allows mul-tiple, simultaneous users to interact in an intuitive fashion.Our CocoViz approach uses these ideas of multi-touchtechnologies to sense multiple points of contact with thetouchscreen, detecting intuitive gestures suitable for softwareexploration.

V. CONCLUSIONS & FUTURE WORK

In this paper we discussed how emerging technologies suchas multi-touch tables are used in software visualization andexploration. A proof of concept has been implemented inour CocoViz tool.

We are currently evaluating the use of multi-touch tablesin combination with 3D software visualization in such acollaborative environment. Our goal is to use the righttechnology in the right situation.

REFERENCES

[1] S. Boccuzzo and H. C. Gall. Cocoviz: Towards cognitivesoftware visualization. In Proc. IEEE Int’l Workshop onVisualizing Softw. for Understanding and Analysis, 2007.

[2] S. Boccuzzo and H. C. Gall. Software visualization with audiosupported cognitive glyphs. In Proc. Int’l Conf. on Softw.Maintenance, 2008.

[3] S. Boccuzzo and H. C. Gall. Automated comprehension tasksin software exploration. In Proc. Int’l Conf. on AutomatedSoftw. Eng. (ASE), 2009.

[4] P. Dietz and D. Leigh. Diamondtouch: A multi-user touchtechnology. In UIST, pages 219–226, November 2001.

[5] M. Lanza and S. Ducasse. Polymetric views — a lightweightvisual approach to reverse engineering. IEEE Trans. on Softw.Eng., 29(9):782–795, 2003.

[6] S. Lee, W. Buxon, and K. C. Smith. A multi-touch threedimensional touch-sensitive tablet. In Proc. of the ACM Conf.on Human Factors in Computing Systems (CHI85), pages 21–25, April 1985.

[7] N. Mehta. Flexible Machine Interface. M.A.Sc. Thesis,Department of Electrical Engineering, University of Toronto,1982.

[8] T. Reenskaug. Models - views - controllers. In Technicalreport, Xerox Parc, 1979.

[9] J. Sillito, G. C. Murphy, and K. D. Volder. Questionsprogrammers ask during software evolution tasks. In Proc.SIGSOFT Foundations of Softw. Eng. Conf. (FSE), 2006.

55