augmenting intellect part 2 professor michael terry february 8, 2007
TRANSCRIPT
Augmenting IntellectPart 2
Professor Michael Terry
February 8, 2007
CS497 / 2
Talk Overview
• Overview of HCI– Types of problems investigated– Goals of HCI
• Augmenting intellect– Historical roots– Challenges– Ill-defined problems– Interface-level support– Open problems
CS497 / 3
Challenges to Augmenting Intellect
• Evaluation– What is intellect and can we actually augment it?– How can we measure success/failure of tools intended to
augment intellect?
• Design– How can we reliably (consistently) design computational
devices to augment intellect?
CS497 / 4
Evaluation: Challenges
• Human intellect is a big, nebulous quality– No single agreed-upon method for measuring it
• Makes it difficult to directly assess whether computational device is “augmenting” intellect, and if so, how
• Examples…
CS497 / 5
Evaluation: Challenges
• Does Google augment our intellect?• Does MS Word?• Does Mathematica?
• If they do, how do they augment our intellect?• And how can we compare which is “better” when there
are alternatives?
CS497 / 6
Evaluation: Challenges
• How can we determine the long-term effect of a tool on person?
• If the tool does more work, is faster…– Could it make the person less capable in the long term?– Could it make them less creative and more dependent on the
tool?
• Example: Calculators– Do they impede the learning of math skills?
• Example: Coding– Who has ever tweaked some buggy lines, recompiled, and re-
run it to see if the changes worked…– …Rather than understood what the real problem was?
CS497 / 7
Evaluation: Summary
• Must indirectly measure effects of tools on very specific tasks– Is user faster?– Does user expend less (perceived) effort?– Can user solve harder problems?– Is user more creative?
• Avoid same pitfalls as calling a system “easy-to-use” or “usable”
CS497 / 8
Design: Challenges
• What forms should augmentation take?– Should tool be “smarter” and proactively solve
problems?– Should it provide streamlined access to
information?– Should it offer communication/collaboration
facilities to connect people?– Should it automate mundane tasks so we can
focus on higher-level issues?– Should it create new symbolic languages?
• Wide range of assistance possible
CS497 / 9
Design: Challenges
• Computational augmentation can exist at many scales
• Interface mechanism-level augmentation– Undo, previews
• Application-level– Mathematica, Matlab
• Societal-level– Collaborative filtering (e.g., Amazon’s rating systems)
CS497 / 10
Design: Summary
• Would like to develop generalizable principles to apply to the design of new tools intended to augment intellect– It’s not enough to have 1-off solutions– Example: Every application should provide “undo” facilities to
support experimentation vs. Mathematica
• Need to be able to reliably repeat our successes– Provide prescriptive guidelines for the design of new
applications
CS497 / 11
Summarizing the Issues
• What do we build?• For whom?• What level/degree/scale of intervention?• Do we make the machine or the person smarter?• How can we measure success in the short- and long-
term?
CS497 / 12
Narrowing the Problem
• Strengths/weaknesses of humans, computers suggest avenues for developing general principles
CS497 / 13
Computer Constants
• Computers not too creative (right now), but…
• Are fast at simple calculations• Have perfect memories (until your disk crashes…)• Can simulate potential future events• Can attend to multiple tasks simultaneously
CS497 / 14
Human Constants
• People are inventive, creative, but…
• Have limitations in memory– 7 +/- 2 chunks in short-term memory– Unreliable long-term memory
• Have limited powers of prediction– Cannot reliably predict outcome of complex events– Cannot reliably work several steps ahead– Are slow to work out implications
• Can focus on only one thing at a time• Extreme example: Writing concurrent software
CS497 / 15
Human-Machine Symbiosis
• Human limitations lead to common problem solving strategies across domains
• Common problem solving strategies can be (partially) enhanced by computational capabilities
• This suggests generalizable user interface design principles are possible
• But must first understand what these common problem solving strategies are…
CS497 / 16
Talk Overview
• Overview of HCI– Types of problems investigated– Goals of HCI
• Augmenting intellect– Historical roots– Challenges– Ill-defined problems– Interface-level support– Open problems
CS497 / 17
Ill-Defined Problems
• Two general classes of problems people solve– Well-defined problems vs. ill-defined problems
• Guess which is the more difficult…
CS497 / 18
Well-Defined Problems
• Have:– Well-defined goal state– Well-defined evaluation function– Well-defined problem state– Well-defined operators to manipulate
problem state
• Examples:– Games (Sudoku, chess, checkers)
• Caveat:– Some well-defined problems can appear
ill-defined by the size of the problem space created (e.g., chess)
CS497 / 19
Ill-Defined Problems
• Described by Walter Reitman (1965)• Have:
– Ill-defined goal state– Ill-defined evaluation function– Ill-defined problem state– Ill-defined operators to manipulate problem state
• Examples:– Design a fuel-efficient car– Design software– Design anything with a given set of constraints– Write a paper
CS497 / 20
Ill-Defined Problem Implications
• Complex problems, not wholly understood at onset– No “textbook” way to solve problems, no “right” answer
• People cope with the uncertainty by actively experimenting
• Donald Schön (1983) dubs experimental process reflection-in-action
CS497 / 21
Reflection-in-Action
• Solution developed step-by-step• An informed act of improvisation• Process:
– Person makes a “move” on a problem based on experience with solving similar problems
– Person reflects on results, uses new information to decide next move
• Consider write-compile-test cycles in software development– Method of managing complexity of solving an ill-defined
problem
CS497 / 22
Broader Experimentation• Experienced practitioners further experiment by
actively generating sets of possibilities• Sets allow one to explore design space, compare and
contrast alternative solutions• Process is called Set-Based Problem Solving
Point-Based Problem Solving Set-Based Problem Solving
(Terminology from Ward et al, 1995, and Sobek et al, 1997)
CS497 / 23
We Have Our Target!
• People actively apply intellect when solving ill-defined problems…
• Ill-defined problems, of all forms, require active experimentation with the problem and its solution…
• Experimentation includes generating sets of alternative solutions to compare and contrast…
• Therefore, one general way to augment intellect is to support experimentation within user interfaces
CS497 / 24
Talk Overview
• Overview of HCI– Types of problems investigated– Goals of HCI
• Augmenting intellect– Historical roots– Challenges– Ill-defined problems– Interface-level support– Open problems
CS497 / 25
Supporting Experimentation
• Experimentation with computer-based tools can happen in near- and long-term– Choosing a command/action (near-term)– Choosing a commands’ parameters (near-term)– Choosing a sequence of actions (long-term)– Developing sets of alternatives (long-term)
• What ways do computers currently support these practices?
CS497 / 26
Current Support
• Previews• Undo/Redo• Save As…• Revision control• Idiosyncratic, manually-driven techniques
– Embed alternatives in same document• Use of layers in Photoshop
• Commenting out sections of code
• Write new paragraph below its replacement
• Demos…
CS497 / 27
Limitations
• Demos– Word– Photoshop
CS497 / 28
Process Support Tools
• Process Support Tools manage past, present, potential future solution states
• Are domain-independent tools and services• Undo/redo, preview, revision control all process-
support tools
• Three classes of process-support tools1. History tools
2. Previewing Tools
3. What-If Tools
CS497 / 29
History Tools
• Provide explicit support for managing versions of conceptually same document
• Revision control systems– CVS, Subversion, version tracking in MS
Word…
• Snapshotting capabilities (Photoshop)• Editable histories• Branching histories• Save As…/Duplicate (thin interface-level
support)
CS497 / 30
History Tools
Editable Graphical Histories (Kurlander & Feiner, 1988)
CS497 / 31
History Tools
Branching History in Designer’s Outpost (Klemmer et al, 2002)
CS497 / 32
Previewing Tools
• Provide support for exploring potential future states without requiring full commitment
• Previews• Design Galleries (Marks et al)• Suggestive interfaces (Takeo
Igarashi, others)• Side Views (Terry & Mynatt)
CS497 / 33
Design Galleries (Marks et al, 1997)
CS497 / 34
Previewing Tools
Suggestive Interface in Chateau (Igarashi & Hughes, 2001)
CS497 / 35
What-If Tools
• Provide support for exploring sets of alternatives• Vary in how explicit support is for parallel versions
• Undo• Spreadsheet convention• Subjunctive interface (Aran Lunzer)• Parallel Pies (Terry & Mynatt)
CS497 / 36
What-If Tools
CS497 / 37
What-If Tools
Subjunctive Interface (Lunzer & Hornbaek, 2003)
CS497 / 38
Limitations
• Consider when/how experimentation takes place:– Choosing a command/action (near-term)– Choosing a commands’ parameters (near-term)– Choosing a sequence of actions (long-term)– Developing sets of alternatives (long-term)
• What are some limitations with existing tools?
CS497 / 39
Limitations: Choosing Actions
• Many choices, little information
• Difficult to predict results of future actions
• Undo/redo, previews helpful, but provide only one view at a time
CS497 / 40
Limitations: Exploring Alternatives
• User must manually manage process of creating, managing, comparing sets of alternatives– Save As…– Alternatives embedded
within same document– Branch, tag in revision
control systems
CS497 / 41
Summary of Tensions
• Applications assume solution development through revision of single solution instance
– A linear problem solving process
• Enforced by equating an overall solution with a single document
– Document the only organizational structure for data– Not expressive enough to hold alternatives
• Few mechanisms to explicitly support exploration– Costly in time, effort to explore alternatives– Ultimately can discourage exploration
CS497 / 42
Summary of Tensions
Interface designs assume this model of problem solving
But users often wantto explore and experiment
CS497 / 43
Set-Based Interaction
• Can reconceptualize interaction to explicitly support creation, manipulation, and evaluation of sets of alternatives
• Let user explore without worrying about saving, documenting each state and its derivation
• Allow manipulation of multiple versions simultaneously• Demo…
CS497 / 44
Evaluation
• It’s not enough to identify a need and design to it• We need to know how the tool affects work practices
– Do people develop better solutions?– Are they faster?– More satisfied?– Work with less effort?
• Want to know what works, what doesn’t, so we can replicate successes, avoid same old mistakes
• But nature of ill-defined problems muddies the waters...
CS497 / 45
Evaluation Challenges
• Efficiency may not be an appropriate metric– User may take longer with tools, but arrive at better solutions
• Difficult to assess whether one solution is better than another– Bane of ill-defined problems: No concrete evaluation function
• People are slow to adopt new work practices– Tools may enable work practices that reliably result in better
solutions– But you need to learn these new work practices– Example: Use of layers in Photoshop
• Comparing problem solving strategies not easy
CS497 / 46
Study of Side Views and Parallel Pies
• Task– Transform start state to known
end state
• Order of operations makes it difficult– Experimentation required– Mimics real-world tasks, but can
judge solution quality– RMS difference in CIE-LUV
colorspace
• 5 minute time limit• 24 subjects
CS497 / 48
Results
• People use sliders 50% less when multiple previews (Side Views) available
• No differences in efficiency or quality of solution found• But dramatically different problem solving practices
when Parallel Pies present
CS497 / 49
Process Map
CS497 / 50
Process Map
• Derivation tree• Active state timeline• Command timeline• Conventions to indicate:
– Undone/abandoned nodes– Duplicated states– Simultaneously active states
Process Maps
CS497 / 53
Do They Reveal Differences?
CS497 / 54
CS497 / 55
If We Can See It…
CS497 / 56
Numerical Characterizations
• Number of branch points per tree• Number of leaf nodes in tree• Average number of children per node for non-leaf
nodes• Maximum distance from root node to a child node• Number of “dead-ends” (abandoned states)• Number of states visited per unit of time (problem
solving rate)• Number of active states per unit of time or per state
CS497 / 57
Study Results
• “Create New Version” and alternatives management infrastructure has major impact on problem solving process– Significant because completely
optional component
– Not necessary to solve problem
• Suggests need for similar “forking” services and infrastructure in other interfaces
CS497 / 58
Summary, Part 1
• In theory, would like to augment intellect with computation…
• In practice, need to consider and evaluate very specific ways computation can help us better solve ill-defined problems– Efficiency– Solution quality– Satisfaction– Perceived cognitive load
CS497 / 59
Summary, Part 2
• Experimentation and exploration are common problem solving practices
• Computation can help us more freely explore by explicitly supporting the management of sets of alternatives
• Understanding how computation affects problem solving process remains a challenge
• Long-term (longitudinal) studies costly…• Or are they?
CS497 / 60
Future Possibilities
• Open source software offers a vehicle for performing longitudinal studies
• Instrument an application to understand how it is being used over time
• With the “right” instrumentation, can determine problem solving practices
• Introduction of new tools allows us to understand how they are adopted and used in the long-term
• An active research project I have with the GIMP…
CS497 / 61
On to Your Assignment…
CS497 / 62
Assignment Overview
• You will– Design (on paper) 2 history tools for source code editors– Sketch out both designs– Describe the designs– Get feedback from a third party
CS497 / 63
Assignment
• From user’s perspective, consider what happens when editing source code if…
• You don’t have to worry about saving the file• You don’t have to worry about “bookmarking” or
snapshotting• System continually records every action, every state• You have access to every past state and action
CS497 / 64
Assignment
• From designer’s perspective, consider what happens if…
• You have access to every past state, action related to a document
• You can determine semantic qualities of the document– Example: Whether code was really changed or simply
refactored
• You have access to surrounding context– What other applications were open– Whether user compiled, ran, debugged source code
CS497 / 65
Assignment Considerations
• Consider that people see code as:– A module– A file– As a collection of functions/methods– As objects– As a collection of constants– As buggy versus solid– As written by one person or another
CS497 / 66
Assignment Considerations
• Consider that differences between documents can be considered from many perspectives
• Differences in:– Number of functions– Number of lines of code– Number of edits– Semantic meaning of code (e.g., consider refactoring)– Number of comments
CS497 / 67
Assignment Considerations
• Ideas to get you started– Graphs that chart the degree of changes in a document– Localized histories (show changes related only to a given
function)– Differences in function results, function runtimes for different
versions
CS497 / 68
Assignment Specifics
• Tasks– Design 2 different methods for browsing, searching,
comparing, or retrieving past states of source code– Branching tree designs are not allowed– Sketch the 2 different designs and describe each– Show designs to someone else to get their reactions
• Deliverables– 2 sketches– Description of each design– Feedback from third party
• Originality and clarity of communication count