User centered agile dev balanced team 2013

Download User centered agile dev balanced team 2013

Post on 05-Dec-2014

226 views

Category:

Technology

2 download

Embed Size (px)

DESCRIPTION

 

TRANSCRIPT

<ul><li> 1. User Centered Agile Dev at NASA - One Groups Path to Better Software Jay Trimble NASA Ames Research Center ! For Balanced Team 11-3-13 </li> <li> 2. My Background Missions NASA Johnson Space Center, Houston Shuttle Mission Control, Payloads Jet Propulsion Lab Space Radar Lab-1 Ops Director Robotic - Voyager Neptune Shuttle - Space Radar Lab, Lead Ops Director Current Mission Operations &amp; Ground Data System Manager, Resource Prospector Lunar Rover Internship in Mission Control (A long time ago) </li> <li> 3. My Background Software Technology Human Centered Computing for Mars Rovers Founded User Centered Technology Group User centered technologies for mission control </li> <li> 4. One Story of Agile at NASA This is a bottom up story of how a group at NASA applied agile methods to software development for mission control This was approved, but not initiated by, management </li> <li> 5. The Project Our task was to build an architecture for mission control user applications, the primary focus being on developing interaction paradigms and technology for user-composable software ! See the results at https://github.com/nasa/mct </li> <li> 6. The Collaboration Design and Development Team at NASA Ames The Customer Mission Control Users at NASA Using Participatory Design, we created an integrated team that included customer representation </li> <li> 7. Issues and Mandates Some customers want a new product, others do not The product must have new capability, but must also not be disruptive within the organization Functional and visual connection to legacy product </li> <li> 8. The Journey We began with a six month software delivery cycle By iteratively xing issues, we got the delivery cycle down to three weeks It took close to two years to complete the transition </li> <li> 9. Where we started Four sixmonth deliverables One User Experience Spec Module 1 Subsystem1 Subsystem2 6 Months Subsystem3 6 Months Subsystem 6 Months 6 Months </li> <li> 10. Issues we faced Long delivery cycle Difcult to manage feature prioritization and development, integration and testing Progress invisible to customer, lack of meaningful ongoing customer interaction to drive design Mismatch in expectations between design/dev team and customer Difcult for the development team to know state of progress relative to goals Deliveries focus on subsystems rather than meaningful end user functionality Two-year nal deliverable created a tendency to defer key issues </li> <li> 11. Initiating Internal Change Fix the problems iteratively, without a broad proclamation of methodology, i.e. we are going to be agile or we are going to be lean Just x the problems jtrimble2@gmail.com </li> <li> 12. First Step - Six Week Cycle We took the six month cycle and divided it into smaller pieces This was a start, but still left many issues It 1 It 2 It 3 It 4 6 6 6 6 Weeks Weeks Weeks Weeks It x </li> <li> 13. Incremental Improvements Six week delivery cycle Prioritization of work at the start of each sixweek iteration User Experience spec for every iteration due one week before iteration start UE testing and design session during coding period of each iteration jtrimble2@gmail.com </li> <li> 14. Six Week Cycle Demo new features for QA UE Specication Rls Docs Stack Rank PreStack Rank 1 PreStack Rank 2 UE Spec Pre-Ship Review, exit critera, customer demo Eng design &amp; spec (3 days) Code (3.5 weeks) Demo Test (2 weeks) PS Review Deliver DeBrief Kickoff UE Testing Iteration n-1 (delivered s/w) UE Design/Testing Iteration n+1 (paper) Develop Test Plan JIRA Updates/Priorities Coding/UE Spec Revisions/Daily Acceptance Test Iteration n-1 Iteration n Iteration n+1 </li> <li> 15. Almost There Better, but still not where we need to be Six week iterations are focused on subsystem capabilities, they lack user-focus Customers see progress every six-weeks, this is not often enough jtrimble2@gmail.com </li> <li> 16. Next Steps Identify the issues After each iteration we had a team de-brief where we identied issues and discussed xes Fixing the issues, one step at a time Some issues we xed with policy changes based on team de-briefs Many of the changes were bottom up within the team, such as Daily communication between user experience designers and the customer as new features rolled out and QA testing of features on rollout, Some changes were top down, such as the length of an iteration (or sprint) and the release cycle </li> <li> 17. Agile We shortened the cycle to three weeks Replaced discrete events, with integrated interactions Integrated strategic and tactical into our ranking process Each iteration had clear purpose, goals, ranked priorities Daily Build, Iterations, Release Strategic road map </li> <li> 18. Designing with the Users Participatory Design &amp; Analysis Customers are part of the design team Designers facilitate, customers are the domain experts Shared ownership </li> <li> 19. Design Artifacts Triggers/Results Really big picture Big Picture Task Flows Blue sky Real world </li> <li> 20. Design Artifacts Task Objects User Objects Windows </li> <li> 21. Agile Cycle Nightly Build Iteration delivered every 3 weeks Release every 3 months Release to Mission Control User Test Community Release to Mission Control User Test Community Release to Mission Control User Test Community Release to Mission Control Ops Release n Iteration 1 Iteration 2 3 Weeks Iteration 3 6 Weeks Iteration 4 9 Weeks 12 Weeks jtrimble2@gmail.com jay.p.trimble@nasa.gov </li> <li> 22. The Three-Week Cycle Agile Development Iteration Feature Freeze (-7 days) Optional Mid-Iteration Hackathon tests big features Priorities/JIRA Rankings Code Freeze (-3 days) Pre-Ship Hackathon Start 24 hour test (-2 day) Deliver to customer 3 Weeks Iteration n Coding UE &amp; Tech Spec dates driven by coding dependencies Issue Tracking Updates/Priorities/Rankings Nightly Build/Internal testing as features roll out Daily iteration n Build to Customer Customer installs iteration n-1 Test Customer acceptance test User Feedback Customer verication of closed JIRA issues Feature mods/additions, bug xes Customer triages issues it discovered Optionally, hot patch Iteration n+1 </li> <li> 23. The Release Cycle Agile Release Into Operations Release to Mission Control User test Community Release to Mission Control User test Community Release to Mission Control User test Community Customer Feature Verication Customer Feature Verication Customer Feature Verication Iteration 1 Iteration 2 Release to Customer for Mission Control Certication Iteration 4 Bugs/ Usability/More Testing Iteration 3 Release 3 Weeks 6 Weeks 9 Weeks Coding/UE Specs Issue Tracking Updates/Priorities/Rankings Build/Internal testing as features roll out 12 Weeks </li> <li> 24. Strategic Road Map </li> <li> 25. The Team Traditional Agile 1 Agile 2* Developers 5-9 Developers 7 Developers 4 User Experience Design (2) User Experience Design (2) User Experience Design (1) QA/Process Engineers (2) QA/Process Engineers (2) QA (.5) Project Manager (1) Project Manager (1) Developers rotate PM role Principle Investigator (Part Time) Principle Investigator Principle Investigator (Part Time) (Part Time) Interns Interns Interns *Reduced Budget </li> <li> 26. Focus Work on issues in order of priority Easier said than done JIRA/Greenhopper for issue tracking and ranking Developers should know what their priorities are Priorities should be achievable Dont over-manage ranking, or over-assign </li> <li> 27. Where are we? There is one, and only one measurement of progress and that is working code Replace presentations, code line counts and other management metrics with the nightly build For progress relative to strategic and tactical situation see issue tracking system (we use JIRA) </li> <li> 28. Testing Internal QA tests features as they roll out Our customer tested features daily to provide feedback Our customer used iteration deliveries and releases for nal feature verication Hackathons tested scaleability in a lab environment </li> <li> 29. Some Lessons Learned The train leaves the station on time A feature that misses one train just gets on the next one This requires frequent departures Do not ever delay a shipment unless the software does not work </li> <li> 30. It Takes Time Our journey was driven by need, i.e. we addressed issues as they came up, rather than being driven by a formal methodology We iteratively rened our methods over two years </li> <li> 31. Lessons Summary The measure of progress is working code Work on highest priorities rst, avoid the temptation to do the easier things rst Demonstrations, not presentations Customer interaction over extensive documentation Progress always visible, nightly build available The train leaves the station on time, only working features ship Do not delay shipment for features - if a feature is not ready it goes into the next iteration </li> <li> 32. ! Conclusion There is no one right way to do agile Fit and evolve the solution to your context of work </li> </ul>