utest and crowdsortium webinar: scope & briefs
DESCRIPTION
Have you ever pondered on how you might be able to better engage your crowd on projects or jobs? You might call it a “campaign description,” “brief,” or project “scope,” but whatever your name is, this presentation is for you. Thought leader, Matt Johnston, CMO of uTest, discuses the ways in which we reach out to our crowd. A few questions explored: - What formats do you use and find most effective? - What methods are used for distribution? - How do you guide remote workers, and how much direction is needed? - How might we engage clients to understand the value of campaign descriptions?TRANSCRIPT
|
Crowdsourcing KeysThe Never-Ending Journey To Define Scope & Briefs
Matt Johnston | @matjohnston | CMO @ uTest | April 2012
The Challenge
||2
What We’reTalking About
|3In-The-Wild Testing for Functional + Security + Load + Usability + Localization
• Two critical components to c-sourcing success:1. What’s the nature & intent of the project?
2. Who’s the best fit to do a project?
Boiled Down
The Challenge
||4
Who’s The Best Fit
|5In-The-Wild Testing for Functional + Security + Load + Usability + Localization
• This 2nd question gets the attention, so we’ll focus elsewhere• For uTest, matching algorithms based upon
– Based upon profile– Based upon merit-based ratings– Based upon other criteria (eg: load balancing)
• Structured data is vital to good matches
Who’s The Best Fit?
The Challenge
||6
Defining Nature &Intent Of Projects
|7In-The-Wild Testing for Functional + Security + Load + Usability + Localization
• Fundamental nature of your projects– Creative brief ≠ technical spec ≠ advertising guidance– Design vs. animation vs. dev vs. testing vs. advertising vs. content
• What customer is really trying to do and cares about– Effectiveness vs. efficiency
- Signal-to-noise ratio?- Precision?- Brute force?
• Those who can’t articulate the
problem won’t find the solution.
First Things First
|8In-The-Wild Testing for Functional + Security + Load + Usability + Localization
• Our journey:– 2008 – We launched… and we sucked– 2009 – More sucking – 2010 – Started wrapping head around nature of the problem– 2011 – More sophisticated– 2012 – Solved most of the problem… and created several new ones
- Launched 4 new testing types, each w/ their own project definition challenges
• Key takeaways for us:– Garbage in, garbage out– Real customer of such info is our community– Nature of project matters:
- If goal is predictability (technical work), unstructured data is the enemy- If goal is creativity, unstructured data is still the enemy (just a necessary evil)
Defining Work To Be Done
The Challenge
||9
ExamplesPast & Future
|10In-The-Wild Testing for Functional + Security + Load + Usability + Localization
Examples: Project Setup
|11In-The-Wild Testing for Functional + Security + Load + Usability + Localization
Examples: Project Scope
|12In-The-Wild Testing for Functional + Security + Load + Usability + Localization
• Even with right questions, we had room for improvement:– Ux for customers to create input
- Information architecture and GUI design- Integrations with legacy systems
– Ux for community to consume output- Information architecture and GUI design- Change management
Input / Output Matters
|13In-The-Wild Testing for Functional + Security + Load + Usability + Localization
Examples: Project Setup
|In-The-Wild Testing for Functional + Security + Load + Usability + Localization
Examples: Project Scope
14
|15In-The-Wild Testing for Functional + Security + Load + Usability + Localization
• Two critical components to c-sourcing success:1. What’s the nature & intent of the project?
2. Who’s the best fit to do a project?
• Should spend equal calories, hours and brain cells on each
Bottomline