Due date (PST): 11:59 pm 6th May 2015 for submission, 11:00am 8th April 2015 for voting and commenting on others' submissions.
This week you will improve your prototype so that you can actually go present it to potential users and get feedback, and go do some heuristic evaluation with users to get feedback on your prototype!
- 1 Improving your Prototype
- 2 Heuristic Evaluation of Prototypes
- 3 Deliverables
- 4 Submitting
- 5 Weekly Survey
Improving your Prototype
First, based on feedback you have received from peers and from looking at other prototypes, refine your prototype further so that you have something that you can show to potential users and get feedback on.
Heuristic Evaluation of Prototypes
Why do Heuristic Evaluation with Users?
Heuristic evaluation with users is great for getting feedback on your ideas from actual potential users, and finding flaws before you spend days implementing them and pushing them into production. Hence, it's especially important to continually do informal user studies (heuristic evaluations) on prototypes to find areas for improvement early on, as opposed to finding flaws only after you have already built and launched your platform.
How to do heuristic evaluation with users
Also see the d.school bootleg for more resources on testing prototypes with users.
Where can we find users?
Turker forums are a great place to start, if you want to find potentially interested workers and requesters to chat with! Ask around on Slack if having trouble finding users.
1) Your updated prototype
2) A writeup summarizing the key findings you observed during your heuristic evaluation with users
The writeup should discuss the most salient feedback and observations you made while doing heuristic evaluation with each of your users (You should aim to try out your prototype with at least 2 different people. If the idea you are prototyping impacts both workers and requesters, try to get at least one of each type). Then, summarize the main findings into some concise bullet points, and finally some potential improvements to the idea and prototype that you have observed as a result of these studies.
Create Wiki Page
Please create a wiki page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=Milestone_10_YourTeamName&action=edit (substituting in YourTeamName with the team name). Copy over the template at Milestone 10 Template .
Submit at http://crowdresearch.meteor.com/ in the category according to which of the 4 foundational ideas you are prototyping. Submit by Wednesday at midnight (pacific time).
Foundation 1: Macro and Micro at some url
Foundation 2: Input/output transducers at some url
Foundation 3: External quality ratings at some url
Foundation 4: Open Governance at some url
List Your Submission
Please list your submission at Milestone 10 Submissions
Remember to vote and comment on each others' submissions on the site, before Friday at 11:00am (pacific time)
Please use http://crowdresearch.meteor.com/needcomments to find submissions that haven't yet gotten feedback, and http://crowdresearch.meteor.com/needclicks to find submissions that haven't been yet been viewed many times.
Please comment on 3 to 5 submissions. Please do not vote for your own submission. Once again, everyone is supposed to vote+comment, whether you're the team leader or not.
Please fill out the weekly survey so we can improve your crowd research experience!