Difference between revisions of "Winter Milestone 1"
|Line 71:||Line 71:|
=== Daemo ===
=== Daemo ===
[[:Media:Daemo (private).pdf | Stanford Crowd Research Collective. Daemo: A Crowdsourced Crowdsourcing Platform (White paper, do not share)]]
[[:Media:Daemo (private).pdf| Stanford Crowd Research Collective. Daemo: A Crowdsourced Crowdsourcing Platform (White paper, do not share)]]
=== Flash Teams ===
=== Flash Teams ===
Revision as of 18:04, 11 January 2016
Due date: 8:00 pm 17th Jan 2016 for submission, 12 pm 18th Jan 2016 for peer-evaluation.
In this milestone we want you to:
- Get experience of what it is like to be a worker and requester on current crowd-labor marketplaces
- Read some research papers about crowdsourcing
- Youtube link of the meeting today: watch
- Meeting 1 slideshow: coming soon.
- 1 Experience the life of a Worker on Mechanical Turk (Mandatory)
- 2 Experience the life of a Requester on Mechanical Turk (Mandatory)
- 3 Explore alternative crowd-labor markets (Optional/If MTurk is inaccessible)
- 4 Research Engineering (Optional)
- 5 Readings (Mandatory)
- 6 Submitting
Experience the life of a Worker on Mechanical Turk (Mandatory)
From outside the USA? - If having difficulty signing up as a worker (e.g., if they don't respond within the 48 hour period), you can try the worker sandbox, or other sites like CrowdFlower or Microworkers or Clickworker. Your peers will post some worker sandbox HITs at Milestone 1 Sandbox HITs.
Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike? Write it up on this wiki, as described here
Experience the life of a Requester on Mechanical Turk (Mandatory)
Sign up as a requester for Mechanical Turk here. Then, post some HITs to tackle a task of interest to you. Please aim to follow best practices for designing these tasks --- be clear in your task description, and do not arbitrarily reject work. If you need funds to support your task, please create a paypal account and fill out this form so we can PayPal you a few dollars. If you don't want to spend money, you can post them via the requester sandbox, list them on Milestone 1 Sandbox HITs and ask people to do them for you. Aim to have your HITs done by at least 15 different people.
Examples of tasks you might want to have done (you can pick one of the tasks below, or do a task of your own choosing):
- Transcribe an audio recording
- Get descriptions for images
- Label parts of speech in a sentence
Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.
Explore alternative crowd-labor markets (Optional/If MTurk is inaccessible)
Crowdsourcing is not only about Mechanical Turk, its much more than that. And a variety of platforms are trying to address different problems. To help you get a flavor of the diversity of the area, choose one of the following crowdsourcing sites. If you're from outside the USA, it is possible that you may not be able to access MTurk, in that case, try the following websites:
Explore it and a get a feel for how the site works. If you want to, try doing a job on the site. You should not spend any more than 30-45min on this part.
Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.
Research Engineering (Optional)
Milestone DRIs: @aginzberg, @durim and @shirish.goyal on Slack. They are your point of contact.
- Get set up locally
- Get you on Github, and help you get postgres + familiarize with pip
Please note, we'll have research-engineering specific milestones in future, this is just for people who cannot wait.
Please skim over the following papers. They're copyrighted, so please don't redistribute. You need to be signed in to the wiki view them.
Other Recommended Readings
These readings are optional (you don't need to write up on them), but recommended:
For each system, please write down:
- What do you like about the system / what are its strengths?
- What do you think can be improved about the system?
Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=WinterMilestone_1_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at WinterMilestone 1 Template . If you have never created a wiki page before, please see this or watch this video on YouTube.
We have a [Reddit like service] on which you can post the links to the wiki-pages for the submissions, explore them, and upvote them.
Sign-up Instructions: Log in with either Twitter or Facebook on the [website]. When it asks you to pick your username, pick the same username as your Slack, this will help us identify and track your contributions better.
Link to the website: Meteor site. Post links to your ideas only once they're finished. Give your posts titles matching your team name this week.
-Please submit your finished ideas by 8:00 pm PST 17th Jan Sunday, and DO NOT vote/comment until then.
[Everyone] Peer-evaluation from 8:05 pm PST 17th Jan Sunday until 12 pm PST 18th Jan Monday
Post submission phase, you are welcome to browse through, upvote, and comment on others' ideas. We encourage you especially to look at and comment on ideas that haven't yet gotten feedback, to make sure everybody's ideas gets feedback. You can use http://crowdresearch.meteor.com/needcomments to find ideas that haven't yet gotten feedback, and http://crowdresearch.meteor.com/needclicks to find ideas that haven't been yet been viewed many times.
COMMENT BEST-PRACTICES: Everybody in the team reviews at least 3 ideas, supported by a comment. The comment has to justify your reason for upvote. The comment should be constructive, and should mention positive aspect of the idea worth sharing. Negative comments are discouraged, rather make your comment in the form of a suggestion - such as, if you disliked an idea, try to suggest improvements (do not criticize an idea, no idea is bad, every idea has a scope of improvement).