Winter Milestone 1
Due date: 8:00 pm 17th Jan 2016 for submission, 12 pm 18th Jan 2016 for peer-evaluation.
In this milestone we want you to:
- Get experience of what it is like to be a worker and requester on current crowd-labor marketplaces
- Read some research papers about crowdsourcing
- Youtube link of the meeting today: watch
- Meeting 1 slideshow: coming soon.
- 1 Experience the life of a Worker on Mechanical Turk (Mandatory)
- 2 Experience the life of a Requester on Mechanical Turk (Mandatory)
- 3 Explore alternative crowd-labor markets (Optional/If MTurk is inaccessible)
- 4 Research Engineering (Optional)
- 5 Readings (Mandatory)
- 6 Submitting
Experience the life of a Worker on Mechanical Turk (Mandatory)
From outside the USA? - If having difficulty signing up as a worker, you can try the worker sandbox, or other sites like CrowdFlower or Microworkers or Clickworker. Your peers will post some worker sandbox HITs at Milestone 1 Sandbox HITs.
Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike? Write it up on this wiki, as described here
Experience the life of a Requester on Mechanical Turk (Mandatory)
Sign up as a requester for Mechanical Turk here. Then, post some HITs to tackle a task of interest to you. Please aim to follow best practices for designing these tasks --- be clear in your task description, and do not arbitrarily reject work. If you need funds to support your task, please create a paypal account and fill out this form so we can PayPal you a few dollars. If you don't want to spend money, you can post them via the requester sandbox, list them on Milestone 1 Sandbox HITs and ask people to do them for you. Aim to have your HITs done by at least 15 different people.
Examples of tasks you might want to have done (you can pick one of the tasks below, or do a task of your own choosing):
- Transcribe an audio recording
- Get descriptions for images
- Label parts of speech in a sentence
Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.
Explore alternative crowd-labor markets (Optional/If MTurk is inaccessible)
Crowdsourcing is not only about Mechanical Turk, its much more than that. And a variety of platforms are trying to address different problems. To help you get a flavor of the diversity of the area, choose one of the following crowdsourcing sites:
Explore it and a get a feel for how the site works. If you want to, try doing a job on the site. You should not spend any more than 30-45min on this part.
Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.
Research Engineering (Optional)
Milestone DRIs: @aginzberg, @durim and @shirish.goyal on Slack. They are your point of contact.
- Get set up locally
- Get you on Github, and help you get postgres + familiarize with pip
Please note, we'll have research-engineering specific milestones in future, this is just for people who cannot wait.
Please skim over the following papers. They're copyrighted, so please don't redistribute. You need to be signed in to view them.
Other Recommended Readings
These readings are optional (you don't need to write up on them), but recommended:
For each system, please write down:
- What do you like about the system / what are its strengths?
- What do you think can be improved about the system?
Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=WinterMilestone_1_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at WinterMilestone 1 Template . If you have never created a wiki page before, please see this or watch this video on YouTube.
After you created a wiki page and completed your milestone, post the link of the wiki page you created on Meteor site! Meteor is like Reddit, where you can upvote and comment on the submissions. Please create an account, and try to use your Slack username here as well. You can make your submission until 8:00 pm 17th Jan 2016.
Peer evaluation period
From 8:00 pm 17th Jan 2016 until 12 pm 18th Jan 2016, you're encouraged to peer-evaluate each others submissions. There's no right or wrong, give your constructive feedback. We encourage you to comment on at least 3 submissions, and feel free to upvote as many as you want. Please note that all team members have to participate in peer feedback. If you are a team of 2, together you're encouraged to comment on 2*3=6 distinct submissions.