Due date: 11:59 pm 4th March 2015 for submission, 9 am 6th March 2015 for peer-evaluation.
In this milestone we want you to:
- Get experience of what it is like to be a worker and requester on current crowd-labor marketplaces
- Read some research papers about crowdsourcing
Experience the life of a Worker on Mechanical Turk
Sign up as a worker for Mechanical Turk here. Then, practice doing some tasks - you should do enough tasks to earn $1.
If having difficulty signing up as a worker, you can try the worker sandbox, or other sites like CrowdFlower or Microworkers. Your peers will post some worker sandbox HITs at Milestone 1 Sandbox HITs.
Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike? Write it up on this wiki, as described here
Experience the life of a Requester on Mechanical Turk
Sign up as a requester for Mechanical Turk here. Then, post some HITs to tackle a task of interest to you. Please aim to follow best practices for designing these tasks --- be clear in your task description, and do not arbitrarily reject work. If you need funds to support your task, email firstname.lastname@example.org and we can PayPal you a few dollars. If you don't want to spend money, you can post them via the requester sandbox, list them on Milestone 1 Sandbox HITs and ask people to do them for you. Aim to have your HITs done by at least 15 different people.
Examples of tasks you might want to have done (you can pick one of the tasks below, or do a task of your own choosing):
- Transcribe an audio recording
- Get descriptions for images
- Label parts of speech in a sentence
Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.
Explore alternative crowd-labor markets
Crowdsourcing is not only about Mechanical Turk, its much more than that. And a variety of platforms are trying to address different problems. To help you get a flavor of the diversity of the area, choose one of the following crowdsourcing sites:
Explore it and a get a feel for how the site works. If you want to, try doing a job on the site. You should not spend any more than 30-45min on this part.
Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.
Please skim over the following papers. They're copyrighted, so please don't redistribute. You need to be signed in to view them.
For each system, please write down:
- What do you like about the system / what are its strengths?
- What do you think can be improved about the system?
Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=Milestone_1_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at Milestone 1 Template .
Then, after you have filled out the page, post the link to the wiki page you created on CrowdGrader!
Step 1 for everyone - For the most of you, you don't have to enroll. I have done it for you. You can directly go to: http://www.crowdgrader.org/crowdgrader/venues/view_venue/860 . However, if you cannot access it, please self-enroll using this link: http://www.crowdgrader.org/crowdgrader/venues/join/860/tofypy_firice_dadepe_rutane
Step 2 for team leaders: Make sure all of your team members have enrolled into the system (though I have done it for you, please double check). Now add them to your group/team - there's an option to add collaborators. You also have to keep a check on deadlines, and whether your team is co-operating with you.
Step 3 for team leaders: Make the submission and represent your team, only team leaders should make the submission.
Step 4 for everyone: Begin peer-evaluation. Everyone will be randomly assigned 3 submissions to grade, and 25% of your grades depend on your duty to peer-grade others - check Crowdgrader to find and grade the submissions. - Please comment and justify why you gave this score, and point out good/bad points about the submission. Team leaders, please make sure that every member of your team grades the submissions.