Winter Milestone 1

From crowdresearch
Revision as of 17:25, 11 January 2016 by Rajan (Talk | contribs) (Submitting)

Jump to: navigation, search

Due date: 8:00 pm 17th Jan 2016 for submission, 12 pm 18th Jan 2016 for peer-evaluation.

In this milestone we want you to:

  • Get experience of what it is like to be a worker and requester on current crowd-labor marketplaces
  • Read some research papers about crowdsourcing
  • Youtube link of the meeting today: watch
  • Meeting 1 slideshow: coming soon.

Experience the life of a Worker on Mechanical Turk (Mandatory)

Sign up as a worker for Mechanical Turk here. Then, practice doing some tasks - you should do enough tasks to earn $1. You can check out forums like Turker Nation to find tips about working on MTurk.

From outside the USA? - If having difficulty signing up as a worker, you can try the worker sandbox, or other sites like CrowdFlower or Microworkers or Clickworker. Your peers will post some worker sandbox HITs at Milestone 1 Sandbox HITs.

Deliverable

Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike? Write it up on this wiki, as described here

Experience the life of a Requester on Mechanical Turk (Mandatory)

Sign up as a requester for Mechanical Turk here. Then, post some HITs to tackle a task of interest to you. Please aim to follow best practices for designing these tasks --- be clear in your task description, and do not arbitrarily reject work. If you need funds to support your task, please create a paypal account and fill out this form so we can PayPal you a few dollars. If you don't want to spend money, you can post them via the requester sandbox, list them on Milestone 1 Sandbox HITs and ask people to do them for you. Aim to have your HITs done by at least 15 different people.

From outside the USA? - If having difficulty signing up as a requester, you can try the MTurk requester sandbox, or sites like CrowdFlower or Microworkers or Clickworker.

Examples of tasks you might want to have done (you can pick one of the tasks below, or do a task of your own choosing):

  • Transcribe an audio recording
  • Get descriptions for images
  • Label parts of speech in a sentence

Deliverable

Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.

Explore alternative crowd-labor markets (Optional/If MTurk is inaccessible)

Crowdsourcing is not only about Mechanical Turk, its much more than that. And a variety of platforms are trying to address different problems. To help you get a flavor of the diversity of the area, choose one of the following crowdsourcing sites. If you're from outside the USA, it is possible that you may not be able to access MTurk, in that case, try the following websites:

Explore it and a get a feel for how the site works. If you want to, try doing a job on the site. You should not spend any more than 30-45min on this part.

Deliverable

Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.

Research Engineering (Optional)

For folks who cannot wait to get their hands dirty with code, check out Introduction to Research Engineering and hop onto #research-engineering channel on Slack. We'll help you:

Milestone DRIs: @aginzberg, @durim and @shirish.goyal on Slack. They are your point of contact.

  • Get set up locally
  • Get you on Github, and help you get postgres + familiarize with pip

Please note, we'll have research-engineering specific milestones in future, this is just for people who cannot wait.

Readings (Mandatory)

Please skim over the following papers. They're copyrighted, so please don't redistribute. You need to be signed in to view them.

MobileWorks

Narula P, Gutheim P, Rolnitzky D, et al. MobileWorks: A Mobile Crowdsourcing Platform for Workers at the Bottom of the Pyramid. Human Computation, 2011, 11: 11.

Daemo

Stanford Crowd Research Collective. Daemo: A Crowdsourced Crowdsourcing Platform (White paper, do not share)

Flash Teams

Retelny D, Robaszkiewicz S, To A, et al. Expert crowdsourcing with flash teams. Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 2014: 75-85.

Other Recommended Readings

These readings are optional (you don't need to write up on them), but recommended:

My MTurk (half) Workday

Deliverable

For each system, please write down:

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?

Submitting

Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=WinterMilestone_1_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at WinterMilestone 1 Template . If you have never created a wiki page before, please see this or watch this video on YouTube.

Submission period

After you created a wiki page and completed your milestone, post the link of the wiki page you created on Meteor site! Meteor is like Reddit, where you can upvote and comment on the submissions. Please create an account, and try to use your Slack username here as well. You can make your submission until 8:00 pm 17th Jan 2016.

Peer evaluation period

From 8:00 pm 17th Jan 2016 until 12 pm 18th Jan 2016, you're encouraged to peer-evaluate each others submissions. There's no right or wrong, give your constructive feedback. We encourage you to comment on at least 3 submissions, and feel free to upvote as many as you want. Please note that all team members have to participate in peer feedback. If you are a team of 2, together you're encouraged to comment on 2*3=6 distinct submissions.

[Team Representative] Submission or Post the links to your ideas until 8:00 pm PST 17th Jan

We have a [Reddit like service] on which you can post the links to the wiki-pages for the submissions, explore them, and upvote them.

Sign-up Instructions: Log in with either Twitter or Facebook on the [website]. When it asks you to pick your username, pick the same username as your Slack, this will help us identify and track your contributions better.

Link to the website: Meteor site. Post links to your ideas only once they're finished. Give your posts titles matching your team name this week.

-Please submit your finished ideas by 8:00 pm PST 17th Jan Sunday, and DO NOT vote/comment until then.

[Everyone] Peer-evaluation (upvote ones you like, comment on them) from 8:05 pm PST 17th Jan Sunday until 12 pm 18th Jan Monday

Post submission phase, you are welcome to browse through, upvote, and comment on others' ideas. We encourage you especially to look at and comment on ideas that haven't yet gotten feedback, to make sure everybody's ideas gets feedback. You can use http://crowdresearch.meteor.com/needcomments to find ideas that haven't yet gotten feedback, and http://crowdresearch.meteor.com/needclicks to find ideas that haven't been yet been viewed many times.

COMMENT BEST-PRACTICES: Everybody in the team reviews at least 3 ideas, supported by a comment. The comment has to justify your reason for upvote. The comment should be constructive, and should mention positive aspect of the idea worth sharing. Negative comments are discouraged, rather make your comment in the form of a suggestion - such as, if you disliked an idea, try to suggest improvements (do not criticize an idea, no idea is bad, every idea has a scope of improvement).