Winter Milestone 1

From crowdresearch
Jump to: navigation, search

Due date: 8:00 pm 17th Jan 2016 for submission, 12 pm 18th Jan 2016 for peer-evaluation.

DRI: @rajanvaish

Slack channel for coordination and questions: #general

In this milestone we want you to:

  • Get experience of what it is like to be a worker and requester on current crowd-labor marketplaces
  • Read some research papers about crowdsourcing
  • Youtube link of the meeting today: watch
  • Meeting 1 slideshow: slides pdf

Experience the life of a Worker on Mechanical Turk (Mandatory)

Sign up as a worker for Mechanical Turk here. Then, practice doing some tasks - you should do enough tasks to earn $1. You can check out forums like Turker Nation to find tips about working on MTurk.

From outside the USA? - If having difficulty signing up as a worker (e.g., if they don't respond within the 48 hour period), you can try the worker sandbox, or other sites like CrowdFlower or Microworkers or Clickworker. Your peers will post some worker sandbox HITs at Milestone 1 Sandbox HITs.

Deliverable

Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike? Write it up on this wiki, as described here

Experience the life of a Requester on Mechanical Turk (Mandatory)

Sign up as a requester for Mechanical Turk here. Then, post some HITs to tackle a task of interest to you. Please aim to follow best practices for designing these tasks --- be clear in your task description, and do not arbitrarily reject work. If you need funds to support your task, please create a paypal account and fill out this form so we can PayPal you a few dollars. If you don't want to spend money, you can post them via the requester sandbox, list them on Milestone 1 Sandbox HITs and ask people to do them for you. Aim to have your HITs done by at least 15 different people.

From outside the USA? - If having difficulty signing up as a requester, you can try the MTurk requester sandbox, or sites like CrowdFlower or Microworkers or Clickworker.

Examples of tasks you might want to have done (you can pick one of the tasks below, or do a task of your own choosing):

  • Transcribe an audio recording
  • Get descriptions for images
  • Label parts of speech in a sentence

Deliverable

Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.

Explore alternative crowd-labor markets (Optional/If MTurk is inaccessible)

Crowdsourcing is not only about Mechanical Turk, its much more than that. And a variety of platforms are trying to address different problems. To help you get a flavor of the diversity of the area, choose one of the following crowdsourcing sites. If you're from outside the USA, it is possible that you may not be able to access MTurk, in that case, try the following websites:

Explore it and a get a feel for how the site works. If you want to, try doing a job on the site. You should not spend any more than 30-45min on this part.

Deliverable

Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.

Research Engineering (Test Flight)

For folks who cannot wait to get their hands dirty with code, check out Introduction to Research Engineering and hop onto #research-engineering channel on Slack.

The deliverable for this week is a screen shot showing that you have successfully gotten set up locally. Post the image in #engineering-deliver on Slack by Sunday at 8pm PST.

For the test flight, we assume that you already know (or can quickly learn on your own) the part of our stack that you want to work on. If you are new to Git, Angular, Django, or anything else, we highly recommend reading up on the docs or browsing a few tutorials here. Our expectation is that by the end of this week, you will have completed Engineering Milestones 0-2 on Introduction to Research Engineering. Feel free to continue on to the other Milestones once you have finished.

Milestone directly responsible individuals (DRIs): @aginzberg, @dmorina and @shirish.goyal on Slack.

Please note, we'll have research-engineering specific milestones in future, this is just for people who cannot wait.

How to install Daemo without Vagrant on Windows 10

Finally, to play with Daemo, see this link.

Readings (Mandatory)

Please skim over the following papers. They're copyrighted, so please don't redistribute. You need to be signed in to the wiki view them.

MobileWorks

Narula P, Gutheim P, Rolnitzky D, et al. MobileWorks: A Mobile Crowdsourcing Platform for Workers at the Bottom of the Pyramid. Human Computation, 2011, 11: 11.

Daemo

Stanford Crowd Research Collective. Daemo: A Crowdsourced Crowdsourcing Platform (White paper, do not share)

Flash Teams

Retelny D, Robaszkiewicz S, To A, et al. Expert crowdsourcing with flash teams. Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 2014: 75-85.

Other Recommended Readings

These readings are optional (you don't need to write up on them), but recommended:

My MTurk (half) Workday

Deliverable

For each system, please write down:

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?

Submitting

Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=WinterMilestone_1_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at WinterMilestone 1 Template . If you have never created a wiki page before, please see this or watch this video on YouTube.

[Team Representative] Submission or Post the links to your ideas until 8:00 pm PST 17th Jan

We have a [Reddit like service] on which you can post the links to the wiki-pages for the submissions, explore them, and upvote them.

Sign-up Instructions: Log in with either Twitter or Facebook on the [website]. When it asks you to pick your username, pick the same username as your Slack, this will help us identify and track your contributions better.

Link to the website: Meteor site. Post links to your ideas only once they're finished. Give your posts titles matching your team name this week.

-Please submit your finished ideas by 8:00 pm PST 17th Jan Sunday, and DO NOT vote/comment until then.

[Everyone] Peer-evaluation from 8:05 pm PST 17th Jan Sunday until 12 pm PST 18th Jan Monday

Post submission phase, you are welcome to browse through, upvote, and comment on others' ideas. We encourage you especially to look at and comment on ideas that haven't yet gotten feedback, to make sure everybody's ideas gets feedback. You can use http://crowdresearch.meteor.com/needcomments to find ideas that haven't yet gotten feedback, and http://crowdresearch.meteor.com/needclicks to find ideas that haven't been yet been viewed many times.

COMMENT BEST-PRACTICES: Everybody in the team reviews at least 3 ideas, supported by a comment. The comment has to justify your reason for upvote. The comment should be constructive, and should mention positive aspect of the idea worth sharing. Negative comments are discouraged, rather make your comment in the form of a suggestion - such as, if you disliked an idea, try to suggest improvements (do not criticize an idea, no idea is bad, every idea has a scope of improvement).