Milestone 1

From crowdresearch
Revision as of 20:48, 26 February 2015 by Geza (Talk | contribs)

Jump to: navigation, search

In this milestone we want you to:

  • Get experience of what it is like to be a worker and requester on current crowd-labor marketplaces
  • Read some research papers about crowdsourcing

Experience the life of a Worker on Mechanical Turk

Sign up as a worker for Mechanical Turk here. Then, practice doing some tasks - you should do enough tasks to earn $1.

If having difficulty signing up as a worker, you can try the worker sandbox, or other sites like CrowdFlower or Microworkers. Your peers will post some worker sandbox HITs at Milestone 1 Sandbox HITs.


Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike?

Experience the life of a Requester on Mechanical Turk

Sign up as a requester for Mechanical Turk here. Then, post some HITs (you will need money, which we will provide how?). Aim to have 20 workers do your HITs.

Examples of tasks you might want to have done (you can pick one of the tasks below, or do a task of your own choosing):

  • Transcribe an audio recording
  • Get descriptions for images
  • Label parts of speech in a sentence


Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.

Explore alternative crowd-labor markets

Choose one of: TaskRabbit/oDesk/GalaxyZoo. Explore. If you’re inspired, get a job. You do not need to spend any more than 30-45min.


Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.


Please skim over the following papers. For each system, please note:

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?


Narula P, Gutheim P, Rolnitzky D, et al. MobileWorks: A Mobile Crowdsourcing Platform for Workers at the Bottom of the Pyramid. Human Computation, 2011, 11: 11.


Gupta A, Thies W, Cutrell E, et al. mClerk: enabling mobile crowdsourcing in developing regions. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2012: 1843-1852.


Eagle N. txteagle: Mobile crowdsourcing. Internationalization, Design and Global Development. Springer Berlin Heidelberg, 2009: 447-456.

Flash Teams

Retelny D, Robaszkiewicz S, To A, et al. Expert crowdsourcing with flash teams. Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 2014: 75-85.


Please create a page for your team's deliverable at (substituting in YourTeamName with the team name), copy over the template at Milestone 1 Template - you can see the source here .

Then, after you have filled out the page, post the link to the wiki page you created on CrowdGrader at SOMEWHERE

Old Stuff

Worker: Make at least $1

Requester: work with at least 20 workers (tasks)

—> either: if you have funds, get real workers to do it.

—> or: post on sandbox and recruit some friends

TODO: let’s figure out what the signup procedure is these days

Requester: give a few examples…transcribe audio, extract info from web site, anything you want

If you can’t get in, there are other platforms: Microworkers, crowd flower.

Submit: attach CSV of results

Read: my half-day as an MTurker

Read: MobileWorks, mClerk, txtEagle, (flash teams?)

Submit: what do you like about these systems? What don’t you like?

Submit: reflect on your experience as a worker. reflect on your experience as a requester. write up short reflections: what did you like? what did you hate?