Milestone 1 Betzy

From crowdresearch
Revision as of 23:24, 4 March 2015 by Elsabakiu (Talk | contribs)

Jump to: navigation, search

Experience the life of a Worker on Mechanical Turk

Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike?

Experience the life of a Requester on Mechanical Turk

Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.

Explore alternative crowd-labor markets

Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.

Readings

MobileWorks

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?

mClerk

mClerk is a crowdsourcing program that allows projects to be delegated to remote regions of developing countries and reach a capable work force otherwise unavailable due to insufficient access to internet through computers or smart phones. The trial run was conducted in rural regions near Bangalore where the local language of Kannada was to be digitized using crowdsourcing methods. Although internet is not readily available in this region, most of the general population owns a SMS capable mobile phone. Using a bitmap algorithm, images of written text were transmitted to participants via SMS who then responded in kind with the transliteration of the word.

  • What do you like about the system / what are its strengths?

Crowdsourcing platforms are mainly beneficial to workers earning minimum or below minimum wage. Laborers have the opportunity to increase their salary by participating in crowdsourcing programs during their idle time like transit, breaks, or other “unproductive” time. mClerk enables companies to delegate menial tasks to workers lacking specialty qualifications in remote locations. This eliminates the concern that crowdsourcing platforms allow unqualified amateurs to underbid professionals with a market comparable hourly rates. While the SMS technology at the core of this idea is the optimal tool to contact this target group, it also poses the biggest threat. While the algorithm allows low resolution images to be transferred, the types of projects are limited to display dimensions of only a few pixels. More complex projects and tasks require longer descriptions and possibly more detailed images requiring better technology and language skills on the user end.

The benefits of this platform vs. Amazon Mechanical Turk include the quick sign up via phone call, the low-tech approach, and the simple user experience. AMT sports an outdated user interface that is very complicated in its setup. The requester publishing a project, has the ability to narrow the field of participants completing “HITs” by setting various qualifications needed, but is then forced to evaluate the validity of each answer after the HITs have been completed. At numbers of several hundred to thousands of HITs per project, this task alone allows for erroneous anwers to “slip through the cracks”. mClerk’s algorithm that automatically checks for congruency between answers given by different users, already proves a 91% accuracy and could be improved with future testing.

  • What do you think can be improved about the system?


Flash Teams

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?