Milestone 1 Betzy

From crowdresearch
Jump to: navigation, search

Experience the life of a Worker on Mechanical Turk

Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike?

Pros:

  • It's flexible, you can easily fit it to your schedule.
  • You can always find HITs that you like, due to the amount of active HITs.

Cons:

  • There is no categorization of the HITs that would facilitate search.
  • The platform itself seems pretty old, we think it needs new, responsive design.
  • There is no worker protection, as a worker you could finish a job correctly and not get rewarded.
  • New workers might experience "starvation" if the HITs require qualifications.
  • It lacks a quick and easy to use communication system between requester and worker.
  • It lacks a transparent approval/rejection process.
  • The payament for some tasks is very low, even though they might require special skills.
  • The approval/rejection process might take too long.

Experience the life of a Requester on Mechanical Turk

CSV file generated when we download the HIT results: Media:Sentiment labeling.csv

Pros:

  • Good thing is that you always find people who will work on your HITs.
  • It takes relatively short time to get the results.
  • You can create custom qualifications to control the wrokers performing your HITs.

Cons:

  • The requester always needs to review the work to ensure good quality.
  • AMT doesn't provide support for the review process, this happens offline.
  • More flexibility needed to be able to distribute amounts of HITs to specific users/groups.
  • When using the templates offered by AMT, you can't use custom qualifications or reject poor quality work (work is automatically approved)
  • Lack of workers demographic data.

Explore alternative crowd-labor markets

Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.

ODesk

oDesk convinces with an updated, clean, and clear user interface. Upon sine-up, users are asked to complete a profile in which they can describe their skills, education level, and strengths. Freelancers post their hourly rate and work under real world conditions by choosing a project they work on from start to finish rather than completing just a part or task of an entire project. Joining oDesk isn’t instantaneous like on mClerk, which could deter potential users looking to join a crowdsourcing platform quickly and easily. To me it makes this platform seem more reliable for me as a freelancer and someone looking for a freelancer. The “this seems too good to be true” comment by one of the mClerk user seems to correspond with this . Depending on ones personal preference, this is where AMT is easier for users who have a full time job and are looking for short, easy work to complete during their idle time rather than full scope projects that require several hours.

GalaxyZoo

MTurk awards workers with money for the work they do, GalaxyZoo on the other hand it's completely voluntary work. It's one big project about classifying galaxies, the requirements(questions) are the same all the time, whereas in MTurk you can post any kind of project, the requirements differ from task to task. GalaxyZoo seems to be a very interesting project that attracts many people, and the results seem to have a really big importance for research, but I guess it would be better if they added a little award for the people who do all the work.

Readings

MobileWorks

What do you like about the system / what are its strengths?

  • A good development and employment opportunity for developing countries.
  • Very useful for tasks that are specific to the language of the workers.
  • Being a mobile solution, it allows to work in any environment, at any time.
  • Payment is comparable to salary standards in India.

What do you think can be improved about the system?

  • The system doesn’t use the quality rating of the workers for quality assurance (only for the payment). The quality rating can be a criterion for defining the correct answer. For example only when a good quality worker (e.g. with quality rating higher than 75%) answer is matched with another answer, then the correct answer is defined and the quality ratings of other workers are updated accordingly.
  • Limited types of tasks that can be performed due to the small size of a mobile.

mClerk

mClerk is a crowdsourcing program that allows projects to be delegated to remote regions of developing countries and reach a capable work force otherwise unavailable due to insufficient access to internet through computers or smart phones. The trial run was conducted in rural regions near Bangalore where the local language of Kannada was to be digitized using crowdsourcing methods. Although internet is not readily available in this region, most of the general population owns a SMS capable mobile phone. Using a bitmap algorithm, images of written text were transmitted to participants via SMS who then responded in kind with the transliteration of the word.

  • What do you like about the system / what are its strengths?

Crowdsourcing platforms are mainly beneficial to workers earning minimum or below minimum wage. Laborers have the opportunity to increase their salary by participating in crowdsourcing programs during their idle time like transit, breaks, or other “unproductive” time. mClerk enables companies to delegate menial tasks to workers lacking specialty qualifications in remote locations. This eliminates the concern that crowdsourcing platforms allow unqualified amateurs to underbid professionals with a market comparable hourly rates. While the SMS technology at the core of this idea is the optimal tool to contact this target group, it also poses the biggest threat. While the algorithm allows low resolution images to be transferred, the types of projects are limited to display dimensions of only a few pixels. More complex projects and tasks require longer descriptions and possibly more detailed images requiring better technology and language skills on the user end.

The benefits of this platform vs. Amazon Mechanical Turk include the quick sign up via phone call, the low-tech approach, and the simple user experience. AMT sports an outdated user interface that is very complicated in its setup. The requester publishing a project, has the ability to narrow the field of participants completing “HITs” by setting various qualifications needed, but is then forced to evaluate the validity of each answer after the HITs have been completed. At numbers of several hundred to thousands of HITs per project, this task alone allows for erroneous anwers to “slip through the cracks”. mClerk’s algorithm that automatically checks for congruency between answers given by different users, already proves a 91% accuracy and could be improved with future testing.

  • What do you think can be improved about the system?

The main draw back seems to be the payment method. While it is outlined in the conclusion of this paper that a more efficient payment method could be achieved by companies reaching agreements with cellular companies, users are forced to complete a minimum of 10-20 tasks to receive the minimum credit of INR 10. This, as mentioned in user comments, has led to participants neglecting other tasks in order to complete their minimum requirement. It may also lead to users not completing their tasks efficiently and truthfully in order to save time.

Flash Teams

  • What do you like about the system / what are its strengths?

It allows you to quickly create teams, and the best features are elasticity and pipelining. You can get a project done within few hours, the implemented workflows are great which make the requester's job easier.

  • What do you think can be improved about the system?

Maybe allowing parallel runs for the same project within the same workflow would be a good idea, which means trying different approaches, in case one of them fails.