WinterMilestone 1 CRBlr200

From crowdresearch
Revision as of 14:45, 17 January 2016 by Prasutkumar (Talk | contribs)

Jump to: navigation, search


This is the wiki page for Winter 2016 Milestone 1 spanning 12th January 2016 to 17th January 2016.

Experience the life of a Worker on Mechanical Turk

Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike?

For some reason, MTurk did not accept me us as workers. So as a worker, that's already some sort of defeat, as it removes one lucrative source one could've worked for. It's heavily marketed as the best place to crowdsource or work, and since the other channel's are not so popular/backed by trustworthy brand like Amazon, worker's are left in doubt if the workplace they are working on actually legal.

2 of our team members worked on Clickworker, 1 worked on Microworkers.

Clickworker Experience - @afreen
I started working at Clickworker. The tasks for the Indian community go from less to virtually non-existent. The learning curve is a little too steep for my liking. I had to browse through multiple subreddits like r/beermoney & r/uhrswork to get a fair idea of what to do. The websites themselves are not very efficiently helpful.

I started working on tasks like verifying if the search term appropriately matches the description. You can earn $0.02 per HIT. The qualification tests could be a lot more inclusive. A lot of people could fail the qualification tests because you may not get the hang of the task unless you have practiced a bit. Sometimes a qualification test may have 3 tries, sometimes 10. It should be a little more consistent regardless of the requester or platform.

Microworker Experience - @kr.prastut
The tasks were pretty well defined, albeit a few, & they were categorically grouped into categories based on most paying, time to rate, best jobs. The best grouping was time to rate, and most took a day to validate, even it was written that it would take 7 days. Most of the jobs were of type signups's. Problem:

  • Most of the signup's required to go to links which looked like phishing/malicious websites.
  • Personal details like emails and passwords were asked. People tend to give the same passwords.
  • No worker profile (atleast not where the tasks were posted) apart from the tasks which had a gold/silver medal based on ratings, so trust issues.
  • Require signing up on popular websites like Facebook, Instagram. The risk of the task deliverable being used against me are slim, but there. Some required signing up illegaly on Apple iTunes and he even provided a YouTube link for it!
  • Some required voting on a specific competitor in a competition. Is it ethical?

A TL,DR approach to what we liked as worker's:

  • Preparation material for each task by the requester, if perused properly, very useful.
 * Tasks were crisp, brief and understandable. Deliverable's were also clearly mentioned.
  • You may get tasks that may not require a lot of effort with higher compensations.
 * Performing reasonably on a task yields into a second task from the same employer. The second task was usually time bound, so involved higher compensation.
  • Time taken was clearly mentioned, and it did take approximately that amount of time.

A TL,DR approach to what we disliked as a worker (this list is obviously longer):

  • Learning curve -- making an account, learning the HIT worker culture, best practices not clearly defined. The interface was outdated, and provided no clues as to what to do. A tour of the website would have been really helpful.
  • Getting banned off tasks on failing qualifications
 * No feedback mechanism for self 
  • Necessary to be bilingual, one of the languages being English
  • A worker might mark something right to the best of his/her ability while requester finds it wrong - no clear judgment.
  • Voting on contestants - ethical?

Experience the life of a Requester on Mechanical Turk

Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results. If you're from outside the USA and unable to access MTurk, you can try the MTurk requester sandbox, or CrowdFlower or Microworkers or Clickworker

Explore alternative crowd-labor markets

Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.



  • What do you like about the system / what are its strengths?

Mobile Works provides an alternate source of income for the lower income bracket, which is great. It is a smart way to continue earning during their free time.

  • What do you think can be improved about the system?

1. It can be applied to even a level-up of users (middle income bracket) with a one-level higher complexity of tasks.
2. The users could be incentivized with financial gains if they complete certain number of tasks in a day. (Example: 10% bonus on completing 50+ tasks)
3. Currently a set of users are given duplicate tasks ( ie tasks that have already been given to another person), this is duplication of effort and could be reduced.


  • What do you like about the system / what are its strengths?

Not only talks of the mechanics of a crowd working platform, but actively works to solve problems faced between requester and worker which is very critical for success of any type of crowd sourcing platform. ( Boomerang & Prototyping) Because of the above, such a crowd platform can be useful generic workers as well as workers of niche expertise.

We think the main strength of such a system is that it can applied to system like MobileWorks, as well as flash teams. The problems that this paper seeks to solve as fundamental across all such platforms.

  • What do you think can be improved about the system?

1. If a worker gets the set of tasks from the requester that he/she has been a postive feedback from, he/she would be acquainted with the type of work that the company/requester is performing - this might create a privacy issue in the future. 2. Easy & quick communication to requester, so that the worker can clear any doubts he/she has, so he/she doesn’t have to assume and continue working.
3. For a requester who is new to the system, it could take more than 1 round of feedback to get used to the type of instructions that are required.
4. When the platform just opens up the real world, all requester and workers would be on the same level. There could be a instance where a requester gets a worker of poor quality and wouldn't be aware of it. So he might have to re-assign his/her work to another worker. To avoid the above, he/she should be able to see the workers review on other platforms like MTurk etc, if it available. This could help a good worker fain an edge while starting off on Daemo.

Flash Teams

  • What do you like about the system / what are its strengths?

A combination Foundry and flash team is an incredible way to leverage expert users as crowd workers! The feature where blocks can be reused over and again is a great way to reduce the time taken on Foundry Also, pipelining tasks and using out tags and input tags could help even unconventional tasks be completed. Instead of being restricted to only a particular type of tasks, this would open the platform to a larger variety of tasks.

  • What do you think can be improved about the system?

1. A diverse set of flash teams is likely to bring up a large set of issues regarding coordination.
2. The quality of work needs to be evaluated and should comply to what would be needed by the requester. If the requester isn’t involved continuously within the process, he/she could be dissatisfied with the job and the entire project would have to worked on again.

Milestone Contributors

Slack usernames of all who helped create this wiki page submission: @varshine, @kr.prastut, @afreen