Milestone 2 Hawkeye

From crowdresearch
Revision as of 05:24, 12 March 2015 by Kpuneetha (Talk | contribs) (Created page with "Attend a Panel to Hear from Workers and Requesters Panel 1: This video is mainly about how to use mturk, what scripts can be used. The major issue which was mentioned accordin...")

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Attend a Panel to Hear from Workers and Requesters Panel 1: This video is mainly about how to use mturk, what scripts can be used. The major issue which was mentioned according to our observation is the interface problem there should be better documentation or graphical animations which can help the new users to understand and also whether is interface problem considered better place than other platform and it also give little insight of workers perspective and requesters perspective.

Reading Others' Insights

Worker perspective: Being a Turker

This paper shows how people use turker. This is kind of survey which talks from workers perspective and have also analysed the same its tells about what do they expect what they feel. They also tells about why do turkers turk and also how to find good requesters and their different perception on different tasks .

Requester perspective: Crowdsourcing User Studies with Mechanical Turk

Based on experiments this paper has demonstrated how crowdsourcing with mechanical turk can result into problems or and it has also given a solution which can help in reducing this problem. This paper has also pointed out the problem that results of different testing like usability testing etc. cannot be relied on just by testing it on smaller groups because this makes difficult to determine whether one approach is effective than the other. Thus, this paper shows that in the given mturk task though there is a time limit but there should be a check whether the user has understood the problem or not, so many people just for the sake to earn money just wants to finish off the task very fast or just for the sake of doing it they give wrong feedback comments etc which can result in a problem. If they try to reject the invalid comments this also very uneccesary work and time consuming so what second thing we observed which is a very good idea that first user should be asked few questions related to the task so that it is known that the task is well known to that person and then accordingly ask user to finish the task. This is one of the very good approach and the actual stats proved that this has reduced lot of invalid responses.

Requester perspective: The Need for Standardization in Crowdsourcing

Crowdsourcing has become one of the promising solutions for various problems, yet still it requires extensive structuring and managerial effort to make crowdsourcing feasible, and standardization of basic building block tasks would make crowdsourcing more scalable.The efficiency of the market can increase tremendously if there is at least some basic standardization of the common types of (micro-)work that is being posted on online labor markets Advantages of standardizing the simple tasks : Reusability, Trading commodities and True market pricing. A standardized set of simple work units can be used that can be later assembled to generate tasks of arbitary complexity.

Both perspectives: A Plea to Amazon: Fix Mechanical Turk

This paper is mostly based on what the requesters and workers need that mturk should provide and how eventually it got to be a better market place lot of development.