Milestone 2 ams

From crowdresearch
Revision as of 21:28, 11 March 2015 by Maniksingh (Talk | contribs)

Jump to: navigation, search

Special Note

We have our examinations going on this week, so we were unable to work on the Monday Panel discussion, and two research papers.

Reading Others' Insights

Requester perspective: Crowdsourcing User Studies with Mechanical Turk

1) What observations about workers can you draw from the readings? Include any that may be are strongly implied but not explicit.

Although Mechanical Turks has become a very popular crowdsourcer, there is always that element of money attached with it which inevitably invites malicious practices. In the experiment 1, where the requester put straight forward and simple one word answer questions so as to make is as easy as possible for a worker, the workers took unethical advantage of it by wrongfully submitting their jobs in search for quick money. The experiment 2, although lengthy and subjective produced better results. Let's be honest, surveys are meant to carry personal opinion, hence no answer is right or wrong. It's an opinion. Basically, you can't hold it against the worker. And there lies the loophole. The worker fills in the survey, submits it and good bye. Job done. Money in his pocket.

2) What observations about requesters can you draw from the readings? Include any that may be are strongly implied but not explicit.

The requester has to be extra smart; think two steps ahead of an unfaithful worker. The experiment 1, although worker friendly produced unusable statistics. The experiment 2, which was extensive, not worker friendly, turned out to be better for the requester. To use the crowd as a source, you need to eliminate the unwanted from the crowd. It's okay to make it a little difficult for the worker as long as the task is achievable and worth the money being offered.

Requester perspective: The Need for Standardization in Crowdsourcing

1) What observations about workers can you draw from the readings? Include any that may be are strongly implied but not explicit.

The method of standardization could well turn out to be a block for beginners looking for opportunities in the Crowdsoursing sector. The more the number of restrictions (let's be honest, standardizing is restricting) the more the worker will think before setting up his profile as a crowdsourcer. The process should be clean, smooth and open.

Hence, where standardization is being thought to improve work and payment standards of the workers, it may well backfire, defeating its entire purpose.

2) What observations about requesters can you draw from the readings? Include any that may be are strongly implied but not explicit.

For the requesters, yes I agree standardization will bring more sophistication into the system. The jobs will be clearly segregated. The payments will be better handled. Agreed. But then again, for a beginner requester, who wouldn't know how to categorize his job so as to bring in more workers, this system is too much of a hassle. A very likely scenario is that the requester who didn't standardize his job well (he's new to the system, mistakes will be made) ended up getting very few workers and was forced to increase the pay so as to attract more workers, which he didn't need to in the first place.

Both perspectives: A Plea to Amazon: Fix Mechanical Turk

1) What observations about workers can you draw from the readings? Include any that may be are strongly implied but not explicit.

  • Since the pay of a HIT, or even multiple HITs, do not amount to much, workers would prefer to move on rather than pursuing a complaint of the requester.
  • Workers aim to improve their 'rating' by working on very small low paying HITs which are a walk in the park. This makes them appear more eligible to higher paying tasks, completely disregarding the account whether they are actually skilled for the task or not.
  • A productive worker cannot differentiate himself from the rest of the crowd and ends up being paid less.
  • Experienced workers usually gauge the trustworthiness of a new requester by only completing a small number of HITs for them.
  • Workers who work legitimately on a large batch of HITs for a requester and get rejected unjustly, do not only lose the opportunity to earn the money but also lose reputation.
  • Workers search for work based on most recent HITs, instead of searching for work which matches their skill set.

2) What observations about requesters can you draw from the readings? Include any that may be are strongly implied but not explicit.

  • Requesters have the ultimate power to misuse their responsibility.
  • Requesters are stuck with using a complicated and technologically outdated platform.
  • The web platform, and the complex API, do very little in reducing the time and costs involved in requesting HITs, thus moving the break-even point further away from the requester.
  • If a requester has followed the approach of building their own interface in an HTML iframe, it becomes easier for them to expose non-MTurk people to their HITs with simple HTML urls, thus leaking the work from an ecosystem which was supposed to be self sufficient enough to contain itself.
  • The flexible nature of the MTurk platform is beneficial for requesters who want to design and represent their HITs as they please.
  • A newly joined requester is unaware of the market dynamics and might get discouraged to continue with the crowd sourcing platform if he/she encounters inexperienced workers in his/her early stages of involvement with the platform.

Do Needfinding by Browsing MTurk-related forums, blogs, Reddit, etc

TurkerNation

  • Requesters are trying to mediate their own verification methods and their trust level for mTurkers seems low since they are trying to filter out mTurkers.

http://turkernation.com/showthread.php?23735-Well-paid-survey-HIT-for-TurkerNation-members-only&p=245585&viewfull=1#post245585

  • A classic case of technical malfunction where the system 'accidentally' rejected a turkers work. The turker seems infuriated by this issue although seemed to have consoled once he/she received a support reply from the requester.

http://turkernation.com/showthread.php?23874-Doing-a-survey-and-the-HIT-was-just-automatically-returned

mTurk Forum

  • The forum has an abundance of posts by requesters who are providing incentives and bonuses in search for good turkers. Starting a thread also allows the requesters to interact with the workers and gain knowledge about the workers and their experience.
  • New users, workers and especially requesters, are wary of the mTurk platform. It shows that either these users are completely new to the concept of crowdsourcing platforms, or they have come across negative reviews about the platform.