Milestone 1 NotMathMajors

From crowdresearch
Jump to: navigation, search

Experience the life of a Worker on Mechanical Turk

Our team was fortunate enough to be quickly accepted into the Mechanical Turk system and promptly attempted to earn a dollar from various HITs. We aimed to complete a variety of the typical HITs found on Mechanical Turk, rather than continuously submitting HITs for the same requester.

We completed these HITs:

  • Play an eye tracking game!
    • Requested by px
    • Reward of $0.15 and a $0.025 bonus
  • Search for a Keyword and Log Position in Search Results
    • Requested by Jonathan Gilliam
    • Reward of $0.08
  • Open a website and confirm contents
    • Requested by Barry Allen
    • Reward of $0.03
  • Review a website for us
    • Requested by TipTopMarketing
    • Reward of $0.07
  • Quality Assurance on a website
    • Requested by Farhan Memon
    • Reward of $0.40
  • Type the text from images
    • Requested by CopyText Inc.
    • Reward of $0.01 and an $0.08 bonus
  • Transcribe a 2-5 minutes interview (audio recording)
    • Requested by Aka Tuma
    • Reward of $0.30

This allotted a total earnings of $1.145.

Reflections

Completing Work

Mechanical Turk has a wide variety of HITs available for completion, which could help an MTurk worker avoid a prolonged sense of boredom while working; however, the tasks are so small in scope that there is no real association between the worker and the requester. If the workers were able to understand the scope of the work as a whole, it would aid in making the work feel more meaningful, thus providing a better work experience as a whole. Of the HITs that were completed, the most enjoyable was the eye-tracking game requested by px, or Princeton University. It was a game, which helped, but we also had an idea of the scope of their research, which made it much more thrilling. If Mechanical Turk, or any crowdsourcing platform, provided the worker with more than just a list of monotonous tasks to complete, it would provide a significantly better experience for the worker.

Navigating the Interface

The largest downside of being a worker on Mechanical Turk is attempting to navigate the interface. It feels dated, slow, and clunky, which drastically counters the idea of rapid task completion posed by the HITs. Mechanical Turk does a good job of keeping the interface simple, but none of the information on the page seems to be placed intuitively. A crowdsourcing platform based on rapid simple task completion should also have an interface that is rapid and simple, allowing the worker to fluidly move between the tasks they are completing without ever having to search for information or a link to click on. This could also be improved by providing a framework for the requester to construct their HIT within, rather than providing their own forms within a box.

Experience the life of a Requester on Mechanical Turk

CSV Here

Reflecting on our experience on the requester side of crowd-labor, the first noticeable quality is the presentation. The requester side is more attractive, easier to use, and much more informative than the worker side of Mechanical Turk, it feels as if the entirety of the service leans toward attracting requesters for profitability. In fact, when we set up our requested task, we found that the service defaults each task to only allow workers with a master's qualification and creating a task with this default provides Amazon with a steep fee of 30% of the value of the reward. I mistakenly created our task with this master's requirement left on and minutes after I received an email from one of the workers informing me on how to get quality labor by specifying my qualifications to have the "Total Approved HITs is not less than 5000” and “HIT Approval Rate is not less than 98%" specifications enabled. We significantly do not like this feature, as the fee is high, and there is no real benefit to only allowing those who meet Amazon's specifications as a "master". Defaulting to this special feature feels slimy. Nonetheless, Mechanical Turk allowing a large amount of versatility while creating a task is a huge plus, as a requester on Mechanical Turk you feel empowered and in control.

Explore alternative crowd-labor markets

GalaxyZoo

Key Difference

The most stark contrast between GalaxyZoo and Mechanical Turk is the lack of a monetary incentive for completing a task. GalaxyZoo asks you to assist in classifying images of Galaxies by their appearance and compares your results to other users who also viewed the exact same image; the service allows you to create a profile that records which galaxies you've classified, but does not give a reward for completion. This crowd-labor website explores another layer to crowd sourcing by extracting from worker's self-made incentives as opposed to paying them in relation to their amount of labor or expertise.

Largest Similarity

There is little about GalaxyZoo that directly compares to Mechanical Turk aside from the type of labor. The image classification featured in GalaxyZoo is all that is offered, while Mechanical Turk includes this type of work in many low-paying Human Intelligence Tasks.

Readings

MobileWorks

MobileWorks is an extremely simple and efficient crowdsourcing platform designed to serve users in developing nations who have access to very simple and cheap cell phones. It is astonishing that the team was capable of creating a platform that could help the lower classes to earn more money; however, the tasks that are being completed are extremely simple and monotonous OCR tasks where the user translates a few words of handwritten text without any context. Of the 10 users that the service was tested with, all that were surveyed said that they would recommend the service to their family and friends. This is extremely important, as it means that the service could easily spread throughout the lower classes of developing nations to assist them with earning money; however, this doesn't attack the root of the problem that causes such a vast disparity in wealth. MobileWorks could be improved by adding more variety in their tasks, which is not an easy task when the user is restricted to a simple cell phone, but it is a necessary step. Also, attempting to provide more motivation for completing the tasks, and allowing users to complete tasks that teach and require mastery of a subject should be experimented with.

mClerk

This system of mClerk is very beneficial because it broadens the way crowd sourcing can spread- by eliminating location- and it also gives incentive to lower income areas. It also provides an easy way to reach out to anyone by using the mobile format. These strengths give it a large foundation, one that can outstretch the boundaries currently held in the crowd sourced world. Location is a very big deal as it is one of the larger limiting factors shared today. It can dictate not only the people you would work with but also the projects you would deal with, as a project from an area would reflect interests of that area and culture. The platform is very user friendly, as it uses mobile devices, providing a way to reach everyone, much like we already do. This is something that could benefit a lot of people, as most everyone has a mobile phone, yet some do not have a data connection or computer. This system can be improved by enhancing the quality of the overall user experience, and adding more payout as incentive, as more users seem to pull out midway into the phases of collaboration. The low end phones and non-English speakers make crowd sourcing a bit harder and that seems to be the only weakness mClerk has.

Flash Teams

Flash Teams is a showcase of how complex tasks can be completed by a multitude of small groups doing modular tasks. Flash Teams' greatest strength is in it's flexibility and diversity, it can easily expand or condense as the need arises while working on a task and this diversity brings a different array of skills from each member who joins. This method allows these intricate tasks to be finished quickly, almost instantaneously compared to other the length of time it would take an individual or small company. Flash Teams is a powerful method of crowdsourcing, but can be improved upon if the ability to add a personal layer of personal attachment to each project as well as a sense of companionship toward fellow workers, a potent side effect of using an expendable and short-term workforce.