Milestone 1 NCPD
Template for your submission for Milestone 1. Do not edit this directly - instead, make a new page at Milestone 1 YourTeamName or whatever your team name is, and copy this template over. You can view the source of this page by clicking the Edit button at the top-right of this page, or by clicking here.
Experience the life of a Worker on Mechanical Turk
Reflect on your experience as a worker on Mechanical Turk. What did you like? What did you dislike?
Experience the life of a Requester on Mechanical Turk
Reflect on your experience as a requester on Mechanical Turk. What did you like? What did you dislike? Also attach the CSV file generated when you download the HIT results.
Explore alternative crowd-labor markets
Compare and contrast the crowd-labor market you just explored (TaskRabbit/oDesk/GalaxyZoo) to Mechanical Turk.
MobileWorks: A Mobile Crowdsourcing Platform for Workers at the Bottom of the Pyramid (2011)
This paper presents a mobile platform – MobileWorks, for crowd-sourcing intended to provide the easier way of employment to the developing world users. This is built in a way to be easier useable by the workers without any extra costs and efforts. MobileWorks provides human optical character recognition (OCR) tasks that can be completed by workers on low-end mobile phones through a web browser. To address the limited screen resolution available on low-end phones, MobileWorks divides documents into many small pieces and sends each piece to a different worker. An initial pilot study with 10 users over a two month period revealed that it is feasible to do basic OCR tasks using a simple mobile web-based application. We find that workers using MobileWorks average 120 tasks per hour at an accuracy rate of 99% using a multiple entry solution. In addition, users had a positive experience with MobileWorks: all study participants would recommend MobileWorks to friends and family.
- Easier to access and mobile
- More economical. as this can be operated in low-end mobile devices.
- Simple GUI for easier understanding.
- Simple OCR tasks for the workers with optimal way of dividing the documents.
- Limited to OCR tasks
- The tasks require multiple divisions and merging of the documents even for simple OCR tasks.
- Cannot be used by users, who have no English understanding.
mClerk: Enabling Mobile Crowdsourcing in Developing Regions (2012)
Even this platform was developed to help the workers from the developing countries, who have no access to computers and internet. This paper presents mClerk, a new platform for mobile crowd-sourcing in developing regions. mClerk sends and receives tasks via SMS, making it accessible to anyone with a low-end mobile phone. However, mClerk is not limited to text: it leverages a little-known protocol to send small images via ordinary SMS, enabling novel distribution of graphical tasks. Via a 5-week deployment in semi-urban India, we demonstrate that mClerk is effective for digitizing local-language documents. Usage of mClerk spread virally from 10 users to 239 users, who digitized over 25,000 words during the study. We discuss the social ecosystem surrounding this usage, and evaluate the potential of mobile crowd-sourcing to both deliver and derive value from users in developing regions.
- Easier to access due to the low-end mobile phones used and the platform is mobile.
- Internet is not required and medium of data transfer is SMS.
- Cost effective has many people usually have free SMS packs for duration of months or years.
- This platform can handle texts and also small images through specific protocol incorporation.
- Offline money transfer to the workers has attracted many people.
- Provides Language translation facilities.
- This was very much developed to the partially educated part of the society. A better characterization of ideal users would be low-income workers who have a lot of free time and have professions that allow them to have social interactions.
- System can limit the influence of less reliable workers by matching their response to a trusted worker, or perhaps by requiring a minimum match rate for a worker to qualify for payment.
- Variation in the payment to the workers based on the type of task and difficulty level.
Expert Crowdsourcing with Flash Teams (2014) This paper introduces flash teams, a framework for dynamically assembling and managing paid experts from the crowd. Flash teams advance a vision of expert crowd work that accomplishes complex, interdependent goals such as engineering and design. These teams consist of sequences of linked modular tasks and handoffs that can be computationally managed. Interactive systems reason about and manipulate these teams’ structures: for example, flash teams can be recombined to form larger organizations and authored automatically in response to a user’s request. Flash teams can also hire more people elastically in reaction to task needs, and pipeline intermediate output to accelerate completion times. To enable flash teams, we present Foundry, an end-user authoring platform and runtime manager. Foundry allows users to author modular tasks, and then manages teams through handoffs of intermediate work. We demonstrate that Foundry and flash teams enable crowd-sourcing of a broad class of goals including design prototyping, course development, and film animation, in half the work time of traditional self-managed teams.
- This is a generic framework, which allows complex task management and completion by the crowd.
- Interactive system can leverage to support collaboration, create new teams automatically, grow and shrink teams on demand, and combine teams into larger organizations.
- This paper offers a method of scaling expert crowd work through computational management of an elastic workforce.
- Flash teams are the first to leverage the scale of paid crowd-sourcing for expert work.
- Flash teams afford dynamic recruitment and coordination of on-demand expertise that is extremely difficult in offline scenarios.
- The flash teams often aim to gather experts with different expertise rather than redundant viewpoints.