Difference between revisions of "Milestone 1 PixelPerfect"

From crowdresearch
Jump to: navigation, search
(Flash Teams)
(MobileWorks)
Line 44: Line 44:
 
=== MobileWorks ===
 
=== MobileWorks ===
  
* What do you like about the system / what are its strengths?
+
MobileWorks is a mobile phone-based crowdsourcing platform intended to provide employment to developing world users. It provides OCR tasks which can be completed by workers on a web browser on a low-end phone. Scanned documents are segmented into small pieces (OCR tasks) and sent to users to digitize and send back. Multiple entry system ensures quality of digitization.
* What do you think can be improved about the system?
+
 
 +
'''Strengths'''
 +
* Simple mobile interface, accessible to workers at bottom of economic pyramid
 +
* Takes advantage of high mobile phone penetration in India, low cost of mobile Internet
 +
* Cost-effective method of sending microtasks as opposed to SMS/desktop-based outsourcing centres etc.
 +
* Offers a convenient way to earn and be productive during free-time
 +
 
 +
'''What do you think can be improved '''
 +
* Improve user interface to group multiple tasks together, saves time in navigation and reduces idle time while fetching next task.
 +
* More financial motivation strategies to get work done - bonuses, referral benefits etc
 +
* Additional tasks such as local product/place description, audio transcription, language subtitling and translation could be taken up.
 +
* For requesters, a priority queue system can be made available for completion of urgent tasks.
  
 
=== mClerk ===
 
=== mClerk ===

Revision as of 21:46, 4 March 2015

Template for your submission for Milestone 1. Do not edit this directly - instead, make a new page at Milestone 1 YourTeamName or whatever your team name is, and copy this template over. You can view the source of this page by clicking the Edit button at the top-right of this page, or by clicking here.

Experience the life of a Worker on Mechanical Turk

The experience as a worker on these various Crowd-worker sites was an unpleasant one.

  • Amazon Mechanical Turk did not allow us to register as Indian workers. The review process took 2 days after which we were informed that the account cannot be activated.
  • There seemed to be some glitches with the Microworkers platform, their policy dictates the registration of one user per IP. We made several attempts at creating an account, from different workstations - however we always received a message that our IP's were already connected to some account. Hence we could not register here as well.
  • The experience on CrowdFlower/Clickworker was slightly better since we were able to register and activate our worker accounts.
    • Though we had active accounts, available work was scarce and we were only able to generate 0.1 Euro per member on Clickworker.
    • There were also no available tasks on CrowdFlower due to a high level (i.e level 1, level 2 or level 3) of contribution required. There did not seem to be any entry-level tasks available.
    • Around 0.0125 USD was generated by completing surveys on various CrowdFlower Contribution Channel partners like InboxDollar, SwagBucks and ClixSense which hosted CrowdFlower microtasks (also not available).

What we liked from this experience is that people can do work from home and try to earn a living from themselves, and maybe do something that they like. The thing that we disliked was that it clearly showed that despite a decent internet connection, competence and willingness to work, entry-level opportunities on Crowdworking platforms are rare, especially for people living outside the United States.

Experience the life of a Requester on Mechanical Turk

We weren't able to register as a Requester on Mechanical Turk as only people who were citizens of the United States could register as a requester.

We really liked the concept of people giving mundane tasks they might not want to do, or might not have time to do, to the community of crowdworkers and allotting some money to pay these workers for accurate and acceptable work. Something that I disliked during this experience is that workers are not respected for the work they have done, and the work-pay ratio hasn't been standardized by any of the crowdworking platforms.

Explore alternative crowd-labor markets

Being denied access to Amazon Mechanical Turk as CrowdSource workers, we can only compare it with various other crowdsourcing websites based on the fair idea we have from forums and discussion pages.

We have chosen oDesk for comparison since ... oDesk is a..

Comparing Mechanical Turk with oDesk :

  • Mechanical Turk is basically meant for short-term tasks, while on the other hand oDesk is chiefly meant for long-term tasks.
  • Mechanical Turk has the upper hand over oDesk in terms of cost.
  • oDesk consists mostly of human-to-human interaction. Workers don’t expect to interact with an automated process for being hired, assigned a task and evaluated.
  • As far as Crowdsourcing is concerned, Mechanical Turk is a far better option as it involves tasks being done by interested contributors while in the case of oDesk, it largely deals between humans where oDesk acts as a platform to interact.

Readings

MobileWorks

MobileWorks is a mobile phone-based crowdsourcing platform intended to provide employment to developing world users. It provides OCR tasks which can be completed by workers on a web browser on a low-end phone. Scanned documents are segmented into small pieces (OCR tasks) and sent to users to digitize and send back. Multiple entry system ensures quality of digitization.

Strengths

  • Simple mobile interface, accessible to workers at bottom of economic pyramid
  • Takes advantage of high mobile phone penetration in India, low cost of mobile Internet
  • Cost-effective method of sending microtasks as opposed to SMS/desktop-based outsourcing centres etc.
  • Offers a convenient way to earn and be productive during free-time

What do you think can be improved

  • Improve user interface to group multiple tasks together, saves time in navigation and reduces idle time while fetching next task.
  • More financial motivation strategies to get work done - bonuses, referral benefits etc
  • Additional tasks such as local product/place description, audio transcription, language subtitling and translation could be taken up.
  • For requesters, a priority queue system can be made available for completion of urgent tasks.

mClerk

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?

Flash Teams

Flash team is a framework for the management and assembly of expert crowd-sourcing. The usual crowd-sourcing tasks are effective for tasks that could be accomplished with minimal skill sets, whereas tasks like software development and design, that require certain expertise, remain largely out of reach. The paper focuses on a platform called Foundry to accomplish such a goal of trying to solve complex, interdependent problems with sufficient feasibility, by structuring the collaborations and assembling the expert crowd. Each task is divided into micro-tasks or modules with a team responsible for each task. To avoid diffusion of responsibility, there is a Directly Responsible Individual (DRI) who can make sure the modules are fulfilled in the given time.

Strengths:

  • Foundry takes on managerial responsibilities, minimizing the responsibilities of the end user (requester) who might not be a natural manager. But he has a complete control over the progress of the task. He can pause, tweak the team structure and resume. The modules are visualized as “blocks” that have input and output tags. The end user just has to follow the arrows to track the flow of responsibilities.
  • Despite being a sequential flow, the tasks are also pipe-lined to minimize the time taken to complete the task. Some processes can run parallel too.
  • Team structure and modular design simplify the management of the expert crowd.
  • Updating of schedules and elasticity in the crowd size are a plus.
  • Path search support enables the user to assign tasks to new combinations of teams based on their previous experiences without having to spend time scouting for individual experts to form a team.

Things that could be improved:

  • New workers with little or no experience in crowd sourcing but sufficient expertise are ignored in the path search since previously used teams are used again. Our suggestion: there should be an initial task given to test and rate a new worker.
  • Miscommunication and arguments in a flash team leads to a delay in the completion of the task, the workers are fired or they just quit. Such disputes should be settled be settled with one person, the user or the DRI. The user can keep a track of the workflow and give continuous feedback about the direction headed by the flash team.
  • Human managers are more preferred according to organizational behavior studies.