Milestone 1 PixelPerfect

From crowdresearch
Revision as of 23:19, 4 March 2015 by Kushagramehta (Talk | contribs) (Experience the life of a Worker on Mechanical Turk)

Jump to: navigation, search

Team PixelPerfect's submission for Milestone 1.

Experience the life of a Worker on Mechanical Turk

The experience as a worker on these various Crowd-worker sites was an unpleasant one.

  • Amazon Mechanical Turk did not allow us to register as Indian workers. The review process took 2 days after which we were informed that the account cannot be activated.
  • There seemed to be some glitches with the Microworkers platform, their policy dictates the registration of one user per IP. We made several attempts at creating an account, from different workstations - however we always received a message that our IP's were already connected to some account. Hence we could not register here as well.
  • The experience on CrowdFlower/Clickworker was slightly better since we were able to register and activate our worker accounts.
    • Though we had active accounts, available work was scarce and we were only able to generate 0.1 Euro per member on Clickworker.
    • There were also no available tasks on CrowdFlower due to a high level (i.e level 1, level 2 or level 3) of contribution required. There did not seem to be any entry-level tasks available.
    • Around 0.125 USD was generated by completing surveys on various CrowdFlower Contribution Channel partners like InboxDollar, SwagBucks and ClixSense which hosted CrowdFlower microtasks (also not available).

What we liked from this experience is that people can do work from home and try to earn a living from themselves, and maybe do something that they like. It helps find tasks most suited for workers using their skills as parameters, resulting in efficient and effective work.

The thing that we disliked was that it clearly showed that despite a decent internet connection, competence and willingness to work, entry-level opportunities on Crowdworking platforms are rare, especially for people living outside the United States.

Experience the life of a Requester on Mechanical Turk

We weren't able to register as a Requester on Mechanical Turk as only people who were citizens of the United States could register as a requester.

We really liked the concept of people giving mundane tasks they might not want to do, or might not have time to do, to the community of crowdworkers and allotting some money to pay these workers for accurate and acceptable work. Something that I disliked during this experience is that workers are not respected for the work they have done, and the work-pay ratio hasn't been standardized by any of the crowdworking platforms.

Unfortunately,Since we weren't able to make a Requester account on Mechanical Turk/Other Crowd-Working Platforms, no CSV file has been attached.

Explore alternative crowd-labor markets

Being denied access to Amazon Mechanical Turk as CrowdSource workers, we can only compare it with various other crowdsourcing websites based on the fair idea we have from forums and discussion pages.

We have chosen oDesk for comparison since it seemed to be the most appropriate crowdsourcing site among the ones provided since the team members are mostly into technical fields. oDesk is a crowdsourcing platform which basically connects clients with freelancers. Essentially clients put up their jobs and freelancers who apply are interviewed.

Comparing Mechanical Turk with oDesk :

  • oDesk having a strong identity verification system makes it easier for clients by keeping them away from scammers as opposed to the disposable Mechanical Turk IDs.
  • Mechanical Turk is basically meant for short-term tasks, while on the other hand oDesk is chiefly meant for long-term tasks.
  • Mechanical Turk has the upper hand over oDesk in terms of cost.
  • oDesk consists mostly of human-to-human interaction. Workers don’t expect to interact with an automated process for being hired, assigned a task and evaluated.



MobileWorks is a mobile phone-based crowdsourcing platform intended to provide employment to developing world users. It provides OCR tasks which can be completed by workers on a web browser on a low-end phone. Scanned documents are segmented into small pieces (OCR tasks) and sent to users to digitize and send back. A multiple entry system is used to ensure quality of digitization.

  • Simple mobile interface, accessible to workers at bottom of economic pyramid
  • Takes advantage of high mobile phone penetration in India, low cost of mobile Internet
  • Cost-effective method of sending microtasks as opposed to SMS/desktop-based outsourcing centres etc.
  • Offers a convenient way to earn and be productive during free-time
What can be improved
  • Improve user interface to group multiple tasks together, saves time in navigation and reduces idle time while fetching next task.
  • More financial motivation strategies to get work done - bonuses, referral benefits etc
  • Additional tasks such as local product/place description, audio transcription, language subtitling and translation could be taken up.
  • For requesters, a priority queue system can be made available for completion of urgent tasks.


mClerk is yet another mobile phone-based crowdsourcing platform for developing regions. However, it uses an SMS-based system to send OCR microtasks. Binary pictures are sent via Nokia's Smart Messaging (SM) and Ericsson's EMS protocols. It has found its call in the digitization of local-language documents.

  • Simple SMS based system makes it accessible and affordable.
  • Smart handling of local-language fonts.
  • Takes advantage of high mobile penetration in India and cheap SMS plans.
  • Good motivation system - reminders and feedback, top-up amounts, referral benefits, leaderboard etc
  • System was not interpreted as a part-time job, rather a service. Viral propagation provides testimony to its success.
  • Eliminates most overhead costs to offer more benefits to worker.
  • Productive use of free-time.
What can be improved
  • Longer words can be sent as a set of 2-3 images to increase accuracy instead of compressing and reducing quality.
  • Develop a Quiz-Up type social application which encourages users to compete. Competition element is of benefit to both users and requesters.
  • Additional tasks through this medium could be explored - such as describing images, songs and issues requiring local knowledge.
  • Partnerships with carriers to reduce service fee and increase worker payment.

Flash Teams

Flash teams are a framework for the management and assembly of expert crowd-sourcing. Tasks like software development and design require certain expertise, remain largely out of reach for ordinary crowdsourcing frameworks. Foundry tries to accomplish the goal of trying to solve complex, interdependent problems with sufficient feasibility, by structuring the collaborations and assembling the expert crowd. Each task is divided into micro-tasks or modules with a team responsible for each task. To avoid diffusion of responsibility, there is a Directly Responsible Individual (DRI) who can make sure the modules are fulfilled in the given time.

  • Foundry takes on managerial responsibilities, minimizing the responsibilities of the end user (requester) who might not be a natural manager. But he has a complete control over the progress of the task.
  • Despite being a sequential flow, the tasks are also pipe-lined to minimize the time taken to complete the task. Some processes can run parallel too.
  • Team structure and modular design simplify the management of the expert crowd.
  • Updating of schedules and elasticity in the crowd size are a plus.
  • Path search support enables the user to assign tasks to new combinations of teams based on their previous experiences without having to spend time scouting for individual experts to form a team.
What can be improved
  • New workers with little or no experience in crowd sourcing but sufficient expertise are ignored in the path search since previously used teams are used again. We suggest there should be an initial task given to test and rate a new worker.
  • Miscommunication and arguments in a flash team leads to a delay in the completion of the task, the workers are fired or they just quit. Such disputes should be settled be settled with one person, the user or the DRI. The user can keep a track of the workflow and give continuous feedback about the direction headed by the flash team.
  • Team motivation can be strengthened by enabling workers to form loose clusters that can be hired together (or not, depending on the user's will). This can also reduce the conflicts faced within a flash team.
  • The DRI can evaluate the workers he is responsible for, and the user can in turn evaluate the DRI to estimate the efficiency of a particular team. New combinations of team sets comprising of the same workers can be compared and efficiency can be maximized.