Difference between revisions of "Milestone 1 teamtrojan"

From crowdresearch
Jump to: navigation, search
(Readings)
(mClerk)
Line 38: Line 38:
  
 
=== mClerk ===
 
=== mClerk ===
mClerk : Enabling Mobile CrowdSourcing in Developing Regions
 
  
 
mClerk is a mobile crowdsourcing platform that focuses on low-income workers in developing countries for performing the tasks via SMS.
 
mClerk is a mobile crowdsourcing platform that focuses on low-income workers in developing countries for performing the tasks via SMS.

Revision as of 13:40, 4 March 2015

Experience the life of a Worker on Mechanical Turk

Singing up as a Mechanical Turk worker was not easy. There were issues related to our Amazon Payment Account. Finally, having resolved the signing issues, we were excited to experience the life of a Turk worker. Being neophytes at this task, it took us time to get the concept and working right.

Since we needed to start in the training mode, we started with a task that required to count the number of comments in a given URL. There were ten minutes available to provide the correct number for each URL. For every correct answer , we were paid an amount of $0.02.

Having obtained a fair idea of concept, we tried a task that required a drawing for a description. Few descriptions we tried were, describing a camp, diversity and barbeque

The platform provides an easy and an inexpensive way to collaborate with different participants.However, the number of tasks for a beginner is low and it is difficult to search tasks for novices. It will help if the tasks can be classified based on a worker's level.

Overall, we had an amazing experience in learning a new concept.

Experience the life of a Requester on Mechanical Turk

Creating HITs as a requester was a challenging experience. After a lot of pondering, we decided to conduct a survey on the usage of laptops and tablets. We had to

Alternate crowd-labor markets - GalaxyZoo

While GalaxyZoo is an astronomical crowdsourcing platform that solicits people to aid in morphologically classifying galaxies leveraging the concept of citizen science , Amazon Mechanical Turk is a crowdsourcing platform that is aimed for individuals and businesses to use collaboration and perform tasks that computers are unable to perform.

GalaxyZoo restricts itself to succor in scientific research but tasks such as classifying images or writing product descriptions can be performed using Amazon Mechanical Turk.

Mechanical Turk, allows requesters' post the Hits(Human Intelligent Tasks) with deadlines, rewards and allotted time to complete them. The HITs range from classifying images to writing detailed descriptions. GalaxyZoo does not allow users to define deadlines and rewards but takes the users' opinion by asking them a series of questions pertaining to each galaxy.

Apart from being an astronomical crowdsourcing platform, GalaxyZoo also provides access to a rich collection of papers and data related to astronomy and galaxies. The site also allows users to engage in constructive discussions through discussion boards.

While Mechanical Turk caters to the broader section of people and businesses, GalaxyZoo aims at helping scientists with galaxy classification.

Readings

MobileWorks

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?

mClerk

mClerk is a mobile crowdsourcing platform that focuses on low-income workers in developing countries for performing the tasks via SMS.

The system has the following good features:

1. Accessibility: The system takes into consideration, the lack of access to computers and internet for a significant percentage of population in developing countries and thus, uses SMS for sending and receiving tasks, making the system accessible to anyone with a low-end mobile phone.

2. Graphical Tasks: Though the trivial SMS is limited to sending and receiving texts, mClerk makes use of a protocol to send small images via ordinary SMS. This helps in accomplishing various real-world tasks that require access to images.

3. Users’ Requirements/ Qualifications: In order to ensure the contribution of low-income and less educated workers who have limited broken knowledge of English language, mClerk provides them with local language documents for digitizing. Thus, the large-scale problem of digitizing local-language task is also resolved to some extent.

4. Novelty: This system is the first demonstration of large-scale crowdsourced digitization for a language that lacks font support on workers’ devices.

5. Interaction with the users: The concept of leaderboard messages everyday to acknowledge the contribution of top users and reminder message in case of no activity improves the interaction of the system with users and helps increasing user contribution.

Improvements that can be made to the system include:

1. Accuracy of results: The system can limit the influence of less reliable workers by:

matching their response to a trusted worker, 
by requiring a minimum match rate for a worker to qualify for payment. 

2. Improving digitization latency(total time taken to digitize a word from the point it was first sent to when second verified response is achieved) : Rather than replicating all tasks, a second user could verify or reject the response of a prior worker.

3. Increasing contribution of lead user: In order to ensure that the lead user does not stop contributing by just relying on the referrals’ income, we can fix a threshold contribution amount. Lead user will receive his referrals’ share of earning only if his personal contribution amounts to the threshold contribution amount.

4. Competitive Environment: We can design our own system which pits two users in a game of word solving in a limited time frame. Competition might lead to improved participation and strive for accuracy among users.

Flash Teams

  • What do you like about the system / what are its strengths?
  • What do you think can be improved about the system?