Analysing Failure Idea: a "FeedbackMe" system

From crowdresearch
Revision as of 21:08, 7 February 2016 by Kamilamananova (Talk | contribs)

Jump to: navigation, search

Analysing Failure Idea: a "FeedbackMe" system


Сrowdsourcing entails the process of obtaining needed services, ideas, or content from a large group of people. A wide variety of tasks can be crowdsourced. But in any crowd, people have different skills and different backgrounds, which means that some workers may be better than others at some tasks, and worse than others at other tasks. Tasks can vary from easy ones, those with binary questions (simple analysis of images, i.e. questions like “Is there a human face on the image?”), to complex ones such as translation or any other task where the output is intricate. Due to the lack of proper communication, it is impossible to predict what problems will be met during the creation and execution of a task. Language barriers, ambiguity of questions and the quantity of work-time relationship can cause huge risk of task failures. But both requesters and workers are interested in receiving good quality work, therefore they are interested to receive feedback to improve their work and to lower the risk of task failures.

This introduction presents a design and evaluation of «FeedbackMe», a software feature designed to analyze task failure and find its causes. Often, when giving a feedback, we meet some difficulty in scaling the measurable and unmeasurable qualities, and observable and unobservable qualities. Therefore, we present the following design of a system:

- Multiple choices questions. We propose a series of binary questions (“Did the worker respected the deadline for every task?”, a requester can evaluate only with “yes” or “no”) and from 0 to 5 scale questions to be answered by the requester and worker after completing a task, both successfully or inverse.

- After checking the cases in a visible feedback window with a series of generated questions and open question case, the results of checkboxes will be recorded (open question is only viewable to requesters and workers involved) and stored on the platform both in the requester’s and worker’s dashboard.

- In cases where the worker or requester gets the same « bad » evaluation of one of the qualities, « FeedbackMe » sends an automatic message to the evaluated person with proposals for further improvement.