WinterMilestone 3 Duka ReputationIdea: Dynamic Pricing based on user reputation for a specific task
Dynamic Pricing based on the user "skill" (reputation) for a specific task
This idea consists in providing a task price according to the user level of experience/quality in executing that specific type of task. Illustrative example:
Requester posts a $1 task. A mid-level worker would see the task on his feed for $1. An entry-level worker would see it for $0.80 (just an example) and an information telling him that the original value is $1, but he will be paid less until the system can verify the quality of the work he can make. On the other end, a high quality worker could see the task for $1.20 and an explanation that he deserves a better pay because of experience and high quality work (for that specific type of task).
This could incentivize requesters to try out new workers because they would be cheaper and, at the same time, they would feel more confident in paying more for “guaranteed” workers in case they need higher quality. On the worker side, they would get discouraged to do poor work, decrease their own reputation and get less money on future tasks. And they would try to improve quality to get better pay.
The initial idea regarding pricing would be to make the system take care of the money. So requesters would pay $1 regardless and the system would control how much money it can deduct from unskilled workers to be able to increase the amount for skilled workers based on different variables (e.g., amount of tasks). Or, another possibility, would be for requesters themselves to tell the system how much they want to pay for each worker category. So they would have the option to input a higher value for experienced workers and a lower one for rookies.
Reputation per skill
Users would not have an overall reputation level, but a reputation per skill which would be based on a combined set of elements (e.g., feedback, amount of tasks done, accuracy, quality, certifications). The user profile should show only skills which the user has some experience. Here is an illustrative example:
Skills would be created by the Daemo team based on feedback from workers and requesters. Some could be related to the individual (e.g., friendliness) and other to the skill itself (e.g., transcription of images). Requesters and workers could request a new skill to be added and also the system can be tracked to improve the quality and accuracy of skill.
Point system instead of amount of reviews
Instead of focusing on specifying numbers of reviews for workers/requesters, the "skill level" would work to represent a broader range. For example, a worker can become "Advanced" by completing hundreds of tasks and getting dozens of reviews, or also by proving they have the needed background, through tests or other methods. The goal is to avoid "superstars" as it happens with "number of reviews" (e.g., worker with 1200 reviews seems a lot better than a worker with 100 reviews, even though they might provide the same quality of work). Hence, the skill reputation would be acquired by points or other similar system. Here is an example using DP ("Daemo Points") to illustrate the idea:
- Rookie: 0 < 100 "DP"
- Intermediate: 100 < 250 "DP"
- Skillful: 250 < 500
- Advanced: 500 < 1000
- Expert: 1000+
To acquire "DPs", the Daemo team would create a system to add and subtract points according to user actions. To exemplify:
- 1 DP for every task approved
- - 5 DP for every task rejected
- 20 DP for every positive review
- - 40 DP for every negative review
- from 20 to 500 DP for certifications and tests, depending on the complexity of the test and/or how trustworthy is the certification.
This type of system could highly improve the trust between workers and requesters and it could stimulate users to keep a good reputation. This idea combined with Boomerang could be very effective in creating a better rating scheme. As mentioned on this article: "(...), we argue that the skills of workers are not necessarily related to the number of their resolve tasks or the difference between their answers and others. We consider the feedback score of the resolved tasks to illustrate the latent skills of workers."
Ideas to improve feedback between workers and requesters
Regarding feedback between workers and requesters, here are a few ideas to incentivize Daemo users to contribute:
- Requesters should be able to provide feedback to a group of workers that did the same task. So, for example, a requester posts a "transcription of image" to be executed by 20 users. Once he receive the results, if he/she provides feedback to one worker, Daemo compares this worker transcription with the others who did the same task and automatically gives the same feedback to those who sent the same result.
- Before workers get paid (access the payment) for a task (or group of tasks for the same requester) Daemo should request (from time to time) a quick scale rating feedback about the requester.
- Users should be educated about the Boomerang system so they understand how important it is to properly rate to get matched with the proper users/tasks.
- As an onboarding tactic we can ask both workers and requesters to provide feedback to one another to increase some of their "humane" skills.
- When providing feedback to one another, users can only rate skills relative to the executed task. For example, if a worker did a translation job, the requester can only provide feedback for the worker skill in translating the specific languages involved (and some other "humane" skills).
- Diane Phan @diane
- Kimberly Le @kimberly35
- Renan Castro @renan
- Nicolas Rodriguez @nick13rodriguez