WinterMilestone 3 Duka ReputationIdea: Dynamic Pricing based on user reputation for a specific task

From crowdresearch
Revision as of 21:01, 31 January 2016 by Dianephan (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Needs from Milestone 2

  • Perhaps newcomers should all go through a short test to measure the degree of their quality of work and interest in completing tasks.
  • What if we had a system of levels? Your reputation and work is measured through levels. Users level up and that determines their reputation.
  • The tasks can be controlled by upvotes and downvotes.
  • There should be a comments section for both workers and users to emphasize their concerns or questions for a certain task. This is how people can get their voice heard.
  • An admin should monitor site.
  • In order for users to know how effective their work was, the requester should give some critique or feedback so that the user can learn to improve in the future and know which direction they should head towards. This improves the relationship between worker and requester.
  • In order for users to find relevant work, they should set up a profile with their interests and skills so that the system can easily help them find what work they might be interested in doing.
  • There should be a private messaging system for all users to communicate to each other. This will also help build relationships and potentially improve the understanding for the work desired.
  • There should be a way for users to make groups with each other when all of them working on the same task for the same requester, similar to the prototyping teams concept we read about two weeks back. This would allow for a greater sense of coworking, and the creation of a better connection between requesters and a certain few workers, giving a sense of security in crowdsourcing to workers.

Dynamic Pricing based on the user "skill" (reputation) for a specific task

This idea consists in providing a task price according to the user level of experience/quality in executing that specific type of task. Illustrative example:

Requester posts a $1 task. A mid-level worker would see the task on his feed for $1. An ​entry-level worker would see it for $0.80 (just an example) and an information telling him that the original value is $1, but he will be paid less until the system can verify the quality of the work he can make. On the other end, a high quality worker could see the task for $1.20 and an explanation that he deserves a better pay because of experience and high quality work (for that specific type of task).

This could incentivize requesters to try out new workers because they would be cheaper and, at the same time, they would feel more confident in paying more for “guaranteed” workers in case they need higher quality. On the worker side, they would get discouraged to do poor work, decrease their own reputation and get less money on future tasks. And they would try to improve quality to get better pay.

The initial idea regarding pricing would be to make the system take care of the money. So requesters would pay $1 regardless and the system would control how much money it can deduct from unskilled workers to be able to increase the amount for skilled workers based on different variables (e.g., amount of tasks). Or, another possibility, would be for requesters themselves to tell the system how much they want to pay for each worker category. So they would have the option to input a higher value for experienced workers and a lower one for rookies.

Reputation per skill

Users would not have an overall reputation level, but a reputation per skill which would be based on a combined set of elements (e.g., feedback, amount of tasks done, accuracy, quality, certifications). The user profile should show only skills which the user has some experience. Here is an illustrative example:

Example-worker-skills.jpg Example-requester-skills.jpg Example-user-skill-level.jpg

Skills would be created by the Daemo team based on feedback from workers and requesters. Some could be related to the individual (e.g., friendliness) and other to the skill itself (e.g., transcription of images). Requesters and workers could request a new skill to be added and also the system can be tracked to improve the quality and accuracy of skill.

Point system instead of amount of reviews

Instead of focusing on specifying numbers of reviews for workers/requesters, the "skill level" would work to represent a broader range. For example, a worker can become "Advanced" by completing hundreds of tasks and getting dozens of reviews, or also by proving they have the needed background, through tests or other methods. The goal is to avoid "superstars" as it happens with "number of reviews" (e.g., worker with 1200 reviews seems a lot better than a worker with 100 reviews, even though they might provide the same quality of work). Hence, the skill reputation would be acquired by points or other similar system. Here is an example using DP ("Daemo Points") to illustrate the idea:

  • Rookie: 0 < 100 "DP"
  • Intermediate: 100 < 250 "DP"
  • Skillful: 250 < 500
  • Advanced: 500 < 1000
  • Expert: 1000+

To acquire "DPs", the Daemo team would create a system to add and subtract points according to user actions. To exemplify:

  • 1 DP for every task approved
  • - 5 DP for every task rejected
  • 20 DP for every positive review
  • - 40 DP for every negative review
  • from 20 to 500 DP for certifications and tests, depending on the complexity of the test and/or how trustworthy is the certification.

This type of system could highly improve the trust between workers and requesters and it could stimulate users to keep a good reputation. This idea combined with Boomerang could be very effective in creating a better rating scheme. As mentioned on this article: "(...), we argue that the skills of workers are not necessarily related to the number of their resolve tasks or the difference between their answers and others. We consider the feedback score of the resolved tasks to illustrate the latent skills of workers."

Ideas to improve feedback between workers and requesters

Regarding feedback between workers and requesters, here are a few ideas to incentivize Daemo users to contribute:

  • Requesters should be able to provide feedback to a group of workers that did the same task. So, for example, a requester posts a "transcription of image" to be executed by 20 users. Once he receive the results, if he/she provides feedback to one worker, Daemo compares this worker transcription with the others who did the same task and automatically gives the same feedback to those who sent the same result.
  • Before workers get paid (access the payment) for a task (or group of tasks for the same requester) Daemo should request (from time to time) a quick scale rating feedback about the requester.
  • Users should be educated about the Boomerang system so they understand how important it is to properly rate to get matched with the proper users/tasks.
  • As an onboarding tactic we can ask both workers and requesters to provide feedback to one another to increase some of their "humane" skills.
  • When providing feedback to one another, users can only rate skills relative to the executed task. For example, if a worker did a translation job, the requester can only provide feedback for the worker skill in translating the specific languages involved (and some other "humane" skills).

Usability Experience (Test Flight)

The website’s layout is simple, minimal, and clean, which I thought was very nice. The button to register flashed up when I clicked on it, so it made the website pop out a bit. Upon activation, I was led to the task feed site where it listed the tasks I was able to choose from. I found the tasks to be entertaining, possibly because people were just testing out the features and posting random questions and answers. I was a bit confused by how things worked, for example, I did not know I had to accept the task before submitting my answer. I think it would be easier to just open the task and submit the answer than to open the task, ACCEPT the task, then submit. There were two questions, the one with the image, and the one with the soundclip, that required feedback, but did not have space for users to type anything in. With that said, I proceeded to click submit on those tasks and found them to be useless. I decided to try and make my own project, just for the fun of it. I hope that, since this is a test, I will not lose anything in the process of creating this project. I proceeded to answer my own question. I tried to connect Daemo to my Google Drive, but for some reason, it failed to do so. In conclusion, I think an actionable item that can be implemented right away, is that the “accept task” should be removed, for it just seems unnecessary. In the future, I hope to see more tasks and users. I think completing tasks is an entertaining way for me to enjoy my study break or procrastinate. Oh, and the My Tasks/Projects window looks kind of ugly. I want to be able to view my task and my answers instead of this window.

Milestone Contributors

  • Diane Phan @diane
  • Kimberly Le @kimberly35
  • Nicolas Rodriguez @nick13rodriguez
  • Renan Castro @renan
  • Samarth Sandeep @samarthsandeep