Crowdresearch:WinterMilestone: Team ZenDesign - Dynamic Task Pricing Model: An Introduction

From crowdresearch
(Redirected from User:Deepikanayak)
Jump to: navigation, search

Dynamic Task Pricing Model : An Introduction

What's the problem you are solving

Pricing Model

Crowdsourcing is a rapidly growing area as a marketplace where people can request and receive services. This forms a legitimate source of income for many people across the globe. One of the main challenge requesters face is in providing appropriate incentives for workers[4]. Current models are based on assigning the price based on a set amount which usually remains the same throughout the lifespan of the task, this is called the fixed model. This limits the ability of the requester to come up with a pricing model that ensures quality work being done for fair wages. One way to set prices under such models is to estimate workers’ costs via a market analysis and then compute an optimal fixed price which would maximize the utility [5]. However, this poses several challenges, some of them being, 1. Market rates fluctuate tremendously. 2. Workers come from different geographical areas where cost of living varies significantly from place to place. 3. Market research and analysis itself is a cost high affair. This poses the need for an alternate dynamic pricing model. In this paper, we explore the various key metrics that enable us to create a dynamic pricing model that helps requesters obtain better value for their money and for workers to be able to work on interesting work for fair wages.


Related Work

  • J. J. Horton and R. J. Zeckhauser. Algorithmic wage negotiations: Applications to paid crowsourcing. In CrowdConf, 2010. - J. J. Horton and R. J. Zeckhauser introduce an automated negotiation program "Hagglebot" that the users use to renegotiate an additional task once the 1st task is done for fixed amount. In this approach, both workers and requesters are in some way haggling over the price of the task done. This can affect the workers negatively as requesters may find that there are others who will be willing to grab the task for the rate that has been initially set by them, thereby giving more power. Barring seasoned workers, many new workers might feel very reluctant to negotiate the price or the counter offer and thereby accept low wages and are unhappy.
  • In Pricing Tasks in Online Labor Markets, Yaron Singer and Manas Mittal propose a pricing mechanism that maintains a certain threshold price that is used to determine if a worker's bid is to be accepted or rejected.The workers are not directly rejected at the sampling and are allocated tasks when their price is below the threshold level. In this approach, the threshold also depends on the budget of the requester. This may affect seasoned workers whose wages are higher and may often not meet the threshold level. In addition, the skills of the worker is not considered and may result in frustration.


Concept

A person will work only when the net benefits from working exceed the hypothetical net benefits from their next-best alternative, be it another job, leisure or a renewed job search [1]. Our concept works on a dynamic pricing system which takes into account a worker- requester matching mechanism that is determined by algorithms that take in key metrics as their data. To ensure that the matching occurs effectively. Each new worker has to undergo a small series of challenges that help determine their areas of strength. This then gives them a base score or ranking depending on the outcome of the challenges. The base score is to help build trust in the requester that the worker is suitable and qualified for the task.

For Example - A requester might post a website building task that needs skills on web technologies. workers with the expertise in the area can take a series of challenges that determine the host of their skills from coding, communication skills etc and get assigned a base KPI score that they can then build upon. The base score helps the requester make an informed decision to hire the worker or not. Worker- task management occurs in an efficient way helping both workers and requesters.

  • Step 1 - Requesters can fix a base price that is a bare minimum.
  • Step 2 - Key Metrics are measured such as
    • 1. Worker base score in the area of task
    • 2. Average time per hit
    • 3. Average rejection rate
    • 4. Recommendations received by the worker in the area of task
    • 5. Geography

Each of these key metrics decide the varying wages for the user.

  • Step 3 - Based on the metrics, worker's wages are set. A highly experienced and specialised worker who can finish the task quickly is more beneficial to the requester and hence will have higher wages for the task whereas a new worker who might take some time to finish the task will have lower wages. Based on the budget, time constraints and the quality requirement of the requester they can then decide whom to assign the task to.
  • Step 4 - If new worker performs exceptionally well, their KPI score goes up and they are able to demand higher wages.


References

  1. Horton, J. J., and Chilton, L. B. 2010. The labor economics of paid crowdsourcing. EC ’10, 209–218
  2. J. J. Horton and R. J. Zeckhauser. Algorithmic wage negotiations: Applications to paid crowsourcing. In CrowdConf, 2010.
  3. Stanford Crowd Research Collective.Daemo
  4. Y Singer, M Mittal, Pricing Tasks in Online Labor Markets
  5. Adish Singla, Andreas Krausemechanisms,Truthful Incentives in Crowdsourcing Tasks using Regret Minimization Mechanisms


Milestone Contributor

  • Deepika Nayak @dnayak
  • Kanwar Arora @kanwar_arora