Milestone 7 BufferOverflow

From crowdresearch
Revision as of 01:48, 16 April 2015 by Soroosh129 (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


A Collaborative Crowdsourcing Platform Based On Skill Categorization


The issue of trust in current crowdsourcing platforms could be attributed to a lack of any form of social bonding[3]. In this paper, we propose a collaborative approach to crowdsourcing by presenting to workers and requesters a set of specialized communication platforms catered to their needs. To accomplish this objective, we propose a categorization mechanism that would qualify workers for different skill sets in order to identify potential communities for them. Moreover, to address quality, we propose an enhanced task categorization based on skills.


A fundamental issue in crowdsourcing platforms could be harnessing trust among workers and among workers and requesters[2]. Based on our observation, crowdsourcing platforms are often consisted of disconnected and scattered set of requesters and workers. This situation results in fewer bonds among workers that would create an environment that is deemed hostile and unfriendly[1]. We believe that this would negatively affect motivation and trust since each worker or requester is isolated in a virtual environment that has no resemblance to a day to day activity. Moreover, crowdsourcing platforms have an intuitive emphasize on an important social aspect which is a worker-employer relationship. We believe emphasizing on the human values of these platforms would benefit both workers and requesters.

The quality of work is another concern in crowdsourcing platforms. One major cause of low quality work is the lack of sufficient skills on the part of the workers for specific tasks[12]. Amazon Mechanical Turk lacks a system to identify skills and the ability to categorize tasks based on required skills. However, oDesk provides a comprehensive skill set and task categorization with the ability to assign each task to specific skills and categories[14]. Hence, to improve quality, we integrate a similar but enhanced mechanism in our collaborative approach.

Related Work

Collaboration among workers has been presented as a way to accomplish specialized distributed tasks[5]. [8] and [7] have proposed a collaborative platform for translation of text. [7] has also noted achievable advantages for collaboration, even for serialized work.

CrowdX provides its users with a set of communication platforms such as instant messaging and forums in order to enable collaboration. Netflix Prize also used a forum as a means of collaboration between participants.

Finding a compatible task for each worker based on their skill set is a subject of active research with a goal of addressing quality concerns. oDesk provides workers (“freelancers”) a set of skills that they can assign to their profile. Moreover, oDesk also provides task categories in order to help requesters (“oDeskers”) find freelancers with specific skills. oDesk also enables requesters to add required skills on job posting and to suggest certain qualifications that are recommended. Moreover, qualifications are obtained by taking a set of automated tests[15].

Task categorization[10] is also a proposed method to categorize work that requesters can post. New workers take tests related to each category and their profile is upgraded accordingly. Workers are allowed to attempt the tasks that they are qualified for.



To the best of our knowledge, collaboration has not yet been proposed as a platform to expedite trust. Moreover, an automated skill qualification method has not yet been proposed. In previous works, skill sets are always defined by the workers themselves. Hence, we propose a social collaborative crowdsourcing platform based on automated skill categorization as a means to aid the development of trust between workers and between workers and requesters.

[4] has shown that a community would bring motivation to workers by providing them with a collaboration platform. In this manner, both workers and requesters have a platform to develop trust in a social-friendly environment. Moreover, LinkedIn’s “gated-access approach” is specifically designed to facilitate “trust” among members[9]. Hence, to generate trustworthy and friendly communities, we propose a crowdsourcing platform with small, private and centralized communities. In order to enhance collaboration, we propose to have communities based on the categorization of skills. Moreover, to test the benefits of a skill-based categorization, we also deploy a location-based community configuration.

As of now, oDesk does not connect task categories to skill sets and does not employ automated tests for skill identification. Furthermore, oDesk separates qualifications from skills. Hence, there are many qualifications that overlap with skills which renders them unnecessary. Also, each worker (“freelancer”) is allowed to add as many skills as she desires. In our proposed model, we use automated tests to identify workers’ skills and we connect skill sets to each task category.

For future works, we believe manual tests are required for certain skills. Moreover, automated tests would work better if they produced a rating based result instead of discrete “accept” and “reject”.


We would build a crowdsourcing platform that allows workers to communicate with each other based on their skill set. For example, when a new worker joins our platform, she is presented with a set of predefined questions, “tests”, which are defined by community moderators for each skill. Based on the tests taken and the results, her profile is upgraded and the platform would then recommend a list of communities using a recommender engine. If a worker is qualified for a certain skill, she can automatically join the respected private community without prior approval. Moreover, community moderators have the ability to add members (both workers and requesters) without obtaining the qualification first.

We then propose the use of task categorization[11]. Each requester chooses a category on task submission. Each category is consisted of a subset of skills chosen by community moderators. By default, all of these skills are automatically qualified for application. However, the requester can require certain skills in a category that are deemed essential for the task.

Moreover, we also propose to eliminate the worker-requester label in communities so that community members could possess a mutual standing. As is described in [5], we also implement an enhanced distributed teamwork as a feature and a direct benefit from the proposed mechanism.


Skill-based communities would increase collaboration (we should base this on future test results) by providing an initial professional and mutual interest (Actual results might show that location based communities have the edge). Skill-based communities would also improve the quality of the submitted work to requesters since only skilled workers are allowed to do the work.

In order to prove the efficiency of small communities we propose the deployment of a rapid prototype to a set of workers and requesters. Our prototype would present a trivial question to each user on sign-up: “Based on what criteria you would like to join a community: Geographical Location or Skill Set”. The users could also choose to opt-out of this feature. We define a set of observations in order to determine the efficiency of each group. These observations include: number of HITs done per week, number of HITs accepted, etc. We would also ask for feedback in a two-week interval to gain insights on the level of trust between workers and between workers and requesters.


1- Silberman, M.S., Irani, L., and Ross, J. Ethics and tactics of professional crowdwork. XRDS 17, 2 (2010), 39–43

2- Milestone 3#Michael Bernstein's synthesis

3- Social Network for Crowdsourcing TuringMachine DarkHorseIdea

4- Building Successful Online Communities: Evidence-Based Social Design, Tausczik, Dabbish, and Kraut 2012

5- The Future of Crowd Work, Kittur et al 2013

6- Zhang, Yu, and Mihaela van der Schaar. "Reputation-based incentive protocols in crowdsourcing applications." INFOCOM, 2012 Proceedings IEEE. IEEE, 2012.

7- Kittur, Aniket. "Crowdsourcing, collaboration and creativity." ACM Crossroads 17.2 (2010): 22-26.

8- Ambati, Vamshi, Stephan Vogel, and Jaime Carbonell. "Collaborative workflow for crowdsourcing translation." Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work. ACM, 2012.

9- Papacharissi, Zizi. "The virtual geographies of social networks: a comparative analysis of Facebook, LinkedIn and ASmallWorld." New media & society 11.1-2 (2009): 199-220.

10- Bufferoverflow : Levelling,Rating and categorisation

11- Turing Machine: Distribution of work according to workers’ interests

12- Allahbakhsh, Mohammad, and Boualem Benatallah. "Quality Control in Crowdsourcing Systems." (2013).

13- Khazankin, Roman, et al. "Qos-based task scheduling in crowdsourcing environments." Service-Oriented Computing. Springer Berlin Heidelberg, 2011. 297-311.

14- oDesk - Freelancer Skills

15- oDesk - Qualification Tests