Milestone 6 BufferOverflow

From crowdresearch
Jump to: navigation, search


A Collaborative Crowdsourcing Platform Based On Skill Categorization


The issue of trust in current crowdsourcing platforms could be attributed to a lack of any form of social bonding[3]. In this paper, we propose a collaborative approach to crowdsourcing by presenting to workers and requesters a set of specialized communication platforms catered to their needs. To accomplish this objective, we propose a categorization mechanism that would qualify workers for different skill sets in order to identify potential communities for them.


A fundamental issue in crowdsourcing platforms could be harnessing trust among workers and among workers and requesters[2]. Based on our observation, crowdsourcing platforms are often consisted of uncategorized and scattered set of requesters and workers. This situation results in fewer bonds among workers that would create an environment that is deemed hostile and unfriendly[1]. We believe that this would negatively affect motivation and trust since each worker or requester is isolated in a virtual environment that has no resemblance to a day to day activity. Moreover, crowdsourcing platforms have an intuitive emphasize on an important social aspect which is a worker-employer relationship. We believe emphasizing on the human aspect of these platforms would benefit both workers and requesters.

Related Work

Collaboration among workers has been presented as a way to accomplish specialized distributed tasks[5]. [8] and [7] have proposed a collaborative platform for translation of text. [7] has also noted achievable advantages for collaboration, even for serialized work.

CrowdX provides its users with a set of communication platforms such as instant messaging and forums in order to enable collaboration. Netflix Prize also used a forum as a means of collaboration between participants.

Finding a compatible task for each worker based on their skill set is a subject of active research with a goal of addressing quality concerns. Task categorization[10] is a proposed method to categorize work that requesters can post. New workers take tests related to each category and their profile is upgraded accordingly. Workers are allowed to attempt the tasks that they are qualified for.


To the best of our knowledge, collaboration has not yet been proposed as a platform to expedite trust. Moreover, an automated skill qualification method has not yet been proposed. In previous works, skill sets are always defined by the workers themselves. Hence, we propose a social collaborative crowdsourcing platform based on automatic skill categorization as a means to aid the development of trust between workers and between workers and requesters. [4] has shown that a community would bring motivation to workers by providing them with a collaboration platform. In this manner, both workers and requesters have a platform to develop trust in a social-friendly environment. Moreover, LinkedIn’s “gated-access approach” is specifically designed to facilitate “trust” among members[9]. Hence, to generate trustworthy and friendly communities, we propose a crowdsourcing platform with small, private and centralized communities. In order to enhance collaboration, we propose to have communities based on the categorization of skills. These skills are obtained through an automated skill recognition mechanism. Moreover, to test the benefits of a skill-based categorization, we also deploy a location-based community configuration.


We would build a crowdsourcing platform that allows workers to communicate with each other based on their skill set. For example, when a new worker joins our platform, she is presented with a set of predefined questions, “tests”, which are defined by community moderators for each skill. Based on the tests taken and the results, her profile is upgraded and the platform would then recommend a list of communities for her to join. If a worker is qualified for a certain skill, she can automatically join the respected private community without prior approval. Moreover, community moderators have the ability to add members (both workers and requesters) without obtaining the qualification first.

We then propose the use of task categorization[11]. Each requester chooses a category on task submission. Each category is consisted of a subset of skills chosen by community moderators. By default, all of these skills are automatically qualified for application. However, the requester can require certain skills in a category that are deemed essential for the task.

Moreover, we also eliminate the worker-requester label in communities so that community members could possess a mutual standing. As is described in [5], we also implement an enhanced distributed teamwork as a feature and a direct benefit from the proposed mechanism.


Skill-based communities would increase collaboration (we should base this on future test results) by providing an initial professional and mutual interest (Actual results might show that location based communities have the edge). Skill-based communities would also improve the quality of the submitted work to requesters since only skilled workers are allowed to do the work.

In order to prove the efficiency of small communities we propose the deployment of a rapid prototype to a set of workers and requesters. Our prototype would present a trivial question to each user on sign-up: “Based on what criteria you would like to join a community: Geographical Location or Skill Set”. The users could also choose to opt-out of this feature. We define a set of observations in order to determine the efficiency of each group. These observations include: number of HITs done per week, number of HITs accepted, etc. We would also ask for feedback in a two-week interval to gain insights on the level of trust between workers and between workers and requesters.


1- Silberman, M.S., Irani, L., and Ross, J. Ethics and tactics of professional crowdwork. XRDS 17, 2 (2010), 39–43

2- Milestone 3#Michael Bernstein's synthesis

3- Social Network for Crowdsourcing TuringMachine DarkHorseIdea

4- Building Successful Online Communities: Evidence-Based Social Design, Tausczik, Dabbish, and Kraut 2012

5- The Future of Crowd Work, Kittur et al 2013

6- Zhang, Yu, and Mihaela van der Schaar. "Reputation-based incentive protocols in crowdsourcing applications." INFOCOM, 2012 Proceedings IEEE. IEEE, 2012.

7- Kittur, Aniket. "Crowdsourcing, collaboration and creativity." ACM Crossroads 17.2 (2010): 22-26.

8- Ambati, Vamshi, Stephan Vogel, and Jaime Carbonell. "Collaborative workflow for crowdsourcing translation." Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work. ACM, 2012.

9- Papacharissi, Zizi. "The virtual geographies of social networks: a comparative analysis of Facebook, LinkedIn and ASmallWorld." New media & society 11.1-2 (2009): 199-220.

10- Bufferoverflow : Levelling,Rating and categorisation

11- Turing Machine: Distribution of work according to workers’ interests