Difference between revisions of "WinterMilestone 4 Team-UXCrowd"

From crowdresearch
Jump to: navigation, search
Line 1: Line 1:
  
'''== Title =='''
+
== '''Title''' ==
  
 
'''Ensuring quality in crowdsourced platform by introducing a Platform ready certification, Sentiment analysis and Standard gold test.'''
 
'''Ensuring quality in crowdsourced platform by introducing a Platform ready certification, Sentiment analysis and Standard gold test.'''

Revision as of 05:57, 7 February 2016

Title

Ensuring quality in crowdsourced platform by introducing a Platform ready certification, Sentiment analysis and Standard gold test. By S.S.Niranga | Alka Mishra


Abstract

A global phenomenon with minimal barrier to entry, crowdsourcing has transformed human force from mere consumers of products to active participants in value co-creation. The crowdsourcing ecosystem is one in which work is being re- defined as an online meritocracy in which skilled work is rewarded in real time and job training is imparted immediately via feedback loops[1]. Under such working conditions, the diverse pool of untrained participants: workers and requester, often find themselves circling with mistrust and ambiguity with respect to result quality and task authorship. This indicates that there is a requirement for quality control mechanisms to account for a wide range behavior: bad task authorship, malicious workers, ethical workers, slow learners, etc.[2]. Although many crowdsourced platforms offer clear guidelines, discussion forums and tutorial sessions to overcome some of these issues but, there is still a large percentage of workers and requesters unaware with the use of platforms. In this paper, we assess how crowd workers can produce a quality output by introducing below three proposed methods.

• Platform ready certifications

• Sentimental analysis system

• Gold Test


Milestone Contributors

S.S.Niranga @niranga,

Alka Mishra @alkamishra