Milestone 3

From crowdresearch
Jump to: navigation, search

Note: The idea submissions for Milestone 3 have been archived at http://crowdresearch3.meteor.com/

Due date (PST): 11:59 pm 18th March 2015 for submission, 9 am 20th March 2015 for voting and commenting on others' ideas.

This week, we will take the set of needs that we collectively identified in the previous milestone and use those insights to generate design ideas.

  • Youtube link of the meeting today: watch
  • Meeting 3 slideshow: pdf

Needs from Milestone 2

We synthesized the main needs groups identified in Milestone 2 in the following table:

Worker Needs

Needs Evidence
Observations Interpretations
Workers need to be able to quickly find tasks they'd want to work on Monday evening panel workers cite the challenge in identifying or distinguishing tasks because of poor tagging, Reddit discussion also cited the exorbitant amount of time that they spend trying to find tasks and do the mental calculations to find the opportunities that match them best (e.g. time to complete the task on average, average $ per minute on task, requirements to complete the task). Finding a task that matches well with the worker's skillset and pays well takes a significant amount of (unpaid) time.
Workers need to feel they are being fairly compensated for their work. Reddit discussion cites that the payment system for HITs is not adaptive and does not take into account changing marketplace conditions (supply/demand) and pricing of tasks based on those changes. Monetary compensation is the primary motivator for many crowd workers.
Workers need to feel like they are treated fairly and respectfully, and have a voice in the platform Comment on Turkopticon: "Got a mass rejection from some hits I did for them! Talked to other turkers that I know in real life and the same thing happened to them. There rejection comments are also really demeaning. Definitely avoid!" Unreasonable rejections and low payments lead workers to feel disrespected. The implicit assumption on MTurk is that workers are unskilled and replaceable. They can do little if their work is rejected.
Workers need to be able to expose their skills so they can get work they are qualified for and advance their skills Monday evening panel workers from oDesk cite that most employers will not work with them until they have enough feedback or past work on the platform If users cannot get new work without feedback, this makes it difficult for new users to establish their reputation and get jobs that will help develop their skillsets.
Workers need to be confident that they understand the goal of the task, and quickly Workers avoid tasks that have ambiguous goals or might result in errors that get them rejected Workers become risk-averse when a task might be confusing enough to threaten their reputation or payment

Requester Needs

Needs Evidence
Observations Interpretations
Requesters need to get their HITs completed (quickly / correctly) Requester asking on forum why nobody is doing his HITs (7-minute, 25-cent surveys - a very low wage) Requesters want their HITs done, and when nobody's doing them, they do not know the reason why (e.g. it is because he is underpaying workers)
Requesters need to be able to trust the results they get Requesters will often rely on previous workers whose results they can trust, and add mechanisms to detect spammers, or manually verify some results. If spammers are not caught, this brings the correctness of results into question. If requesters are not sure the results are correct, they may need to discard the data.
Requesters need to have workers who have the appropriate skills and demographics do their tasks Requesters worry that they are not able to verify self-reported demographics for surveys. Workers' self-reported skills and demographics are often not viewed as trustworthy. This is a problem for surveys, which need to have correct demographic data to be useful.
Requesters need to be able to easily generate good tasks Companies hire full-time developers to deal with the complexities of posting microtasks on MTurk. Requesters often develop their own tools and workflow systems on top of Amazon's. The process of authoring HITs is currently difficult and makes crowd-work inaccessible to potential requesters
Requesters need to price their tasks appropriately Requesters asking on forums about the appropriate amount they should pay for their HITs Requesters often don't have a good intuition of what the appropriate wage for their task would be in terms of price per HIT.
Requesters need workers to trust them Requesters say they are reluctant to reject work, because they fear they might get bad reviews. Workers are more likely to do HITs if the requester seems trustworthy. Requesters do not want bad reviews, because they may result in workers ignoring the requester's HITs

Michael Bernstein's synthesis

These needs boil down to two main issues: 1) trust, and 2) power.

Trust:

  • How do I trust who you say you are?
  • How do I trust that the results I get are results that will be good?
  • How do I trust that you’ll respect me as a worker, and pay me accordingly?

Power:

  • Who has the power to post work?
  • To edit other peoples’ posted work?
  • To return results to the requester? Can I, as a worker, send it back myself, or does someone else need to vet it?

As we brainstorm, we should be thinking about solutions that holistically address these issues of power and trust, not just surface fixes that get at micro-elements of the system.

Recommended Readings

Coming up with good, novel visions and ideas is a crucial part of doing successful research, and reading other researchers' visions and ideas can help you come up with better ideas yourself. These readings discuss visions that crowdsourcing researchers have thought of related to future crowd marketplaces. This week's readings are optional and don't have a deliverable, but are highly recommended.

Design notes for a future crowd work market

Design notes for a future crowd work market - This is a Medium post written by researchers involved with Turkopticon in response to hearing about this research project. It discusses their vision for a future crowd marketplace, where workers are more involved in the management of the marketplace.

The future of crowd work

Kittur A, Nickerson J V, Bernstein M, et al. The future of crowd work. Proceedings of the 2013 conference on Computer supported cooperative work. ACM, 2013: 1301-1318. - This paper envisions a future crowd marketplace that emphasizes on workers' long-term development, and where people can be proud to be workers. It is a long paper. Feel free to focus on just the parts that particularly interest you.

Initial Brainstorm

Now it's time for your team to brainstorm some ideas based on these needs.

Work with your team to brainstorm as many ideas as you can under two headings: trust and power. Here are some examples of " How might we" questions (a technique which can inspire specific brainstorms) which can drive the generation of your ideas:

  • How might we” enable workers to trust the requester’s intention to pay?”
  • “How might we” enable requesters to trust the results they get back?

Use whatever tools you need - if you can get together in-person, whiteboards and sketchbooks are great tools, while if you're a remote team, services like Google Docs and sketchboard.io should help you with the brainstorming process.

Sketch out enough ideas until you find a set that you’re inspired to explore further. It should be at least 20 ideas total - 10 ideas for trust, and 10 ideas for power. This brainstorm should be wild and broad. Focus not on usability patches, but deeper design innovations.

Deliverable

The ideas you brainstormed, (at least 10 ideas for trust, and at least 10 ideas for power). Provide them in whatever format you want - diagrams, sketches, descriptions, or a combination (the wiki supports images, see here for instructions on uploading them).

Dive Deeper into Specific Ideas

Select 2 ideas per heading (trust, power) that you would like to pursue, and expand on them a bit further. In addition to describing the idea itself, make sure you also tell us:

  • What are the goals of the design? For example, Google's Android design goals are: delight me in surprising ways, simplify my life, and make me amazing (e.g., grant me special powers).
  • Which aspects of your design reflect each goal? How does your design solution addresses the users' needs?

Deliverable

For each of the 4 ideas (2 for trust, 2 for power), describe (using diagrams, sketches, storyboards, text, or some combination) the ideas in further detail.

Please create a separate wiki page for each of your ideas, so we can link to them individually. The title of the wiki page should be Milestone 3 followed by your team name and a description of the idea itself (ex: Milestone 3 YourTeamName TrustIdea 1: Automatic Pricing for Tasks based on Average Completion Time). Post a link to each of your trust-related ideas to http://crowdresearch.meteor.com/category/milestone-3-trust-ideas and a link to each of your power-related ideas to http://crowdresearch.meteor.com/category/milestone-3-power-ideas when done.

Dark Horse idea

Now that you've identified some design directions you like, it's time to change tack and toss in a dark horse idea. A dark horse, in horse racing, is a contender who most people don't think will win, but may turn in an unexpectedly strong performance and produce a huge payoff. Dark horse ideas are intended to be something far out there or nearly impossible. In the best case, your dark horse ideas might end up winning the race. However, even in the worst case, they can give us tremendous design insight and prevent design fixation, where the design space shrinks too rapidly.

There are three requirements for dark horse ideas. First, they must be "dark": they must explore a space that is risky, radical, infeasible, and/or in a direction orthogonal to previously explored solutions. They should feel slightly uncomfortable. Second, they must be brainstormed after the more traditional ideas — you can't have a dark path without a traditional "light" path to contrast it against. Third, they must be refined enough that they could be prototyped and objectively tested. That is, it cannot be infeasible: it needs to be something that we could put in front of real people to see whether it would work.

If you have dark horse ideas that came up in your initial brainstorm, you can use that. If you're not satisfied, brainstorm some! Try using Powers of Ten and other techniques to push further and generate even more. After you brainstorm and sketch out dark horse ideas, choose one that you'd like to include among your set of two top candidates from before. Expand on your dark horse idea like you did in the previous section.

Deliverable

Describe your dark horse idea (using diagrams, sketches, storyboards, text, or some combination).

Please create a separate wiki page for your dark horse idea so we can link to it individually. Post the link on http://crowdresearch.meteor.com/category/milestone-3-dark-horse-ideas when done

Submitting

Create a Wiki Page for your Team's Submission

Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at Milestone 3 Template . If you have never created a wiki page before, please see this or watch this.

[Team Leaders] Submission or Post the links to your ideas until 18th March 11:59 pm

We have a service on which you can post the links to the wiki-pages for the individual ideas you generated, explore them, and upvote them.

Instructions for posting are at http://crowdresearch.meteor.com/posts/bXSNbqihjajASBQEL

Sign-up Instructions: Log in with either Twitter or Facebook. When it asks you to pick your username, pick the same username as your Slack, this will help us identify and track your contributions better.

There are 3 submission categories:

1- http://crowdresearch.meteor.com/category/milestone-3-trust-ideas where you can post links to the wiki pages for each of the 2 trust-related ideas you generated in the "Dive Deeper into Specific Ideas" stage

2- http://crowdresearch.meteor.com/category/milestone-3-power-ideas where you can post links to the wiki pages for each of the 2 power-related ideas you generated in the "Dive Deeper into Specific Ideas" stage

3- http://crowdresearch.meteor.com/category/milestone-3-dark-horse-ideas where you can post a link to the wiki page for your dark horse idea

Post links to your ideas only once they're finished. Give your posts titles which summarize your idea. Viewers should be able to get the main point by skimming the title ("Automatic Pricing for Tasks based on Average Completion Time" is a good title. "YourTeam TrustIdea 1" is a bad title).

-Please submit your finished ideas by 11:59 pm 18th March 2015, and DO NOT vote/comment until 19th March 12:05 am

[Everyone] Peer-evaluation (upvote ones you like, comment on them) from 12:05 am 19th March until 9 am 20th March

Post submission phase, you are welcome to browse through, upvote, and comment on others' ideas. We encourage you especially to look at and comment on ideas that haven't yet gotten feedback, to make sure everybody's ideas gets feedback. You can use http://crowdresearch.meteor.com/needcomments to find ideas that haven't yet gotten feedback, and http://crowdresearch.meteor.com/needclicks to find ideas that haven't been yet been viewed many times.

COMMENT BEST-PRACTICES: As on Crowdgrader, everybody reviews at least 3 ideas, supported by a comment. The comment has to justify your reason for upvote. The comment should be constructive, and should mention positive aspect of the idea worth sharing. Negative comments are discouraged, rather make your comment in the form of a suggestion - such as, if you disliked an idea, try to suggest improvements (do not criticize an idea, no idea is bad, every idea has a scope of improvement).

[Team Leaders] Milestone 3 Submissions

To help us track all submissions and browsing through them, once you have finished your Milestone 3, go to the link below and post the link:

Milestone 3 Submissions

Fill out this week's survey

Please provide your feedback on this week's meeting and milestone so we can improve it, by filling out this survey