Welcome to the Crowd Research project wiki, we're a group of one thousand designers, engineers, crowd workers, and crowd requesters from around the world who are building Daemo - a self-governed crowdsourcing marketplace. Crowd Research is part of the larger Aspiring Researchers Challenge.
- Though we may not open a large scale application cycle anytime soon, if you're interested, sign up here and we'll be in touch.
- Winter 2016 applications are closed now, check back again later. Welcome, Winter 2016 batch.
- Inviting applications for the winter session, aspiring researchers and hackers apply now. To help you get started and understand what we've done so far, we've created this -> "Introducing Crowd Research Initiative and Recap - Winter 16" (recap and overview page), where you can learn more about this project.
- Our first version (Daemo) is live on daemo.stanford.edu.
- We're finalists (top 20 of 1,000+) at the Knights Challenge 2015, check out our proposal and video for the brief overview of Daemo.
- Our ACM UIST'15 got accepted, read it for more details.
Some research projects are too big and too important to tackle alone. Sometimes, we need to team up.
At Stanford’s Computer Science department, we've observed that people who are aiming to get research experience or launch their research career will often fall into an expertise valley. Undergraduates are assigned extremely tightly scoped activities within research projects, getting little room for creativity. Then these folks get into PhD programs, have literally the entire space of human knowledge to explore, and don’t have enough scaffolding to make quick progress.
We're going to create a crowdsourced research team to tackle both these challenges together. We’ll gather as many talented folks as we can get, and work to build out that intervening bridge between tightly-scoped work and open-ended exploration. It will have far more flexibility than a typical research experience, but with a focused goal where we can bring each other back on track each week and nobody gets lost.
- 1 About the project
- 2 Affiliation policy
- 3 Participant contribution
- 4 Summer 16 meetings and slides
- 5 Winter16 milestones
- 6 General weekly plan
- 7 Skill requirements
- 8 Resources
- 9 About this wiki
- 10 Contact crowd research admins
About the project
About Daemo - a self-governed crowdsourcing marketplace
Whether you need help gathering data, labeling machine learning training examples, running experiments, or transcribing audio, today we use crowdsourcing platforms such as Amazon Mechanical Turk. However, these platforms are notoriously bad at ensuring high quality results, producing respect and fair wages for workers, and making it easy to author effective tasks. It’s not hard to imagine that we could do better.
This research will be a complete design, implementation, launch, and evaluation of a new crowdsourcing platform. What would it take to create an effective marketplace? One where workers have more power in the employment relationship, or could take additional responsibility for the result quality? How might we design such a market? Could we launch it and become the new standard? This research in human-computer interaction will involve a combination of design thinking, web development, and experimental design. This is far more ambitious than your typical project. It’s an entire marketplace design question. Thus, we’re banding together to solve it.
About Crowd Research initiative
Crowd Research is an experimental initiative by professor and students at Stanford to recognize the potential of upcoming student researchers and exploring the possibility of massive research collaboration between the two. It is a first of its kind research-at-scale initiative in the world. We're a group of about 500 students, professionals and researchers from all around the world, working together by engineering, building, designing and making Daemo a reality. We call ourselves, "Stanford Crowd Research Collective". The project is organized by Prof. Michael Bernstein at Stanford HCI and his students.
When you write emails or update LinkedIn profiles/resume or cover letters or statements of purpose, *please be careful about how you state your affiliation and your contributions*. Other faculty or people can find it very confusing when folks say they "work with Michael Bernstein at Stanford”, because it implies you’re a student here. It is actually hurting the people who do that, because when the faculty or other people realize you’re not actually here, they don’t trust the other things you say you’ve done. It also hurts Prof. Michael Bernstein's Stanford students, because people no longer assume they’re actually at Stanford.
I want to make sure you can leverage your hard work here, and carry all the benefits but avoid making those faculty or other people confused and cynical. I suggest the best route is to make sure not to imply that you’re a Stanford student, and instead that you’re part of our crowd research effort. I suggest saying something like “I’m a member of the Stanford Crowd Research Collective, a worldwide group of researchers led by Michael Bernstein in Stanford CS. I’ve worked directly with Michael and other crowd researchers on [your contributions]”. If you're sending an important email, you can either cc me on the email, or directly suggest that they email me for verification on your contributions. That way I can speak up to reinforce it.
Well, first, there’s creating a crowdsourcing market that becomes the new standard. This could lead to a far better future for crowdsourcing and crowd work, and millions of people could eventually use it. It’s research, of course, so there’s always risk it might not work out — but if we knew it would work, it wouldn’t be research!
Second, we’ll be planning papers to top-tier conferences based on our work. If you are considering an MS or especially a PhD program, being a heavily contributing author on a paper can greatly improve your chances. How much you contribute to the project will determine author order. Last, I really do hope to build relationships with a diverse range of researchers.
Summer 16 meetings and slides
See Archives for spring and summer meetings. If you're getting started, consider going through this page: Introducing Crowd Research Initiative and Recap - Winter 16.
Summer Meeting 1:
- Youtube link of the meeting today: watch
- Summer Meeting 1 slideshow: no slides
All timings in PST (California time)
- Winter Milestone 1 - 8:00 pm 17th Jan 2016 for submission, 12 pm 18th Jan 2016 for peer-evaluation.
- Winter Milestone 2 - 8:00 pm 24th Jan 2016 for submission, 12 pm 25th Jan 2016 for peer-evaluation.
- Winter Milestone 3 - 8:00 pm 31st Jan 2016 for submission, 12 pm 1st Feb 2016 for peer-evaluation.
- Winter Milestone 4 - 8:00 pm 7th Feb 2016 for submission, 12 pm 8th Feb 2016 for peer-evaluation.
- Winter Milestone 5 - 8:00 pm 14th Feb 2016 for submission, 12 pm 15th Feb 2016 for peer-evaluation.
- Winter Milestone 6 - 8:00 pm 21st Feb 2016 for submission, 12 pm 22nd Feb 2016 for peer-evaluation.
- Winter Milestone 7 - 8:00 pm 28th Feb 2016 for submission, 12 pm 29th Feb 2016 for peer-evaluation.
- Winter Milestone 8 - 8:00 pm 6th March 2016 for submission, 12 pm 7th March 2016 for peer-evaluation.
- Winter Milestone 9 - 8:00 pm 13th March 2016 for submission, 12 pm 14th March 2016 for peer-evaluation.
- Winter Milestone 10 - 8:00 pm 20th March 2016 for submission, 12 pm 21st March 2016 for peer-evaluation.
- Winter Milestone 11 - 8:00 pm 27th March 2016 for submission, 12 pm 28th March 2016 for peer-evaluation.
- Winter Milestone 12 - 8:00 pm 29th March 2016 for submission, 12 pm 3rd April 2016 for peer-evaluation.
- Winter Milestone 13 - 8:00 pm 10th April 2016 for submission, 12 pm 11th April 2016 for peer-evaluation.
- Winter Milestone 14 - 8:00 pm 17th April 2016 for submission, 12 pm 18th April 2016 for peer-evaluation. - targeting UIST
- Winter Milestone 15 - 8:00 pm 24th April 2016 for submission, 12 pm 25th April 2016 for peer-evaluation.
- Winter Milestone 16 - 8:00 pm 1st May 2016 for submission, 12 pm 2nd May 2016 for peer-evaluation.
- Winter Milestone 17 - 8:00 pm 8th May 2016 for submission, 12 pm 9th May 2016 for peer-evaluation.
- Winter Milestone 18 - 8:00 pm 15th May 2016 for submission, 12 pm 16th May 2016 for peer-evaluation.
- Winter Milestone 19 - 8:00 pm 22nd May 2016 for submission, 12 pm 23rd May 2016 for peer-evaluation.
- Winter Milestone 20 - 8:00 pm 29th May 2016 for submission, 12 pm 30th May 2016 for peer-evaluation. - targeting CSCW
- Winter Milestone 21 - 8:00 pm 12th June 2016 for submission, 12 pm 13th June 2016 for peer-evaluation.
- Milestone 1 submissions
- Milestone 2 submissions
- Milestone 3 submissions
- Milestone 4 submissions
- Milestone 5 and 6 submissions
- Milestone 8 and 9 submissions
- Milestone 10 submissions
Engineering enthusiasts, check this: Introduction to Research Engineering
General weekly plan
We’ll meet weekly over videochat and lay out our goals for the next week. At the end of the week, you’ll submit what you’ve been working on. Your peers and a Ph.D. student here at Stanford will peer critique the work, and we’ll talk about the best stuff each week in our meeting. The sky’s the limit.
I’m sure we’ll adjust this as we go. Because, this entire crowdsourced research idea is a bit of a research project in itself, too.
- Monday evening 8 pm PST: Prof meeting with participants and milestone set for the next week (over Google Hangout on Air)
- Monday after meeting - Sunday 8pm PST: participants work on their milestones (~6 days)
- Post Sunday 8pm PST - Monday noon 12 pm PST: peer-evaluation by the participants (12+ hours). This is occasional.
- Monday evening 8 pm PST: Prof meets based on the input from RAs and top submissions. Participants receive their next milestone and a feedback survey after every meeting.
In research? None. Anybody who is smart and dedicated can help us envision the future of crowdsourcing and articulate how it might play out.
In terms of skills, there are many different ways that you can participate. If you want to contribute design skills, having a portfolio of past work would be helpful. If you are a CS major or enjoy programming, you’d likely need to have completed an introductory programming course sequence to succeed. We’re currently building infrastructure and implementing foundations using Djanjo, Angular.js, PostgreSQL, REST framework, so knowledge or experience in these areas would be extremely helpful. If you have experience in social science methods (e.g., surveys, qualitative work, designing controlled experiments), there will be lots to do as well to help us make sure we’re creating the right thing.
For everyone, a class in human-computer interaction (such as Scott Klemmer’s HCI Online, which you can complete as prep) will be a huge leg up.
- Slack - used for chat and discussion
- Github - used for collaborating the development process
- Relevant Work - used to save relevant articles, links and papers
- Forums - discussions about this project on external sites. Note that all official announcements and communications will occur via Slack and email.
- Resources - used to index all platforms and resources as we evolve
- Archives - older meetings, slides and milestones
- Introducing Crowd Research Initiative and Recap - getting started with the crowd research initiative
- Current Pagerank - Current pagerank, as computed based on the badge-giving on bonusly (will be used to set author order)
Want to share some other resources? Create a wiki page, and post it at Resources.
About this wiki
You need to be logged in to edit this wiki and view private pages. If you already have an account but cannot login, try resetting your password. If you need an account, please check your email (login credentials were emailed out at the start of the project), and if you can't find the email, contact Geza (@geza on slack), stating your email, and you will be emailed your login details.
All pages on this wiki are public by default - so anyone visiting this site can see them.
Any page which ends with (private) will be private - so you need to be logged in to see them. See example.
Contact crowd research admins
Find us on Slack!
Alternatively, you can email us at email@example.com.