SCG Court: A Crowdsourcing Platform in the Scientific Method

Speaker: Karl Lieberherr, College of Computer and Information Science, Northeastern University, Boston.

Supported by Novartis.

Abstract

A recent Communication of the ACM article on Crowdsourcing Systems (April 2011) points to the importance of crowdsourcing platforms to simplify the development of crowdsourcing systems. We present the Scientific Community Game Court (SCG Court) a crowdsourcing platform (a web application) parameterized by a playground X and our experience in using it for driving innovation in several domains. The Scientific Method we use is refutation-based a la Popper.

The Scientific Community Game involves proposing and opposing claims related to a constructive domain (e.g., domains in computer science, mathematics, engineering, etc.). Central to opposing claims is refuting claims based on a refutation protocol. When playing the game, players make constructive claims about the domain and oppose others' claims. The players who are the most successful in defending and opposing claims win the game and gain a high reputation in the community. Adopting an SCG-centric research process has the following benefits:
(1) it focuses researchers on a specific domain by defining a language for expressing claims about that domain. Thus, reducing the amount of management effort. The numerous contributions from the crowd of researchers are effectively combined by the game to build a knowledge base through voting with justification.
(2) it provides a structured framework for collaboration between researchers. The researchers provide and receive frequent feedback on their claims from their peers. Players who lose points gain knowledge to improve their game in the future. This makes collaboration more effective.
(3) it accumulates knowledge in playground X. The game produces both a knowledge base (the social welfare coming from the game) as well as useful know-how to defend the claims in the knowledge base. For some playgrounds, the know-how consists of a clever algorithm, if the domain is well understood. For other playgrounds that are less understood, the know-how is heuristic.
(4) researchers are motivated towards proposing and opposing non-trivial claims in order to gain reputation. They like to win and if they lose, they want to find out why.
(5) managers get a fair comparison of the skills of their researchers through the competition results.
(6) controlled teaching and learning through the game. Researchers who introduce new knowledge entice other researchers to assimilate the same or better knowledge. Researchers that don't participate in this activity, lose reputation, as they would in a real scientific community. The game is fun and adjusts to the skill levels of players.

The SCG can be played productively for: (1) developing reliable software for computational problems, (2) evaluating potential employees, (3) developing new knowledge in the given domain, (4) evaluating algorithmic innovations fairly, and (5) teaching software development / problem solving techniques in a fun game environment.

More information on SCG is available from: SCG Home Page.