David Olsen and Dr. Conan Albrecht, School of Accountancy and Information Systems
Administrators of online groups face challenges in making their information attractive to users. Large groups with a lot of information must attract new users while retaining current users. Because users of large groups can get overwhelmed by the volume of information available, they key to the group’s success is attracting high-quality and easily-searched comments. To deal with the challenges of information usefulness and relevance, groups have adopted rating systems and search engines, respectively. I hope to determine whether or not rating systems really do lead to high-quality information.
Online groups have been defined as social aggregations of critical masses of people on the Internet who engage in public discussions, interactions in chat rooms, and information exchanges with sufficient human feeling on matter of common interest to form webs of personal relationships. My study will focus on communities of interest, where users interact with each other extensively on similar topics. Examples of communities of interest on the Internet today include Slashdot (see Slashdot.org), PerlMonks (see perlmonks.com), and Cactus (see cactuscode.org).
I am interested in online groups because they provide vast amounts of information on focused topics, which is a great learning resource. However, the structure of allowing anyone to post comments leads a lot of low-quality information. Rating systems have been adopted by many groups for two main reasons. First, rated comments allow readers to easily find quality information. Second, rating systems should motivate users to post high-quality comments.
For example, Slashdot (see Slashdot.org) has a fairly sophisticated rating system. Comments are posted in reaction to news stories each day. Moderators rate these comments on a scale of -1 to 5, and comments between 3 and 5 are given a “type” rating, such as “insightful,” “interesting,” “informative,” and “funny.” Users who consistently participate and give good comments build up “karma,” which is judged on a scale of “Terrible, Bad, Neutral, Positive, Good, and Excellent.” Users who gain karma of Positive, Good, and Excellent generate “points.” These points allow them to moderate a certain number of other users’ comments. To assist readers in seeking high-quality comments, the Slashdot system allows them to adjust their preferences to only display above a certain threshold of comment ratings. For example, a user could choose to only see comments rated 3 or above.
Rating systems and their affect on user behavior in communities of interest are not well researched or understood. Past research demonstrates that users are motivated to contribute to communities of interest by intrinsic factors, which may suggest that rating systems may not be necessary for motivating contribution. However, we believe that rating systems will enhance user’s motivation by giving them feedback on the usefulness of their contributions. This feedback will in turn motivate the user to continually improve their contributions and lead to greater productivity. I am interested in the overall behavior of users over time.
My faculty mentor, Dr. Conan Albrecht, and I had originally planned to do a case study of Slashdot to investigate this hypothesis. We planned to analyze behavior by harvesting and analyzing data from Slashdot’s past year of activity. We felt that Slashdot would provide adequate external validity because it is a well-established, mature community with thousands of users and hundreds of thousands of comments available to analyze. The rating system has been in place for a long time, is similar to other group’s rating systems, and is well understood by users.
However, as we proceeded with this project we faced an unexpected ethical challenge. Some websites post a file called “robots.txt” that basically specifies who can “crawl” their website and save the data that they gather. For example, most websites allow the search engine Google to crawl them so that their site will appear in Google searches. However, most websites also limit crawling ability to only certain search engines. We could not ethically crawl Slashdot without getting explicit permission from them.
Therefore, we adjusted our approach to be more theoretical. We joined forces with another professor in the same department, Dr. Doug Dean and his research assistant, Steven Tedjamulia. Steven had done extensive background research on the theories behind motivating people to contribute to online communities. We combined this background research with our background on motivating high-quality contributions to form the basis of our theoretical paper.
We proposed a new model that shows how to motivate high-quality contribution. Personality characteristics—self-efficacy, intrinsic motivation, need to achieve, and trust—combined with environmental factors—usability, group identity, personal responsibility—and personal goals and commitment lead to contribution. The contribution is measured and reinforced by financial rewards, performance appraisal, and social recognition. The reinforcements applied in a positive manner in turn motivate continued contribution by their affect on personality characteristics.
This study will provide valuable insight into future studies in the area. The research in the area of motivation contribution to online communities is very sparse, and we hope to inspire a dedicated line of research that will give more insight into this topic. In particular, we hope that our proposed model will be tested empirically. The paper, coauthored by all four of us, has been accepted into the 2005 Hawaii International Conference for System Sciences (HICSS) and will be presented during that conference in January.