URI engineer challenges computer users to ‘attack’ program designed to combat manipulation of product rating systems

$1,000 prize offered to most successful participant


KINGSTON, R.I. – April 23, 2007 – As online shopping becomes more common around the globe, shoppers increasingly depend on consumer-based rating systems that vendors like Amazon.com use to rate their products.


“These ratings are having more and more impact on consumer purchasing decisions, yet the rating systems are very easy to manipulate,” said Yan Sun, assistant professor of computer engineering at the University of Rhode Island and an expert in computer network security. “It’s as simple as telling all your friends to rate a product in one way or another.”


To detect manipulated or unfair ratings, Sun and URI colleagues Qing Yang and Yafei Yang have developed an algorithm that can detect suspicious ratings. A patent on the algorithm is pending.


“We treat regular ratings as noise, but we can detect patterns from users who are colluding to unfairly manipulate ratings,” she said. “It’s designed to improve the quality of the information in the rating systems to make them more reliable.”


Sun explained that systems already exist that can detect obvious efforts to manipulate ratings, but current algorithms cannot detect “smart attackers” who try to make subtle changes to a product’s rating. Her new algorithm is more successful at detecting these subtle manipulations.


To test its effectiveness, Sun is inviting computer users to try to manipulate a rating system protected by her algorithm. A $1,000 prize is offered to the individual or team that can introduce the largest bias in the rating system.


“We hope that the contest will enable us to collect a lot of attack data, which can be used as a benchmark for testing our algorithm and future algorithms,” said Sun. “It’s the best way to test it against real attackers.”


The contest begins April 25 and runs through July 30. Rules and registration requirements can be found at www.etanlab.com/rating.