Researchers trawling through data from Facebook and other social networking sites find good examples of what they call human herding behavior. A notable case shows that if you “like” an article online, your friends are more likely to “like” that article too. Is it a case of similarities of the group leading to similar behavior among peers? Well, apparently not — the same research also found that if you dislike the same article, your friends are not as likely to dislike it as well. So what is going on?
From the New York Times:
If you “like” this article on a site like Facebook, somebody who reads it is more likely to approve of it, even if the reporting and writing are not all that great.
But surprisingly, an unfair negative reaction will not spur others to dislike the article. Instead, a thumbs-down view will soon be counteracted by thumbs up from other readers.
Those are the implications of new research looking at the behavior of thousands of people reading online comments, scientists reported Friday in the journal Science. A positive nudge, they said, can set off a bandwagon of approval.
“Hype can work,” said one of the researchers, Sinan K. Aral, a professor of information technology and marketing at the Massachusetts Institute of Technology, “and feed on itself as well.”
If people tend to herd together on popular opinions, that could call into question the reliability of “wisdom of the crowd” ratings on Web sites like Yelp or Amazon and perhaps provide marketers with hints on how to bring positive attention to their products.
“This is certainly a provocative study,” said Matthew O. Jackson, a professor of economics at Stanford who was not involved with the research. “It raises a lot of questions we need to answer.”
Besides Dr. Aral (who is also a scholar in residence at The New York Times research and development laboratory, working on unrelated projects), the researchers are from Hebrew University in Jerusalem and New York University.
They were interested in answering a question that long predates the iPhone and Justin Bieber: Is something popular because it is actually good, or is it popular just because it is popular?
To help answer that question, the researchers devised an experiment in which they could manipulate a small corner of the Internet: reader comments.
They collaborated with an unnamed Web site, the company did not want its involvement disclosed, on which users submit links to news articles. Readers can then comment on the articles, and they can also give up or down votes on individual comments. Each comment receives a rating calculated by subtracting negative votes from positive ones.
The experiment performed a subtle, random change on the ratings of comments submitted on the site over five months: right after each comment was made, it was given an arbitrary up or down vote, or — for a control group — left alone. Reflecting a tendency among the site’s users to provide positive feedback, about twice as many of these arbitrary initial votes were positive: 4,049 to 1,942.
The first person reading the comment was 32 percent more likely to give it an up vote if it had been already given a fake positive score. There was no change in the likelihood of subsequent negative votes. Over time, the comments with the artificial initial up vote ended with scores 25 percent higher than those in the control group.
“That is a significant change,” Dr. Aral said. “We saw how these very small signals of social influence snowballed into behaviors like herding.”
Meanwhile, comments that received an initial negative vote ended up with scores indistinguishable from those in the control group.
The Web site allows users to say whether they like or dislike other users, and the researchers found that a commenter’s friends were likely to correct the negative score while enemies did not find it worth their time to knock down a fake up vote.
The distortion of ratings through herding is not a novel concern. Reddit, a social news site that said it was not the one that participated in the study, similarly allows readers to vote comments up or down, but it also allows its moderators to hide those ratings for a certain amount of time. “Now a comment will more likely be voted on based on its merit and appeal to each user, rather than having its public perception influence its votes,” it explained when it unveiled the feature in April.
Read the entire article here.
Image: Facebook “like” icon. Courtesy of Wikimedia / Facebook.