Trolls Win: It Might Be Better To Shut Off Comments
    By News Staff | February 14th 2013 07:01 PM | 10 comments | Print | E-mail | Track Comments

    Why are physicists thick-skinned but biologists run for the hills when the comment trolls invariably appear?  It may help to be arcane and complex - it's harder to troll hard sciences. Everyone feels like they know some biology but good luck to casual readers trying to debunk rare B_s decays in a high energy physics paper.

    But in biology you will invariably get comments like...

    "How much did Monsanto pay you to write this?"


    "You can't prove this is safe"

    ...which can be frustrating. Brian Dunning at SkepticBlog even coined a term for it - Argumentum ad Monsantium - and has a hilarious list going back to 2008 of when it was used against him, even when he was not talking about biotech at all. It is just the go-to claim for people who hate biological science, the same way 'scientists used to talk about an ice age in the 1970s' is used by detractors of climate science.

    The next big culture war is nanotechnology and it is going to make GMOs and pharmaceutical research look like a societal walk in the park. Unfortunately, for people just trying to learn about new research or technology, the tone of comments can be such a turn-off that it will skew the perception of the science itself. Basically, the trolls win just by using fear, doubt and derision. In that sense, as much as it is anathema to Science 2.0, it is sometimes better to just shut off comments.

    An online experiment sampled a representative cross section of 2,338 Americans where the civility of blog comments was manipulated, like introducing name-calling into commentary tacked onto an otherwise balanced newspaper blog post. Results showed that comments could elicit either lower or higher perceptions of risk, depending on one's predisposition to the science of nanotechnology.

    "It seems we don't really have a clear social norm about what is expected online," said Dominique Brossard, a University of Wisconsin-Madison professor of Life Science Communication, contrasting online forums with public meetings where prescribed decorum helps keep discussion civil. "In the case of blog postings, it's the Wild West."

    For rapidly developing nanotechnology, a technology already built into more than 1,300 consumer products, exposure to uncivil online comments is one of several variables that can directly influence the perception of risk associated with it.

    Some results were a little odd. Highly religious readers were more likely to see nanotechnology as risky when exposed to rude comments than less religious readers. So maybe less religious people are used to rudeness.

    But even simple disagreement in posts can also sway perception: "Overt disagreement adds another layer. It influences the conversation," Brossard said.

    UW-Madison Life Sciences Communication Professor Dietram Scheufele, another of the study's co-authors, notes that the Web is a primary destination for people looking for detailed information and discussion on aspects of science and technology. Because of that trend, "studies of online media are becoming increasingly important, but understanding the online information environment is particularly important for issues of science and technology."

    Article: Journal of Computer-Mediated Communication, no live online version yet.


    Gerhard Adam
    Basically, the trolls win just by using fear, doubt and derision. In that sense, as much as it is anathema to Science 2.0, it is sometimes better to just shut off comments.
    Sure, that way no one ever has to answer any questions.  That way anyone posting articles ranging from a new Theory of Everything to Origin of Life issues, they can simply turn off comments and declare the science as "finished".

    I already assume that any article that has comments turned off is self-serving propaganda.

    BTW ... anyone that can't handle a few comment trolls, and "heads for the hills" is unworthy of being involved in setting any public policy.
    Mundus vult decipi
    Scientists doing outreach are not engaged in public policy, that is why it is instead called outreach. But you're proving the point with your tone.
    News media reporting can do a lot to improve this situation by asking good questions, being skeptical - being a skeptic is not anti-science - it is science. Have a look at this one: . Almost all the comments instantly recognize that this is not free energy.

    Science journalism doesn't ask a lot of hard questions these days, I agree - some topics are culturally forbidden. But  physorg is just press releases, no journalist wrote that so there were no questions a journalist would ask.  And this one is about whether or not comments help or hurt. If the first comment is trolling and sniping, does it turn people off from even reading further? I think it does so maybe rather than turning them off completely it might be better to turn them off for the first day. The trolls will have moved onto whatever Google Alerts has brought up next. Chris Mooney detailed the motivated reasoning of the public regarding climate change but it is easily transposed to any field - vaccine, energy and food anti-science people do it even more than the climate deniers.
    I really can't believe what I am reading. Only the fear of truth or a need for control is a reason to suppress ideas. What is the criteria? How do you write it down? How do you identify objectively, scientifically, for all time, what a troll comment is? On the hand, if you are drunk with power or you fear the light of truth, how convenient it is to just turn off that light.

    If they are bad ideas they are seen for that and you receive a clearer perception of the truth - if they are good ideas then you are enriched.

    Here's another example. The commentary instantly identifies an important error in the mass of the Russian meteor reported recently:

    So what is the issue?  If a bunch of crackpots immediately clog up a story, that isn't a bad thing, because sometimes commenters get things right a journalist gets wrong?  I suppose, but that is - as you worried - a subjective, artificial metric.

    There are 75 people here who would have read that first paragraph and started making goat noises at him for writing it - Tuskunga is pretty famous - but because an equally dumb editor at a billion-dollar mainstream media corporation also didn't catch it, you think it should be open season for blatant trolls who hijack biology articles?  That doesn't compute.

    The study is what it is; it found trollish comments ruin the credibility of articles. If you disagree with their method or conclusions, show your data - but arguing that it is philosophically wrong because the conclusion disagrees with your beliefs is part of the problem, not part of the solution.

    Bonny Bonobo alias Brat
    The study is what it is; it found trollish comments ruin the credibility of articles. If you disagree with their method or conclusions, show your data - but arguing that it is philosophically wrong because the conclusion disagrees with your beliefs is part of the problem, not part of the solution
    My interpretation of the study you linked to, is that rude comments tend to quite quickly polarise people into reverting to their preexisting beliefs, that they held prior reading the news or science article and that this is probably caused by a primitive, emotional, biological response to the threatening behavior that the rudeness and insults represent and are interpreted by the reader.

    If this is true then a further study would probably show that in all liklihood, the same can be said about the original news and science articles, if they also contain a lot of rude snipes and comments targeted at the opposition or at their opposing beliefs, which is often the case both here at Science20 and elsewhere.

    The article you linked to says :-
    'The text of the post was the same for all participants, but the tone of the comments varied. Sometimes, they were "civil"—e.g., no name calling or flaming. But sometimes they were more like this: "If you don’t see the benefits of using nanotechnology in these products, you're an idiot."'

    'The researchers were trying to find out what effect exposure to such rudeness had on public perceptions of nanotech risks. They found that it wasn't a good one. Rather, it polarized the audience: Those who already thought nanorisks were low tended to become more sure of themselves when exposed to name-calling, while those who thought nanorisks are high were more likely to move in their own favored direction. In other words, it appeared that pushing people's emotional buttons, through derogatory comments, made them double down on their preexisting beliefs.'

    So, in my opinion, this study is clearly implying that people in general, are turned off and polarised by reading rude, insulting, intolerant, remarks and comments. Surely this would also apply to both the news/science articles as well as to their comments sections?

    Maybe all that is really needed is a rude word filter in both the articles and the comments, rather than shutting off all comments, even for a day? Isn't that what Science20 moderators and their moderation are already doing? It certainly wouldn't bother me at all, as I am never rude and I have never called anyone an idiot or a crackpot, unlike many other bloggers and commenters have done both here at Science20 and elsewhere as a simple search on those keywords often reveals.

    There is also plenty of scientific evidence in the hundreds of Science20 blogs here, to support what I am interpreting from this study, that rudeness in both articles and comments is generally a waste of time, uneducational and that it polarises people back into their preexisting camps and belief systems.

    My article about researchers identifying a potential blue green algae cause & L-Serine treatment for Lou Gehrig's ALS, MND, Parkinsons & Alzheimers is at
    Thor Russell
    It shouldn't be a binary decision. The audience is justified in wanting some kind of response from the author, and the author is justified in wanting their voice to be heard more than perhaps a following flame war in the comment section. In many cases including science journalism, people in the audience can write more coherently than the author. For example the journalist will confuse units, not include how the story fits in the big picture etc and someone following the field can give genuine advice.  Also some journalists obviously just present one side of the picture, perhaps because that it is easier being the side they are used to, or to deliberately incite argument.
    If it was just a binary decision (comment/no comment) and a flame war made the audience unreasonably take one side then "no comment" may be the best solution. However it isn't. Other ideas
    1. Commenters are not anonymous, but pseudonymous, and have a profile across many sites, e.g. Disqus. If their comments often get voted down as trolling, then users have to click a "comment downgraded because of reputation" link to see it. Also if they are prolific and well respected their comments should be more visible somehow.

    2. The audience can down vote comments, give reasons such as unreasonable/personal attack. Then if someone wants to continue their flame thread, they have to click again to see it, and the rest of the audience won't bother and so be unconsciously influenced.

    3. The author can get special privilege to downvote but perhaps not remove comments. Perhaps also rank comments if they don't have time to respond to all of them. 

    I would like to see how the results change if in order to see the flame war, the audience has to click on the "comment downgraded" link. The ones that don't click at all won't be badly influenced, and I expect the ones that do click will also attach less importance and be less influenced by the rudeness.

    Time to start being a bit more creative with social/reputation technologies to see if a better result is possible.
    Thor Russell
    Good insight. I've yet to find a universal system that wasn't intrusive in some way but a hybrid that accepted disqus, Facebook, etc. might not block out too many people. I think our closed system here does not work in 2013 - we end up denying 95% of sign-ups anyway, so it might be beter to outsource it to someone else.
    You can control who you give accounts to, and disqus, and Facebook, etc to allows comments, that would be a good system.
    I would also think you'd get some statistics from the comments that could be used as input on who you allow an account for people who aren't "scientists" (like me :) ).
    Never is a long time.