Moderating Uncivil User Comments by Humans or Machines? The Effects of Moderation Agent on Perceptions of Bias and Credibility in News Content

Research output: Contribution to journalJournal articlepeer-review

34 Citations (Scopus)

Abstract

Studies have shown that uncivil comments under an online news article may result in biased perceptions of the news content, and explicit comment moderation has the potential to mitigate this adverse effect. Using an online experiment, the present study extends this line of research with the examination of how interface cues signalling different agents (human vs. machine) in moderating uncivil comments affect a reader’s judgment of the news and how prior belief in machine heuristic moderates such effects. The results indicated that perceptions of news bias were attenuated when uncivil comments were moderated by a machine (as opposed to a human) agent, which subsequently engendered greater perceived credibility of the news story. Additionally, such indirect effects were more prominent among readers who strongly believed that machine operations are generally accurate and reliable than those with a weaker prior belief in this rule of thumb.
Original languageEnglish
Pages (from-to)64-83
Number of pages20
JournalDigital Journalism
Volume9
Issue number1
Early online date4 Dec 2020
DOIs
Publication statusPublished - 2 Jan 2021

User-Defined Keywords

  • Moderation
  • incivility
  • comments
  • machine heuristic
  • news bias
  • credibility

Fingerprint

Dive into the research topics of 'Moderating Uncivil User Comments by Humans or Machines? The Effects of Moderation Agent on Perceptions of Bias and Credibility in News Content'. Together they form a unique fingerprint.

Cite this