Before I get into the study itself I would like to point out that posting this is not a rebuttal to any previous discussion. I think this is a very fascinating study on the human tendency and something for all of us to keep in mind when interacting online and how our ability, or lack therof, to influence the opinions of others at least in the moment. OK, so here's the stuff...
The concept is simple, provide instant corrections to web-surfers when they run across obviously false information on the Internet. But a new study suggests that this type of tool may not dispel inaccurate beliefs, particularly among people who already want to believe the falsehood. The problem with trying to correct false information is that some people want to believe it, and simply telling them it is false won't convince them. For example, the rumor that President Obama was not born in the United States was widely believed during the past election season, even though it was thoroughly debunked.
The prospect of correcting falsehoods like this online before they have a chance to spread widely has obvious appeal, Garrett said. But will it work? In order to find out, Garrett conducted a study with Brian Weeks, a graduate student in communication at Ohio State (ugh). Their study which they presented Feb. 26 in Austin, Texas, appears in the 2013 Proceedings of the Computer Supported Cooperative Work and Social Computing conference.
Participants in the study were a diverse group of 574 adults from across the country who participated online. The experiment was designed to see what would happen when participants read false statements copied from a political blog about the issue of electronic health records. While some of the information, collected from news stories and government sources, was correct, the researchers also inserted several false statements about who was allowed access to these records. For instance, the message falsely claimed that hospital administrators, health insurance companies and even government officials had unrestricted access to people's electronic health records.
The participants were divided into three groups - some were presented with an immediate correction, saying that FactCheck.org, an independent fact-checking organization, had concluded this blog post contained factual errors. Inaccurate statements were italicized, enclosed in brackets and displayed in red, and a detailed correction appeared at the bottom of the page.
Others read the blog post with the errors, followed by completing an unrelated three-minute task, and then were presented with the exact same correction. The final group was presented only with the inaccurate message during the study. Afterwards, all participants were asked how easy or difficult it would be for several groups to access electronic health records. Participants were graded based on the accuracy of their answers.
In general, those who received the immediate correction were just slightly more likely to be accurate than those who received the delayed correction. Those who received no corrections were, not surprisingly, the least accurate. But the most interesting results came when the researchers analyzed who was influenced by each kind of correction.
The real-time correction worked well with participants who indicated at the beginning of the study that they supported electronic health records, also called EHRs. "But for those who opposed EHRs, the effect of the immediate correction was essentially the same as if they had received no correction at all," Garrett said. The reason appears to be that opponents of EHRs discounted the credibility of the source of the correction, Garrett said. On the other hand, the more favorably an individual felt about EHRs, the more credible the correction was perceived to be.
Although this pattern was also evident among those who received the delayed correction, the effect was significantly weaker. Garrett said the results of this study cast doubt on the theory that people who believe false rumors need only to be educated about the truth to change their minds. "Humans aren't vessels into which you can just pour accurate information," he said. "Correcting mis-perceptions is really a persuasion task. You have to convince people that, while there are competing claims, one claim is clearly more accurate." Garrett noted that, while instant corrections were slightly more effective than delayed corrections, the problem is that instant corrections actually increase resistance among those whose attitudes are supported by the falsehood.
The concept is simple, provide instant corrections to web-surfers when they run across obviously false information on the Internet. But a new study suggests that this type of tool may not dispel inaccurate beliefs, particularly among people who already want to believe the falsehood. The problem with trying to correct false information is that some people want to believe it, and simply telling them it is false won't convince them. For example, the rumor that President Obama was not born in the United States was widely believed during the past election season, even though it was thoroughly debunked.
The prospect of correcting falsehoods like this online before they have a chance to spread widely has obvious appeal, Garrett said. But will it work? In order to find out, Garrett conducted a study with Brian Weeks, a graduate student in communication at Ohio State (ugh). Their study which they presented Feb. 26 in Austin, Texas, appears in the 2013 Proceedings of the Computer Supported Cooperative Work and Social Computing conference.
Participants in the study were a diverse group of 574 adults from across the country who participated online. The experiment was designed to see what would happen when participants read false statements copied from a political blog about the issue of electronic health records. While some of the information, collected from news stories and government sources, was correct, the researchers also inserted several false statements about who was allowed access to these records. For instance, the message falsely claimed that hospital administrators, health insurance companies and even government officials had unrestricted access to people's electronic health records.
The participants were divided into three groups - some were presented with an immediate correction, saying that FactCheck.org, an independent fact-checking organization, had concluded this blog post contained factual errors. Inaccurate statements were italicized, enclosed in brackets and displayed in red, and a detailed correction appeared at the bottom of the page.
Others read the blog post with the errors, followed by completing an unrelated three-minute task, and then were presented with the exact same correction. The final group was presented only with the inaccurate message during the study. Afterwards, all participants were asked how easy or difficult it would be for several groups to access electronic health records. Participants were graded based on the accuracy of their answers.
In general, those who received the immediate correction were just slightly more likely to be accurate than those who received the delayed correction. Those who received no corrections were, not surprisingly, the least accurate. But the most interesting results came when the researchers analyzed who was influenced by each kind of correction.
The real-time correction worked well with participants who indicated at the beginning of the study that they supported electronic health records, also called EHRs. "But for those who opposed EHRs, the effect of the immediate correction was essentially the same as if they had received no correction at all," Garrett said. The reason appears to be that opponents of EHRs discounted the credibility of the source of the correction, Garrett said. On the other hand, the more favorably an individual felt about EHRs, the more credible the correction was perceived to be.
Although this pattern was also evident among those who received the delayed correction, the effect was significantly weaker. Garrett said the results of this study cast doubt on the theory that people who believe false rumors need only to be educated about the truth to change their minds. "Humans aren't vessels into which you can just pour accurate information," he said. "Correcting mis-perceptions is really a persuasion task. You have to convince people that, while there are competing claims, one claim is clearly more accurate." Garrett noted that, while instant corrections were slightly more effective than delayed corrections, the problem is that instant corrections actually increase resistance among those whose attitudes are supported by the falsehood.
Comment