In October 2004, a flawed systematic review entitled “Interactive Health Communication Applications for People with Chronic Disease” was published in the Cochrane Library, accompanied by several press releases in which authors warned the public of the negative health consequences of interactive health communication applications, including the Internet. Within days of the review's publication, scientists identified major coding errors and other methodological problems that invalidated the principal conclusions of the study and led to a retraction. While the original study results and their negative conclusions were widely publicized in the media, the retraction seemed to go unnoticed.
This paper aims to document an unprecedented case of misinformation from a Cochrane review and its impact on media, scientists, and patients. As well, it aims to identify the generic factors leading to the incident and suggest remedies.
This was a qualitative study of the events leading to the retraction of the publication and of the reactions from media, scientists, and patients. This includes a review and content analysis of academic and mass media articles responding to the publication and retraction. Mass media articles were retrieved in May 2005 from LexisNexis Academic and Google and were classified and tallied. The extended case method is employed, and the analysis is also applied to comparable publishing events.
A search on LexisNexis Academic database with the query “Elizabeth Murray AND health” for the period of June 2004 to May 2005 revealed a total of 15 press reports, of which only 1 addressed the retraction. Google was searched for references to the review, and the first 200 retrieved hits were analyzed. Of these, 170 pages were not related to the review. Of the remaining 30 pages, 23 (77%) were reports about the original publication that did not mention the retraction, 1 (3%) was a bibliography not mentioning the retraction, and 6 (20%) addressed the retraction, of which only 1 was a non-Cochrane–related source.
Analyzed retrievals showed that the mass media gave more coverage to the Cochrane review than to the retraction or to a related systematic review with a similar scope but a different conclusion. Questionable results were prematurely disseminated, oversimplified, and sensationalized, while the retraction was hardly noticed by the public. Open commentary by scientists and patients helped to rapidly identify the errors but did not prevent or correct the dissemination of misinformation.
On October 18, 2004, the Cochrane Collaboration, a organization which produces and disseminates systematic reviews of health care interventions [
Interactive health communication applications (IHCAs) were defined in the IHCA review as “computer-based, usually Web-based, health information packages for patients that combine information with social-, decision-, or ‘behavior change'-support” [
The principal conclusion of the review was “consumers whose primary aim is to achieve optimal clinical outcomes should not use an IHCA” [
The National Library of Medicine (NLM) is a leader in the bibliographic handling of retractions. The Medical Subject Headings (MeSH) contain the concept “retracted publication,” which identifies a citation previously published and now retracted through a formal issuance from the author, publisher, or other authorized agent. In January 2005, the PubMed query “Retracted Publication[Publication Type] AND 1971:2004[edat]” retrieved 619 retracted citations that entered PubMed between 1971 and 2004. Since the query “1971:2004[edat]” retrieves approximately 12.5 million citations, less than 1 in 10000 publications have been retracted.
Friedman [
While very few publications are officially retracted, the concern about factors related to retractions is substantial. The study of retractions itself might be indexed with MeSH concepts such as “scientific misconduct,” although the fraction of retractions that stem from error as opposed to scientific misconduct is not known. The query “Scientific Misconduct[majr] AND 1971:2004[edat]” in PubMed retrieved 1840 citations. This body of literature recommends that medical researchers constructively criticize the research practices of others in their institution to reduce the likelihood of misconduct [
The objective of this paper is to document the IHCA review as an event in the history of medical publishing, to identify the factors leading to the publicizing of a retracted publication, and to assess the implications.
The objectives of this research called for various study methods. The author employed the following three methods: (1) historical processes of collecting documents about a contemporary event and organizing them thematically; (2) ethnographic processes of author participation in the event, personal communication with other participants in the event, interpretation of communications, and construction of models; (3) content analyses based on bibliographic database and Internet searches, coding of the retrieved documents, and tallying of the code frequencies.
The ethnographic method employs the extended case method, and the extended case method applies reflexive science to ethnography. Buroway describes reflexive science as follows: “Reflexive science starts out from dialogue, virtual or real, between observer and participants, embeds such dialogue within a second dialogue between local processes and extralocal forces that in turn can only be comprehended through a third, expanding dialogue of theory with itself” [
Various database and Internet searches were employed to study the impact of the review and to quantify the difference between mass media coverage of the original publication and its retraction. LexisNexis Academic databases of health news and general news were searched, as was Google. The queries were designed in an iterative process that began with keywords from the question to be addressed but refined the query based on study of the query retrieval results. The retrieved results were coded, and the coding language was also developed in an iterative process. First, the obvious codes “about the review” and “about the retraction” were introduced. Each retrieved document was classified into a single code by the author. If the retrieved document was not appropriately described by an existing code, then the coding language was augmented. The Web of Science was also queried to identify academic citations, but no citations were identified (data not shown). Most database and Internet searches were conducted in May 2005.
To better understand how special the publicity accorded the IHCA review was, this study was extended to three other publications: 2 of these were retracted publications tagged as “Retracted Publication” (1 Cochrane review, but not eHealth related, and 1 non-Cochrane review, but eHealth related), and 1 was a meta-analysis with a scope similar to that of the IHCA review. These 3 reports were identified through PubMed searches.
The following qualitative results on the impact of the IHCA review are organized into three main sections: scientist reaction, mass media reaction, and patient reaction.
The section on scientist reaction considers Cochrane reviewers' reactions and how eHealth scientists responded to the IHCA review in the comment section of the Cochrane database. The mass media section provides the Cochrane retraction and then explores, via LexisNexis and Google results, the reaction of the mass media to the IHCA review. The patient reaction section shares dialogue from patient-patient online discussions that reveals the reactions of patients to the IHCA review.
The Cochrane Collaboration allows anyone to submit comments to the published reviews. Two scientists' comments on the IHCA review appeared independently on October 28, 2004. Kummervold and Eysenbach criticized the IHCA review for both its protocol and its coding.
Kummervold explained in detail how the coding of the meta-analysis was incorrect: “We can't get the numbers to add up, it looks like they are reversed in 8 of the 11 studies...” [
Eysenbach had similar comments, stressing that a formal meta-analysis of these heterogeneous studies was problematic, and that the three studies which contributed most to the “negative” result were in fact positive: “Apart from the fact that I do not think that it is legitimate to do a formal meta-analysis using papers measuring totally heterogeneous outcomes with different types of interventions, I also notice that the overall effect estimate is ‘negative' (eg, ‘favoring control') because of three studies…. However, when I read these three studies I cannot find that their result[s] are negative…. If my suspicion is correct, then this is quite a catastrophic error, and quite an embarrassment for Cochrane to let such an error slip through peer-review” [
On November 10, 2004, the Cochrane Consumers and Communication Review Group reacted to the discovered errors [
John Wiley & Sons (the publisher of the Cochrane Database) released to EurekAlert a retraction on December 6, 2004: “The review originally determined that…chronically ill people using interactive programmes had worse clinical outcomes than those who did not. Regrettably, errors in data analysis meant that these outcomes were reported incorrectly.... It is expected that the revised results will be published in April 2005” [
The April 2005 edition of the Cochrane Systematic Reviews did not mention the IHCA review. Royle, the chief executive officer of the Cochrane Collaboration, said that further review of the revised report was ongoing and no date could be given as to when the review might be published (personal communication, April 25, 2005).
The Cochrane Database of Systematic Reviews is not read by the typical consumer. However, Murray's employer, the University College London (UCL), worked with Murray to widely publicize the result. UCL posted a news bulletin on its website on October 18, 2004 that remained there as of May 25, 2005. The bulletin was titled “Knowledge may be hazardous to web consumers' health” and stated the following: “People who use their computers to find information about their chronic disease often wind up in worse condition than if they had listened to their doctor, according to a UCL review of studies on internet health.… One reason…might be because knowledge-seekers become so steeped in information from the Internet they make treatment choices on their own, contradicting advice from their doctors” [
Most significantly, the UCL bulletin was circulated to information intermediaries that are considered the main entrance to the world's mass media, including AlphaGalileo and EurekAlert.
A search on LexisNexis Academic with the query “Elizabeth Murray AND health” for the period June 2004 to May 2005 revealed a total of 15 relevant press reports, in the following categories:
Medical and Health News: There were 9 publications with titles such as UCL's press release title of “Knowledge may be hazardous to web consumers' health.” The publications appeared in places like
General News–Major Papers: There were 5 relevant articles, such as one entitled “Why medical advice from the internet can be bad for your health” in the British
Time Incorporated Publications: There was 1 article in the November 1, 2004 issue of
Among the 15 results from the LexisNexis Academic database, only 1 newspaper report, authored by Tom Spears, dealt specifically with the retraction [
To further test whether the media emphasized the false negative result but minimally covered the retraction, a content analysis on Google was performed on May 24, 2005. The query was “health AND Cochrane AND Murray AND (interactive OR web OR internet)” for English pages, within the past year. Of the first 200 retrieved hits, 170 pages were not related to the IHCA review. Of the remaining 30 pages, 23 (77%) were reports about the original publication that did not mention the retraction, and an additional page was a bibliography (at a UCL site) that included a citation to the IHCA review, again without mentioning the retraction. All reports (except the bibliography) used a title such as “Click to Get Sick?” and emphasized the negative impact on clinical outcomes of using the Web. The reports came from such reputable sources as the
The grey literature reported on the mass media. For example,
NLM indexed the IHCA review and entered the citation for it (including its abstract) in PubMed on October 21, 2004. The “Retracted Publication” tag did not, however, appear in PubMed until March 24, 2005.
Some patients reported the news about the IHCA review to their patient-patient online discussion groups. In a neurology patient discussion group [
This author reported the
The typical patient with a chronic disease has no formal medical training and is ill prepared to critique a meta-analysis of clinical trials. However, the typical patient is vulnerable to cultural pressures, as they are partially shaped by and reflected in the mass media.
For comparison, a search for further retracted Cochrane reviews using the PubMed query “Cochrane Database Syst Rev[TA] AND Retracted Publication[PT] AND 1971:2005/5/25[edat]” was conducted. One reference, in addition to the IHCA review already discussed, was identified, which was a retracted review by Brewster et al [
A search on LexisNexis with the query “Brewster AND antihypertensive” for the period November 2004 to May 2005 retrieved no articles in either the “General News–Major Papers” category or the “Medical and Health News” category.
A search on Google for “Brewster antihypertensive” followed by an examination of the first 100 retrieved pages identified 23 relevant pages, which had a very different content pattern than the hits for the IHCA review. They all contained citations of papers from Brewster et al, who have published elsewhere on the same subject as in their review. The Brewster et al publication attracting the most attention was an article [
To determine whether other articles on a similar topic to the IHCA review have been retracted, a search was first made for articles on a similar subject that had been MeSH indexed in PubMed. The article by Demiris [
A search on LexisNexis Academic with the query “McKinley and surgical and Internet” for the period 1995 to May 2005 revealed no relevant press reports, neither in the “General News–Major Papers” category (three hits were all not relevant to the McKinley article) or in the “Medical and Health News” category.
A search on Google for English pages with the query “McKinley surgical Internet” revealed 96 irrelevant pointers in the first 100 results. Of the remaining 4 relevant hits, 1 was the article about the plagiarism [
Thus, the only other retraction of a published article appearing in PubMed similar in topic (the Internet) to the IHCA review had a very different pattern of reactions than the IHCA review.
The IHCA review addressed a topic that the mass media found interesting. Have any other recent publications also been a meta-analysis on the impact of interactive applications on health, and, if yes, what was the mass media reaction? Using the query “Meta-analysis AND Web AND Chronic Illness” in PubMed, we found only 1 citation: Wantland et al [
What has been the impact of the Wantland et al paper and how does that compare to the impact of the IHCA review? The Wantland et al paper was not announced with a press release in EurekAlert. A search on LexisNexis Academic for newspaper articles about the Wantland et al paper retrieves no articles. The queries performed were similar to those performed for the IHCA review and included “Wantland AND health” for 2004 through 2005 in General News/Major Papers.
A search was done on Google for “Wantland health Web” on May 24, 2005. Of the first 200 returns, 182 were not relevant. Of the remaining 18 hits, 15 pages contained academic citations to Wantland et al, 2 announced the appearance of the article, and 1 was a personal blog that commented on the article.
Thus, most of the Google returns that gave Wantland et al citations are academic in character and very different from the mass media coverage afforded the IHCA review.
As shown, the IHCA review provides a perhaps unprecedented case from which lessons should be drawn. Only one other Cochrane review (about antihypertensives) has been retracted, and that one received negligible mass media attention. The only retracted publication in PubMed that is indexed under the MeSH concept of “Internet” (the IHCA review did not have time to get indexed before it was withdrawn) received no newspaper coverage. The paper most similar to the IHCA review in topic and method (the Wantland et al report [
This section next presents a framework based on tiers of response. The first tier is medical scientists. The second tier is the mass media spreading medical press releases. The third tier is the patient community reacting to the mass media and the scientists.
In an effort to critique the problem that occurred, one might build on the analysis of misconduct in toxicology by Purchase. Purchase [
Intention of the work
Conduct of the studies
Design and interpretation of studies
Bias from conflict of interest
In the case of the IHCA review, the intention was scientifically appropriate, namely to gain further insight about IHCAs through a systematic review. In the other three categories, fault can be found:
The errors in the coding of data should not have been made. The coauthors Nazareth and Tai, who are credited with doing the coding, have good enough credentials to not lay the blame on lack of experience: Nazareth is a Professor at UCL and is Scientific Director of the British Medical Research Council's General Practice and Research Framework, and Tai has coauthored several articles over the past two decades that appeared in refereed medical journals. An explanation for the miscoding in terms of experience of the coders is not apparent.
The design of the study has been criticized as lumping together studies which are too heterogeneous in their design, interventions, and outcomes [
The reporting of the work suggests possible bias. The authors and their employers have sensationalized a result that catches the media's attention. For some observers, the review appeared biased in that the authors, who are affiliated with medical institutions, concluded that patients should listen to their doctor, instead of seeking help on the Internet.
Purchase [
Open commentary, as exists for the Cochrane Database after a publication, is one way to identify flaws. Extending the open commentary to the refereeing phase might reduce the likelihood of something going to press with errors. A submitted article might be available to the public and a community of hundreds of registered scientists could be invited to make anonymous comment. Submissions online would require extensive online commenting that reached a consensus before a submission could be considered “published.” Other approaches to increase the commentary on the research process include refereeing the protocol phase [
The second-order problem is a press release and subsequent mass media coverage of the release. Winsten's classic study of science and the media shows how the truth is repeatedly misrepresented by journalists and researchers: “The most striking finding which emerged from the interviews [of medical journalists] is the dominant distorting influence of the competitive force in journalism.… As economic competition among hospitals has intensified, they have begun to compete aggressively for publicity.… With increasing frequency…scientists…are using the media to attach their names to important findings before their competitors do.… The result has been a spiraling competition, sometimes characterized by exaggerated claims” [
Online media have stimulated further competition [
One way for researchers to prevent the mass media from misrepresenting the truth is for researchers to understand how the media work and to interact with the media accordingly [
The honesty of the press could be improved with the Internet [
The third-order problem concerns the long-term impact of the mass media. While electronic publications might be erased from a computer or marked as retracted, this does not consistently happen. Furthermore, some of the mass media coverage of the IHCA review is on paper and sits on people's bedside tables with no practical way to be retracted [
Although this author did not (yet) find any citations to the IHCA review in Web of Science, previous studies have confirmed that a retracted scientific publication may continue to have impact without readers recognizing its retracted status. For instance, one study [
If and when the revised IHCA review is published, what could it say that would undo the effect of the original publication? If the conclusion is that IHCAs result in improved clinical outcomes, then the medical profession will want to closely study the protocol and might have grounds to discredit the conclusion. The media trumpeted the IHCA review conclusion partly because it was counterintuitive but was backed by top-notch institutions. If the conclusion becomes intuitive, then the media are unlikely to be interested in it.
The reactions to the IHCA review in patient online discussions highlight the importance of virtual communities in helping patients deal with published information. Simple extensions to Web-based, patient, discussion systems could help patients connect to Web-based publications. For instance, when a patient posts a message to a Web-based discussion board, the Web system could parse the message and provide links from the message to relevant articles on the Web. Patients might follow the links and engage in discourse about the validity and implications of the literature. This might lessen the potential ill effects of publications that are wrong or misleading.
This special medical publishing event was marked by incorrect coding and a desire for maximum publicity. The IHCA review authors, their employers, and the Cochrane Collaboration were responsible for quality control, and failed. The mass media played their part by widely publicizing a sensational message but not reacting to the notice that that sensational message was false. The false result that patients are clinically harmed by interactive applications was very strongly delivered to patients worldwide. The broad lesson to be re-learned is that potentially sensational results should be carefully scrutinized before being sensationalized.
None declared.
interactive health communication applications
University College London