Published criteria for evaluating health related web sites: reviewBMJ 1999; 318 doi: https://doi.org/10.1136/bmj.318.7184.647 (Published 06 March 1999) Cite this as: BMJ 1999;318:647
- A description of the agreement coefficient appears on our website
Published criteria for evaluating health related web sites: review
Paul Kim, Thomas R Eng, Mary Jo Deering, Andrew Maxfield
Health Communication and Telehealth, Office of Disease Prevention and Health Promotion, US Department of Health and Human Services, Washington DC, USA
Thomas R Eng,
May Jo Deering,
National Institute for Occupational Safety and Health, Centers for Disease Control and Prevention, Washington DC, USA
Correspondence to: Dr Eng
Objective To review published criteria for specifically evaluating health related information on the world wide web, and to identify areas of consensus.
Design Search of world wide web sites and peer reviewed medical journals for explicit criteria for evaluating health related information on the web, using Medline and Lexis-Nexis databases, and the following internet search engines: Yahoo!, Excite, Altavista, Webcrawler, HotBot, Infoseek, Magellan Internet Guide, and Lycos. Criteria were extracted and grouped into categories.
Results 29 published rating tools and journal articles were identified that had explicit criteria for assessing health related web sites. Of the 165 criteria extracted from these tools and articles, 132 (80%) were grouped under one of 12 specific categories and 33 (20%) were grouped as miscellaneous because they lacked specificity or were unique. The most frequently cited criteria were those dealing with content, design and aesthetics of site, disclosure of authors, sponsors, or developers, currency of information (includes frequency of update, freshness, maintenance of site), authority of source, ease of use, and accessibility and availability.
Conclusions Results suggest that many authors agree on key criteria for evaluating health related web sites, and that efforts to develop consensus criteria may be helpful. The next step is to identify and assess a clear, simple set of consensus criteria that the general public can understand and use.
The large volume of health information resources available on the internet has great potential to improve health,(1)(2)(3) but it is increasingly difficult to discern which resources are accurate or appropriate for users.(3)(4)(5)(6)(7)(8) Because of the potential for harm from misleading and inaccurate health information,(9)(10)(11)(12)(13)(14) many organisations and individuals have published or implemented criteria for evaluating the appropriateness or quality of these resources.(15) (16) Two published reviews of evaluation criteria for health related web sites did not present information on the range of criteria proposed by various authors, and included rating tools that were not developed exclusively for health related sites.(15) (17) Our study reviews criteria currently proposed or employed specifically to evaluate health related web sites.
Databases and search engines
Between September 1997 and May 1998, we conducted a search of the web and peer reviewed medical journals for criteria for evaluating health related information on the web using Medline and Lexis-Nexis databases, and web search engines including Yahoo!, Excite, Altavista, Webcrawler, HotBot, Infoseek, Magellan Internet Guide, and Lycos. Medline searches (using PubMed) used variations of the following: "quality," "Internet," "World Wide Web," "computer communication networks/standards," "quality control," and "medical Informatics/standards." Searches with web search engines and Lexis-Nexis used "quality," "health information," "health," and variations of "rating," "ranking," "evaluate," "award," and "assess." Investigating references and hyperlinks from initial results gave additional resources. We ended the sampling period when searches produced similar results, and when previous search results became outdated.
We included criteria when they were explicit, specifically used for evaluating health related web sites, and published in a peer reviewed journal or publicly accessible web site. We also considered peer reviewed journals not indexed by Medline. We included resources framed as "guidelines" because there was little difference between them and other criteria, and the intent of the authors was similar. When subcriteria provided details about main criteria, we included only the main criteria to prevent overrepresenting that author’s perspective. Criteria were extracted and sorted into similar groups according to their wording and description. When a criterion seemed to combine several concepts and could fit in multiple groups, we considered the first mentioned concept.
To examine the reproducibility of the category groupings, four independent, naive coders also assigned 40 randomly selected and ordered criteria to the 13 criteria cateogries. Agreement among the coders was 84% (two or more coders coded each criteria into identical categories). The coders unanimously agreed on 63% (25/40) of the criteria. The agreement of the coders with the authors was 76% (the independent coders used the same categories as the authors in coding cirteria 121 times out of a possible 160). The difference between the agreement among coders (84%) and agreement of coders with the authors (76%) results from coders agreeing on a category but disagreeing with the authors. Given that the observed agreement of the coders with the authors was 76%, the coefficient, indicating "per cent agreement above chance", was 0.74, or 74%. The basis for this "agreement coefficient" is Scott’s pi; (pi=percentage of observed matches-percentage of expected matches/1-percentage of expected matches where the expected percentage is based on (1/4)´(13/40).(18) However, it is not "agreement" per se, but rather agreement with the authors, or a coefficient that reflects agreement with the "correct" response.
Twenty nine rating tools and articles—24 web sites and five journal articles—had explicit criteria for assessing health related web sites (table 1(T1)). Of the 165 criteria identified, 132 (80%) were grouped under 12 specific categories (table 2(T2)). Thirty three (20%) criteria that lacked specificity or were unique were categorised as "miscellaneous." Frequently cited criteria included those dealing with content, design and aesthetics of site, and disclosure of authors, sponsors, or developers.Discussion
Not surprisingly, "content" of the site, which includes concepts of information quality and accuracy, was the most commonly cited criterion group. Design and aesthetics of the site and ease of use were the second and sixth most frequently cited groups respectively, indicating that authors highly value good quality application design and user interfaces. Disclosure of authors, sponsors, or developers had the third highest frequency, highlighting the need for users to be able to consider a site’s content in the context of who created or financed the site. It was somewhat surprising that disclosure was not more commonly cited given recent reports about misleading health information and fraud on the internet.(9) (11) (12) Most rating tools discriminated between content and the fourth most common criterion group, currency of information (includes frequency of update, freshness, maintenance of site), suggesting that currency of information is nearly as important as the information itself.
Criteria related to confidentiality and privacy of information were only cited by one author despite widespread interest in this issue.(19) Some health related web sites are already collecting personal health information to "tailor" content, and as sites begin to integrate healthcare services and information, confidentiality and privacy safeguards will become increasingly important.(19)(20)(21)
Study limitations include the subjective variables around the scope of the criteria categories used. Testing of the category groupings, however, showed that they were reproducible by others. It is also possible that some authors used the same criteria terms to describe different concepts. Because subcriteria were not included, some concepts may not have been represented. Inherent limitations of web search engines and the dynamic nature of the web also prevented us from locating all existing published criteria.(22) Nevertheless, our review located more sources of criteria specifically for health related sites than did previous reviews.(15) (17)
Given the evolving state of the internet, it may be difficult or even inappropriate to develop a static tool or system for assessing health related web sites. Our results suggest that many authors agree on key criteria, and that efforts to develop consensus criteria may be helpful.(6) (16) (23)(24)(25) The next step is to identify and assess a clear, simple set of consensus criteria that the general public can understand and use. Tools that integrate them need to be developed and validated, and their ultimate impact and effectiveness in assisting the public with health related decisions should be monitored to ensure that they remain useful.
· Many organisations and individuals have published criteria to evaluate health related information on the world wide web
· A literature and world wide web search found that the most frequently cited criteria were those dealing with content, design and aesthetics of site, disclosure of authors, sponsors, or developers, currency of information, authority of source, and ease of use
· Criteria related to confidentiality and privacy were only cited by one author
· Consensus regarding critical criteria for evaluation of web based health information seems to be emerging
· Our results indicate that many authors agree on key criteria for evaluating health related web sites, and that efforts to develop a set of key criteria may be helpful
We thank Farrokh Alemi and Anne Restino for their assistance and advice on this study. The views expressed in this paper are solely those of the authors and do not necessarily reflect those of the US Department of Health and Human Services.
Contributors: PK participated in data collection, analysis, and interpretation, and writing the paper. TRE formulated the study design, developed the core ideas, and participated in data analysis and interpretation, and writing the paper. MJD participated in the study design and interpretation, and edited the paper. AM participated in data analysis and interpretation, and edited the paper. PK and TRE will act as guarantors for the paper.
Funding: Internal funds of the US Department of Health and Human Services.
Competing interests: None declared.
- Government Accounting Office. Consumer health informatics. Emerging issues. Publication Government Accounting Office/Accounting and Information Management Division-96-86, July 1996.
- Robinson TN, Patrick K, Eng TR, Gustafson D for the Science Panel on Interactive Communication and Health. An evidence-based approach to interactive health communication: a challenge to medicine in the Information Age. JAMA 1998;280:1264-9.
- Eng TR, Maxfield A, Patrick K, Deering MJ, Ratzan S, Gustafson D. Access to health information and support: a public highway or a private road? JAMA 1998;280:1371-5.
- Coiera E. The internet’s challenge to health care provision. BMJ 1996;312:3-4.
- Anon. The web of information inequality (?). Lancet 1997;349:1781.
- Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the internet. Caveant lector et viewor—let the reader and buyer beware. JAMA 1997;277:1244-5.
- Sonnenberg FA. Health information on the internet. Opportunities and pitfalls (?). Arch Intern Med 1997;157:151-2.
- Wyatt JC. Commentary: measuring quality and impact of the world wide web. BMJ 1997;314:1879-81.
- Federal Trade Commission. North American Health Claim Surf Day targets Internet ads. Hundreds of e-mail messages sent. Press release, Nov. 5, 1997. Accessed May 12, 1998. http://www.ftc.gov/opa/9711/hlthsurf.htm
- Impicciatore P, Pandolfini C, Casella N, Bonati M. Reliability of health information for the public on the world wide web: systematic survey of advice on managing fever in children at home. BMJ 1997;314:1875-9.
- Food and Drug Administration. FDA warns consumers on dangerous products promoted on the internet. FDA Talk Paper T97-26, June 17, 1997.
- Bower H. Internet sees growth of unverified health claims. BMJ 1996;313:497.
- Micke MM. The case of hallucinogenic plants and the internet. J Sch Health 1996;66:277-80.
- Weisbord SD, Soule JB, Kimmel PL. Poison on line—acute renal failure caused by oil of wormwood purchased through the internet. N Engl J Med 1997;337:825-7.
- Jadad AR, Gagliardi A. Rating health information on the internet. Navigating to knowledge or to Babel? JAMA 1998;279:611-4.
- Pealer LN, Dorman SM. Evaluating health-related web sites. J Sch Health 1997;67:232-5.
- Murray PJ, Rizzolo MA. Web site reviews and evaluations. Nurs Stand Online 1997 Jul 30;11(45). Accessed January 29, 1998. http:\\www.nursing-standard.co.uk/vol11-45/ol-art.htm
- Krippendorf K. Content analysis: an introduction to its methodology. Beverly Hills, CA: Sage, 1980.
- Bowen JW, Klimczak JC, Ruiz M, Barnes M. Design of access control methods for protecting the confidentiality of patient information in networked systems. Proceedings of the American Medical Informatics Association annual fall symposium 1997:46-50.
- National Research Council, Computer Science and Telecommunications Board (US). For the record: protecting electronic health information. Washington: National Academy Press, 1997.
- Patrick K, Robinson TN, Alemi F, Eng TR, for the Science Panel on Interactive Communication and Health. Policy issues relevant to the evaluation of interactive health communication applications. Am J Prev Med 1999;16:35-42.
- Lawrence S, Giles CL. Searching the world wide web. Science 1998;280:98-100.
- British Healthcare Internet Association. Quality standards for medical publishing on the web. Accessed May 26, 1998. http://www.bhia.org/public/reference/recommendations/medpubstandards.htm
- Health On the Net Foundation. HON code of conduct for medical and health web sites. Accessed January 27, 1998. http://www.hon.ch/HONcode/Conduct.html
- Health Information Technology Institute, Mitretek Systems. Criteria for assessing the quality of health information on the internet. Accessed January 27, 1998. http://www.mitretek.org/hiti/showcase/index.html
- When your patient is a survivor of tortureBMJ November 09, 2016, 355 i5019; DOI: https://doi.org/10.1136/bmj.i5019
- Popular measure in California ballot targets drug pricesBMJ October 28, 2016, 355 i5830; DOI: https://doi.org/10.1136/bmj.i5830
- European drug agency launches website giving open access to trial dataBMJ October 21, 2016, 355 i5700; DOI: https://doi.org/10.1136/bmj.i5700
- Industry sponsorship hits the headlinesBMJ October 19, 2016, 355 i5585; DOI: https://doi.org/10.1136/bmj.i5585
- Beyond open data: realising the health benefits of sharing dataBMJ October 10, 2016, 355 i5295; DOI: https://doi.org/10.1136/bmj.i5295
- Surfing for Juvenile Idiopathic Arthritis: Perspectives on Quality and Content of Information on the Internet
- Characterisation and evaluation of UK websites on attention deficit hyperactivity disorder
- Evaluation of internet websites about retinopathy of prematurity patient education
- Emerging challenges in using health information from the internet
- The internet
- Accuracy of information on apparently credible websites: survey of five common health topics
- Disseminating health information in developing countries: the role of the internet
- http://www.ilar.org A source of information for rheumatologists, allied health professionals, medical students and the general public
- The Internet and evidence-based decision-making: a needed synergy for efficient knowledge management in health care
- Promoting partnerships: challenges for the internet age
- Helping patients access high quality health information