Intended for healthcare professionals

Information In Practice

Obtaining useful information from expert based sources

BMJ 1997; 314 doi: (Published 29 March 1997) Cite this as: BMJ 1997;314:947
  1. David C Slawson, associate professor (dslawson{at},
  2. Allen F Shaughnessy, director of researchb
  1. a Department of Family Medicine University of Virginia Charlottesville VA 22908 USA
  2. b Harrisburg Family Practice Residency Program Harrisburg PA
  1. Correspondence to: Dr Slawson
  • Accepted 5 March 1997


Clinicians rely heavily on expert based systems– consultation with colleagues, journal reviews and textbooks, and continuing education activities–to obtain new information. The usefulness of sources such as these depends on the relevance and validity of the information and the work it takes to obtain it. Useful information can be distinguished from the useless by asking three questions: Does the information focus on an outcome that my patients care about? Is the issue common to my practice, and is the intervention feasible? If the information is true, will it require me to change my practice? If the answer to all three questions is yes, then the information is a common POEM (patient oriented evidence that matters), capable of improving the lives of your patients and must be evaluated for validity. Conclusions based on results of well designed clinical trials are more likely to be valid than those drawn from observations based on experience in clinical practice. Both members of the team, clinicians and experts, must take responsibility for their respective roles.


When educators and academics look for solutions to the problem of managing information in clinical practice they tend to focus on computer resources– hardware and software–for solutions. Clinicians, on the other hand, usually turn to colleagues or other specialists–“wetware”–to find answers. Information from these expert sources can be obtained by direct consultation, via review articles and textbooks, or during continuing education activities.

Expert sources of information are valuable because they are quick, cheap (usually), and easy to use. An expert also provides guidance, support, affirmation, and other psychological benefits that computerised sources cannot provide.1 While much has been written about the critical assessment of journal articles and reviews, little information is available to guide clinicians in evaluating “expert” sources. We will explore the usefulness of these sources and present a rationale for when to use them and how to evaluate the information.

Determining the usefulness of medical information

When we read journals or textbooks, attend continuing education conferences, or consult with colleagues to obtain information about treating our patients, our goal is to find the most useful information in the shortest time. The best information has three attributes: it must be relevant to everyday practice, it must be correct, and it should require little work to obtain. These three factors are related as follows:

[This figure is not available.]


Relevance is based on the type of information being presented and the frequency of the problem in your practice. The most relevant information will tell you how to help your patients live functional satisfying lives free from pain and symptoms. We call this type of information patient oriented evidence, It is based, as much as possible, on the results of “outcomes based” research rather than “authority based” or “experiential” impressions. It does not stop with surrogate markers2 or intermediate level data–what we call disease oriented evidence–but instead evaluates outcomes of importance to people.3

For example, an expert recommending the need for screening for prostate cancer with the prostate specific antigen assay may discuss the accuracy of the test in identifying men with prostate cancer and the survival rates for different stages of prostate cancer (disease oriented evidence). This information does not tell you what you and your patients really want to know: whether they will be better off (live a longer, healthier, happier life) as a result of early identification of the cancer (patient oriented evidence). Only a randomised trial evaluating the overall effect of early detection on mortality and morbidity from prostate cancer will provide this information.

What doctors are truly seeking is “patient oriented evidence that matters” (POEM). This information matters because, if it is valid, it should change what they do in practice. Once a POEM has been identified the frequency of contact with the problem in clinical practice must be considered. When doctors evaluate information sources POEMs are top priority if they provide information that doctors can use to better manage patients with illnesses frequently seen in their clinical practice. These common POEMs will have the greatest impact on patients and therefore have the greatest relevance.


The second factor is the likelihood of the information being true. Conclusions based on results of well designed clinical trials are more likely to be valid than those drawn from observations in clinical practice. But it is not enough to accept evidence at face value simply because it has been published in a well known journal or comes recommended from a specialist.4 For example, early research into preventing osteoporosis showed that sodium fluoride increased bone density, and, based on expert recommendation, many patients were subsequently treated.5 Further research, however, found that fracture rates were actually increased with the use of fluoride.6


This includes factors such as how long it takes to obtain the information, how much it costs, and the amount of mental energy required to track down the answer. Working too hard to establish the validity or relevance of information will lower its usefulness. Too often, however, information sources that require little work also have low validity or relevance and should be used cautiously. Expert based sources are appealing because the work needed to access information is low. The true potential for usefulness of these sources therefore depends on their validity and relevance.

The “lure” of the expert

Perhaps the most important benefit of expert based information is that experience can be used to interpret and apply evidence to the care of difficult patients. Experts can be relied on to perform procedures or to help with diagnosing atypical disease. For example, a paediatric infectious disease specialist may be the best person to confirm the diagnosis of varicella encephalitis in a sick child.

Benefits and drawbacks of expert based information Benefits

  • Wisdom gained through experience

  • Clinical “pearls” that cannot be derived through the scientific method

  • Able to fill in the gaps in current outcomes based knowledge with evidence derived from clinical experience or from their extensive knowledge of the physiological basis of disease


  • The information may be out of date

  • Reverse gullibility, in which clinical experience or disease oriented evidence is favoured over patient oriented evidence

  • Knowledge may be based on a highly selected population, and the information may not be applicable to your patients

Doctors must be cautious, however, when adopting an expert's anecdotal based treatment advice. This same infectious disease specialist may be too biased by 20 years of experience to endorse the practice of not admitting to the hospital all febrile children under the age of two months. There also is a tendency on the part of the clinicians to develop clinical “rules” out of patient specific recommendations made by consultants. A typical mental shortcut is to think, “The last time I asked, the cardiologist suggested amlodipine. Therefore, I should always use amlodipine.” It may not have been the expert's intention to provide a wide ranging rule in response to your specific question.

There are several other reasons why expert based information may be inaccurate. Firstly, expert advice may not be based on the most current research. A study of reviews and textbook articles written by experts found that in some cases up to 13 years elapsed between the time that convincing evidence was available to support interventions for managing myocardial infarction and when these interventions were recommended by most experts.7 As well as the lag time associated with publication, this delay stems from the proclivity of experts, especially researchers, to favour their beliefs or their prior experiences over evidence derived from outcomes based research, a tendency termed “reverse gullibility.”8 For example, at least four randomised clinical trials have shown that patching eyes after a corneal abrasion not only does no good but actually worsens pain and delays healing.9 10 11 12 In a recent series of letters in a major journal expert ophthalmologists stated that, despite this research, clinicians at their institution had always patched eyes and would continue to do so regardless of information obtained from patient oriented clinical trials.13 14

A second problem with expert information is that there is a tendency for authors of review articles to start by writing their conclusions and then finding the supportive evidence. The potential for unrecognised bias in these articles is high, since only references are used that support these predetermined conclusions. Furthermore, cited references often do not actually support the conclusions. For example, the latest edition of a widely used reference of antimicrobial treatment states that co-trimoxazole should not be used in children with otitis media who do not respond to amoxycillin.15 However, the reference given to support this recommendation actually states the exact opposite.16 This flaw is not uncommon: one study of review articles found that 24% of the referenced articles were not correctly summarised.17 Assertions or information in reviews or texts that contradict what you think is true should be checked for validity. A little raised number at the end of a statement is not an icon of inerrancy.

A third issue is that expert based knowledge is often developed through experience with a highly selected patient population, and this knowledge may not be applicable to the general population.18 To an endocrinologist who sees a unique population, patients with weight gain, headaches, and fatigue have a hormone secreting tumour until proved otherwise. Similar patients seen by a primary care doctor would probably be evaluated for depression.

Finally, you should not assume that an expert is skilled at evaluating medical research. Techniques for critical appraisal are not well known or easily developed, and experts may not be any better than you are at determining the validity of research findings.

Improving the usefulness of expert based information

Whenever review articles, textbooks, continuing education presentations, or consultants are used to transfer information, both the expert and the clinician are involved in the process. The appropriate use of expert based sources requires the effort of both groups to assume responsibility for their respective roles.

Clinicians using review articles to keep up to date do not have to read the entire article. It is acceptable to read the article headlines, abstract, or conclusion first. If a statement is not firmly rooted in patient oriented evidence it can be disregarded. If a POEM is found (information with patient oriented outcomes with the potential to change clinical practice) the clinician must then determine its validity. Other sources can be used to confirm validity before any changes in practice are made. Original research and review articles can be evaluated with pre-existing criteria set forth by the Evidence Based Medicine Working Group.19 A useful source of continually updated reviews based on evidence is the Cochrane Collaboration.20

Clinicians attending continuing education presentations or receiving advice from colleagues should stay alert to glean new information, supported by patient oriented evidence, that would require them to change their practice. Speakers and consultants should clearly identify whether their recommendations are based on patient oriented research outcomes or on other evidence. They should provide some evidence that they have assessed the validity of the information. If not, the listener must take an assertive approach by asking, “What is the evidence to support that recommendation?”

Whether experts are writing reviews or textbook chapters, giving continuing education presentations, or providing consultations, their role remains the same. They should always emphasise POEMs first when forming a recommendation, even when this information conflicts with disease oriented evidence or their own clinical experience. A substantial problem, however, is that POEMs are often unavailable to guide patient care decisions. As a result, experts are often unable to support a particular recommendation with POEMs but have to base their opinion on a knowledge of pathophysiology, studies of surrogate outcomes, experience, or “gut feeling.” Thus, when POEMs are not available, experts should use the best available evidence and clearly state that these sources are the basis for their recommendations. By doing so, they can help clinicians to remain more open to changing their practice when patient oriented evidence becomes available, avoiding reverse gullibility.

This mesh of patient oriented evidence and expert opinion provides the most value to the audience and lowers the work required to obtain useful information. Too often lectures are like professional basketball games, in which only the last two minutes are important.


As consumers and providers of new information on the same team, clinicians and experts evaluating information from any source must always keep three questions in mind:

  • Will this information have a direct bearing on the health of my patients?

  • Is the issue at hand common to my practice?

  • If valid, will this information require me to change my current practice?

A negative answer to any of these questions allows you to deflect the information. When the answer to all three of these questions is yes, you have found a common POEM capable of improving the lives of your patients. If the information is valid it will require you to change your practice. If you cannot determine validity from the source, you must look elsewhere. If it is not a valid POEM, it is just not necessarily so.

Experts are a valuable source of information for clinicians. Their usefulness, however, depends on the relevance and validity of the information they present. Close attention to these two issues by both experts and clinicians can lead to better information management, and with it, better patient care.

Using expert based information

  • Determine the level of evidence supporting recommendations–Is the recommendation based on patient oriented or disease oriented evidence?

  • Do not make your own clinical rules out of patient specific recommendations made by consultants

  • Remember the lag time between fact finding and publication

  • Realise that authors of reviews and textbooks and speakers at educational conferences often begin with their predetermined conclusions and then find evidence to support only this point of view

  • Wisdom gained through experience helps clinicians diagnose disease and perform procedures. Experience is not adequate to remain proficient in treatments: proficiency also requires a knowledge of the medical literature and the ability to think critically with an open mind


Funding: None.

Conflict of interest: None.


  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.