Intended for healthcare professionals


When I use a word . . . . Too much healthcare—technology

BMJ 2022; 378 doi: (Published 26 August 2022) Cite this as: BMJ 2022;378:o2102
  1. Jeffrey K Aronson
  1. Centre for Evidence Based Medicine, Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
  1. Twitter @JKAronson

Much modern healthcare depends on the excellent technology that we have at our disposal. However, technology can result in too much healthcare, not by having too much technology, but because the technology we have is capable of being misused or the results misinterpreted. New forms of technology are often associated with fear: fear of using them lest damage occurs, either to the technology itself or through use of the technology, and fear of “technological unemployment.” Artificial intelligence can go wrong through unintended consequences of the programming. Technology can be deliberately misused, as when predatory open access journals offer online publication of poor research for payment, without offering the type of backup that one would normally expect from a publisher, or when machine translators are used to substitute terms in fake papers with so-called “tortured phrases.” Producing fake images through manipulation is another example. Misinterpretation of technological outputs can also lead to impaired healthcare. Technological advances have revolutionised our management of many healthcare problems, but misuse of technology and misinterpretation of results can lead to overdetection, overdiagnosis, and misdiagnosis, potentially leading to overtreatment.


There would be less overdetection in healthcare, one of the major factors that contributes to overdiagnosis,12 if we had less diagnostic technology. On the other hand, much modern healthcare depends on the excellent technology that we have at our disposal. So the link between technology and too much healthcare is not forged by having too much technology, but by misuse of the technology that we have or misinterpretation of the outcomes of using the technology. Scientific advances are also potentially vulnerable.

When we talk about technology, we may typically think about systems that measure things mechanically and transform them into pictures, like CT or MRI scanners, or into numbers, like pieces of analytical apparatus, such as liquid scintillation counters; or we may think of robots that do mechanical things, such as assisting in laparoscopic surgery; or gadgets, often programmable, such as halogen ovens, air fryers, dishwashers, and washing machines, that we use around the house or hospital to do quotidian tasks. Or we may think of information technology and devices that collect, process, store, and distribute information, such as personal computers, mobile phones, or satnav systems, which themselves depend on another form of technology, geostationary satellites. And of course, different forms of technology can be linked, as when digital images are obtained by a scanner and processed by an intelligent computerised algorithm.

These are just a few examples of the kinds of technology with which we have become familiar in everyday practice. But the word “technology” has a much more varied history than this limited view of it suggests.

The roots of “technology”

The IndoEuropean root TEK[H]S meant to weave or fabricate, and we have woven a surprising number of words from it.

The Greek derivative, τέκτων meant a carpenter or builder, giving us words such as architect, and tectonic, which originally meant related to building, but then became more specialised, referring to the structure of the earth’s crust and the plates that form it. Other words ending in –tect, such as detect and protect, come from a different Latin word, tegere to cover.

In Latin the verbal derivative texere, which may have come via the Sanskrit takṣ (तक्ष्), to fashion, form, or cut, also meant to weave or fabricate, particularly embroidery or tapestry. Textum was a piece of woven cloth, from which we derive text. A praetextus was a particular kind of cloth, a toga with a purple border, as worn by magistrates; praetextus therefore also came to mean showy or pretentious, and hence the English word pretext. Contexere meant to weave together or join, and the adjective contextus meant closely joined or interwoven, uninterrupted or unbroken; hence our word context, which originally meant the weaving together of written material in the process of composition. Other derivatives have arisen through direct prefixation in English; they include audiotext, co-text, intertext, pre-text (as opposed to pretext), protext, subtext, and videotext, the Greek–Latin hybrids epitext, hypertext, meta-text, microtext, paratext, peritext, and teletext, and the unusual German–Latin hybrid Urtext. Prefixation with a familiar abbreviation also gives us e-text; or perhaps that should be called e-fixation.

Textura in Latin meant the art or process of weaving or a method of doing it. It gives us not only texture but also the Italian loanword tessitura.

Via French, texere has also given us tissue, originally a rich kind of cloth, often interwoven with gold or silver.

In German TEKHS gives dachs, a badger, famous for building dams; that gives us the badger-hound, the dachshund.

But “technology” comes from Greek. The Classical Greek derivative of TAKHS, τέχνη, meant an art or craft and hence a method of making or doing, a system, or a set of rules. In Hippocrates’ first aphorism it famously refers to the art of medicine. By extension, τέχνη also meant a treatise, typically on grammar or rhetoric. The term τεχνολογία, meaning the systematic treatment of a subject such as grammar, was introduced later, in Hellenistic Greek. The post-classical Latin form, technologia, meant something similar, a treatise on grammar or the liberal arts. And that is what the word originally meant in English, when it appeared in the early 17th century. However, in French the word “technologie” referred to the technical language or terminology of a subject, which is what the word came to mean in English in the mid-17th century. Sir Thomas Browne is credited with the first use of the word in this sense, when in his Hydriotaphia, Urn Burial (1658) he referred to the Hebrew term “binah,” part of the cabbalistic Tree of Life, as “the mother of Life and Fountain of souls in Cabalisticall Technology.”

It wasn’t until late in the 18th century that “technology” came to mean knowledge dealing with the mechanical arts and applied sciences and the study of it, and then, another half a century later, the use of such knowledge for practical purposes. And it took until the end of the 19th century before the modern meaning of the word emerged: equipment of some sort, used in the widest possible sense, developed from practical application of scientific and technical knowledge.

Most of this information is based on entries in the Oxford English Dictionary (OED), but there is a more recent meaning that the dictionary has not yet caught up with. The word is used, typically by the UK’s National Institute of Health and Care Excellence (NICE), to refer to any intervention whose cost-effectiveness it has been asked to assess. And it is used to refer not only to devices and operative interventions but also to medicinal products, about which NICE issues technology appraisals (TAs).

Thus, the term “technology” turns from a non-count noun to a count noun,4 referring to individual examples. This allows us to talk about “technologies,” such as 99mtechnetium, which was the subject of a NICE technology appraisal, TA73, describing the use of radioactive tracers in myocardial perfusion scintigraphy.3

Technetium,” of course, comes from the same root as “technology.” The word was introduced by the chemist Carlo Perrier and the physicist Emilio Segrè in 1947, after they had synthesised radioactive isotopes of the element in 1937, a huge technical achievement, requiring metaphorical pyrotechnics, actually neutron or deuteron bombardment of molybdenum.5 “It seems appropriate now,” they wrote, “to give a name to this element ... and we would like to propose the name of ‘technetium’, from the Greek τεχνητος, artificial, in recognition of the fact that technetium is the first artificially made element.”

Fear of technology

Whenever a new form of technology emerges it tends to engender fear. It may be fear of damaging the technology itself or causing harm through using it. There are many examples.

When use of mobile phones became widespread, hospitals issued diktats that they were not to be used on the premises because they might be in some way hazardous, perhaps interfering with the operation of important pieces of equipment, such as cardiac monitors or pacemakers. It appears that those fears were unfounded.67 Concerns about brain tumours have also been allayed.8 However, concerns still remain about transmission of infections.9

There are also justified fears about potential harmful effects of artificial intelligence. Consider, for example, the deep learning algorithm that used neural networks to diagnose pneumonia in chest x-rays and worked out which x-ray machine had been used and in which hospital, as a way of predicting whether the generated image might contain signs of pneumonia, because some machines were used for sicker patients.10 The machine thereby sought to make the diagnosis by a probabilistic shortcut based on the source of the information and not the information itself. As a result, diagnostic capability was reduced when the computer accessed external data.

Or the phenomenon of wireheading, which may cause a computer to find ways of seeking a reward that the original algorithm did not anticipate.11 For example, a machine that learns to prioritise avoiding errors may prefer to shut down rather than to pursue its previously programmed tasks, risking error; one that is programmed to maximise its use of a reagent may start pouring it away; or one that is trained to fix a problem may create more problems, with the intention of gaining rewards by fixing them.

Another fear that arises when new technologies are introduced into the workplace is that unemployment may result. Charlie Chaplin’s satire Modern Times (1936) illustrated the problems of the Great Depression, which Chaplin attributed to increased efficiencies brought about by industrialisation. The film was well received, and remains popular, but the problem was more complex than he envisaged. Enthusiasm for technological developments in the 1920s gave way to unease in the 1930s as the Depression hit, and “technological unemployment” was blamed. However, as a major science populariser of the 1950s and later, Magnus Pyke, pointed out, in a book about automation,12 a term that was then still quite new and contemporaneous with the introduction of the word “technophobe,” technological advances could bring increased leisure, which could be used profitably, and new ways of working would bring new challenges that would result in increased employment.

Since then the complexity of the problem has been further elucidated. For example, in a Danish prospective cohort study, the introduction of new technology into the workplace reduced the risk that older workers working in offices, administrative posts, IT, or production would lose paid employment, but increased it among those working with people.13 Those who were directly involved in the introduction of new technology and who received adequate training in using it were also at lower risk.

Misuse of technology

Technological advances give opportunities for misuse. For example, the rise of predatory journals in the past 10 years has been facilitated by online publication. Beall’s list of predatory open access journals increased from 18 publishers in 1911 to over 900 in early 2017 when his website was withdrawn. Fortunately, they make little impact on the scientific literature and are infrequently cited.14

Paper mills are another manifestation. These organisations sell fabricated research papers to authors who are seeking publications to further their careers.15 Such papers can sometimes be recognised by their use of what have become known as “tortured phrases,”16 standard phrases that have been distorted by machine translation, in order to disguise the origins of plagiarised text. In this way “standard error of the mean” might become “standard mistake of the average” and “artificial intelligence” might become “counterfeit consciousness.” Alternatively, consider how a machine might translate “artificial intelligence” if “artificial” can mean “fake” and “intelligence” can mean “news,.” Technology impugning technology.

The ability to manipulate images constitutes another opportunity to misuse technology. In an analysis of 960 papers published in Molecular and Cellular Biology in 2009–16, 59 (6.1%) were found to contain inappropriately duplicated images, leading to 41 corrections and five retractions. In 13 cases no action was taken. The authors estimated that if these results were representative, as many as 35 000 published papers would be candidates for retraction because of inappropriate image duplication.

Misinterpretation of technology

The results of laboratory tests can sometimes be misinterpreted. Take the use of the MDRD method for estimating creatinine clearance from the serum creatinine concentration; the results are expressed in units of mL/min/1.73m2. However, when the patient is, say, a little old man or woman with half that surface area, a value of 60 mL/min/1.73m2, thoughtlessly accepted, would be interpreted as satisfactory when there was an important degree of renal impairment. I have seen this happen more than once, with, in at least one case, a fatal outcome.

Concluding thoughts

The examples given above are a tiny selection of those available, which could fill a book.

Technological advances have revolutionised our management of many healthcare problems. Misuse of technology and misinterpretation of results can lead to overdetection, overdiagnosis, and misdiagnosis, potentially leading to overtreatment.


  • Competing interests: None declared.

  • Provenance and peer review: not commissioned; not externally peer reviewed.