Intended for healthcare professionals

Editorials

A revolution in drug discovery

BMJ 2000; 321 doi: https://doi.org/10.1136/bmj.321.7261.581 (Published 09 September 2000) Cite this as: BMJ 2000;321:581

Combinatorial chemistry still needs logic to drive science forward

  1. Nigel Beeley (nbeeley{at}arenapharm.com), vice president and chief chemical officer,
  2. Abi Berger (aberger{at}bmj.com), science editor
  1. Arena Pharmaceuticals, 6166 Nancy Ridge Drive, San Diego, CA 92121, USA
  2. BMJ

    Several new drugs now in phase II clinical trials have got there at speed. Gone are the days of laboriously slow organic chemistry. New techniques and a rapidly changing culture surrounding drug development have revolutionised the time it now takes to get drugs from discovery to market. Bristol-Myers Squibb, for example, reported recently that BMS-201038, an inhibitor of microsomal triglyceride transport protein that lowers plasma cholesterol concentrations, is already, after a relatively short period of time, in phase II clinical trials.

    A key component of any drug discovery programme is synthetic organic chemistry, which has analogies with Lego blocks. Chemical building blocks are assembled according to precise rules, creating molecules of increasing complexity. Some of these blocks hold others in position, while others are capable of interacting with a biochemical target (such as the active site of an enzyme or a receptor). Some blocks are there to modify the metabolism of molecules and others modify the overall structure to improve drug delivery. “Combinatorial chemistry” is the process by which millions of molecular constructions can be created and tested simultaneously and which underlies the speed of new drug development.

    BMS-201038 is Bristol-Myers Squibb's first molecule to get through to the clinic whose origins were within a focused combinatorial library of chemicals synthesised by robotics.1 Other examples have come from Merck, which has described several combinatorial cases, such as a series of molecules that interact with somatostatin receptors.2

    Medicinal chemists traditionally synthesised a handful of different molecules, submitted them for test in appropriate biological assays, and then waited for the results. As the results came through, the chemists would modify the design of the molecules and a new generation would be created and retested. Although modern refinements, such as drug design aided by computer, became an integral part of the process, this archaic regimen consistently represented one of the rate limiting steps in getting drugs from discovery to market.

    Recent advances, however, have significantly compressed the discovery component of the pharmaceutical timescale for research and development. The study of gene sequences (genomics) has improved the identification of useful points of therapeutic intervention; combinatorial chemistry has generated massive numbers of molecules for testing; and screening using high throughput techniques has automated the process of doing large numbers of biological assays. All have reduced the discovery-to-market time from the conventional 10–14 years to the 5–8 years claimed today.

    Chemistry's contribution to this began in the 1960s, when Nobel prizewinner Bruce Merrifield devised a methodology that allowed peptide synthesis to be performed on a solid support using, for example, beads of polystyrene.3 This method resulted in an almost instant purification of molecules, using filtration to separate the chemical reagents from the products. Peptide synthesis thus became an automated process. A single operator can now synthesise an important biologically active peptide within 24 hours, compared with the years of chemists' time required using more traditional approaches. At the same time a move back to empiricism is appearing on the horizon, with chemists insidiously being turned from scientists into technicians.

    This pioneering work enabled Mario Geysen in the early 1980s to devise a way of synthesising many peptides in parallel.4 He developed an 8×12 matrix of polystyrene pins, which corresponds to the 96-well microtitre plate used today. Geysen's matrix was used to identify useful epitopes, the short peptide sections of antibodies that interact with corresponding antigens.

    In 1985 Richard Houghten synthesised millions of peptides as mixtures.5 He screened these mixtures for biological activity, and, having identified those that showed positive activity, he was then able to figure out which components of the mixtures were responsible for that activity. This was one of the first examples of the new discipline called combinatorial chemistry.6

    Not surprisingly, in today's entrepreneurial world many new companies are capitalising on combinatorial chemistry. The combination of performing organic synthesis on solid support, along with automation and robotics, allows a single chemist to synthesise thousands of different molecules in one week, which can then be evaluated for biological activity in a high throughput manner. Such collections of molecules are referred to as libraries. Most of the pharmaceutical and biotechnology industries are now carrying out some form of combinatorial chemistry in their laboratories. Those who recognised the merits of this science early on have now got molecules in the clinic that came from combinatorial chemistry and high throughput screening. The promise of considerably shortened discovery-to-market time has become a reality.

    As chemistry becomes capable of being able to synthesise the universe of possible organic structural permutations, and high throughput screening is approaching the capacity to screen everything against every biochemical target ever identified, we wonder if science is taking a backward step towards empiricism. We will have a huge barrage of information without having thought about how all the pieces fit together. Today's approach is in danger of becoming haphazard, with little logic driving it forward.

    The costs associated with this “big, dumb science” approach are not trivial. But as new information is analysed and a more focused rationale emerges, the costs should diminish. People who begin to understand the processes involved, rather than simply generating large quantities of data, are going to rediscover the pleasure of becoming scientists again.

    References

    1. 1.
    2. 2.
    3. 3.
    4. 4.
    5. 5.
    6. 6.
    View Abstract