Increasing the impact of health services research
BMJ 2003; 327 doi: https://doi.org/10.1136/bmj.327.7427.1339 (Published 04 December 2003) Cite this as: BMJ 2003;327:1339All rapid responses
Rapid responses are electronic comments to the editor. They enable our users to debate issues raised in articles published on bmj.com. A rapid response is first posted online. If you need the URL (web address) of an individual response, simply click on the response headline and copy the URL from the browser window. A proportion of responses will, after editing, be published online and in the print journal as letters, which are indexed in PubMed. Rapid responses are not indexed in PubMed and they are not journal articles. The BMJ reserves the right to remove responses which are being wilfully misrepresented as published articles or when it is brought to our attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not including references and author details. We will no longer post responses that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
The paper by Dash et. al reporting findings from a Health Foundation
and Nuffield Trust study about the lack of impact of health services
research (1) has sparked off a much needed and fruitful debate.
In Scotland, a primary care based R&D programme is now in its 3
year (2). This initiative has taught us many, and at times hard earned,
lessons about the relationship between research, practice and policy in
the British NHS. We would fully endorse the paper’s recommendations in
relation to action needed to make research more effective as a lever for
change. We would also add some considerations based on our own experience.
The ‘Research-based Development of Scottish Primary Care’ is jointly
and directly funded by the Scottish Primary Care Trusts (PCTs) and
facilitated by the Scottish School of Primary Care. It is carried out by a
team of 4 research fellows, with research assistant and administrative
support. The research fellows work in 4 different Scottish academic
departments with strong primary care research connections. This structure
allows the initiative to bring the academic and service communities closer
together, and contributes to improved understanding between them and
ensures that rigour is not compromised in the development of practice
oriented research.
The initiative revolves around 4 different projects, identified as
important for development by the PCTs themselves. The projects concern the
integration of health and social care for older people and for people with
mental health problems, diabetic retinopathy screening and prevention of
unscheduled acute admissions for the oldest old (see (2) for detailed
description). Each PCT and relevant partner organisation (e.g. Social Care
or Acute Trust) has signed up to one project. The research fellows in
charge of each project work closely with nominated contacts within each
participating PCT to develop a project focus which is directly relevant to
practice and development of services within those organisations.
Mutual learning and a strategic focus for R&D are two
cornerstones for the initiative. Because several PCTs participate in each
project, there is room for mutual comparison and learning. This makes it
possible to develop strategic R&D which links national priorities and
policy to local variation in their implementation. We also work
systematically to increase capacity among our collaborators for rigorous
and applied research with immediate practical relevance.
This initiative thus facilitates ‘linkage and exchange’ (3) and
directly addresses many of the points raised in the paper:
• it avoids the disconnection between funders and beneficiaries of
the research;
• it facilitates the development of research approaches which are
directly relevant to practice, without compromising on rigour;
• the collaboration between researchers and end users of the research
at all stages of a project makes it easier to communicate research
findings to all relevant parties.
This approach is producing practice-oriented R&D. However, the
initiative would be more successful, and provide far better returns on the
PCTs investment, if there were better continuity and stability in
relationships between researchers and service collaborators. In our
experience, both NHS service and the academic community (including the
relevant Scottish Executive Departments) need to give long-term
consideration to how their interface can be improved.
For the NHS service community this means (see also (4):
• seriously looking at the effects of constant service reorganisation
and change;
• formulating short term political imperatives in ways which do not
take away from the long-term development work with less immediate results;
• considering the effects of several short term ‘pilot’ projects,
each funded from different sources and often working to the same target
populations, but poorly joined up;
• improving the career prospects and working conditions for middle
managers so that the good managers stay and see projects through, with
appropriate research input.
For the (medical) academic community it means:
• recognising that applied health services research founded on
rigorous case-comparisons of direct relevance to NHS organisation is
important and worth investment;
• valuing and providing career structures for researchers who work
across traditional service/academic boundaries;
• valuing development of combined methodologies which are flexible
and practice oriented.
Without an organisational infrastructure for R&D at the interface
between the NHS and academic communities we fear that the recommendations
of the Health Foundation/Nuffield Trust report may have less than maximum
effect.
We look forward to the forthcoming consultation exercise and hope
that some of these long-term issues will be addressed.
1. Dash P, Gowman N, Traynor M ‘Increasing the impact of health
services research’ BMJ:327:1339-41 (6th December)
3. Lomas J ‘Health Services Research’ BMJ 2003;327:1301-2 (6th
December)
4. Pettigrew M, Ferlie E, McKee L Shaping Strategic Change SAGE 1994
Competing interests:
None declared
Competing interests: No competing interests
Editor
Dash et al1 rightly debate the need to increase the impact of health
services research (HSR) in the NHS. While we agree with many of their
conclusions, we are very struck by the lack of recognition afforded to
undergraduate and postgraduate education of health care professionals. We
suggest that a considerable strengthening of the HSR contribution could be
made if health care professionals were taught HSR at undergraduate level
and if they were taught together through inter-professional education
initiatives. Here we make particular reference to the part to be played
by nursing to strengthening the impact of HSR in the NHS.
Historically nurses in the United Kingdom have been taught research
methods primarily from a theoretical perspective and usually ‘borrowed’
from the social sciences. Furthermore they are often taught by non-
research staff which means the focus remains theoretical when it should be
applied and practical. Essentially nurses should be taught HSR, ideally
in the company of other health care professionals and by active nurse
researchers with links to disciplines like Epidemiology as well as
Sociology and Psychology. A greater focus should be paid to developing
pertinent research questions likely to influence the care given to
patients and to patient outcomes rather than the tendency to focus on
research methodology per se often in isolation from clinical application.
As nurses represent the largest section of the workforce in the NHS, the
introduction of an HSR approach to research teaching in nursing and
midwifery education would provide an important bridge between research and
practice. We fully agree with a recent report by Professor Gerry McKenna
who concluded that patient care would be greatly improved if nurses and
other healthcare professionals were given more training in clinical
research2. In particular, conducting empirical research as part of a
taught Master’s programme should be an essential part of postgraduate
education for nursing. However the implementation of the research
governance framework for health and social care may create significant
barriers for students undertaking dissertations involving patients.
The potential of the nursing contribution to HSR has repeatedly been
reiterated3,4,5 but such recognition has not been matched by equal access
to funding or adequate management strategies to break the cycle of
disadvantage nurses have experienced in contributing to the R&D
agenda6. Rafferty argues that nurses account for 70% of the NHS wage bill
and 40% of the NHS budget, therefore 40% of the NHS Research and
Development budget should be invested in research that impacts upon their
work7. Furthermore funding agencies need to be more creative in their
approach to developing the research capacity of the workforce. They must
recognise that most nurses in a position to undertake research training
opportunities (studentships, fellowships or career scientists awards) are
working in tenured positions in Universities and not in the health
service. Placing restrictions on eligibility criteria (e.g. being
employed in the NHS or having a fixed term contract) only further limits
the extent to which nurses can make a contribution. In this sense nursing
is a special case.
If a more client centred approach to commissioning research is
adopted as recommended by Dash et al1, the nature of the nurse-client
relationship makes nurses key players in such developments6. By
strengthening research teaching of HSR in nursing and with more inclusive
approaches to financial investment in nursing research - commensurate to
the workforce’s contribution to public health and patient well-being8, the
impact of HSR in the NHS would be significantly enhanced.
1 Dash, P., Gowman, N., Traynor, M. (2003) Increasing the impact of
health services research. British Medical Journal, 327; 1339-1341.
2 Gould, N. Training is the key to better care. [Findings of a
report by Professor Gerry McKenna, University of Ulster] Belfast
Telegraph, 23rd September 2003.
3 Department of Health Research and Development Task Force (1994)
Supporting Research and Development in the National Health Service (Culyer
Report) London: HMSO.
4 Department of Health (1999). Making a Difference: Strengthening the
nursing, midwifery and health visiting contribution to health and
healthcare, London, DoH.
5 Department of Health (2000). Towards a strategy for nursing
research and development. Proposals for action, London, DoH.
6 McKenna H & Mason C (1998) Nursing and the wider R&D
agenda:Influence and contribution. Nursing Times Research, 3 2:10115.
7 Rafferty A M (2000) Influencing the research and development
agenda, paper presented to a Department of Health Research and Development
Workshop, York, March 2000 cited in Department of Health (2000). Towards a
strategy for nursing research and development. Proposals for action,
London, DoH.
8 Aiken, L.H., Clarke, S. P., Cheung, R. B., Sloane, D. M. and
Silber, J. H. (2003) Educational levels of hospital nurses and surgical
patient mortality. Journal of the American Medical Association, Vol. 290,
No. 12: 1617-1623.
Competing interests:
None declared
Competing interests: No competing interests
The paper by Dash et al raises relevant issues regarding the frequent
schisms between heath service researchers, decision makers, research
funders and the academic establishment.
Working in the field of interventions to improve heart health, I
recognize a clear preference from funders and journal editors for studies
and findings that can be generalized across populations, ideally, based on
large randomized control trials. While being esteemed methodologically and
therefore more likely to be published in prominent journals (1), the
findings of such studies are often inconsistent (2-4) and do little to
improve service quality(5) or the chronic problem of low service
uptake(6).
Health service researchers by nature tend to focus on contextual,
social, cultural and organizational factors that influence service
implementation and effectiveness. Yet, recognition that these factors can
influence the effectiveness of heart health interventions is often absent.
This ignores a wealth of evidence to the contrary (7-9).
Though rightly valued, in isolation, a randomized control trial does
not explain why an intervention is successful in one setting but fails to
replicate these benefits in others (8). Nor does this method identify why
some groups are better served by an intervention than other groups (8). If
the effectiveness of health services is to be improved for all patient
groups, a greater emphasis is needed on examining these kinds of ‘why’
questions. Research funders and the editors of our esteemed journals need
to recognize this and respond accordingly.
References
1. Dolby RGA. Uncertain Knowledge. London: Cambridge University
Press, 1996.
2. Ebrahim S, Davey Smith G. Systematic review of randomised
controlled trials of multiple risk factor interventions for preventing
coronary heart disease. British Medical Journal 1997;314:1666.
3. Jones DA, West RR. Psychological rehabilitation after myocardial
infarction : Multicentre control trial. British Medical Journal
1996;313:1517-1521.
4. McAlister F, Lawson F, Teo K, Armstrong P. Randomised trials of
secondary prevention programmes in coronary heart disease: systematic
review. British Medical Journal 2001;323:957-962.
5. Lewin RJP, Ingleton R, Newens AJ, Thompson DR. Adherence to
cardiac rehabilitation guidelines : A survey of rehabilitation programmes
in the United Kingdom. British Medical Journal 1998;316:1354-1355.
6. Cooper AF, Jackson G, Weinman J, Horne R. Factors associated with
cardiac rehabilitation attendance: a systematic review of the literature.
Clinical Rehabilitation 2002;16:541-552.
7. Wilkinson R, Marmot M, editors. Social determinants of health: The
solid facts. Geneva: World Health Association, 1998.
8. Pawson R, Tilley N. Realistic Evaluation. Sage: London, 1997.
9. Stewart M, Reutter L, WIlliamson D, Raine K, Wilson D, Fast J, et
al. Low-income consumers' perspectives on determinants of health services
use. Ontario: Canadian Health Services Research Foundation, 2001.
Competing interests:
None declared
Competing interests: No competing interests
From the Office of the
Director of Research and Development
Professor Sir John Pattison
Richmond House
79 Whitehall
London SW1A 2NS
Tel: 020 7210 5556
Fax:020 7210 5868
john.pattison@doh.gsi.gov.uk
12 December 2003
Dear Sir
The article by Penny Dash and colleagues, and the accompanying
editorial by Jonathan Lomas (6th December 2003) raise some interesting
points about the application of health services research in the UK
context. However, elements of their analysis do not seem to square with
the reality.
First, Dash et al claim that funders, researchers and users believe
that responsibility for disseminating the findings of research and
supporting its application does not rest with them.
The English Department of Health (DH) currently spends around £83m pa
directly on research to inform policy and practice, through its
commissioned research programmes. Structures are in place to ensure that
the knowledge generated by this investment is, indeed, applied to both
policy-making and practice.
For example, the DH Policy Research Programme commissions research
directly informed by policy-makers’ needs, which is then used by them to
inform, implement and improve national policy-making. This often has a
direct impact on practice at a service level.
The NHS Health Technology Assessment R&D programme has made a
major contribution to the international pool of evidence about ‘what
works’ in clinical care. Its programme is influenced by questions
clinicians and policy makers raise, and its outputs feed directly into
guidelines developed by the National Institute of Clinical Excellence. By
coupling this evidence base with clinical governance structures in every
NHS provider and new contracts in primary care that have quality
improvement at their heart, the NHS has, to use Paul Shekelle’s words in
the BMJ earlier this year, “vault[ed] over anything being attempted in the
United States, the previous leader in quality improvement initiatives.”
(1)
And the newer NHS R&D programme on Service Delivery and
Organisation is directly driven by questions raised from NHS managers,
professionals and service users. These groups play a direct role in
commissioning research and the programme has a vigorous approach to
dissemination. This active programme of dissemination includes a
developing relationship with the Modernisation Agency to ensure that its
change management activity is underpinned by high quality research
evidence.
Lomas refers to the NAO report that argued the importance of ‘linkage
and exchange’ and implies that the ‘limited adoption’ of such linkage goes
some way to explaining ‘the disappointing results from over a decade of
investments in the NHS R&D strategy’. In fact, the NAO report did not
review DH-funded research but, if it had, it would have shown that DH
already meets all of the key recommendations of the report.
Secondly, Dash et al claim that health services decision makers are
anxious that, at times, the ‘how’ of research overshadows the ‘what’. But
the issue is not, surely, that the ‘what’ is more important than the
‘how’. Poor quality research remains poor quality research, no matter how
important the topic. The trick is to get both right – something to which
the DH R&D programmes, as described above, are absolutely committed.
We welcome any additional resources dedicated to the perennially
difficult issue of implementing research knowledge in systems that are
continually developing, and where influences other than pure rationality
(evidence) have a perfectly legitimate role. We particularly welcome any
input that helps to increase health service deliverers’ appreciation of
how to use health services research. But it is important that this input
starts from an informed position in relation to existing structures and
activity in the system it is aiming to influence.
Yours sincerely
PROFESSOR SIR JOHN PATTISON
Director of Research and Development
PROFESSOR GILLIAN PARKER
Head of Policy Research Programme
(1) Shekelle P, New contract for general practitioners. BMJ 2003,326:
457-8.
Competing interests:
None declared
Competing interests: No competing interests
EDITOR:-Increasing the impact of health services research BMJ vol
327 6 December 2003
Recent research carried out across the public service delivery and
academic sectors in Wales identifies similar barriers to translating
research evidence into improved practice as those identified by Dash et al
(Increasing the impact of health services research 6 Dec03). Our research
canvassed the views of over 70 senior stakeholders across Wales to find
out how managers viewed the research available to them, and to what extent
they felt able to access evidence to inform their decision-making on
service delivery improvements. Respondents reported that organisational
capacity is severely limited in terms of both skills and status:
“We don’t value research in management unless it is presented by an
individual with a global reputation and even then we listen but don’t
apply it to practice”
as one neatly summed up the situation. Stakeholders also reported
that management-related research is viewed as a “poor relation” to
clinical research both in terms of quality, funding and accessibility.
Operational pressures and resource constraints combined to leave
research looking like a luxury rather than an essential activity.
Differential funding streams favouring single discipline activity over the
applied multidisciplinary work needed to inform service delivery
improvements all contributed to a perceived growing chasm between
practitioners in the field and academics working in lofty isolation. But,
unlike Kernick, far from closing down university departments of health
services research, we are trying to build better bridges to incorporate
them more effectively.
Despite the formidable obstacles noted above, respondents across all
sectors reported conviction that things had to change, recognising
knowledge management as a core organisational competence for the future.
Almost because the pace of change is so rapid, managers and academics
alike reported commitment to the need to put service improvement at the
top of the research agenda. Recognising that a real change in culture
will be needed to effect such a shift in priorities, both the Welsh
Assembly Government’s Ministers for Health and Social Care, and Education,
endorsed the study findings. Supported by the Assembly’s Director of
Health and Social Care, we are now working to test the infrastructure
needed to support establishing a service development research network, to
build capacity to engage in research across the service delivery sectors,
and - most importantly - to develop a distinctive research agenda
appropriate to Wales’ specific and strategic aims. The path towards
implementing this is rocky, but the potential prize includes better
services to patients and a more informed workforce operating in a more
stimulating and productive environment.
Steffi Williams
Director Research and Professional Development
Centre for Health Leadership Wales and Senior Lecturer, PGMDE School,
University of Wales College of Medicine
Competing interests:
None declared
Competing interests: No competing interests
Dash and colleagues (1) and Lomas (2) rightly point out that there is
much to gain by increasing the relevancy and use of health services
research. Dash et al draw attention to the initiatives carried out by the
NHS Service Delivery and Organisation (SDO) R&D Programme in England
and the Canadian Health Services Research Foundation in Canada to set a
relevant research agenda by ‘listening’ to a wide range of key
stakeholders (3). It is important to recognise that this involvement of
stakeholders should not end at the ‘listening’ phase so that the
continuing relevance of the research is ensured and it is communicated in
effective ways.
The SDO Programme has attempted to involve stakeholders throughout
the whole process of defining topics, selecting proposals, creating
evidence, and translating research into knowledge to inform decision-
making. Managers, health care professionals, user representatives, as
well as academic researchers, are members of the SDO Programme Board
advising on strategic direction, and serve as members of commissioning
groups that select proposals and advise on communicating the findings.
Research findings are communicated in a range of ways identified by users
as highly effective, such as briefings written by ‘expert communicators’
in consultation with researchers. Other initiatives include the
establishment of a forum for NHS chief executives to learn about research
findings and listen to their needs for research.
The SDO Programme has commissioned a wide range of research around
the themes arising from the listening exercise to address the needs of the
NHS, which include access to services, organisational change, and the
health care workforce. Just a few examples of the highly relevant research
being funded by the SDO Programme include a review of the literature on
reducing attendances and waits in accident and emergency, and empirical
studies including an evaluation of Diagnostic and Treatment Centres (DTCs)
and an evaluation of new models of configuring acute services locally
which are attempts at balancing access and quality (more information is
available at www.sdo.lshtm.ac.uk).
Despite this, more needs to be done. The current increase in funds
for the NHS has not been matched by an increase in NHS R&D funds, when
arguably the need for research is even greater.
While we share the belief that partnerships between research funders,
researchers and decision-makers will increase the impact of health
services research (4), this needs to be demonstrated through rigorous
evaluation. In addition, we need to understand which strategies are
successful in increasing the impact of health services research and in
which contexts (5).
Naomi Fulop, Director
Stuart Anderson, Senior Lecturer in Organisational Behaviour
Nick Goodwin, Senior Lecturer in Health Service Delivery and
Organisational Research
Pauline Allen, Lecturer in Organisational Research
Pamela Baker, Programme Manager
National Co-ordinating Centre for NHS Service Delivery and
Organisation R&D,
London School of Hygiene and Tropical Medicine,
Keppel Street,
London WC1E 7HT
Nick Black, Professor of Health Services Research
London School of Hygiene and Tropical Medicine
Aileen Clarke, Reader in Health Services Research
Queen Mary, University of London
Alan Burns, Chief Executive, Trent Strategic Health Authority
Malcolm Lowe-Lauri, Chief Executive, Kings College Hospital NHS Trust
1 Dash P, Gowan N, Traynor M. Increasing the impact of health
services research. BMJ 2003; 327: 1339-1341.
2 Lomas J. Health services research. BMJ 2003; 327(7427): 1301 - 1302.
3 Lomas J, Fulop N, Gagnon D, Allen P. On being a good listener: setting
priorities for applied health services research. Milbank Quarterly 2003;
81(3): 363-88.
4 Journal of Health Services Research and Policy 2003; 8 (supplement 2).
5 Lavis J, Ross S, McLeod C, Gildiner A. Measuring the impact of health
research. Journal of Health Services Research and Policy 2003; 8(3): 165-
170.
Competing interests:
Fulop, Anderson, Goodwin, Allen and Baker work for the National Co-ordinating Centre for NHS Service Delivery and Organisation R&D (NCCSDO), Black is grant-holder for NCCSDO, Clarke undertakes work for NCCSDO, Burns is Chair and Lowe-Lauri is member of the SDO Programme Board.
Competing interests: No competing interests
The recent report from the Health Foundation and the Nuffield Trust
featured in last week's BMJ (1) confirmed what practitioners have known
for years - the history of health service research has been characterised
by an overwhelming volume of literature that has an impercievable impact
on those who actually get on and do the work. The focus has been on an
examination of why evidence is not accommodated into practice and how the
barriers to implementation can be reduced. The fact that the evidence-
based product may not be relevant to those at who it is directed had not
until recently been considered a possibility.
A useful analytical framework to address the problem is provided by
the concept of "communities of practice." This refers to a description of
relating that occurs through particular activities or practices undertaken
by a group of people that facilitates their sharing of knowledge and
negotiating of meaning amongst them (2). Communities of practice consist
of three structural elements: practice - the set of ideas, tools and
documents that they share; community - the environment in which people
interact, learn and build relationship; domain - know-how or highly
specialised professional expertise.
Although knowledge is transmitted effectively within communities of
practice, it does not pass well across their boundaries. Therefore, the
current emphasis is on managing the interface, examining why evidence is
not accommodated into practice and how the barriers to implementation can
be reduced (3). But more useful solutions may be obtained by closer
scrutiny of the academic community.
Despite the rhetoric and some methodological development, university
institutions hold themselves as the highest legitimising body for valid
knowledge. They remain committed to a particular epistemology that fosters
selective inattention to the problems of the real world. Research is
viewed as “a retail store in which researchers are busy filling shelves
with a comprehensive set of all possible relevant studies that a decision
maker might someday drop by to purchase"(4). The academic community of
practice is consolidated within a hierarchical structure that inhibits
flexibility and innovation. Where funding spirals, assessment exercises,
and internal politics divorce activity from the real world.
The only way to increase the impact of research in the NHS is to
close down the university departments of health service research and send
their staff into the trenches.
REFERENCES:
1. Dash P, Gowman N, Traynor M. Increasing the impact of health
service research. BMJ 2003;327:1339-41.
2. Wenger E. Communities of practice. Cambridge University Press:
Cambridge, 1998.
3. Grol R, Grimshaw J. From best evidence to best practice:
effective implementation of changing patients care. Lancet 2003 ;362:1225-
30.
4. Lomas J. Connecting research and policy. Can J Policy Res
2000;1:140- 144.
Competing interests:
None declared
Competing interests: No competing interests
The lack of responsibility for dissemination amongst funders and
researchers(1) may reflect a lack of understanding about what
dissemination actually entails. For many, dissemination is often used to
describe implementation activities, rather than viewed as an important
awareness raising activity in its own right.
At CRD, we have always tried to ensure that our approach to
dissemination takes account of the ways by which different audiences
become aware of, receive, access, read and use research findings.
If researchers are to repackage their outputs in ways that suit the
needs of the end user (and we think they should), these issues have to be
addressed. Given this, funders need to recognise that if the communication
of research findings is to be improved, dissemination activities need to
be adequately resourced.
(1) Dash P, Gowan N, Traynor M. Increasing the impact of health
services research. BMJ 2003; 327: 1339-1341.
Competing interests:
we all have responsibility for promoting the
findings of research
Competing interests: No competing interests
First a confession: in relation to the question "are the results
being communicated to and applied by the people who need them?" (1)I
personally have, often, failed to communicate properly my findings about
primary health care services. For example, the last time I submitted some
health service research to Br J Gen Pract, the two reviewers responded "if
this is the new research, give me the old research every time!" and "does
God love General Practice?". Sometimes health services research (HSR) is
junked because it is rubbish science, sometimes because it is
incomprehensible, but most often because it appears useless - to the
policy makers and managers who have to run a million employee National
Health Service (NHS) for 57 million possible patients.
Now, thanks to an inspiring discussion with the 'Children's Czar' Al
Aynsley-Green (South East Public Health Observatory conference 28 November
2003), I have seen the light. In considering HSR to develop children's
services, the topic came up of 1000 sudden deaths annually in young people
with epilepsy (2). The NHS will never reduce deaths in this population
until we learn to incorporate into 'service' research what it means to a
school leaver with no qualifications, social isolation and poor prospects,
to have a history of epilepsy. There is already abundant, useful clinical
research on epilepsy (3): the fog of ignorance prevents our putting the
delivery of effective clinical care into what the Chief Nursing Officer
calls The Patient Journey or the Social Exclusion Unit calls young
people's Life Chances.
The BMJ has done an excellent job in communicating the concerns of
the Academy of Medical Sciences (4) about the shrinking quantity of
clinical research for the needs of our NHS. The fundamental weakness of
British HSR is one of quality: we keep skimming the surface of problems
around service delivery but we avoid the deep issues around the way
organisations fail to build an alliance with so many of their patients
and, indeed, what job these 'health' organisations really exist to do.
This winter a storm of fog and fury is building up about one of the
major primary care services: Health Visiting. Changes in legislation,
regulation, education and employment threaten the unravelling of a 102
year old service for families. Before Dr. Traynor brought his skills and
experience to the Health Foundation, he happened to be a health visitor.
Might I suggest that a good Test Case, for the Health Foundation's aim of
improving HSR, would be groundbreaking "collaborative" (1) research on the
sort of home visiting service that both British families would find useful
and the Department of Health could deliver?
1 Dash P, Gowan N, Traynor M. Increasing the impact of health
services research. BMJ 2003; 327: 1339-1341.
2 Caan W. Epilepsy, early death and learning disabilities.
http://bmj.com/eletters/326/7385/349#29786 19 February 2003.
3 Pedley TA, Hauser WA. Sudden death in epilepsy: wake-up call for
management. Lancet 2002; 359: 1790-1791.
4 Bell J. Resuscitating clinical research in the United Kingdom. BMJ
2003; 327: 1041-1043.
Competing interests:
Practitioner-academician of the Academy of learned societies for the Social Sciences
Competing interests: No competing interests
Integration of research and service delivery
The topic of increasing the impact of health services research is
certainly one vital to all those involved in R&D in primary care.
At Battersea Research Group (BRG), a primary care research network
funded by the Department of Health R&D levy, we believe that the
situation on the front line can change and would like to highlight some
examples of change taking place in one south London Primary Care Trust. We
believe there are some core areas and goals that can be attained through a
process of more integrated working.
Firstly, a major issue is the need to break down barriers between
service and research organizations. The BRG was originally set up as a
research charity operating in primary care, however, over the past 15
months we have been working closely with Wandsworth PCT to develop more
integrated and effective research and development activities. From January
2004 BRG will merge into the PCT becoming part of the Nursing and Clinical
Governance Directorate developing our link into the PCT R&D Committee.
We hope that staff integration and the co-ordination of a shared research
management and governance agenda will increase contact between researchers
and health professionals and managers facilitating greater understandings
of research and ability to disseminate and implement findings.
Secondly, there is a need to increase the opportunities for research
relating to service development. We also hope that the development of a
PCT wide R&D strategy via the R&D Committee will enable more PCT
employees to liaise with the research group and develop collaborative work
directly relevant to their clinical practice and service delivery. Of
course, funding of research and resource pressure is a major issue.
However, one method may be to encourage individuals to work together as a
unit using personal or team development time to develop and undertake
research directly relevant to their particular service. Additionally,
Wandsworth PCT is the local Teaching PCT and through the steering
committee and associated funds it may be possible to utilize the research
group to help groups undertake particular pieces of project work focusing
on core areas of skill mix, recruitment and retention or links between
primary and secondary care.
As yet no commission from the PCT has been received by the research
group to answer a specific service orientated question as Dash et al state
is the case in the USA and Canada. Indeed the research group would not
expect to receive such a commission without external competition. PCTs may
not be used to commissioning such work and research groups may not always
have the capacity to respond sufficiently fast. However integration allows
for infrastructure and relationships to develop and this means that the
stage is set for discussions about how questions that are raised within
the organisation might be answered to mutual satisfaction providing both
useful answers and meeting academic targets (such as publications).
Already the skills of the research group are being called upon to think
about generic issues such as the collection and analysis of data held in
primary care or referrals processes for smoking cessation service. It is
only a matter of time before these questions yield both useful answers and
publications.
Dr Jeremy Gray – Director, BRG – Wandsworth’s Primary Care Research
Centre
Amy Scammell – Research Manager, BRG.
Helen Walley Chief Executive Wandsworth Primary Care Trust
Dr Andrew Neil Joint Medical Director and Chair Wandsworth PCT R&D
Committee
Competing interests:
None declared
Competing interests: No competing interests