JCPSLP Vol 23 Issue 2 2021

Journal of Clinical Practice in Speech-Language Pathology Journal of Clinical ractic i Spe ch-L l

Volume 13 , Number 1 2011 Volume 23 , Number 2 2021

Implementation science

In this issue: Introduction to implementation science and potential impact on SLP practice Guiding implementation and evaluation of interventions for children with communication disorders, using the RE-AIM framework Perspectives on open access research Supporting clinician access to publications: A novel use of OneNote A framework to support effective implementation of evidence based interventions for people with acquired brain injury Benchmark of clinical practice with safe swallowing strategies Parent perceptions of an online training in language development

Print Post Approved PP352524/00383 ISSN 2200-0259

Speech Pathology Australia

Level 1 / 114 William Street, Melbourne, Victoria 3000 T: 03 9642 4899 F: 03 9642 4922 Email: office@speechpathologyaustralia.org.au Website: www.speechpathologyaustralia.org.au ABN 17 008 393 440 ACN 008 393 440 Speech Pathology Australia Board Tim Kittel President Chyrisse Heine Vice President Communications Maree Doble Vice President Operations

Acceptance of advertisements does not imply Speech Pathology Australia’s endorsement of the product or service. Although the Association reserves the right to reject advertising copy, it does not accept responsibility for the accuracy of statements by advertisers. Speech Pathology Australia will not publish advertisements that are inconsistent with its public image. 2021 subscriptions Australian subscribers – $AUD106.00 (including GST). Overseas subscribers – $AUD132.00 (including postage and handling). Institutional rate – $AUD330 (including GST). No agency discounts. Reference This issue of Journal of Clinical Practice in Speech-Language Pathology is cited as Volume 23, Number 2, 2021. Disclaimer To the best of The Speech Pathology Association of Australia Limited’s (“the Association”) knowledge, this information is valid at the time of publication. The Association makes no warranty or representation in relation to the content or accuracy of the material in this publication. The Association expressly disclaims any and all liability (including liability for negligence) in respect of use of the information provided. The Association recommends you seek independent professional advice prior to making any decision involving matters outlined in this publication. Copyright ©2021 The Speech Pathology Association of Australia Limited. Contributors are required to secure permission for the reproduction of any figure, table, or extensive (more than 50 word) extract from the text, from a source which is copyrighted – or owned – by a party other than The Speech Pathology Association of Australia Limited. This applies both to direct reproduction or “derivative reproduction” – when the contributor has created a new figure or table which derives substantially from a copyrighted source.

Electronic copies of JCPSLP Speech Pathology Australia members are able to access past and present issues of JCPSLP via the Speech Pathology Australia website www.speechpathologyaustralia.org.au/ publications/jcpslp Electronic copies of the full journal or individual articles are available to everyone (members and non-members) at a cost by emailing pubs@speechpathologyaustralia.org.au or by completing the form available from the Speech Pathology Australia website

Kathryn McKinley Brooke Sanderson Alison Smith Bronwyn Sutton

JCPSLP Editor Co-Editors Dr Andy Smidt and Dr Katrina Blyth c/- Speech Pathology Australia

Editorial Committee Emma Finch Leah Hanley Lindy McAllister Rachael Unicomb

Alexia Rohde Cori Williams Shaun Ziegenfusz

Copy edited by Carla Taines Designed by Bruce Godden

Contribution deadlines Number 3, 2021 30 August 2021 Number 1, 2022 1st December 2021

Advertising Booking deadlines Number 3, 2021 17 August 2021 Number 1, 2022

1 December 2021

Number 2, 2022 8 April 2022

Please contact the Publications Manager at Speech Pathology Australia for advertising information.

Implementation science

From the editors Andy Smidt and Katrina Blyth

Contents

W elcome to the July 2021 JCPSLP issue! As new editors, we were initially apprehensive and unsure how many papers we’d have for this issue on Implementation science and knowledge translation . We are delighted however to present an issue with so many contributions and papers from across

54 How could implementation science shape the future of SLPs’ professional practice? – Hazel Roddam and Jemma Skeat 59 Using the RE-AIM framework to guide the implementation and evaluation of interventions for children with communication disorders – Elise Baker, Kate Short, and Katrina Tosi 65 Launching a collective clinical research resource for a local speech- language pathology team – Sophie Chalmers 70 A PhD student’s perspective on open access research – Sam Harvey 74 The Social Brain Toolkit: Implementation considerations from the development of a suite of novel online social communication training programs for adults with acquired brain injury and their communication partners – Melissa Miao, Emma Power, Rachael Rietdijk, Melissa Brunner, Leanne Togher, and Deborah Debono 80 Supporting safe drinking in dysphagia: Exploring the use, knowledge and skills of United Kingdom speech pathologists with strategies to support safe drinking – Angela Crocker, Hannah Crawford, Alessia Nicotera, Carlotta Griseri, and Hazel Roddam 88 Parent perceptions of an online training in language development – Lydia Timms, Isabella Sciullo, Hannah Nizich, Suze Leitão, and Mary Claessen 95 Retrospective parent report of early vocal behaviours in children with phonological delay – Chantelle Highman, Chloe Harper, Neville Hennessey, and Suze Leitão 101 Ethical Conversations – Suze Leitão, Grant Meredith, Dave Parsons, Trish Johnson 104 Around the journals 106 Evidence matters – Cori Williams 108 Top 10 in dissemination and implementation science: For SLP and audiology practitioners – Amanda Owen Van Horne 110 Resource review: Allied Health – Translating Research into Practice (AH-TRIP) – Ashley Cameron

From left: Drs Andy Smidt and Katrina Blyth

Australia and the UK covering a range of clinical caseloads from toddlers to adults. We are also pleased to showcase authors with a range of perspectives—from Sophie Chalmers who writes about clinicians collating and sharing evidence resources; to Sam Harvey’s paper where he discusses open access to evidence as a PhD candidate; and to Elise Baker who applies her expertise and writes about planning and implementing evidence within a paediatric case scenario. We’re also pleased to include an introductory paper on implementation science and knowledge translation from our guest writers Hazel Roddam and Jemma Skeat. Hazel has 25 years’ experience as a clinical SLP in UK, plus 15 years as an academic researcher. She currently works as an independent consultant for research and evaluation in allied health practice. Hazel has been commissioned by Health Education England to write a new research strategy for all 14 allied health disciplines that will be published at the end of 2021. Jemma is lead of clinical programs and a senior lecturer in the Department of Audiology and Speech Pathology. She has over 20 years of clinical and research experience focusing on evidence-based practice, outcome measurement and population health. Her current research and teaching interests include collaborative (interdisciplinary) practice, qualitative research, evidence-based practice and clinical learning. As well as a great variety of papers, the JCPSLP editorial committee has put together columns that explore the topic of implementation science and knowledge translation. In Ethical conversations , Suze Leitão and Grant Meredith explore the ethical obligations that arise in the process of implementation science for both researchers and clinicians with Dave Parsons. In the Around the journals column we have reviews of some relevant articles from Lindy McAllister and Elizabeth Cole. Elizabeth is an Australian SLP who now lives in Malaysia and Lindy is one of our most eminent academics with a long history of teaching SLP in Vietnam. Together Lindy and Elizabeth lookat issues of implementation science from an international perspective. In this issue’s Evidence matters column, Cori Williams has gathered the perspectives from past and present JCPSLP editors. In Top ten tips , Amanda Owen Van Horne writes very practical advice on what implementation science means to you in your workplace. In the Resource review , Anna Farrell and Erin Kelly write about the Allied Health—Translating Research into Practice (AH-TRIP) process for implementing research evidence. Last but not least, we’d like to take this opportunity to welcome new members to our editorial committee: Lindy McAllister, Alexia Rhode and Leah Hanley. We’re also saying farewell and thank you to Laurelie Wishart. She has been co-ordinating Viewpoints and Around the journals columns but is stepping down to take on a different role as Mum. That’s all from us. We hope you enjoy reading this issue as much as we have. Andy and Kat

53

JCPSLP Volume 23, Number 2 2021

www.speechpathologyaustralia.org.au

Invited commentary

How could implementation science shape the future of SLPs’ professional practice? Hazel Roddam and Jemma Skeat

SLPs are familiar with evidence-based practice, but the implementation of evidence into practice continues to be difficult across all health and care professions, including ours. Implementation science (ImpSci) is a branch of science that focuses how we can encourage and improve this implementation. This commentary will introduce readers to ImpSci and where has it come from. We clarify the distinctive differences between ImpSci and both evidence-based practice (EBP) and quality improvement. We also reflect on how ImpSci addresses complementary and essential issues to our current evidence base of scientific and clinical research. ImpSci is beginning to have an impact within allied health, including research addressing implementation of evidence for speech, hearing, communication, and swallowing disorders. However, as a profession we heavily weigh our research towards demonstrating efficacy and the uptake of ImpSci methodologies has been slow. We argue for a more strategic and systematic adoption of ImpSci, to promote change in the clinical effectiveness, societal impact, and scientific reputation of SLP professional practice. The inaugural issue of Implementation Science launched more than fifteen years ago, (https://implementationscience. biomedcentral.com/). This journal aims to promote methods to increase research use in health and care services, ultimately to improve the quality and effectiveness of services. The seminal definition of implementation science (ImpSci) was coined by Eccles and Mittman in that first issue as “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice” (Eccles & Mittman, 2006, p. 1). The principles underpinning ImpSci What are the drivers for implementation science?

had, in fact, been promoted for at least a decade before the definition was coined. A key influence was the theory of the diffusion of innovations (Rogers, 1995), highlighting the significance of contextual influences for achieving successful and sustainable changes. The principal focus for ImpSci is thus on the context into which the intervention or new practice is being introduced, as well as the perceptions and responses to the change by all stakeholders—those receiving the intervention and the health care professionals who will deliver the specified new treatment. In this way, ImpSci has a broader scope than traditional clinical intervention research, focusing not only on patient, but also on providers, organisations, and policy makers in health care. A focus on ImpSci is needed because translation of research into practice is slow and requires strategic support for sufficient impact and change to clinical service delivery (Morris et al., 2011). The limited uptake and implementation of research evidence into practice across all health care professions has been well documented (Balas & Boren, 2000). This applies equally in allied health disciplines, as reported in Scott et al.’s (2012) systematic review of 32 studies. The analogy of a “pipeline” in translational research (Lynch et al., 2018) has recently been increasingly used to highlight that in the absence of active facilitation strategies (Morris et al., 2020) empirically supported therapies may not be effectively implemented, if at all. Apart from the unwarranted time lag for the uptake of new and effective evidence-based practices, the consequences of this research-practice or “know-do gap” (Rycroft-Malone et al., 2015, p. 2) comprise the risks of clinical harm from potentially unsafe, non-evidence-based practice; ethical harm from raising unrealistic expectations for ineffective practices; and financial inefficiencies from wasted effort and resources. Without consistent and strategic adoption of the best current research findings into practice, the potential for inequalities in health care outcomes, and of inequalities in access to health care services, is also perpetuated. ImpSci has taken up the challenge of finding proven ways for supporting the rapid implementation of research into practice, considering the complex contexts into which new interventions are being implemented, and take into account the need to change clinician behaviours. A key part of this has been understanding the specific challenges perceived by clinicians in implementing what are often multifaceted novel interventions into practice (Cochrane, 2015; Damschroder et al., 2009). ImpSci goes beyond a simple focus on active dissemination of the findings from

Hazel Roddam (top) and Jemma Skeat

54

JCPSLP Volume 23, Number 2 2021

Journal of Clinical Practice in Speech-Language Pathology

research studies, as it provides a highly structured and systematic process to translate the evidence based new ways of working into routine care. Although it has been around for more than 15 years, ImpSci as a science and specific type of research is not well understood. From the perspective of the practitioner workforce who strive to read and apply research into their own practice, there are particular aspects of ImpSci that are likely to have constrained familiarity and confidence in understanding of ImpSci principles and methodologies. The first is the large number of distinctive frameworks that sit under the ImpSci umbrella. ImpSci methodologies comprise a range of sophisticated frameworks, incorporating a toolkit of specific and sensitive evaluation approaches. The Consolidated Framework for Implementation of Research (Damschroder et al., 2009) provides a highly practical and accessible guide that relates theoretical ImpSci constructs into pragmatic real-world scenarios. The behaviour change model (Michie et al., 2011) has also been widely used over the past decade. This model was developed from a purposeful selection of directly relevant components of other approaches to social behaviour change, specifically focused on capability, opportunity and motivation (COM-B model). The second barrier to understanding ImpSci is the plethora of terminology that has been associated with ImpSci; with terms often appearing to be used interchangeably, such as knowledge management, knowledge exchange, knowledge transfer, knowledge translation, dissemination, diffusion, implementation and research utilisation. Within our own professional community, Douglas et al. (2015) have warned that ImpSci terminology was beginning to be used as a “buzzword”; like other buzzwords before it, it devalues ImpSci if the result is a tokenism where almost anything may be called ImpSci. Thus, we recommend a helpful primer by Bauer et al. (2015) to help orientate clinical practitioners to more meaningfully understand these aspects. There are also a range of ImpSci focused websites that provide helpful overviews for interested researchers and practitioners- including the University of Washington Implementation Science Hub (https://impsciuw.org/), the Sydney Health Partners Implementation Science Community of Practice (https://implementationscience.com.au/), and the American Institutes for Research Centre on Knowledge Translation for Disability and Rehabilitation Research (https://ktdrr.org/ products/kt-implementation/introduction.html). How does implementation science differ from evidence-based practice and quality improvement? From the outset, the fundamental philosophy of evidence- based practice (EBP) has been to promote a culture of critical reflection on practice, guiding sound and explicit decision making based (McCurtin & Roddam, 2012). EBP was driven by an agenda to assure greater consistency and continuity of best practice and effective health care treatments, after a growing awareness that there was immense variability in the way that patients were treated in medical practice (Greenhalgh, 1997). The general public were beginning to question why ‘expert’ clinical wisdom differed so widely. Alongside this, government and health care insurance agencies started to increasingly demand factual evidence for medical treatments, so that they could have a more objective basis for procuring the most clinically and economically effective health care available. Moving

beyond medicine over time, the EBP movement in nursing and the allied health professions emphasised that clinical decision-making should focus more clearly on high quality scientific evidence, rather than on clinical intuition (Reilly et al., 2004). Professional associations now almost universally actively promote the EBP agenda, and link this with regulatory requirements for individuals to undertake continuing professional development (CPD) and for services to be accountable for clinical effectiveness. While the EBP model establishes accountability for individual health care professionals to be aware of the most recent evidence for assessment and intervention approaches in their own field of practice, there is also the imperative for person-centred and values-based care, involving the patient and their family in care planning in the context of their own priority concerns (Greenhalgh et al., 2014). Above all, EBP is “a way of thinking”—a practice of clinicians reflecting on what they are doing (e.g., in assessment and in intervention) and why (McCurtin & Roddam, 2012, p. 21). Despite the attention paid to EBP, it has been recognised that this has not secured the anticipated significant impact to accelerate research findings into practice. Three decades of “barriers’ studies” have shown that individual clinical practitioners still continue to report their perceptions of facing the same inherent challenges of EBP; resoundingly highlighting the factors of time (to locate and read research), skills and confidence (to critically judge research publications), and autonomy (to make changes in work practices). The last of these is closest to what we would consider under ImpSci. While all health care practitioners need to be equipped to find and appraise the literature, when required—for example when faced with an unusual case, or when the usual therapy does not seem to be working well (Roddam & Skeat, 2010)—the implementation part does not rest solely on the individual. The context, which may include not just practical considerations (such as training and resources to use a new intervention), but policy and cultural factors (such as directives or historical use of an established intervention), play a critical role. ImpSci provides a means of exploring these influences on implementing evidence. ImpSci does not substitute for EBP: each has its distinctive philosophy. Embracing EBP as an integral facet of professionalism enhances engagement by practitioners with the research landscape in their own field of practice. This active level of engagement has been robustly demonstrated to significantly improve evidence-based processes of care and patient reported experiences of care, plus some gains in clinical outcomes of care (Boaz et al., 2015). The further contribution of ImpSci is to provide a systematic way of getting those research-based interventions into best clinical practice. ImpSci is also distinct from quality improvement (QI) approaches and methodologies (Jones et al., 2019). There are naturally some aspects of similarity between the two, and the overarching aim of both ImpSci and QI is to improve quality standards in health care. However, QI projects are generally triggered by a focus on patients’ experiences of current service delivery. In contrast, ImpSci is typically initiated by the identification of a robustly validated intervention or treatment that has not yet been introduced, or fully adopted, into routine practice (Bauer et al., 2015). After identifying a specific problem in a health care system, QI approaches lead to the development and trial of strategies to improve the quality standards of that

55

JCPSLP Volume 23, Number 2 2021

www.speechpathologyaustralia.org.au

service delivery. Within the ImpSci framework approach, the starting point may be to gain insights into the variability in current practice, as well as perceived—and actual— constraints to the adoption of the new practice/s. Since services are necessarily delivered within a multidisciplinary and often cross-agency context, wider stakeholder consultations would usually include referral agents, as well as onwards referral providers. These consultations in both QI and ImpSci paradigms need to negotiate competing priorities and address aspects of the organisational culture. What specific insights can ImpSci contribute to SLP practice? While there are general benefits to ImpSci research in understanding and addressing the barriers to evidence use in practice, there are specific challenges that SLPs face that ImpSci may help us to address. One relates to the questions that become highly relevant when an SLP determines that a change is needed to their current clinical practice on the basis of evidence; specifically, questions about transferability and about acceptability. First, we might ask whether the (usually very specific) clinical population and the (often highly resourced) intervention protocol used in the research is realistically transferable to our clinical setting. Do we have these clients and the time/ resources/ funding or staff to implement this evidence based intervention as presented in the literature? Clinical caseloads include patients with complex profiles, and from demographics that do not necessarily closely match the stringent inclusion criteria of the published studies. Alongside this, the published intervention protocols are often unrealistic for clinical contexts. Second, we might ask whether the intervention is acceptable from the perspectives of both the service users and the health care professionals. Do we want to do this? Would patients be interested in this intervention? These questions are related—for example, if the evidence proposes daily intervention at a clinic, we need to ask both whether this is feasible to provide and acceptable to those receiving it to attend daily for the intervention. At best (or worst), we might choose to selectively adapt variations of validated interventions. Adaptations to the intervention could include changes to the format or setting for the session delivery, adding or skipping elements of the intervention; changes to the pacing or timing; re-ordering the sequence or substituting elements of the programme; or “drift” from the intervention protocol (Stirman et al., 2013). These ad hoc adaptations may render the interventions less, or wholly, ineffective. At worst (or best) we might choose to ignore the unfeasible and unacceptable evidence as presented, and continue with our current practice. ImpSci approaches provide us with another way. Rather than ignoring the changes that are needed to implement the evidence, or ignoring the evidence altogether, ImpSci helps us to explore these adaptations and the impact of these. For example, ImpSci evaluations may include exploring practitioners’ opinions on the feasibility of length and intensity of intervention that could be possible within their own routine clinical practice (Morris et al., 2020; Rycroft-Malone et al., 2004; Stetler et al., 2011). These approaches can also objectively measure the range of adaptations made by experienced practitioners when they begin to implement new therapy interventions within real-world settings, including all the relevant factors related to the intervention itself (content/duration/frequency etc), as well as the context for delivery (service user demographics,

practitioner characteristics, environment) (Carroll et al., 2007). This enables a clear differentiation between the intended delivery and the actual delivery of the intervention. ImpSci framework approaches support the identification of both context and content modifications that may influence the effectiveness of the intervention (Stirman et al., 2013), which can subsequently be explored through both observed and self-reported adaptations to the intervention delivery. There is a distinction between unwarranted modifications, versus adaptations that are consistent with the intended delivery of complex interventions within a range of feasible fidelity; an issue that is an ongoing methodological challenge within the field of ImpSci (Stirman et al., 2012). ImpSci studies measure the effectiveness of the implementation of the new practice (intervention fidelity), as well as the effectiveness of any specified strategies that were put in place to assist the adoption and maintenance of the practice change. We believe that these types of ImpSci studies are both highly relevant and necessary to SLP practice at this point. We have a growing body of strong, well-conducted research that requires precisely this type of interpretation to enable successful implementation into practice. The past few years has seen a slow but steadily growing discussion of ImpSci models across allied health research (including Lynch et al., 2018; Morris et al., 2020) and in speech pathology specifically (including Campbell & Douglas, 2017; Douglas & Burshnic, 2019). These papers exhort the value of ImpSci, but also recognise a number of factors that contribute towards the relatively slow adoption of this field of science across health care disciplines. In common with much of health care research, and with the evidence base for allied health disciplines in particular, our traditional research focus has been on evaluating the efficacy of interventions. This is valuable and necessary, but the research agenda of exploring how best to implement evidence into practice has been largely ignored, and we are seeing across all areas of health care that this is needed. Without ImpSci, we risk having a lot of strong evidence that we can’t use clinically or can only use with modifications that potentially negate the efficacy. Influencing the research priority agenda to secure substantive investment for ImpSci studies across all aspects of communication and swallowing science will be a challenging and long battle ahead. However, without this investment, without seriously considering how we shift some of our precious (and scarce) research resources towards demonstrating how to implement the interventions that we know to be effective, there is a danger that we may miss an important opportunity for change in our profession. EBP is as important as ever, but with ImpSci, we have the opportunity to create a better return on our research investment into the efficacy of interventions, to harness evidence in a new way that leads to greater clinical effectiveness, with greater societal impact and improved scientific reputation for our professional practice. The responsibility for shaping this future knowledge base for practice does not rest exclusively with the researchers, but also needs practitioners, service managers and policy leads to demand the publication of implementation studies that realistically reflect the rapidly evolving contexts in which SLP services are delivered. In this issue Given the above need, we welcome the discussions in this issue that include an ImpSci focus and tell us more about where and how our efforts can be focused. There are two

56

JCPSLP Volume 23, Number 2 2021

Journal of Clinical Practice in Speech-Language Pathology

clear ImpSci research papers contributing to our evidence base in this area. The paper from Elise Baker and colleagues recommends a systematic and structured approach for SLP teams to anticipate the factors that are likely to influence the adoption and maintenance of new interventions in clinical practice. The framework that Elise recommends has been used worldwide across various fields in health care and is illustrated here using a highly relevant scenario example. Melissa Miao and colleagues use another well-established framework that is particularly relevant to the adoption of evidence-based technological innovations in practice. This framework supports clinicians to consider challenges to implementation of eHealth interventions, and the paper provides a worked example of the use of this framework for online, social communication interventions. Both these examples are valuable for readers getting started with ImpSci, to see the application of these conceptual frameworks to SLP relevant areas. Beyond these direct ImpSci examples, this issue provides excellent examples of research that highlight issues known to influence implementation of evidence. Angela Crocker and colleagues’ project provides an idea of how we might get started understanding contexts for implementation of evidence—particularly that of practitioner knowledge, as she explores SLP confidence in using and recommending safe drinking techniques. Lydia Timms and colleagues present a specific examination of the acceptability and feasibility of the Language Together program, exploring both client and clinician views. This issue also presents papers that provide support for knowledge mobilisation, which is crucial to supporting broader engagement with research by SLPs. As we continue to struggle with paying more than just homage to being evidence-based practitioners, these papers are important. Sam Harvey’s column feature discusses the issue of open access to research publications. Over the past decade this has been increasingly acknowledged as an ethical barrier that inherently constrains the potential for wider dissemination of scientific findings. The pointers that Sam provides are of value to all clinicians, academics and researchers who have authored, or who access, published articles. Sophie Chalmers’ report presents her innovative approach to encouraging her team of clinical colleagues to engage more actively with the research evidence base in their own field of practice. References Baker, E., Short, K., & Tosi, K. (2021) Using the RE-AIM framework to guide the implementation and evaluation of interventions for children with communication disorders Journal of Clinical Practice in Speech-Language Pathology , 23 (2), 54–58. Balas, E. A., & Boren, S. A. (2000). Managing clinical knowledge for health care improvement. Yearbook of Medical Informatics , 9 (01), 65–70 https://doi. org/10.1055/s-0038-1637943 Bauer, M. S., Damschroder, L., Hagedorn, H., Smith, J., & Kilbourne, A. M. (2015). An introduction to implementation science for the non-specialist. BMC Psycholog y, 3 (1), 1–12. https://doi.org/10.1186/s40359- 015-0089-9 Boaz, A., Hanney, S., Jones, T., & Soper, B. (2015). Does the engagement of clinicians and organisations in research improve healthcare performance: A three-stage review. BMJ Open , 5 (12), e009415. https://doi.org/10.1136/ bmjopen-2015-009415

Campbell, W. N., & Douglas, N. F. (2017). Supporting evidence-based practice in speech-language pathology: A review of implementation strategies for promoting health professional behavior change. Evidence-Based Communication Assessment and Intervention , 11 (3–4), 72–81. https://doi.org/10.1080/17489539.2017.1370215 Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007). A conceptual framework for implementation fidelity. Implementation Science , 2 (1), 1–9. https://doi.org/10.1186/1748-5908-2-40 Chalmers, S. (2021) Launching a collective clinical research resource for a local speech-language pathology team. Journal of Clinical Practice in Speech-Language Pathology , 23 (2), 65–69. Cochrane. (2015). Effective practice and organisation of care . https://epoc.cochrane.org/ Crocker, A, Crawford, H., Nicotera, A., Griseri, C., & Roddam, H. (2021). Supporting safe drinking in dysphagia: Exploring the use, knowledge and skills of United Kingdom speech pathologists with strategies to support safe drinking Journal of Clinical Practice in Speech-Language Pathology , 23 (2), 80–87. Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science , 4 (1), 1–15. https://doi.org/10.1186/1748-5908-4-50 Douglas, N. F., & Burshnic, V. L. (2019). Implementation science: Tackling the research to practice gap in communication sciences and disorders. Perspectives of the ASHA Special Interest Groups, 4 (1), 3–7. https://doi. org/10.1044/2018_pers-st-2018-0000 Douglas, N. F., Campbell, W. N., & Hinckley, J. (2015). Implementation science: Buzzword or game changer? Journal of Speech, Language, and Hearing Research , 58 (6), S1827–S1836. https://doi.org/10.1044/2015_ jslhr-l-15-0302 Eccles, M., & Mittman, B. (2006). Welcome to implementation science. I mplementation Science , 1 (1). https://doi.org/10.1186/1748-5908-1-1 Greenhalgh, T. (1997). How to read a paper: The basics of evidence-based medicine . BMJ Publications. https://doi. org/10.1136/bmj.315.7112.891 Greenhalgh, T., Howick, J., & Maskrey, N. (2014). Evidence based medicine: a movement in crisis? BMJ , 348, g3725. https://doi.org/10.1136/bmj.g3725 Harvey, S. (2021) A PhD student’s perspective on open access research. Journal of Clinical Practice in Speech- Language Pathology , 23 (2), 70–73. Highman, C., Harper, C., Hennessey, N., & Leitão, S. Retrospective parent report of early vocal behaviours in children with phonological delay. Journal of Clinical Practice in Speech-Language Pathology, 23 (2), 95–100. Jones, B., Vaux, E., & Olsson-Brown, A. (2019). How to get started in quality improvement. BMJ , 364 , k5408. https://doi.org/10.1136/bmj.k5437 Lynch, E. A., Chesworth, B. M., & Connell, L. A. (2018). Implementation—The missing link in the research translation pipeline: Is it any wonder no one ever implements evidence-based practice? Neurorehabilitation and Neural Repair , 32 (9), 751–761. https://doi. org/10.1177/1545968318777844 McCurtin, A., & Roddam, H. (2012). Evidence-based practice: SLTs under siege or opportunity for growth? The use and nature of research evidence in the profession.

57

JCPSLP Volume 23, Number 2 2021

www.speechpathologyaustralia.org.au

professions. Implementation Science , 7 (1), 1–17. https:// doi.org/10.1186/1748-5908-7-70 Stetler, C. B., Damschroder, L. J., Helfrich, C. D., & Hagedorn, H. J. (2011). A guide for applying a revised version of the PARIHS framework for implementation. Implementation Science , 6 (1), 1–10. https://doi. org/10.1186/1748-5908-6-99 Stirman, S. W., Kimberly, J., Cook, N., Calloway, A., Castro, F., & Charns, M. (2012). The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science , 7 (1), 1–19. https://doi. org/10.1186/1748-5908-7-17 Stirman, S. W., Miller, C. J., Toder, K., & Calloway, A. (2013). Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science , 8 (1), 1–12. https:// doi.org/10.1186/1748-5908-8-65 Timms, L., Sciullo, I., Nizich, H., Leitão, S., & Claessen, M. Parent perceptions of an online training in language development. Journal of Clinical Practice in Speech- Language Pathology , 23 (2) Dr Hazel Roddam has 25 years’ experience as a clinical SLP in UK, plus 15 years as an academic researcher. She currently works as an independent consultant for research and evaluation in allied health practice. Hazel has been commissioned by Health Education England to write a new research strategy for all 14 allied health disciplines, that will be published at the end of 2021. Jemma is Lead of Clinical Programs and a Senior Lecturer in the Department of Audiology and Speech Pathology. She has over 20 years of clinical and research experience focusing on evidence-based practice, outcome measurement and population health. Her current research and teaching interests include collaborative (interdisciplinary) practice, qualitative research, evidence-based practice and clinical learning.

International Journal of Language & Communication Disorders , 47 (1), 11–26. https://doi.org/10.1111/j.1460- 6984.2011.00074.x Michie, S., Van Stralen, M. M., & West, R. (2011). The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implementation Science , 6 (1), 1–12. https://doi. org/10.1186/1748-5908-6-42 Morris, J. H., Bernhardsson, S., Bird, M.-L., Connell, L., Lynch, E., Jarvis, K., Kayes, N. M., Miller, K., Mudge, S., & Fisher, R. (2020). Implementation in rehabilitation: A roadmap for practitioners and researchers. Disability & Rehabilitation , 42 (22), 3265–3274. https://doi.org/10.1080 /09638288.2019.1587013 Morris, Z. S., Wooding, S., & Grant, J. (2011). The answer is 17 years, what is the question: Understanding time lags in translational research. Journal of the Royal Society of Medicine , 104 (12), 510–520. https://doi. org/10.1258/jrsm.2011.110180 Reilly, S., Douglas, J., & Oates, J. (Eds.). (2004). Evidence-based practice in speech pathology . Whurr. Roddam, H., & Skeat, J. (Eds.). (2010). Embedding evidence-based practice in speech and language therapy: International examples . Wiley. Rogers, E. M. (1995). Diffusion of innovations . Free Press. Rycroft-Malone, J., Burton, C., Wilkinson, J., Harvey, G., McCormack, B., Baker, R., Dopson, S., Graham, I., Staniszewska, S., Thompson, C., Ariss, S., Melville- Richards, L., & Williams, L. (2015). Collective action for knowledge mobilisation: a realist evaluation of the Collaborations for Leadership in Applied Health Research and Care. Health Services and Delivery Research , 3 (44), 1–166. https://doi.org/10.3310/hsdr03440 Rycroft-Malone, J., Harvey, G., Seers, K., Kitson, A., McCormack, B., & Titchen, A. (2004). An exploration of the factors that influence the implementation of evidence into practice. Journal of Clinical Nursing , 13 (8), 913–924. https://doi.org/10.1111/j.1365-2702.2004.01007.x Scott, S. D., Albrecht, L., O’Leary, K., Ball, G. D., Hartling, L., Hofmeyer, A., Jones, C. A., Klassen, T. P., Burns, K. K., & Newton, A. S. (2012). Systematic review of knowledge translation strategies in the allied health

Correspondence to: Hazel Roddam Research Support Network, ReSNetSLT email: hazeroddam@gmail.com

58

JCPSLP Volume 23, Number 2 2021

Journal of Clinical Practice in Speech-Language Pathology

Implementation science

Using the RE-AIM framework to guide the implementation and evaluation of interventions for children with communication disorders Elise Baker, Kate Short, and Katrina Tosi

Clinical decisions in speech-language pathology practice are ideally informed by experimental evidence, with the randomised controlled trial considered the ‘pinnacle’ of best available evidence for new interventions. Although tightly controlled experimental studies are valuable, they do not necessarily provide guidance on how interventions should be implemented in routine practice. Implementation science emerged out of a need to close the gap between research and practice. In this article we describe how Russell Glasgow and colleagues’ RE-AIM framework could be used to plan and evaluate intervention implementation. Drawing on a hypothetical clinical scenario about a team of speech-language pathologists (SLPs) seeking to implement a program for late talking toddlers, we explore the type of information and issues SLPs need to consider to ensure optimal reach, effectiveness, adoption, implementation, and maintenance of an intervention in clinical practice. T he conduct of evidence-based practice in paediatric speech-language pathology can be challenging. This is partly due to the evidence-base being weighted towards efficacy studies rather than effectiveness studies, implementation research, and practice-based evidence (Crooke & Olwsang, 2015; Schliep et al., 2017). Efficacy studies focus on internal validity—they determine if a clear cause–effect relationship exists between an intervention and a desired outcome under carefully controlled experimental conditions. Children eligible to participate in efficacy studies usually need to meet strict inclusion criteria such as not having comorbid conditions and only speaking English at home (e.g., Girolametto et al., 1996). Although efficacy studies are important for determining if an intervention causes a desired outcome, the findings can have limited generalisability to the children on speech- language pathologists’ (SLPs) caseloads. For instance,

it cannot be assumed that experimental intervention outcomes for a cohort of monolingual children will be the same for multilingual children. By contrast, effectiveness studies focus on external validity—they determine if interventions work under real-world conditions. Inclusion criteria usually more closely reflect the heterogeneity of children on SLPs caseloads. Although effectiveness studies offer better generalisability, they still focus on the effect of an intervention under experimental conditions. Effectiveness studies are not designed to provide information about how an intervention might be faithfully implemented at a local level to yield results in routine practice. The field of implementation science was born out of a need to close the research–practice gap, accelerate the implementation of empirical research into day-to-day practice, and evaluate both the process and outcome of implementation (Crooke & Olswang, 2015). One of the advances within the field of implementation science over the past 20 years has been the development of theories, frameworks, and models for studying, guiding, and evaluating intervention implementation (Fixen et al., 2005; Nilsen, 2015). In this paper we reflect on how one of these popular frameworks, the RE-AIM framework (Glasgow et al., 1999) could be used to guide the planning and evaluation of intervention implementation. We draw on a hypothetical example from our experience in clinical practice to illustrate the value of the RE-AIM framework. Hypothetical SLP practice scenario: Improving parent training for late talkers Astrid is a senior SLP working for an organisation providing paediatric SLP services across a local government area. Astrid leads a team of nine SLPs across three different sites. During a planning meeting, the team discuss the need to improve their engagement with families of children who are late to talk (i.e., any young child whose parents or referring agents are concerned about the child’s communication) and subsequent implementation of parent training interventions. The team discuss the need to shift towards an effective group-based model of service delivery focused on parent training. While the team are aware of prior literature suggesting that SLPs could use a “wait-and- see” approach with some late talkers (Whitehurst et al., 1991), they are aware of evidence indicating that as these

THIS ARTICLE HAS BEEN PEER- REVIEWED KEYWORDS CHILDREN COMMUNICATION FIDELITY IMPLEMENTATION INTERVENTION

Elise Baker (top), Kate Short

(centre) and Katrina Tosi

59

JCPSLP Volume 23, Number 2 2021

www.speechpathologyaustralia.org.au

The reach domain When implementing a new intervention, SLPs need to consider the reach of the intervention—the number, proportion, and representativeness of the children/families suitable for and willing to receive the intervention (Glasgow et al., 2019). Data needs to be collected about who is eligible and who receives the intervention, in addition to barriers and facilitators for children/families to receive the intervention (see Table 1). Information about intervention reach is important for (a) determining if the children and families accessing the service are representative of the target population in the local area, (b) comparing if children/ families receiving the intervention are similar/different to the children/families participating in published evidence, and (c) and understanding who the intervention works for and who it does not. The success of methods used to reach the target population can also be considered. The effectiveness domain Effectiveness (see Table 2) refers to the impact of an intervention on desired outcomes plus unintended consequences (including negative effects) (Glasgow et al., 2019). Outcomes measured can be clinically specific targets (e.g., size of a child’s expressive vocabulary) in addition to more distal measures beyond targets such as a child’s activity and participation, and quality of life (Sandbank et al., 2021). As Thomas-Stonell et al. (2013) point out, if outcome measures only capture the intervention target, opportunities are missed to identify subsequent social functional changes. The FOCUS-34 © (Thomas-Stonell et al., 2015) is one such patient (parent) reported outcome measure with robust psychometrics designed to capture the impact of intervention on young children’s activity and participation. When working with young children the outcome for others working with children (e.g., parents/carers, educators) also need to be considered. Finally, organisations need to consider the economic outcomes or costs relative to client outcomes when making decisions about resource allocation and exploring cost minimisation and relative cost–benefits. In a helpful tutorial paper for SLPs on health economics, Burns et al. (2020) provide SLPs with guidance on various types of health economic evaluations, costs, and outcome data to consider when engaging in implementation research. In an application of the RE-AIM framework, Bittar et al. (2018)

children grow up they continue to have difficulties with speech, language, and literacy relative to peers who talk on time (Hawa & Spanoudis, 2014; Neam et al., 2020), and that the “wait and see” approach needs to be reconsidered (Capone Singleton, 2018). Astrid is involved in a local evidence-based practice clinical group with an interest in late talkers. The group recently completed an appraisal of research evidence on interventions targeting parent responsiveness (e.g., Heidlage et al., 2020). In light of Astrid’s summary of what was learned from the appraisal of the research, the team have decided to implement It Takes Two To Talk®—the Hanen Program® for parents of children with language delays (Weitzman, 2017). However, the team is unsure if the program would be suitable and just as effective for the families in their local area for two reasons. First, the wider evidence-base in early language intervention has a proclivity towards efficacy studies, with few studies involving practice-based research (Crook & Olswang, 2017). Second, the homogeneity of the research participants limits the generalisability of the evidence to all the late talkers and their families (e.g., culturally and linguistically diverse [CALD] communities and families of low socioeconomic status [SES]) on their caseloads. The team decide to use the RE-AIM framework (Glasgow et al., 1999) to guide their implementation plans and evaluation. What is the RE-AIM framework? The RE-AIM framework was developed by Glasgow et al. (1999) to assist clinicians, researchers, and decision- makers with the task of translating research into practice. The acronym RE-AIM refers to five domains considered important when translating and implementing research into practice: reach , effectiveness , adoption , implementation and maintenance . Together these five domains help inform the planning and evaluation of intervention implementation. RE-AIM has been used by clinicians and researchers across the globe to evaluate intervention implementation across various fields in health care (Glasgow et al., 2019). What follows is an overview of each RE-AIM domain and how each domain could be applied to our hypothetical scenario to plan and evaluate the implementation of It Takes Two To Talk® (Weitzman, 2017) in paediatric SLP practice. • Demographic characteristics of children and parents/carers. For example child’s age, parental education level, SES, CALD background and language(s) spoken in the home, history of prior therapy, need for interpreters, involvement in early childhood education, home literacy environment, child’s screen time, transport used to access the service. • Attendance and non-attendance of children and parents/carers per session. • Reason for non-attendance. • Proportion of late talkers accessing the service who did vs did not receive the program and why (e.g., parent declined service delivery, reason for a child not being suitable). • If parents/carers were given any choice regarding when and where they received the program. What information could be collected?

Table 1. RE-AIM a Domain REACH: Information and issues to consider when implementing and evaluating It Takes Two to Talk ®b

Issues for consideration

• Review standard case history form used in the service. Ensure necessary child and parent demographic data are gathered. This could aid further evaluations of models of service. Consider using secure online questionnaire/ data collection software to facilitate efficient data entry (by parent/carer), data extraction, and analysis. • Use established electronic notes/data systems rather than paper, to minimise double handling of clinical data. • To minimise inconsistencies in documentation between clinicians, implement a documentation protocol and training, and conduct routine clinical audits to ensure documentation is consistent.

a Glasgow et al., 2019 b Weitzman, 2017

60

JCPSLP Volume 23, Number 2 2021

Journal of Clinical Practice in Speech-Language Pathology

Made with FlippingBook - Online magazine maker