Journal articles on the topic 'Evaluation of complex intervention'

To see the other types of publications on this topic, follow the link: Evaluation of complex intervention.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Evaluation of complex intervention.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Moore, Graham F., Rhiannon E. Evans, Jemma Hawkins, Hannah Littlecott, G. J. Melendez-Torres, Chris Bonell, and Simon Murphy. "From complex social interventions to interventions in complex social systems: Future directions and unresolved questions for intervention development and evaluation." Evaluation 25, no. 1 (October 31, 2018): 23–45. http://dx.doi.org/10.1177/1356389018803219.

Full text
Abstract:
Complex systems approaches to social intervention research are increasingly advocated. However, there have been few attempts to consider how models of intervention science, such as the UK’s Medical Research Council complex interventions framework, might be reframed through a complex systems lens. This article identifies some key areas in which this framework might be reconceptualized, and a number of priority areas where further development is needed if alignment with a systems perspective is to be achieved. We argue that a complex systems perspective broadens the parameters of ‘relevant’ evidence and theory for intervention development, before discussing challenges in defining feasibility in dynamic terms. We argue that whole systems evaluations may be neither attainable, nor necessary; acknowledgment of complexity does not mean that evaluations must be complex, or investigate all facets of complexity. However, a systems lens may add value to evaluation design through guiding identification of key uncertainties, and informing decisions such as timings of follow-up assessments.
APA, Harvard, Vancouver, ISO, and other styles
2

Makleff, Shelly, Marissa Billowitz, Jovita Garduño, Mariana Cruz, Vanessa Ivon Silva Márquez, and Cicely Marston. "Applying a complex adaptive systems approach to the evaluation of a school-based intervention for intimate partner violence prevention in Mexico." Health Policy and Planning 35, no. 8 (August 6, 2020): 993–1002. http://dx.doi.org/10.1093/heapol/czaa067.

Full text
Abstract:
Abstract Despite calls for evaluation practice to take a complex systems approach, there are few examples of how to incorporate complexity into real-life evaluations. This article presents the case for using a complex systems approach to evaluate a school-based intimate partner violence-prevention intervention. We conducted a post hoc analysis of qualitative evaluation data to examine the intervention as a potential system disruptor. We analysed data in relation to complexity concepts particularly relevant to schools: ‘diverse and dynamic agents’, ‘interaction’, ‘unpredictability’, ‘emergence’ and ‘context dependency’. The data—two focus groups with facilitators and 33 repeat interviews with 14–17-year-old students—came from an evaluation of a comprehensive sexuality education intervention in Mexico City, which serves as a case study for this analysis. The findings demonstrate an application of complex adaptive systems concepts to qualitative evaluation data. We provide examples of how this approach can shed light on the ways in which interpersonal interactions, group dynamics, the core messages of the course and context influenced the implementation and outcomes of this intervention. This gender-transformative intervention appeared to disrupt pervasive gender norms and reshape beliefs about how to engage in relationships. An intervention comprises multiple dynamic and interacting elements, all of which are unlikely to be consistent across implementation settings. Applying complexity concepts to our analysis added value by helping reframe implementation-related data to focus on how the ‘social’ aspects of complexity influenced the intervention. Without examining both individual and group processes, evaluations may miss key insights about how the intervention generates change, for whom, and how it interacts with its context. A social complex adaptive systems approach is well-suited to the evaluation of gender-transformative interventions and can help identify how such interventions disrupt the complex social systems in which they are implemented to address intractable societal problems.
APA, Harvard, Vancouver, ISO, and other styles
3

Byford, Sarah, and Tom Sefton. "Economic Evaluation of Complex Health and Social Care Interventions." National Institute Economic Review 186 (October 2003): 98–108. http://dx.doi.org/10.1177/002795010300100114.

Full text
Abstract:
The use of economic evaluation in relatively complex areas of health and social care has been limited. The level of complexity is influenced by the nature of the problems and interventions under evaluation, being dependent upon the degree of user involvement and the complexity of the inputs and outcomes. Complexity does not preclude the achievement of a good quality economic evaluation, but it can add significant difficulties. Efforts must be made to ensure scientific validity of evaluations, whilst recognising that the complexity inherent in many health and social care interventions may require deviations from and additions to traditional evaluation models. Fundamentally, the net effect will be the need for more time and money than would perhaps be required for the evaluation of a simpler intervention.
APA, Harvard, Vancouver, ISO, and other styles
4

ȘOMFELEAN, Oana-Maria. "Complex evaluation process in the context of early intervention." Revista Română de Terapia Tulburărilor de Limbaj şi Comunicare IV, no. 2 (October 15, 2018): 12–24. http://dx.doi.org/10.26744/rrttlc.2018.4.2.03.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

I., Mühlhauser, and Berger M. "Patient education - evaluation of a complex intervention." Diabetologia 45, no. 12 (December 1, 2002): 1723–33. http://dx.doi.org/10.1007/s00125-002-0987-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Jones, Taryn M., Blake F. Dear, Julia M. Hush, Nickolai Titov, and Catherine M. Dean. "Application of Intervention Mapping to the Development of a Complex Physical Therapist Intervention." Physical Therapy 96, no. 12 (December 1, 2016): 1994–2004. http://dx.doi.org/10.2522/ptj.20150387.

Full text
Abstract:
Abstract Background Physical therapist interventions, such as those designed to change physical activity behavior, are often complex and multifaceted. In order to facilitate rigorous evaluation and implementation of these complex interventions into clinical practice, the development process must be comprehensive, systematic, and transparent, with a sound theoretical basis. Intervention Mapping is designed to guide an iterative and problem-focused approach to the development of complex interventions. Purpose The purpose of this case report is to demonstrate the application of an Intervention Mapping approach to the development of a complex physical therapist intervention, a remote self-management program aimed at increasing physical activity after acquired brain injury. Case Description Intervention Mapping consists of 6 steps to guide the development of complex interventions: (1) needs assessment; (2) identification of outcomes, performance objectives, and change objectives; (3) selection of theory-based intervention methods and practical applications; (4) organization of methods and applications into an intervention program; (5) creation of an implementation plan; and (6) generation of an evaluation plan. The rationale and detailed description of this process are presented using an example of the development of a novel and complex physical therapist intervention, myMoves—a program designed to help individuals with an acquired brain injury to change their physical activity behavior. Conclusion The Intervention Mapping framework may be useful in the development of complex physical therapist interventions, ensuring the development is comprehensive, systematic, and thorough, with a sound theoretical basis. This process facilitates translation into clinical practice and allows for greater confidence and transparency when the program efficacy is investigated.
APA, Harvard, Vancouver, ISO, and other styles
7

Madan, Jason, Meghan Bruce Kumar, Miriam Taegtmeyer, Edwine Barasa, and Swaran Preet Singh. "SEEP-CI: A Structured Economic Evaluation Process for Complex Health System Interventions." International Journal of Environmental Research and Public Health 17, no. 18 (September 17, 2020): 6780. http://dx.doi.org/10.3390/ijerph17186780.

Full text
Abstract:
The economic evaluation of health system interventions is challenging, and methods guidance on how to respond to these challenges is lacking. The REACHOUT consortium developed and evaluated complex interventions for community health program quality improvement in six countries in Africa and Asia. Reflecting on the challenges we faced in conducting an economic evaluation alongside REACHOUT, we developed a Structured Economic Evaluation Process for Complex Health System Interventions (SEEP-CI). The SEEP-CI aims to establish the threshold effect size that would justify investment in a complex intervention, and provide an assessment to a decision-maker of how likely it is that the intervention can achieve this impact. We illustrate how the SEEP-CI could have been applied to REACHOUT to identify outcomes where the intervention might have impact and causal mechanisms, through which that impact might occur, guide data collection by focusing on proximal outcomes most likely to illustrate the effectiveness of the intervention, identify the size of health gain required to justify investment in the intervention, and indicate the assumptions required to accept that such health gains are credible. Further research is required to determine the feasibility and acceptability of the SEEP-CI, and the contexts in which it could be used.
APA, Harvard, Vancouver, ISO, and other styles
8

Steele Gray, Carolyn, and James Shaw. "From summative to developmental." Journal of Integrated Care 27, no. 3 (June 20, 2019): 241–48. http://dx.doi.org/10.1108/jica-07-2018-0053.

Full text
Abstract:
Purpose Models of integrated care are prime examples of complex interventions, incorporating multiple interacting components that work through varying mechanisms to impact numerous outcomes. The purpose of this paper is to explore summative, process and developmental approaches to evaluating complex interventions to determine how to best test this mess. Design/methodology/approach This viewpoint draws on the evaluation and complex intervention literatures to describe the advantages and disadvantages of different methods. The evaluation of the electronic patient reported outcomes (ePRO) mobile application and portal system is presented as an example of how to evaluate complex interventions with critical lessons learned from this ongoing study. Findings Although favored in the literature, summative and process evaluations rest on two problematic assumptions: it is possible to clearly identify stable mechanisms of action; and intervention fidelity can be maximized in order to control for contextual influences. Complex interventions continually adapt to local contexts, making stability and fidelity unlikely. Developmental evaluation, which is more conceptually aligned with service-design thinking, moves beyond these assumptions, emphasizing supportive adaptation to ensure meaningful adoption. Research limitations/implications Blended approaches that incorporate service-design thinking and rely more heavily on developmental strategies are essential for complex interventions. To maximize the benefit of this approach, three guiding principles are suggested: stress pragmatism over stringency; adopt an implementation lens; and use multi-disciplinary teams to run studies. Originality/value This viewpoint offers novel thinking on the debate around appropriate evaluation methodologies to be applied to complex interventions like models of integrated care.
APA, Harvard, Vancouver, ISO, and other styles
9

Beames, Joanne R., Raghu Lingam, Katherine Boydell, Alison L. Calear, Michelle Torok, Kate Maston, Isabel Zbukvic, et al. "Protocol for the process evaluation of a complex intervention delivered in schools to prevent adolescent depression: the Future Proofing Study." BMJ Open 11, no. 1 (January 2021): e042133. http://dx.doi.org/10.1136/bmjopen-2020-042133.

Full text
Abstract:
IntroductionProcess evaluations provide insight into how interventions are delivered across varying contexts and why interventions work in some contexts and not in others. This manuscript outlines the protocol for a process evaluation embedded in a cluster randomised trial of a digital depression prevention intervention delivered to secondary school students (the Future Proofing Study). The purpose is to describe the methods that will be used to capture process evaluation data within this trial.Methods and analysisUsing a hybrid type 1 design, a mixed-methods approach will be used with data collected in the intervention arm of the Future Proofing Study. Data collection methods will include semistructured interviews with school staff and study facilitators, automatically collected intervention usage data and participant questionnaires (completed by school staff, school counsellors, study facilitators and students). Information will be collected about: (1) how the intervention was implemented in schools, including fidelity; (2) school contextual factors and their association with intervention reach, uptake and acceptability; (3) how school staff, study facilitators and students responded to delivering or completing the intervention. How these factors relate to trial effectiveness outcomes will also be assessed. Overall synthesis of the data will provide school cluster-level and individual-level process outcomes.Ethics and disseminationEthics approval was obtained from the University of New South Wales (NSW) Human Research Ethics Committee (HC180836; 21st January 2019) and the NSW Government State Education Research Applications Process (SERAP 2019201; 19th August 2019). Results will be submitted for publication in peer-reviewed journals and discussed at conferences. Our process evaluation will contextualise the trial findings with respect to how the intervention may have worked in some schools but not in others. This evaluation will inform the development of a model for rolling out digital interventions for the prevention of mental illness in schools.Trial registration numberANZCTRN12619000855123; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=377664&isReview=true.
APA, Harvard, Vancouver, ISO, and other styles
10

Kainz, Kirsten, Allison Metz, and Noreen Yazejian. "Tools for Evaluating the Implementation of Complex Education Interventions." American Journal of Evaluation 42, no. 3 (July 9, 2021): 399–414. http://dx.doi.org/10.1177/1098214020958490.

Full text
Abstract:
Large-scale education interventions aimed at diminishing disparities and generating equitable learning outcomes are often complex, involving multiple components and intended impacts. Evaluating implementation of complex interventions is challenging because of the interactive and emergent nature of intervention components. Methods that build from systems science have proven useful for addressing evaluation challenges in the complex intervention space. Complexity science shares some terminology with systems science, but the primary aims and methods of complexity science are different from those of systems science. In this paper we describe some of the language and ideas used in complexity science. We offer a set of priorities for evaluation of complex interventions based on language and ideas used in complexity science and methodologies aligned with the priorities.
APA, Harvard, Vancouver, ISO, and other styles
11

Boeije, Hennie R., Sarah J. Drabble, and Alicia O’Cathain. "Methodological Challenges of Mixed Methods Intervention Evaluations." Methodology 11, no. 4 (October 2015): 119–25. http://dx.doi.org/10.1027/1614-2241/a000101.

Full text
Abstract:
Abstract. This paper addresses the methodological challenges that accompany the use of a combination of research methods to evaluate complex interventions. In evaluating complex interventions, the question about effectiveness is not the only question that needs to be answered. Of equal interest are questions about acceptability, feasibility, and implementation of the intervention and the evaluation study itself. Using qualitative research in conjunction with trials enables us to address this diversity of questions. The combination of methods results in a mixed methods intervention evaluation (MMIE). In this article we demonstrate the relevance of mixed methods evaluation studies and provide case studies from health care. Methodological challenges that need our attention are, among others, choosing appropriate designs for MMIEs, determining realistic expectations of both components, and assigning adequate resources to both components. Solving these methodological issues will improve our research designs and provide further insights into complex interventions.
APA, Harvard, Vancouver, ISO, and other styles
12

Saarijärvi, Markus, Lars Wallin, and Ewa-Lena Bratt. "Process evaluation of complex cardiovascular interventions: How to interpret the results of my trial?" European Journal of Cardiovascular Nursing 19, no. 3 (February 14, 2020): 269–74. http://dx.doi.org/10.1177/1474515120906561.

Full text
Abstract:
Complex interventions of varying degrees of complexity are commonly used and evaluated in cardiovascular nursing and allied professions. Such interventions are increasingly tested using randomized trial designs. However, process evaluations are seldom used to better understand the results of these trials. Process evaluation aims to understand how complex interventions create change by evaluating implementation, mechanisms of impact, and the surrounding context when delivering an intervention. As such, this method can illuminate important mechanisms and clarify variation in results. In this article, process evaluation is described according to the Medical Research Council guidance and its use exemplified through a randomized controlled trial evaluating the effectiveness of a transition program for adolescents with chronic conditions.
APA, Harvard, Vancouver, ISO, and other styles
13

Trieu, Kathy, Stephen Jan, Mark Woodward, Carley Grimes, Bruce Bolam, Caryl Nowson, Jenny Reimers, Chelsea Davidson, and Jacqui Webster. "Protocol for the Process Evaluation of a Complex, Statewide Intervention to Reduce Salt Intake in Victoria, Australia." Nutrients 10, no. 8 (July 30, 2018): 998. http://dx.doi.org/10.3390/nu10080998.

Full text
Abstract:
Systematic reviews of trials consistently demonstrate that reducing salt intake lowers blood pressure. However, there is limited evidence on how interventions function in the real world to achieve sustained population-wide salt reduction. Process evaluations are crucial for understanding how and why an intervention resulted in its observed effect in that setting, particularly for complex interventions. This project presents the detailed protocol for a process evaluation of a statewide strategy to lower salt intake in Victoria, Australia. We describe the pragmatic methods used to collect and analyse data on six process evaluation dimensions: reach, dose or adoption, fidelity, effectiveness, context and cost, informed by Linnan and Steckler’s framework and RE-AIM. Data collection methods include routinely collected administrative data; surveys of processed foods, the population, food industry and organizations; targeted campaign evaluation and semi-structured interviews. Quantitative and qualitative data will be triangulated to provide validation or context for one another. This process evaluation will contribute new knowledge about what components of the intervention are important to salt reduction strategies and how the interventions cause reduced salt intake, to inform the transferability of the program to other Australian states and territories. This protocol can be adapted for other population-based, complex, disease prevention interventions.
APA, Harvard, Vancouver, ISO, and other styles
14

McHugh, Neil, Olga Biosca, and Cam Donaldson. "From wealth to health: Evaluating microfinance as a complex intervention." Evaluation 23, no. 2 (April 2017): 209–25. http://dx.doi.org/10.1177/1356389017697622.

Full text
Abstract:
Innovative interventions that address the social determinants of health are required to help reduce persistent health inequalities. We argue that microcredit can act in this way and develop a conceptual framework from which to examine this. In seeking to evaluate microcredit this way we then examine how randomized controlled trials, currently considered as the ‘gold standard’ in impact evaluations of microcredit, compare with developments in thinking about study design in public health. This leads us to challenge the notion of trials as the apparent gold standard for microcredit evaluations and contend that the pursuit of trial-based evidence alone may be hampering the production of relevant evidence on microcredit’s public health (and other wider) impacts. In doing so, we introduce new insights into the global debate on microfinance impact evaluation, related to ethical issues in staging randomized controlled trials, and propose innovations on complementary methods for use in the evaluation of complex interventions.
APA, Harvard, Vancouver, ISO, and other styles
15

Ackermann, Joëlle, Armando Hoch, Jess Gerrit Snedeker, Patrick Oliver Zingg, Hooman Esfandiari, and Philipp Fürnstahl. "Automatic 3D Postoperative Evaluation of Complex Orthopaedic Interventions." Journal of Imaging 9, no. 9 (August 31, 2023): 180. http://dx.doi.org/10.3390/jimaging9090180.

Full text
Abstract:
In clinical practice, image-based postoperative evaluation is still performed without state-of-the-art computer methods, as these are not sufficiently automated. In this study we propose a fully automatic 3D postoperative outcome quantification method for the relevant steps of orthopaedic interventions on the example of Periacetabular Osteotomy of Ganz (PAO). A typical orthopaedic intervention involves cutting bone, anatomy manipulation and repositioning as well as implant placement. Our method includes a segmentation based deep learning approach for detection and quantification of the cuts. Furthermore, anatomy repositioning was quantified through a multi-step registration method, which entailed a coarse alignment of the pre- and postoperative CT images followed by a fine fragment alignment of the repositioned anatomy. Implant (i.e., screw) position was identified by 3D Hough transform for line detection combined with fast voxel traversal based on ray tracing. The feasibility of our approach was investigated on 27 interventions and compared against manually performed 3D outcome evaluations. The results show that our method can accurately assess the quality and accuracy of the surgery. Our evaluation of the fragment repositioning showed a cumulative error for the coarse and fine alignment of 2.1 mm. Our evaluation of screw placement accuracy resulted in a distance error of 1.32 mm for screw head location and an angular deviation of 1.1° for screw axis. As a next step we will explore generalisation capabilities by applying the method to different interventions.
APA, Harvard, Vancouver, ISO, and other styles
16

Dharni, Nimarta, Josie Dickerson, Kathryn Willan, Sara Ahern, Abigail Dunn, Dea Nielsen, Eleonora Uphoff, Rosemary R. C. McEachan, and Maria Bryant. "Implementation evaluation of multiple complex early years interventions: an evaluation framework and study protocol." BMJ Paediatrics Open 3, no. 1 (June 2019): e000479. http://dx.doi.org/10.1136/bmjpo-2019-000479.

Full text
Abstract:
IntroductionImplementation evaluations are integral to understanding whether, how and why interventions work. However, unpicking the mechanisms of complex interventions is often challenging in usual service settings where multiple services are delivered concurrently. Furthermore, many locally developed and/or adapted interventions have not undergone any evaluation, thus limiting the evidence base available. Born in Bradford’s Better Start cohort is evaluating the impact of multiple early life interventions being delivered as part of the Big Lottery Fund’s ‘A Better Start’ programme to improve the health and well-being of children living in one of the most socially and ethnically diverse areas of the UK. In this paper, we outline our evaluation framework and protocol for embedding pragmatic implementation evaluation across multiple early years interventions and services.Methods and analysisThe evaluation framework is based on a modified version of The Conceptual Framework for Implementation Fidelity. Using qualitative and quantitative methods, our evaluation framework incorporates semistructured interviews, focus groups, routinely collected data and questionnaires. We will explore factors related to content, delivery and reach of interventions at both individual and wider community levels. Potential moderating factors impacting intervention success such as participants’ satisfaction, strategies to facilitate implementation, quality of delivery and context will also be examined. Interview and focus guides will be based on the Theoretical Domains Framework to further explore the barriers and facilitators of implementation. Descriptive statistics will be employed to analyse the routinely collected quantitative data and thematic analysis will be used to analyse qualitative data.Ethics and disseminationThe Health Research Authority (HRA) has confirmed our implementation evaluations do not require review by an NHS Research Ethics Committee (HRA decision 60/88/81). Findings will be shared widely to aid commissioning decisions and will also be disseminated through peer-reviewed journals, summary reports, conferences and community newsletters.
APA, Harvard, Vancouver, ISO, and other styles
17

Lyon, Aaron R., Kelly Koerner, and Julie Chung. "Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI): A methodology for assessing complex intervention implementability." Implementation Research and Practice 1 (January 2020): 263348952093292. http://dx.doi.org/10.1177/2633489520932924.

Full text
Abstract:
Background: Most evidence-based practices in mental health are complex psychosocial interventions, but little research has focused on assessing and addressing the characteristics of these interventions, such as design quality and packaging, that serve as intra-intervention determinants (i.e., barriers and facilitators) of implementation outcomes. Usability—the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction—is a key indicator of design quality. Drawing from the field of human-centered design, this article presents a novel methodology for evaluating the usability of complex psychosocial interventions and describes an example “use case” application to an exposure protocol for the treatment of anxiety disorders with one user group. Method: The Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI) methodology comprises four steps: (1) identify users for testing; (2) define and prioritize EBPI components (i.e., tasks and packaging); (3) plan and conduct the evaluation; and (4) organize and prioritize usability issues. In the example, clinicians were selected for testing from among the identified user groups of the exposure protocol (e.g., clients, system administrators). Clinicians with differing levels of experience with exposure therapies (novice, n =3; intermediate, n = 4; advanced, n = 3) were sampled. Usability evaluation included Intervention Usability Scale (IUS) ratings and individual user testing sessions with clinicians, and heuristic evaluations conducted by design experts. After testing, discrete usability issues were organized within the User Action Framework (UAF) and prioritized via independent ratings (1–3 scale) by members of the research team. Results: Average IUS ratings (80.5; SD = 9.56 on a 100-point scale) indicated good usability and also room for improvement. Ratings for novice and intermediate participants were comparable (77.5), with higher ratings for advanced users (87.5). Heuristic evaluations suggested similar usability (mean overall rating = 7.33; SD = 0.58 on a 10-point scale). Testing with individual users revealed 13 distinct usability issues, which reflected all four phases of the UAF and a range of priority levels. Conclusion: Findings from the current study suggested the USE-EBPI is useful for evaluating the usability of complex psychosocial interventions and informing subsequent intervention redesign (in the context of broader development frameworks) to enhance implementation. Future research goals are discussed, which include applying USE-EBPI with a broader range of interventions and user groups (e.g., clients). Plain language abstract: Characteristics of evidence-based psychosocial interventions (EBPIs) that impact the extent to which they can be implemented in real world mental health service settings have received far less attention than the characteristics of individuals (e.g., clinicians) or settings (e.g., community mental health centers), where EBPI implementation occurs. No methods exist to evaluate the usability of EBPIs, which can be a critical barrier or facilitator of implementation success. The current article describes a new method, the Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI), which uses techniques drawn from the field of human-centered design to evaluate EBPI usability. An example application to an intervention protocol for anxiety problems among adults is included to illustrate the value of the new approach.
APA, Harvard, Vancouver, ISO, and other styles
18

Guicheteau, Julie, Ümran Sema Seven, Jana Boes, Ina Monsef, Sascha Köpke, Ann-Kristin Folkerts, Justina Doffiné, Elke Kalbe, and Martin N. Dichter. "P93: Characteristics of complex, non-pharmacological cognitive stimulation interventions for people with dementia in nursing homes: systematic review." International Psychogeriatrics 35, S1 (December 2023): 124. http://dx.doi.org/10.1017/s1041610223002533.

Full text
Abstract:
Objective:Several guidelines propose the use of cognitive stimulation (CS) in people with dementia. Multi-component CS interventions seem most effective in improving cognitive function, quality of life, and behavioral and psychological symptoms of dementia. For successful implementation, it is important to analyze CS interventions in detail in order to identify frequently used and potentially effective components. The aim of this systematic review is to identify, describe and summarise multicomponent CS interventions conducted in nursing homes aiming to improve cognitive function, quality of life, mood, and behavior of people with dementia in nursing homes.Methods:This review is based on established methodological frameworks for systematic evidence syntheses. We conducted a database search in February 2021, using PubMed, CENTRAL, PsycINFO, ALOIS and CINAHL. Two independent reviewers assessed all search results for eligible studies and assessed studies’ methodological quality using the Cochrane Risk of Bias tool for RCTs and the Joanna Briggs Institute checklist for quasi-experimental studies. Evaluation and intervention development studies of any design examining multicomponent interventions CS were included. Components of included intervention programs were analyzed using the TIDieR and CReDECI 2 criteria following a narrative analysis.Results:We identified 19,992 references and included 45 publications. We observed large heterogeneity regarding intervention components, delivery, materials, mode of delivery, intervention provider, and intervention duration. Intervention components included for example reminiscence therapy, activities of daily living, cognitive exercises or reality orientation. Risk of bias was generally low. Reporting of complex interventions was frequently insufficient. No study reported patient and public involvement (PPI) at any stage of the research process.Conclusion:This systematic review is the first to describe complex CS interventions conducted in nursing homes in detail. Results indicate the need for more detailed intervention description for future studies based on TIDieR and CReDECI2 guidelines to allow reliable replication of these interventions. Despite enormous research activities, many questions regarding the implementation and efficacy are still unanswered as process evaluations are lacking. In addition, reproducibility of interventions is hardly possible due to limited reporting. Future studies should use established frameworks for the development, evaluation and implementation of complex interventions and apply PPI concepts.
APA, Harvard, Vancouver, ISO, and other styles
19

Wenzel, Lisa, Christoph Heesen, Jutta Scheiderbauer, Markus van de Loo, Sascha Köpke, and Anne Christin Rahn. "Evaluation of an interactive web-based programme on relapse management for people with multiple sclerosis (POWER@MS2): study protocol for a process evaluation accompanying a randomised controlled trial." BMJ Open 11, no. 10 (October 2021): e046874. http://dx.doi.org/10.1136/bmjopen-2020-046874.

Full text
Abstract:
IntroductionProcess evaluations accompanying complex interventions examine the implementation process of the underlying intervention, identify mechanisms of impact and assess contextual factors. This paper presents the protocol for a process evaluation conducted alongside the randomised controlled trial POWER@MS2. The trial comprises the evaluation of a web-based complex intervention on relapse management in 188 people with multiple sclerosis conducted in 20 centres. The web-based intervention programme focuses on relapse treatment decision making and includes a decision aid, a nurse-led webinar and an online chat. With the process evaluation presented here, we aim to assess participants’ responses to and interactions with the intervention to understand how and why the intervention produces change.Methods and analysisA mixed methods design is used to explore the acceptance of the intervention as well as its use and impact on participants. Participants are people with multiple sclerosis, neurologists, nurses and stakeholders. Quantitative semistandardised evaluation forms will be collected throughout the study. Qualitative semistructured telephone interviews will be conducted at the end of the study with selected participants, especially people with multiple sclerosis and neurologists. Quantitative data will be collected and analysed descriptively. Based on the results, the qualitative interviews will be conducted and analysed thematically, and the results will be merged in a joint display table.Ethics and disseminationThe process evaluation has received ethical approval from the Ethical Committee of the University of Lübeck (reference 19–024). Findings will be disseminated in peer-reviewed journals, at conferences, meetings and on relevant patient websites.Trial registration numberNCT04233970.
APA, Harvard, Vancouver, ISO, and other styles
20

Roberts, Shelley, Laurie Grealish, Lauren T. Williams, Zane Hopper, Julie Jenkins, Alan Spencer, and Andrea P. Marshall. "Development and Process Evaluation of a Complex Intervention for Improving Nutrition among Hospitalised Patients: A Mixed Methods Study." Healthcare 7, no. 2 (June 24, 2019): 79. http://dx.doi.org/10.3390/healthcare7020079.

Full text
Abstract:
Hospital-acquired malnutrition is a significant issue with complex aetiology, hence nutrition interventions must be multifaceted and context-specific. This paper describes the development, implementation and process evaluation of a complex intervention for improving nutrition among medical patients in an Australian hospital. An integrated knowledge translation (iKT) approach was used for intervention development, informed by previous research. Intervention strategies targeted patients (via a nutrition intake monitoring system); staff (discipline-specific training targeting identified barriers); and the organisation (foodservice system changes). A process evaluation was conducted parallel to implementation assessing reach, dose, fidelity and staff responses to the intervention using a mixed-methods design (quantitative and qualitative approaches). Staff-level interventions had high fidelity and broad reach (61% nurses, 93% foodservice staff and all medical staff received training). Patient and organisation interventions were implemented effectively, but due to staffing issues, only reached around 60% of patients. Staff found all intervention strategies acceptable with benefits to practice. This study found an iKT approach useful for designing a nutrition intervention that was context-specific, feasible and acceptable to staff. This was likely due to engagement of multiple disciplines, identifying and targeting specific areas in need of improvement, and giving staff frequent opportunities to contribute to intervention development/implementation.
APA, Harvard, Vancouver, ISO, and other styles
21

Tsantila, Fotini, Evelien Coppens, Hans De Witte, Ella Arensman, Birgit Aust, Arlinda Cerga Pashoja, Paul Corcoran, et al. "Implementing a complex mental health intervention in occupational settings: process evaluation of the MENTUPP pilot study." BMJ Open 13, no. 12 (December 2023): e077093. http://dx.doi.org/10.1136/bmjopen-2023-077093.

Full text
Abstract:
BackgroundAccording to the Medical Research Council (MRC) framework, the theorisation of how multilevel, multicomponent interventions work and the understanding of their interaction with their implementation context are necessary to be able to evaluate them beyond their complexity. More research is needed to provide good examples following this approach in order to produce evidence-based information on implementation practices.ObjectivesThis article reports on the results of the process evaluation of a complex mental health intervention in small and medium enterprises (SMEs) tested through a pilot study. The overarching aim is to contribute to the evidence base related to the recruitment, engagement and implementation strategies of applied mental health interventions in the workplace.MethodThe Mental Health Promotion and Intervention in Occupational Settings (MENTUPP) intervention was pilot tested in 25 SMEs in three work sectors and nine countries. The evaluation strategy of the pilot test relied on a mixed-methods approach combining qualitative and quantitative research methods. The process evaluation was inspired by the RE-AIM framework and the taxonomy of implementation outcomes suggested by Proctor and colleagues and focused on seven dimensions: reach, adoption, implementation, acceptability, appropriateness, feasibility and maintenance.ResultsFactors facilitating implementation included the variety of the provided materials, the support provided by the research officers (ROs) and the existence of a structured plan for implementation, among others. Main barriers to implementation were the difficulty of talking about mental health, familiarisation with technology, difficulty in fitting the intervention into the daily routine and restrictions caused by COVID-19.ConclusionsThe results will be used to optimise the MENTUPP intervention and the theoretical framework that we developed to evaluate the causal mechanisms underlying MENTUPP. Conducting this systematic and comprehensive process evaluation contributes to the enhancement of the evidence base related to mental health interventions in the workplace and it can be used as a guide to overcome their contextual complexity.Trial registration numberISRCTN14582090.
APA, Harvard, Vancouver, ISO, and other styles
22

Sprange, Kirsty, Gail Mountain, and Claire Craig. "Evaluation of intervention fidelity of a complex psychosocial intervention Lifestyle Matters: a randomised controlled trial." BMJ Open 11, no. 4 (April 2021): e043478. http://dx.doi.org/10.1136/bmjopen-2020-043478.

Full text
Abstract:
ObjectivesRobust research of complex interventions designed to promote mental well-being in later life is required to inform service development. An essential component is ensuring that such interventions are delivered as intended. We present a detailed description of the design and implementation of a fidelity assessment within a trial of one such intervention (Lifestyle Matters). The findings help to explain the trial results and also inform the design of embedded fidelity assessments within future evaluations of complex interventions.DesignWe conducted a mixed-method fidelity assessment embedded as part of a multicentre pragmatic randomised controlled trial. A conceptual fidelity framework was developed from the Behaviour Change Consortium framework. From this the fidelity assessment was designed. The resulting instrument assessed the following parameters: intervention design, training, supervision; and delivery, receipt and enactment of the intervention.InterventionThe Lifestyle Matters intervention was designed to assist older people to improve and sustain mental well-being through participation in meaningful activity. The aim is to enable participants to engage in both new and neglected activities through a mix of facilitated group meetings and individual sessions.ResultsThe fidelity assessment demonstrated that the intervention was delivered as per protocol for the group component and was tailored to meet individual needs. There was substantial inter-rater agreement for training; and group member performance 0.72; and moderate agreement for facilitator performance 0.55. It was not possible to determine whether small declines seen in facilitator performance were due to facilitator drift or moderating factors such as group dynamics or participant characteristics.ConclusionsThe assessment methods adequately measured criteria identified as being significant indicators of fidelity. Adherence during training, delivery and supervision was good. The subjective nature of identification and rating observed behaviours was the main challenge. Future research should explore alternative methods of assessing fidelity in trials of complex interventions.Trial registration numberISRCTN67209155.
APA, Harvard, Vancouver, ISO, and other styles
23

McHugh, J. E., O. Lee, N. Aspell, L. Connolly, B. A. Lawlor, and S. Brennan. "Peer volunteer perspectives following a complex social cognitive intervention: a qualitative investigation." International Psychogeriatrics 28, no. 9 (February 18, 2016): 1545–54. http://dx.doi.org/10.1017/s1041610216000144.

Full text
Abstract:
ABSTRACTBackground:Peer volunteers can be key to delivering effective social cognitive interventions due to increased potential for social modeling. We consulted peer volunteers who had just taken part in an 8-week social and nutritional mealtime intervention with older adults living alone, to seek their evaluation of the intervention.Methods:Semi-structured focus groups were used with a total of 21 volunteers (17 female) and two facilitators. Thematic analysis was used to interrogate the data.Results:Six themes (16 sub-themes) are discussed. Peer volunteers described the importance of the socializing aspect of the intervention, of pairing considerations and compatibility in peer interventions, of considering the needs of the participant, of benefits to the volunteers, and of the practical considerations of conducting an intervention. Volunteers also discussed considerations for future research and services for older adults living alone.Conclusions:Volunteers found their involvement in the intervention to be personally beneficial, and revealed some valuable considerations for the researchers to take forward to future research. Results are pertinent to intervention design and could inform future social cognitive and other peer-oriented interventions for older adults living alone.
APA, Harvard, Vancouver, ISO, and other styles
24

Farquhar, Morag C., Gail Ewing, and Sara Booth. "Using mixed methods to develop and evaluate complex interventions in palliative care research." Palliative Medicine 25, no. 8 (August 1, 2011): 748–57. http://dx.doi.org/10.1177/0269216311417919.

Full text
Abstract:
Background: there is increasing interest in combining qualitative and quantitative research methods to provide comprehensiveness and greater knowledge yield. Mixed methods are valuable in the development and evaluation of complex interventions. They are therefore particularly valuable in palliative care research where the majority of interventions are complex, and the identification of outcomes particularly challenging. Aims: this paper aims to introduce the role of mixed methods in the development and evaluation of complex interventions in palliative care, and how they may be used in palliative care research. Content: the paper defines mixed methods and outlines why and how mixed methods are used to develop and evaluate complex interventions, with a pragmatic focus on design and data collection issues and data analysis. Useful texts are signposted and illustrative examples provided of mixed method studies in palliative care, including a detailed worked example of the development and evaluation of a complex intervention in palliative care for breathlessness. Key challenges to conducting mixed methods in palliative care research are identified in relation to data collection, data integration in analysis, costs and dissemination and how these might be addressed. Conclusions: the development and evaluation of complex interventions in palliative care benefit from the application of mixed methods. Mixed methods enable better understanding of whether and how an intervention works (or does not work) and inform the design of subsequent studies. However, they can be challenging: mixed method studies in palliative care will benefit from working with agreed protocols, multidisciplinary teams and engaging staff with appropriate skill sets.
APA, Harvard, Vancouver, ISO, and other styles
25

Contandriopoulos, André-Pierre, Lynda Rey, Astrid Brousselle, and François Champagne. "Évaluer une intervention complexe : enjeux conceptuels, méthodologiques, et opérationnels." Canadian Journal of Program Evaluation 26, no. 3 (January 2012): 1–16. http://dx.doi.org/10.3138/cjpe.0026.003.

Full text
Abstract:
Abstract: Theoretically, evaluation should help decision-makers address contemporary health system challenges. Paradoxically, the use of evaluation results by decision-makers remains poor, despite rapid development in the evaluation field. The level of use depends on the evaluator’s ability to account for the complexity of health-care systems. The complex nature of an intervention often compels evaluators to adopt unconventional approaches to account for the roles of the players. The evaluation of a complex intervention raises conceptual, methodological, and operational challenges the evaluation has to overcome to increase the level of use of its findings by decision-makers.
APA, Harvard, Vancouver, ISO, and other styles
26

Brand, Sarah Louise, Cath Quinn, Mark Pearson, Charlotte Lennox, Christabel Owens, Tim Kirkpatrick, Lynne Callaghan, et al. "Building programme theory to develop more adaptable and scalable complex interventions: Realist formative process evaluation prior to full trial." Evaluation 25, no. 2 (October 8, 2018): 149–70. http://dx.doi.org/10.1177/1356389018802134.

Full text
Abstract:
Medical Research Council guidelines recognise the need to optimise complex interventions prior to full trial through greater understanding of underlying theory and formative process evaluation, yet there are few examples. A realist approach to formative process evaluation makes a unique contribution through a focus on theory formalisation and abstraction. The success of an intervention is dependent on the extent to which it gels or jars with existing provision and can be successfully transferred to new contexts. Interventions with underlying programme theory about how they work, for whom, and under which circumstances will be better able to adapt to work with (rather than against) different services, individuals, and settings. In this methodological article, we describe and illustrate how a realist approach to formative process evaluation develops contextualised intervention theory that can underpin more adaptable and scalable interventions. We discuss challenges and benefits of this approach.
APA, Harvard, Vancouver, ISO, and other styles
27

Greenhalgh, C., G. Williams, A. Harrison, A. Garrow, S. Mitchell, and A. Verma. "Modified realist evaluation of a complex, multi-centred, multi-intervention programme." Journal of Public Health 45, Supplement_1 (December 2023): i5—i9. http://dx.doi.org/10.1093/pubmed/fdad029.

Full text
Abstract:
Abstract Well North was a complex, multi-intervention health improvement programme spanning 10 sites across the North of England. The aim was to address inequalities by improving the health of the poorest fastest, increasing resilience and reducing levels of worklessness. The intention of the programme was for all sites to have freedom and flexibility to conduct different interventions reflecting local priorities. Evaluation ran concurrently with the programme, and an iterative approach was required to ensure constant feedback, allowing the programme to be adapted and improved as necessary. Realist methodology was chosen for evaluation, as it provides insight into what works, for whom and in what circumstances. Due to the complex nature of the programme and diverse approaches, it was necessary to adapt the methodology to meet the needs of the evaluation. The Evaluation Team utilized a range of qualitative and quantitative techniques within the context of a Rapid Cycle Evaluation framework. For each project, Contexts, Mechanisms and Outcomes (CMOs) were identified at three stages and were incorporated into the CMO configuration, leading to the development of a middle range theory. Validation and testing of theory took place at every stage. Realist methodology was the most appropriate existing method. However, it still necessitated modification.
APA, Harvard, Vancouver, ISO, and other styles
28

Lakshman, Rajalakshmi, Simon Griffin, Wendy Hardeman, Annie Schiff, Ann Louise Kinmonth, and Ken K. Ong. "Using the Medical Research Council Framework for the Development and Evaluation of Complex Interventions in a Theory-Based Infant Feeding Intervention to Prevent Childhood Obesity: The Baby Milk Intervention and Trial." Journal of Obesity 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/646504.

Full text
Abstract:
Introduction. We describe our experience of using the Medical Research Council framework on complex interventions to guide the development and evaluation of an intervention to prevent obesity by modifying infant feeding behaviours.Methods. We reviewed the epidemiological evidence on early life risk factors for obesity and interventions to prevent obesity in this age group. The review suggested prevention of excess weight gain in bottle-fed babies and appropriate weaning as intervention targets; hence we undertook systematic reviews to further our understanding of these behaviours. We chose theory and behaviour change techniques that demonstrated evidence of effectiveness in altering dietary behaviours. We subsequently developed intervention materials and evaluation tools and conducted qualitative studies with mothers (intervention recipients) and healthcare professionals (intervention deliverers) to refine them. We developed a questionnaire to assess maternal attitudes and feeding practices to understand the mechanism of any intervention effects.Conclusions. In addition to informing development of our specific intervention and evaluation materials, use of the Medical Research Council framework has helped to build a generalisable evidence base for early life nutritional interventions. However, the process is resource intensive and prolonged, and this should be taken into account by public health research funders. This trial is registered with ISRTCN:20814693Baby Milk Trial.
APA, Harvard, Vancouver, ISO, and other styles
29

Bossert, Jasmin, Cornelia Mahler, Ursula Boltenhagen, Anna Kaltenbach, Daniela Froehlich, Joachim Szecsenyi, Michel Wensing, Stefanie Joos, and Nadja Klafke. "Protocol for the process evaluation of a counselling intervention designed to educate cancer patients on complementary and integrative health care and promote interprofessional collaboration in this area (the CCC-Integrativ study)." PLOS ONE 17, no. 5 (May 13, 2022): e0268091. http://dx.doi.org/10.1371/journal.pone.0268091.

Full text
Abstract:
Background Conducting a process evaluation is essential to understand how health interventions work in different healthcare settings. Particularly in the case of complex interventions, it is important to find out whether the intervention could be carried out as planned and which factors had a beneficial or hindering effect on its implementation. The aim of this study is to present the detailed protocol of the process evaluation embedded in the controlled implementation study CCC-Integrativ aiming to implement an interprofessional counselling program for cancer patients on complementary and integrative health care (CIH). Methods This mixed methods study will draw upon the “Consolidated Framework for Implementation Research” (CFIR) combined with the concept of “intervention fidelity” to evaluate the quality of the interprofessional counselling sessions, to explore the perspective of the directly and indirectly involved healthcare staff, as well as to analyze the perceptions and experiences of the patients. The qualitative evaluation phase consists of analyzing audio-recorded counselling sessions, as well as individual and group interviews with the involved persons. The quantitative evaluation phase applies questionnaires which are distributed before (T0), at the beginning (T1), in the middle (T2) and at the end (T3) of the intervention delivery. Discussion This protocol provides an example of how a process evaluation can be conducted parallel to a main study investigating and implementing a complex intervention. The results of this mixed methods research will make it possible to identify strengths and weaknesses of the team-based intervention, and to target more specifically the key factors and structures required to implement healthcare structures to meet patients’ unmet needs in the context of CIH. To our knowledge, this study is the first applying the CFIR framework in the context of interprofessional CIH counselling, and its results are expected to provide comprehensive and multidisciplinary management of cancer patients with complex supportive healthcare needs.
APA, Harvard, Vancouver, ISO, and other styles
30

Bangdiwala, Shrikant I., Tasneem Hassem, Lu-Anne Swart, Ashley van Niekerk, Karin Pretorius, Deborah Isobell, Naiema Taliep, Samed Bulbulia, Shahnaaz Suffla, and Mohamed Seedat. "Evaluating the Effectiveness of Complex, Multi-component, Dynamic, Community-Based Injury Prevention Interventions: A Statistical Framework." Evaluation & the Health Professions 41, no. 4 (May 16, 2017): 435–55. http://dx.doi.org/10.1177/0163278717709562.

Full text
Abstract:
Dynamic violence and injury prevention interventions located within community settings raise evaluation challenges by virtue of their complex structure, focus, and aims. They try to address many risk factors simultaneously, are often overlapped in their implementation, and their implementation may be phased over time. This article proposes a statistical and analytic framework for evaluating the effectiveness of multilevel, multisystem, multi-component, community-driven, dynamic interventions. The proposed framework builds on meta regression methodology and recently proposed approaches for pooling results from multi-component intervention studies. The methodology is applied to the evaluation of the effectiveness of South African community-centered injury prevention and safety promotion interventions. The proposed framework allows for complex interventions to be disaggregated into their constituent parts in order to extract their specific effects. The potential utility of the framework is successfully illustrated using contact crime data from select police stations in Johannesburg. The proposed framework and statistical guidelines proved to be useful to study the effectiveness of complex, dynamic, community-based interventions as a whole and of their components. The framework may help researchers and policy makers to adopt and study a specific methodology for evaluating the effectiveness of complex intervention programs.
APA, Harvard, Vancouver, ISO, and other styles
31

Swerissen, Hal. "Guest Editorial: Health Promotion Evaluation." Australian Journal of Primary Health 5, no. 4 (1999): 6. http://dx.doi.org/10.1071/py99045.

Full text
Abstract:
Health promotion has changed significantly over the past twenty years. From its origins based on relatively simple models of individual behaviour change it has evolved to incorporate complex models involving multi causal influences. Interventions have developed from single method, single risk factor to integrated, multi factorial approaches. Similarly, from initial exploratory intervention trials, longer term government sponsored health promotion programs have evolved. Significant dedicated agencies and programs with continuing responsibilities for health promotion are now common.
APA, Harvard, Vancouver, ISO, and other styles
32

van Olmen, Josefien, Pilvikki Absetz, Roy William Mayega, Linda Timm, Peter Delobelle, Helle Mölsted Alvesson, Glorai Naggayi, et al. "Process evaluation of a pragmatic implementation trial to support self-management for the prevention and management of type 2 diabetes in Uganda, South Africa and Sweden in the SMART2D project." BMJ Open Diabetes Research & Care 10, no. 5 (September 2022): e002902. http://dx.doi.org/10.1136/bmjdrc-2022-002902.

Full text
Abstract:
IntroductionType 2 diabetes (T2D) and its complications are increasing rapidly. Support for healthy lifestyle and self-management is paramount, but not adequately implemented in health systems. Process evaluations facilitate understanding why and how interventions work through analyzing the interaction between intervention theory, implementation and context. The Self-Management and Reciprocal Learning for Type 2 Diabetes project implemented and evaluated community-based interventions (peer support program; care companion; and link between facility care and community support) for persons at high risk of or having T2D in a rural community in Uganda, an urban township in South Africa, and socioeconomically disadvantaged urban communities in Sweden.Research design and methodsThis paper reports implementation process outcomes across the three sites, guided by the Medical Research Council framework for complex intervention process evaluations. Data were collected through observations of peer support group meetings using a structured guide, and semistructured interviews with project managers, implementers, and participants.ResultsThe countries aligned implementation in accordance with the feasibility and relevance in the local context. In Uganda and Sweden, the implementation focused on peer support; in South Africa, it focused on the care companion part. The community–facility link received the least attention. Continuous capacity building received a lot of attention, but intervention reach, dose delivered, and fidelity varied substantially. Intervention-related and context-related barriers affected participation.ConclusionsIdentification of the key uncertainties and conditions facilitates focus and efficient use of resources in process evaluations, and context relevant findings. The use of an overarching framework allows to collect cross-contextual evidence and flexibility in evaluation design to adapt to the complex nature of the intervention. When designing interventions, it is crucial to consider aspects of the implementing organization or structure, its absorptive capacity, and to thoroughly assess and discuss implementation feasibility, capacity and organizational context with the implementation team and recipients. These recommendations are important for implementation and scale-up of complex interventions.Trial registration numberISRCTN11913581.
APA, Harvard, Vancouver, ISO, and other styles
33

Mohd Sa’id, Iklil Iman, Iliatha Papachristou Nadal, Angus Forbes, Kimberley Goldsmith, Irmi Zarina Ismail, Faezah Hassan, Siew Mooi Ching, et al. "A Protocol of Process Evaluations of Interventions for the Prevention of Type 2 Diabetes in Women With Gestational Diabetes Mellitus: A Systematic Review." International Journal of Qualitative Methods 20 (January 2021): 160940692110340. http://dx.doi.org/10.1177/16094069211034010.

Full text
Abstract:
Background Process evaluations of randomised controlled trials (RCTs) can provide insight and inform us on the intervention implementation, the causal mechanisms and the contextual factors. This will inform about interventions’ success or failure due to their implementation or the interventions themselves. We aim to consolidate the methodology from previous process evaluations of complex interventions upon their findings on facilitators and barriers to address the prevention of type 2 diabetes mellitus among women with gestational diabetes mellitus (GDM). Methods Comprehensive search will be conducted on electronic databases and reference lists of recent reviews for RCTs of complex interventions which address process evaluations of diabetes prevention intervention (DPI) for women with GDM in healthcare settings. There is no restriction on the language of the papers and year of publication until December 2020. Data from each study will be extracted by two reviewers independently using standardised forms. Data extracted include descriptive items on the study design and the outcomes of process evaluations from the three dimensions: (1) implementation; (2) mechanism of impact and (3) context. The quality of the studies will be assessed using mixed methods appraisal tool which is designed for the appraisal of mixed studies in systematic reviews. A narrative and framework analysis of the findings will be presented to inform the contents of a new DPI for women with GDM. Discussion The findings from this process evaluation findings are valuable in determining whether a complex intervention should be scaled up or modified for other contexts in future plan. It will give deeper understanding of potential challenges and solutions to aid in the implementation of effective DPIs for GDM in Malaysia.
APA, Harvard, Vancouver, ISO, and other styles
34

Bird, Victoria J., Clair Le Boutillier, Mary Leamy, Julie Williams, Simon Bradstreet, and Mike Slade. "Evaluating the feasibility of complex interventions in mental health services: standardised measure and reporting guidelines." British Journal of Psychiatry 204, no. 4 (April 2014): 316–21. http://dx.doi.org/10.1192/bjp.bp.113.128314.

Full text
Abstract:
BackgroundThe feasibility of implementation is insufficiently considered in clinical guideline development, leading to human and financial resource wastage.AimsTo develop (a) an empirically based standardised measure of the feasibility of complex interventions for use within mental health services and (b) reporting guidelines to facilitate feasibility assessment.MethodA focused narrative review of studies assessing implementation blocks and enablers was conducted with thematic analysis and vote counting used to determine candidate items for the measure. Twenty purposively sampled studies (15 trial reports, 5 protocols) were included in the psychometric evaluation, spanning different interventions types. Cohen's kappa (κ) was calculated for interrater reliability and test–retest reliability.ResultsIn total, 95 influences on implementation were identified from 299 references. The final measure – Structured Assessment of FEasibility (SAFE) – comprises 16 items rated on a Likert scale. There was excellent interrater (κ = 0.84, 95% CI 0.79–0.89) and test–retest reliability (κ = 0.89, 95% CI 0.85–0.93). Cost information and training time were the two influences least likely to be reported in intervention papers. The SAFE reporting guidelines include 16 items organised into three categories (intervention, resource consequences, evaluation).ConclusionsA novel approach to evaluating interventions, SAFE, supplements efficacy and health economic evidence. The SAFE reporting guidelines will allow feasibility of an intervention to be systematically assessed.
APA, Harvard, Vancouver, ISO, and other styles
35

Saksvik, Per Øystein, Margrethe Faergestad, Silje Fossum, Oyeniyi Samuel Olaniyan, Øystein Indergård, and Maria Karanika-Murray. "An effect evaluation of the psychosocial work environment of a university unit after a successfully implemented employeeship program." International Journal of Workplace Health Management 11, no. 1 (February 5, 2018): 31–44. http://dx.doi.org/10.1108/ijwhm-08-2017-0065.

Full text
Abstract:
Purpose The purpose of this paper is to examine whether a successful implementation of an intervention could result in an effect evaluated independently from a process evaluation. It was achieved by evaluating the effects of an intervention, the “employeeship program,” designed to strengthen the psychosocial work environment through raising employees’ awareness and competence in interpersonal relationships and increasing their responsibility for their everyday work and working environment. Design/methodology/approach An employeeship intervention program was developed to improve the psychosocial work environment through reducing conflict among employees and strengthening the social community, empowering leadership, and increasing trust in management. An earlier process evaluation of the program found that it had been implemented successfully. The present effect evaluation supplemented this by examining its effect on the psychosocial work environment using two waves of the organization’s internal survey and comparing changes in the intervention unit at two points and against the rest of the organization. Findings The intervention was effective in improving the psychosocial work environment through reducing conflicts among employees and strengthening the social community, empowering leadership, and increasing trust in management. Research limitations/implications More attention should be paid to developing and increasing positive psychosocial experiences while simultaneously reducing negative psychosocial experiences, as this employeeship intervention demonstrated. Practical implications An intervention focusing on employeeship is an effective way to achieve a healthier psychosocial work environment with demonstrable benefits for individuals and the working unit. Originality/value Although organizational-level interventions are complex processes, evaluations that focus on process and effect can offer insights into the workings of successful interventions.
APA, Harvard, Vancouver, ISO, and other styles
36

Lyon, Aaron R., Michael D. Pullmann, Jedediah Jacobson, Katie Osterhage, Morhaf Al Achkar, Brenna N. Renn, Sean A. Munson, and Patricia A. Areán. "Assessing the usability of complex psychosocial interventions: The Intervention Usability Scale." Implementation Research and Practice 2 (January 2021): 263348952098782. http://dx.doi.org/10.1177/2633489520987828.

Full text
Abstract:
Background: Usability—the extent to which an intervention can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction—may be a key determinant of implementation success. However, few instruments have been developed to measure the design quality of complex health interventions (i.e., those with several interacting components). This study evaluated the structural validity of the Intervention Usability Scale (IUS), an adapted version of the well-established System Usability Scale (SUS) for digital technologies, to measure the usability of a leading complex psychosocial intervention, Motivational Interviewing (MI), for behavioral health service delivery in primary care. Prior SUS studies have found both one- and two-factor solutions, both of which were examined in this study of the IUS. Method: A survey administered to 136 medical professionals from 11 primary-care sites collected demographic information and IUS ratings for MI, the evidence-based psychosocial intervention that primary-care providers reported using most often for behavioral health service delivery. Factor analyses replicated procedures used in prior research on the SUS. Results: Analyses indicated that a two-factor solution (with “usable” and “learnable” subscales) best fit the data, accounting for 54.1% of the variance. Inter-item reliabilities for the total score, usable subscale, and learnable subscale were α = .83, α = .84, and α = .67, respectively. Conclusion: This study provides evidence for a two-factor IUS structure consistent with some prior research, as well as acceptable reliability. Implications for implementation research evaluating the usability of complex health interventions are discussed, including the potential for future comparisons across multiple interventions and provider types, as well as the use of the IUS to evaluate the relationship between usability and implementation outcomes such as feasibility. Plain language abstract: The ease with which evidence-based psychosocial interventions (EBPIs) can be readily adopted and used by service providers is a key predictor of implementation success, but very little implementation research has attended to intervention usability. No quantitative instruments exist to evaluate the usability of complex health interventions, such as the EBPIs that are commonly used to integrate mental and behavioral health services into primary care. This article describes the evaluation of the first quantitative instrument for assessing the usability of complex health interventions and found that its factor structure replicated some research with the original version of the instrument, a scale developed to assess the usability of digital systems.
APA, Harvard, Vancouver, ISO, and other styles
37

Johnson, Louise, Julia Mardo, and Sara Demain. "Understanding implementation of a complex intervention in a stroke rehabilitation research trial: A qualitative evaluation using Normalisation Process Theory." PLOS ONE 18, no. 9 (September 8, 2023): e0282612. http://dx.doi.org/10.1371/journal.pone.0282612.

Full text
Abstract:
Background The Implicit Learning in Stroke study was a pilot cluster randomised controlled trial, investigating the use of different motor learning strategies in acute stroke rehabilitation. Participating Stroke Units (n = 8) were from the South East/West regions of the UK, with the experimental intervention (implicit learning) being delivered by clinical teams. It required therapists to change how they gave instructions and feedback to patients during rehabilitation. This paper reports the processes underpinning implementation of the implicit learning intervention. The evaluation aimed to i) understand how therapists made sense of, engaged with and interpreted the effects of the intervention; ii) compare this to the experience reported by patients; iii) extrapolate learning of broader relevance to the design and conduct of research involving complex interventions in stroke rehabilitation. Methods Qualitative evaluation, with data collected through focus groups with clinical staff (n = 20) and semi structured interviews with people with stroke (n = 19). Mixed inductive and theory driven analysis, underpinned by Normalisation Process Theory. Results How therapists made sense of and experienced the intervention impacted how it was implemented. The intervention was delivered by individual therapists, and was influenced by their individual values, beliefs and concerns. However, how teams worked together to build a shared (team) understanding, also played a key role. Teams with a more “flexible” interpretation, reported the view that the intervention could have benefits in a wide range of scenarios. Those with a more fixed, “rule based” interpretation, found it harder to implement, and perceived the benefits to be more limited. Therapists’ concerns that the intervention may impair therapeutic relationships and patient learning were not reflected in how patients experienced it. Conclusions Changing practice, whether in a research study or in the “real world”, is complex. Understanding the process of implementation is crucial to effective research delivery. Implementation frameworks facilitate understanding, and subsequently the systematic and iterative development of strategies for this to be addressed. How teams (rather than individuals) work together is central to how complex interventions are understood and implemented. It is possible that new complex interventions work best in contexts where there are ‘flexible’ cultures. Researchers should consider, and potentially measure this, before they can effectively implement and evaluate an intervention. Trial registration Clinical Trials - NCT03792126.
APA, Harvard, Vancouver, ISO, and other styles
38

Giannopoulos, Eleni, Janet Papadakos, Erin Cameron, Janette Brual, Rebecca Truscott, William K. Evans, and Meredith Elana Giuliani. "Identifying Best Implementation Practices for Smoking Cessation in Complex Cancer Settings." Current Oncology 28, no. 1 (January 13, 2021): 471–84. http://dx.doi.org/10.3390/curroncol28010049.

Full text
Abstract:
Background: In response to evidence about the health benefits of smoking cessation at time of cancer diagnosis, Ontario Health (Cancer Care Ontario) (OH-CCO) instructed Regional Cancer Centres (RCC) to implement smoking cessation interventions (SCI). RCCs were given flexibility to implement SCIs according to their context but were required to screen new patients for tobacco status, advise patients about the importance of quitting, and refer patients to cessation supports. The purpose of this evaluation was to identify practices that influenced successful implementation across RCCs. Methods: A realist evaluation approach was employed. Realist evaluations examine how underlying processes of an intervention (mechanisms) in specific settings (contexts) interact to produce results (outcomes). A realist evaluation may thus help to generate an understanding of what may or may not work across contexts. Results: The RCCs with the highest Tobacco Screening Rates used a centralized system. Regarding the process for advising and referring, three RCCs offered robust smoking cessation training, resulting in advice and referral rates between 80% and 100%. Five RCCs surpassed the target for Accepted Referral Rates; acceptance rates for internal referral were highest overall. Conclusion: Findings highlight factors that may influence successful SCI implementation.
APA, Harvard, Vancouver, ISO, and other styles
39

Gwernan-Jones, Ruth, Nicky Britten, Jon Allard, Elina Baker, Laura Gill, Helen Lloyd, Tim Rawcliffe, et al. "A worked example of initial theory-building: PARTNERS2 collaborative care for people who have experienced psychosis in England." Evaluation 26, no. 1 (May 26, 2019): 6–26. http://dx.doi.org/10.1177/1356389019850199.

Full text
Abstract:
In this article, we present an exemplar of the initial theory-building phase of theory-driven evaluation for the PARTNERS2 project, a collaborative care intervention for people with experience of psychosis in England. Initial theory-building involved analysis of the literature, interviews with key leaders and focus groups with service users. The initial programme theory was developed from these sources in an iterative process between researchers and stakeholders (service users, practitioners, commissioners) involving four activities: articulation of 442 explanatory statements systematically developed using realist methods; debate and consensus; communication; and interrogation. We refute two criticisms of theory-driven evaluation of complex interventions. We demonstrate how the process of initial theory-building made a meaningful contribution to our complex intervention in five ways. Although time-consuming, it allowed us to develop an internally coherent and well-documented intervention. This study and the lessons learnt provide a detailed resource for other researchers wishing to build theory for theory-driven evaluation.
APA, Harvard, Vancouver, ISO, and other styles
40

Wong, Alvin, Yingxiao Huang, Merrilyn D. Banks, P. Marcin Sowa, and Judy D. Bauer. "A Conceptual Study on Characterizing the Complexity of Nutritional Interventions for Malnourished Older Adults in Hospital Settings: An Umbrella Review Approach." Healthcare 12, no. 7 (March 31, 2024): 765. http://dx.doi.org/10.3390/healthcare12070765.

Full text
Abstract:
Introduction: Malnutrition is a widespread and intricate issue among hospitalized adults, necessitating a wide variety of nutritional strategies to address its root causes and repercussions. The primary objective of this study is to systematically categorize nutritional interventions into simple or complex, based on their resource allocation, strategies employed, and predictors of intervention complexity in the context of adult malnutrition in hospital settings. Methods: A conceptual evaluation of 100 nutritional intervention studies for adult malnutrition was conducted based on data from a recent umbrella review (patient population of mean age > 60 years). The complexity of interventions was categorized using the Medical Research Council 2021 Framework for Complex Interventions. A logistic regression analysis was employed to recognize variables predicting the complexity of interventions. Results: Interventions were divided into three principal categories: education and training (ET), exogenous nutrient provision (EN), and environment and services (ES). Most interventions (66%) addressed two or more of these areas. A majority of interventions were delivered in a hospital (n = 75) or a hospital-to-community setting (n = 25), with 64 studies being classified as complex interventions. The logistic regression analysis revealed three variables associated with intervention complexity: the number of strategies utilized, the targeted areas, and the involvement of healthcare professionals. Complex interventions were more likely to be tailored to individual needs and engage multiple healthcare providers. Conclusions: The study underlines the importance of considering intervention complexity in addressing adult malnutrition. Findings advocate for a comprehensive approach to characterizing and evaluating nutritional interventions in future research. Subsequent investigations should explore optimal balances between intervention complexity and resource allocation, and assess the effectiveness of complex interventions across various settings, while considering novel approaches like telehealth.
APA, Harvard, Vancouver, ISO, and other styles
41

Bowden, A. Brooks, Robert Shand, Clive R. Belfield, Anyi Wang, and Henry M. Levin. "Evaluating Educational Interventions That Induce Service Receipt." American Journal of Evaluation 38, no. 3 (August 26, 2016): 405–19. http://dx.doi.org/10.1177/1098214016664983.

Full text
Abstract:
Educational interventions are complex: Often they combine a diagnostic component (identifying student need) with a service component (ensuring appropriate educational resources are provided). This complexity raises challenges for program evaluation. These interventions, which we refer to as service mediation interventions, affect additional resources students receive that mediate the impact measured. Evaluations of these types of programs that solely report effects are potentially misleading. Cost-effectiveness analysis clarifies the importance of assessing service-mediated receipt for evaluation purposes. We illustrate our argument empirically from City Connects, a comprehensive student support intervention. We find that the direct costs of the program represent only one-third of the total change in resource use by program participants required to produce impacts. Evaluative statements of service mediation interventions should be accompanied by information on the full costs to achieve effects. Many interventions might be structured in this way and require evaluation that includes an economic perspective.
APA, Harvard, Vancouver, ISO, and other styles
42

Stijnen, Mandy, Inge Duimel-Peeters, Hubertus Vrijhoel, and Maria Jansen. "Process evaluation plan of a patient-centered home visitation program for potentially frail community-dwelling older people in general practice." European Journal for Person Centered Healthcare 2, no. 2 (April 8, 2014): 179. http://dx.doi.org/10.5750/ejpch.v2i2.716.

Full text
Abstract:
Background: Evaluation studies examining the effectiveness of interventions tend to offer little insight into the mechanisms responsible for changes in outcomes. The present study has conducted a thorough process evaluation alongside a longitudinal quasi-experimental trial investigating the effects of a home visitation program for the early detection of health problems among potentially frail community-dwelling older people (≥ 75 years). We aim to describe the rationale for and steps undertaken in developing a process evaluation plan to identify the factors that influence the success or failure of this complex, patient-centered intervention within the primary care setting.Method: Using a theoretical framework underlying the process evaluation, process evaluation questions are formulated per component of the framework (i.e., implementation fidelity, dose delivered, dose received, reach, recruitment and context). The process evaluation plan shows how both quantitative (e.g., structured registration forms) and qualitative methods (e.g., semi-structured interviews) are applied in gathering process data for a complex, patient-centered intervention integrated within general practices. Process data are gathered with either formative or summative purposes among practice nurses and general practitioners from participating general practices and a purposive sample of older people.Conclusion: Conducting a process evaluation alongside a clinical trial will assist in deciding to what extent the intervention is effective, as well as what factors contribute to the intervention’s effectiveness. The insights gained are imperative for the development of patient-centered interventions that are likely to be sustained when implemented in the intended context.
APA, Harvard, Vancouver, ISO, and other styles
43

Booth, Andrew, Jane Noyes, Kate Flemming, Graham Moore, Özge Tunçalp, and Elham Shakibazadeh. "Formulating questions to explore complex interventions within qualitative evidence synthesis." BMJ Global Health 4, Suppl 1 (January 2019): e001107. http://dx.doi.org/10.1136/bmjgh-2018-001107.

Full text
Abstract:
When making decisions about complex interventions, guideline development groups need to factor in the sociocultural acceptability of an intervention, as well as contextual factors that impact on the feasibility of that intervention. Qualitative evidence synthesis offers one method of exploring these issues. This paper considers the extent to which current methods of question formulation are meeting this challenge. It builds on a rapid review of 38 different frameworks for formulating questions. To be useful, a question framework should recognise context (as setting, environment or context); acknowledge the criticality of different stakeholder perspectives (differentiated from the target population); accommodate elements of time/timing and place; be sensitive to qualitative data (eg, eliciting themes or findings). None of the identified frameworks satisfied all four of these criteria. An innovative question framework, PerSPEcTiF, is proposed and retrospectively applied to a published WHO guideline for a complex intervention. Further testing and evaluation of the PerSPEcTiF framework is required.
APA, Harvard, Vancouver, ISO, and other styles
44

Liddiard, Kim, Sara Louise Morgan, and Bronwen Elizabeth Lesley Davies. "Evaluation of a transition intervention in a secure setting." Journal of Forensic Practice 21, no. 2 (May 13, 2019): 158–66. http://dx.doi.org/10.1108/jfp-03-2019-0008.

Full text
Abstract:
Purpose Transitioning is an inevitable part of being in secure settings, yet little research exists focussing on the experiences of individuals and what interventions might help them to achieve optimal transitions. This seems surprising as the very people who find themselves in secure settings often have attachment difficulties, maladaptive coping strategies and complex mental health needs, which are the factors considered most likely to disadvantage individuals when transitioning. The paper aims to discuss this issue. Design/methodology/approach This study used a repeated design to explore the effectiveness of a person-centred intervention with 18 transitioning individuals in a medium-secure hospital. Three self-report questionnaires were used to capture data relating to anxiety, coping strategies and how individuals feel about the transition pre- and post-intervention. Whole data sets were achieved in 16 cases. Findings Following the transition intervention, individuals felt more at ease with the transition ahead of them, their use of adaptive coping strategies had significantly increased and their trait anxiety had significantly lowered. Research limitations/implications This study revealed that using a person-centred intervention with transitioning individuals was helpful. However, the study was not able to capture the impact of this intervention over time. Practical implications This study highlights the importance of attending to how individuals experience the transition, alongside offering interventions designed to help them adjust and cope to achieve optimal transitions. Originality/value Very little is known about what interventions might help individuals achieve a successful transition. Therefore, the findings offer new and significant contributions to this under-researched area.
APA, Harvard, Vancouver, ISO, and other styles
45

Dewar, Sophie J., Jana Jenkins, and Ian I. Kneebone. "How do you evaluate an informal support group? A pilot example from stroke rehabilitation." FPOP Bulletin: Psychology of Older People 1, no. 107 (April 2009): 41–49. http://dx.doi.org/10.53841/bpsfpop.2009.1.107.41.

Full text
Abstract:
In order to justify the use of limited resources there is an increasing demand to evaluate the effectiveness of interventions within the health service. Despite this, the literature indicates that designing and implementing effective evaluations is difficult and often poorly done. Further, clinicians’ negative perceptions of research may discourage evaluation, resulting in practice that is primarily based on clinical intuition. The current article describes the rationale behind the development of a pilot evaluation strategy for an informal inpatient and carer stroke support group. It is concluded that appropriate evaluation should be determined by the purpose and nature of the intervention under examination as well the persons involved. Furthermore, it is suggested that valuable feedback regarding clinical practice need not involve time-consuming and complex evaluation.
APA, Harvard, Vancouver, ISO, and other styles
46

Lazo-Porras, María, Lena R. Brandt, Elsa Cornejo-Vucovich, Catalina A. Denman, Francisco Diez-Canseco, Alejandra Malavera, Ankita Mukherjee, et al. "The value of process evaluation for public health interventions: field-case studies for non-communicable disease prevention and management in five countries." Salud Pública de México 64 (June 13, 2022): S56—S66. http://dx.doi.org/10.21149/12853.

Full text
Abstract:
Complex interventions are needed to effectively tackle non-communicable diseases. However, complex interventions can contain a mix of effective and ineffective actions. Process evaluation (PE) in public health research is of great value as it could clarify the mechanisms and contextual factors associ­ated with variation in the outcomes, better identify effective components, and inform adaptation of the intervention. The aim of this paper is to demonstrate the value of PE through five case studies that span the research cycle. The interven­tions include using digital health, salt reduction strategies, use of fixed dose combinations, and task shifting. Insights of the methods used, and the implications of the PE findings to the project, were discussed. PE of complex interventions can refute or confirm the hypothesized mechanisms of action, thereby enabling intervention refinement, and identifying implementation strategies that can address local contextual needs, so as to improve service delivery and public health outcomes.
APA, Harvard, Vancouver, ISO, and other styles
47

Thayabaranathan, Tharshanah, Maarten A. Immink, Susan Hillier, Rene Stolwyk, Nadine E. Andrew, Philip Stevens, Monique F. Kilkenny, et al. "Co-Designing a New Yoga-Based Mindfulness Intervention for Survivors of Stroke: A Formative Evaluation." Neurology International 14, no. 1 (December 21, 2021): 1–10. http://dx.doi.org/10.3390/neurolint14010001.

Full text
Abstract:
Movement-based mindfulness interventions (MBI) are complex, multi-component interventions for which the design process is rarely reported. For people with stroke, emerging evidence suggests benefits, but mainstream programs are generally unsuitable. We aimed to describe the processes involved and to conduct a formative evaluation of the development of a novel yoga-based MBI designed for survivors of stroke. We used the Medical Research Council complex interventions framework and principles of co-design. We purposefully approached health professionals and consumers to establish an advisory committee for developing the intervention. Members collaborated and iteratively reviewed the design and content of the program, formatted into a training manual. Four external yoga teachers independently reviewed the program. Formative evaluation included review of multiple data sources and documentation (e.g., formal meeting minutes, focus group discussions, researcher observations). The data were synthesized using inductive thematic analysis. Three broad themes emerged: (a) MBI content and terminology; (b) manual design and readability; and (c) barriers and enablers to deliver the intervention. Various perspectives and feedback on essential components guided finalizing the program. The design phase of a novel yoga-based MBI was strengthened by interdisciplinary, consumer contributions and peer review. The 12-week intervention is ready for testing among survivors of stroke.
APA, Harvard, Vancouver, ISO, and other styles
48

Smit, Linda C., Jeroen Dikken, Marieke J. Schuurmans, Niek J. de Wit, and Nienke Bleijenberg. "Value of social network analysis for developing and evaluating complex healthcare interventions: a scoping review." BMJ Open 10, no. 11 (November 2020): e039681. http://dx.doi.org/10.1136/bmjopen-2020-039681.

Full text
Abstract:
ObjectivesMost complex healthcare interventions target a network of healthcare professionals. Social network analysis (SNA) is a powerful technique to study how social relationships within a network are established and evolve. We identified in which phases of complex healthcare intervention research SNA is used and the value of SNA for developing and evaluating complex healthcare interventions.MethodsA scoping review was conducted using the Arksey and O’Malley methodological framework. We included complex healthcare intervention studies using SNA to identify the study characteristics, level of complexity of the healthcare interventions, reported strengths and limitations, and reported implications of SNA. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews 2018 was used to guide the reporting.ResultsAmong 2466 identified studies, 40 studies were selected for analysis. At first, the results showed that SNA seems underused in evaluating complex intervention research. Second, SNA was not used in the development phase of the included studies. Third, the reported implications in the evaluation and implementation phase reflect the value of SNA in addressing the implementation and population complexity. Fourth, pathway complexity and contextual complexity of the included interventions were unclear or unable to access. Fifth, the use of a mixed methods approach was reported as a strength, as the combination and integration of a quantitative and qualitative method clearly establishes the results.ConclusionSNA is a widely applicable method that can be used in different phases of complex intervention research. SNA can be of value to disentangle and address the level of complexity of complex healthcare interventions. Furthermore, the routine use of SNA within a mixed method approach could yield actionable insights that would be useful in the transactional context of complex interventions.
APA, Harvard, Vancouver, ISO, and other styles
49

Mannell, Jenevieve, and Katy Davis. "Evaluating Complex Health Interventions With Randomized Controlled Trials: How Do We Improve the Use of Qualitative Methods?" Qualitative Health Research 29, no. 5 (March 14, 2019): 623–31. http://dx.doi.org/10.1177/1049732319831032.

Full text
Abstract:
Qualitative methods are underutilized in health intervention evaluation, and overshadowed by the importance placed on randomized controlled trials (RCTs). This Commentary describes how innovative qualitative methods are being used as part of RCTs, drawing on articles included in a special issue of Qualitative Health Research on this topic. The articles’ insights and a review of innovative qualitative methods described in trial protocols highlights a lack of attention to structural inequalities as a causal mechanism for understanding human behavior. We situate this gap within some well-known constraints of RCT methodologies, and a discussion of alternative RCT approaches that hold promise for bringing qualitative methods center stage in intervention evaluation, including adaptive designs, pragmatic trials, and realist RCTs. To address the power hierarchies of health evaluation research, however, we argue that a fundamental shift needs to take place away from a focus on RCTs and toward studies of health interventions.
APA, Harvard, Vancouver, ISO, and other styles
50

Fitzgerald, Sarah, Aileen Murphy, Ann Kirby, Fiona Geaney, and Ivan J. Perry. "Cost-effectiveness of a complex workplace dietary intervention: an economic evaluation of the Food Choice at Work study." BMJ Open 8, no. 3 (March 2018): e019182. http://dx.doi.org/10.1136/bmjopen-2017-019182.

Full text
Abstract:
ObjectiveTo evaluate the costs, benefits and cost-effectiveness of complex workplace dietary interventions, involving nutrition education and system-level dietary modification, from the perspective of healthcare providers and employers.DesignSingle-study economic evaluation of a cluster-controlled trial (Food Choice at Work (FCW) study) with 1-year follow-up.SettingFour multinational manufacturing workplaces in Cork, Ireland.Participants517 randomly selected employees (18–65 years) from four workplaces.InterventionsCost data were obtained from the FCW study. Nutrition education included individual nutrition consultations, nutrition information (traffic light menu labelling, posters, leaflets and emails) and presentations. System-level dietary modification included menu modification (restriction of fat, sugar and salt), increase in fibre, fruit discounts, strategic positioning of healthier alternatives and portion size control. The combined intervention included nutrition education and system-level dietary modification. No intervention was implemented in the control.OutcomesThe primary outcome was an improvement in health-related quality of life, measured using the EuroQoL 5 Dimensions 5 Levels questionnaire. The secondary outcome measure was reduction in absenteeism, which is measured in monetary amounts. Probabilistic sensitivity analysis (Monte Carlo simulation) assessed parameter uncertainty.ResultsThe system-level intervention dominated the education and combined interventions. When compared with the control, the incremental cost-effectiveness ratio (€101.37/quality-adjusted life-year) is less than the nationally accepted ceiling ratio, so the system-level intervention can be considered cost-effective. The cost-effectiveness acceptability curve indicates there is some decision uncertainty surrounding this, arising from uncertainty surrounding the differences in effectiveness. These results are reiterated when the secondary outcome measure is considered in a cost–benefit analysis, whereby the system-level intervention yields the highest net benefit (€56.56 per employee).ConclusionsSystem-level dietary modification alone offers the most value per improving employee health-related quality of life and generating net benefit for employers by reducing absenteeism. While system-level dietary modification strategies are potentially sustainable obesity prevention interventions, future research should include long-term outcomes to determine if improvements in outcomes persist.Trial registration numberISRCTN35108237; Post-results.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography