Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Tytuł pozycji:

Eco-Normalization: Evaluating the Longevity of an Innovation in Context.

Tytuł:
Eco-Normalization: Evaluating the Longevity of an Innovation in Context.
Autorzy:
Hamza DM; D.M. Hamza is an implementation scientist and the research and evaluation lead for postgraduate medical education, University of Alberta, Edmonton, Canada; ORCID: https://orcid.org/0000-0001-8943-2165 .
Regehr G; G. Regehr is professor, Department of Surgery, and scientist, Centre for Health Education Scholarship, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia, Canada; ORCID: https://orcid.org/0000-0002-3144-331X .
Źródło:
Academic medicine : journal of the Association of American Medical Colleges [Acad Med] 2021 Nov 01; Vol. 96 (11S), pp. S48-S53.
Typ publikacji:
Journal Article; Review
Język:
English
Imprint Name(s):
Publication: Philadelphia, PA : Published for the Association of American Medical Colleges by Lippincott Williams & Wilkins
Original Publication: [Philadelphia, Pa. : Hanley & Belfus, c1989-
MeSH Terms:
Diffusion of Innovation*
Models, Educational*
Education, Medical/*trends
Humans ; Implementation Science ; Program Evaluation
References:
Chen H, Rossi P. Issues in the theory-driven perspective. Eval Program Plan. 1989; 12:299–306.
Rose C. Evaluation designs. POD Quarterly: J Prof Organ Dev Network Higher Educ. 1980;2:38–46.
Stufflebeam D. The use of experimental design in educational evaluation. J Educ Meas. 1971; 8:267–274.
Campbell DT, Stanley JC. Experimental and quasi-experimental designs for research on teaching In: Handbook of Research on Teaching. Gage NL, ed. Chicago, IL: Rand McNally, 1963.
Chen H, Rossi P. Evaluating with sense: The theory-driven approach. Eval Rev. 1983; 7:283–302.
Cordray D. Optimizing validity in program research: An elaboration of Chen and Rossi's theory-driven approach. Eval Program Plan. 1989; 12:379–385.
Judd CM. Combining process and outcome. evaluation. New Dir Program Eval. 1987; 23–41. doi:10.1002/ev.1457. (PMID: 10.1002/ev.1457)
Lipsey MW, Pollard JA. Driving toward theory in program evaluation: More models to choose from. Eval Program Plan. 1989; 12:317–328.
Lipsey MW. Theory as method: Small theories of treatments. New Dir Program Eval. 1993;57:5–38.
Jabeen S. Unintended outcomes evaluation approach: A plausible way to evaluate unintended outcomes of social development programmes. Eval Program Plann. 2018; 68:262–274.
Jabeen S. Do we really care about unintended outcomes? An analysis of evaluation theory and practice. Eval Program Plann. 2016; 55:144–154.
Merton RK. The unanticipated consequences of purposive social action. Am Sociol Rev. 1936; 1:894–904.
Meyers WR. The Evaluation Enterprise. London, UK: Jossey-Bass Publishers, 1981.
Rogers EM. Diffusion of Innovations. New York, NY: Free Press, 1995.
Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Educ Res. 2003; 18:237–256.
O’Donnell C. Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K–12 curriculum intervention research. Rev Educ Res. 2008; 78:33–84.
Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clin Psychol Rev. 1998; 18:23–45.
Moncher F, Prinz RJ. Treatment fidelity in outcome studies. Clin Psychol Rev. 1991; 11:247–266.
Gresham FM, Gansle KA, Noell GH. Treatment integrity in applied behavior analysis with children. J Appl Behav Anal. 1993; 26:257–263.
Power TP, Blom-Hoffman J, Clarke AT, Riley-Tillman TC, Kelleher C, Manz P. Reconceptualizing intervention integrity: A partnership-based framework for linking research with practice. Psychol Sch. 2005; 42:495–507.
Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007; 2:40.
Van Melle E, Gruppen L, Holmboe ES, Flynn L, Oandasan I, Frank JR. International Competency-Based Medical Education Collaborators. Using contribution analysis to evaluate competency-based medical education programs: It’s all about rigor in thinking. Acad Med. 2017; 92:752–758.
Oandasan I, Martin L, McGuire M, Zorzi R. Twelve tips for improvement-oriented evaluation of competency-based medical education. Med Teach. 2020; 42:272–277.
Hamza DM, Ross S, Oandasan I. Process and outcome evaluation of a CBME intervention guided by program theory. J Eval Clin Pract. 2020; 26:1096–1104.
Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J. International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019; 94:1002–1009.
Onyura B, Ng SL, Baker LR, Lieff S, Millar BA, Mori B. A mandala of faculty development: Using theory-based evaluation to explore contexts, mechanisms and outcomes. Adv Health Sci Educ Theory Pract. 2017; 22:165–186.
Onyura B, Baker L, Cameron B, Friesen F, Leslie K. Evidence for curricular and instructional design approaches in undergraduate medical education: An umbrella review. Med Teach. 2016; 38:150–161.
Schneider J, Hall J. Why most product launches fail.Harvard Business Review https://hbr.org/2011/04/why-most-product-launches-fail . Published April 2011. Accessed July 25, 2021.
Kocina L. What percentage of new products fail and why? Media Relations Agency. https://www.publicity.com/marketsmart-newsletters/percentage-new-products-fail/?cn-reloaded=1 . Published May 3, 2017 Accessed July 25, 2021.
Viki T. Why Innovation Fails Forbes. https://www.forbes.com/sites/tendayiviki/2018/02/28/why-innovation-fails/?sh=48a2163280be . Published February 28, 2018 Accessed July 25, 2021.
Whitehead CR, Hodges BD, Austin Z. Captive on a carousel: Discourses of ‘new’in medical education 1910–2010. Adv Health Sci Educ. 2013; 18:755–768.
Hall AK, Rich J, Dagnone JD, et al. It’s a marathon, not a sprint: Rapid evaluation of competency-based medical education program implementation. Acad Med. 2020; 95:786–793.
Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017; 12:110.
Grant MJ, Booth A. A typology of reviews: An analysis of 14 review types and associated methodologies. Health Info Libr J. 2009; 26:91–108.
May C, Finch T. Implementing, embedding, and integrating processes. Sociology. 2009; 43:535–554.
Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009; 4:50.
van Mierlo B, Regeer B, van Amstel M, et al. Reflexive monitoring in action. A guide for monitoring system innovation projects. Communication and Innovation Studies. https://www.researchgate.net/publication/46383381_Reflexive_Monitoring_in_Action_A_guide_for_monitoring_system_innovation_projects . Published 2010 Accessed July 25, 2021.
May CR, Finch T, Ballini L, et al. Evaluating complex interventions and health technologies using normalization process theory: Development of a simplified approach and web-enabled toolkit. BMC Health Serv Res. 2011; 11:245.
Giddens A. Central Problems in Social Theory: Action, Structure, and Contradiction in Social Analysis. Berkeley, CA: University of California Press, 1979.
Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, Reed JE. Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Saf. 2014; 23:290–298.
CBD Program Evaluation Operations Team. Competence by Design (CBD) Implementation Pulse Check. Ottawa, ON, Canada: Royal College of Physicians and Surgeons of Canada, 2020. https://www.royalcollege.ca/rcsite/cbd/cbd-program-evaluation-e . Accessed September 1, 2021.
LeMahieu P. What we need in education is more integrity (and less fidelity) of implementation. Carnegie Foundation for the Advancement of Teaching. https://www.carnegiefoundation.org/blog/what-we-need-in-education-is-more-integrity-and-less-fidelity-of-implementation . Published October 11, 2011Accessed July 25, 2021.
Entry Date(s):
Date Created: 20210804 Date Completed: 20211108 Latest Revision: 20230717
Update Code:
20240105
DOI:
10.1097/ACM.0000000000004318
PMID:
34348375
Czasopismo naukowe
Purpose: When initiating an educational innovation, successful implementation and meaningful, lasting change can be elusive. This elusiveness stems from the difficulty of introducing changes into complex ecosystems. Program evaluation models that focus on implementation fidelity examine the inner workings of an innovation in the real-world context. However, the methods by which fidelity is typically examined may inadvertently limit thinking about the trajectory of an innovation over time. Thus, a new approach is needed, one that focuses on whether the conditions observed during the implementation phase of an educational innovation represent a foundation for meaningful, long-lasting change.
Method: Through a critical review, authors examined relevant models from implementation science and developed a comprehensive framework that shifts the focus of program evaluation from exploring snapshots in time to assessing the trajectory of an innovation beyond the implementation phase.
Results: Durable and meaningful "normalization" of an innovation is rooted in how the local aspirations and practices of the institutional system and the people doing the work interact with the grand aspirations and features of the innovation. Borrowing from Normalization Process Theory, the Consolidated Framework for Implementation Research, and Reflexive Monitoring in Action, the authors developed a framework, called Eco-Normalization, that highlights 6 critical questions to be considered when evaluating the potential longevity of an innovation.
Conclusions: When evaluating an educational innovation, the Eco-Normalization model focuses our attention on the ecosystem of change and the features of the ecosystem that may contribute to (or hinder) the longevity of innovations in context.
(Copyright © 2021 by the Association of American Medical Colleges.)

Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies