June 2019 Bulletin

Bridging the gap_Summer.png

Welcome to Implementation in Action, a monthly bulletin for implementers and intermediary organizations who are seeking to apply implementation science in a thoughtful and systematic way. Implementation in Action includes an overview of the issue's theme, a Project Spotlight, and links to two resources - one from the foundational literature on the topic and another more recently published resource.


TCI Launches a Free Course!

We are excited to share that we have just opened enrollment to our first ever online mini-course, Inspiring Change: Creating impact with evidence-based implementation.

The course is designed for professionals responsible for creating change  - the people leading change efforts within their organizations and communities.

For a limited time this course is available for free.

This mini-course will empower and enable you to:

  • Gain an overview of evidence-based implementation so you can plan your change initiative more proactively

  • Discover how process models, theories, and frameworks can be the backbone of your change plan

  • Be inspired to use behavior change theory

  • Be more purposeful with your time, by addressing high-priority areas and anticipating resistance to change

  • Learn simple tricks and tips that can set you up for success

CLICK HERE to sign up (it’s free) – you can launch into the modules right away!

Please forward this email to anyone you think would be interested. 


Making evidence locally meaningful and relevant

By Sobia Khan

Senior Consultant, The Center for Implementation

We often think that the terms “evidence” and “evidence-based” are universally understood and accepted. However, depending on individual and collective experiences and worldviews, notions of what constitutes evidence can be vastly different. Questions that I have heard people mull over include (but are not limited to):

  • Does evidence have to be quantitative to be meaningful?

  • Is a randomized controlled trial truly the gold standard for generating evidence for all interventions?

  • Does lived experience count as evidence?

While there are exceptions to the rule, the general trend that I have seen in my own career is that people working in clinical health care tend be influenced by principles of evidence-based medicine and often faithfully adhere to the evidence pyramid. Evidence typically has to be quantitative in nature to be the most meaningful. In contrast, those working in settings where the target of a program or intervention is the population or community (e.g., in public health, social justice, prevention) and in which multiple stakeholders are involved in implementation seek other sources of evidence because they view the evidence pyramid as limiting for their settings.

Add to this discussion the fact that regardless of the field you work in, applying evidence requires a deep understanding of how that evidence was produced, for whom it was produced, and what that evidence means. Trials are often controlled to such an extent that they are not replicable in real life settings where uncertainty reigns. This is where questions of “adapting evidence” are posed, and there are no concrete answers on how best to do this.  Moreover, inequities exist in who the evidence is applicable for.  At the systems and organization levels, higher-resource settings (e.g., countries, communities, hospitals) tend to be the focus of research studies.  Women, people of colour, and marginalized populations are still disproportionately represented in research at the individual level. Again, some level of meaning has to be derived, and adaptations made, from the evidence that exists in order to make the evidence make sense.

This reasonably leads to the following thoughts about evidence: that evidence is important and non-negotiable (which is why we keep thinking and talking about it); that the “best level of evidence” for different interventions and settings might differ (for example, randomized-controlled trials may still be the standard for clinical trials, but pragmatic trials, pre-post or cohort study designs might be a better standard for other fields); and that evidence has to meaningful to all of those involved in implementing and using it.

In this issue, Dr. Stephanie Bradley describes an approach to evidence production using an inclusive sensemaking process for the Communities That Care intervention.


Project Spotlight: Aligning Evidence Based Practice (EBP) implementation standards for diverse state agencies

By Stephanie Bradley

Founder, All Youth Access

I have been promoting evidence-based models in Pennsylvania since 2011. While at the EPISCenter, I frequently synthesized and translated implementation science for policymakers. In 2016, several state leaders expressed interest in identifying which evidence-based models were funded by other agencies and developing funding strategies to sustain local implementations. However, not all systems define “evidence-based” in the same way and funding mechanisms also vary across systems. In response to these issues, I recommended development of a comprehensive prevention strategy modeled after the process used to develop PA’s Juvenile Justice System Enhancement Strategy.

Figure 1. Five Phases of CTC

Figure 1. Five Phases of CTC

This led to the emergence of the Pennsylvania Cross-Systems Prevention Workgroup (CSPW) which is a working group of over 20 state and county agency representatives and local practitioners collaborating to develop a comprehensive, multi-agency, prevention strategy. CSPW is essentially a state-level implementation of Communities That Care (CTC; Figure 1). The CSPW uses a structured process of examining risk and protective factor data to select priorities and is conducting a resource analysis on programs and services being delivered locally. We secured funding for a position to support coordination of CSPW efforts, which is comparable to a CTC mobilizer/facilitator position who supports and guides the group’s work. Using a CTC-informed process, the CSPW is integrating available evidence, navigating coordination across systems, leveraging unique strengths across sectors, and establishing a unified, collaborative vision for universal prevention in PA.

While evidence-based models continue to rise as the gold standard for funding and implementation, we must also acknowledge inherent biases that feed into the development of “evidence” (Kirmayer, 2012; Prussing, 2014; House, 2017) which has been predominantly generated from a white, male, heteronormative, and top-down perspective. In 2018, I founded All Youth Access (AYA). My primary focus through AYA is to improve policies and programs for minority and underserved youth by leveraging frameworks like health equity and social determinants of health. These approaches provide (1) a broader view of effective strategies for improving behavioral and physical health, such as affordable housing, transportation, access to health care, quality education and employment, and (2) address the impact of systemic, historical, and ongoing discrimination of communities of color (Sotero, 2006; Figure 2). Ultimately such strategies may yield greater impact than individual programs and would also benefit from high quality implementation and cross-sector collaboration.

Figure 2: Conceptual model of historical trauma; Sotero, 2006

Figure 2: Conceptual model of historical trauma; Sotero, 2006


Implementation Resources - June Picks

Classic Literature: Practice-based evidence in public health: improving reach, relevance, and results

This article defines the difference between evidence-based practice and practice-based evidence, outlining a way to move both agendas forward.

New Literature: Pew Results First Clearinghouse Database

This is not an article, but an online resource presenting the effectiveness of programs based on national existing databases. It includes colour-coding to highlight the strength of the evidence for each intervention.