Succeeding With (or Maybe in Spite of) Evidence-Based Practices
Nonprofits face increasing pressure to show that their programs are evidence-based. How to “tweak” these practices to your own populations.
Article Highlights:
- Evidence-based practices have been shown through rigorous research to be effective.
- 1. Decide if claims that the practice is evidence-based are trustworthy.
- 2. Decide if the EBP is relevant for your population and your program.
- 3. If necessary, tweak the EBP so that it best fits your work.
- 4. Use available resources to adapt EBPs!
- 5. Measure whether it’s working, and keep learning.
- Hey, wait a sec… We’re already doing that!
Evidence-based practices have been shown through rigorous research to be effective.
We understand the reasoning that allows funding only for proven, evidence-based practices. But too often this requirement has become a club battering community nonprofits. Evaluator Clare Nolan explains how to “tweak” evidence-based practices to your own populations:
Safer sex can be a life and death issue. And many nonprofits make safer sex education the centerpiece of their work. But how do they know whether what they’re teaching is working – that lives are being saved?
A San Francisco Tenderloin neighborhood had a safer sex education program modeled after a “proven” intervention being promoted by the Centers for Disease Prevention and Control (CDC). But their own expertise with their population led them to want to change the model. That’s why they asked me to design a program evaluation.
As part of my background research, I was surprised to learn that the intervention was first shown to be effective among a primarily gay white population in a small Southern city. Would this intervention really be successful at reducing HIV risk behaviors among residents of a diverse urban neighborhood struggling with poverty, homelessness and crime?
This situation reflects a broader trend in the nonprofit sector in which funders encourage and sometimes require nonprofits to use “evidence-based” practices and models. Evidence-based practices (EBPs) are strategies that have been shown through rigorous research to be effective. The premise sounds great. If there’s strong evidence that something works, nonprofits should use it, right?
Not so fast. Models and practices with positive track records are a potentially good tool for those working toward social impact, but there are some things nonprofits should know before jumping onto the evidence-based bandwagon.
Here are five tips for avoiding potential pitfalls while making the most of what EBPs have to offer.
1. Decide if claims that the practice is evidence-based are trustworthy.
These days, there is no shortage of websites providing information on supposedly proven practices or program models for the nonprofit sector. Be skeptical about what these sites have on offer.
For example, some individuals are advocating for practices they pioneered in order to increase their own visibility or generate personal revenue.
To determine whether a resource is credible, look to see what kind of evidence was used to determine whether a practice was effective. Commonly accepted standards include:
- The practice was evaluated using a rigorous research design (i.e., one that used an intervention and control group)
- The practice has been shown to be effective for more than one kind of population, and by different researchers
- Studies of the intervention have been published in peer-reviewed journals. Some examples of groups that are credible resources for EBPs include the What Works Clearinghouse, the Promising Practices Network , and the Substance Abuse and Mental Health Administration’s National Registry of Evidence-Based Programs and Practices .
2. Decide if the EBP is relevant for your population and your program.
Even when the evidence looks good, it’s important to take a critical stance. You have to assess for yourself whether you think the practice is relevant to and appropriate for the communities you serve. Keep in mind that research is often limited when it comes to discerning what works for diverse populations.
Also important is assessing whether your organization has the capacity to implement the intervention. For example, does your staff have the appropriate skill set and training associated with a particular EBP?
Also take into account how easy or hard will it be to obtain staff buy-in before moving forward.
3. If necessary, tweak the EBP so that it best fits your work.
Once you’ve identified an EBP that seems relevant to your organization, the next step is to think about implementation. You’re probably familiar with the significant body of literature on the difficulties and low success rate of replicating model programs.
In most cases, nonprofits will not have the resources to replicate an EBP exactly nor is this necessarily the best idea. Instead, nonprofits should think about how to adapt the practice in ways that are responsive to local needs.
To do this, gather information about the theory or logic underlying the EBP. Decide which elements are core to the intended outcomes and which are less critical. Anticipate how potential adaptations might strengthen or weaken the EBP based on the underlying logic.
For example, the EBP on which the Tenderloin group’s intervention was based utilized weekly group meetings with participants — probably good in the EBP model but possibly too difficult with this street-based, drug-using population.
Focus groups confirmed for us that having fewer group meetings would not invalidate the core elements of the EBP. As you can see, adaptation is more of an art than a science at this point.
4. Use available resources to adapt EBPs!
Many EBPs come with materials to support implementation, including program manuals, training curricula, and technical assistance programs. (See, for example, this CDC website .) The best of these will include assessments regarding your organization’s readiness to implement a particular practice or program as well as how to adapt EBPs without sacrificing quality in terms of outcomes.
If you can’t find any implementation resources online, try calling the organization that pioneered the particular practice to gather this information directly. Also, if a particular funder is asking you to employ specific EBPs, be sure to include staff training and technical assistance in your funding request.
5. Measure whether it’s working, and keep learning.
Once you’ve adopted an EBP for your organization, be sure to evaluate what’s working. At a minimum, staff should come together regularly to reflect on implementation, identify successes, and discuss intended adaptations as well unplanned ones that occurred in practice.
If you hire an external evaluator, ask them to examine how the program was implemented. This information will help you decide whether you have implemented the practice as you intended in addition to whether it is achieving desired outcomes.
Hey, wait a sec… We’re already doing that!
In the course of looking for EBPs in your field, you may find that your organization is already using practices that have been shown to be effective. In this case, pat yourselves on the back. You’ve just identified a way to show your funders that your organization’s services are consistent with the latest findings about what works.
Given the current economic environment, nonprofits are going to face increasing pressure to demonstrate that their programs are evidence-based. Whether you’re interested in preventing the spread of HIV, educating young children, or helping young people connect with employment and education, knowing the EBPs in your field will put you at an advantage when it comes to talking with funders.
Readers: Do you agree with Clare? Disagree? What have been your successes and frustrations? Clare will respond to comments posted here.
You might also like:
- A Nonprofit Partnership: How One Board Member Connected Two Organizations and Boosted Both
- Innovative Leadership — Culture Doesn’t Have to Eat Strategy: Tending to Human Factors During Strategic Planning
- Insider Newsletters: An Easy Way to Keep Your Board in the Loop and Engaged
- Five Years and Growing: How One Nonprofit Built a Sustainable, Collaborative Mission
- Measure What You Value: Designing a Values-based Performance Appraisal System
You made it to the end! Please share this article!
Let’s help other nonprofit leaders succeed! Consider sharing this article with your friends and colleagues via email or social media.
About the Author
Clare Nolan is vice president of Harder+Company Community Research, a California consulting firm that specializes in research and strategic planning for the nonprofit, philanthropic and public sectors.
Articles on Blue Avocado do not provide legal representation or legal advice and should not be used as a substitute for advice or legal counsel. Blue Avocado provides space for the nonprofit sector to express new ideas. The opinions and views expressed in this article are solely those of the authors. They do not purport to reflect or imply the opinions or views of Blue Avocado, its publisher, or affiliated organizations. Blue Avocado, its publisher, and affiliated organizations are not liable for website visitors’ use of the content on Blue Avocado nor for visitors’ decisions about using the Blue Avocado website.
So basically you model something after something someone else has done, but you need an expensive professional "evaluator" to hold some focus groups and give a blessing to the simple idea of holding meetings with clients less often! EBP is just another way of keeping established ways of doing things in place and discouraging innovation.
This is a subject of special passion for our nonprofit, which uses acupuncture, nutrition education, nutritional supplements, tai chi, qigong, yoga, and Emotional Freedom Technique (check it out at www.emofree.com – a fantastic quick method of eliminating pain and PTSD) in multiple drug and alcohol treatment sites. Our nonprofit has been called “an integral part of Sacramento County’s effort to reduce drug-related crime and ensure public safety” by the county district attorney and received other accolades from administrators in the state’s dept. of alcohol and drug programs, as well as from consultants to the legislative health committees for the senate and assembly. I can hear the eyebrows furrowing from here, though, when these time-tested and even scientifically studied methods of improving brain chemistry are mentioned in conventional drug treatment facilities. When you are outside the conventional box, it can take longer than our clients have time to wait to convince administrators to change business as usual even with studies showing it works. Luckily, this year, the state’s Administrative Office of the Courts has proven our most extensive program keeps 83% of graduates out of the criminal justice system in the two years following graduation. That’s a lot of jail bed days saved so maybe there will be more replication. Maybe. We’re hoping a conference in April will draw more attention to our successes with or without evidence-based studies from a bazillion different peer-reviewed journals. www.carasac.org
One reason why I thought Blue Avocado readers would find Clare’s reflections thoughtful and her suggestions helpful is that she doesn’t take an all-or-nothing perspective. Thinking critically about EBP in our field or issue area can help sharpen how we do our work; these guidelines give people in nonprofits a chance to learn from others’ experiences, especially if we don’t feel duty-bound to follow them slavishly. Even if some EBPs fall short, what’s the point of throwing the baby out with the bathwater?
Lynora Williams, Blue Avocado Senior Editor
There is an interesting parallel I think with the end of the 19th century, when the ideas of an efficiency expert Taylor ushered in the era of tight controls over workers in manufacturing, and changed the face of industrial production. Part of the change that Taylorism brought was to put into place a whole new class of professionals and middle managers who analyzed, measured and assessed the inputs and outputs of those doing the hard, physical work that manufacturing involved, thereby helping to build a new middle class. There were undoubtedly benefits, but there were also great costs that accompanied Taylorism, one of which was the division of labor between those who worked the lines and those who evaluated how well the work was done. When I look at what’s happening with EBP in the non-profit sector I can’t help but see very strong parallels. I see it proliferating a new class of evaluation professionals adding one kind of value, to be sure, but at costs that have not been well articulated, in part because there is no organized, systemic structure to articulate them.
It took the better part of a century before management took a new, hard look at the Taylor ethos. By the close of the 20th century, the command/control ideas that under girded Taylorism had been challenged by other, more democratic approaches for reaching efficiency, such as control circles. The old command-control ideas have come in many ways to be viewed as an impediment to innovation, flexibility, and effective responses to highly changeable environments. I see EBP as mirroring the need for command and control. That’s why the term "evidence based" sounds so compelling. It appeals to systems that thrive on control. While recognizing the need for informed action I also find the EBP trends deeply troubling in the context of non-profit work that involves people without power, connections or resources of their own to fight back against creeping (or galloping) professionalism. Does anyone remember Careless Society by John McKnight, written in the 1980s? It applies more than ever.
The McKnight book was published in 1996.
As an evaluator, I have the same mixed feelings about EBPs alluded to in some of the comments. On the one hand, they can be stifling for nonprofits doing new and innovative work. On the other hand, I have also seen instances of funders/nonprofits supporting/implementing interventions that have been shown to be ineffective or weak. One of the reasons I wanted to write this piece is because we are increasingly being asked by funders to evaluate grantee compliance with EBPs. In some cases, funders don’t really understand what they are asking for. For example, there may not be an "evidence basis" for a particular intervention because of the high standard of proof required or because it represents a new, and therefore, untested practice. In other cases, funders mistakenly believe that an intervention needs to be replicated exactly rather than adapted and tailored to local needs. Nonprofits, of course, are the ones that have to bear the brunt of this trend.
Clare Nolan
EBP is of course a good idea, but its only a pretentious way of telling people to "do good things that are also proven to be good."
The unfortunate aspect of the whole EBP excitement is that those promoting it don’t really understand what they’re saying or what the implications of tying funding explicitly to an already-existing evidence base.
At a conference in the past year, a CDC director’s plenary speech proclaimed the mantra "Evaluate, Evaluate, Evaluate." At the end, when I asked if they knew the total annual government budget (research evaluation) available for this type of evaluation, they had no idea. The answer was approximately $100,000. — a completely insufficient sum.
The implications of this are that no truely innovative projects will be funded.
From my agency’s experience, EBP’s serve primarily as a cash cow for research psychologists. If an EBP does not have a "train the trainers" model, requires the provider to pay for ongoing training, support, etc., without the possibility of ending the relationship without losing "certification", that is a huge red flag to steer clear of it. It is not about best practices for serving people of need, but rather, serving their bottom line.
The science to watch, and there is plenty of it, is "practice based evidence." This entails researchers who have identified elements or components of evidence based practices which return the same results. It’s analogous to open source code software vs. proprietary software. There is "scientific validity" without the fiscal blackmail!!!
Thanks for sharing this. Where can Blue Avocado readers go to find out more about practice-based evidence?
My “bible” for EBR is *The Voice of Evidence in Reading Research*, by Peggy McCardle and Vinita Chhabra (2004).
http://tinyurl.com/l6uoxy
True, the book is dated, and it addressed NCLB, but it explains “evidence based” concepts, which stem from “research based” models from the National Institute of Health.
The Bush administration, spear-headed by Darv Riddick’s think tank at U. Houston, brought “research based” xyz to the funding forefront. The term “evidence” replaced “research” after someone in Washington read “How to lie with statistics.” :-)
I’m afraid EBR, PBE, PBR, and the 16 possible acronyms are with us as long as money power is in the hands of engineers (think computer wealth foundations) and statisticians (federal & state money) rather than social research types.
BTW, EBR has done little to improve reading among students in the US. In fact, it may have contributed to the 40% dropout rate suffered by most of our large urban areas.
Damun Gracenin, pundit
American YouthWorks
Austin, TX
I agree to a point.
The more challenging the intervention more difficult the training of the trainor will most likely be. I have been hired to do a 6 day training of something that takes 2 years for most people to master (and some will never master). I couldn’t say that after the 6 days all the trainers could do it but sure enough they were "teaching it" (despite my statements are that they haven’t mastered it). To understand something and then to implement, this isn’t something that can’t be taught over night (or in a week) or without standard review and on the cheap. You can’t photocopy my materials and get the same effects and you must follow the implementation guidelines and not cut corners.
I make much more money having a successful implementation, repeated business and rep than I do repeating one program with different groups – I’ll be held accountable but so are you to fidelity and decisions your companies make. The balance is the hard part and it is even harder to say "no" to a contract when those situations come up (no, I’m not a pompous ass) but I have learned I must or I keep paying and paying (and so do you) – trust me. Guided practice is important and so is following steps, surveys, trainings etc. etc. (at least the good ones)
I have done my homework and not only have I studied the success (or implemented them) but the ones that fail as well! I call them as I see them – I think that is what you are really paying me for – it up to you to listen, reflect, sweat and spend time & money.
I believe the author has confused the terms EBTs with EBP. While evidence-based treatments (EBTs) are interventions which have been proven effective through rigorous research methodologies, evidence-based practice (EBP) refers to a decision-making process which integrates the best available research, clinician expertise, and client characteristics. EBP is an approach to treatment rather than a specific treatment.
This is an important distinction that needs to be made, and one that is perhaps clearer in the medical fields. However, I would argue that there is no standard definition in the community health and social services arena and that the term is often ill-defined and misunderstood. As the California Institute for Mental Health notes, much like the terms “model program” and “best practices,” evidence-based practice “is frequently used to mean different things.” (http://www.cimh.org/Initiatives/Evidence-Based-Practice/Definitions-Resources.aspx)
I believe the author has confused the terms EBTs with EBP. While evidence-based treatments (EBTs) are interventions which have been proven effective through rigorous research methodologies, evidence-based practice (EBP) refers to a decision-making process which integrates the best available research, clinician expertise, and client characteristics. EBP is an approach to treatment rather than a specific treatment.