Page 1

Clinician’s Corner 2018

The EBP Issue Texas Institute For Child & Family Wellbein g

UNITING SOCIAL WORK RESEARCH & PRACTICE


Letter From Our Director

D

ear readers:

I am incredibly excited to share our second issue of Clinician’s Corner with you. As a researcher and social work practitioner, I often see a disconnect between the research on practice published in academic journals and what practitioners are actually doing to improve the lives of their clients. Last year, our team set off on a unique journey to bring together researchers and practitioners through the production of our first Clinician’s Corner. Our goal is to create an ongoing dialogue between researchers, practitioners and policy makers. In this second issue, we have focused on evidencebased practice and we are proud to have several pieces that have been co-written by researchers and practitioners to facilitate this dialogue. Evidencebased practice has long been a source of conversation. On one hand, measuring and disseminating research evidence related to client outcomes has the opportunity to inform best practices and improve services to clients. On the other hand, the current use of evidence-based practice models has contributed to some feelings of frustration for many in the social work field. For example, academics may feel like practitioners should utilize the evidence they produce in academic journals, even though practitioners may be ahead of the research. For practitioners, using evidence-based practices can be stifling because of the perception that it can constrain clinical judgement. For policymakers, the use of evidence-based practices are often mandated in an effort to “guarantee” a defined result. Finally, for administrators, evidence-based practices can be costly pre-packaged models required by funders. In our first article, my colleague Dr. Danielle Parrish from Baylor University, an expert in social work evidence-based practice, and I address some of these issues and the chaos we see in how research evidence is defined and used. Our argument is that evidencebased practices should not be conceptualized as static models. Rather, evidence-based practice is a process in which you, as the practitioner, review the available research and then use your clinical judgement to make the right choices with (not for) your client.

backgrounds to provide guidance on utilizing clearinghouses, making cultural adaptations to evidence-based models and the process of building evidence for an intervention. We also have several pieces that were authored by practitioners and researchers as a team. Barbara Jefferson from the Center for Child Protection offers insight into thinking through what a client needs and making the right choice of interventions. Debbie Okrina from Spirit Reins dialogues with our Associate Director, Dr. Beth Gerlach, about reservations with evidence-based practices and how the clinical relationship has to be at the forefront of practice. Finally, Rob Thurlow of Lifeworks discusses with Dr. Patrick Tennant supervision in the context of evidence-based practices. Our intention is to provide tools and dialogue to help practitioners be informed consumers of evidencebased practice. Despite the breadth of the content we are providing, we recognize that we cannot capture all the complexities and nuances of evidence-based practice. However, our primary hope is that you walk away from this understanding that you and your client are at the heart of the evidence-based process because you and your client have to decide what is best for their wellbeing. As you read, we invite you to be in touch with us to share your experiences with evidence-based practice so we can continue this conversation. We believe you are the expert on your practice and we want to know about your experiences, hear your questions and work together to find solutions. Look for future Facebook live sessions addressing these topics and please keep in touch with us. Sincerely,

Monica Faulkner, Ph.D., LMSW Director & Research Associate Professor, Texas Institute for Child & Family Wellbeing, Steve Hicks School of Social Work, University of Texas at Austin

To help you through this process, we have articles in this issue written by researchers with clinical

1

C l i n i c i a n’s C o r n e r


Table Of Contents 3 LIST OF CONTRIBUTORS

4 EVIDENCE-BASED PRACTICE IS A PROCESS

7 HOW TO FIND THE RIGHT EVIDENCE-BASED PRACTICE USING EBP CLEARINGHOUSES

9

EVIDENCE-BASED DOES NOT MEAN ONE SIZE FITS ALL

11 H E A R T W O R K A N D H E A D W O R K : A C O N V E R S AT I O N A B O U T R E L AT I O N S H I P S , S E L F - A W A R E N E S S A N D E V I D E N C E - B A S E D PRACTICE

13

T H E I M P O R TA N C E O F E F F E C T I V E C U LT U R A L A D A P TAT I O N S T O EVIDENCE-BASED INTERVENTIONS

15

“ M A K I N G I T T H E I R O W N ” : S U P E R V I S I N G T H E I M P L E M E N TAT I O N OF EBPS IN CLINICAL PRACTICE. AN INTERVIEW WITH ROB T H U R L O W, L C S W O F L I F E W O R K S

17 H O W T O C H A N G E T H E W O R L D O N E E B P AT A T I M E : A C L I N I C A L RESEARCHER’S JOURNEY TO EVIDENCE

The EBP Issue

2


2018 Clinician’s Corner Contributors

3

T I N A A D K I N S, P H . D. AUTHOR

SWETHA NULU, MPH EDITOR

J O LY N N E B AT C H E L O R , P H . D . , LCSW EDITOR

DEBBIE OKRINA, LCSW AUTHOR

M O N I C A FA U L K N E R , P H . D. , L M S W AUTHOR

D A N I E L L E E . PA R R I S H , P H . D. AUTHOR

B E T H G E R L AC H , P H . D. , LCSW AUTHOR & EDITOR

R U B É N PA R R A - C A R D O N A , P H . D. AUTHOR

BARBARA JEFFERSON, LCSW AUTHOR

P A T R I C K S . T E N N A N T, P H . D . , LMFT AUTHOR & EDITOR

K AT E M C K E R L I E , M S S W, M P H C O O R D I N AT O R & D E S I G N E R

R O B T H U R L O W, L C S W AUTHOR

C l i n i c i a n’s C o r n e r


Evidence-based Practice is a Process

M O N I C A FA U L K N E R , P H . D . , L M S W, D I R E C T O R & R E S E A R C H A S S O C I AT E P R O F E S S O R , T E X A S I N S T I T U T E F O R C H I L D & FA M I LY W E L L B E I N G , S T E V E H I C K S S C H O O L O F S O C I A L W O R K , U N I V E R S I T Y O F T E X A S AT A U S T I N D A N I E L L E E . PA R R I S H , P H . D . A S S O C I AT E P R O F E S S O R , D I A N A R . G A R L A N D S C H O O L O F S O C I A L W O R K , B AY L O R U N I V E R S I T Y, H O U S T O N G R A D U AT E E X T E N S I O N

I M P O R TA N T T O K N O W •

While there are many benefits to the development and use of evidence-based practices, there are significant costs associated with the current systems for establishing and utilizing them.

Those costs manifest in a variety of ways, but too often lead to clients and communities being served by programs and practices that do not “fit” them and agencies feeling unnecessarily restrained.

A transition to an evidence-based process, as recommended here, would always preference the needs of the client or community while also entailing a deep understanding of the research process and findings so that they could be effectively melded with the clinical expertise of the practitioner.

W

e all want to do what is best for our clients. The best thing should always involve grounding our practice in the best research evidence we have on how to meet our client’s specific needs. Unfortunately, discussions about evidence-based practice often devolve into “evidence-based chaos.” Policymakers and funding agencies are demanding certain percentages of programs be “evidence-based,” judges may only refer to “evidence-based” programs, agencies are desperately trying to figure out how to afford evidence-based practices and practitioners fight battles to adapt evidence-based models so that it works for their clients. If we truly understood evidence-based practice as a process, this chaos would not be negatively impacting our practices, policies and agency decisions. In fairness to practitioners and policymakers, many academics who have driven discussions of evidencebased practice have done a poor job of using a common definition. In some cases, academics have referred to evidence-based practice as using a model that has been deemed “evidence-based” after being tested through a randomized controlled trial or equivalent research design. When we have individual models that we designate as evidence-based, we create chaos because we assume that the same intervention or program will work for everyone regardless of their culture, values and preferences. We also assume that we can bring the “evidence-based” program or intervention to full scale without taking proper stock of the cost, resources and training required. Consequently, this definition can contribute to the evidence-based chaos by misrepresenting the true intentions of the original evidence-based practice model, which highlighted the importance of integrating the best available research with the client’s background, culture and preferences, as well as professional practice expertise. The consequences of this chaos can include wasted time and money, potentially lower levels of client engagement, partially implemented programs and clients who do not receive the very best services.

W H O D E C I D E S W H AT I S EVIDENCE? First, we need to understand who decides something is evidence-based. In the case of teen pregnancy prevention, the federal government has created a list of evidence-based programs with the input of research think tanks. Most people assumed that the evidence behind these programs was that they reduced teen pregnancy. However, because preventing teen pregnancy takes years to study, most of the programs only had evidence that impacted short-term outcomes like increased knowledge about prevention or increased intention to not engage in sexual activity. These outcomes do not necessarily lead to changes in risky behavior. To be considered evidence-based at this time, a program needed to show a statistically significant change in just one short-term outcome. Statistical change, alone, tells us if those who got the intervention improved more than the comparison group, but provides no information on the degree of difference. When we know how different they are, we can better discern if an intervention is likely to result in meaningful change for our clients. As discussed below, this is an important consideration when selecting an evidence-based program. To be fair, these determinations make sense, as most research has not been adequately funded to collect long-term outcomes and this research is only emerging. However, once something is designated evidence-based, the social services field is inclined not to question the research behind the evidence. New research evidence is always changing and emerging, and should be considered periodically to ensure best practice. As we learned from years of funding abstinence-only sex education, we may know better, and can do better, when we take this research into consideration.

WHO CONTROLS ACCESS TO EVIDENCE-BASED PROGRAMMING? When we designate something as evidence-based, it often becomes a “golden ticket.” Programs that have

The EBP Issue

4


invested time and money to create an evidencebased practice, now have ownership and control over intellectual property that often must be purchased by others. Funders who require grantees to use these practices are essentially requiring programs to spend potentially thousands of dollars in purchases and trainings. In many cases, grantees benefit from high quality tools and information. However, they can also struggle to keep their staff trained when there is turnover or when they can no longer afford to purchase licenses and trainings. As funders and practitioners, it is essential to adequately budget for training, implementation and sustainability costs when requiring evidence-based programming. Likewise, funding sources and practitioners may want to also assess the cost-effectiveness of various approaches, as some programs offer many of their materials and trainings for much lower costs than others.

B U T W H AT I F T H E E V I D E N C E D O E S N ’ T M AT C H T H E N E E D OF THE CLIENT? The most chaotic factor in our use of some evidencebased practice is that the models and curricula may not match what our clinical judgement tells us is the best fit for our clients and we may have little means to adapt. Anyone using an evidence-based model typically has to implement the model with fidelity, meaning that any modifications have to be approved. In some cases, fidelity allows tailoring aspects of the intervention while maintaining the primary components of the intervention. In other cases, it means reading a script word for word during a home visit. Or it may be presenting sexual health content without the ability to modify that content for youth who have likely experienced sexual abuse. Our professional expertise should not be put aside because an evidence-based model requires a rigid approach to client interaction. When we “start where a client is”, we have to adapt and use our clinical skills in order to practice ethically.

best available evidence to answer the question, evaluating the quality of the evidence as well as its applicability, applying the evidence, and evaluating the effectiveness and efficiency of the solution.” For social work, the process should include research evidence, clinical state of the client, your clinical expertise and the client’s preferences. In this process, there is no assumption that your expertise or the research evidence is more important than the client’s preferences. Most commonly, the process is characterized with 5 steps: ask, acquire, appraise, apply and assess. Combining this process with the Haynes et al. model of decision-making, social workers can utilize this simple guide for evidencebased practice. The first step is to ask. Although this step seems obvious, far too often experts come into treatment and communities assuming they know what is best. Communities may have a model funded and they attempt to serve everyone with the same model. A core social work value is that we assume the individual or community is the expert on their experience. If you are working with an individual, this is the point where you ask questions like: What brought you in for services? What do you hope will get better? How do the people around you view treatment? What does your support network look like? If working with a community to roll out an intervention, the ask is critical. You need to know what the community values, what the community wants, what has or has not worked before and what the strengths of the community are.

Title of this Article GETTING OUT OF THE CHAOS BY GETTING BACK TO THE PROCESS

The way we are approaching evidence-based practice has created chaos around what evidence is, the lack of accessibility to use evidence-based models and lack of ability to use our clinical knowledge in ways that benefit our clients. The good news is we can pull ourselves out of this chaos by using the originally intended definition of evidence-based practice. Instead of only accepting individual practice models or curricula as evidence-based, the field should utilize the evidence-based process. In clinical practice, an evidence-based process means we integrate the best research available with our clinical expertise related to our client’s history, preferences and culture. The National Association of Social Workers defines this process as: “creating an answerable question based on a client or organizational need, locating the

5

After you know the needs, you acquire information. This is the point where you look at the research evidence. Is there a practice model that fits with what the client or community wants and their background characteristics or culture? What is the research that has been done to show that model works? Is it of good quality? After you find information, you appraise it because we know one size never fits all. You should be asking questions about the theory behind a particular intervention, cultural relevancy for your client, generalizability of samples and expected outcomes. When thinking about outcomes, which seem most meaningful? Do the outcomes in the study suggest that there is clinically significant change? This means, does the change in outcome reflect what you, as a practitioner, would like to see in terms of improvement from your client? This is important, as a study can show significant differences between an intervention group and a control group, even if the intervention group only improves just slightly and stays below a meaning clinical cutoff score (e.g., for depression). Most likely, the intervention you found will also need some modifications and it’s important to assess the degree to which such modifications can be made while achieving similar outcomes.

The next step is to apply the intervention. To do this, you consult with the client. Explain what resources and options you have found, including the limitations and expected outcomes. Part of informed consent should include what you found (or did not) in the

C l i n i c i a n’s C o r n e r


THE EVIDENCE-BASED PRACTICE PROCESS Influences

C

n lie

li t C

nica

l State & Circum

sta

nc

es

2. ACQUIRE W ha t ev i d e n c e & t he o r y ex i s t s t o ex p la i n s i t u a t i o n ?

5 . A N A LY Z E & A DJ UST D id it wor k?

4 . A P P LY Impl emen t t he b e s t c h oi c e.

Yo u r

3. APPRAISE W ha t i s t he b e s t i n fo rm a t i o n a b o u t wha t w ill wo rk best?

Clinicial Expe

research. Then together you decide how to proceed. This is essential to empower and engage the client in the helping process. The final step is to assess how the intervention worked. In order to build evidence for your future practice, you should utilize a single subject design so that you monitor progress with your client. If shared with the client, these graphs of outcomes over time can serve as a useful tool, highlighting changes that may suggest a need for clinical attention during a specific week, or for changing the intervention approach altogether. The importance of such evaluation is highlighted by the fact that just because there is a good research evidence for an intervention does not mean the intervention will work for everyone. For example, a randomized controlled trial may suggest that 80% get better in an intervention group compared to only 20% in a control group. Your client could be similar to the 20% that did not get better in the intervention group, in which case, an alternative approach may be needed. For a community intervention, program evaluation should be utilized to help build evidence for future practitioners and inform needs to improve implementation and fidelity.

Research

Client Prefere

nce

1. ASK S t a rt whe re t he c li e n t i s.

rtis

e

how evidence derives from research, from the client’s situation, from the client’s perspective and from your expertise. Help present and advocate for alternative programs that take all of these factors into account by learning and utilizing the evidence-based practice (EBP) process. When we only rely on research and fail to take context and client factors into consideration, we fail to use and build evidence appropriately. •••

ADDITIONAL READING For practitioners hoping to learn more about the EBP Process, here is a useful online training resource: Evidence-Based Behavioral Practice: https://ebbp.org/ Drisko, J.W. & Grady, M.D. (2015). Evidence-based practice in social work: A contemporary perspective. Journal of Clinical Social Work, 43, 274-282. Evidence-based practice in psychology. (2006). American Psychologist, 61(4), 271-285. doi: 10.1037/0003-066X.61.4.271. Haynes, R., Deverwaux, P. & Guyatt, G. (2002). Clinical expertise in the era of evidence based medicine and patient choice. Evidence-based Medicine, 7, 36-38.

P R O C E S S M AT T E R S

National Association of Social Workers. (2010). Evidence-based practice for social workers. Washington DC: author.

The confusion and chaos around evidence can improve if we change our language and our understanding of what evidence-based practice is. As practitioners, we have to be speaking with our clients, policymakers, agencies and funders about evidencebased practice as a process. Be prepared to explain

Parrish, D. (2018). Evidence-based practice: A common definition matters. Journal of Social Work Education. Spring, B. & Hitchcock, K. (2009) Evidence-based practice in psychology. In I.B. Weiner & W.E. Craighead (Eds.) Corsini’s Encyclopedia of Psychology, 4th edition (pp. 603-607). New York:Wiley.

The EBP Issue

6


How to Find the Right Evidence-based Practice Using EBP Clearinghouses P A T R I C K S . T E N N A N T, P H . D . , L M F T, R E S E A R C H A S S O C I A T E , T E X A S I N S T I T U T E F O R C H I L D & F A M I L Y W E L L B E I N G , S T E V E H I C K S S C H O O L O F S O C I A L W O R K , U N I V E R S I T Y O F T E X A S AT A U S T I N

D

atabases of evidence-based programs and practices (jointly referred to here as EBPs) classify the quality of evidence supporting those EBPs and are intended to be accessible for nonresearchers. These databases have been utilized to inform practitioners in social services and healthcare for over two decades. More recently, these databases have been transformed into web-based clearinghouses that are designed to be user-friendly and informative for those interested in employing EBPs but unsure about where to start. Understanding what clearinghouses are available and how to best use them can be of great benefit to direct practitioners who are interested in research-supported practices and interventions, and to directors trying to initiate a new program or update an existing one.

W H Y D O I T: As reviewed elsewhere in this issue, there are many potential benefits to understanding and using evidence-based programs and practices. There are also drawbacks to rigid structures and the false dichotomy of ‘EBP-or-not’ designations. The best

7

EBP clearinghouses are designed to address those drawbacks by displaying a wide breadth of programs and indicating the degree of research supporting each EBP. Placing EBP designations along a continuum more accurately reflects the true nature of the process of developing evidence for a program or practice. That continuum of evidence approach, and the ability to cross-reference the degree of support for a wide-variety of programs and practices by target population, program focus and program features, make evidence-based clearinghouses immensely valuable for anyone in search of an intervention.

H O W T O D O I T: A practitioner’s implementation of an EBP is fundamentally contingent on their ability to source the necessary information and critically compare available alternatives. National social service associations have recognized that need and called for the infrastructure necessary to promote that ability. A variety of public and private organizations have taken up that call and created curated databases of information about programs, practices and the evidence supporting them. Many of these databases now exist, varying in their

C l i n i c i a n’s C o r n e r


I M P O R TA N T T O K N O W •

EBP clearinghouses and databases exist to help researchers and practitioners sort through the evidence associated with a variety of programs and practices.

The best EBP clearinghouses report on degrees of evidence, rather than the false-dichotomy of ‘EBP-or-not’ designations.

The Results First Clearinghouse Database by The Pew-MacArthur Results First Initiative is a great place to start.

support for these clearinghouses may be waning, a number of high-quality alternatives exist and can be utilized in the absence of the SAMSHA National Registry of Evidence-based Programs and Practices. Several additional factors outside the purview of the clearinghouses must also be considered. For example, the importance of fit with the population, the intended outcomes and decisions about fidelity versus adaptation (addressed elsewhere in this issue) to the empirically supported practices and programs is worth exploring. A variety of implementation strategies also exist and should be reviewed during the EBP selection process. •••

focus (i.e., some are targeted, others are more general), review procedures, user-experience, how often they are updated, and so on. Certainly, there are benefits to the presence of many clearinghouses, but the amount of choices can also be overwhelming and can stall well-intentioned searchers. A great place to start is the Results First Clearinghouse Database, a project of The Pew-MacArthur Results First Initiative. http://www.pewtrusts.org/en/multimedia/datavisualizations/2015/results-first-clearinghousedatabase While not technically responsible for reviewing programs itself, it functions as a clearinghouse of clearinghouses, compiling an enormous amount of information into a single, comparable scale so that users can effectively sort through the findings of eight evidence-based clearinghouses all at the same time. More information on other high-quality databases and clearinghouses can be found at: www.campbellcollaboration.org

ADDITIONAL READING

http://evidencebasedprograms.org/

Kounang, N. (2018, January 11). Feds freeze mental health practices registry. Retrieved from https://www.cnn.com/2018/01/11/health/fed-mental-healthregistry-frozen-bn/index.html

W H AT T O D O W I T H T H E I N F O R M AT I O N : There are many things to consider when selecting and implementing an EBP. Clearinghouses and databases of EBP support should be utilized to make the process more effective and efficient. There may also be reciprocal benefit to the research findings available to the clearinghouses if more practitioners and program directors select EBPs for implementation, and thus create additional opportunities for evaluation of their effectiveness. Though federal government

Novins, D. K., Green, A. E., Legha, R. K., & Aarons, G. A. (2013). Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review. Journal of the American Academy of Child & Adolescent Psychiatry, 52(10), 1009-1025. Powell, B. J., Bosk, E. A., Wilen, J. S., Danko, C. M., Van Scoyoc, A., & Banman, A. (2015). Evidence-based programs in “real world” settings: Finding the best fit. In Advances in Child Abuse Prevention Knowledge (pp. 145-177). Springer, Cham. Powell, B. J., Proctor, E. K., & Glass, J. E. (2014). A systematic review of strategies for implementing empirically supported mental health interventions. Research on Social Work Practice, 24(2), 192-212. Rousseau, D. M., & Gunia, B. C. (2016). Evidence-based practice: The psychology of EBP implementation. Annual Review of Psychology, 67, 667-692. Soydan, H., Mullen, E. J., Alexandra, L., Rehnman, J., & Li, Y. P. (2010). Evidence-based clearinghouses in social work. Research on Social Work Practice, 20(6), 690-700 .

T h e E B P  I s s u e

8


Evidence-based Does Not Mean One Size Fits All B A R B A R A J E F F E R S O N , L C S W, C L I N I C A L D I R E C T O R W I T H T H E C E N T E R F O R C H I L D P R O T E C T I O N

A

s clinicians, there are excellent treatment interventions we are trained in that are considered “evidence-based.” These treatment interventions can work extremely well for some people, but not for everyone. In my decades of work with children and families who have experienced trauma, I have come to believe that a “one size does not fit all” treatment approach is critical for each child and parent in optimally gaining from their therapeutic treatment process. Clinical assessment should always drive appropriate treatment planning. Clinical assessment and talking with the child and his or her family is what enables us to individualize the treatment areas to address and to devise an individualized treatment strategy. When a treatment intervention is evidence-based, it is important to understand and recognize that not one single treatment intervention is going to work for every child and every family impacted by trauma, though it can provide a good place to start. In providing therapeutic services within a traumainformed organization, we have to be mindful of providing the right interventions at the right time and with the right dose for a child’s specific developmental needs, as well as the family’s specific needs. This timing and dosing of the identified appropriate treatment interventions will be assessed throughout the treatment process for a child and their family. Our agency had the fortunate experience of being trained in Phase 1 of the Neurosequential Model of Therapeutics (NMT) with Dr. Bruce Perry and the Child Trauma Academy. As a result, our organization and clinical team developed an understanding of the neurodevelopmental issues of how trauma impacts the developing brain of a child. As we all develop the abilities to sit up, crawl, walk, talk, learn to control our behavior and emotions, develop relationships with others, and learn to reason and problem solve, this is done in a neurosequential process. We may be working with a child who is chronologically 10 years old, but is developmentally functioning at the age of a 3 year old. As a clinical program, we’ve been able to apply this neurodevelopmental assessment approach into our work with children impacted by trauma. This neurodevelopmental assessment approach enables us to determine the areas of the brain most impacted by adverse experiences, as well as relational health factors. As a result, identification of types of treatment interventions to target areas of the brain to facilitate neurosequential development is made possible for each child. For example, it is often thought that trauma-

9

C l i n i c i a n’s C o r n e r


I M P O R TA N T T O K N O W •

Even with the best EBPs, there is usually a need to tailor the interventions to fit the particular needs of the family being served.

Even the process of tailoring the intervention will vary between families.

In order to best meet the needs of the family, clinicians should include a thorough assessment that can inform the intervention selected and how that intervention is tailored.

focused cognitive behavioral therapy is one of the best treatment options for individuals who have experienced trauma. However, if an individual has experienced developmental trauma and he/she is not able to connect higher level cognitive functioning to their trauma experiences, cognitive behavioral therapy may not be as effective, despite the evidence that it works for many. Rather, we would look for ways to help a parent and child co-regulate through art, music, animals and/or play rather than jump into therapy that requires higher cognitive reasoning that the client is not ready for. As the child progresses within the treatment process, cognitive behavioral therapy can be a valuable and a necessary treatment intervention at the appropriate time.

with a review of the research evidence related to the presenting issues and expected outcomes in order to select interventions for each individual child and parent. Otherwise, we risk a mismatch of needs and services.

T h e r e ar e s i g n i f i c an t po te n t i al c os t s t o no t c on d uc t i n g a t ho r oug h as s e s s me n t b e fo r e i n t e r ve n i n g with an E B P.

Specific treatment interventions will change as a child progresses throughout the treatment process using an individualized treatment approach. Whether it be through the Neurosequential Model of Therapeutics, the Trauma Symptom Checklist for Children, the Trauma Symptom Inventory or some other measure, it is critical that a thorough clinical assessment and our clinical judgment drive the identification of treatment goals. Thorough clinical assessment is then partnered

There are significant potential costs to not conducting a thorough assessment before intervening with an EBP. For example, a child who is withdrawn in school may behave that way because of past trauma. An intervention that fails to consider that may be ineffective or could even exacerbate the issue by re-traumatizing the child. In using trauma assessment measures, as well as a biological developmentally sensitive assessment approach, one would consider the timing and severity of both adverse experiences and relational health factors experienced by this child. In the framework of NMT, assessing this child’s developmental risk factors and completing the brain map will help us determine the types of interventions that will be most useful to her based on her specific developmental needs.

The EBP Issue

•••

10


H e a r t Wo r k a n d H e a d Wo r k : A C o n v e r s a t i o n About Relationships, Selfawareness and Evidencebased Practice A C O N V E R S AT I O N B E T W E E N : DEBBIE OKRINA, LCSW CLINICAL DIRECTOR, SPIRIT REINS

D

B E T H G E R L A C H , P H . D . , L C S W, A S S O C I A T E D I R E C T O R , T E X A S I N S T I T U T E F O R C H I L D & F A M I L Y W E L L B E I N G , S T E V E H I C K S S C H O O L O F S O C I A L W O R K , U N I V E R S I T Y O F T E X A S AT A U S T I N ebbie: As a social worker, I’m interested in research. I love reading, learning and gaining new perspectives. But I have to admit that as a clinician, I have felt frustrated with the focus on evidencebased practice. I think the intention is noble. And I am all for effective practice. But I’m not convinced that evidence-based practice models lead to quality or even effectiveness. My experience in 20 years of practicing therapy and clinical supervision is that it is first and foremost about relationships. Beth: As a clinician and researcher, I totally agree! I am often concerned that the reliance on using evidencebased programs and practice, especially when mandated by someone without clinical expertise, oversimplifies the likelihood for positive outcomes and change. Yes, ideally we want to provide services that have been demonstrated to work for a similar population with similar presenting issues – but that does not replace the therapeutic relationship as the key mechanism to deliver those services. In fact, most research on therapeutic interventions acknowledges that the relationship accounts for treatment outcomes at least as much as the particular treatment method. Debbie: One of the most significant factors I’ve seen in creating a safe relationship for therapy is the therapist

11

doing their own work. In order to have the capacity to attune to clients and create a safe relationship for healing, the therapist must be aware of her own past hurts, shame triggers and other barriers that take her out of connection. I’ve observed clinicians, myself included, staying in their heads, being task-focused and sometimes focused on the evidence-based model so much that they miss the opportunity to connect emotionally with a client. They miss the “heart work” that creates a secure relationship between therapist and client to facilitate healing and understanding. As a clinician, I often experience the need to “fix” my client. I want to have “the answer.” Our brains even reward us for having “the answer.” Sometimes focusing on the evidence-based model supports this part of me. In order to engage in that heart work, the therapist has to be attuned to the client and to themselves. It seems like the focus on evidenced-based models can lead to the therapist remaining more in their head, thinking more about what to do next and how to apply the model. This can be more comfortable than working to get embodied, owning our own feelings and working outside of sessions on what comes up inside of us that

C l i n i c i a n’s C o r n e r


needs healing which might be a barrier to connecting with clients. So, sometimes the focus on evidenced-based models can result in therapists remaining more in their head. I have observed that those of us who have college degrees, including therapists, are often more comfortable with thinking than feeling. We value data. If we could just find the answer, we believe we would be successful. But when we overly rely on the “head work”, we run the risk of losing the “heart work.” Maybe this affects researchers too?

understanding of all the “head work” and “heart work” that can impact outcomes could be more useful for the field in using evidence to determine the best fit for a client and a clinician. All of this speaks to the importance of strong partnerships between researchers and clinicians. Researchers need to listen to clinicians in order to better articulate the essential skills needed within the clinician and client relationship to deliver the specific intervention. While understanding processes like fidelity and dosage are important, including baseline information about engagement, trust and therapeutic alliance could help us all understand how clinician expertise in authentic relationships can work in partnership with specific strategies offered by an intervention model.

One of the most significant factors I’ve seen in creating a safe relationship for therapy is the therapist doing their own work.

Beth: Absolutely! That said, I have also felt frustration when I know an intervention is making a difference for clients, but am struggling to determine how to best measure and collect data to provide the evidence. Sometimes the outcomes we measure for impact are hard to operationalize and observe (like did resilience improve? Did we prevent child abuse?). One way to address this is for clinicians and researchers to work together closely to develop an evaluation plan. Clinicians hold the expertise on the work that is occurring through the therapeutic relationship. In addition to specific client outcomes, evaluations can include measures of therapeutic alliance, use of quality supervision, clinician perspective, client feedback, adaptations to models based on client or clinician characteristics and additional strategies used beyond a specific model to compliment services. Including measures such as these can enhance the usefulness of the evidence supporting a practice model or intervention. One of the most significant criticisms of the focus on evidence-based practice is that just because an intervention was able to demonstrate positive outcomes during an evaluation or trial, it is not necessarily easy to replicate in a “real-world” setting with a different set of client characteristics and clinician expertise. Thus, evaluations that include the most comprehensive

Debbie: I agree with you. And I’m inspired by this conversation. I look forward to creative new ways to measure these important concepts. Now it seems obvious to me that cultivating relationships between clinicians and researchers is a beautiful way to make sure the focus on relationship doesn’t get lost. I have to admit that sometimes I stereotype researchers as being more in their head, but when I take some time to know them, it is usually so clear that they followed their heart into this work. I look forward to partnerships that help us improve the ways we heal ourselves, our clients and our communities. •••

ADDITIONAL READING http://societyforpsychotherapy.org/evidence-based-therapy-relationships/ Lambert, M. J., & Barley, D. E. (2001). Research summary on the therapeutic relationship and psychotherapy outcome. Psychotherapy: Theory, Research, Practice, Training, 38(4), 357-361.

The EBP Issue

12


How to Make Effective Cultural Adaptations to Evidence-based Interventions

H

W H AT I S T H E N E E D ?

R U B É N PA R R A - C A R D O N A , P H . D . , A S S O C I AT E P R O F E S S O R , S T E V E H I C K S S C H O O L O F S O C I A L W O R K , U N I V E R S I T Y O F T E X A S AT A U S T I N

istorically, social workers have been at the forefront of serving individuals and families impacted by adversity and inequalities. However, low-income ethnic minority populations in the United States continue to experience widespread mental health disparities. A key approach to addressing these disparities is culturally adapting efficacious interventions that have been shown to work, but remain in need of cultural refinement. Essentially, cultural adaptation refers “to the systematic modification of an evidence-based treatment (EBT) or intervention protocol to consider language, culture, and context in such a way that is compatible with the clients’ cultural patterns, meaning, and values.” Culturally adapting an evidence-based intervention maintains the essential components of an intervention, while amending or including cultural references to increase relevance and engagement for a particular population being served. Researchers have evaluated the overall success of cultural adaptations by studying the impact of several cultural adapted interventions in studies, a line of scientific inquiry known as meta-analysis. These studies have provided conclusive empirical evidence indicating that culturally adapted interventions can be effective in reducing risk factors among vulnerable populations, as well as strengthening protective factors.

H O W S H O U L D C U LT U R A L A D A P TAT I O N S B E CONDUCTED? M O R E A B O U T C O M M U N I T Y- B A S E D PA R T I C I PAT O R Y R E S E A R C H ( C B P R ) In 2004 U.S. Department of Health and Human Services contracted with a professional work group tasked with synthesizing Community-based Participatory Research (CBPR). This work group defined CBPR as: A collaborative research approach to establish participation of the following entities: •

Communities affected by the issues,

Representatives of organizations and

Researchers who are studying the issue.

With this collaboration, CBPR specifically involves the following principles:

13

Co-learning and reciprocal transfer of expertise,

Shared decision-making power and

Mutual ownership of the processes and products of the research enterprise.

C l i n i c i a n’s C o r n e r

There are several models of cultural adaptation that are useful. In our own work, we have modified interventions by following the guidelines of a well-defined cultural adaptation framework, known as the Ecological Validity Model (EVM). Briefly, the EVM specifies the need to conduct content and intervention delivery adaptations by considering the following cultural dimensions: (a) language, (b) persons, (c) metaphors, (d) content, (e) concepts, (f ) goals, (g) methods and (h) context. In order to accurately understand the specific cultural dimensions of an underserved population, we use Community-Based Participatory Research (CBPR) principles as a way to co-implement the process of adaptation with active participation of local community leaders, stakeholders and the ultimate beneficiaries of interventions.

C U LT U R A L LY A D A P T I N G A PA R E N T I N G I N T E R V E N T I O N F O R L O W - I N C O M E L A T I N O /A I M M I G R A N T PA R E N T S : A C A S E EXAMPLE To illustrate this process, we will briefly describe a recent project to culturally adapt and empirically test


I M P O R TA N T T O K N O W •

Evidence-based interventions not originally developed for underserved populations often need to be adapted to reflect the culture of the people you intend to serve.

The cultural adaptation process includes attention to many dimensions of culture and involves active co-leadership with target communities, as well as precise feedback by the ultimate beneficiaries of adapted interventions.

Interventions with a strong empirical base can maintain, and even enhance, their effectiveness through appropriate cultural adaptations.

an established parenting intervention for low-income Latino/a immigrants. Using CBPR methods, we first held several meetings with leaders from local mental health agencies, community organizations and churches to learn about their perspectives. Then we conducted a qualitative study with 83 Latino/a immigrant parents to learn in depth about their aspirations as parents as well as challenges. Based on the findings from this study, we learned that many Latino/a immigrant parents in the area were raised in adverse contexts in their countries of origin and were often exposed to harsh and neglectful parenting as children. Participants also provided detailed descriptions of the contextual challenges they experience as immigrants in the U.S. such as long working hours, language barriers and racial/ethnic discrimination. Finally, parents identified their most pressing parenting needs, such as their desire to instill cultural values in their children, while utilizing safe and non-punitive parenting practices. Informed by this knowledge, we adapted the parenting intervention curriculum and conducted a randomized controlled trial to measure the impact of exposing caregivers of young children to a parenting intervention in which cultural issues, as well as contextual challenges such as discrimination, were overtly addressed. According to study findings, when compared to a wait-list control group, the culturally adapted version of the original parenting intervention known as GenerationPMTO©, was effective not only in improving parenting outcomes and child behaviors, but also in reducing immigration-related stress among families. Most recently, we completed another related study and obtained similar outcomes with a sample of Latino/a immigrant families with adolescent children.

FINAL REFLECTIONS AND I M P L I C AT I O N S F O R P R A C T I C E Even though some evidence-based interventions were not originally developed with sufficient inclusion of minority populations, cultural adaptation research indicates that many of these interventions can have a positive impact with low-income minorities if they are culturally adapted to increase cultural and contextual relevance. Ultimately, culturally adapted interventions are likely to be effective if: a) the intervention being adapted has a robust empirical base demonstrating its efficacy and b) the cultural adaptation process involves active co-leadership with target communities, as well as precise feedback by the ultimate beneficiaries of adapted interventions. The challenge now remains to duplicate these lines of applied research and service delivery to underserved communities across the U.S. and abroad. •••

ADDITIONAL READING Alegria, M., Atkins, M., Farmer, E., Slaton, E., & Stelk, W. (2010). One size does not fit all: Taking diversity, culture, and context seriously. Administration and Policy in Mental Health and Mental Health Services Research, 37, 48-60. https://doi. org/10.1007/s10488-010-0283-2 Bernal, G., Bonilla, J., & Bellido, C. (1995). Ecological validity and cultural sensitivity for outcome research: Issues for the cultural adaptation and development of psychosocial treatments with Hispanics. Journal of Abnormal Child Psychology, 23, 67-82. https://doi.org/10.1007/bf01447045 Bernal, G., Jimenez-Chafey, M. I., & Domenech Rodríguez, M. M. (2009). Cultural adaptation of treatments: A resource for considering culture in evidence-based practice. Professional Psychology: Research and Practice, 40, 361-368. https://doi. org/10.1037/a0016401 Hall, G. C. N., Ibaraki, A. Y., Huang, E. R., Marti, C. N., & Stice, E. (2016). A metaanalysis of cultural adaptations of psychological interventions. Behavior Therapy, 47(6), 993–1014. https://doi.org/10.1016/j.beth.2016.09.005 Parra-Cardona, J. R., López-Zerón, G., Leija, S. G., Maas, M. K., Villa, M., Zamudio, E., Arredondo, M., Yeh, H. H., & Domenech Rodríguez, M. M. (2018a). Culturally Adapted Intervention for Mexican-origin Parents of Adolescents: The Need to Overtly Address Culture and Discrimination in Evidence-Based Practice. Family Process. Advance online publication. Parra-Cardona, J. R., Leijten, P., Baumann, A., Mejía, A., Lachman, J., Amador Buenabad,… Domenech Rodríguez, M. (2018b). Strengthening a Culture of Prevention in Low- and Middle-Income Countries: Balancing Scientific Expectations and Contextual Realities. Prevention Science. Advance online publication. Parra-Cardona, J. R., Bybee, D., Sullivan, C. M., Domenech Rodríguez, M. M., Dates, B., Tams, L., & Bernal, G. (2017a). Examining the impact of differential cultural adaptation with Latina/o immigrants exposed to adapted parent training interventions. Journal of Consulting and Clinical Psychology, 85, 58-71. doi. org/10.1037/ccp0000160 Parra-Cardona, J.R., Lopez Zerón, G., Villa, M., Zamudio, E., Escobar-Chew, A. R., & Domenech Rodríguez, M. (2017b). Enhancing parenting practices with Latino/a parents: A community-based prevention model integrating evidence-based knowledge, cultural relevance, and advocacy. Clinical Social Work Journal, 45, 88-98. doi: 10.1007/s10615-016-0589-y Parra-Cardona, J. R., Aguilar, E., Wieling, E., Domenech Rodríguez, M., & Fitzgerald, H. (2015). Closing the gap between two countries: Feasibility of dissemination of an evidence-based parenting intervention in México. Journal of Marital and Family Therapy, 41, 465-481. doi: 10.1111/jmft.12098 Substance Abuse and Mental Health Services Administration (2015). Racial/ethnic differences in mental health service use among adults. HHS Publication No. SMA-154906. Rockville, MD: Substance Abuse and Mental Health Services Administration.

The EBP Issue

14


“Making it their ow n” : S u p e r v i s i n g t h e Implementation of EBPs in Clinical Practice. A N I N T E R V I E W W I T H R O B T H U R L O W, L C S W C L I N I C A L F I E L D D I R E C T O R A T L I F E W O R K S

15

C l i n i c i a n’s C o r n e r


T

meeting the criteria to be a SMART goal, it is also an instance of clients “making it their own.” This often results in a higher level of success than if the client had conformed to the original goal, and I applaud clients for taking ownership for their growth in this way.

Rob: I have supervised interns and employees in programs which are evidence-informed, but not evidence-based, for approximately fifteen years. I would distinguish Evidence-Based Programs from Evidence-Based Practices based on my assumption that Evidence-Based Programs comprise, at least in part, Evidence-Based Practices.

Rob: Hardest: helping them “get” the formula for a SMART goal. Easier: knowing the students have a template they can rely on to develop useful goals.

XICFW: Thanks so much for taking the time to chat with us about supervising and EBPs. What experience do you have supervising students, interns, or new employees under Evidence-based Programs and Evidence-based Practices? How would you explain the difference between the two?

TXICFW: What do interns generally struggle with when working under an EBP for the first time? Can you give an example? And what techniques have been useful in helping them to adapt to the structure of EBPs? Rob: When developing Action Plans [an Evidence Based Practice], interns often struggle initially to distinguish between client hopes/expectations, goals and action steps. I train them to think of hopes/expectations as what the client would like to see change by the end of treatment, SMART goals as the means for getting there and action steps as supports REMINDER to goals being achieved, or ways of removing obstacles to Specific goals being achieved. Interns Measurable have reported finding role Attainable playing these conversations with me very helpful as part Relevant of training. Timely

TXICFW: What is the hardest part of supervising interns working under an EBP? How do EBPs make your work as a supervisor easier?

TXICFW: Are interns generally aware of and comfortable with the reason (i.e., contract mandates, etc…) for implementing EBPs? Do they understand what makes EBPs ‘special’? Rob: Students today are generally much more aware of EBPs than they were in years past, and they typically understand both why they exist and why they are helpful. There seems to be a parallel trend between the increasing requirement of the use of EBPs and students’ education around EBPs. TXICFW: What should EBP creators understand about supervisors that are trying to train interns in their program/curriculum?

Rob: It is helpful for EBP creators to keep in mind that supervisors are training interns in all aspects of providing counseling. As license holders, supervisors are the gatekeepers of their disciplines and are ultimately responsible for their SMART is an acronym for evaluating TXICFW: In your experience, students’ caseload. Most interns have treatment goals, is an evidenceare interns more likely to be limited or no clinical experience overly-rigid or overly-flexible based practice and has been shown prior to their internships. They are to support client identification of and in their implementation of naturally anxious to learn agency and progress toward desired changes. EBPs? program policies and procedures, the program’s clinical model and how to form therapeutic Rob: I think the answer to this question depends in relationships with their clients. Supervisors are part on the interns’ personalities. Some interns are training the “whole” intern, and addressing both hard very detail-oriented and want to follow instructions and soft skills. “to the T”, while others prefer being more flexible and creative. It is also important to note that, because TXICFW: What is the most important thing for a we utilize Evidence-Based Practices that require a supervisor to remember when they are training an degree of structure, we try to select interns with that intern under an EBP? attention to detail, organization and follow-through as part of their professional repertoire. Rob: The more interns can observe and practice EBP skills before seeing clients and the more they are TXICFW: Have you encountered a situation in which encouraged to “make it their own,” i.e. “develop their an EBP would recommend one course of action in a own style in practicing the skills,” the easier it will be particular case, but you feel another would be more for them to work successfully with clients. appropriate? How would you recommend a supervisor handle that situation? ••• Rob: While I recognize the value of EBPs, I also recognize that clinical work is ultimately about the relationship between the clinician and the client, and meeting the client where (s)he is. I would probably try INTERVIEW BY to nudge the client closer to the recommended course Patrick S. Tennant, Ph.D., LMFT, Research Associate, Texas Institute for of action, but if the client felt strongly that (s)he Child & Family Wellbeing, Steve Hicks School of Social Work, University of wanted to do something different, I would respect that. Texas at Austin Sometimes, after clients develop SMART goals, they modify the goal on their own between appointments. While this sometimes results in the goal no longer

The EBP Issue

16


H o w t o C h a n g e t h e Wo r l d One EBP at a Time: A C l i n i c a l R e s e a r c h e r ’s Journey to Evidence T I N A A D K I N S , P H . D . , R E S E A R C H A S S I S TA N T P R O F E S S O R , T E X A S I N S T I T U T E F O R C H I L D & FA M I LY W E L L B E I N G , S T E V E H I C K S S C H O O L O F S O C I A L W O R K , U N I V E R S I T Y O F T E X A S AT A U S T I N

A

s clinicians, we have all likely heard the term “Evidence-Based Practice” or EBP. However, when I became a clinical researcher, I learned just how challenging it can be to both create an intervention and build evidence for it. I would like to share a few hard-earned lessons with you, in hopes that it will give you a better understanding of what it really takes to create an EBP. My story starts while I was working on my Ph.D. I knew I wanted to create a practical psychoeducation intervention for foster parents, one that really helped them understand foster kids on a deep level and that would help build their attachment relationship with these children. I researched what was already working so I could build both the intervention and the evidence from there.

FIRST LESSON: Building a successful intervention must start with a solid theory of change and, ideally, using some evidence already established in another context. I was working with clinicians who were both using and researching a psychoanalytic therapy called Mentalization Based Therapy for Families (MBT-F). This intervention was being used clinically with foster and adoptive families at the Anna Freud Center in London and results were quite promising. Researchers had already conducted studies showing evidence for this clinical intervention. So this is where I started… knowing I would use the theory of mentalization, as well as parts of this clinical intervention.

THIRD LESSON: Pilot test (i.e., run a small trial of ) the intervention to get real world experience of what works and does not work. This is a very important process that really helped me hone the intervention and create research processes that work. For example, I found it very difficult to get foster parents to attend many classes or a long intervention, so I structured my intervention in as few classes as possible (for my intervention, this was three classes). Also, attendance was generally low so I needed to talk to participants and really find out what would draw them in. In my study, foster parents desperately needed child-care to be able to attend a three-class series. All of these discoveries were essential to the success of both the intervention and my study. Running a well-thought-out pilot of the intervention and gathering careful feedback from all participants as well as any staff involved, is a vital part of creating an EBP. These are just a few of the lessons I learned during my work on both creating an intervention and building evidence for Family Minds. There are many more steps necessary, but if these basic building blocks are present, you have a much greater chance of building the evidence needed to finally call your intervention an EBP. Have an idea for an intervention? Your clinical knowledge is your wellspring. You already have what it takes to develop an intervention! To further your ideas, pair with a Social Work or Psychology graduate student who is interested in research and together you could change the world, one EBP at a time.

SECOND LESSON:

•••

Know your theory VERY well. Luckily, I was studying this theory in graduate school, so I had already gained this in-depth knowledge. To build an intervention based on theory, I also needed to understand the other interventions that use this theory. So, I trained in several versions of this clinical intervention. I gathered as much information as I could. As a result, I discovered that all versions of MBT had a psychoeducation component! This was a great discovery as it provided me with many ideas I used to build my intervention. I was able to pull from what already worked, to create something innovative.

17

C l i n i c i a n’s C o r n e r


I M P O R TA N T T O K N O W •

A thorough review of the current research literature is necessary to help you decide what works and how to build your intervention.

It is critical to have a solid understanding of the theory on which your intervention is based.

A small pilot test of an intervention helps you figure out what works and what does not, helping you fine tune the final version.

F O R Y O U R I N F O R M AT I O N Thinking about trying to measure client outcomes and build evidence for your intervention? Great idea! While a thorough description of the research process necessary to establish evidence is outside the scope of this article, an important potential take away is that engagement in that process is within reach for any interested and motivated clinician. As suggested, working with a social work graduate student or university professor is great way to do so. You may also be able to help establish evidence on an existing or developing model by connecting with the model creators and offering to participate in a clinical trial. A thought exercise consisting of designing a test for your own therapeutic model(s) of choice (from choosing the sample, to identifying measures to track outcomes of interest and designing data collection procedures, through examining and reporting on trends in the data) is another way to learn about the process of establishing evidence. We encourage you to reach out for support and partnership!

ADDITIONAL READING Want to know more about the nuts and bolts of Evidenced-Based Practice? Go here: https://guides.mclibrary.duke.edu/c.php?g=158201&p=1036021 To read more about the first study of the Family Minds intervention, go here: https:// txicfw.socialwork.utexas.edu/research/project/family-minds-attachment-basedmentalizing-psycho-educational-intervention-foster-adoptive-parents/ To read more about Mentalization-Based Therapies and the Anna Freud Center: https://www.annafreud.org/training/mentalization-based-treatment-training/ Visit www.familyminds.org to discover more about the intervention mentioned in this article and here to find out more about Family Minds research: https:// txicfw.socialwork.utexas.edu/research/project/family-minds-attachment-basedmentalizing-psycho-educational-intervention-foster-adoptive-parents/

The EBP Issue

18


TXICFW.SOCIALWORK.UTEXAS.EDU

Profile for TXICFW

Clinician's Corner 2018 | The EBP Issue  

Clinician's Corner 2018 | The EBP Issue  

Profile for txicfw
Advertisement

Recommendations could not be loaded

Recommendations could not be loaded

Recommendations could not be loaded

Recommendations could not be loaded