Children's Voice Jan/Feb 2008

In This Issue...

Features
Departments
Our Advertisers
Subscribe
About Children's Voice


Walking the Walk, Not Just Talking the Talk

Eight Steps Toward Implementing Evidence-Based Practice

By Sue D. Steib and Wendy Whiting Blome

Talk about evidence-based practice has floated among child welfare administrators, practitioners, and academics for years. The conversations include using terms like empirical support, proximal and distal outcomes, fidelity assessments, and generalizability.

What exactly are evidence-based practices? They are distinct activities an agency can incorporate into different types of service delivery models and that have some level of empirical support indicating they are effective--in other words, they work. In the field of family services, for example, casework practices such as family-engaged service planning, strengthsbased assessment, and structured family visitation, have an empirical base. These practices have been researched individually but may not have been tested together as part of a specific model of practice. Every agency does assessments and plans, but most are not using evidence-based practices.

Evidence-based programs are clearly defined activities delivered as a whole. The models specify requirements for staff qualifications and training; type, intensity, and duration of contact with clients; and use of particular tools, techniques, and documentation. Their effectiveness has been demonstrated with a specific population addressing documented needs. Examples include Functional Family Therapy, Multidimensional Treatment Foster Care, and Multisystemic Therapy.

Evidence-based models often command attention, sometimes causing policymakers and practitioners alike to use them as a one-size-fits-all answer for the many needs of their service population. Tempting as this may be, a more methodical, holistic approach usually is better. If, for example, your agency's mission statement and policies are well targeted and aligned, but the agency lacks sufficient committed, qualified staff, no evidence-based model is going to compensate.

The Evidence-Based Practice Process

Evidence-based practice, as a way of approaching work in child welfare, implies more than just adopting practices or models that have demonstrated effectiveness. It is an inclusive process that poses thoughtful questions about needs, plans for specific outcomes, involves ongoing evaluation, and uses evaluation to revise and improve practice. Many agencies explore evidence-based practice, but few take the plunge to find the practice that will address problems identified in their organizations, then implement the change and evaluate the results. To move from talking the talk to walking the walk, agencies need to follow eight steps:
  1. Identify existing data sources. Assessing organizational needs begins with examining the data. All agencies have some data sources. Public and larger private agencies have sophisticated data tracking systems, while smaller organizations may have simpler computer-based or manual tracking tools. The key is determining what data exist and what the data do and do not tell you.

    Agencies also have the capacity to gather additional data to use as a basis for planning. Interviewing a group of supervisors, caseworkers, or clients; having supervisors poll their staff about programmatic issues; or reviewing a sample of case records may provide valuable additional information.

  2. Disaggregate the data by appropriate variables. Data provide more information when they are disaggregated based on key variables. If, for example, you are concerned about the time it takes for children in foster care to move to permanency, you will learn much more about the issue by disaggregating the data by children's permanency goals, ages, races, reasons for entry into foster care, courts of jurisdiction, and placement types, than by simply looking at the timeline for the entire population. More precise information gained through this examination allows you to be more targeted in the steps that follow.

  3. Look at policy, practice, and system issues based on data. The next step is to determine just what is happening to cause the problems pinpointed by examining the disaggregated data. Why are problems, such as delays or placement changes, occurring where they are? Are policies unclear or inefficient? Are resources inadequate? Are certain court jurisdictions rendering different decisions than others?

  4. Meet with the agency team to determine where problems exist. The combined knowledge and perspectives of agency managers and supervisors can help form the most complete and meaningful conclusions from the examination of data, policy, organizational structure, and capacity. Managers and workers are most likely to know the practical factors and details about how things are really working. Identifying discrepancies between policy and practice may reveal part of the problem.

  5. Review the evidence-based practice for the program area. The search for evidence-based practices should be directed not simply at the newest, most widely touted model, but to the evidence base that addresses the needs identified in Steps 1 through 4.

  6. Assess the current practice against the selected evidence-based practice. This step is also about being strategic. Before you abandon current practices in the program area you have identified, examine them against the key features of successful practices revealed in the research. Chances are you will find some of your current activities are already aligned with the research. Focusing on those that are not supported by the evidence base is more efficient than changing everything.

  7. Implement changes in policy or practice. Organizations, like people, can tolerate only so much change at one time. You have to be deliberate about what you decide to implement and how. Change implemented by e-mail won't work; change needs to be planned, measured, and monitored over time. Change must be nurtured until it is institutionalized into the agency's practice and process. Many change management schemes exist, but most incorporate multiple steps that include
    • defining the problem,
    • establishing a sense of urgency,
    • forming a work team,
    • creating a vision,
    • planning for short- and long-term implementation,
    • empowering everyone involved, and
    • monitoring, monitoring, monitoring.

    Evidence-based practice can be implemented without outside help except when the program is licensed and requires the agency to use established training, tools, or processes. For all agencies, however, enlisting a consultant may be helpful to ensure you are looking at the problem and the implementation design with fresh eyes that will allow you to build on agency strengths to meet agency needs.

  8. Evaluate the program. For many social workers and administrators, memories of research classes in a long-ago graduate program are not their favorite recollections. But if you or someone on staff is qualified, evaluation of your implemented evidence-based practice can be performed internally. Otherwise, you will need to enlist the help of an external evaluator. That person should be involved in the process early--back in Step 1 when you were identifying data sources, and Step 2 when you were disaggregating the data. Calling an evaluator late may mean you have not collected the essential data or you have not established control or comparison groups that will be necessary to determine if the experimental group is experiencing a change.

    Agency evaluation must follow the rhythms of the organization, not interrupt the services to children and families. An evaluator will explain to staff and managers the research design, the sampling plan, the data collection strategy, the analysis scheme, and the interpretation method. The first thing the evaluator will want to do will likely be a process evaluation, which determines if the program or practice is being implemented as planned--or, put another way, determining "Are we doing what we intended to do?" If you evaluate a practice without knowing whether you are doing the evidenced-based practice as designed, the resulting data will not be useful.

    Following the process evaluation is an outcome evaluation. This measures the extent to which your implementation of the program or practice is meeting the established goals. The goal of the evidence-based case planning process, for example, is to involve parents in developing the plan so they will feel ownership of it. Using established tools, you measure their level of commitment and success in completing the tasks outlined in the plan. Often in outcome evaluations, measures are taken at several points in time to see if the change is being maintained. Data analysis techniques are necessary to determine if the change observed is significant or if it could have occurred by chance.

Applying the Steps

Let's look at an example. Perhaps your examination of data in Step 1 reveals that moving children in your agency's foster care program to permanency is taking an average of 20 months. By going through Steps 2 through 4, you learn that children who return to their families and those who are discharged to relatives exit within 14 months, on average, but those who exit to adoption remain for up to three years. This discovery prompts you to examine the key decisions in the permanency planning process to determine where the source of the delay lies.

Timely adoption can be linked to all steps in the permanency planning process, so you will need to examine decision points, beginning with children's initial placement. Where, for example, are children who take the longest to be adopted placed when they enter protective custody? Would early placement with potential adoptive families allow these children to avoid multiple moves and attain permanence sooner? How consistent and thorough are efforts to identify and assess absent parents and relatives as placement resources? At what point are case plans being changed from reunification to adoption? Is planning being done sequentially, rather than concurrently, so that many months are spent working toward reunification before an alternative plan is identified?

Are procedural steps, such as moving the case from one unit to another or approving adoption subsidies, consuming more time than they should? Are most children relinquished voluntarily, or are almost all being made available for adoption through an involuntary termination of parental rights? How long does that process take, and where are the greatest delays along the way in caseworkers' preparation of materials for attorneys, or in attorneys' preparation of petitions?

Suppose you went through the above process and learned that caseworkers have manageable workloads but lack the skills to involve families in developing alternative permanency plans that can be implemented quickly if it becomes clear that reunification will not be successful. That finding would suggest you provide supervisors and caseworkers with professional development focusing on evidence-based techniques that engage families in planning and decision-making. If, on the other hand, you found that permanency planning moves along efficiently, but a huge backlog in the approval of adoption subsidies is responsible for the delay, you have identified a procedural problem that cannot be addressed through changes in casework practice.

Every agency makes changes routinely in programs, practices, and procedures. Whether they are well-planned or poorly conceived will take time and money--two commodities in short supply. Picking evidence-based programs and practices improves the likelihood the change will bring positive results for children and families and for the agency. It isn't magic-- evidence-based practice changes are like all other planned changes: They require planning, consistent monitoring, and quality evaluation. But, then, what's your option? You don't want to invest in changes that may not work. Better to take the time, follow the steps, and learn to walk the walk.

Sue D. Steib PhD is Director of CWLA's Research to Practice Initiative.

Wendy Whiting Blome PhD is an Associate Professor in the School of Social Service at Catholic University of America, Washington, DC, and former Director of CWLA's National Center for Research and Data.


 Subscribe to Children's Voice Magazine

 Return to Table of Contents for this issue.


 Back to Top   Printer-friendly Page Printer-friendly Page
If you know of others who would like to subscribe to the Children's Voice, please have them visit www.cwla.org/pubs/periodicals.htm.

Copyright © 2007 Child Welfare League of America. All Rights Reserved.