National Technical Assistance and Evaluation Center for Systems of Care
Challenges and Strategies in Implementing Accountability
The experiences of the nine grant communities involved in the Improving Child Welfare Outcomes Through Systems of Care initiative, the challenges they faced, and the strategies they followed to address them provide useful information to administrators nationwide for implementing accountability in a systems of care framework for change.
1. Unique role of evaluators in systems change
Evaluators can help system partners manage the complex, comprehensive, and synergistic nature of systems change work (Schorr, 2006). To do so, evaluators must be engaged and contributing participants in the ongoing work rather than simply observers. Because this is a departure from the traditional evaluator function, not all evaluation or agency professionals are comfortable with the extent of communication and daily involvement the engaged evaluator role requires.
- Use an internal evaluator. Contra Costa County (California) used grant funds to build its agency's internal evaluation capacity by employing a full-time evaluator. This led to greater use of agency data, not only to monitor and evaluate the system of care effort but also more generally within the child welfare agency and other offices, such as Temporary Assistance for Needy Families (TANF). Also, relationships developed by this internal position facilitated the aggregation and utilization of data across systems.
- Foster a good relationship and open communication between evaluator and project staff. The local evaluator and project manager in Pennsylvania regularly updated each other about issues in and strengths of participating counties. The local evaluator participated in monthly technical assistance calls and meetings with project staff to maintain a focus on evaluation's role in daily practice and provide assistance to agency staff and community partners on evaluation-related issues. This communication helped both the local evaluators stay informed about the work of the counties, and gave the project manager and other staff opportunities to provide feedback and contribute to the evaluation effort.
- Be creative about who participates in implementing accountability. Although the demonstration initiative required grant communities to contract with a local evaluator, much of the work related to accountability has been performed by project staff and collaborative members as they engage in activity tracking, data collection, and assessments. Creativity and inclusiveness in determining who participates in accountability can make the effort easier to sustain.
- Medicine Moon Initiative (North Dakota) hired a local evaluator coordinator within each of the Tribal child welfare agencies. The new position helped the Tribes develop the capacity to initiate their own evaluation plans based on local needs and interest. Turtle Mountain's Tribal child welfare agency worked with a group of community members who identified a shared need, developed a survey, administered and collected the data, and then worked with the local evaluator to analyze the results. The community members communicated the survey results and recommendations within their networks.
- Contra Costa County used a State program to fund internships at county child welfare agencies for graduate students in social work. The interns participated in evaluation activities that might not have been conducted without the additional contribution of their time.
- Use system measures and process data to re-invigorate work before outcome changes are evident. In Kansas, evaluators noticed that motivation and satisfaction with the demonstration initiative's efforts declined because participants were focused on long-term outcomes that were not expected for several years. Refocusing collaborative and agency participants to system and process measures and their associated activities and accomplishments, and graphically showing how this could lead to long-term change, helped reward and revitalize participants throughout the challenging work of systems change.
2. Working with agency data systems and sharing data across agencies
Collecting and sharing data can be impeded by resource constraints, inconsistent data entry protocols, lack of integrated data systems that cross organizational boundaries, confidentiality issues, and the often highly politicized environment inherent in child welfare (Mears & Butts, 2008). Some grant communities encountered data access issues, often while tracking individual outcomes. For instance, the local team in North Carolina designed an evaluation that would track the progress of individual youth for a number of measures including school attendance, behavior, and performance; involvement in other systems, such as juvenile justice and mental health; and models used by caseworkers, such as child and family team meetings. The evaluators had difficulty recruiting enough youth for the sample, were unable to gain permission to access school information, and were able to track juvenile justice measures only through aggregate data.
- Make data entry convenient and useful for child welfare professionals. In Colorado, the Jefferson County System of Care project data and technology team conducted focus groups with child welfare workers and supervisors to determine in advance staff objectives and needs for a data entry system, such as reducing duplication in paperwork. Rather than establish a system and then train child welfare staff afterward, Jefferson County built the data entry system to address the specific needs identified by child welfare workers and supervisors. The Jefferson County Child Welfare Application Timesaver accesses information entered into the Statewide Automated Child Welfare Information System to automate internal county documents, forms, and referrals, making documentation and data entry more manageable for workers, and summarizes key compliance indicators for caseworkers, supervisors, and managers. The result has been improved worker morale, improved data quality, and more efficient workload management.
- Educate staff on existing information sharing policies. In Pennsylvania, the Department of Public Welfare hosted a series of Confidentiality Forums to help county agency staff identify what information they could share across systems and where the legitimate legal challenges to cross-systems information sharing actually existed. The Confidentiality Forums helped resolve questions about the information that could be shared between staff of agencies serving the same family members.
- Link data from other systems serving similar populations. Contra Costa County uploaded data on children in foster care from the system of care demonstration initiative to California Work Opportunities and Responsibility to Kids (CalWorks), the State TANF program, which deepened the data capacity for both programs and gave better visibility and understanding to the characteristics and needs of families and individuals served by both systems.
"Almost every State responding to our survey and all the States we visited reported that insufficient training for caseworkers and inaccurate and incomplete data entry affect the quality of the data reported to AFCARS [Adoption and Foster Care Analysis and Reporting System] and NCANDS [National Child Abuse and Neglect Data System]" (U.S. Government Accountability Office, 2003, p. 15).
3. Developing and actively using detailed, meaningful plans to track progress
Accountability must start during the planning phase of systems improvement. Yet as a system change initiative moves from early planning to implementation, decision-makers and staff are often too busy performing the work to review their initial plans, which can leave the work unfocused, inefficient, and unable to accomplish significant improvements for children and families.
- Assess strategic plans routinely to measure progress. North Carolina conducted a systems of care planning retreat annually and has used the strategic plan as a working document throughout the initiative. The local evaluators led an annual blueprinting exercise to document the process of system change, identify lessons learned, and determine action steps for sustainability. This approach helped integrate planning, work, and results into a unified vision for the initiative.
- Use detailed plans to monitor progress and engage everyone performing the work. In a broad-based collaborative, many participants are responsible for accomplishing a variety of activities. While agency administrators may be able to hold their own staff accountable, it can be difficult to hold members of a collaborative accountable. Colorado, New York, North Carolina, and Pennsylvania created and used collaborative work plans that identified activities and persons responsible.
- New York emphasized the role of community members in its collaborative. Community members held the agency accountable, but through good project management and thorough tracking tools, the collaborative also held community members and agencies accountable for activities they promised to accomplish.
- Northumberland County, Pennsylvania, used the Plan-Do-Study-Act (PDSA) improvement process (Langley et al., 1996) to monitor and guide the work of several subcommittees involved in systems improvement activities in the child welfare agency. Each subcommittee followed the PDSA process to create, implement, assess, and act on the results of systems improvement activities such as developing cross-system training and revising an intake form to be more family friendly.
- Colorado's Progress to Goals Survey measures the perceived progress in each of the collaborative's committees by asking participants about the committee goals, membership, and productivity. Based on this feedback, grant staff can make modifications to the facilitation and activities of the subcommittees to ensure meaningful and productive meetings.
- Bladen County, North Carolina, developed an outline-style tracking tool that identified the tasks assigned to each collaborative partner and enabled the collaborative to track progress toward goals.
- Develop logic models for specific areas of work. Some efforts within a large initiative are so complex that they may benefit from their own planning documents. For instance, Kansas developed models for each of the activities articulated in the grant logic model. The activity models helped the grant team and local system of care steering committees stay focused on grant goals. The activity models also provided a framework for dialogue among diverse stakeholders on various grant activities and progress toward goals. The Medicine Moon Initiative and New York also developed planning documents (logic models and strategic plans) for specific activities or areas of work. Such focused plans can be especially helpful for collaboratives with dedicated subcommittees.
"Effective evaluation data reports can be powerful tools for improving and sustaining interagency service delivery systems for children and families" (Woodbridge & Huang, 2000, p. 11).
4. Addressing implications
Applying the information can be seen as critically examining the data collected about the work and asking the question, "So what?" With any systems improvement effort, two obstacles are associated with this phase of accountability: ensuring regular opportunities are offered to reflect on the data collected, progress made, and lessons learned, and making and carrying out decisions based on this information.
- Change organizational culture to embrace accountability.
- In Kansas, the project team and local evaluators created a culture of evaluation, beginning with development of a logic model. The project team then used the logic model to increase local capacity for data usage and data-driven decision-making in several ways: (1) conducting focus groups that included questions about how systems of care principles would be operationalized, proposed action steps, and identified measures of effectiveness; (2) conducting training on the logic modeling process with local and State systems of care steering committees; and (3) responding to requests for data on issues identified by systems of care collaborative councils. Though more time intensive and complex than traditional evaluation approaches, the Kansas approach has resulted in local steering committees that have incorporated logic modeling as a central part of planning and evaluation in ongoing work.
- In Clark County, Nevada, local evaluators provided regular data and evaluation updates at county and State meetings. In addition, they provide the grant community with technical assistance on issues related to data collection, assessing program effectiveness, and interpreting data in preparation for presentations at meetings.
- In Pennsylvania, local evaluators participated in various collaborative subcommittees. At subcommittee meetings, the local evaluator provided informal updates on the evaluation, formally presented evaluation findings biannually, and provided evaluation technical assistance for specific subcommittee tasks.
- Link findings to other agency priorities. In Kansas, the local evaluation team gave several presentations to local and State systems of care leadership, illustrating how the activities conducted through their systems of care related to findings about the length of stay in foster care, which is a priority for the State as it works to comply with the mandates of the Federal Child and Family Services Reviews.
- Use what is learned through measurement to sustain and grow programs.
- Bedford-Stuyvesant, New York, participates in agency and citywide efforts to implement a neighborhood-based services system through the realignment of all foster care, prevention, and protective services along community district lines. This grant community has received additional funding to continue improving neighborhood-based service coordination, collaboration, and accountability to the community via the child welfare funded agency's community partnership initiative (designed to help the child welfare agency and community coalitions come together to design a plan to increase safety, permanency, and well-being in their communities).
- Contra Costa County focused on using data to inform decision-making regarding agency practices. For instance, one of the internal evaluators assessed caseworker workloads for 12 months to give supervisors and managers a better idea of their needs, resources, and how workload may affect child and family outcomes.
- Spirit Lake's (North Dakota) implementation of the SuperFileIt electronic data management system allowed the director of the child welfare agency to provide the Tribal council with agency performance data as well as information on the needs of Tribal child welfare involved youth and families. Because of the disproportionate number of American Indian children in foster care in North Dakota,1 the director was invited to provide testimony to the State legislature on Tribal child welfare needs in Spirit Lake.
1According to the North Dakota Department of Human Services (2009), although American Indian children represent only 7 percent of the total population of children in North Dakota, they make up 27 percent of children in foster care in the State. back