United States. Children's Bureau.
Chibnall, Susan.;Dutch, Nicole M.;Jones-Harden, Brenda.;Brown, Annie.;Gourdine, Ruby.
Appendix: Conducting the Site Visits
As a first step in conducting the site visits, the team worked with agency administrators to identify a contact person at each site who could assist the team to plan and implement on-site discussions. In most cases, this person was an employee of the child welfare agency. All communications regarding site visits were coordinated through the identified contact person and the team member assigned to a particular site. Through telephone conversations, team members worked with each contact person to identify the individuals and groups most important to include in on-site discussions, and then to plan proposed interviews, discussion groups, and other activities to be conducted during the site visit. To the extent possible, across sites, the team attempted to focus discussions on similar topics and to talk with individuals of similar title and position. In the end, the discussions commenced as follows:
- Individual discussion sessions with state-level child-welfare officials and agency administrators as well as partner agency directors and other high-ranking individuals (i.e., Attorneys General and judges)
- Individual or group discussion sessions with child-welfare agency supervisors and direct service workers, and partner agency supervisors and direct service providers
- Individual or group discussion sessions with child-welfare agency and partner agency program staff
- Individual discussions with parents.
Opportunities for gathering additional information, such as observations of program activities or court hearings, were identified by the contact person and discussed with team members prior to scheduling.
Due to the comprehensive nature of the information gathering efforts, most visits lasted a full week, and were conducted by two, sometimes three, team members. At the beginning of each site visit, the team met with the agency administrator to review the purpose of the site visit, discuss the intended goals of each planned discussion session, and to resolve any issues or concerns related to the team's efforts. In many instances, this first meeting included agency administrators as well as his or her management staff.
Prior to each individual or group discussion, participants were briefed about the purpose of the study, confidentiality guidelines, and the anticipated length of the discussion, and were asked to give verbal consent to participate. Participants were encouraged to ask questions and gain clarification on issues of concern prior to giving consent. At this time, they also were asked permission for the facilitator to record the interview using a tape recorder. If participants were uncomfortable with the recording for any reason, and there were instances when this occurred, team members were instructed to use paper and pencil to record responses and important notes. Once consent was obtained and participants were comfortable, the discussion began. At the end of each session, participants were thanked for their time, allowed to ask questions, and reassured regarding confidentiality guidelines.
At the conclusion of the site visit, a brief meeting was scheduled with the agency administrator or his or her assigned representative. The meeting was designed as an opportunity for the administrator to ask questions but also for the team to give feedback regarding the issues of interest. Because the data had not been analyzed at the time of the meeting, and team members were reporting based on their initial impressions, feedback tended to be broad and general but informative.
ANALYZING THE DATA
In total, eight site visits were conducted. After the conclusion of each site visit, audio-tapes were transcribed and analyzed using both traditional qualitative techniques (e.g., content analysis) and text analysis software, specifically, IN-VIVO, a qualitative software package that allows the analyst to store documents, create text categories, code text segments, and generate reports.
To conduct the qualitative analyses, the transcripts first were organized by topic areas. Due to a number of factors, including time limitations and variation in participants, discussion questions often varied across participants, both within and across sites. As a result, the development of the coding scheme and subsequent analysis was focused on the five topics that had been addressed most consistently. The following four priority one topics were included:
- What is your perception of over-representation? That is, why do you think children of color are over-represented in the child welfare system?
- How have Federal policies like MEPA and ASFA changed the way in which your agency serves children and families of color?
- What has your agency done, if anything, to improve the delivery of services to children and families of color?
- What types of services, programs, or policies do you think are necessary to reduce the over-representation of children of color in the child welfare system?
Additionally, the following priority two topic was included, "What policies, procedures or practices would assist your agency to better serve children and families of color?"
Once the data were organized by question, a sample of discussion sessions was drawn, and responses from each session were examined across the five questions. From this examination, an exhaustive list of response "themes" (e.g., initial codes) was generated. These themes or codes became the foundation for the coding scheme, which was used to code the data within and then across the sites. As a quality control check, other team members reviewed the list of response themes for accuracy and completeness.
The development of the final coding scheme was an iterative process. As team members applied the initial set of codes to the data, codes were modified, revised, or dropped altogether, and new codes were developed. Each time new codes were developed or others were changed, team members went back to the previously coded data and applied the new or revised codes, where appropriate, a task common to qualitative data analysis and generally referred to as re-coding. Once the coding scheme was finalized and the data were coded, they were analyzed across site for common themes and significant differences. In addition to the qualitative analysis, written documentation provided by agency administrators were reviewed to provide both a context for the qualitative analysis, including agency characteristics and operational guidelines, and descriptions of programs, projects, and strategies that were being implemented with children and families of color. Generally, written documentation was limited to descriptive information. The findings of the analyses are presented in detail in Chapter IV of this report.
This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare Information Gateway.