Trauma Services for Children & Families: Lessons Learned from Program Evaluation

by |

By Molly Ryan, MPH

This month several Institute for Community Health staffers will showcase our program evaluation work at the American Evaluation Association Annual Conference. Participatory program evaluation is central to the ICH mission and is one of our core services.  In honor of AEA and National Depression Screening Day, we’re highlighting our work evaluating mental health services with the Central Massachusetts Child Trauma Center (CMCTC).

ICH has partnered with CMCTC for the past year to evaluate their delivery of evidence-based, trauma-informed mental health services for children and adolescents, particularly those with a military affiliation. ICH and CMCTC project staff utilize standardized trauma evaluation tools to measure trauma exposures and symptoms, resulting behaviors, and caregiver strain. Mental health providers administer the tools, which are then sent to ICH for immediate analysis. This immediate analysis provides mental health clinicians, patients, and families with data in “real time”, allowing providers to efficiently modify the care plan and help validate patients and families’ experiences of trauma. Our periodic aggregate review of clinical assessment data also helps the project directors understand the strengths and limitations of the treatments, informs training improvements, and ultimately contributes to the treatment models’ evidence base.

As we enter our second year of evaluation, we’re reflecting on some of the key lessons learned:

Training

  • Just because a tool is standardized does not mean it’s easy to follow! In order to ensure data reliability, it’s important for providers to be trained users of the tools and for them to help clients complete the tools.
  • Evaluators must ensure that clinicians are comfortable using the evaluation tools in the clinical encounter.
    • Tip: Use a combination of text and graphics to explain evaluation results. This will help both providers and caregivers understand the data.

  • When using multiple data collection tools at multiple time points, help providers keep track of upcoming due dates. This is particularly important if providers have several clients enrolled in the evaluation.
    • Tip: Remind providers when a client’s follow-up assessment is approaching. Time the reminder so that clinicians have enough scheduling flexibility to complete the assessment.  Several reminders may also be necessary.
    • Tip: Create schematics like the one below to help providers understand when to complete evaluation tools.

Retention

  • It’s important to recognize that it can be difficult for vulnerable populations, such as individuals receiving trauma services, to remain in care. Unstable living situations, acute mental health problems, and readiness for treatment are just a few of the issues that our program population frequently endure. As a result, “lost to follow up” is a common issue for program evaluation.
    • Tip: Maintain open lines of communication with providers in order to track clients’ progress and create a tracking mechanism to document clients’ change in status.
    • Tip: Anticipate that clients are more likely to drop out of treatment in the first 3 months. Work with program staff to identify the information that is essential and meaningful to capture if a client has not been actively engaged in treatment.

Meaningful evaluation of mental health services depends on effective and efficient collaboration between project leaders, clinicians, and evaluators.  Our experience with this evaluation highlights the value of multi-disciplinary partnerships to improve mental health outcomes for children and their families.


The views expressed on the Institute for Community Health blog page are solely those of the blog post author(s), and do not necessarily reflect the views of ICH, the author’s employer or other organizations with which the author is associated.