The 30-60-90 Day Follow-Up Playbook

A Simple System to Prove Outcomes Last

Most programs lose visibility the moment someone exits services. The case closes, the file archives, and whatever happens next is a mystery unless the person comes back for more services—usually because something went wrong.

This gap isn't just a missed opportunity. It's where outcomes go to die.

The research is unambiguous: programs that maintain structured follow-up after services end see significantly better maintenance than those that leave follow-up to chance. The format matters less than the consistency. Any scheduled touchpoint beats no touchpoint.

What follows is a practical system you can implement without adding major staff burden. The 30-60-90 day follow-up cadence creates three key benefits: it catches regression early, it reinforces progress, and it generates evidence that your outcomes actually last.

Why These Intervals

The 30-60-90 day timeline isn't arbitrary—it's a practical cadence drawn from patterns observed across programs. That said, regression doesn't follow a predictable schedule. These intervals are useful checkpoints, not guarantees of what you'll find.

30 days is typically early enough that people still have momentum from services. Issues that surface at this stage are more often environmental—adjustment challenges with the new setting, missing supports, or unclear expectations—rather than skill loss. It's a chance to catch setup problems before they compound.

60 days is when you start to see whether skills are being practiced and reinforced. Initial momentum has usually faded. If routines aren't sticking or support gaps exist, this is often when the signs become visible. A brief check-in or "booster" conversation at this stage can make a real difference.

90 days gives you a longer view. By now, you're seeing whether new patterns have taken hold or whether earlier struggles have deepened. This check-in helps you understand trajectory—not as a final verdict, but as meaningful data about what's working and what isn't.

Some programs extend this cadence to 6 and 12 months for longer-term tracking, especially when accreditation or funding requires outcome data beyond the initial period.

What to Ask at Each Interval

Each check-in should be brief but focused. You're not recreating the full assessment—you're looking for indicators of how things are going.

30-Day Check-In

Focus: Environmental fit and early adjustment

Key questions:

  • How is the transition going overall?
  • Are there any challenges with the new environment (job, residence, program)?
  • Are the supports you expected to have actually in place?
  • Is there anything from our program you're having trouble applying?

What you're listening for: Environmental barriers, missing supports, early signs of isolation or overwhelm. At 30 days, most problems are external rather than skill-based.

60-Day Check-In

Focus: Maintenance and practice

Key questions:

  • Are you still using [specific skill/routine] regularly?
  • What's been the hardest part of maintaining your progress?
  • Is there anything that's started to slip that you'd like help with?
  • How are things going with [specific goal area]?

What you're listening for: Skills starting to fade, reduced practice frequency, loss of motivation, emerging patterns of difficulty. This is often when subtle regression begins.

90-Day Check-In

Focus: Consolidation and sustainability

Key questions:

  • Looking back over the past three months, what's working well?
  • What's been your biggest challenge?
  • How confident do you feel about maintaining your progress going forward?
  • Is there anything you need from us to keep moving forward?

What you're listening for: Overall trajectory (stable, improving, declining), confidence level, remaining support needs, whether the outcome feels sustainable.

Keeping It Lightweight

The biggest barrier to follow-up isn't lack of will—it's lack of capacity. Programs are stretched thin. Adding extensive post-discharge contact can feel impossible.

Here's how to keep the burden manageable:

Use phone calls, not visits. A 10-15 minute phone call provides meaningful information without the time cost of an in-person meeting. For many participants, a brief call feels supportive without being intrusive.

Schedule at discharge. Before someone exits services, schedule all three follow-up calls. Put them on the calendar. Assign who's responsible. Scheduled touchpoints happen. "We'll check in sometime" doesn't.

Batch your follow-ups. Designate a specific time each week for follow-up calls. If you're making calls to everyone who hit their 30/60/90 day mark that week, you build a rhythm rather than constantly context-switching.

Use simple documentation. You don't need a comprehensive assessment at each touchpoint. A brief note capturing key observations is enough: "60-day check. Morning routine holding steady. Some challenges with workplace social interactions. Connected to peer mentor for additional support."

Empower the participant. When possible, give participants a way to reach out proactively if issues arise. Some programs provide a simple "how's it going?" text check-in that invites a response if needed.

What to Do with What You Learn

Follow-up data is only valuable if you act on it.

When things are going well: Celebrate. Reinforcement matters. A brief acknowledgment that someone is maintaining their progress builds confidence and keeps them engaged.

When things are starting to slip: This is your intervention window. A 60-day check-in that reveals emerging struggle is much easier to address than a crisis three months later. Consider a booster session, additional support connection, or environmental problem-solving.

When regression is significant: Some participants will need to re-engage with services. Better to identify that at 60 days than to let someone spiral for six months before returning in crisis.

For your program overall: Aggregate follow-up data tells you about your program's effectiveness—not just at discharge, but over time. Patterns in regression can reveal weaknesses in your approach: maybe generalization isn't being addressed, maybe certain skill areas aren't sticking, maybe a particular transition pathway is problematic.

Proving Outcomes to Funders and Accreditors

Beyond the direct benefit to participants, follow-up data creates evidence that funders and accreditors increasingly want to see.

CARF and other accrediting bodies care about outcomes—and they're getting more sophisticated about what counts. Discharge metrics are necessary but not sufficient. Programs that can demonstrate 90-day or longer outcome sustainability stand out.

Similarly, funders are pushing beyond service counts toward impact measures. Follow-up data transforms your outcomes story from "we served 200 people" to "we served 200 people, and 85% were still maintaining their progress three months later."

Getting Started

If you don't currently have a follow-up system, here's a simple implementation path:

Week 1: Decide on your cadence. 30-60-90 is a solid starting point. Adjust based on your population and capacity.

Week 2: Create your check-in protocol. Draft the questions you'll ask at each interval. Keep it simple—you can refine over time.

Week 3: Build the scheduling system. This might be as simple as a calendar reminder workflow or as structured as a tracking spreadsheet that triggers follow-up dates.

Week 4: Train your team. Make sure everyone understands the purpose, the process, and how to document what they learn.

Ongoing: Start with new discharges. Don't try to retroactively follow up with everyone. Build the habit with fresh cases, then expand.

The Bottom Line

The 30-60-90 day follow-up cadence isn't complicated, but it's transformative. It catches regression early, provides meaningful data, and demonstrates that your program is serious about outcomes that last.

Any follow-up is better than none. A structured cadence is better than ad hoc check-ins. And documentation of what you learn turns isolated touchpoints into actionable intelligence.

For what to watch for during these check-ins, see Leading Indicators of Regression.

Still managing your program with spreadsheets and PDFs?

Equip brings clarity to your support team — one platform for assessments, goals, progress tracking, and communication across every setting.
Used daily by programs at universities like Auburn, UL Lafayette, and University of South Florida.