Alt Cert Programs: 5 Tracking Failures as You Scale

By
Craft Education Staff
March 30, 2026
Share this post

Growth in the non-IHE alternative certification sector has been significant. According to AACTE, enrollment in non-IHE alternative programs swelled by nearly 150% between 2011 and 2020. Programs that launched with a few dozen candidates are now operating with hundreds — in some cases, thousands — spread across multiple states.

The tracking infrastructure at most of those programs has not kept pace.

The same spreadsheets, shared drives, and email workflows that worked at 50 candidates are still in use at 500. The problems they create aren't theoretical. They show up at certification time, during accreditation reviews, and whenever leadership asks for a cross-state program report. This post identifies the five specific points where manual tracking breaks down as programs scale — and why each poses real operational and compliance risks.

Failure Point 1 — Hours Tracking Falls Apart at Volume

At 50 candidates, collecting hour logs by email is manageable. At 500 candidates across multiple states, the volume of submissions, missed entries, and unverified logs becomes unmanageable without a dedicated system.

Each state has its own minimum hour requirements. Washington requires 540 hours for alternative-route candidates. California's traditional and integrated undergraduate pathways require a minimum of 600 hours. Managing those thresholds across a distributed candidate population without structured tracking creates gaps that only surface when it's too late — at certification deadlines, not before them.

The result: programs discover missing hours in week 15, not week 9. By then, options are limited.

Failure Point 2 — Evaluation Forms Go Missing

Paper and PDF-based evaluation workflows depend entirely on individual evaluators — cooperating teachers, site supervisors, field mentors — returning completed forms on time. These evaluators are school-based professionals with full-time classroom responsibilities. Following up with them is not their priority.

At scale, the administrative burden of tracking outstanding evaluation submissions falls on a small central team managing hundreds of open items simultaneously. Submissions get lost in inboxes. Forms get returned incomplete. Some never come back at all.

Missing evaluations aren't just an inconvenience. They are gaps in the accreditation evidence record. CAEP's Quality Assurance System requirements demand verifiable, cumulative data — not a best-effort collection of whatever forms were returned.

Failure Point 3 — State Programs Operate in Silos

Multi-state programs commonly develop separate tracking systems per state — different spreadsheet templates, different folder structures, different naming conventions. The practical result is that no one has a unified view of program health across the organization.

Leadership makes resource and staffing decisions based on incomplete or inconsistent data. State coordinators operate without visibility into how other state programs are performing. When a compliance question arises in one state, answering it requires manually compiling data that was never designed to be compiled together.

The CAP and AACTE joint analysis found that despite dramatic enrollment growth in the non-IHE sector from 2010–11 to 2018–19, completions actually declined by 10% over the same period. Fragmented operational infrastructure is one structural reason programs grow enrollment without a corresponding improvement in outcomes.

Failure Point 4 — Accreditation Evidence Is Scattered

When accreditation review approaches, evidence collection becomes a project in itself. The hours data is in one file. Evaluations are in another. Placement records are in a shared drive that nobody has fully organized. Assembling a coherent evidence package requires weeks of staff time and introduces transcription errors that reviewers are trained to spot.

CAEP's R5.2 Data Quality standard requires evidence to be relevant, verifiable, representative, cumulative, and actionable. Data assembled retrospectively from scattered files does not meet that bar — regardless of how strong the underlying program outcomes actually are. Programs with good results lose credibility points because their documentation doesn't reflect the quality of the work.

Failure Point 5 — Candidate Progress Is Invisible Until It's Too Late

Without a live dashboard, program administrators don't know a candidate is behind until they miss a milestone, fail a state-required observation, or don't meet hour thresholds before a certification deadline.

At that point, the intervention window has closed. The program is left managing exceptions rather than preventing them — issuing deadline extensions, documenting extenuating circumstances, and navigating state agency processes that exist precisely because programs failed to catch problems earlier.

Real-time visibility into candidate progress isn't a reporting convenience. It's a candidate outcomes tool. Programs that can see a developing gap in week 9 can act on it. Programs that see it in week 15 cannot.

The Right Time to Fix This Is Before the Next Cycle

These five failure points don't appear suddenly. They develop gradually as programs grow and compound. Scattered hours data makes evaluation tracking harder. Missing evaluations weaken accreditation evidence. Siloed state programs make everything above worse.

The solution isn't adding more tools to the stack. It's replacing disconnected manual workflows with a single system where hours are verified at entry, evaluations route automatically, state programs share one data structure, and accreditation reporting is always current — not assembled under deadline pressure. Platforms like Craft are built specifically for this operating model: multi-state program management, structured candidate tracking, and compliance reporting that doesn't require a manual build every cycle.

Programs that scale cleanly are the ones that built the infrastructure before they needed it. The ones who wait build it under pressure—and pay for it twice.

Share this post

Sign up for our newsletter

Stay up to date with the latest news, insights, and resources from Craft.

By submitting you agree to our Privacy Policy & Terms of Service and provide consent to receive updates from Craft.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.