Your Clinical Hours Data Might Not Be Good Enough for CAEP

By
Craft Education Staff
March 27, 2026
Share this post

Picture this: a CAEP site reviewer is sitting across from you during a visit. They ask to see the clinical practice hours documentation for your alternative track candidates — not aggregated totals, the actual individual records, with supervisor sign-offs.

Most programs can answer that question. What they struggle with is answering it quickly, cleanly, and without pulling data from three or four different places.

If your clinical hours data lives in a Google Form, a shared spreadsheet, and an email chain that nobody can fully reconstruct — this post is for you. Because the problem isn't your program. It's your tracking workflow. And there's a version of this that doesn't require building anything from scratch.

What CAEP Actually Requires (And Why It's Stricter Than You Think)

Let's start with the standard itself. CAEP Standard 2 requires that clinical practice be of "sufficient depth, breadth, diversity, coherence, and duration." But the more demanding language is in Standard R5.2, which governs Data Quality within your Quality Assurance System. The QA system must rely on relevant, verifiable, representative, cumulative, and actionable measures to ensure interpretations of data are valid and consistent.

That word — verifiable — is doing a lot of work. It means accreditors don't just want to know that your candidates completed their hours. They want to see evidence that can be traced back to a source, validated by a third party, and pulled on demand. As the CAEP Evidence Guide defines it, evidence is the intentional use of documentation, multiple and valid measures, and analysis to support and prove an EPP's claims related to CAEP's standards.

This is also a state compliance issue, not just accreditation. The hour minimums vary significantly — and they're not small numbers. California's traditional and integrated undergraduate teacher preparation pathways require a minimum of 600 hours of clinical preparation. Washington State requires at least 540 hours for alternative-route candidates — notably higher than the 450 required for traditional programs in the same state.

Spreadsheets don't meet that bar. Here's why.

The Three Gaps Manual Tracking Can't Close

Gap 1 — No Verification Trail

When a candidate logs hours in a Google Form or spreadsheet, that entry has one author: the candidate. There's no documented chain of approval. No cooperating teacher sign-off. No timestamp linking the hours to a specific placement date.

CAEP reviewers understand the difference between self-reported data and verified data. When your only evidence is a candidate-populated spreadsheet, you don't have a verification trail — you have a record of what candidates claimed. That's a meaningful distinction during a site visit, and it's an easy Area for Improvement to cite.

Gap 2 — No Real-Time Visibility

When hours live in a Google Form, no one sees the full picture until someone manually compiles it. That compilation usually happens once a semester, which means a candidate who's behind in week 9 doesn't surface as a problem until week 15 — often after certification deadlines have passed.

By the time a coordinator notices the gap, the options are limited: scramble for a placement extension, issue a late certification, or document a candidate exception. None of those are good outcomes. And all of them are preventable with earlier visibility.

Gap 3 — No Audit-Ready Export

Here's where manual systems break hardest. When accreditation review comes, you need structured data: hours by candidate, by placement site, by date range, with supervisor approvals attached. The CAEP Evidence Guide defines evidence as the intentional use of documentation, multiple and valid measures, and analysis provided as support for and proof of an EPP's claims — a standard that retrospectively assembled spreadsheets consistently fail to meet.

If your data is in spreadsheets across multiple files and semesters, assembling that evidence is a project — one that takes staff hours, introduces transcription errors, and creates the kind of inconsistencies reviewers notice. Every manual compilation is an opportunity to make a mistake.

What a Workflow That Closes These Gaps Looks Like

Here's the fix — and it doesn't require building a custom system.

Verified entries from day one. Candidates log hours as they complete them. Each entry goes directly to the cooperating teacher or site supervisor for review and approval. The sign-off is part of the workflow, not a separate step you chase later. Every entry is timestamped, tied to a placement site, and attached to an approving supervisor's name.

A live dashboard for coordinators. Instead of a manual semester-end tally, the program administrator sees hours logged, hours verified, and hours remaining per candidate — updated in real time. A candidate who's behind in week 9 shows up as a flag in week 9, not week 15.

Reporting that's already structured. When data is captured in a structured workflow from the start, it doesn't need to be assembled into evidence later — it already is evidence. Hours by candidate, by site, by semester, with supervisor sign-offs: that's an export, not a manual build.

This is exactly the workflow Craft is designed around — candidate self-reporting with built-in evaluator verification, real-time progress dashboards for coordinators, and compliance reporting that's structured from the moment data enters the system. No spreadsheet maintenance required.

You Don't Need a Custom System

The usual objection at this point is: "We'd have to build something." And for a lean alternative certification team that's already managing placements, advising, and accreditation prep — building and maintaining a custom platform is a non-starter.

Stitching together Google tools gets you part of the way there. You can collect data and share it. But you can't close the verification gap with a Google Form, and you can't get audit-ready reporting from a shared spreadsheet without someone manually building it each time.

The right move is a purpose-built workflow that already solves this problem — one where the verification trail, the real-time visibility, and the structured reporting are built in, not bolted on.

Clinical hours tracking is the right place to start. It's the most immediately painful administrative task for alternative program coordinators, and it's the most frequently scrutinized piece of evidence during accreditation reviews. Get that right, and evaluation data and placement records slot in naturally alongside it.

Share this post

Sign up for our newsletter

Stay up to date with the latest news, insights, and resources from Craft.

By submitting you agree to our Privacy Policy & Terms of Service and provide consent to receive updates from Craft.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.