OET Small-Span Correction — Prototype live LLM

Goal Make the AI proofreader produce atomic phrase-level edits (the style of a real human proofreader), not sentence-level rewrites. The same clause skeleton the student wrote is preserved — only the errors are touched.

Pipeline Constrained Claude prompt → JSON corrections → JavaScript validator (rejects spans that cross sentence boundaries, break word boundaries, or exceed 4 tokens) → goal-style HTML render. The default below was generated by a single API call on a sample letter; paste your own letter and it runs the same pipeline live.

Sample: COPD discharge referral, 21 corrections, avg span ≈ 14 chars.

How this compares to sentence-level rewrites

Corrections
v1-style: ~8-12
Avg span length
v1-style: 30–80 chars
Cross-sentence spans
target: 0
Validator rejections
caught before render
1 · Constrained prompt (the core)

The prompt enforces small-span output through four rules. Full text below — copy and test it yourself on any OET letter.

loading…
2 · Validator (post-processor, ~35 lines of JS)

After the LLM returns JSON, this validator runs before render. It rejects any correction whose original_text crosses a sentence boundary, starts or ends mid-word, or exceeds the token budget. This pre-render check directly addresses the "broken tracked changes" and "duplicated text" issues from the brief.

loading…
3 · How the default sample was generated

The sample you see above is not hand-authored. The pipeline was run once on the COPD letter with a single Claude Sonnet 4.6 API call. The JSON response was saved and is loaded here as the default. The Try Your Own Letter button calls the same endpoint live.

  • Model: claude-sonnet-4-6
  • Max tokens: 8000
  • Temperature: default (1.0)
  • System prompt: as above
  • User turn: the letter, verbatim, wrapped in --- delimiters

The validator also demonstrates itself: one of the LLM's 21 corrections ("time. we" → "time. We") technically crosses a sentence boundary, and the validator flags it — an example of the pipeline catching a real LLM slip.