How to Document Dyslexia Evaluations and Structured Literacy Intervention Plans

How to Document Dyslexia Evaluations and Structured Literacy Intervention Plans

A practical guide for school psychologists, educational diagnosticians, and reading specialists on documenting dyslexia evaluations, IDEA eligibility, MTSS intervention records, structured literacy progress monitoring, and IEP goals across elementary through high school.

Why Dyslexia Documentation Is Different

Dyslexia sits at an uncomfortable intersection of science, law, and institutional inertia. The research base is strong: dyslexia is a specific, neurobiological learning disability characterized by difficulties with accurate and fluent word recognition, poor spelling, and poor decoding abilities. These difficulties typically result from a deficit in the phonological component of language and are unexpected in relation to other cognitive abilities. That definition, from the International Dyslexia Association, is decades old. The research consensus on what works (structured, systematic literacy instruction) is equally well-established.

And yet, school teams across the country still write evaluation reports that hedge, obscure, or entirely omit the word "dyslexia." They still write IEP goals that say "the student will improve reading fluency" without specifying what fluency means, what baseline it starts from, and what intervention will produce the change. They still write progress monitoring notes that record whether a student attended a session without recording whether they are actually learning.

This guide is for the people doing the real work: school psychologists completing comprehensive psychoeducational evaluations, educational diagnosticians running reading-specific assessments, reading specialists implementing intervention, and special education teachers writing IEP goals their teams will actually be able to measure. It covers what to document at each stage of the process and why each element matters.

The Comprehensive Dyslexia Evaluation

What the Evaluation Must Cover

A defensible dyslexia evaluation is not a single reading test. It is a multi-domain assessment that simultaneously establishes a cognitive profile, rules out competing explanations, and builds an evidence base for eligibility under the Specific Learning Disability (SLD) category of IDEA. The domains below should each appear in your report, with test scores, behavioral observations, and clinical interpretation for every domain evaluated.

Phonological awareness refers to the ability to identify and manipulate the sound units of spoken language. This includes syllable awareness, onset-rime awareness, and at the most granular level, phonemic awareness, the ability to isolate, blend, segment, and manipulate individual phonemes. Phonological awareness deficits are the most consistent finding in dyslexia research, and your report must document them with specificity. A score alone is not sufficient. Note which subtests were administered, what each measures, and what the pattern of scores indicates.

Phonological memory (also called verbal working memory for phonological information) involves the ability to hold sound-based information in short-term memory while processing it. Poor phonological memory affects the ability to learn new words, follow multi-step directions, and retain phonics rules long enough to apply them.

Rapid automatized naming (RAN) measures how quickly a student can name a series of familiar stimuli: letters, numbers, colors, or objects. RAN performance is a strong predictor of reading fluency, independent of phonological awareness, and slower RAN speeds are associated with the fluency deficits seen in many students with dyslexia. Document RAN performance with raw scores, standard scores, and the specific stimuli used. Note whether the student showed hesitations, self-corrections, or sequencing errors beyond simple speed reduction.

Orthographic processing refers to the ability to perceive, store, and retrieve the visual letter patterns of words. Students with orthographic processing weaknesses may develop phonological decoding but still struggle with reading efficiency and spelling because they cannot form stable visual word forms.

Decoding is the application of phonics knowledge to read unfamiliar words. Your report should document both real-word reading and pseudoword (nonword) decoding separately. Pseudoword decoding scores isolate phonics application from whole-word memorization and are particularly diagnostic. A student who reads familiar words adequately but scores significantly lower on pseudoword decoding has a phonics application deficit, regardless of general reading level.

Reading fluency encompasses rate, accuracy, and prosody. Document fluency with scores that separate these components where possible. A student who reads accurately but very slowly has a different profile than a student who reads quickly but inaccurately.

Reading comprehension must be assessed and documented even when it is adequate. When a student with significant decoding and fluency deficits shows grade-level comprehension, document how: oral language comprehension, context use, prior knowledge, or compensatory listening strategies. This profile is common in older students with dyslexia who have developed work-arounds, and the note should be explicit about what is supporting their comprehension.

Written expression includes spelling, writing mechanics, and compositional quality. Students with dyslexia almost universally show spelling deficits that persist even when reading improves. Document spelling both quantitatively (standard scores) and qualitatively (error type analysis: are errors phonetically plausible? Do they suggest phonological confusion or orthographic instability?).

Commonly Used Instruments

Your report should name each instrument by its full title and include the edition used. Outdated editions (administered because they are in the testing cabinet) are a liability risk and a clinical validity problem.

CTOPP-2 (Comprehensive Test of Phonological Processing, Second Edition): The standard instrument for assessing phonological awareness, phonological memory, and RAN. Subtests include Elision, Blending Words, Phoneme Isolation, Memory for Digits, Nonword Repetition, Rapid Digit Naming, and Rapid Letter Naming, among others. Report composite scores (Phonological Awareness, Phonological Memory, Rapid Symbolic Naming, Rapid Non-Symbolic Naming) along with subtest-level scores. Note behavioral observations: did the student use subvocalization during memory tasks? Did RAN performance deteriorate as the row length increased?

TOWRE-2 (Test of Word Reading Efficiency, Second Edition): A brief, efficient measure of real-word reading (Sight Word Efficiency) and pseudoword decoding (Phonemic Decoding Efficiency) under timed conditions. TOWRE-2 scores are particularly useful for identifying fluency deficits and for establishing a decoding baseline that is quick to re-administer at progress review.

WJ-IV Tests of Achievement (Woodcock-Johnson IV): Provides broad coverage of reading, writing, and mathematics achievement. For dyslexia evaluations, prioritize the Letter-Word Identification, Word Attack, Passage Comprehension, Oral Reading, Reading Recall, Sentence Reading Fluency, and Spelling subtests. Document age and grade equivalent scores alongside standard scores. Grade equivalents are often the metric that parents understand most readily, but they should be presented alongside standard scores rather than in isolation.

KTEA-3 (Kaufman Test of Educational Achievement, Third Edition): A comprehensive achievement battery with detailed error analysis built into several reading subtests. The KTEA-3 is particularly useful when you want structured qualitative data alongside scores. Its Letter and Word Recognition, Nonsense Word Decoding, Reading Comprehension, Silent Reading Fluency, Word Recognition Fluency, Decoding Fluency, and Spelling subtests provide strong coverage of the dyslexia profile.

GORT-5 (Gray Oral Reading Tests, Fifth Edition): Provides a combined Oral Reading Index from Rate and Accuracy subscores, plus a Fluency score and a Comprehension score. The GORT-5 is particularly useful for documenting the relationship between oral reading fluency and comprehension, and for setting measurable fluency goals.

Writing the Diagnostic Narrative

Test scores without interpretation do not constitute a dyslexia evaluation report. The narrative must do three things: describe the pattern, tie the pattern to the diagnostic criteria, and connect the findings to the student's observable academic struggles.

A diagnostic narrative section might read:

Marcus (age 9, grade 3) presents with a consistent phonologically-based reading disability profile. On the CTOPP-2, his Phonological Awareness composite fell at the 7th percentile (standard score 79), with particularly depressed performance on Elision (scaled score 6) and Blending Words (scaled score 7). His Phonological Memory composite fell at the 9th percentile (standard score 81). Rapid Symbolic Naming fell at the 5th percentile (standard score 75), indicating significant automaticity deficits beyond phonological processing alone. On the TOWRE-2, Phonemic Decoding Efficiency fell at the 4th percentile (standard score 72), substantially below his Sight Word Efficiency score at the 18th percentile (standard score 87), indicating that phonics application remains a primary deficit even as sight-word recognition develops. WJ-IV Passage Comprehension fell at the 30th percentile (standard score 92) when read aloud by the examiner, indicating adequate oral language comprehension. When reading independently, Sentence Reading Fluency fell at the 5th percentile (standard score 74). This pattern, adequate comprehension with oral input but severely depressed decoding and fluency under independent reading demands, is characteristic of dyslexia as defined by the IDA and consistent with the pattern described by Marcus's third-grade teacher. Marcus's difficulties are unexpected in relation to his oral language abilities and cognitive processing strengths.

The narrative should conclude with a clear diagnostic statement. Many states now have dyslexia-specific language requirements for school evaluation reports. Know your state's statute and use the required terminology. In states without a specific requirement, use the IDA definition and note that the student's profile is consistent with dyslexia.

IDEA Eligibility and SLD Documentation

The Three-Prong IDEA Framework

A student with dyslexia who needs specially designed instruction qualifies for special education services under the SLD category of IDEA. The eligibility documentation must establish three things:

  1. The student does not achieve adequately for their age or meet state-approved grade-level standards in one or more of the specified areas (oral expression, listening comprehension, basic reading skill, reading fluency skills, reading comprehension, written expression, mathematics calculation, mathematics problem solving).
  2. The student does not make sufficient progress to meet age or grade-level standards when provided appropriate instruction.
  3. The student's pattern of strengths and weaknesses is consistent with SLD and is not primarily the result of a visual, hearing, or motor disability; intellectual disability; emotional disturbance; cultural factors; environmental or economic disadvantage; or limited English proficiency.

Each prong requires documentation. The first prong is met through assessment data. The second prong is where your intervention tier records become critical. The third is the exclusionary analysis section of your report.

Exclusionary Analysis

The exclusionary analysis is one of the most commonly thin sections in SLD evaluation reports. Write it with the same specificity you bring to the assessment data. For a student like Marcus:

Marcus's current vision and hearing screening results are within normal limits (most recent vision screening date and results; most recent hearing screening date and results). Intellectual ability was assessed via the WISC-V; Marcus's Full Scale IQ falls within the average range (standard score 102), ruling out an intellectual disability as the primary explanation for his reading difficulties. His academic difficulties are present across instructional settings and have persisted despite consistent school attendance and documented Tier 2 intervention participation. Teacher reports and parent reports indicate that Marcus's primary language is English and that he has been enrolled in English-medium instruction since kindergarten. His reading difficulties cannot be attributed to limited English proficiency. No evidence of emotional disturbance as a primary factor was identified through behavioral rating scales or clinical observation.

Vague statements such as "the difficulties are not due to other factors" do not satisfy this requirement. Name each exclusionary criterion explicitly and state the evidence supporting each exclusion.

Documenting MTSS Tier 2 and Tier 3 Intervention

What the Intervention Record Must Show

Before a student reaches a comprehensive evaluation, most schools will have documented a period of intervention within a Multi-Tiered System of Supports (MTSS) or Response to Intervention (RTI) framework. These records are part of the eligibility determination documentation and must meet a standard that many schools do not reach.

For each tier of intervention, the record should establish:

  • The intervention used, by name and program title (Tier 2 small-group intervention should be named: "Barton Reading and Spelling System, Level 2" is more useful than "phonics group")
  • Delivery fidelity: who delivered the intervention, in what setting, for how many minutes per session, how many days per week, and over what date range
  • Attendance: actual attendance, not scheduled attendance
  • Progress monitoring data: scores from standardized, curriculum-based, or norm-referenced probes, administered at a frequency appropriate to the tier (typically every 1-2 weeks for Tier 3, every 2-4 weeks for Tier 2), not just benchmark assessments three times per year
  • Data-based decisions: what the team decided to do in response to the data, and when

A progress monitoring record that shows three Universal Screening data points across an academic year and nothing in between does not demonstrate that the student received Tier 2 intervention. It demonstrates that the school administers benchmark assessments. These are not the same thing.

Sample Tier 3 Intervention Record Entry

Student: Sofia R., Grade 2. Intervention: SPIRE (Structured Phonics Instruction for Reading), Level 1. Delivery: 45 minutes daily, 5 days per week, with reading specialist Ms. Alvarez. Session dates: October 7 through December 14.

Progress monitoring probe: DIBELS 8th Edition ORF (Oral Reading Fluency) administered every two weeks. Baseline: 8 words correct per minute (wcpm). Week 2: 10 wcpm. Week 4: 9 wcpm. Week 6: 11 wcpm. Week 8: 12 wcpm. Week 10: 11 wcpm.

Data interpretation: Sofia's ORF growth rate is approximately 0.2 wcpm per week. Expected growth rate for a grade 2 student receiving intensive intervention is approximately 2.0 wcpm per week (Good, Simmons, and Kame'enui benchmark). Sofia is not responding to Tier 3 intervention at the current intensity and approach. Team decision (December 16 meeting): refer for comprehensive psychoeducational evaluation. Parent notified by phone on December 16; written referral to school psychologist completed December 17.

That is the documentation that supports a referral and, later, an eligibility determination. A note that says "Sofia struggled in her reading group this quarter" does not.

Progress Monitoring for Structured Literacy Programs

What to Document at Each Session

When a student is receiving structured literacy intervention (meaning systematic, explicit, sequential instruction in phonology, phonics, morphology, syntax, and semantics), the session note should capture whether the instruction actually occurred in the prescribed sequence and whether the student demonstrated mastery at each level before moving forward.

For a student receiving Orton-Gillingham instruction, a session record might include:

  • Review: phonogram cards reviewed (list the deck used and any cards that required multiple presentations)
  • New learning: the phonogram or phonics concept introduced that session, the teaching procedure used (visual-auditory-kinesthetic-tactile linkage, card introduction sequence)
  • Reading practice: words and sentences read from that lesson level; accuracy rate
  • Spelling dictation: words and sentences dictated; accuracy rate
  • Connected text: passage or decodable text used; wcpm and accuracy percentage
  • Error pattern: the specific error types the student produced (substituting short vowel sounds, omitting blends, reversing letter sequence in multisyllabic words)
  • Mastery determination: whether the lesson concept is ready to progress or needs additional practice

For a student in the Wilson Reading System, document the step the student is working on (Step 1 through Step 12), the lesson activities completed, the student's word card deck performance, and whether the student met the criterion for advancing to the next part of the step.

For a student using SPIRE (Structured Phonics Instruction for Reading), document the lesson level and lesson number, the student's accuracy on the word and phrase reading exercise, the dictation accuracy, and the fluency probe result if administered that session.

Longitudinal Progress Tracking

A single session note, however detailed, has limited value in isolation. What matters at annual IEP review, re-evaluation, or due process is the pattern over time. Build a simple data table into your progress monitoring record that allows anyone reading the file to see at a glance whether the student is growing, plateauing, or declining. Minimum fields: date, probe administered, score, and a one-line interpretation.

This is not additional paperwork. It is the same data you are already collecting, organized so it can be read.

Writing IEP Goals for Dyslexia

What a Measurable Goal Requires

An IEP goal for a student with dyslexia must identify the skill, the condition under which it will be measured, the criterion for mastery, and the timeline. Generic goals do not meet IDEA's measurability standard and do not give teachers or reading specialists enough structure to know what success looks like.

Insufficient goal: "Marcus will improve his reading fluency."

Sufficient goal: "By the end of the IEP period (June 2027), Marcus will read second-grade-level decodable text at a rate of 60 words correct per minute (wcpm) with 95% accuracy across three consecutive probes, as measured by DIBELS 8th Edition ORF administered by the reading specialist."

Insufficient goal: "Sofia will improve phonological awareness skills."

Sufficient goal: "By the end of the IEP period, Sofia will correctly segment and blend words of up to three phonemes with 90% accuracy across two consecutive assessment sessions, as measured by the phoneme segmentation subtest of the DIBELS 8th Edition administered by the resource room teacher."

Goals Across Skill Domains

For a student with the full dyslexia profile, the IEP will typically need goals across multiple domains. Consider whether the student needs goals in: phonological awareness (if not yet at ceiling), phonics/decoding, reading fluency, spelling, and written expression. Goals in each area should be traceable to the baseline data in the evaluation report. If the evaluation showed a standard score of 72 on TOWRE-2 Phonemic Decoding Efficiency, the IEP goal for decoding should reference where the student currently performs, not just where you want them to go.

Accommodation Documentation

Accommodations for dyslexia are not goals. They are supports that allow the student to access the curriculum while their reading skills are being explicitly taught. Both must be present in the IEP. Document each accommodation with enough specificity to be implemented consistently across settings.

Extended time: Specify the ratio (1.5x, double time), the settings in which it applies (all classroom tests, standardized assessments, state exams), and whether it applies to homework.

Text-to-speech: Name the software or tool (Kurzweil 3000, NaturalReader, built-in device accessibility features). Specify whether the student uses it for all reading tasks or only for longer passages. Note whether the student requires training on the tool and whether that training is documented elsewhere in the IEP.

Audiobooks: Specify the provider (Learning Ally, Bookshare, Audible for Education), the classes in which audiobooks will be provided, and who is responsible for ensuring materials are available in audio format in advance of assignment due dates.

Spelling support: Distinguish between spell-check tools (which catch phonetically close errors) and predictive text tools (which allow the student to select a word from partial input). Some students with dyslexia have errors so phonologically distant from the target word that standard spell-check cannot identify what they intended. Note whether this is the case for your student.

Read-aloud for tests: Specify whether this applies to all content areas, whether math word problems are included, and whether it applies to state standardized assessments (which have their own eligibility requirements).

Common Documentation Mistakes in Dyslexia Evaluation and Intervention

Avoiding the word "dyslexia" in the report. Using terms like "reading disorder" or "specific reading difficulty" when the pattern clearly meets the IDA definition is not a neutral documentation choice. Many states now require the word to appear in reports when the profile is consistent with dyslexia. Beyond legal requirements, the family deserves a report that names what their child has.

Reporting only composite scores. A student whose subtest scores within a composite show wide scatter is a fundamentally different clinical picture than a student with uniformly depressed scores at the same composite level. Report subtest scores and interpret the pattern. An evaluator who reports only composites misses the diagnostic information that justifies the referral.

Omitting error analysis for decoding and spelling. Whether a student's decoding errors are phonetically plausible (reading "flant" for "plant") versus visually similar (reading "plant" for "plans") tells you something qualitatively different about the deficit. Error analysis belongs in the report.

Citing response to intervention without providing the data. Writing "the student did not respond to Tier 2 intervention" without attaching the actual progress monitoring data is circular and legally insufficient. The data must be present.

Writing accommodation lists without describing how each will be implemented. An accommodation that every teacher interprets differently is not an accommodation. Specify the tool, the setting, and the responsible party.

Treating progress monitoring as attendance records. A session log that says "student attended, worked on reading skills, session ended" is a scheduling document, not a clinical record. Document what was taught, what the student produced, and what error patterns appeared.

Failing to write separate goals for decoding and fluency. A student can develop phonics knowledge without developing fluency. These are related but distinct skills with different instructional needs and different measurement approaches. One goal that says "improve reading" does not cover both.

How NotuDocs Fits Into This Workflow

Educational diagnosticians and reading specialists who document intervention sessions, progress monitoring data, and IEP goal updates regularly can use NotuDocs to build structured templates for each session type: initial evaluation narrative sections, session-level intervention records, and progress monitoring summaries. Because the tool is template-first, your specialist-specific language (lesson steps, program names, mastery criteria) stays in the note rather than being replaced by generic output.

Dyslexia Evaluation and Intervention Documentation Checklist

Comprehensive Evaluation

  • Phonological awareness assessed with a normed instrument (CTOPP-2 or equivalent); subtest and composite scores reported
  • Phonological memory assessed and documented with subtest scores
  • Rapid automatized naming assessed; both symbolic and non-symbolic naming documented where instrument allows
  • Orthographic processing assessed or absence of assessment explained in report
  • Pseudoword decoding assessed separately from real-word reading
  • Reading fluency (rate, accuracy, prosody components) documented with normed scores
  • Reading comprehension assessed with both oral and silent conditions where feasible
  • Written expression including spelling assessed; error analysis included
  • Behavioral observations during testing documented (hesitations, self-corrections, frustration, use of subvocalization, finger-tracking)
  • All instruments identified by full title and edition

Diagnostic Narrative and Eligibility

  • Pattern of scores described as a coherent profile, not a list of scores
  • Dyslexia definition referenced (IDA or state-specific statute)
  • Word "dyslexia" used where the profile is consistent with it
  • Exclusionary factors addressed individually: vision, hearing, intellectual disability, emotional disturbance, language background, environmental factors
  • IDEA three-prong SLD eligibility addressed explicitly

MTSS and Intervention Tier Records

  • Intervention named by program title and level
  • Delivery data recorded: provider, setting, minutes per session, days per week, date range
  • Attendance documented (actual, not scheduled)
  • Progress monitoring data collected at appropriate frequency for tier
  • Data-based decisions documented with dates and rationale
  • Insufficient response documented with specific growth rate comparison to expected rate

Structured Literacy Session Notes

  • Program name and lesson number or level recorded in every session note
  • Review activities documented with accuracy data
  • New learning or current lesson content recorded with specific phonics concept or pattern
  • Decoding accuracy documented (percentage correct or number of errors)
  • Spelling dictation accuracy documented
  • Error patterns described in behavioral terms (specific substitution or omission, not "made errors")
  • Mastery decision documented with criterion referenced

Progress Monitoring Longitudinal Record

  • Standardized probe identified (DIBELS 8 ORF, TOWRE-2, or equivalent)
  • Baseline score documented
  • Probe scores recorded at consistent intervals with dates
  • Growth rate calculated or compared to expected growth benchmark
  • Team decisions triggered by data patterns are documented with dates

IEP Goals

  • Each goal names the skill, condition, criterion, measurement tool, and timeline
  • Goals trace back to specific baseline data from the evaluation
  • Decoding and fluency addressed as separate goals where both are deficits
  • Spelling goal present if spelling is an area of need
  • Written expression goal present if compositional or mechanics deficits identified

Accommodations

  • Extended time specifies ratio and applicable settings
  • Text-to-speech specifies tool and task types covered
  • Audiobook access specifies provider and responsible party
  • Spelling supports distinguish between spell-check and predictive text where relevant
  • Read-aloud accommodations specify content areas and whether state assessments are included

Related reading: How to Document Gifted Education Evaluations and Twice-Exceptional Student Plans | How to Document ELL Assessments and Progress Reports | How to Document School-Based Counseling and Mental Health Services

Verwandte Artikel

Schluss mit Notizen von Grund auf

NotuDocs verwandelt Ihre rohen Sitzungsnotizen automatisch in strukturierte, professionelle Dokumente. Wählen Sie eine Vorlage, nehmen Sie Ihre Sitzung auf und exportieren Sie in Sekunden.

NotuDocs kostenlos testen

Keine Kreditkarte erforderlich