The Reality
Almost every student’s scores go up from one screening window to the next. That’s the norm. But occasionally a score will drop or stay flat. When a parent sees this, two things happen simultaneously: they feel alarmed, and they want an explanation.
This guide gives you both—a clear understanding of why scores sometimes decrease and specific language for the parent conversation. The goal is not to dismiss the concern. The goal is to help the parent understand what the score is actually telling them, and what you’re doing about it.
A Score Is an Estimate, Not a Fixed Truth
Every screening score is an estimate of a student’s skills on a given day. No assessment can perfectly capture a student’s knowledge or ability. Many factors influence performance—focus, fatigue, confidence, and the specific content of the test. A single result should never be interpreted as a final or permanent statement about a student’s ability or potential.
This is not a flaw in any specific assessment. It is a property of all standardized measurement. A score of 50 might represent a student whose true performance on that day falls somewhere between 45 and 55. If a student scored at the higher end of that range in the fall and the lower end in the winter, the score drops—even though nothing about the student’s actual ability has changed.
The core principle:
A screening score by itself cannot answer your questions about a student. It is a signal. It tells you where to look more closely, what additional information to gather, and when to bring the data to your team. The score opens the conversation. It does not conclude it.
Why Scores Sometimes Decrease
There are several legitimate reasons a score might drop between screening windows. Understanding them allows you to give parents an honest, informed response rather than a vague reassurance.
1. Measurement Variability
Every assessment has a margin of measurement precision. If a student scored at the higher end of her range in the fall and the lower end in the winter, the score drops even though nothing about her actual ability has changed. This is the most common explanation for small score fluctuations and it applies to every standardized test in existence.
What this means for the parent conversation:
A small score change in either direction—a few percentile points—is within the normal range of measurement. It’s the equivalent of stepping on a bathroom scale three times in a row and getting three slightly different numbers. The scale isn’t broken. The measurement just has a natural range.
2. Assessment Behavior as an Engagement Signal
When a student rushes through an assessment, clicks through items carelessly, or disengages partway through, it is tempting to treat this as a testing problem—bad data that should be discarded and retaken. But that misses something important. The behavior during the assessment is itself meaningful information.
A student who races through a screener in five minutes, or who gives up before finishing, is telling you something about her relationship to the work. She may not see the purpose of the task. She may not feel confident enough to try. She may feel disconnected from the adults who are asking her to do it. She may feel that the work is so far beyond or below her level that effort feels pointless.
These are not testing problems. They are engagement problems. And they are often the same patterns the student shows in the classroom every day.
Two kinds of signals:
A screening score can signal two fundamentally different things. It can reveal aspecific skill gap—the student tried her best and the score shows where her knowledge breaks down. Or it can reveal achallenge with engagement—the score doesn’t reflect what the student knows because the student wasn’t in a position to show it. Both signals require a response. But they require different responses. A skill gap calls for targeted instruction. An engagement challenge calls for addressing the conditions that make learning possible in the first place.
3. Content Sampling
Screening assessments sample from a range of skills. The specific items a student encounters differ from one administration to the next. A student who happened to receive more items aligned with her stronger skills in the fall may encounter more items targeting areas of relative weakness in the winter. The student hasn’t lost ability. The sample changed.
4. Curriculum Timing
Screening assessments measure a broad range of grade-level skills, but the curriculum doesn’t teach those skills simultaneously. A student screened in winter may be assessed on skills not yet taught in the current instructional sequence, or on skills taught earlier that haven’t been practiced recently. This can produce a temporary dip that resolves as the curriculum cycle continues.
5. The Moving Target
This is the most important one and the hardest for parents to understand. The screening assessment doesn’t just measure what a student knows. It measures where a student is relative to where she should be at this point in the school year. Grade-level expectations advance between screening windows. A student who is learning—but learning at a slower rate than the curriculum expects—can appear to hold steady or even decline, because the benchmark moved.
An analogy that works:
Imagine your child is running a race, and the finish line moves forward between each checkpoint. She’s running faster than before. But the finish line moved faster. Her score looks worse even though she’s actually improving. The score reflects the gap between where she is and where the grade-level target is now—not whether she learned.
6. Learning Is Not Perfectly Linear
Growth often includes periods of consolidation, challenge, and adjustment. As students encounter more complex material, their performance may temporarily fluctuate before strengthening again. A student moving from single-digit addition to multi-digit subtraction with regrouping is encountering qualitatively different demands. A temporary plateau or dip during that transition is a normal part of learning, not evidence of regression.
A Score Decline Is a Signal, Not an Answer
This is the most important section of this guide. When a score goes down, the natural instinct is to ask: what happened? But a single screening score cannot answer that question by itself. It was never designed to.
What a score decline can do is tell you where to direct your attention. It is a signal to look more closely at the student, to consult other sources of data, and to bring the question to your team. The screening score opens the conversation. The educator’s professional judgment—informed by classroom evidence, teacher observations, and team discussion—is what determines the response.
Schools do not rely on a single test score to understand a student’s learning. Teachers see students every day. They notice growth in persistence, confidence, and strategy use. They observe how students apply skills in real classroom situations. These insights are essential and often provide the most meaningful understanding of a child’s development. When screening data is considered alongside this broader evidence, it becomes far more powerful—it helps educators confirm patterns, identify emerging needs, and adjust instruction in ways that support each student.
Know Your Next Step Before the Conversation
The most important question a parent will ask is: what are you going to do about this?
Every parent conversation about a score decline will arrive at this question in some form. The educator should know the answer before the conversation begins—not improvise it in the moment.
The data alone does not determine what to do next. The next step is determined by the score combined with what the teacher observes in the classroom, what other data is available, and what your school’s structures and workflows make possible. Two students with identical score drops might need completely different responses—one might need a diagnostic to find a specific skill gap, and another might need an engagement intervention because the score reflects disconnection, not inability.
The preparation for a parent meeting is not “review the score.” The preparation is: review the score, check it against classroom evidence, determine whether the signal is about a skill gap or an engagement challenge, decide which next step fits this student, and be ready to explain it in one or two sentences.
Below are common next steps following a score decline or flat data, along with example language for explaining each one to a parent. Not every step applies to every student. Choose the one that fits the situation. Your school may have additional steps based on your own workflows and structures—space is provided at the end for those.
Next Step | What to Tell the Parent (Example Rationale) |
Administer a diagnostic assessment | “The screening score tells us she’s below where we’d expect, but it doesn’t tell us exactly where the gap is. The diagnostic will pinpoint the specific skill area so we can target our response precisely.” |
Administer a skills inventory | “His skills appear to be scattered across several grade levels. The inventory will show us exactly which skills are solid and which have gaps, so we’re not guessing about where to focus.” |
Bring the student’s data to the next team meeting for discussion | “I want our team to look at this together. When multiple educators who know your child review the data alongside what they’re seeing in the classroom, we get a more complete picture than any one of us could build alone.” |
Review recent classroom work samples alongside the screening data | “The score is one data point. Before I draw any conclusions, I want to compare it to what she’s actually producing in class every day—that will tell me whether this score reflects a real pattern or a single-day snapshot.” |
Observe the student during instruction | “I want to watch how he engages with the material during class—where he gets stuck, where he checks out, where he leans in. That will help me understand whether the issue is the content, the approach, or something else entirely.” |
Talk with the student about her experience | “Sometimes the student can tell us something the data can’t. I want to ask her how she felt during the assessment, how she’s experiencing the class right now, and what would make the work feel more meaningful or manageable.” |
Address an engagement challenge | “The way he approached the assessment tells me that engagement may be the more immediate issue—before we focus on specific skills, we need to make sure the conditions for learning are in place. I’m going to look at what might be getting in the way—whether the work feels meaningful to him, whether he feels confident enough to try, and whether he has a connection with the adults supporting him—and adjust our approach.” |
Audit the current intervention | “He’s been receiving support, and I want to take a closer look at whether it’s targeting the right skill, whether he’s receiving it frequently enough, and whether the approach is the best match for what he needs. Flat data doesn’t mean he can’t learn. It means we may need to adjust the plan.” |
Revise the current intervention and bring the data to the team | “The current approach has had enough time to show results, and the data tells me we need to try something different. I’m bringing this to our team so we can generate a new hypothesis about what she needs and adjust the plan rather than continue with what isn’t working.” |
Begin collecting progress data more frequently | “Right now we’re checking in every three months with the screener. For this student, I want to start collecting data every week or two so we can see much sooner whether our approach is working—and adjust quickly if it’s not.” |
Schedule a follow-up conversation with the parent at a specific date | “I’m going to [specific action] over the next [specific timeframe], and I’d like to schedule a time to share what we find. How does [specific date] work for a check-in?” |
Your school’s next steps: |
|
|
|
|
|
When to Reassure and When to Investigate
Not every score decrease is a signal that something is wrong. But some are. Here is how to tell the difference.
Likely Normal Variation — Reassure | Worth Investigating — Gather More Information |
The drop is small (a few percentile points) | The drop is large (10+ percentile points or crossing a meaningful threshold) |
Classroom performance is consistent or improving | Classroom performance also shows a decline |
The previous score was unusually high for this student | This is the second consecutive window of flat or declining scores |
The student is otherwise progressing in the classroom | The student is receiving support and the data shows no response to that support |
A short-term factor clearly affected one session (illness, family emergency) | The student rushed or disengaged—and the teacher sees the same pattern in the classroom |
The key principle: One data point is not a trend. A single score decrease is information—a signal to look more closely and gather additional evidence. Two consecutive windows of flat or declining scores is a pattern that warrants a team discussion and a specific plan. |
What to Say to Parents
Below are specific phrases organized by situation. These are not scripts to read verbatim. They are frameworks for an honest, professional conversation. Notice that each one ends with a next step—because the parent will ask, and you should be ready.
When the drop is small and likely measurement variability:
Say: “Screening assessments have a built-in range of precision, the same way a bathroom scale gives slightly different numbers each time you step on it. A shift of a few points in either direction is within that normal range. What I pay attention to is the overall trajectory across multiple windows—and your child’s trajectory is [on track / consistent / heading in the right direction].”
When the assessment behavior signals an engagement challenge:
Say: “I want to share something important about how your child approached the assessment. [He rushed through it very quickly / She disengaged partway through / He didn’t appear to invest effort in the way we’d expect.] We distinguish between a score that reveals a specific skill gap and a score that reveals a challenge with engagement. In this case, I believe the assessment is signaling to us that engagement is the issue—and that’s consistent with what I’m seeing in the classroom as well. The score may not tell us exactly what he knows, but his approach to the assessment tells us something equally important: that the conditions for learning need attention. Here’s what I’m going to do to help with that…”
When the moving target is the likely explanation:
Say: “This is something that can be confusing about screening data. The assessment isn’t just measuring what your child knows—it’s measuring where he is relative to grade-level expectations, and those expectations advance between each screening window. So even when a student is learning, if the pace of learning is slower than the pace of the curriculum, the score can hold steady or dip. That doesn’t mean he’s going backward. It means the target moved. Here’s what we’re doing to close that gap…”
When you need to investigate further:
Say: “I want to be straightforward with you. The score is lower than we expected, and I’ve also been noticing [specific observation] in the classroom. A single score can’t tell us exactly what’s happening—but when I put it together with what I’m seeing daily, it’s a signal to look more closely. Here’s what I’m going to do: I’m bringing this to our team so we can review the data together and determine whether we need [a diagnostic assessment / a closer look at the specific skill area / an adjustment to the current support]. I’d rather investigate now and find there’s nothing to worry about than wait another three months.”
When the student is receiving support and the data is flat:
Say: “Your child has been receiving [specific support] and the screening data isn’t showing the movement we want to see yet. I want you to know that doesn’t mean the support isn’t helping or that your child can’t make progress. What it tells me is that we need to look more closely at the approach—whether we’re targeting the right skill, whether the intensity is sufficient, whether something else is getting in the way. I’m bringing this to our team so we can adjust the plan rather than continue with something that isn’t producing results.”
When the parent notices something the school hasn’t:
Say: “Thank you for telling me that. What you’re seeing at home is important information. When a parent’s observations align with a change in the data, that’s a stronger signal than either one alone. I want to take what you’re describing seriously and bring it to our team alongside the screening data so we can look at the full picture together.”
What Not to Say
Avoid This | Why It’s a Problem |
“Don’t worry about it.” | Dismisses the parent’s concern. Even if the drop is within normal variation, the parent needs to understand why, not be told not to care. |
“It’s just one test.” | Technically true but sounds defensive. Better to explain what a single data point can and can’t tell you—and what you’re going to do with it. |
“He just rushed through it. We’ll have him retake it.” | Treats the rushing as a testing error rather than a signal. A student who rushed through the assessment is telling you something about his engagement. Retaking the test addresses the score. It doesn’t address the student. |
“These tests aren’t that accurate.” | Undermines your own assessment program. The more honest version is that all assessments have a measurement range, and this is why we use multiple sources of evidence. |
“She’s doing fine in class.” (without specifics) | Sounds like you’re avoiding the question. If classroom performance is strong, name specific evidence. |
“This is what we’d expect given his profile.” | Communicates a ceiling. Even if you believe the student has significant challenges, the parent should hear what you’re going to try, not that stagnation was predicted. |
“We’ll keep monitoring.” (as the only response) | Monitoring without action is not a plan. If the data concerns you, name the action. If it doesn’t concern you, explain why and name what you’re watching for next. |
A Structure for the Conversation
When a parent asks about a score decrease, the conversation has three parts. Do them in order.
1. Acknowledge the concern.
Do not begin by explaining. Begin by hearing. The parent is worried about her child. That worry is legitimate regardless of whether the score decrease is meaningful.
Example: “I understand why that caught your attention. Let me walk you through what I think is happening.”
2. Explain the most likely cause and the evidence you’re drawing on.
Name the specific reason you believe applies to this student. Do not give a generic list of reasons scores can fluctuate. Pick the one that fits and explain it clearly. If you believe the signal is about engagement rather than a skill gap, say so directly. Then describe the other sources of evidence—classroom performance, teacher observations, work samples, effort patterns—that inform your interpretation.
Example: “In this case, I believe the most likely explanation is [specific reason], and here’s why I think that. When I look at what I’m seeing in the classroom…”
3. Name the next step and the timeline.
Every parent conversation about data should end with what you are doing, not just what you think. This is the moment the parent has been waiting for. Be specific. Name the action, explain why it’s the right one for this student, and give a timeline for when you’ll follow up.
Example: “Here’s what I’m going to do. [Specific action from the next steps table.] I’m doing this because [one-sentence rationale]. And I’ll follow up with you by [specific date] to let you know what we find.”
The Question Parents Will Ask
If scores can vary and don’t capture the full picture, why do we bother with assessments at all?
This is an important and thoughtful question, and it deserves a direct answer.
The value of screening assessments does not come from any single score. It comes from the process of tracking scores over time and coordinating those results with other sources of information. When schools look at patterns across multiple testing windows, they catch things early that might otherwise go unnoticed until a much larger gap has developed.
Screening assessments are not meant to replace professional judgment. They are tools that strengthen it. They give educators a systematic reason to ask: is this student where we expect her to be? And if not, what do we need to learn, and what should we try?
The Underlying Principle
Parents don’t need you to be certain about what caused a score to drop. They need you to be honest about what you see, clear about what it means, and specific about what you’re going to do.
The parent who hears “I’m not sure yet, but here’s what I’m going to do to find out” trusts you more than the parent who hears “Don’t worry about it.”
A score cannot answer your questions. It can tell you which questions to ask. The team discussion is where those questions get answered. The parent should know that discussion is happening.
Certainty is not the goal. Transparency and action are. |
Quick Reference: Score Change Decision Guide
Use this when reviewing data before parent conferences.
What You See | Most Likely Cause | Your Next Step |
Small dip, classroom work fine | Measurement variability | Reassure with explanation. Note what you’ll watch for at next window. |
Drop after unusually high previous score | Regression to the mean | Explain that the previous score was likely the outlier, not this one. |
Student rushed or disengaged during assessment | Engagement challenge | Identify which engagement condition is unmet. Address engagement before targeting skills. |
Score flat, student receiving support | Intervention may need adjustment | Bring data to team. Review the approach. Report plan to parent. |
Decline in both screener and classroom | Real skill gap or emerging need | Investigate: diagnostic or inventory. Form a hypothesis. Discuss with team. Report timeline to parent. |
Two consecutive windows of decline | Pattern, not noise | Team review required. Specific plan with specific follow-up date to parent. |
Parent reports concerns from home that align with score | Converging evidence | Thank the parent. Bring both data points to team. Treat as a stronger signal than either alone. |
