Search This Blog

Scatter Plot Analysis

 

30 setup prompts for producing scatter plots from school test data.

These prompts are designed to help you explore relationships between test performance and a wide range of other factors. For each prompt, you would typically plot the test score on the vertical (Y) axis and the other variable on the horizontal (X) axis.


📈 Category 1: Performance vs. Study Habits

These prompts look for a connection between a student's preparation and their test results.

  1. Hours Spent Studying (per week) vs. Final Exam Score

  2. Percentage of Homework Completed vs. Quiz Average

  3. Number of Practice Problems Completed vs. Math Test Score

  4. Time Spent on Study App (minutes) vs. Vocabulary Test Score

  5. Number of Tutoring Sessions Attended vs. Midterm Exam Score

🏫 Category 2: Performance vs. Engagement & Attendance

These prompts investigate the relationship between a student's physical and mental presence in class and their scores.

  1. Number of School Days Missed (Absences) vs. Standardized Test Score (Reading)

  2. Class Participation Grade vs. History Final Project Score

  3. Percentage of Classes Attended vs. Final Grade Percentage

  4. Number of Times Tardy to Class vs. Pop Quiz Average

  5. Student's Self-Reported "Interest in Subject" (Scale 1-10) vs. Science Test Score

📊 Category 3: Comparing Academic Performance

These prompts compare a student's performance in one area directly against another, looking for consistency or correlation.

  1. Midterm Exam Score vs. Final Exam Score

  2. Standardized Test Score (Math) vs. Standardized Test Score (Language Arts)

  3. Score on Exam 1 vs. Score on Exam 2 (in the same course)

  4. High School GPA vs. College Entrance Exam Score (e.g., SAT/ACT)

  5. Reading Speed (Words Per Minute) vs. Reading Comprehension Test Score

  6. Score on Written Essay vs. Score on Multiple-Choice Test (for the same subject)

🏠 Category 4: Performance vs. Student Background

These prompts explore potential correlations between a student's home environment or demographic factors and their academic achievement.

  1. Family Income vs. Standardized Test Score

  2. Parental Education Level (Years of Schooling) vs. Student's GPA

  3. Number of Books in the Home vs. Reading Test Score

  4. Student's "Self-Efficacy" Score (from a survey) vs. Physics Exam Score

  5. Average Hours of Sleep per Night vs. Cognitive Aptitude Test Score

📝 Category 5: Performance vs. In-Test Behavior

These prompts look at what a student does during the test.

  1. Time Taken to Complete Exam (minutes) vs. Final Exam Score

  2. Number of Questions Answered vs. Total Test Score (in a timed test)

  3. Number of Answers Changed vs. Test Score

🧑‍🏫 Category 6: School-Level & Other Factors

These prompts investigate broader environmental or school-wide factors.

  1. Class Size (Number of Students) vs. Class Average Exam Score

  2. Years of Teacher Experience vs. Class's Average Standardized Test Score

  3. Per-Pupil Spending (by school) vs. Average SAT Score (for that school)

  4. Distance from Home to School (miles) vs. Student's Attendance Rate

  5. Screen Time (hours per day) vs. GPA

  6. Score on Pre-Test (at start of unit) vs. Score on Post-Test (at end of unit)

    --------------------------------------

    Here are 30 setup prompts designed to generate scatter plots for a comparative analysis of school test data, categorized by the type of relationship they aim to explore.

    📈 Category 1: Core Subject vs. Core Subject

    These prompts compare performance between two academic disciplines.

    1. "Plot Math scores vs. Reading scores, with points color-coded by grade level."

    2. "Generate a scatter plot comparing Science scores (Y-axis) against Math scores (X-axis), using different shapes for each grade level."

    3. "Show the relationship between Writing assessment scores and Reading comprehension scores, color-coded by grade."

    4. "Create a comparative scatter plot of History scores versus Writing scores for all grades."

    5. "Plot 3rd-grade Math scores vs. 3rd-grade Reading scores on one chart, and 5th-grade Math vs. 5th-grade Reading on another for side-by-side comparison."

    6. "Generate a scatter plot of final exam scores in Language Arts vs. final exam scores in Social Studies, faceted by grade level."

    📊 Category 2: Performance vs. Engagement/Behavior

    These prompts look for correlations between academic scores and student behavior data.

    1. "Analyze the correlation between days absent (X-axis) and final exam scores (Y-axis), differentiated by grade level."

    2. "Plot homework completion rate (%) against Science test scores, using grade level to color-code the points."

    3. "Show the relationship between 'time spent on e-learning platform' and 'Reading comprehension scores,' comparing trends across grades."

    4. "Create a scatter plot of 'number of behavioral incidents' vs. 'average test score,' segmented by grade."

    5. "Compare student attendance rates (X-axis) with their corresponding Math scores (Y-axis), using separate colors for 6th, 7th, and 8th grades."

    6. "Plot 'library books checked out' vs. 'Reading test scores,' color-coding points by grade."

    🧑‍🎓 Category 3: Performance vs. Demographic/Context

    These prompts explore relationships between scores and student background or class context.

    1. "Plot Math scores vs. socioeconomic status (e.g., Free/Reduced Lunch status), and use faceting to create a separate plot for each grade level."

    2. "Generate a scatter plot of Reading scores by student age (in months), coloring the points by grade level to see within-grade age effects."

    3. "Show the relationship between 'class size' (X-axis) and 'average class test score' (Y-axis), with each point representing a class, colored by grade."

    4. "Plot Reading scores (Y) vs. Writing scores (X) for ELL (English Language Learner) students only, color-coded by grade."

    5. "Compare 'teacher experience (years)' vs. 'class average Math score,' using different markers for each grade level."

    ⏳ Category 4: Test-Taking & Growth Analysis

    These prompts analyze test-taking habits or student progress over time.

    1. "Plot 'time taken to complete test' (X-axis) against 'final score' (Y-axis), and create separate plots (facets) for each grade level."

    2. "Show the relationship between 'practice test scores' and 'final exam scores,' using different colors for each grade."

    3. "Plot 'previous year's Math score' (X-axis) against 'current year's Math score' (Y-axis) to show growth, and color-code by current grade level."

    4. "Analyze the change in Reading scores from Fall to Spring semester, plotted against Fall semester scores. Use colors to differentiate grade levels."

    5. "Generate a scatter plot of 'student growth percentile' (SGP) in Math vs. SGP in Reading, colored by grade."

    ⚙️ Category 5: Advanced & Multi-Variable Prompts

    These prompts request more complex visualizations, such as adding trend lines or creating grids of plots.

    1. "Plot Math scores vs. Reading scores. Add a separate linear regression line for each grade level to compare the slope of the relationship."

    2. "Create a scatter plot matrix (pair plot) of Math, Reading, and Science scores. Color the points in all plots by grade level."

    3. "Generate a scatter plot of Science scores vs. 'days absent.' Use faceting to create a 2x3 grid of plots, one for each grade from 3rd to 8th."

    4. "Plot 'average homework score' vs. 'final exam score.' Use large, semi-transparent markers and color-code by grade to visualize cluster density."

    5. "Show a single scatter plot of 4th-grade Reading scores vs. 5th-grade Reading scores for the same cohort of students (longitudinal)."

    6. "Plot Math scores (Y) vs. a 'composite engagement score' (X). Superimpose a 2D density plot for 6th graders over a standard scatter plot for 7th graders."

    7. "Generate a 3D scatter plot with Math score (X), Reading score (Y), and Science score (Z), using color to represent grade level."

    8. "Plot 'standardized test percentile' (Y) against 'report card grade (GPA)' (X). Facet this comparison by both grade level and subject."


     

    Here are 25 more setup prompts, grouped into new categories to explore more nuanced relationships in your school test data.

    🔬 Category 6: Sub-Skill & Test Component Analysis

    These prompts break down "scores" into their constituent parts to find more granular relationships.

  7. "Plot scores on the 'Algebra' section vs. scores on the 'Geometry' section of the Math test, color-coded by grade."

  8. "Show the relationship between 'Multiple Choice' score (%) (X-axis) and 'Essay/Short Answer' score (%) (Y-axis) for the English exam, using different shapes for each grade."

  9. "Generate a scatter plot of 'Reading Fluency' scores against 'Reading Comprehension' scores, faceted by grade level."

  10. "Compare performance on the 'Lab' portion (Y) vs. the 'Written Test' portion (X) of the Science final, with separate plots for 7th and 8th grade."

  11. "Plot 'Grammar & Mechanics' sub-score vs. 'Vocabulary' sub-score from the Writing assessment, color-coding by grade."


🏛️ Category 7: Group-Level & Aggregate Analysis

These prompts shift the unit of analysis from individual students to classrooms or schools.

  1. "Create a scatter plot of 'Average Class Math Score' vs. 'Average Class Reading Score,' with each point representing a classroom, colored by grade level."

  2. "Plot 'Variance in test scores within a class' (Y) vs. 'Average score of that class' (X), colored by grade."

  3. "Show the relationship between 'Teacher's Average Student Score (Math)' and 'Teacher's Average Student Score (Reading),' with points colored by the grade level they teach."

  4. "Plot 'School-wide Average Science Score' (Y) vs. 'Student-Teacher Ratio' (X), with each point representing a school, faceted by grade."

  5. "Compare 'Percentage of Students Passing' in Math vs. 'Percentage of Students Passing' in Reading for each classroom, coloring points by grade."


📉 Category 8: Discrepancy & Gap Analysis

These prompts focus on the difference or gap between two data points.

  1. "Plot the discrepancy (Math score - Reading score) on the Y-axis against the 'Overall Average Score' on the X-axis. Color by grade."

  2. "Generate a scatter plot of 'Homework Score' vs. 'Test Score' only for students in the bottom quartile of test-takers. Color-code by grade."

  3. "Plot the gap between 'Practice Test 1' and 'Practice Test 2' vs. the 'Final Exam Score,' colored by grade."

  4. "Show the relationship between 'Student-Reported Confidence' (survey data) vs. 'Actual Test Score' in Math, faceting by grade level."

  5. "Plot the 'Fall-to-Spring Growth' in Math (Y) vs. the 'Fall-to-Spring Growth' in Reading (X), with points colored by grade."


🗓️ Category 9: Temporal & Milestone Comparisons

These prompts compare performance at different points in time (e.g., midterm vs. final).

  1. "Plot 'Midterm Exam' scores (X-axis) vs. 'Final Exam' scores (Y-axis) for the History curriculum, using different colors for each grade."

  2. "Show the relationship between 'Quarter 1 Report Card Grade' and 'Quarter 3 Report Card Grade' in Language Arts, faceted by grade level."

  3. "Compare scores on the 'Fall Benchmark' (X) with scores on the 'Winter Benchmark' (Y), coloring points by grade."

  4. "Plot 'Diagnostic Test' score (start of year) vs. 'Summative Test' score (end of year), with a separate regression line for each grade."

  5. "Generate a scatter plot of 'Project-Based Assessment' scores vs. 'Standardized Test' scores (in the same subject), color-coded by grade."


🧩 Category 10: Specific Cohort & Program Analysis

These prompts filter the data to look at specific student populations.

  1. "Plot Math scores vs. Reading scores only for students in the Gifted & Talented program, coloring by grade."

  2. "Compare 'Special Education' (IEP) students' Math scores (X) vs. their Reading scores (Y), using grade level to color-code."

  3. "Show the relationship between 'Years in Program' (e.g., ELL, G&T) and 'Latest Test Score,' faceted by grade."

  4. "Plot test scores for students in 'After-School Tutoring' vs. students not in tutoring, on the same axes. Use color for grade and shape for tutoring status."

  5. "Generate a scatter plot of 'Standardized Test' percentile vs. 'School-Specific Assessment' percentile, faceting by both grade level and program (e.g., IB, AP, General)."

2025 Top 10s

 2025

Bottom 10 Counties by Average Math Score


District Name

Average Math Score

Lewis

0.460

Monroe

0.459

Wetzel

0.454

Summers

0.452

Lincoln

0.439

Randolph

0.437

Tucker

0.415

WV Virt Academy

0.408

McDowell

0.394

Vrt Prep Academy

0.330

Top 10 Counties by Average Math Score


District Name

Average Math Score

Putnam

0.650

Ohio

0.634

Jackson

0.629

Monongalia

0.626

Ritchie

0.620

Wood

0.618

Hancock

0.612

Cabell

0.584

Pocahontas

0.578

Calhoun

0.576

Math Variations Deep Research

 

Kicking off the analysis
To begin my analysis of math variations in West Virginia schools, I first need to understand the data I'm working with. My initial step is to define the key metrics, specifically what "Assessment Math Performance Value" and "Assessment Math Progress Value" represent within the 2025 state accountability system. Understanding these definitions is crucial for providing an accurate analysis.
Reading the data
Simultaneously, I am examining the five data files provided. My first task is to identify all the unique 'School Type' categories present in the data. This will confirm the primary groups for my comparison, which I expect to be 'Elementary', 'Middle', and 'Secondary', but I'm checking for any others.
Extracting relevant numbers
To get the specific numbers needed, I am systematically going through all five files for each school type. I am extracting the math performance, proficiency, and progress values specifically for the 'Total' group, starting with Elementary schools, then Middle schools, and finally Secondary schools. This data-gathering process will form the basis for the statistical analysis and synthesis in the next step.
Refining my approach
After my initial exploration, I've identified a much more efficient path forward. My first search for definitions didn't provide the necessary details, so I'm re-initiating that step to ensure I correctly interpret the metrics. Understanding the precise meaning of 'Performance Value' and 'Progress Value' is essential before I analyze the numbers.
Streamlining data extraction
Instead of examining the files multiple times for each school type, I'm now taking a more direct approach. I will process all five files at once to extract every row where the 'Group' and 'Subgroup' are listed as 'Total' or 'Totals'. This single, unified extraction will capture all the necessary math performance, progress, and proficiency data for every school type in one go, which is vastly more efficient.
Next steps: Synthesis
Once I have the definitions from my research and the complete dataset from my extraction, I will move to the analysis phase. My plan is to synthesize this information, group the data by each school type, and then calculate and compare the averages and ranges for the math performance and progress values.
Data retrieved
I have now successfully gathered all the raw data for the 'Totals' subgroup from all five files. This gives me a comprehensive list of the math performance and progress values for all school types ('Elementary', 'Middle', and 'Secondary') present in the data.
Understanding the metrics
My previous search also located the official state resources that define the 'Assessment Math Performance Value' and 'Assessment Math Progress Value'. It appears 'Performance' relates to proficiency at a specific point in time, while 'Progress' measures growth over time. My next step is to examine these documents to get the exact definitions.
Moving to analysis
After I retrieve these definitions, I will have all the components needed for the final analysis. The next phase will involve synthesizing these definitions with the raw data I've collected. I will then calculate the averages, ranges, and proficiency target percentages for each school type, allowing me to draw clear comparisons.
Final analysis phase
Getting the precise definitions for these metrics is the final piece I need. Once I retrieve these, I will have all the necessary components—the data and the definitions—to move into the final synthesis. My next step will be to complete the research and begin the detailed analysis, comparing the math metrics across all school types.
Data collection complete
I have successfully gathered all the necessary raw data from the five provided files. I now have a complete dataset for the 'Total' groups across all school types, including their 'Assessment Math Performance Value' and 'Assessment Math Progress Value'.
Pivoting on definitions
My attempts to retrieve the official state definitions for these math metrics have hit a roadblock, as the primary source files appear to be unavailable. However, my initial research uncovered a press article discussing the accountability system changes. I will now examine this article as an alternative source to understand how 'Performance' and 'Progress' values are defined.
Moving to synthesis
This will be my final research action. After I investigate this last source for context, I will have exhausted my research avenues for definitions. My next step, regardless, will be to conclude the research phase and proceed with the full analysis, using all the data I've collected to synthesize the findings and compare the mathematical variations between school types.
Trying a new search
Understanding these definitions is key to an expert analysis, so I'm not giving up just yet. I am initiating a new, broader search. This time, I'm looking for any reliable source—such as a school district's explanation or a different government manual—that can clarify what these metrics mean. This will be my final attempt to locate this information.
Preparing for final analysis
After this last research push, I will have all the components available to me. The next step will be to synthesize the definitions (if found) with the complete dataset I've already compiled. I will then proceed to the final analysis and report generation.
 
A promising new lead
My last search effort has uncovered a significant new lead. The key metrics I'm investigating appear to be part of the 'WV Balanced Scorecard'. My search provided a press release and a main accountability system page that both reference this scorecard, which I hadn't found before. This is the most promising path I've had so far for finding the definitions.
Final research attempt
I am now examining these two new sources to find the official definitions for 'Math Achievement' and 'Math Progress' and confirm they correspond to the 'Performance' and 'Progress' values in my dataset. This will be my final, targeted attempt to secure these definitions.
Moving to synthesis
Once this last step is complete, my research phase will be concluded. I have already successfully gathered all the raw data. After this final check for definitions, I will have all the components needed to synthesize the information and perform the full analysis of the math variations between school types.
Partial insights
My recent investigation of the state's press release was partially successful. It confirmed the accountability system is called the 'West Virginia Balanced Scorecard' and uses 'academic achievement' and 'academic progress' as metrics. However, it didn't explicitly confirm that these terms are identical to the 'Assessment Math Performance Value' and 'Assessment Math Progress Value' from my data, which is a crucial link I'm still trying to establish.
One final check
I noticed that a key part of my previous plan—examining the main accountability system webpage—was not executed. This page represents the best and final chance to find an official, concrete definition. I am now prioritizing the examination of this last remaining resource.
Completing the research
After this final check, my research phase will be complete. I have successfully gathered all the necessary raw data, and I will have exhausted all reasonable avenues for finding the official definitions. Regardless of the outcome of this last step, I will then proceed to the full analysis, synthesizing all available information to compare the math variations across school types.
West Virginia Accountability System | West Virginia Department of ...
Creating the initial report outline...
The initial information sweep has returned a significant volume of data. I am now synthesizing these preliminary findings to construct a logical report outline. This structured framework will organize the key arguments, establish a clear narrative flow, and map the primary data points to their respective sections.
Moving to final analysis
Despite the lack of official definitions, I have enough information to proceed. Based on the metric names and the press release I found, I will infer their meanings for the analysis—'Performance' as a measure of proficiency and 'Progress' as a measure of growth. I will now move to the final step: synthesizing all the data I've collected, performing the calculations, and analyzing the mathematical variations between elementary, middle, and secondary schools.
Expanding the report outline...
With the high-level outline complete, I am now populating each section with specific supporting evidence, statistics, and qualitative insights from the research. This phase involves a deeper synthesis of the data and helps to identify any critical information gaps that may require a second, more targeted search.
 
 
 

An Analysis of Mathematics Performance and Progress Variations Across West Virginia School Tiers (2025 Data)



Executive Summary: Comparative Analysis of Mathematics Performance and Progress by School Type


This report provides a comprehensive analysis of mathematics assessment data from the 2025 reporting year, focusing on the variations in "Assessment Math Performance Value" (achievement) and "Assessment Math Progress Value" (growth) among Elementary, Middle, and Secondary schools. The analysis of aggregate "Total" student data reveals three critical, systemic findings.


First, a pronounced and statistically significant decline in math achievement is evident as students advance through the school system. Performance values drop sharply from the Elementary to Middle tier and again from the Middle to Secondary tier, indicating a systemic "degradation pipeline" in mathematics proficiency.


Second, a "growth-versus-proficiency" paradox emerges at the Middle School level. While Middle School achievement scores are substantially lower than their Elementary counterparts, their "Assessment Math Progress Value" (student growth) is statistically indistinguishable from the Elementary sector. This suggests that Middle Schools are, on average, achieving expected growth but are unable to close a pre-existing and widening proficiency gap.


Third, a critical data gap exists within the state's accountability framework. The "Assessment Math Progress Value" is systematically unreported for all Secondary Schools in the dataset. This "accountability black box," combined with the sector's critically low performance scores, makes it impossible to evaluate the instructional effectiveness of high schools using this metric, creating a significant blind spot for educational oversight and intervention.


Part 1: Analysis of 'Assessment Math Performance Value' (Achievement) by School Type


This section analyzes the "Assessment Math Performance Value," a metric representing the point-in-time proficiency of students. Based on its structure in the provided data alongside the "Meets Annual Target?" column, this value is interpreted as a proficiency rate or an index of students meeting a state-defined benchmark.1


1.A. Aggregate Performance Profile: Elementary Schools


The Elementary School sector establishes the highest performance baseline in the K-12 system. The data, compiled from 50 schools, demonstrates the strongest, though most varied, achievement levels. Performance scores are widely distributed, with a cluster of high-achieving institutions, such as Scott Teays Elementary School (0.8132), Stanaford Elementary School (0.7822), and Ritchie Elementary School (0.7626).1 These top-tier schools indicate a high potential for mathematics proficiency at this level.


However, this high average is balanced by significant underperformance at the sector's lower end. A notable number of elementary schools post scores below a 0.500 value, including Eagle School Intermediate (0.4186), Chesapeake Elementary School (0.4605), and Orchard View Intermediate School (0.4824).1 This wide dispersion highlights a significant inequality of outcomes within the elementary tier itself, even before students transition to middle school.


1.B. Aggregate Performance Profile: Middle Schools


The "Assessment Math Performance Value" for the Middle School sector, compiled from 20 schools, reveals a substantial decline in proficiency. The central tendency for this tier is markedly lower than for Elementary Schools. The data shows a significant cluster of schools performing in the 0.40-0.50 range, such as Kasson Elementary/Middle School (0.474), Belington Middle School (0.4909), and Hedgesville Middle School (0.484).1


This "middle school slump" is further defined by a floor that drops well below the elementary sector's, with schools like Madison Middle School (0.3553), Philippi Middle School (0.4044), and West Side Middle School (0.3301) posting values that are concerningly low.1 Conversely, the ceiling for middle school performance (e.g., Winfield Middle School at 0.6963 and Barrackville Elementary/Middle School at 0.6498) does not reach the heights of the top elementary schools, indicating a system-wide depression of math achievement at this level.1


1.C. Aggregate Performance Profile: Secondary Schools


The Secondary School sector exhibits the lowest and most critical performance metrics. Based on data from 8 schools, the mean and median performance values are drastically lower than those of both Elementary and Middle Schools. This dataset shows a near-total collapse in proficiency for a significant portion of the high school cohort.


Multiple high schools are performing in the 0.30 range, including Philip Barbour High School Complex (0.367), Chapmanville Regional High School (0.3538), Logan Senior High School (0.3402), and Westside High School (0.3409).1 The sector floor is represented by Man Senior High School, with an "Assessment Math Performance Value" of just 0.2772, indicating that fewer than 28% of students in the "Total" cohort are meeting the proficiency benchmark.1


A single significant outlier, George Washington High School (0.5609), performs at a level more consistent with an average elementary school, demonstrating that high-level math failure is not inevitable but is characteristic of the sector as a whole.1


1.D. Comparative Insights: Statistical Variations in Math Performance Across Tiers


A direct statistical comparison of the "Total" student cohort's math performance across the three tiers quantifies the systemic degradation of proficiency. The analysis reveals a clear, tiered drop in average performance and a concurrent shift in the distribution of outcomes.


Table 1: Aggregate Statistics for 'Assessment Math Performance Value' by School Type (2025)


School Type

N (School Count)

Mean

Median

Std. Deviation

Min. Value

Max. Value

Elementary

50

0.626

0.615

0.098

0.4186

0.8132

Middle

20

0.506

0.487

0.098

0.3301

0.6963

Secondary

8

0.386

0.360

0.091

0.2772

0.5609

Source: Compiled from 2025 data, "Total" subgroups [1, 1, 1, 1, 1]







This statistical summary illustrates two primary findings. First, a clear "performance degradation pipeline" exists, where average proficiency drops approximately 12 percentage points from Elementary to Middle school, and another 12 percentage points from Middle to Secondary school. The median values confirm this trend, showing that the 50th percentile school at the secondary level (0.360) performs far below the 50th percentile middle school (0.487) and elementary school (0.615).


Second, this drop is characterized by a "floor effect" rather than increased dispersion. The standard deviation remains remarkably consistent across all three tiers (approx. 0.09-0.10), meaning the spread of school performance is not widening. Instead, the entire distribution of schools is shifting downward. The minimum performance value at each tier drops precipitously, from 0.4186 (Elementary) to 0.3301 (Middle) to 0.2772 (Secondary). This indicates that while inequality of outcomes is present at all levels, the severity of failure at the lowest-performing schools becomes more extreme at each successive tier.


Part 2: Analysis of 'Assessment Math Progress Value' (Growth) by School Type


This section analyzes the "Assessment Math Progress Value," a metric interpreted as a measure of student academic growth over time (e.g., a Student Growth Percentile). This value is essential for evaluating school effectiveness in promoting learning, independent of the students' absolute proficiency.


2.A. Aggregate Progress Profile: Elementary Schools


The "Assessment Math Progress Value" for Elementary Schools establishes the baseline for expected academic growth. From a sample of 40 schools with available progress data, the mean (0.470) and median (0.482) values are robust. The distribution is wide, with a range spanning from 0.2821 (Clendenin Elementary School) to 0.6587 (Scott Teays Elementary School).1


This wide variance is critical, as it shows that high performance (Part 1.A) does not always correlate with high progress. For example, some high-performing schools like Central City Elementary (0.571 performance) post relatively low progress scores (0.439).1 Conversely, a school like Tomahawk Intermediate has moderate performance (0.6198) but strong progress (0.5052).1 This variance underscores the importance of using both metrics to evaluate school quality.


2.B. Aggregate Progress Profile: Middle Schools


The "Assessment Math Progress Value" for Middle Schools presents a complex picture. The data is notably sparse; of the 20 schools analyzed for performance, only 10 reported a "Total" progress value. However, the data that is available is revealing.


The mean (0.444) and median (0.435) for this 10-school sample are numerically similar to the Elementary sector's values. The distribution is also analogous, with scores ranging from 0.3105 (West Side Middle School) to 0.6056 (Barrackville Elementary/Middle School).1 This initial finding stands in stark contrast to the performance data in Part 1.B, which documented a significant "slump" in absolute achievement. Here, the growth metrics suggest that middle schools are, on average, fostering academic growth at a rate comparable to their elementary counterparts.


2.C. Data Gap Analysis: The Absence of Secondary School Progress Metrics


A comprehensive review of the 2025 dataset reveals a critical, systemic finding: the "Assessment Math Progress Value" is uniformly blank or "Not Reportable" for all schools classified as "Secondary".1 This includes Philip Barbour High School Complex, George Washington High School, Chapmanville Regional High School, and all other 9-12 institutions analyzed.1


This is not a random data error but appears to be an architectural feature of the accountability system represented in the data. The West Virginia Accountability System (WVAS) and its associated Balanced Scorecard likely substitute other indicators for high schools, such as "On-Track to Graduation" and "Post-Secondary Achievement".2


While metrics like graduation rates are important lagging indicators, they are not a valid proxy for subject-specific instructional effectiveness. This data gap creates an "accountability black box" for high school mathematics. Given the extremely low performance scores detailed in Part 1.C (e.g., Man Senior High at 0.2772 1), this omission is severe. It renders the state's accountability system, as reflected in this data, incapable of distinguishing a high school that is effectively "growing" its low-performing students from one that is compounding their academic deficits.


Consequently, any direct comparison of math "progress" must be formally limited to Elementary and Middle schools.


2.D. Comparative Insights: Statistical Variations in Math Progress (Elementary vs. Middle)


Given the data gap for secondary schools, this comparative analysis is restricted to the Elementary and Middle tiers, for which 50 data points (40 Elementary, 10 Middle) are available.








Table 2: Aggregate Statistics for 'Assessment Math Progress Value' by School Type (2025)


School Type

N (School Count)

Mean

Median

Std. Deviation

Min. Value

Max. Value

Elementary

40

0.470

0.482

0.095

0.2821

0.6587

Middle

10

0.444

0.435

0.106

0.3105

0.6056

Secondary

0

N/A

N/A

N/A

N/A

N/A

Source: Compiled from 2025 data, "Total" subgroups.1 N reflects schools with available progress data.








The data in Table 2 reveals the report's second major finding. The mean and median "Assessment Math Progress Value" for Elementary and Middle schools are statistically similar. The means (0.470 vs. 0.444) and medians (0.482 vs. 0.435) show only minor differences, and the standard deviations (0.095 vs. 0.106) indicate that both tiers have a virtually identical breadth of "growth" outcomes.


This statistical parity in growth, when juxtaposed with the significant drop in performance (Table 1), demonstrates that the "Middle School Slump" is a phenomenon of proficiency, not growth. Middle schools are not, on average, "worse" at their jobs than elementary schools. Rather, this data pattern suggests they are achieving an "average" year of growth with students who are, on average, falling further behind a "proficient" benchmark. The difficulty of the curriculum is likely outpacing the standard rate of instructional growth, causing the proficiency gap that began in elementary school to widen into a chasm at the middle level.


Part 3: Synthesized Findings and Strategic Implications


This section synthesizes the analyses of performance (achievement) and progress (growth) to provide a holistic, actionable overview of the K-12 mathematics landscape.


3.A. The Performance-Progress Matrix: Correlating Achievement and Growth


To move beyond aggregate averages, schools were profiled based on their relative performance and progress. Using the overall median values for performance ($0.605$) and progress ($0.479$) as dividers, Elementary and Middle schools were mapped into four distinct quadrants. This analysis provides a more granular diagnosis of school-level effectiveness.


Table 3: Performance vs. Progress Quadrant Analysis (Elementary and Middle Schools, 2025)


School Type

Quadrant 1: High Perf / High Prog

Quadrant 2: High Perf / Low Prog

Quadrant 3: Low Perf / Low Prog

Quadrant 4: Low Perf / High Prog

No Prog Data

Total N

Elementary

18 (36.0%)

12 (24.0%)

7 (14.0%)

3 (6.0%)

10 (20.0%)

50

Middle

1 (5.0%)

0 (0.0%)

6 (30.0%)

3 (15.0%)

10 (50.0%)

20

Source: Compiled from 2025 data.1 Quadrants are based on overall medians (Perf: 0.605, Prog: 0.479).








This quadrant analysis reveals the fundamentally different challenges facing the two tiers.


  • Elementary Schools: The largest group (36.0%) is in Quadrant 1 (High/High), representing schools that are both high-achieving and high-growth (e.g., Ritchie Elementary 1, Scott Teays Elementary 1). However, a substantial portion (24.0%) falls into Quadrant 2 (High/Low). These are "coasting" schools that benefit from high proficiency but are failing to achieve adequate student growth.


  • Middle Schools: The profile is dramatically different. 45% of all middle schools (9 of 20) are in the "Low Performance" half (Quadrants 3 and 4). A alarmingly high 30% of all middle schools fall into Quadrant 3 (Low/Low), representing systemic failure (e.g., Philippi Middle 1, West Side Middle 1). Critically, 15% are in Quadrant 4 (Low/High), (e.g., Belington Middle 1, Hedgesville Middle 1). These schools are effective (high growth) but are struggling with low-proficiency populations, supporting the hypothesis from Part 2.D. The fact that 50% of middle schools lack progress data further complicates this analysis.





3.B. Identifying Systemic Gaps and Strengths in the K-12 Math Trajectory


The combined analysis of performance, progress, and data gaps yields three primary conclusions regarding the K-12 mathematics trajectory.

Finding 1: The Elementary-to-Middle School Transition is the System's Primary Failure Point.


The data clearly shows that the most significant, measurable drop in math proficiency occurs between the Elementary and Middle school tiers.1 The fact that growth metrics (Table 2) remain stable proves this is a structural problem of cumulative disadvantage. The issue is not necessarily poor instructional effectiveness within the Middle Schools; it is that the standard rate of growth is insufficient to combat the widening gap between student ability and grade-level expectations.


Finding 2: The Secondary School Accountability Model is Incomplete for Math.

The systematic absence of "Assessment Math Progress Value" data for all secondary schools is a critical flaw in the accountability system.1 The state's Balanced Scorecard, which measures indicators like "On-Track to Graduation" 2, is missing the most crucial metric for academic intervention: subject-specific instructional effectiveness. It is currently impossible to know if the lowest-performing high schools (e.g., Man Senior High, 0.2772 1) are failing due to ineffective instruction (low growth) or overwhelming systemic challenges (high growth, but low proficiency).


Finding 3: Extreme Variation Defines the Secondary Sector.


While all tiers show variance, the "variation" in performance is most profound at the Secondary level. The gap between the highest-performing high school (George Washington High, 0.5609) and the lowest (Man Senior High, 0.2772) is a staggering 28.37 percentage points.1 This suggests a profound inequity in the educational experience for 9-12 students, where the specific high school a student attends is a primary determinant of their likelihood of achieving math proficiency.


3.C. Tier-Specific Strategic Recommendations for Intervention and Improvement


This analysis gives rise to specific, data-driven recommendations for educational stakeholders.


  • Recommendation 1 (For Elementary Schools): Focus on "Progress" in High-Performing Schools.
    State and district leaders should immediately target the 24% of elementary schools identified in Quadrant 2 ("High Performance / Low Progress"). These "coasting" schools must have their growth expectations re-evaluated and their instructional models audited to ensure they are actively adding value for all students, not just benefiting from a high-achieving population.


  • Recommendation 2 (For Middle Schools): Resource "High-Growth" Schools and Replicate Success.



The 15% of middle schools in Quadrant 4 ("Low Performance / High Progress") are models of effectiveness. These schools (e.g., Belington Middle, Hedgesville Middle 1) should be studied for best practices in remediation and provided with additional resources to help close the underlying proficiency gap. Concurrently, the 30% of schools in Quadrant 3 ("Low Performance / Low Progress") require immediate and intensive intervention.


  • Recommendation 3 (For Secondary Schools): Create a Valid Math Progress Metric.


The single most urgent recommendation is for the West Virginia Department of Education 3 to develop and implement a valid "Math Progress Value" for the secondary tier. The current "accountability black box" is unacceptable. This could be achieved by developing growth models based on 8th-grade assessment data linked to 11th-grade summative exam (e.g., SAT) outcomes or End-of-Course (EOC) exams.


  • Recommendation 4 (System-Wide): Conduct Root-Cause Analysis at Transition Points.


This quantitative data proves what is happening but cannot explain why. A state-led, qualitative investigation is necessary to conduct a root-cause analysis of the Elementary-to-Middle school transition. This study must examine the curriculum, instructional, and social-emotional disconnects that are causing the systemic collapse in math proficiency documented in this report.

Works cited

  1. 6Book1.xls

  2. 2025 County Approval Status and Accreditation Report - West Virginia Department of Education, accessed November 1, 2025, https://wvde.us/sites/default/files/2025-10/2025%20County%20Approval%20Status%20and%20Accreditation%20Report%20Final%20Version%20for%20WVBE%20Meeting.pdf

West Virginia Accountability System | West Virginia Department of ..., accessed November 1, 2025, https://wvde.us/data-school-improvement/accountability/west-virginia-accountability-system




Exegesis of of each verse--Model

Luke 24: 13-33  Conduct a through exegesis of of each verse of this scripture : 13 And, behold, two of them w...

Shaker Posts