This wide- and large- screen layout may not work quite right without Javascript.

Maybe enable Javascript, then try again.

Home Ipswich: the Place

Educating the High End

If you want to closely examine the charts/pictures/code on this page but they are not easily viewable on your device, note the web address and return to view it later from a device with a larger display screen.

A school should do more than just give a good education to its "average student." It should also educate those who can barely keep up rather than just writing them off. And it should continually challenge even the best students rather than forcing them into a standardized McStudent role. For many years Ipswich schools had a district wide program that tried to serve the special needs of the high end. But the ELP program shrank over time and as a district wide program was eventually cut from the budget entirely. There was some discussion of returning a similar program christened SPARK.


How well do nearby school districts educate not just the typical students but the ones at the high end? The following chart gives a first very rough idea of how well different school districts educate their exceptional students, with positions higher on the chart being better than lower positions. Different schools have very different "average students," and this difference easily swamps out differences in educating students that are exceptional in their school. The statistical procedures used here control for ("hide" or "remove") average student differences to expose how well each school handles its exceptional students.

The left two columns are based on results of the fourth grade MCAS test and serve as an indicator for "the elementary" schools. The middle three columns are based on results of the eighth grade MCAS test and serve as an indicator for "the middle" school. The right two columns are based on results of the tenth grade MCAS test and serve as an indicator for "the high" school.

What this chart shows is hopefully a little more meaningful than just counting "Advanced" MCAS scores. Simply counting Advanced scores would make schools with high overall averages look like they were doing a good job with exceptional students even though they in fact squeezed everyone into the same McGood mold. This chart sidesteps that problem by showing for each grade and subject area the ratio of students who actually scored Advanced compared to the number of students expected to score Advanced based on the average and distribution of scores. A way to think of this procedure is each count of Advanced scores was "adjusted" according to that districts's overall average score. Of course trying to adjust for all the different factors that distinguish one community from another by using the single number actual average score is almost certain to cause problems.

Numbers greater than 1.0 (higher on the chart) mean the school had even more Advanced scores than expected from their regular program and so is apparently doing a good job of educating the best students. Numbers less than 1.0 (lower on the chart) mean the school had fewer Advanced scores than expected from their regular program. Note the reasoning includes two assumptions either of which may not be true: that the MCAS does a good job of measuring what exceptional students should know; and that behind many high scores on the MCAS are exceptional students.

This is the first quantitative measure we have of how different school districts educate exceptional students, and as such is better than nothing. Keep in mind though it's statistically quite dubious and also the reasoning is dubious. The results are not nearly as meaningful as one might wish. They're published at all only because --as questionable as they are-- they're one of the very few quantitative pieces of information we have. See the methodology section below for a list of issues.


How Well Exceptional Students Are Educated
Comparison of Nearby School Districts
Ratio of Actual/Expected Students In Top Quartile ("Advanced")
On MCAS Spring 2001
(MCAS scores categorized: Failure 200-220, Needs Improvement 220-240, Proficient 240-260, Advanced 260-280)
Exceptional Index
(ratio of actual Advanced to expected Advanced)
Elementary School EnglishElementary School MathMiddle School EnglishMiddle School MathMiddle School History & Social StudiesHigh School EnglishHigh School  Math 
>1.0 -- More4th grade4th grade8th grade8th grade8th grade10th grade10th grade
<1.0 -- Fewerstdev 11.5stdev 18.5stdev 9.5stdev 21.5stdev 13.5stdev 19.5stdev 31
2.5
2.4
2.3Rockport
2.2
2.1
2.0
1.9Danvers
1.8
1.7Swampscott
1.6
1.5Gloucester
Manchester-Essex
1.4
1.3Triton
1.2IpswichRockport
Marblehead
Swampscott
Danvers
AndoverManchester-Essex
Hamilton-Wenham
Masconomet
Newburyport
1.1Middleton
Swampscott
Newburyport
Swampscott
Hamilton-Wenham
Marblehead
Ipswich
Danvers
Manchester-EssexMarbleheadSwampscott
Andover
Masconomet
Marblehead
Hamilton-Wenham

1.0
TopsfieldTriton
Andover
Lynnfield
Middleton
Danvers
Gloucester
Rockport
Andover
Lynnfield
Hamilton-Wenham
Rockport
Gloucester
Lynnfield
Triton
Swampscott
Georgetown
Lynnfield
Georgetown
Manchester-Essex
0.9Andover
Marblehead
Lynnfield
Newburyport
BoxfordManchester-Essex
Hamilton-Wenham
Georgetown
Triton
Marblehead
Triton
Georgetown
Manchester-Essex
Masconomet
Andover
Masconomet
RockportIpswich
Danvers
0.8Georgetown
Danvers
TopsfieldSwampscott
Lynnfield
Newburyport
Danvers
Andover
Ipswich
Triton
Rockport
0.7Georgetown
Manchester-Essex
Gloucester
Newburyport
Ipswich
NewburyportLynnfield
0.6Hamilton-WenhamIpswichGloucesterGloucester
0.5BoxfordMasconomet
0.4RockportMarblehead
Hamilton-Wenham
0.3
0.2
0.1
No DataGeorgetown
Gloucester
Ipswich
Newburyport
Triton

Some Possible Conclusions

  1. There is little correlation between english education and math education. Several school districts do quite well on one and not very well on the other.
  2. There is not much correlation between different grades in the same school district. It seems that educating the high end is not currently predominately a matter of district policy. Rather the suspicion is that very local conditions --individual schools or even individual teachers-- currently have the most impact.
  3. The fairly wide spread in educational achievement levels in elementary school narrows considerably by high school.
  4. There was something goofy about the History and Social Studies testing of eighth graders which caused scores to vary all over the map. Perhaps the state frameworks in this area have been so fluid that what's taught isn't yet the same as what's tested except by chance.
  5. Judging solely by what the MCAS measures, none of the secondary schools does a truly outstanding job of educating its few exceptional students. Every school's emphasis seems to be on its "average student."

Methodology

Raw MCAS score summaries from the spring 2001 test cycle were obtained from the Massachusetts Department of Education. (More recent results are available.) For the districts included in each column, the number of students in each quartile (Advanced, Proficient, Needs Improvement, Warning/Failing) was totaled up, and an overall average was calculated. The distribution of scores was assumed to be Gaussian/normal, the distribution was centered on the overall average, and the best fit standard deviation was found.

To find the overall standard deviation for the whole column the number of students in the lower three quartiles was summed and the standard deviation that most closely matched this sum was selected. Standard deviations were expected to be somewhere between 15 and 25. Actual best fit standard deviations vary from less than 10 to more than 30. The headings for each column include the standard deviation that best fit the data in that column.

For each district a Gaussian/normal distribution with the mean being the average score for that school district and the standard deviation that best fit all the data in the column was used to predict how many students could be expected to score in the top (Advanced) quartile. One way to think of this is "sliding" the distribution back and forth to align with the average peak for each school district. Thus schools with high averages were expected to have many Advanced scores. The net effect was to "adjust" for the average score so districts with higher averages could be fairly compared to districts with lower averages.

There are quite a few statistical problems with both the data and the prodedure, including:

  1. This table is based on MCAS scores, which are themselves problematic.
  2. To protect individual identities only summary (average and quartiles) MCAS results are available. Output can only be as accurate as input, which when summaries are used may not be very accurate at all. For example in the case of the high school where 41% of the students scored Advanced on the mathematics test, that single number 41 almost certainly hides quite a bit of detail about the highest test scores. In most --but not all-- cases this problem wasn't as bad as expected because most students fell into the second or third quartiles and the number of students in the fourth (Advanced) quartile was tiny.
  3. It's assumed with no empirical justification that MCAS score results are modeled reasonably well by a Gaussian/normal distribution. This is done both for the totals and for individual districts. It's quite possible that some individual districts have a different distribution of MCAS scores (for example two humps rather than just one, with one hump being top tier students and the other hump being second tier students).
  4. The actual Gaussian/normal distribution extends both ways to infinity while all MCAS scores are constrained to fall between 200 and 280.
  5. In some cases the total number of students is rather small so the results may be inordinately influenced by simple chance. Efforts to get larger numbers of students by calculating for whole districts rather than individual schools were only partially successful. Although the worst of the too small sample size problems were skirted, there's a strong suspicion that very different schools dragged the district results either up or down in odd ways.
  6. Rather than identifying exceptional students first, overall test results are used without identification because this is all that's available from the state. But a high score isn't necessarily an exceptional student; it could also be a typical student who just happens to focus hard on their schooling.

There are also possible reasoning problems, including:

  1. A step in the reasoning is that a high MCAS score is a good indication of a well educated exceptional student. Saying the same thing another way, it's taken for granted that exceptional students should simply learn more of the same stuff. Alternatively, it's possible that the MCAS does not measure exceptional students very well at all, especially if one thinks exceptional students should learn something different in addition to the regular curriculum.
  2. The key step in the reasoning is that if a school has more Advanced scores than expected from its regular program, the extra Advanced scores represent exceptional students who are being well served. This is an unsubstantiated assumption. It's possible the extra Advanced scores don't in fact represent exceptional students.

Places to learn more


Material Updated July 2002

Location: (N) 42.67995, (W) -70.83761
 (North America> USA> Massachusetts> Boston Metro North> Ipswich)

Email comments to Chuck Kollars
Time: UTC-5 (USA Eastern Time Zone)
 (UTC-4 summertime --"daylight saving time")

Chuck Kollars headshot Chuck Kollars' other web presences include Chuck's books and Chuck's movies.

You may also wish to look at Dad's photo album.

All content on this Personal Website (including text, photographs, audio files, and any other original works), unless otherwise noted on individual webpages, are available to anyone for re-use (reproduction, modification, derivation, distribution, etc.) for any non-commercial purpose under a Creative Commons License.