September 6, 2002

Commentary

City School’s Test Scores DON’T Tell the Whole Story

Edward M. Olivos, President
Greater San Deigo CABE

Superintendent Alan Bersin, Chancellor Alvarado, and the rest of the San Diego City Schools district administration would like the San Diego public to believe that the so-called “improvement” in the spring 2002 SAT-9 scores were direct effects of the Blueprint for Student Success. Specifically, they are promoting a very simple, and misleading, “cause and effect” scenario for the “rise” in test scores in SDCS. This position by SDCS officials, however, is flawed on many fronts.

First of all, the District is contending that the “rise” in test scores is directly attributable to the implementation of the Blueprint. I, however, would argue that tying “improved” student test scores to the implementation of the Blueprint is flawed reasoning. If anything, the rise in test scores may only suggest a “correlational” relationship, as it is referred to in the field of educational research. This assumption, however, may also be giving Bersin and Alvarado much more credit than they deserve since many other variables may have been equally, if not more, responsible for the modest rise in test scores.

We must ask ourselves, why is it that the Blueprint for Student Success and Alan Bersin cannot be indisputably credited with improving test scores in SDCS, even if the relationship between the two is “correlational?” The reason is because within SDCS there doesn’t appear to have ever been an appropriate research design intended to specifically measure the effects of the Blueprint on student achievement. That is, to my knowledge, there were never any “control groups” or “experimental groups” established prior to the implementation of the Blueprint. In my opinion, these groups would have been useful in that they would have provided us with the necessary data to differentiate improvement between student cohorts who were “subjected” to the Blueprint, and those who were not. This indeed would have been an interesting “experiment” since I’m sure it would have invalidated many of the claims of the District. A clear demonstration of my hypothesis is the rise in test scores of many school districts throughout the state of California who do not implement the Blueprint for Student Success, as seen below.

For example, SDCS has led many to believe that the district has made tremendous progress in terms of student achievement, i.e. test scores. If this is the case, let’s present a very “simple” comparison between SDCS and other large districts in the state. The table below shows the percentage of students (grades 2-11) who scored above the 50th percentile on the Stanford Achievement Test, ninth edition (Stanford 9,) in reading:

Percentage of Students above the 50th Percentile in Reading

District   1998 1999 2000 2001 2002   5 year growth

San Diego   41%  44%  47%  47%  48%      + 7%

San Jose    44%  45%  47%  51%  51%      + 7%

San Fran.   44%  45%  47%  46%  47%      + 3%

Sacramento  33%  38%  40%  41%  41%      + 8%

LA Unified  22%  23%  26%  28%  31%      + 9%
Source: www.greatschools.net

The above information is indeed interesting. Specifically, we notice that Los Angeles Unified, while having a lower percentage of students who tested above the 50th percentile, showed the most growth over a 5 year period, 9 percentage points, with Sacramento in second place with 8 percentage points. Furthermore, San Jose Unified currently has the highest percentage of students testing above the 50th percentile, 51%. This is particularly interesting to note since San Jose Unified has a student population that is 50% Latino, in comparison to San Diego Unified’s 40%, and is strongly supportive of biliteracy instruction, whereas in SDCS biliteracy is consistently undermined and watered-down.

Now let’s take a look at math. The table below shows the percentage of students (grades 2-11) who scored above the 50th percentile on the Stanford Achievement Test, ninth edition (Stanford 9,) in math:

Percentage of Students above the 50th Percentile in Math

District     1998 1999 2000 2001 2002   5 year growth

San Diego     45%  49%  55%  53%  54%      + 9%

San Jose      44%  50%  56%  60%  61%      + 17% 

San Fran.     55%  56%  61%  60%  63%      + 8% 

Sacramento    36%  44%  49%  54%  53%      + 17% 

LA Unified    27%  31%  34%  36%  41%      + 14% 
Source: www.greatschools.net

Again, in the area of math, SDCS did not show the most “growth” over the 5 year period. That honor belonged to San Jose and Sacramento, each growing 17 percentage points, and San Francisco had the highest percentage of students scoring above the 50th percentile, 61%, in 2002!

The above tables demonstrate a basic, albeit simple, form of comparing school districts in which SDCS is clearly not the “best.” And while certainly the data on student achievement is open to more rigorous statistical analysis, I do believe that the information provided here supports La Prensa’s position that the test scores, SDCS, Bersin, and the major local newspaper are touting, “don’t tell the whole story.”

Therefore, assuming that standardized testing is a valid and reliable form of measuring student achievement (many would argue that it isn’t) and that improvement in test scores is indicative of success, SDCS has failed, particularly since there is no “strong evidence” whatsoever that demonstrates that the costly and demoralizing Blueprint for Student Success is responsible for the rise in test scores. Additionally, educational statisticians would caution us against assuming cause and effect from correlational data, yet Bersin’s accountability department does so, consistently.

Perhaps Mr. Bersin should consider emphasizing the “research,” instead of the “reporting,” of the District’s Standards, Assessment, and Accountability: Research and Reporting Office. One need not be a statistician to know that public relations do not equal student achievement.

Letters to the Editor Return to the Frontpage