The Q&A is now over, but please send us your comments.
Q: Is there a similar study on high schools, or will you be conducting one in the future?
A: There is no similar study on high schools for the moment, and we have no current plans for one.
We do, however, have plans for a similar study of primary schools in Alberta and BC (and perhaps Quebec).
Q: Sorry, I don't understand the answer to my earlier question about how student postal codes were determined as a basis for this whole study.
Please clarify "a six-digit postal code" -- is that the student's home address or the school's address? What is a "census dissemination area?" When census data is used, does it actually relate to the individual student's circumstances, or does it relate to the population as a whole in that neighbourhood? (For example, in inner-city schools where the local school families are often poorer on average than the neighbourhood residents who don't have children.)
A: The six-digit postal codes used are the students' postal codes, from which we can tell in what census dissemination areas (DAs) they live. A DA is the smallest geographic unit available in the 2001 census. It includes 400 to 700 people. The school's community profile is then constructed by weighting the socio-economic variables of each DAs by the number of students at a given school that live in those DAs. Some of the socio-economic variables used relate to the entire population in the DA, others relate only to those with children.
Of course, ideally, one would like to be able to measure directly each individual student’s socio-economic characteristics and associate them with the student-level assessment results. Such data are not collected, however, because of concerns about privacy. Instead, one is able to measure only the average characteristics of households in the community from which a school draws its students -- and then only indirectly.
So, even though we do not know the socio-economic characteristics of individual students, the characteristics of the small census units in which students live can tell us much about them. And we can determine which small census units students live in because we know their postal codes.
Q: There's no rating for Thomas L. Wells Public School in Scarborough. It opened in 2005. Why didn't C.D. Howe include it?
A: To ensure fair representation, only schools where we have three years of data in all three assessments and 15 students or more per assessment per year are included in the rankings.
Since the school you mention opened less than three years ago, there is no evaluation for that school.
Q: In some schools there is a huge difference in rating between Grade 3 and Grade 6, for example, the Grade 3 percentile would be 20 but the Grade 6 is 80, or vice-versa.
What could be the reasons for such a discrepancy for the same school? In this case, how do you rate the overall performance of the school?
A: Such a large discrepancy between the Grade 3 score and the Grade 6 score is rare. When it happens, the most likely explanation is that the school is small, with perhaps only one or two classes in each grade, and that the teacher(s) in one grade is far better than the teacher(s) in the other grade. It could also be that for some reason the two grade are drawing students from different neighbourhoods, that is, from different socio-economic contexts.
Q: When defining postal code, what exactly is being used? (On page R1: "By linking student postal codes to census data on education, employment..." ). Is it the first three digits of the postal code?
I'm asking because I believe census data use the first three digits only. So where I live, this includes inner-city schools and schools that in the Star's terminology are "carriage trade." In other words, any attempt to define a neighbourhood using only the first three digits would be flawed.
A: A six-digit postal code is linked to either one census "dissemination area" or, in a smaller percentage of cases, to several dissemination areas. The "neighbourhood" is NOT a geographic concept in the sense of a physical area around a school. It is the actual dissemination areas in which the students at a given school live.
Q: What is the definition in the charts for the column "pass rate"? Achieving which levels - 3 and 4, or 2, 3 and 4? I'm not an expert but I believe 2 is considered a pass.
A: The pass rate is the percentage of students at a given school and grade that achieve at levels 3 or 4 according to EQAO Method 1.
Q: Which schools were compared against each other? Specifically, which schools was Bedford Park compared against?
A: Through the statistical methodology used, every school is effectively compared to every other school in the province. The methodology used is hard to fully grasp unless one is familiar with the statistical technique of regression analysis. To put it simply, a mathematical equation based on several socio-economic variables is estimated using the EQAO school results and the census variables for all schools in the province. This equation is then used to "predict" the score of each school given the particular values of the socio-economic variables for that school. If a school does better than this equation predicts, than it scores above the 50th percentile. If it does worse, then it scores below.
In this sense the comparison really uses all the information from all the schools in the province, the comparison is in no way restricted to a specific area or a specific socio-economic context.
Again, this methodology is explained in much more details in Signposts of Success, David Johnson's 2005 book available from the C.D. Howe Institute. It is explained more concisely with the help of Figure 1 in the e-brief available at http://www.cdhowe.org/pdf/ebrief_39.pdf.
Q: I live in Port Sydney and was looking at our local school's results. I noticed that the socio-economic factors was a factor of -21.4. How is this factor determined? Do they consider all residents or only those who have children in the school? Are seasonal residents included?
A: -21.4 is the adjusted pass rate for that school, that is, the pass rate from which we subtract the provincial average pass rate. It means that this school's pass rate is 21.4 percentage points below the provincial average (which is about 60). The adjusted pass rate has nothing to do with the socio-economic variables. The measures adjusted for the socio-economic factors are the percentiles.
Q: Why does David Johnson's formula take percentage of single detached homes as a socio-economic factor? My semi-detached home in Riverdale is worth many times more than many homes in other areas of the province. Does the high amount of semi-detached homes in Riverdale bring down our socio-economic rating? Are high density areas penalized or do they get a boost by his "crunching" formula?
A: The proportion of single detached homes in an area turns out to be a good indicator of wealth, stability and community involvement of the residents in that area. The higher the percentage of single detached homes in an area, the higher the socio-economic status of that area tends to be. Schools in an area with a higher percentage of single detached homes would be expected to do better, all else equal.
Q: Regarding alternative schools, are all students included in this testing? Also, for schools that have less the 16 students participating, why are the results not included and how do you get and accurate result for the past five years comparing with other schools?
A: The rankings use an average of three years of EQAO results over three assessments, but only if there are 15 students or more in all years/assessments. The reason this is done is to insure sufficient results for a fair and representative comparision. All students in public schools are included in the rankings if they meet the filters above. Alternative schools are included if they participate in the EQAO testing process.
Q: Please indicate the census data you have used in the study. What year is it taken from and what formula did the professor use to compensate for the factors considered?
A: The socio-economic data are from the 2001 Census. The detailed methodology included the regression equations used to construct the rankings are explained in Chapter 6 of David Johnson's book, Signposts of Success, available in print from the C.D. Howe Institute.
Q: Were private schools rated as well? If so, where can I find this information?
A: Only public schools that participate in the EQAO testing process were evaluated.
Q: How are the school ratings calculated? I know that EQAO is used as the primary source of data, are there any other assessments used? If EQAO is used, how do you adjust for the fact that the EQAO test has been changed recently, and therefore cannot be used to track students progress effectively? After all, the assessment cannot be used to assess if a school educates students.
For example, if a student moves into a school in Grade 3, and takes the EQAO test, it cannot be determined if the student improves. Of course the test will be taken again in Grade 6, but the test has changed since they last took it. Any comment?
A: Only EQAO test results are used, along with 2001 census data to adjust for the schools' socio-economic contexts. The EQAO data are not used to track a student's or a school's progress, they are used to evaluate what percentage of students in a grade meet EQAO standards as compared to other schools in similar socio-economic environments. To get a more representative evaluation, 3-year averages are used.
Q: Did the C.D. Howe Institute use Method 1 or Method 2 EQAO results? If Method 1 ... are you aware that there are many problems with this reporting? For example, Method 1 includes the results of Mildly Intellectually Disabled students. These students typically do not write EQAO and thus in Method 1 receive zeros. This skews the results of any school with an MID program.
Moreover, as a teacher I can tell you that our results almost always include scores of zero for children no longer even enrolled in our school at the time of EQAO testing. The ministry still has them listed as attending. Students absent, for example on a trip during testing (quite frequently happens nowadays in inner city schools - families travel during the school year) also count as zero and skew the results.
A: David Johnson's methodology uses EQAO "Method 1," because the process of exempting students from writing assessments creates more significant problems with Method 2.
His book explains the reasoning as follows (Chapter 5): "Some Ontario elementary school students are exempted from the Grade 3 and Grade 6 assessments undertaken by the Education Quality and Accountability Office (EQAO). Students can be exempted when, even with all possible accommodations, 'the student would be unable to participate productively and/or where the student’s participation would be harmful.' The overall exemption rate is quite low — falling between 4 and 7 percent — but it varies across years and across assessments. There is also some evidence that exemption rates vary across boards — that is, two schools with the same socio-economic characteristics but located in different boards can have systematically different exemption rates. Exemption rates are important because some users of school assessment results emphasize EQAO’s 'Method 2' presentation of results, which are open to manipulation. In this methodology, the key variable is not the percentage of all students at a school that achieves at Level 3 or Level 4 but the percentage that actually wrote the assessment at a school that achieves at those levels. Thus, a school or board that wanted to improve its apparent assessment results using Method 2 could do so simply by exempting more students who would not score at Level 3 or Level 4. In the logical extreme, 'perfect' Method 2 results could be obtained by exempting all students except those who would score at Level 3 or Level 4. Most school boards, EQAO itself, and most users of EQAO data do not make use of Method 2 results."
C.D. Howe is the first organization to calculate the socio-economic level of each grade school in the province, and then compare its test scores with schools of a similar background, rather than lumping them all into the same crude ranking. Their way is a being hailed as a much fairer way of determining if a school is doing a good job educating its students.
Designed by Wilfrid Laurier economics professor David Johnson, the ratings calculate each school’s socio-economic profile based on census data for that neighbourhood, then compares test scores among schools with similar demographics, from family income to parents’ education. Schools scoring a 50 are about average; those below 80 are doing a tremendous job and those scoring in the 30th percentile or lower should be cause for concern.