School socio-economic composition and student outcomes in Australia: implications for educational policy.
Perry, Laura ; McConney, Andrew
Introduction
National educational policy analysis and evaluation are complex
endeavours that demand empirical data-gathering efforts that are of
appropriate scale and high quality but mounting such data-gathering
efforts can be resource- and time-intensive. As an alternative, perhaps
under-utilised, strategy, this paper describes a retrospective secondary
analysis of an existing large-scale data set that potentially adds value
to educational policy evaluation. Specifically, as a member of the
Organisation for Economic Co-operation and Development (OECD), Australia
participates in the Programme for International Student Assessment
(PISA) that assesses the literacy of 15-year-old students in reading,
mathematics and science. PISA is administered on a cyclical three-year
schedule that began in 2000 with a focus on reading, followed in 2003
with a focus on mathematics and 2006 with a focus on science. The PISA
surveys have made an important departure from other international
assessments by decoupling the instruments from school curricula; rather,
the assessment instruments are based on holistic definitions of
discipline-specific literacies--the skills and knowledge deemed
necessary for personal and working life in industrialised countries with
21st-century economies--in the core learning areas of reading,
mathematics and science (OECD, 2004). PISA data sets are housed and
managed by the Australian Council for Educational Research (ACER) and it
is the 2003 data set that is the subject of our secondary analysis here.
Australia's Commonwealth government has begun consideration of
applying a so-called 'socio-economic status (SES) model'
within its policies guiding school funding. For the current study, we
suggest that the secondary analysis of extant large-scale data sets can
provide important input to the discussion of Commonwealth school funding
policy by shedding light on previously obscured or possibly unexamined
relationships. In particular, it is already well established in the
educational research literature that the socio-economic status of
individual students is strongly associated with educational achievement
as measured by standardised assessment systems, whether local, national
or international. In addition, various international studies have shown
that the aggregated socio-economic profile of a school is also
positively associated with students' academic achievement (OECD,
2004; Rumberger & Palardy, 2005; Sinn, 2005).
On the other hand, less is known about the nature of these
relationships when both individual student and school socio-economic
status are disaggregated. To uncover these finer-grained associations,
we subjected Australia's 2003 PISA data set to retrospective
secondary analysis to better understand the reading and mathematics
literacy performance of secondary school students from different SES
backgrounds, across a variety of school SES strata. This analysis
therefore contributes to our understanding in two important ways. First,
from a methodological perspective, the study demonstrates the process
and potential usefulness of a secondary analysis approach using a
large-scale dataset as a contributor to national policy evaluation.
Secondly, the study adds value from a substantive perspective in
shedding light on a key policy question currently facing the
Commonwealth: specifically, the findings presented will add to
data-informed decision-making around the appropriate federal funding of
public education, as well as the use of public funds in the support of
independent and Catholic systems of schooling across Australia. In these
two ways, this secondary analysis demonstrates a strategy that holds
potential for optimising the value of public policy evaluation through
the enhanced use of extant large-scale, high-quality data sets in the
consideration of important national policy questions.
Socio-economic status and student outcomes
School socio-economic composition is a strong predictor of student
academic achievement in many countries (OECD, 2004; Rumberger &
Palardy, 2005; Sirin, 2005). Although studies in numerous countries have
shown that the socioeconomic profile of schools is positively associated
with achievement, our understanding of how this may vary across groups
of students, schools, or national contexts remains incomplete. As with
class size (American Educational Research Association, 2003) it is
likely that the association between school SES and achievement varies
with student background (family) characteristics, institutional or
sectoral arrangements, or national contexts.
For school SES, previous studies have examined variations in the
association between school composition and achievement for students from
different racial and socio-economic backgrounds. For example, four
decades ago, Coleman and colleagues (1966), found that lower SES
African-American students benefited from attending a racially integrated
school, whereas the achievement of their middleclass white peers did not
differ. More recent studies have suggested that the association between
achievement and school SES is strong for all students (Caldas &
Bankston, 1997; OECD, 2004; Tate, 1997), but many of these have not
disaggregated students by SES to show conclusively that the association
is similarly strong for all students.
Similarly, the relationship between individual students' SES
and academic achievement is well established (Jencks et al., 1972;
Marjoribanks, 1979; Noel & de Broucker, 2001; OECD, 2004). This
association has been shown to be strong and positive; typically, higher
student-level SES is associated with stronger educational outcomes, on
average. For example, in a meta-analysis of 74 studies examining SES and
academic achievement, Sirin (2005) confirmed that student-level SES is
one of the strongest correlates of academic performance. Higher SES
students typically have higher scores on standardised achievement tests
and are more likely to complete secondary school and university than
their peers from lower SES backgrounds (Blossfeld & Shavit, 1993;
Willms, 1999).
Despite these established understandings, questions remain. In
particular, our understanding of how academic achievement varies when
profiled in the context of both student-level and school-level SES
remains incomplete. Some studies suggest that the association between
achievement and school SES is stronger for lower SES students than for
their higher SES peers (Kahlenberg, 2001; Thrupp, 1995), while others
posit that the association is similar across the full range of
student-level SES (OECD, 2004; Rumberger & Palardy, 2005). Here, we
examine this tripartite association for disaggregated groups of students
and schools--our aim being to shed light at a finer grain and thereby
better inform policy making around federal school funding. We have two
main questions:
* to what extent is the association between school SES and student
achievement consistent for all students regardless of their individual
SES?
* to what degree does student achievement increase in a linear
fashion as school SES increases (that is, is the relationship
essentially linear, or does it depart from linearity, perhaps suggesting
that the relationship tapers off as school SES increases or conversely,
that there are thresholds that must first be crossed before the strongly
positive relationship between SES and academic performance is seen)?
Method
Our methodological approach is similar to that recently used to
compare the effectiveness of private and public schooling across student
SES groups in the USA and Chile (Lubienski & Lubienski, 2005;
Matear, 2006), and to examine the disaggregated relationship among
individual and school SES and achievement in Australia (Perry &
McConney, in press).
Specifically, we used secondary analysis of the 2003 PISA data set
for Australia. Within this secondary analysis, we drew on disaggregated
descriptive statistics and graphical representations to compare the
literacy performance of secondary students in two subject areas (reading
and mathematics) across various student SES backgrounds, and across a
range of school SES profiles. Our aim is not to show the extent to which
school SES explains variation in student achievement, which has already
been done in the primary analyses of PISA. Rather, our aim is to show
how the association between school SES and student performance varies
for different students and across different schools in a simple but
powerful way that is meaningful to policy-makers and readers without
advanced statistical expertise.
As noted above, PISA is a major international assessment of
15-year-olds' literacy performance in three subject areas:
mathematics, reading and science (problem-solving was also included in
the 2003 round) developed by the OECD as an assessment of students'
ability to apply their skills and knowledge in particular subject areas
and to communicate their findings when they do so. The objective of PISA
is to support member countries' educational systems in the
development of the skills and knowledge necessary for personal and
working life in industrialised countries. PISA therefore assesses
students' literacy in the three subject areas rather than
achievement tied to a specific curriculum to which students may have
been exposed in school. Test questions derive from hypothetical
situations or problems that students could reasonably be expected to
encounter in their adult lives (OECD, 2004).
For the 2003 PISA round, all OECD member countries and 11 partner
(or non-OECD countries) participated. In total, the sample from the
member countries included more than 250,000 students, increasing to more
than 275,000 students with the inclusion of those from partner
countries. Each country's sample is drawn to be statistically
representative of the total number of students enrolled in different
types of schools (for example, private or public, college, preparatory
or vocational schools) and locations (for example, urban or rural). The
Australian sample included 312 schools and just over 12,500 students
representative of the population of 15-year-old students across the
country. The sample statistics generated from this dataset are therefore
representative of the Australian population of 15-year-old secondary
students, and subgroups within that population.
PISA's measure of student-level SES is a composite index of
the following variables: highest parental occupational status, highest
parental educational attainment (years of education), and economic and
cultural resources in the home. PISA has named this variable ESCS
(economic, social and cultural status), and each participating student
completes a questionnaire that allows an individual ESCS score to be
assigned.
To calculate aggregated school-level SES, we averaged the ESCS
scores of every student who participated in PISA from a given school. We
hasten to underline that PISA is designed for administration to
15-year-old students. This means that in no case did we have the
individual ESCS for every student in a given school participating in
PISA 2003. For the 321 schools that comprised the Australian data, the
size of the student group ranged from a low of 5 students to a high of
61 students. Importantly, 305 (95%) of the 321 schools participating for
Australia had student groups of more than 20, with the average student
group size being about 39 students. We have termed this measure of
school-level SES 'mean school-group SES' and consider it a
relatively stable proxy measure, given the absence of the latter
variable in the Australian data set.
Briefly, the methodology we used in computing reading and
mathematics achievement means across student and school SES bands was as
follows:
(1) The Australian subset (about 12,500 students) was extracted
from the 2003 PISA data housed at the Australian Council for Educational
Research
(ACER).
(2) We constructed student-wise average literacy performance scores
in reading and mathematics using the sets of 'plausible
values' for these subjects provided in the data set.
(3) Using the individual student SES variable (called ESCS in
PISA), we sorted the data set according to SES and divided the data set
into five parts, based on student SES.
(4) Again using the individual SES variable, as well as the unique
school identifier variable (321 schools in the Australian data set), we
computed a 'mean school-group SES' variable and added it to
the data set.
(5) We determined the quintile cut-points on this mean school-group
SES variable.
(6) Each student therefore carried average scores in reading and
mathematics literacy performance, individual SES, unique school
identifier and mean SES of the school group to which he/she belonged.
(7) The overall Australian data set was cut into quintiles, based
on individual student SES (these subgroups each contained about 2,500
students and are the five rows represented in tables 1 and 2).
(8) Each of the five groups thus formed was further disaggregated
into five subgroups using the quintile cut-scores associated with the
mean school-group SES variable.
(9) These procedures left us with 25 subgroups organised by
individual SES and by mean school-group SES; these subgroups ranged in
size from a low of 88 students to a high of 1,212 students.
(10) We computed the group-wise mean scores in reading and
mathematics for each of these 25 subgroups, which are given by subject
in tables 1 and 2.
Empirical findings
As portrayed in tables 1 and 2, the aggregated SES of the school
group matters. Put another way, the SES school context in which the
students find themselves is strongly associated with academic
performance, on average. For example, as shown in Table 1, for the
typical student in the first SES quintile, being part of a high SES
school group versus a low SES school group is associated with a
difference of about 57 points (0.6 of a standard deviation) in reading
achievement.
For readers interested in a statistical yardstick for appraising
the magnitude of the differences among school-group means within
individual student SES quintiles, we have also provided the standard
errors associated with each student-level SES quintile. The commonly
used standard error of the mean is a yardstick for judging how much the
value of any sample mean may vary from sample to sample taken from the
same distribution. It can be used to compare an observed mean to a
hypothesised value (for instance, one can conclude the two values are
statistically different if the ratio of the difference to the standard
error is less than -2 or greater than +2).
For the current case, we are of the view that the more relevant
question is how much the difference between any pair of means, drawn
from a common source, might vary if repeated sampling had been possible.
We have therefore provided the standard error associated with
sample-mean differences for each of the five quintiles based on
individual student SES. Differences greater than two standard errors
indicate statistically meaningful disparity between that pair of means.
For example, within the first student SES quintile, first quintile
schools have significantly lower mean scores than fourth and fifth
quintile schools, but their mean difference compared with second and
third quintile schools is not significant. Nonetheless, the main purpose
of Table 1 (and Table 2) is not to test hypotheses about each mean
difference, but rather to examine the overall patterns of mean
difference across the two tables.
In mathematics, as depicted in Table 2, for the typical student in
the first SES quintile, being part of a high SES school group versus a
low SES school group is also associated with a difference of about 57
points (0.6 of a standard deviation). It is also evident that the
pattern of association between increases in average performance and
increases in school-group SES holds consistently across the quintiles
based on individual student SES. For example, as seen in Table 1, for
mid-SES students the difference in average reading achievement
associated with being in a low SES school group as compared to a high
SES school group is about 63 points (or about 0.7 standard deviation
units). For high SES students, the difference in average reading
performance associated with being in a low SES school group as compared
a high SES school group is 54 points (0.6 of a standard deviation). As
portrayed in Table 2, similar comparisons in mathematics yielded
differences of 67 (for mid-SES students) and 56 points (for high-SES
students), respectively.
Furthermore, consistent with other research--as we previously
knew--individual student SES also matters. For example, as depicted in
Table 1 in the case of reading, the difference between the average low
SES student in a low SES school and the average high SES student in a
similar school is about 90 points, or just about one standard deviation.
For school groups in the mid-SES range, the reading achievement
difference between the average low SES student and the average high SES
student moderates somewhat to about 78 points, or 0.8 standard
deviations, but for high SES school groups the difference in average
reading achievement again stretches to 86 points, or close to one
standard deviation.
These patterns of substantial difference in average achievement
associated with changes in individual student SES are also observed for
mathematics. For example, in mathematics the difference between the
typical low SES student and the typical high SES student, both in
mid-SES school groupings, is 71 points. Similar to the case for reading,
the observed difference in mathematics achievement between the average
high SES student and the average low SES student, both in high SES
school groupings, is about 84 points.
Our purpose in systematically disaggregating these data has also
been to provide a finer-grained portrait of the relationships among
individual student and school SES and academic literacy performance,
including such issues as whether there are evident 'school SES
thresholds' that must first be crossed before the positive
relationship between SES and academic performance is seen, and whether
observed patterns continue to be strongly positive across the entire
range of student and school-group SES. Figures 1 and 2 are provided to
offer tentative answers to these questions.
[FIGURE 1 OMITTED]
First, from these two figures the strength and consistency of the
association between mean school-group SES and academic literacy
performance across the quintiles representing individual student SES, as
well as across reading and mathematics, are remarkable. In no case is
there overlap among the lines representing the academic literacy
performance of different SES cohorts across the two subjects. In other
words, for both reading and mathematics, literacy performance as
measured by PISA almost universally increases steadily and consistently
as school SES increases, for each of the five student-level SES
quintiles.
Secondly, consistently across the two subjects, but perhaps most
notably in reading, there does appear to be something like a
school-group SES threshold--located at around the third school-group SES
quintile--below which the relationship between school-group SES and
academic attainment is positive but quite moderate, and beyond which the
relationship becomes strongly positive. For the Australian sample, this
may reflect the transition from lower- and middle-SES public schools to
more affluent private or public schools.
[FIGURE 2 OMITTED]
Thirdly, we point out the phenomenon evident in both reading and
mathematics for students in the highest individual SES quintile
(represented by the uppermost line in each chart).These lines show that
for students in this highest SES cohort, there is a small but noticeable
fall-off in average academic performance when comparing second (and
sometimes third) quintile school group performance against first
quintile school group performance; we refer to this phenomenon as
'the hockey stick' and note that it appears for no other
quintile in the data set. Although we know that the size of the group of
high SES students in the lowest SES school groups is small in comparison
to other groups, and suspect that the higher mean scores obtained simply
reflect that relatively smaller size group, we cannot confirm this
suspicion from these data alone.
Overall, the message resulting from our retrospective secondary
analysis of the 2003 PISA data set for Australia is clear and
consistent. As detailed in tables 1 and 2 and portrayed by figures 1 and
2, the aggregated SES of the school-group matters substantially. Put
another way, the SES context in which students find themselves is
strongly and consistently associated with academic performance, across
all student SES groupings. Similarly, and in concert with what was
previously known, it is also the case that individual student SES
matters greatly in the Australian context. For the core subjects of
reading and mathematics, higher individual student SES is positively
associated with higher academic literacy performance on average, and
this patterning was consistently observed across all five school-group
SES quintiles.
Educational policy implications
The Australian educational system can be characterised as
relatively equitable and effective, with high levels of school choice
and privatisation (Perry, 2009; Thomson, Cresswell & De Bortoli,
2003). As many previous studies about school socio-economic composition
and student achievement have been conducted in the USA, studies of other
national contexts can illuminate the ways in which educational policies
and structures influence the relationship. From an education policy
point of view, understanding which students are most affected by school
composition can help to shape policy options. For example, if high SES
students are relatively immune to the influence of school SES, then
there is no policy disincentive to fostering the socio-economic
integration of schools. If, on the other hand, low SES students are
strongly influenced by school SES, then policies need to take that into
account.
The findings from our secondary analysis of the Australian PISA
2003 data are clear; all students--regardless of their personal
SES--benefit strongly and relatively equally from schooling contexts in
which the SES of the school-group is high. Our findings similarly show
that all students, regardless of their individual SES, perform
considerably less well on measures of academic achievement in school
contexts characterised, in the aggregate, as low on the SES continuum.
Thus, the segregation of schools according to SES provides further
benefits for students whose economic circumstances allow attendance at
high SES schools, and also further handicaps students who lack this
advantage. That is, schooling that is segregated by SES is most likely
to benefit students who are already educationally privileged, but harm
students who find themselves at educational disadvantage, associated
with low SES backgrounds. Rather than mitigating or mediating
educational inequity, school segregation exacerbates it. For the
equitable educational benefit of all students, therefore, schools with
large concentrations of students with low SES backgrounds are
disadvantageous to those students. Educational policies that work
against the segregation of students and schools based on SES could be
vigorously pursued on the simple basis that they are likely to achieve
better and more equitable educational outcomes for all, rather than for
an economically privileged few. For these reasons, a strong consensus
exists among educational researchers and policy-makers that the
minimisation of school segregation based on SES should be a central
outcome of educational policy (Lamb, 2007; Oakes, 2000; OECD, 2004,
2005; Orfield, 1996; Willms, 1999).
While reducing school socio-economic segregation is not an easy
task, a number of innovative approaches have been tried by schools and
districts in different countries. No single approach will dramatically
reduce segregation but taken together they have the potential to make a
meaningful impact. The first group of approaches relates to reducing
real or perceived differences in quality between high and low SES
schools. This means paying attention to the inputs and resources
available to schools. The second group of approaches relates to
providing incentives to attract high SES students to lower SES schools.
One way to minimise differences in quality between low and high SES
schools is to adopt a funding model that provides similar resources to
all schools, and additional funding to schools with high needs (e.g.,
schools that are located in rural and remote areas, that enrol a high
percentage of students with learning disabilities or that serve a high
percentage of students with disadvantaged social backgrounds). Funding
models used in New Zealand and the UK minimise differences in
educational resources between schools. In these countries, all private
and public schools are entitled to the same funding based on the number
and type of students they enrol as long as they do not charge student
fees. Schools that charge fees relinquish their right to receive public
funds. This funding model provides an equitable distribution of
resources to schools but also promotes diversity and choice within the
educational system. Such a model is also simpler and more transparent
than the current funding model in Australia, which commentators have
described as opaque and overly complex (Dowling, 2008).This model could
also be politically feasible to implement since it would save many
families thousands of dollars in school fees without compromising the
quality of education on offer at their schools. The main
'losers' in this model would be high-fee independent schools
that are currently receiving public funds and the families whose
children attend them. Without Commonwealth funding, it is likely that
fees at these schools would increase to maintain the same quality of
education provision. Fees at such schools in the USA and UK are
significantly higher than in Australia, at least in part because they do
not receive any public funds.
Another way to reduce differences among schools is to ensure that
core curricular and programmatic offerings are relatively similar across
all schools. Marks, Creswell and Ainley (2006) have shown that the
educational advantage that high SES students enjoy is mediated primarily
through the curriculum that they receive. High-SES students are likely
to attend schools that have rigorous and demanding academic programs
oriented toward university entrance exams. Currently, high-quality
academic programs tend to be concentrated in private schools and in
public schools in higher SES communities (Edwards, 2006; Lamb, Hogan
& Johnson, 2001). Rather than maintain this financially and
geographically selective access to high-quality academic programs,
making such programs available to all students regardless of their
financial resources or place of residence could improve educational
opportunities for lower SES students. Increased funding to lower SES
schools could be used to support in-service training of teachers in
these programs, recruit experienced and successful teachers or subsidise
program costs.
Increased investment to lower SES schools could be used to help
them introduce or improve programs that will make them more attractive
to higher SES families. Such programs could include high-quality
university preparatory programs, intensive or immersion foreign language
programs, and specialised curriculum such as the International
Baccalaureate program. Another approach is to establish partnerships
between low SES secondary schools and local universities to permit able
students to enrol in university classes free of charge. Yale University,
for example, has such partnerships with low SES schools in New Haven,
Connecticut--the seventh poorest city in the continental USA (see Yale
University, 2008). Similarly, some lower SES school districts in the USA
have been able to attract higher SES students by providing financial
incentives upon graduation. For example, the Kalamazoo Promise, an
initiative 'funded into perpetuity by a small group of anonymous
donors' (Kalamazoo Promise, 2010), provides scholarships to
graduates of the Kalamazoo public school district in Michigan to attend
any of the 15 public universities in the state, including the
prestigious University of Michigan.
While balanced school compositions can be facilitated by making
lower SES schools more attractive to higher SES families, we also
acknowledge that all students who are struggling in school require extra
support and resources, regardless of the school that they attend. We
agree with other researchers who have called for increased support to
students who are falling behind their peers academically (Lokan,
Greenwood & Cresswell, 2001). But, based on our findings, we also
believe that policy measures should target schools and school funding to
reduce the association between school SES and student achievement.
Conclusion
Many of the measures we have recommended here, such as increased
funding to low SES schools, are consistent with the policies of the
current federal Labor government. We support the stance that low SES
schools in all sectors (that is, government, Catholic and independent)
need to be better supported. The socioeconomic composition of schools
has a significant influence on all children's academic performance.
For the benefit of most children and the larger society, balanced school
socio-economic composition should be a primary aim of educational
policy, and should be used as a criterion against which other policies
are evaluated. Reducing socio-economic school segregation is not only
equitable but also effective. For example, the association between
school SES and student achievement is lower in Canada and Finland than
in Australia, and both countries outperform Australia on PISA (OECD,
2004). As these countries show, reducing socio-economic school
segregation and differences among schools promotes higher overall
achievement for all students without decreasing the achievement of
high-performing students. Reducing school socio-economic segregation
does not mean that other foundational objectives, such as diversity and
choice, should be ignored. Rather, they should be pursued in ways that
do not reduce the educational opportunities and outcomes of students
from socially disadvantaged backgrounds.
References
American Educational Research Association. (2003). Class size:
Counting students can count. Research Points, 1(2), 1-4.
Blossfeld, H.-P., & Shavit, Y. (1993). Persisting barriers:
Changes in educational opportunities in thirteen countries. In Y. Shavit
& H.-P. Blossfeld (Eds.), Persistent inequality (pp. 1-24). Boulder,
CO: Westview.
Caldas, S. J., & Bankston, C. III. (1997). Effect of school
population socioeconomic status on individual academic achievement.
Journal of Educational Research, 90(5), 269-277.
Coleman, J., Campbell, E., Hobson, C., McPartland, J., Mood, A.,
Weinfeld, F., & York, R. (1966). Equality of educational
opportunity. Washington, DC: US Government Printing Office.
Dowling, A. (2008). Towards a national school funding model.
Research Developments, 19(2), 1-5.
Edwards, D. (2006). Competition, specialisation and stratification:
Academic outcomes of the government school system in Melbourne,
Australia. Paper presented at the Comparative Education Society in
Europe conference, Granada, Spain.
Jencks, C., Smith, M., Acland, H., Bane, M. J., Cohen, D., Gintis,
H., et al. (1972). Inequality: A reassessment of the effect of family
and schooling in America. New York: Basic Books.
Kalamazoo Promise. (2010). Retrieved from www.kalamazoopromise.com
Kahlenberg, R. (2001). All together now: Creating middle-class
schools through public school choices. Washington, DC: Brookings
Institution.
Lamb, S. (2007). School reform and inequality in urban Australia: A
case of residualising the poor. In R. Teese, S. Lamb & M. Duru-Belat
(Eds.), Education and inequality (pp. 1-38). (Vol. 3.) Dordrecht:
Springer.
Lamb, S., Hogan, D., & Johnson, T. (2001). The stratification
of learning opportunities and achievement in Tasmanian secondary
schools. Australian Journal of Education, 45(2), 153-167.
Lokan, J., Greenwood, L., & Cresswell, J. (2001). The PISA 2000
survey of students' reading, mathematical and scientific literacy
skills: How literate are Australia's students? Melbourne:
Australian Council for Educational Research.
Lubienski, S. T., & Lubienski, C. (2005). A new look at public
and private schools: Student background and mathematics achievement.
Phi Delta Kappan, 86(9), 696-699.
Marjoribanks, K. (1979). Families and their learning environments:
An empirical analysis. London: Routledge and Kegan Paul.
Marks, G. N., Cresswell, J., & Ainley, J. (2006). Explaining
socioeconomic inequalities in student achievement: The role of home and
school factors. Educational Research and Evaluation, 12(2), 105-128.
Matear, A. (2006). Equity in education in Chile: The tensions
between policy and practice. International Journal of Educational
Development, 27(1), 101-113.
Noel, S., & de Broucker, P. (2001). Intergenerational
inequities: A comparative analysis of the influence of parents'
educational background on length of schooling and literacy skills. In W.
Hutmacher, D. Cochrane & N. Bottani (Eds.), In pursuit of equity in
education: Using international indicators to compare equity policies
(pp. 277-298). Dordrecht: Kluwer Academic.
Oakes, J. (2000). The distribution of knowledge. In R. Arum &
I. R. Beattie (Eds.), The structure of schooling: Readings in the
sociology of education (pp. 224-234). Mountain View, CA: Mayfield
Publishing.
Organisation for Economic Co-operation and Development. (2004).
Learning for tomorrow's world: First results from PISA 2003. Paris:
Author.
Organisation for Economic Co-operation and Development. (2005).
School factors related to quality and equity: Results from PISA 2000.
Paris: Author.
Orfield, G. (1996). Dismantling desegregation: The quiet reversal
of Brown v. Board of education. New York: New Press.
Perry, L. (2009). Characteristics of equitable systems of
education: A cross-national analysis. European Education, 41(1), 79-100.
Perry, L., & McConney, A. (in press). Does the SES of the
school matter? An examination of socioeconomic status and student
achievement using PISA 2003. Teachers College Record.
Rumberger, R. W., & Palardy, G. J. (2005). Does segregation
still matter? The impact of student composition on academic achievement
in high school. Teachers College Record, 107(9), 1999-2045.
Sirin, S. R. (2005). Socioeconomic status and academic achievement:
A meta-analytic review of research. Review of Educational Research,
75(3), 417-453.
Tate, W. F. (1997). Race-ethnicity, SES, gender, and language
proficiency trends in mathematics achievement: An update. Journal for
Research in Mathematics Education, 28(6), 652-679.
Thomson, S., Cresswell, J., & De Bortoli, L. (2003). Facing the
future: A focus on mathematical literacy among Australian 15-year-old
students in PISA 2003. Melbourne: Australian Council for Educational
Research.
Thrupp, M. (1995). The school mix effect: The history of an
enduring problem in educational research, policy and practice. British
Journal of Sociology of Education, 16(2), 183-203.
Willms, J. D. (1999). Quality and inequality in children's
literacy: The effects on families, schools, and communities. In D. P.
Keating & C. Hertzman (Eds.), Developmental health and the wealth of
nations: Social, biological, and educational dynamics (pp. 72-73). New
York: Guilford Press.
Yale University. (2008). Partnerships. Retrieved 11 January 2010
from http://www.yale. edu/onhsa/youth_partnerships.htm
Laura Perry is Senior Lecturer in Educational Policy and Contexts
at the School of Education, Murdoch University.
Email:
[email protected]
Andrew McConney is Senior Lecturer in Research and Evaluation at
the School of Education, Murdoch University.
Table 1 Mean reading scores by individual student SES and
school-group mean SES for PISA 2003 Australia
School-group SES
Individual
student SES 1st 2nd 3rd
(ESCS) quintile quintile quintile
1st quintile n = 984 n = 690 n = 490
458.8 466.0 471.5
2nd quintile n = 591 n = 681 n = 596
486.2 496.0 503.5
3rd quintile n = 416 n = 492 n = 639
498.1 504.2 515.1
4th quintile n = 213 n = 377 n = 516
520.3 525.1 529.8
5th quintile n = 99 n = 199 n = 362
547.8 543.0 549.4
School-group SES Standard
Individual error of
student SES 4th 5th sample-mean
(ESCS) quintile quintile differences
1st quintile n = 231 n = 88
503.3 516.0 12.8
2nd quintile n = 425 n = 195
531.3 543.9 9.6
3rd quintile n = 568 n = 348
541.7 560.9 8.6
4th quintile n = 682 n = 693
557.2 577.2 9.1
5th quintile n = 602 n = 1212
576.1 601.7 10.9
Table 2 Mean mathematics scores by individual student SES
and school-group average SES for PISA 2003 Australia
School-group SES
Individual
student SES 1st 2nd 3rd
(ESCS) quintile quintile quintile
1st quintile n = 984 n = 690 n = 490
458.8 459.8 475.3
2nd quintile n = 591 n = 681 n = 596
485.5 494.9 505.0
3rd quintile n = 416 n = 492 n = 639
495.4 501.3 513.6
4th quintile n = 213 n = 377 n = 516
521.6 521.1 530.5
5th quintile n = 99 n = 199 n = 362
543.1 535.4 545.9
School-group SES Standard
Individual error of
student SES 4th 5th sample-mean
(ESCS) quintile quintile differences
1st quintile n = 231 n = 88
497.9 515.8 12.3
2nd quintile n = 425 n = 195
529.4 546.4 9.8
3rd quintile n = 568 n = 348
538.5 562.2 8.8
4th quintile n = 682 n = 693
554.8 575.0 9.5
5th quintile n = 602 n = 1212
570.9 599.5 11.7