An Examination of Socioeconomic Status and Student Achievement Using PISA 2003
by Laura B. Perry & Andrew McConney — 2010
Background/Context: It is well established in the research literature that socioeconomically disadvantaged students and schools do less well on standardized measures of academic achievement compared with their more advantaged peers. Although studies in numerous countries have shown that the socioeconomic profile of a school is strongly correlated with student outcomes, less is understood about how the relationship may vary if both individual student and school socioeconomic status (SES) are disaggregated.
Population/Participants/Subjects: This study uses data from the Australian 2003 Programme for International Student Assessment (PISA). The sample includes over 320 secondary schools and more than 12,000 students from Australia.
Research Design: This study is a secondary analysis of data from the Australian 2003 PISA. Descriptive statistics are used to compare the average reading, mathematics, and science achievement of secondary school students from different SES backgrounds in a variety of school SES contexts.
Conclusions: The two main findings of the study are that increases in the mean SES of a school are associated with consistent increases in students’ academic achievement, and that this relationship is similar for all students regardless of their individual SES. In the Australian case, the socio-economic composition of the school matters greatly in terms of students’ academic performance.
The relationship between students’ socioeconomic status (SES) and their educational outcomes is well established in the research literature (Jencks et al., 1972; Marjoribanks, 1979; Noel & de Broucker, 2001; Organisation for Economic Co-operation and Development [OECD], 2004). The relationship is strong and positive; on average, the higher a student’s SES, the stronger his or her educational outcomes tend to be. In his meta-analysis of 74 studies about SES and academic achievement, Sirin (2005) confirmed that “family SES at the student level is one of the strongest correlates of academic performance” (p. 438). For example, higher SES students typically have higher scores on standardized achievement tests and are more likely to complete secondary school and university than their peers from lower SES backgrounds (Blossfeld & Shavit, 1993; Willms, 1999).
In addition to family (individual student) SES, socioeconomic status at the school level is also related to student outcomes. The aggregated SES of the student body, also known as mean school SES, has been shown to be independently associated with student outcomes beyond that of individual student backgrounds (OECD, 2004; Rumberger & Palardy, 2005; Sirin, 2005; Willms, 1999). Schools with a high mean SES tend to have higher average scores on standardized tests related to their student intake. Put another way, the grouping of high-SES students into a school seems to create conditions associated with even higher educational outcomes than would be expected from individual students’ SES alone. The opposite is true for lower SES students. When lower SES students are grouped in a lower SES school, their lower educational outcomes can be exacerbated. Indeed, the correlation of school SES with academic performance has been demonstrated in some cases as even stronger than individual SES (OECD, 2004; Sirin). Thus, in the case of school socioeconomic composition as related to academic performance, the whole appears to be more than the sum of the parts.
Although the relationship between mean school SES and student outcomes is established in the literature, many questions remain. Some studies suggest that the association between academic achievement and school socioeconomic composition is stronger for lower SES students than their higher SES peers (Kahlenberg, 2001; Thrupp, 1995), whereas others suggest that the association is similar for all students (OECD, 2004; Rumberger & Palardy, 2005). The current research literature is not clear whether the association between mean school SES and academic outcomes is stronger for students of some socioeconomic backgrounds than others, and, if so, by how much. This study attempts to shed light on this ambiguity by examining the extent to which the relationship between mean school SES and academic performance holds for higher SES students as compared with lower SES students.
Similarly, gaps remain in our understanding of how the relationship between student SES and academic performance may vary across schools with different socioeconomic compositions. For example, are increases in the mean SES of the school consistently associated with increases in student academic outcomes? Or does the relationship weaken or flatten as the mean SES of the school increases? Similarly, is there a threshold before which increases in the mean SES of the school have limited association with increases in academic achievement? In other words, is the relationship between mean school SES and academic achievement uniformly linear, or does it have other forms depending on student SES? Our second purpose, therefore, was to examine the extent to which the relationship between mean school SES and student outcomes is linear.
Our current knowledge about the association of mean school SES and student outcomes is fuzzy and emerging. We know that the association is strong and positive, but compared with other areas of educational research, our understanding about how the relationship may vary is incomplete. For example, the large literature about the association of class size and student academic outcomes has shown that the relationship varies in a number of important ways. A recent review has shown that lower SES and minority students receive more benefit from small class sizes than their higher SES peers, that a threshold exists above which reductions in class size are ineffective (i.e., reducing the size of a class from 30 to 22 students is not effective, but reducing it from 22 to 17 students is), and that the benefit is most strongly felt in the first few years of primary education (American Educational Research Association, 2003). Understanding how the association of class size with academic attainment varies is important for policy makers and educational leaders who are considering implementing this effective but resource-intensive intervention. Similarly, understanding how the association of academic attainment and mean school SES may vary is important for policy makers contemplating reforms that could exacerbate or mediate school segregation based on SES. Finally, more precise understanding of the association of mean school SES with achievement could help parents make more informed choices about selecting schools for their children.
To answer our research questions and thereby provide a more detailed look at the association between mean school SES, student SES, and academic outcomes, we have conducted secondary analysis of the 2003 Australian data from the Programme for International Student Assessment (PISA). PISA is a large international assessment system (and resulting data set) developed by the OECD. For our purposes, the main advantage of PISA lies in its complex measure of individual student SES, which includes parental education and occupation, and the family’s cultural capital and financial resources. This measure is much more complex and precise than many other data sets, some of which use simple measures—such as parental postal address or participation in a subsidized school meals program—to estimate student SES. Using the PISA data set will also allow us to make cross-national comparisons in future studies.
Analyses of the Australia data are relevant to an international audience for a number of reasons. First, most immigrants and ethnic minorities are relatively well integrated into Australian society and do not suffer high levels of educational disadvantage (Lokan, Greenwood, & Cresswell, 2001; OECD, 2004), as is common in the United States and many European countries.1 Related to this, ethnically or racially segregated schools are much less common in Australian towns and cities than in many other countries. This relative lack of racial or ethnic segregation associated with educational disadvantage means that the relationship between SES at the school and individual levels can be more “cleanly” examined without the confounding influences of race or ethnicity often seen in other developed countries. As Caldas and Bankston (1997) have argued, “More research is needed to see how the economic composition of schools affects achievement in areas with greater racial equality” (p. 275). This study attempts to answer that call.
Second, Australia has a relatively equitable and high-performing educational system compared with other OECD countries (Lokan et al., 2001; OECD, 2004). In PISA 2003, only a handful of countries performed statistically significantly higher than Australia, and Canada was the only English-speaking country to perform at a similar level. And like most of the other high-performing countries in PISA, Australia has more equitable student outcomes than most other OECD countries. These features of the Australian educational system make it an interesting case in which to examine the association of mean school SES and student outcomes. In terms of both quality and equity, Australia places higher than most of the other OECD countries that participated in PISA but lags behind a few exemplary cases such as Finland, Hong Kong, and Canada. The Australian educational system is performing well but could nevertheless be substantially improved. It thus represents a moderate rather than an ideal or extreme case.
Finally, Australia is an interesting case study because of its high levels of privatization and school choice. Parental school choice is widely available and practiced to a much higher degree than in North America. Currently, one third of all Australian students and almost 40% of all secondary students attend private schools (Ryan & Watson, 2004), and choosing a non-neighborhood school within the public sector is common as well. Because mean school SES can be both a driver and consequence of parental school choice, studies about educational systems with high levels of privatization and choice are especially relevant.
Examining mean school SES and student outcomes in different countries can help researchers better understand how the relationship between the two may vary under different educational and sociocultural contexts. Educational practices and the larger sociocultural, economic, and political contexts that shape them vary much more across countries than within them. Examining different national educational systems within a comparative framework provides researchers a range of cases that embody unique combinations of “variables.” This range of diverse cases can then be used to build understanding and theory about the ways in which educational practices and policies interact with larger social forces to lead to particular outcomes.
AUSTRALIAN EDUCATIONAL CONTEXT
Like most other English-speaking countries, Australia has a comprehensive system of secondary education, wherein most students attend the same type of institution (e.g., high school). By contrast, differentiated systems of secondary education, common in continental Europe, provide different types of education in different types of institutions (e.g., lycee, gymnasia, technical schools, and vocational schools ), only some of which offer pathways for further study at a university. Comprehensive secondary systems may offer varying degrees of vocational education, but the emphasis for most schools and most students is on general academic education.
Australia has a long history of private schooling, and the number of students enrolled in private schools has been increasing over the last few decades (Ryan & Watson, 2004). As noted earlier, almost 40% of all secondary students in Australia now attend a private school. The private sector is divided into two main types: Catholic and “independent,” many of which are religious as well (e.g., Anglican, Baptist, and Islamic).
All private schools, including parochial schools, receive varying levels of public funding; public schools are funded primarily by state governments. In 2004, one third of public (state and federal) education funding was directed to private schools, which enroll approximately one third of all students in Australia (Ryan & Watson, 2004). Thus, the share of government funding that is directed to public and private schools follows the proportion of students that they enroll. All private schools also charge fees, however. In 2002, average fees were $2,500 per annum at Catholic schools and $6,000 per annum at independent schools (Ryan & Watson); fees at high-status independent schools can reach two to three times as much. The amount of per-pupil funding thus varies across the three main school sectors, although it is relatively uniform within the public school sector. Schools do not receive funds from local authorities, as is common in the United States, although they do request “voluntary fees” from families.
Reflecting these funding differences across school sectors, the average SES of students in the public school sector is lower than in Catholic schools, which is again lower than in independent schools (Ryan & Watson, 2004). Students from lower SES backgrounds are more likely to attend public schools, whereas students from middle and higher SES backgrounds are more likely to attend private schools. Students from the highest SES backgrounds are the most likely to attend the high-status, high-fee independent schools.
School choice also exists within the public school sector. Students are guaranteed a place at their local neighborhood school but may submit an out-of-area application to any public school within the entire state. Although it may be difficult to secure a place at some secondary schools, many students will successfully find a place at a school of their choice.
Curriculum frameworks and standards are created at the state level, although the current federal government aims to implement a national curriculum by 2011 and has initiated a national assessment program beginning in 2008. All schools, whether private or public, are required to teach the same curriculum. Because the curriculum frameworks are broad, however, the actual taught curriculum can vary substantially between schools. Research has shown that lower SES schools are more likely to offer vocational education and a limited range of university-preparation courses as contrasted with higher SES schools (Edwards, 2006).
MEAN SCHOOL SES AND ACADEMIC ACHIEVEMENT
School composition is defined as the aggregated measure of the social backgrounds of the students who attend a school. Student background can be measured along numerous dimensions, such as race, ethnicity, immigrant status, ability, gender, or SES. Within the school composition literature, some studies include multiple dimensions of school composition, whereas others focus on just one dimension, such as SES. In many countries, some of these dimensions are correlated with each other. For example, in countries where a particular ethnic or racial group has a lower social status within the larger society, ethnicity or race is correlated with SES. They are not the same, however, and studies that have measured both the racial and social class composition of a school have found that race exerts a separate influence on academic achievement independent of SES (Caldas & Bankston, 1997).
Coleman and associates’ (Coleman et al., 1966) study of racial segregation in American schools was one of the first to examine the association between school social composition and academic achievement. These researchers found that African American students had higher levels of academic achievement in racially desegregated schools than in racially segregated schools. They also found that increasing the funding of racially segregated schools was less effective than desegregating schools in raising the academic achievement of African American students. Finally, the study found that school desegregation benefited African American student achievement but was not associated with increases in the achievement of the White majority students. The Coleman report had two significant findings relevant to the research literature about school composition and academic achievement: Academic outcomes are more strongly associated with school composition than with school resources or processes, and the association seemed to be stronger for underserved students than for their more privileged peers.
Since the Coleman report, many studies have shown that school socioeconomic composition is strongly correlated with student academic achievement. Two American studies have found that the relationship between academic outcomes and SES at the school level is similar to or the same as at the student level (Caldas & Bankston, 1997; Rumberger & Palardy, 2005). Sirin’s (2005) meta-analysis of 74 studies conducted between 1990 and 2000 similarly found that SES at the school level was more strongly correlated with student academic performance than individual SES. The results from PISA 2000 and 2003 show that in most OECD countries, academic outcomes are more strongly associated with mean school SES than with individual students’ socioeconomic backgrounds (OECD, 2004, 2005).
Other studies have found that school composition has a stronger association with students’ academic achievement than school resources or processes. For example, Robertson & Symons’s (2003) study of secondary schools in the United Kingdom found that the SES of peers within a school had a stronger association with individual students’ academic achievement than did school-level variables such as class size. Lamb and Fullarton (2002) found that peer effects within a classroom, not differences between teachers, explained variation in student outcomes in Australia and the United States. The OECD’s primary analysis of PISA 2003 has also shown that after controlling for mean school SES, the unique effect of school resources is small (OECD, 2004). School resources are highly interrelated with the school’s social intake, however. Because higher SES schools tend to be better resourced than lower SES schools (Darling-Hammond, 2007; OECD, 2004; Tate, 1997), the unique but mediated association of school resources with academic attainment is likely to be underestimated.
Despite the consistency of the message about the association of school SES with academic attainment just described, the research literature remains unclear on the degree to which increases in mean school SES are associated with improvements in achievement across students grouped according to SES. On the one hand are studies that either explicitly or implicitly suggest that the association between academic outcomes and mean school SES is stronger for lower SES students than for their higher SES peers. These include the Coleman report (Coleman et al., 1966); a recent speech by Barry McGaw, one of the architects of PISA (McGaw, 2007); and a study by Opdenakker and Van Damme (2001), which found that high-ability low-SES students in Belgium were more than “twice as sensitive as the students with the same ability level from high SES families” (p. 424) to the mean SES of their schools.
On the other hand are studies that show that all students benefit from increases in mean school SES. For example, in the United States, Rumberger and Palardy (2005) have shown that increases in mean school SES are associated with gains in the achievement of students from both higher and lower SES backgrounds. Likewise, the PISA reports have shown that all students benefit from attending schools with a higher mean SES (OECD, 2004). Other studies have not explicitly examined the association for different groups of students but argue that any given student will perform better at a higher mean SES school than at a lower SES school, thus suggesting that the association is similar for all students regardless of their individual SES (Lauder & Hughes, 1999; Sui-Chu & Willms, 1996).
Although these studies do not provide a conclusive picture about variations in the strength of the association for different student groups, they do not necessarily contradict each other. Although some studies have suggested that the association of academic achievement and mean school SES is stronger for lower SES students, others suggest that the association holds for all student groups. No studies, however, have shown that all student SES groups are influenced equally. In other words, showing that mean school SES is associated with academic achievement for all students does not rule out the possibility that the association is less strong for higher SES students compared with their lower SES peers. Our study aims to address this gap in the literature and in our finer-grained understanding of the association of mean school SES, student SES, and academic performance.
In addition to studies that have examined the association of mean school SES and student academic outcomes, other studies have sought to explain the mechanisms by which higher SES schools may facilitate higher levels of learning. Studies have shown that higher SES schools differ from lower SES schools in multiple ways. Compared with higher mean SES schools, lower mean SES schools often have fewer material and financial resources (Chiu & Khoo, 2005; Tate, 1997); have more discipline problems, which reduce the amount of instructional time available to students (Kahlenberg, 2001; Thrupp, 1999; Willms, 1999); have less qualified teachers (Berliner, 2001; Darling-Hammond, 2007; Gandara, Rumberger, Maxwell-Jolley, & Callahan, 2003; Orfield, 1996; Willms, 1999); have lower teacher expectations (Rumberger & Palardy, 2005); have less positive relationships between teachers and students (OECD, 2005); require less homework (Rumberger & Palardy); and offer a less academically rigorous curriculum (Anyon, 1981; Gandara et al.; Orfield; Thrupp, 1999). Moreover, higher mean SES schools often have a culture of achievement because the students themselves bring high expectations for academic success. This achievement press can then support the achievement of all students, regardless of their own family background (Hanushek, Kain, Markman, & Rivkin, 2001; Thrupp, 1999; Willms).
THE SAMPLE, VARIABLES, AND METHOD
PISA is a major international assessment of 15-year-olds’ academic performance in four subject areas: mathematics, reading, science, and problem-solving. This assessment program was developed and is managed by the OECD, a nongovernmental research and policy organization established in 1961. The OECD is headquartered in Paris and is devoted to social, educational, economic, and environmental issues. The OECD’s membership comprises wealthy industrialized countries that have accepted representative political democracy and a market economy and include Australia, Canada, Germany, Japan, the United Kingdom, and the United States; altogether, the OECD currently comprises 30 member countries.
The objective of PISA is to support member countries’ educational systems in the development of the skills and knowledge necessary for personal and working life in industrialized countries. PISA therefore assesses students’ literacy in the four subject areas rather than achievement tied to a specific curriculum to which students may have been exposed in school. Test questions derive from hypothetical situations or problems that students could reasonably be expected to encounter in their adult lives (OECD, 2004).
PISA was first administered in 2000 and is repeated every 3 years. As of mid-2007, the most recent results that were publicly available were for the tests administered in 2003. For the 2003 cycle, all 30 member countries plus 11 partner countries participated. The sample from the member countries included more than 250,000 students; the sample increased to more than 275,000 students with the inclusion of partner countries. Each country’s sample is drawn to be statistically representative of the total number of students enrolled in different types of schools (e.g., private or public, college preparatory or vocational schools, and so on) and locations (e.g., urban or rural). The Australian sample included 312 schools and just over 12,500 students representative of the population of 15-year-old students across the country. The sample statistics generated from this data set are therefore representative of the Australian population of 15-year-old secondary students and subgroups within that population, without the need for further inferential statistical analysis.
Our study computed mean performance scores in three subject areas—reading, mathematics, and science—for students with various individual and school SES backgrounds. PISA’s measure of student-level SES is an index of the following: highest parental occupational status, highest parental educational attainment (years of education), and economic and cultural resources in the home. PISA has named this variable ESCS (economic, social, and cultural status), and each participating student completes a questionnaire that allows an ESCS score to be assigned.
To calculate the aggregated mean school SES, we averaged the ESCS scores of every student who participated in PISA from a given school. This resulted in a variable with a mean of 0.226, ranging from a minimum of -1.045 to a maximum of 1.415, and having a standard deviation of 0.439. However, we hasten to underline that PISA is designed for administration to 15-year-old students. This means that in no case did we have the individual ESCS for every student in a given school participating in PISA 2003 Australia. For the 321 schools that constitute the Australian data, the size of the student group ranged from a low of 5 students to a high of 61 students. As depicted in Figure 1, the distribution of the 321 Australian schools according to the size of the student group tested shows that 16 of the schools had 20 or fewer students.
Figure 1. Distribution of 321 Australian schools participating in PISA 2003 according to size of the student group
Conversely, 305 (95%) of the 321 schools participating for Australia had student groups of more than 20, with the average student group size being about 39 students. Thus, we have termed this measure of group SES “mean school group SES” and consider it a relatively stable proxy measure for school SES, given the absence of the latter variable in the Australian data set.
Our study’s methodological approach is similar to that used in recent studies comparing the effectiveness of private and public schooling for particular student SES groups. Lubienski and Lubienski (2005) in the United States and Matear (2006) in Chile computed mean scores on standardized achievement tests for students of various SES groups and then compared the means for different types of schools. Both studies showed that private schools are not more effective than public schools once the data have been disaggregated according to student SES in these two national contexts. Although neither study explicitly used mean school SES to explain its findings, it is likely that students of a particular SES cohort performed equally well on standardized achievement tests, whether in public or private schools, because both types of institution enroll students with similar SES backgrounds.
We used a similar methodology in our study, substituting a range of mean school SES for the private versus public school dichotomization. Therefore, instead of comparing the average academic achievement of high-SES students in private versus public schools, we compare achievement of high-SES students across five bands (quintiles) of schools representing low through high mean school SES. We then replicated this comparison for students with middle- and low-SES backgrounds. Initially, five subgroups of students were formed based on their individual SES, and each of these subgroups was further subdivided into five parts based on the average SES of the school group to which they belonged. In total, we calculated 25 means for each of the three subject areas. As shown in Tables 1–3, the smallest subgroup in our analysis of the 25 subgroups comprised 88 students (very low SES students attending very high SES schools) and the largest group contained 1,212 students (very high SES students attending very high SES schools).
We did not further disaggregate our sample into private and public schools for two reasons. First, although the Australian sample comprises representative proportions of students from private and public schools, the data set does not allow us to determine which schools are private or public. Each school in the sample is coded as either private or public, but as with a few other countries, including Canada and France, the Australian executive committee decided not to release this information publicly. Therefore, users of the data set are not able to determine whether a given school in the sample is private or public.
Second, it is unlikely that further disaggregating our sample along school sectors would provide additional information. The studies by Lubienski and Lubienski (2005) and Matear (2006) suggest that private schools are not more effective than their public counterparts once SES has been taken into account. Further support for this view comes from the OECD’s (2004) primary analysis of PISA 2003, which found that “the private school advantage remains after controlling for individual students’ backgrounds . . . but disappears once the effect of the social composition of their schools is controlled for” (p. 252). Therefore, at least for the PISA 2003 data set, the value of school type (i.e., public or private) as an explanatory variable on student performance is subsumed by mean school SES. Controlling for school type is therefore effectively unnecessary in our study because we are examining student academic achievement in the context of varying levels of mean school SES.
Briefly, the methodology we used in computing means across student and school SES bands for each of the three subject areas was as follows: (1) The Australian subset (about 12,500 students) was extracted from the 2003 PISA data housed at the Australian Council for Educational Research (ACER). (2) We constructed average scores in each of reading, math, and science using the sets of “plausible values” for each subject provided in the data set (the appropriateness of this procedure was first checked with the project director for PISA Australia). (3) Using the individual student SES variable (called ESCS in PISA), we sorted the data set according to SES and determined the quintile cut-scores to divide the data set into five parts, based on student SES. (4) Again using the individual SES variable, as well as the unique school identifier variable (321 schools in the Australian data set), we computed a “mean school group SES” variable and added it to the data set. (5) We determined the quintile cut-points on this mean school group SES variable. (6) Each student therefore carried average scores in reading, math, and science; individual SES; unique school identifier; and mean SES of the school group to which he/she belonged. (7) The overall Australian data set was cut into five quintiles based on individual SES (these subgroups each contained about 2,500 students and are the 5 rows represented in the Tables 2–4). (8) Each of the five groups so formed were further disaggregated into five subgroups using the mean school group SES variable. (9) These procedures left us with 25 subgroups organized by individual SES and by mean school group SES; these subgroups ranged in size from a low of 88 students to a high of 1,212 students). (10) We computed the mean scores in reading, math, and science for each of these 25 subgroups, which are given by subject in Tables 2–4.
Unlike many prior studies, we have not used hierarchical linear modelling (HLM) to examine the association of varying levels of student SES and mean school SES with student outcomes. A number of considerations directed our methodological approach. First, we have set out here not to replicate the OECD’s primary analyses of PISA, which employed multilevel modelling and, in our view, clearly demonstrated substantial unique associations between individual and school-level SES and student achievement. Rather, our purpose was, in the first instance, to unpack those previously demonstrated relationships to better describe, and thereby understand, how each varied in the context of variations in the other. In other words, there are no hypotheses or questions requiring inferential statistics being tested here (e.g., do high-SES students in low-SES schools typically show statistically significant different achievement than high-SES students in high-SES schools?), nor are we here suggesting or testing a causal mechanism between SES, whether individual or school, and achievement. Rather, our research questions are clearly descriptive (e.g., what does achievement look like across varying levels of individual and school SES?). Therefore, our approach is also descriptive, simply providing tabular and graphical descriptions of how student achievement varies for this PISA data set in the context of differing levels of individual student and school SES. We believe that such descriptions are accessible and meaningful to a broad audience and hence add to the more global explained variance estimates provided by the primary multilevel analyses already done. Thus, in this case, we believe that our methods represent a parsimonious yet powerful and widely understandable approach to understanding at a finer grain the interaction of individual and school-level SES and their relationship with student attainment.
Second, although it is widely understood that HLM is ideal for globally estimating the unique associations of student- and school-level variables on student attainment, HLM relies on often unspoken assumptions that relationships among variables under study are linear. The approach can thereby result in the unintended consequence that departures from linearity in relationships for particular subgroups of students within the data set, which may become evident with a finer grained analysis, are masked. Thus, a secondary purpose of our analyses has been to examine at a fine grain the extent to which the relationship between school socioeconomic composition and student achievement is uniformly linear, given varying levels of student SES.
However, the preceding points being made, to directly address the question of whether individual SES and the aggregated school-level SES variable that we constructed (due to the lack of that variable in the Australian case) are indeed accounting for unique portions of variance in the three achievement variables, we have included three hierarchical multiple regression analyses (Cohen & Cohen, 1983). The results of these hierarchical regression analyses are given in Table 1. They demonstrate that student-level SES and our aggregated school-level SES variable account for independent and unique portions of variance, over and above that accounted for by gender, in reading, math, and science achievement as measured by PISA. Further, in each case, both individual and aggregated school SES account for statistically significant portions of explained variance in the outcomes examined.
Table 1. Hierarchical Multiple Regression Analysis of PISA 2003 Australia, in Reading, Mathematics, and Science
Model Summary Reading | |||||||||
Model | R | R2 | AdjustedR2 | SE of the Estimate | Change Statistics | ||||
R2Change | F Change | df1 | df2 | Sig. FChange | |||||
1 | .208a | .043 | .043 | 89.89526 | .043 | 562.702 | 1 | 12387 | .000 |
2 | .473b | .224 | .224 | 80.96547 | .181 | 2884.036 | 1 | 12386 | .000 |
3 | .520c | .270 | .270 | 78.52948 | .046 | 781.346 | 1 | 12385 | .000 |
Model Summary Mathematics | |||||||||
Model | R | R2 | AdjustedR2 | SE of the Estimate | Change Statistics | ||||
R2Change | F Change | df1 | df2 | Sig. FChange | |||||
1 | .033a | .001 | .001 | 93.16148 | .001 | 13.777 | 1 | 12387 | .000 |
2 | .413b | .170 | .170 | 84.90502 | .169 | 2527.248 | 1 | 12386 | .000 |
3 | .467c | .218 | .218 | 82.44051 | .048 | 752.612 | 1 | 12385 | .000 |
Model Summary Science | |||||||||
Model | R | R2 | AdjustedR2 | SE of the Estimate | Change Statistics | ||||
R2Change | F Change | df1 | df2 | Sig. FChange | |||||
1 | .002a | .000 | .000 | 97.30363 | .000 | .071 | 1 | 12387 | .790 |
2 | .435b | .190 | .189 | 87.60086 | .190 | 2896.963 | 1 | 12386 | .000 |
3 | .483c | .233 | .233 | 85.19961 | .044 | 709.007 | 1 | 12385 | .000 |
a. Predictors: (Constant), Sex Q3 | |||||||||
b. Predictors: (Constant), Sex Q3, Index of Socio-Economic and Cultural Status | |||||||||
c. Predictors: (Constant), Sex Q3, Index of Socio-Economic and Cultural Status, School-wise Average Index of Socio-Economic and Cultural Status |
FINDINGS
Overall, the message resulting from the secondary analysis of the 2003 PISA data for Australia seems clear and consistent. As portrayed in Tables 2–4, the aggregated SES of the school group matters. Put another way, the SES context in which the student finds himself or herself seems strongly associated with academic performance, on average. For example, as shown in Table 2, for the typical student in the first SES (ESCS) quintile, being part of a high-SES school group versus a low-SES school group is associated with a difference of about 57 points (0.6 of a standard deviation) in reading. Similarly, in mathematics, as depicted in Table 3, for the typical student in the first SES quintile, being part of a high-SES school group versus a low-SES school group is also associated with a difference of about 57 points (0.6 of a standard deviation). And consistently, for science, as shown in Table 4, the difference between being in a low- versus a high-SES school group for a low-SES student is about 57 points (or .58 standard deviations.)
Not only does this pattern hold across reading, math, and science, but it is also evident that it holds across the quintiles based on individual student SES. For example, as seen in Table 2, for high-SES students, the difference in average reading performance associated with being in a low-SES school group as compared with a high-SES school group is 54 points. Similar comparisons in math and in science (Tables 3 and 4) yielded differences of 56 and 52 points, respectively.
In addition, consistent with other research and as we previously knew, individual SES also matters. For example, as depicted in Table 2 in the case of reading, the difference between the average low-SES student in a low-SES school and the average high-SES student in a similar school is about 90 points, or just less than one standard deviation.
Table 2. PISA 2003 Australia Reading Mean Scores by Individual Student SES and School Group Mean SES
Individual Student SES | School Group SES | ||||
1st Quintile | 2nd Quintile | 3rd Quintile | 4th Quintile | 5th Quintile | |
1st Quintile | n = 984 458.8 | n = 690 466.0 | n = 490 471.5 | n = 231 503.3 | n = 88 516.0 |
2nd Quintile | n = 591 486.2 | n = 681 496.0 | n = 596 503.5 | n = 425 531.3 | n = 195 543.9 |
3rd Quintile | n = 416 498.1 | n = 492 504.2 | n = 639 515.1 | n = 568 541.7 | n = 348 560.9 |
4th Quintile | n = 213 520.3 | n = 377 525.1 | n = 516 529.8 | n = 682 557.2 | n = 693 577.2 |
5th Quintile | n = 99 547.8 | n = 199 543.0 | n = 362 549.4 | n = 602 576.1 | n = 1212 601.7 |
For school groups in the mid-SES range, the difference between the average low-SES student and the average high-SES student moderates somewhat to about 77 points, or 0.82 standard deviations, but for high-SES school groups, the difference again stretches to 86 points.
Again, as with the findings for differences in average academic performance associated with differences in school group SES, these patterns of substantial difference associated with student SES are consistent across reading, mathematics, and science.
Table 3. PISA 2003 Australia Math Mean Scores by Individual Student SES and School Group Average SES
Individual Student SES | School Group SES | ||||
1st Quintile | 2nd Quintile | 3rd Quintile | 4th Quintile | 5th Quintile | |
1st Quintile | n = 984 458.8 | n = 690 459.8 | n = 490 475.3 | n = 231 497.9 | n = 88 515.8 |
2nd Quintile | n = 591 485.5 | n = 681 494.9 | n = 596 505.0 | n = 425 529.4 | n = 195 546.4 |
3rd Quintile | n = 416 495.4 | n = 492 501.3 | n = 639 513.6 | n = 568 538.5 | n = 348 562.2 |
4th Quintile | n = 213 521.6 | n = 377 521.1 | n = 516 530.5 | n = 682 554.8 | n = 693 575.0 |
5th Quintile | n = 99 543.1 | n = 199 535.4 | n = 362 545.9 | n = 602 570.9 | n = 1212 599.5 |
For example, in math (as seen in Table 3) the difference between the typical low-SES student and the typical high-SES student, both in mid-SES school groupings, is 71 points, and for science, it is 80 points, or about 0.80 standard deviations. Similarly, as depicted in Table 4, the observed difference between the average high-SES student and the average low-SES student, both in high-SES school groupings, is about 86 points in reading, about 84 in math, and more than 95 points in science.
Table 4. PISA 2003 Australia Science Mean Scores by Individual Student SES and School Group Average SES
Individual Student SES | School Group SES | ||||
1st Quintile | 2nd Quintile | 3rd Quintile | 4th Quintile | 5th Quintile | |
1st Quintile | n = 984 455.0 | n = 690 457.1 | n = 490 470.5 | n = 231 496.6 | n = 88 511.9 |
2nd Quintile | n = 591 482.6 | n = 681 492.8 | n = 596 501.1 | n = 425 527.9 | n = 195 539.8 |
3rd Quintile | n = 416 496.1 | n = 492 500.3 | n = 639 512.0 | n = 568 541.0 | n = 348 558.0 |
4th Quintile | n = 213 520.4 | n = 377 523.8 | n = 516 530.5 | n = 682 556.6 | n = 693 577.3 |
5th Quintile | n = 99 555.3 | n = 199 543.8 | n = 362 550.0 | n = 602 582.3 | n = 1212 607.2 |
In sum, the findings depicted in Tables 2–4 confirm that for the Australian PISA case, both student- and school-level SES consistently and substantially matter in the academic performance of students across the three core subjects of reading, math, and science. The systematic disaggregation of the PISA 2003 data set for Australia shows unequivocally that both student and school group SES are strongly associated with academic outcomes across five quintiles representing a range of individual and group SES profiles.
In addition, however, our intent in systematically disaggregating these data has also been to provide a finer grained portrayal of the profiles of the relationships between SES and academic performance, including such issues as whether there are evident “SES thresholds” that must first be reached before the positive relationship between SES and academic performance is seen, and whether observed relationships continue to be strongly positive across the entire range of student and school group SES. Figures 2–4 are provided to offer tentative beginning answers to these questions.
Figure 2. PISA 2003 Australia reading mean scores by individual student and school group SES
First, from these three figures, the strength and consistency of the association between school group SES and academic performance across the quintiles representing student SES, as well as across the three subjects examined, is noteworthy. In no case is there overlap among the lines representing the academic performance of different SES cohorts across the three subjects.
Second, consistently across the three subjects, but perhaps most notably in reading, there does appear to be something like a group SES threshold—located at around the third school group SES quintile—below which the relationship between school group SES and academic performance is positive but quite moderate, and beyond which the relationship becomes strongly positive. For the Australian sample, this may reflect the transition from lower and middle-SES public schools to private and/or more affluent public schools.
Figure 3. PISA 2003 Australia math mean scores by individual student and school group SES
Third, in the case of the lowest (first) student SES quintile in both reading and science, the relationship seems to flatten at both the lower and the higher ends of the school group SES continuum. This may indicate that the lowest SES students also benefit from higher SES groupings, but for resource-intensive subjects like reading and science, these students may continue to be limited by their personal socioeconomic circumstances. In contrast, this same student cohort, in a less materially resource-intensive subject like math, shows no such flattening of the relationship graph (see Figure 3).
Figure 4. PISA 2003 Australia science mean scores by individual student and school group SES
Fourth, we make note of the phenomenon evident across all three subjects for students in the highest individual SES quintile (represented by the uppermost line in each graph). Each of these three lines show that for students in this highest SES cohort, there is a small but noticeable fall-off in academic performance when comparing second (and sometimes third) quintile school group performance against first-quintile school group performance; we refer to this phenomenon as “the hockey stick” and further note that it appears for no other quintile in the Australian data set. We suspect that what we are seeing here is more reflective of a type of “regression to the mean” effect for the second and third quintiles in that the size of the group of high-SES students in the lowest SES school groups is relatively small in comparison with other groups, and therefore its mean may be artificially higher than might be expected.
LIMITATIONS
A significant limitation of this study is that this secondary analysis of PISA 2003 has not accounted for student ability. Thrupp and colleagues (Thrupp, Lauder, & Robinson, 2002) have argued that measures of student ability should be included in studies on socioeconomic composition to better disentangle the important roles of peers in academic attainment. PISA does not measure prior ability, but its advantage is its rich measure of student SES, and thus mean school SES as well. Future research could use other variables in the PISA data set that have been observed as correlated with ability, such as student self-efficacy, confidence, or motivation.
Related to this limitation is the possibility of self-selection, or what Nash (2003) called “within-class selection.” It is plausible that more able or motivated students from a particular SES group are more likely to attend a high-SES school than their less able or motivated SES peers. The higher average achievement in a given school would thus reflect less an influence of socioeconomic composition or peer effects, and more the individual abilities or motivation of the students. Controlling for ability or cultural capital within each student SES group, as Nash (2003) used in his study, could lead to a different picture than that evident in this study.
Caution should be exercised in interpreting our findings. Our study describes observed relationships among student SES, school SES, and academic attainment given the current distribution of students within schools. If students of various SES backgrounds were distributed across schools differently, our graphs would obviously not look the same. Thus, our graphs cannot be used to predict the scores of any group if they were moved to a different type of school. For example, Figures 1–3 could not be used to predict the typical achievement of high-SES students if large numbers of them were moved to a low mean SES school. Thus, if the distribution of students from particular SES backgrounds were to change significantly, we would need to reexamine the described relationships.
DISCUSSION AND CONCLUSION
The findings from our secondary analysis of the Australian PISA 2003 data are clear and consistent. All students—regardless of their personal/family SES—benefit strongly and relatively equally from schooling contexts in which the SES of the school group is high. Conversely, all students, regardless of their individual SES, perform considerably less well on measures of academic achievement in school contexts characterized, in the aggregate, as low on the school SES continuum. Our findings are consistent with other studies that have found that all students are sensitive to the influence of the aggregated socioeconomic composition of their school. The main contribution of our study is that we can now say that, in Australia at least, all student SES groups are influenced relatively equally.
The second main finding of our study is that increases in the mean SES of the school are consistently associated with increases in student academic achievement. Other studies have suggested this as well, but because they used statistical methods that assume linear relationships, we were previously unsure of the degree to which the relationships examined were actually linear. Our findings show that the relationship for Australian 15-year-olds is largely, although not completely, linear. Moving from a low school SES context to a middle school SES context is associated with smaller improvements in academic achievement than moving from a middle to a high mean school SES context. In other words, the slope of the relationship typically becomes steeper as the mean SES of the school increases.
We noted previously that the public school/private school distinction was largely subsumed by mean school SES in the explanation of academic achievement. And, as shown by Ryan and Watson (2004), there is substantial overlap between high-SES schools and private schools in Australia. We therefore feel justified (although the public/private variable has not been supplied by PISA Australia) in positing that a high proportion of the schools in the top SES quintiles are private fee-paying schools, whereas a large proportion of schools in the bottom SES quintiles are public. As discussed earlier in the article, private schools receive an equal share of public funds, proportional to their enrollments, as public schools do, but they also charge fees. Thus, many private schools enjoy a funding advantage compared with their public counterparts, and this would be especially true for the high-status, high-fee schools that enroll large numbers of high-SES students. Ryan and Watson (2004) have shown that private schools have largely used public funds to increase the quality of their educational resources rather than increase access by reducing school fees. Thus, the two highest mean school SES groups may be associated with steeper increases in student achievement because they are more likely to be considerably better resourced than the lower mean SES schools.
A second possible and complementary explanation may lie with curricular differences between high- and low-SES schools. The curriculum in many private schools, especially those that enroll large proportions of high-SES students, is heavily focused on academic preparation for university entrance examinations (Teese, 1989). Public schools, by contrast, enroll students from a broader range of backgrounds, interests, and abilities, and therefore offer a more varied range of curricula to serve their students’ diverse needs. Lower SES public schools are particularly likely to offer a vocational rather than academic curriculum (Edwards, 2006). The rigorous academic curricular orientation offered by the higher mean SES schools may be associated with higher scores on PISA.
Our findings suggest that schools with large concentrations of students with low-SES backgrounds should be discouraged. Educational policies that work against the segregation of students and schools based on SES should be vigorously pursued on the simple basis of better and more equitable student outcomes. For these reasons, a strong consensus exists among educational researchers and policy makers that the minimization of school segregation based on SES should be a central outcome of educational policy (Kahlenberg, 2001; Lamb, 2007; Oakes, 2000; OECD, 2004, 2005; Orfield, 1996; Willms, 1999). We believe that the findings portrayed here are strongly supportive of that view.
Segregating high-SES students into separate institutions undoubtedly provides some academic benefits for these students, as underlined by our study. It could thus be argued that reducing the segregation of high-SES students in high-SES schools would reduce the academic performance of these students. According to this reasoning, equity comes at the cost of quality for some groups of students. The PISA reports show, however, that many inclusive national educational systems, such as those in Finland and Canada, are able to produce a greater proportion of students who achieve at the highest proficiency levels as compared with countries with more segregated schooling. Thus, reducing school segregation by SES does not automatically reduce the proportion of high-achieving students; rather, it often increases the proportion of such students. The overall achievement of all students is also higher in less segregated educational systems. Inclusive educational systems thus promote quality for both high-performing students and the entire student cohort. In sum, “more inclusive schooling systems have both higher levels of performance and fewer disparities among students from differing socio-economic backgrounds” (OECD, 2004, p. 197).
National systems of education operate under unique combinations of educational and social features. For an international audience, the unique features of the Australian case are its high levels of school choice, large private sector, variable school funding, and low levels of educational disadvantage associated with racial or ethnic minority status. It is likely that each of these features shapes the way in which the socioeconomic composition of schools is associated with student academic outcomes in Australia. This study has provided a detailed description of the association among school and student SES and academic achievement in one national context that we hope will serve as a springboard for similar future studies of other educational systems. By examining a variety of national contexts within a comparative framework, researchers will be able to develop more robust theory about the factors, policies, and structures that ameliorate or exacerbate the association between mean school SES and student academic outcomes.
Note
1. This is not to say that there is no group of educationally disadvantaged students in Australia. Indigenous Australians consistently have significantly lower educational outcomes compared with their nonindigenous peers. Because they constitute a relatively small percent of the total population, roughly 1.5%, however, the scope of the challenge to the educational system more generally is much smaller than in many other countries.
References
American Educational Research Association. (2003). Class size: Counting students can count. Research Points, 1(2), 1–4.
Anyon, J. (1981). Social class and social knowledge. Curriculum Inquiry, 11, 235–246.
Berliner, D. (2001). Our schools vs. theirs: Averages that hide the true extremes (No. CERAI-01-02). Milwaukee, WI: Center for Education Research, Analysis, and Innovation.
Blossfeld, H.-P., & Shavit, Y. (1993). Persisting barriers: Changes in educational opportunities in thirteen countries. In Y. Shavit & H.-P. Blossfeld (Eds.),Persistent inequality (pp. 1–24). Boulder, CO: Westview.
Caldas, S. J., & Bankston, C., III. (1997). Effect of school population socioeconomic status on individual academic achievement. Journal of Educational Research, 90, 269–277.
Chiu, M. M., & Khoo, L. (2005). Effects of resources, inequality, and privilege bias on achievement: Country, school, and student level analyses. American Educational Research Journal, 42, 575–604.
Cohen, J., & Cohen, P. (1983). Applied multiple regression. Hillsdale, NJ: Erlbaum.
Coleman, J., Campbell, E., Hobson, C., McPartland, J., Mood, A., Weinfeld, F., et al. (1966). Equality of educational opportunity. Washington, DC: U.S. Government Printing Office.
Darling-Hammond, L. (2007). Race, inequality and educational accountability: The irony of “No Child Left Behind.” Race Ethnicity and Education, 10, 245–260.
Edwards, D. (2006, July). Competition, specialisation and stratification: Academic outcomes of the government school system in Melbourne, Australia. Paper presented at the annual conference of the Comparative Education Society in Europe, Granada, Spain.
Gandara, P., Rumberger, R., Maxwell-Jolley, J., & Callahan, R. (2003). English learners in California schools: Unequal resources, unequal outcomes.Education Policy Analysis Archives, 11. Retrieved January 8, 2008, from http://epaa.asu.edu/epaa/v11n36/
Hanushek, E. A., Kain, J. F., Markman, J. M., & Rivkin, S. G. (2001). Does peer ability affect student achievement? Journal of Applied Econometrics, 18, 527–544.
Jencks, C., Smith, M., Acland, H., Bane, M. J., Cohen, D., Gintis, H., et al. (1972). Inequality: A reassessment of the effect of family and schooling in America. New York: Basic Books.
Kahlenberg, R. (2001). All together now: Creating middle-class schools through public school choices. Washington, DC: Brookings Institution.
Lamb, S. (2007). School reform and inequality in urban Australia: A case of residualising the poor. In R. Teese, S. Lamb, & M. Duru-Belat (Eds.), Education and inequality (Vol. 3, pp. 1–38). Dordrecht, The Netherlands: Springer.
Lamb, S., & Fullarton, S. (2002). Classroom and school factors affecting mathematics achievement: A comparative study of Australia and the United States using TIMSS. Australian Journal of Education, 46, 154–173.
Lauder, H., & Hughes, D. (1999). Trading in futures: Why markets in education don't work: Philadelphia: Open University Press.
Lokan, J., Greenwood, L., & Cresswell, J. (2001). The PISA 2000 survey of students' reading, mathematical and scientific literacy skills: How literate are Australia’s students? Camberwell, Victoria, Australia: Australian Council for Educational Research.
Lubienski, S. T., & Lubienski, C. (2005). A new look at public and private schools: Student background and mathematics achievement. Retrieved January 24, 2008, from http://www.pdkintl.org/kappan/k_v86/k0505lub.htm
Marjoribanks, K. (1979). Families and their learning environments: An empirical analysis. London: Routledge and Kegan Paul.
Matear, A. (2006). Equity in education in Chile: The tensions between policy and practice. International Journal of Educational Development, 27, 101–113.
McGaw, B. (2007, November–December). Keynote speech. Annual conference of the Australia and New Zealand Comparative and International Education Society, Auckland, New Zealand.
Nash, R. (2003). Is the school composition effect real? A discussion with evidence from the UK PISA data. School Effectiveness and School Improvement, 14, 441–457.
Noel, S., & de Broucker, P. (2001). Intergenerational inequities: A comparative analysis of the influence of parents' educational background on length of schooling and literacy skills. In W. Hutmacher, D. Cochrane, & N. Bottani (Eds.), In pursuit of equity in education: Using international indicators to compare equity policies (pp. 277–298). Dordrecht, The Netherlands: Kluwer Academic.
Oakes, J. (2000). The distribution of knowledge. In R. Arum & I. R. Beattie (Eds.), The structure of schooling: Readings in the sociology of education (pp. 224–234). Mountain View, CA: Mayfield.
Opdenakker, M.-C., & Van Damme, J. (2001). Relationship between school composition and characteristics of school process and their effect on mathematics achievement. British Educational Research Journal, 27, 407–432.
Orfield, G. (1996). Dismantling desegregation: The quiet reversal of brown v. Board of education. New York: New Press.
Organisation for Economic Co-operation and Development. (2004). Learning for tomorrow’s world: First results from PISA 2003. Paris: Author.
Organisation for Economic Co-operation and Development. (2005). School factors related to quality and equity: Results from PISA 2000. Paris: Author.
Robertson, D., & Symons, J. (2003). Do peer groups matter? Peer group versus schooling effects on academic attainment. Economica, 70(277), 31–53.
Rumberger, R. W., & Palardy, G. J. (2005). Does segregation still matter? The impact of student composition on academic achievement in high school.Teachers College Record, 107, 1999–2045.
Ryan, C., & Watson, L. (2004). The drift to private schools in Australia: Understanding its features (Discussion Paper No. 479). Canberra, Australian Capital Territory, Australia: Centre for Economic Policy Research, Australian National University.
Sirin, S. R. (2005). Socioeconomic status and academic achievement: A meta-analytic review of research. Review of Educational Research, 75, 417–453.
Sui-Chu, E. H., & Willms, J. D. (1996). Effects of parental involvement on eighth-grade achievement. Sociology of Education, 69, 126–141.
Tate, W. F. (1997). Race-ethnicity, SES, gender, and language proficiency trends in mathematics achievement: An update. Journal for Research in Mathematics Education, 28, 652–679.
Teese, R. (1989). Australian private schools, specialization and curriculum conservation. British Journal of Educational Studies, 37, 235–252.
Thrupp, M. (1995). The school mix effect: The history of an enduring problem in educational research, policy and practice British Journal of Sociology of Education, 16, 183–203.
Thrupp, M. (1999). Schools making a difference: Let's be realistic. Buckingham, England: Open University Press.
Thrupp, M., Lauder, H., & Robinson, T. (2002). School composition and peer effects. International Journal of Educational Research, 37, 483–504.
Willms, J. D. (1999). Quality and inequality in children's literacy: The effects on families, schools, and communities. In D. P. Keating & C. Hertzman (Eds.),Developmental health and the wealth of nations: Social, biological, and educational dynamics (pp. 72–93). New York: Guilford Press.
Cite This Article as: Teachers College Record Volume 112 Number 4, 2010, p. 1137-1162 http://www.tcrecord.org ID Number: 15662, Date Accessed: 1/24/2012 3:13:45 AM |
No comments:
Post a Comment