Are U.S. schools too easy?
This was written by education historian Diane Ravitch, a research professor at New York University and author of the bestselling “The Death and Life of the Great American School System.” This first appeared on her blog.
By Diane Ravitch
A report appeared this week by the Center for American Progressasserting that “schools are too easy.” It was widely reported in the national media. See here, here, here, here, and here. For some reason, the media love stories that say either that our kids don’t know anything or they aren’t working hard enough. We have to turn up the pressure, raise standards, make the tests harder, test them more often. And then,
when the schools devote every day to testing and test preparation, and when the arts and physical education have been eliminated to make more time for test prep, we don’t understand why kids don’t like school!
when the schools devote every day to testing and test preparation, and when the arts and physical education have been eliminated to make more time for test prep, we don’t understand why kids don’t like school!
Ed Fuller, a superb researcher at Penn State University, was curious about the validity of these findings. He decided to review the Center for American Progress study. He decided that it did not provide evidence to support its conclusions.
In a follow-up comment, Ed summarizes his critique thus: Essentially, if one wants to make policy conclusions based on simple frequencies, the take away from the data would be that we need to make math classwork easier because students reporting math work is easy have much greater math scores than students who report math work is difficult. While there may be evidence we need to increase the quality of our curriculum, the evidence just is not in this report or in the NAEP [National Assessment of Educational Progress] survey data.
Here is his analysis:
Yesterday, the Center for American Progress released a report entitled, “Do School Challenge Our Students?” The essential take-away, in the words of the authors, is that, “Many students are not being challenged in school.” Based on the authors’ analysis of student survey questions from the National Assessment of Educational Progress, or NAEP (http://nces.ed.gov/nationsreportcard/naepdata/), additional findings include:
- “Many schools are not challenging students and large percentages of students report that their school work is “too easy.”
- “Many students are not engaged in rigorous learning activities.”
- “Students don’t have access to key science and technology learning opportunities”
- “Too many students don’t understand their teacher’s questions and report that they are not learning during class.”
- “Students from disadvantaged background are less likely to have access to more rigorous learning opportunities.”
Let me address a few of the many, many problems with this report.
First, the authors missed multiple opportunities to analyze the data and chose to rely on simple cross-tabulations and frequency counts. Anyone who has taken a Research 101 course can tell you, one of the first important rules is to not make conclusions based on frequency counts. More on the missed opportunities later in this post.
Second, the authors’ conclusion that students are not being challenged in school is based on the results from one question posed to students that asked, “How often do you feel the math work in your math class is too easy?” At the 4th grade level, the authors bemoan the fact that 37% of students thought the math work was “often” or “always or almost always” too easy. At the 8th grade level, the comparable percentage was 29%. The authors continue by arguing that this unchallenging work results in far too few students — 40% at 4th grade and 35% at 8th grade -- meeting the NAEP proficiency standard.
This is problematic because (a) the NAEP proficiency standards are an arbitrary score with little or no correlation with any student outcomes (Gerald Bracey and many others have mentioned the flaw in using NAEP proficiency scores to argue low student performance); and, (b) the authors fail to point out that 46% of the 4th grade students reported that math class work was too easy “almost always or always” achieved proficiency as compared to only 33% meeting proficiency who responded that math work was “never or hardly ever too easy.” According to the logic used by the authors, we would increase the percentage of students meeting proficiency by making math work easier.
Third, the authors report that only 65% of middle-school students reported that they were “always or almost always” learning in their math class. Conveniently, the authors fail to mention that an additional 24% of students that they were “often” learning in math class. In other words, 89% of middle school students said they were “often” or “always or almost always” learning in math class. Sounds pretty good to me.
Fourth, the authors claim that data showing poor and minority students are more likely to report difficulty in understanding teachers’ questions than their more affluent and white peers is evidence that poor and minority students need greater access to more rigorous learning opportunities. While I would strongly agree that poor and minority students need greater opportunity to learn, this data says nothing about more rigorous curriculum. Certainly a more plausible explanation is that less qualified teachers are placed in schools with high proportions of poor and minority students and that poor students often have more trouble understanding teachers for a variety of factors wholly unrelated to the rigor of the curriculum (for example, see David Berliner’s work on the effects of poverty on students’ vocabulary). In fact, none of the data speaks to a more rigorous curriculum other than course enrollment—data that the authors completely ignored.—and that data shows only small differences in the percentage of students saying coursework is too easy across different classes. Essentially the same percentage of students said coursework was too easy in both Algebra I and basic math.
Fifth, the authors could have examined the correlations and scatter-plots between different variables and even employed simple regression analyses using state-level means. Without such steps — as mentioned before — incorrect conclusions can be drawn. Even with such steps, the data preclude any definitive answers from being reached. But let’s take a look at what such analyses would tell us:
–the greater the percentage of students reporting that math class was interesting and engaging, the greater the percentage of students reporting that math work was easy;
– the greater the percentage of students reporting that math work was easy, the greater the percentage students reported that they were learning in class.
Further, all of these factors were positively associated with test scores for both poor students and their more affluent peers. So, one plausible theory would be that students in math classes that are made interesting perceive such classes as easier and lead to students perceiving increased learning and having greater actual scores. However, note that the results of a regression analysis reverses the sign for all these factors after the percentage of poor kids is included, meaning these variables are negatively associated with state-level scores)).
Of course, this was at the state level rather than the student level. The NAEP data doesn’t allow for cross-tabulations at the student-level for these variables which is what the authors would need access to in order make some conclusions based on actual evidence. In fact, in another section of the report, the authors state, “The data should not, however, be treated as causal research, and the responses from students could be skewed by other factors.” But that does not stop them from making some sweeping conclusions and pushing policies to meet needs that are not well understood.
Ultimately, this report appears to have been a bundle of conclusions in search of some supporting data. After performing some serious data contortions and giant leaps of association, the authors made their point. Should anyone listen? Only when we have concrete evidence from some solid research.
Ed Fuller is an Associate Professor in the Educational Theory and Policy Department in the College of Education at Penn State University.
P.S. Ed pointed out to me that the lead author of the study is a journalist, not an expert in statistical analysis.
No comments:
Post a Comment