May 04, 2005

Criticizing the new SAT

The Instaman links to a NYT article that expresses doubts about the SAT's new essay section:

In March, Les Perelman attended a national college writing conference and sat in on a panel on the new SAT writing test. Dr. Perelman is one of the directors of undergraduate writing at Massachusetts Institute of Technology. He did doctoral work on testing and develops writing assessments for entering M.I.T. freshmen. He fears that the new 25-minute SAT essay test that started in March - and will be given for the second time on Saturday - is actually teaching high school students terrible writing habits...

In the next weeks, Dr. Perelman studied every graded sample SAT essay that the College Board made public. He looked at the 15 samples in the ScoreWrite book that the College Board distributed to high schools nationwide to prepare students for the new writing section. He reviewed the 23 graded essays on the College Board Web site meant as a guide for students and the 16 writing "anchor" samples the College Board used to train graders to properly mark essays.

He was stunned by how complete the correlation was between length and score. "I have never found a quantifiable predictor in 25 years of grading that was anywhere near as strong as this one," he said. "If you just graded them based on length without ever reading them, you'd be right over 90 percent of the time." The shortest essays, typically 100 words, got the lowest grade of one. The longest, about 400 words, got the top grade of six. In between, there was virtually a direct match between length and grade.

To be honest, I don't find this surprising. The SAT essay focuses on quality of writing, not quality of facts therein, and one could argue that someone who misstates the date of the first shot of the Civil War, but who can string together 400 words about it, deserves a much higher score than someone who, in 25 minutes, can't manage more than 100 words. Granted, length should not be the only predictor - and examinees should not be able to game the system by scribbling down pure nonsense - but I'm not as horrified by this as Professor Perelman.

Should lengthy prose with wrong facts be the goal of writing courses? Of course not. But the SAT in and of itself does not control how students in the US learn to write, nor should it. The SAT isn't about separating the brilliant from the very good, and its essay, in particular, seems to be geared towards separating out those who can write about something, even if their facts aren't perfect, from those who can't string together complete sentences at all. Colleges are, unfortunately, seeing more and more of those in the second camp, and this is the problem for which the SAT essay was developed.

Scott J. at Inside Highered Ed presents a report by the National Council of Teachers of English that is highly critical of the SAT essay:

The SAT and ACT timed writing tests are “unlikely to improve writing instruction,” and have the potential to “compromise student writers and undermine longstanding efforts to improve writing instruction in the nation’s schools,” according to “The Impact of the SAT and ACT Timed Writing Tests,” a report from the National Council of Teachers of English (NCTE) Task Force on SAT and ACT Writing Tests.

The task force’s findings relate to both the new SAT essay and the new essay component of the ACT (which, being optional, is expected to have less of an impact). The report raises serious concerns about the validity of the tests as an indication of writing ability, the impact of the tests on curriculum and classroom instruction, the unintended consequences of the writing tests, and issues of equity.

Given that the essay section was developed because young men and women were graduating from high school with no writing skills whatsover, it's disheartening to see the NCTE latch onto this essay - which has been operational for a grand total of two months - as though it, and it alone, can really bring down writing education in the US. The College Board says as much:

Officials of both the College Board and ACT strongly disputed the assertions of the report.

Chiara Coletti, a spokeswoman for the College Board, said she wouldn’t even call the report representative of the English teachers’ association. She noted that many members of the group are involved in College Board committees related to the SAT and not only support, but have helped to develop, the writing test.

Coletti said that the College Board has never claimed that the writing test is capable of testing skills in creative writing or producing a research paper. But she said that the new writing test is valuable for what it does do — give colleges a way to compare the writing skills of students nationally.

She also rejected the idea that the test will hurt existing writing instruction in high schools. She noted recent studies that suggest that most high schools don’t do a good job of teaching writing. “It’s hard to understand why the task force would fear that this would take time away from high quality writing instruction,” Coletti said. “There’s not that much of it going on.”

Heh.

The full report is here. It does claim that the writing portion of the exam does not appear to add much in the way of predictive validity for the SAT, but the College Board data on this has not been released. Also, while the NCTE report addresses the SAT 's predictive validity, they do not acknowledge that the predictive power of the SAT can differ widely from university to university, nor do they mention that restriction of range can lower the correlations.

It seems to me, then that the brouhaha about the essay section is no different from the criticisms that swirl around the SAT as a whole. The critics do not want to admit that (a) the essay may be useful even if all it does is separate the horrible from the mediocre, (b) the SAT does not set the standard for education in the US, even though the test is used in college admissions, and (c) the SAT may be highly predictive at one school and not at another. I'd say the jury is still out on the essay section, and I'll be sure to post the College Board's report when the data are made public.

Update: A fellow psychometrician reminded me that what could be going on here is that length is co-occuring with performance, so that those who write well, and more accurately, may also be writing more. All that Dr. Perlman is quoted as saying is that length is correlating highly with score, but that doesn't mean that the quality of the writing is not driving both.

One commenter asks:

How would you address a more usual case, a student who writes an essay filled with factual errors as lengthy as and no better or worse constructed than one who writes one that isn't?

I would hope that the one with fewer factual errors gets a higher score. Nothing presented by Dr. Perlman suggests that this could not happen, only that some factual errors are occuring in even the highest-scoring essays. Judges are asked not to count off for small factual errors, but I would hope judges are asked to count off when what is written is so incorrect so as to not make sense.

Is this a perfect writing test? Of course not, and the fact that it counts for only 25% of the grade of the exam reflects that. But what would a "perfect" writing test be in the context of the SAT? The exam is already under attack for its length. Testing critics already attack every multiple-choice item for alleged cultural bias, which makes both prompt development and grading standards for them tricky. The test score gap is already controversial enough without introducing a segment of the test to measure anything except basic skills. This essay represents a compromise among all of these constraints and the complaints that the SAT has not previously been useful enough in helping colleges determine writing skills.

If the College Board's data suggest that the essay segment adds nothing over and above the MCQ's on the exams, I would hope that the essays would be discontinued. Unlike others I have quoted here, I am willing to say that the jury is still out.

Update #2: On the other hand, as Opinion Journal points out, perhaps the issues with the new SAT essay prove that it is useful at measuring real-world skills:

Perelman's recommendation for students preparing for the SAT: "I would advise writing as long as possible, and include lots of facts, even if they're made up." That's also good advice for those who want to go to work for Dan Rather at CBS's "120 Minutes."

Roger Simon (and his commenters) seems to agree.

Update #3 : I think some of you are misinterpreting my statements above; in particular:

Colleges are, unfortunately, seeing more and more of those in the second camp, and this is the problem for which the SAT essay was developed.

I did not mean to suggest that I think the essay portion of the SAT was designed to actually improve student performance. In fact, I've gone on record many times as saying that testing does not, in and of itself, improve performance. The SAT essay is a response to the UC complaints, and it appears to be a response to the widespread complaints about writing skills. Some of you are far more cynical than I about the College Board's purpose in adding this segment, and that's fine. But please don't rush to interpret a statements like the one above as evidence that I think the SAT will improve writing. I said nothing of the sort.

Posted by kswygert at May 4, 2005 11:43 AM
Sitemeter