- Home /
NAPLAN fosters bad writing: expert
World-leading figure in writing assessment Dr Les Perelman is suggesting a new approach for the NAPLAN major writing task.
“If we want to test writing, we should test all aspects of it, including reading, prewriting, writing, revision, and 21st century editing,” Dr Perelman writes in a new report, Towards a New NAPLAN: Testing to the Teaching.
Dr Perelman will deliver a public lecture about the findings of his research on 4 May.
His report examines NAPLAN’s current major writing task alongside comparative school writing assessments. He says the major writing task assessment is “poorly designed and executed in comparison to similar assessments in Canada, the US, or the UK”. “Its focus on low-level skills causes it to de-emphasise the key components of effective written communication. It is reductive and anachronistic,” he concludes.
“[The] overall structure and marking scheme, is paradoxically both overly complex in its marking but simplistic and highly reductive in measuring of any reasonable formulation of the writing construct. Teaching to this test will make students poor writers by having them focus on non-essential tasks such as memorising spelling lists.”
Dr Perelman criticises the Australian Curriculum Assessment and Reporting Authority’s (ACARA) attitude to the assessment of writing, commenting on ACARA’s specific goals for the years of the NAPLAN tests. “These objectives, although important, are extremely limited and highly reductive, placing mechanical correctness in the foreground while giving some but limited acknowledgement of writing as primarily a communicative act.”
There are 10 marking categories, with different weights, for the major writing task. Dr Perelman says it is difficult to see how they could represent any commonly held consensus of a writing construct.
“Writing exists to communicate ideas and information. Yet Ideas is given only 5 marks, while Spelling is given 6 … Indeed, much more weight is given to spelling, grammar, and other mechanics than to communicating meaning.”
Dr Perelman notes other researchers’ conclusions “that the ability to revise effectively differentiates mature writers from novice writers … teaching revision is an essential component of teaching writing … and ignoring it, ignores a major part of the writing construct”.
Tips highlight assessment’s weaknesses
Dr Perelman was a vocal critic of the essays set for the US college and university entrance exam, the Scholastic Aptitude Test (SAT) of more than 10 years ago. He developed a set of writing tips, which he says emphasised “that the writing that would receive a high mark was formulaic, artificial, and had little to do with real-world or even real academic writing”. The old SAT essay was abolished in significant measure, due to Dr Perelman’s work.
“My study of the NAPLAN essay marking has produced a similar conclusion about the disassociation of the NAPLAN marking scheme from any authentic construct of writing ability,” Dr Perelman says in his latest report.
“Moreover, its emphasis on form and the correct spelling of certain words makes it even easier to provide students with constructirrelevant strategies to attain high marks.” Dr Perelman has written a guide to a top-scoring NAPLAN essay.
He says making the strategies public, democratises opportunity on NAPLAN. “Second, such cynical advice exposes the poor pedagogical practices that are encouraged by the test,” he said. “Simultaneously, when students use them and score well, it reveals which constructs are being assessed and which constructs are not.”
Dr Perelman’s guide to a top-scoring NAPLAN essay
1) Memorise the list of Difficult and Challenging Spelling Words and sprinkle them throughout the paper. Feel free to repeat them, and do not worry very much about the meaning.
2) If you are not sure how to spell a word, do not use it.
3) Repeat the language and ideas in the Writing Task throughout the paper.
4) Begin at least one sentence with the structure, “Although x (sentence), y (sentence).” For example: “Although these instructions are stupid, they will produce a high mark on the NAPLAN essay.”
5) Master the five-paragraph form.
6) Increase your score on the “Audience” and “Persuasive Devices” categories by addressing the reader using “you” and ask questions. For example: “So you think you wouldn’t mind writing a stupid essay?”
7) Use connective (Velcro) words such as “Moreover,” “However,” “In addition”, “On the other hand” at the beginning of sentences.
8) Begin sentences with phrases such as “In my opinion”, “I believe that”, “I think that” etc.
9) Repeat words and phrases throughout your paper.
10) Employ the passive voice frequently throughout your paper.
11) Use referential pronouns, such as “this”, without a reference noun following it. For example, “This will make the marker think you are a coherent writer”.
12) Make arguments using forms such as “We all believe that we should do X” or “We all know that Y is harmful”.
13) Always have at least one, preferably two adjectives next to nouns. Thus, not “the dog” but the “frisky and playful dog”.
14) If you are writing a narrative essay, think quickly if there is a television program, movie, or story that you know that fits the requirements of the narrative writing task. If there is one use it as your narrative, embellishing it or changing it as much as you want. Markers are explicitly instructed to ignore if they recognise any stories or plots and mark the script on its own merits as if it was original.
15) Never write like this except for essay tests such as NAPLAN.
Report's Executive Summary
Achievement tests have become an almost universal feature of primary and secondary education in industrialised countries. Such assessments, however, always need to be periodically reassessed to examine whether they are measuring the relevant abilities and whether the results of the assessment are being used appropriately. Most importantly, the assessments must themselves be assessed to ensure they are supporting the targeted educational objectives. Contemporary concepts of validity are considered as simultaneous arguments involving the interpretation of construct validity, content validity, and external validity, along with arguments involving fairness and appropriateness of use.
As points of comparison, the examination of six different writing tests from the United States, Australia, Canada and the United Kingdom produced observations directly relevant to an evaluation of the NAPLAN essay:
• The majority of tests, and all the tests specifically for primary and secondary schools, are developed, administered, and refined within the context of publicly available framework and specification documents. These documents articulate, often in great detail, the specific educational constructs being assessed and exactly how they will be measured. They are not only an essential tool for any assessment design but also their publication is vital for the transparency and accountability necessary for any testing organisation.
• In some cases, these documents are produced with collective input from stakeholders and academic specialists in the specific disciplines. The Smarter Balanced Assessment Consortium and the National Assessment of Educational Progress (NAEP) writing assessments made use of large panels of teachers, administrators, parents, elected officials, and academic experts.
• Several of the tests unreservedly mix reading and writing. The Smarter Balanced Assessment Consortium reading test incorporates short-answer writing (constructed response). The texts in the reading exercise form part of the prompt for the long essay, and the short written answers to the reading questions serve as prewriting exercises. Integrating writing and reading in assessments makes sense. Children acquire language through exposure to speech. Eventually, reception leads to production. Although writing is a technology that is only approximately 6000 years old, it is an analogue to speech, albeit not a perfect one. Indeed, students will have extreme difficulty writing in a genre if they have not read pieces in that same genre.
• Writing tasks are designed and employed for specific classes or years. With the exception of NAPLAN, I know of no other large-scale writing assessment that attempts to employ a single prompt for different age groups.
• Similarly, most tests tailor their marking rubrics for different classes or years. For example, the scoring rubrics for Grades 4 and 7 in British Columbia’s Foundation Skills Assessment (FSA), displayed in Appendix D (see online report), vary significantly, consistently expecting a higher level of performance from the higher grade.
• Informative writing, in addition to narrative and persuasive writing, is a common genre in school writing assessments. Much of the writing students will do in school and then in higher education and in the workforce will be informative writing.
• Several of the assessments explicitly define an audience and, often, a genre as part of the writing task. One prompt from the National Assessment of Educational Progress (NAEP) assessments asks students to write a letter to the school principal on a specific issue. A Smarter Balanced Assessment Consortium informative writing task for Grade 6 students asks the student to write an informative article on sleep and naps (the topics of the reading questions) for the school newspaper that will be read by parents, teachers, and other students.
• All of the other assessments that employ multi-trait scoring use the same or similar scales for all traits. Moreover, they all employ significantly fewer trait categories. The Smarter Balanced Assessment Consortium employs three scales: two are 1-4, and the Conventions scale is 0-2. British Columbia’s Foundation Skills Assessment uses five scales, all 1-4. The Scholastic Aptitude Test (SAT) has three 1-4 scales that are not summed, and UK tests such as A and AS Levels have multiple traits, usually four to six, that are always scored on scales that are multiples of 1-5 levels.
• Most of the assessments, and all of the assessments that focused on the primary and secondary years/grades, allowed students access to dictionaries and, in some cases, grammar checkers or thesauri. Some of the assessments are now on networked computers or tablets that include standard word processing applications with spell-checkers or dictionaries and other tools for writing.
• There is a complete lack of transparency in the development of the NAPLAN essay and grading criteria. There is no publicly available document that presents the rationale for the 10 specific criteria used in marking the NAPLAN essay and the assignment of their relative weights. This lack of transparency is also evident in the failure of the Australian Curriculum Assessment and Reporting Authority (ACARA) to include other stakeholders, such as teachers, local administrators, parents, professional writers, and others in the formulation, design, and evaluation of the essay and its marking criteria.
• Informative writing is not assessed although explicitly included in the writing objectives of the Australian Curriculum. Informative writing is probably the most common and most important genre in both academic and professional writing. Because that which is tested is that which is taught, not testing informative writing devalues it in the overall curriculum.
• Ten marking criteria with different scales are too many and too confusing, causing high-level attributes such as ideas, argumentation, audience, and development to blend into each other even though they are marked separately. Given the number of markers and time allotted for marking approximately one million scripts, a very rough estimation would be that, on average, a marker would mark 10 scripts per hour, or one every six minutes (360 seconds). If we estimate that, on average, a marker takes one-and-a-half minutes (90 seconds) to read a script, that leaves 270 seconds for the marker to make 10 decisions, or 27 seconds per mark on four different scales. It is inconceivable that markers will consistently and accurately make 10 independent decisions in such a short time.
• The weighting of 10 scales appears to be arbitrary. The 10 traits are marked on four different scales, 0-3 to 0-6, and then totalled to compute a composite score. Curiously, the category Ideas is given a maximum of 5 marks while Spelling is given a maximum of 6.
-There is too much emphasis on spelling, punctuation, paragraphing and grammar at the expense of higher order writing issues. While mastery of these skills is important, the essential function of writing is the communication of information and ideas.
-The calculation of the spelling mark, in particular, may be unique in Anglophone testing. It is as concerned with the presence and correct spelling
- of limited sets of words defined as Difficult and Challenging as it is with the absence of misspelled words. Markers are given a Spelling reference list categorising approximately 1000 words as Simple, Common, Difficult, and Challenging. The scale for the spelling criterion is 0-6. A script containing no conventional spelling scores a 0, with correct spelling of most simple words and some common words yielding a mark of 2. To attain a mark of 6, a student must: spell all words correctly; and include at least 10 Difficult words and some Challenging words or at least 15 Difficult words.
• The NAPLAN grading scheme emphasises and virtually requires the five-paragraph essay form. Although the five-paragraph essay is a useful form for emerging writers, it is extremely restrictive and formulaic. Most arguments do not have three and only three supporting assertions. More mature writers such as those in Year 7 and Year 9 should be encouraged to break out of this form. The only real advantage of requiring the five-paragraph essay form for large-scale testing appears to be that it helps to ensure rapid marking.
• Although “audience” is a criterion for marking, no audience is defined in the writing prompt. There is a significant difference between a generic reader and a specific audience, a distinction that the current NAPLAN essay ignores but is essential for effective writing.
• Specificity in marking rubrics on issues of length and conventions not only skews the test towards low-level skills, it also makes the test developmentally inappropriate for lower years or stages. Several of the marking criteria specify at least one full page as “sustained writing” or “sustained use” necessary for higher marks. It is unrealistic to expect most Year 3 students to produce a full page of prose in 40 minutes.
• The supplementary material provided to markers on argument, text and sentence structure, and other issues is trivial at best and incorrect at worst. It should to be redone entirely as part of the redesign of the NAPLAN essay. Markers should be surveyed to discover what information would be most useful to them.
• The 40 minutes students have to plan, write, revise and edit precludes any significant planning (prewriting) or revision, two crucial stages of the writing process.
There should be an impartial review of NAPLAN, commencing with special attention being paid to the writing essay, leading to a fundamental redesign of the essay and the reconsideration of its uses. Such a review should also consider the way in which NAPLAN is administered and marked, its current disconnection to a rich curriculum and the specific and diverse teaching programs that children experience in classrooms.
Such a review should be an inclusive process encompassing all elements of the educational and academic communities with the key focus areas identifying the particular needs of students, how they have progressed in their class with the programs they are experiencing and how systems, jurisdictions and the nation can further support their intellectual growth and futures. A principal emphasis in this review should be to promote alignment of the curriculum, classroom pedagogy, and all forms of assessment; that is, to test to the teaching. If students consider classroom exercises and outside assessments to be indistinguishable, and both reflect the curriculum, then assessments reinforce teaching and learning rather than possibly subverting them.
Australia produces great language assessments. I admire the various Australian state and territory English and writing HSC papers. The International English Language Testing System (IELTS), developed in Australia and the United Kingdom, is by far the best test of English as a foreign language. Australia can produce a great NAPLAN major writing assessment.
Click here to download PDF.