File Download Area

Information about "naplan 2016 state report year 9 with answers.pdf"

  • Filesize: 738.23 KB
  • Uploaded: 25/09/2019 11:43:44
  • Status: Active

Free Educational Files Storage. Upload, share and manage your files for free. Upload your spreadsheets, documents, presentations, pdfs, archives and more. Keep them forever on this site, just simply drag and drop your files to begin uploading.

Download Urls

  • File Page Link
    https://www.edufileshare.com/177de4ef3b90b21e/naplan_2016_state_report_year_9_with_answers.pdf
  • HTML Code
    <a href="https://www.edufileshare.com/177de4ef3b90b21e/naplan_2016_state_report_year_9_with_answers.pdf" target="_blank" title="Download from edufileshare.com">Download naplan 2016 state report year 9 with answers.pdf from edufileshare.com</a>
  • Forum Code
    [url]https://www.edufileshare.com/177de4ef3b90b21e/naplan_2016_state_report_year_9_with_answers.pdf[/url]

[PDF] naplan 2016 state report year 9 with answers.pdf | Plain Text

NAPLAN 2016 State report: Year 9



i Queensland Curriculum & Assessment Authority| Contents Preface ............................................................................................................... 1 Placing the tests in the assessment context ...................................................... 2 Marking and scoring the tests ............................................................................ 2 Marking the tests ...................................................................................................... 2 Calculating raw scores ............................................................................................. 2 Constructing scale scores ........................................................................................ 2 Using scale scores ................................................................................................... 3 Understanding the data Which reports? ................................................................................................... 4 Using reports to improve teaching and learning................................................. 5 Year 9 Writing Writing prompt .................................................................................................... 6 Key messages.................................................................................................... 7 About the task........................................................................................................... 7 Performance ............................................................................................................. 7 References ............................................................................................................... 8 Writing task sample ............................................................................................ 9 Year 9 Literacy Language conventions ..................................................................................... 12 Spelling — Results and item descriptions .............................................................. 12 Spelling — Key messages...................................................................................... 13 Grammar and punctuation — Results and item descriptions ................................. 16 Grammar and punctuation — Key messages......................................................... 17 Resources .............................................................................................................. 18 Reading ............................................................................................................ 19 Results and item descriptions................................................................................. 19 Key messages ........................................................................................................ 20 Year 9 Numeracy Results and item descriptions................................................................................. 23 Key messages ........................................................................................................ 25

| 2016 NAPLAN: State report ii

1 Queensland Curriculum & Assessment Authority| Preface The purpose of the National Assessment Program is to collect information that governments, education authorities and schools can use to determine whether Australian students are reaching important educational goals. As part of that program, the Literacy and Numeracy tests are valuable sources of information about literacy and numeracy learning that can be used to inform educational policy and current educational practice. The National Assessment Program — Literacy and Numeracy (NAPLAN) tests were developed using the nationally agreed Statements of Learning for English and Statements of Learning for Mathematics, 2005. From 2016 however, the tests will now directly relate to the Australian Curriculum. The NAPLAN tests are designed to provide a nationally comparable indication of student performance in Language conventions, Writing, Reading and Numeracy. The tests are designed to assess a student’s ability to demonstrate the following skills: •Language conventions: The test assesses the ability of students to independently recognise and use correct Standard Australian English grammar, punctuation and spelling in written contexts. •Writing: The test assesses the ability of students to convey thoughts, ideas and information through the independent construction of a written text in Standard Australian English. •Reading: The test assesses the ability of students to independently make meaning from written Standard Australian English texts, including those with some visual elements. •Numeracy: The test assesses students’ knowledge of mathematics, their ability to apply that knowledge in context independently, and their ability to independently reason mathematically. This document reports the performance of Queensland students in Year 9 who sat the 2016 National Assessment Program — Literacy and Numeracy (NAPLAN) tests. Who should use this report? NAPLAN: State report will help teachers, principals and other school personnel understand, interpret and use the student performance information contained in the test reports. Class and school reports are supplied electronically on the secure section of the Queensland Curriculum and Assessment Authority (QCAA) website: https://naplan.qcaa.qld.edu.au/naplan/pages/login.jsp. These reports are accessible only with the school’s Brief Identification Code (BIC) login and password. Individual student reports are distributed to schools as printed copies. Principals Principals can use this document to help interpret their school reports and to provide information to the school community on aspects of the tests. The document provides information on how to access and interpret the online reports located on the QCAA’s website. Curriculum leaders, Heads of Department and Heads of Special Education Services Queensland’s performance on each of the Literacy and Numeracy strands is provided in this document. Curriculum leaders can use this information to interpret the class reports. Classroom teachers Classroom teachers can use information such as the item descriptors, state and national results

| 2016 NAPLAN: State report 2 and the commentaries provided in this report to interpret their class reports. Teachers can compare the performance of their students on a particular item with Australian results. For example, an item with a low facility rate (percentage correct) may not necessarily indicate a problem in teaching and learning. It may be that this was simply a difficult item for all students in this cohort across Australia. The results for such an item may provide information about the learning challenges associated with that concept but should not necessarily be cause for concern. Parents/carers Parents can use the information in this document to interpret the results on their child’s report. They are also able to judge how their child performed when compared with the whole population of students. The item descriptors provide useful information about the scope of the tests. Pre-service teachers Pre-service teachers will find the information in the commentaries on overall student performance useful in gaining an understanding of what students know and can do in some areas of Literacy and Numeracy at Year 9. Placing the tests in the assessment context The NAPLAN tests are national instruments designed to contribute to a school’s assessment program and to inform the teaching and learning cycle. It must be remembered, however, that the results from the 2016 NAPLAN tests represent only one aspect of a school’s assessment program. The results from a school’s formal and informal assessment of students should be consistent with the NAPLAN test results. Principals and teachers should keep in mind that these were pencil-and- paper, point-in-time, timed tests. If the test results are different from what was expected, consider the possible reasons. The results of the tests may indicate aspects of student performance that need further investigation within the classroom using other forms of assessment. Marking and scoring the tests Marking the tests The tests are scored against nationally agreed marking guides. There are four guides, one for the writing task and one each for the open responses in reading, numeracy and spelling. These guides provide information on the acceptable forms of the correct answer. For the Numeracy tests, students may provide a correct response in different forms. Professional officers review these results and decide how to score. Calculating raw scores The simplest calculation made in scoring the tests is the raw score — the number of questions answered correctly. All of the questions for the Language conventions, Writing, Reading and Numeracy tests are marked as either correct or incorrect. Constructing scale scores Raw scores have limited use. They enable the performance of students who have all completed the same test at the same time to be placed in a rank order, but they do not provide information about the level of difficulty of the test nor the relative differences between students.

3 Queensland Curriculum & Assessment Authority| To achieve this, raw scores are transferred to a common scale that reflects how difficult it was to achieve each score. The scale is comparable between year levels for each assessment area. An equating process is also carried out on each year’s test to enable scores to be compared between years of testing. This might mean, for example, that a raw score of 20 on the Year 3 Reading test is transformed to a scale score of 354. This will also represent the same achievement for a student with the same scale score in Year 5, and for a student with the same scale score for Reading in a previous year. The single scale for all students in all year levels is centred on approximately 500. Scale scores also provide a basis for measuring and comparing students’ abilities across years of schooling, for example, comparing a student’s result in Year 3 in 2014 and Year 5 in 2016. From 2017, the move toward a NAPLAN Online testing platform will commence, with the involvement of up to 115 Queensland schools in this first year of transition. Scaling processes involving both paper-based and online testing programs will continue to ensure comparability. Using scale scores The scale score can be used to compare the results of different students. Principals and teachers should take care when making comparisons between small groups of students. For groups of fewer than 10 students, differences may not be reliable, particularly small differences. The scales can be used to monitor the growth of groups of students over time. Principals and teachers should ensure that the compositions of the groups are the same. This enables the school to evaluate special programs that may have been put in place.

| 2016 NAPLAN: State reportUnderstanding the data 4 Understanding the data Which reports? The NAPLAN National Summary Report and the NAPLAN National report provide nationally comparable data about student performance within the National Assessment Program. These reports provide states and territories with information about the achievement of their students in relation to their peers across the nation. Reports are available from the Australian Curriculum Assessment and Reporting Authority (ACARA) website. This NAPLAN State report provides detailed information about student performance on each of the test items. It gives information about: • the Queensland performance on each of the items • the national performance on each item • the item descriptors • some commentary on the state results • some recommendations for teaching. Together, these publications provide system-level information and are publicly available. The NAPLAN School reports give information about a school’s performance in each year level tested. They provide a summary of year-level performance as well as performance by gender, language background and Indigenous status in the following fields: • distribution of scale scores • distribution of achievement bands • school and state means • participation of the group. The shading shows the range of performance for the middle 60% of Queensland students together with the state mean, and positions a school’s performance within the state.

5 Queensland Curriculum & Assessment Authority| The NAPLAN class reports show the performance of each student on every item. They show the items a student had correct and the errors made in each strand (with the exception of reading, where the answers are generally too long to record). The report also gives the: • scale scores for each student • bands for each student • percentage correct for each item for the class and state, and by gender. The NAPLAN school and class reports are available to schools from the QCAA secure website. Using reports to improve teaching and learning While the national and state reports provide the comparative data, it is the class reports that provide a school with the information that can be used to inform teaching and learning and to build capacity in schools. Analysis of the NAPLAN class data, in particular the performance on each item, will provide teachers with information about the understandings and patterns of misunderstandings in student learning. An analysis of the distracters presented in multiple-choice items and the answers to the constructed-response items, other than those for reading, is available through the SunLANDA data analysis tool. This is available on the QCAA website and is designed to help schools with their analyses of class and school results. These results should be placed in a context with other school-based assessments. Looking at the performance on the items and then analysing the error patterns allows teachers and principals to make hypotheses about why groups of students make particular errors. Schools can: • compare the facility rates (percentage correct) of items to see if their performance is consistent with the national and state results available in this document • look at the common errors made by their students and compare them with the common errors made in the state (only errors from Queensland students are available, and are found in the item analyses that are part of SunLANDA). • form hypotheses about why students are making these errors, e.g. – How did students think about this aspect of curriculum? – What misunderstandings might these errors represent? – How might the structure of the test question have shaped the response? Using a combination of the NAPLAN data, school data and professional judgment, teachers should then test these hypotheses to see whether they are valid or whether there is more to be thought about and investigated. Interpretation of these results allows teachers to make judgments about teaching approaches and curriculum. The professional conversations that are part of this process are the most effective and powerful way to use the data as they are the vehicle for developing shared understandings.

| 2016 NAPLAN: State reportYear 9 Writing 6 Year 9 Writing Writing prompt Write a narrative (story) about what happened to a character or characters after reading a sign. You can use a sign on this page OR you can make up your own sign. Think about: • the characters and where they are • the complication or the problem to be solved • how the story will end. Remember to: • plan your story before you start • choose your words carefully • write in sentences • pay attention to your spelling, punctuation and paragraphs • check and edit your writing. YEAR 7 AND YEAR 9 © ACARA 2016

7 Queensland Curriculum & Assessment Authority| Key messages About the task In 2016, the NAPLAN Writing test was based on the narrative genre. As was the case in 2015, two prompts were used; one for Years 3 & 5 and another for Years 7 & 9. The test conditions and administration remained the same as in previous years, i.e. teachers delivered the same spoken instructions and read the text aloud to students. Working independently, students had to plan, compose and edit a written response. Students were allowed five minutes to plan, thirty minutes to write their script, and a further five minutes to edit and complete the task. Three pages were provided for students to write a response. The 2016 prompt for Years 7 & 9 was titled The sign said. Students were asked, in the textual component of the prompt, to: Write a narrative (story) about what happened to a character or characters after reading a sign. Additional information was provided in the textual component of the prompt. This named the structural components, and further defined these elements, e.g. the complication or the problem to be solved. Other notes were also provided in relation to the conventions associated with the writing task, e.g. write in sentences, check and edit your writing etc. Four photographic-like images were also provided to support the textual elements of the prompt. The prompt was relatively open-ended, allowing students to base their writing on either one (or more) of the images provided, or composing their own narrative around a particular sign. Markers for this Writing test were trained using the national narrative writing marker training package, delivered as part of ACARA’s national assessment program. Markers were recruited and trained in accordance with national protocols. Registered Queensland teachers marked the NAPLAN Writing test scripts. All markers applied the ten criteria and related standards from the marking rubric. Writing test scripts were marked on screen in all states and territories. Stringent quality-control measures were applied to the marking of student scripts, including a prescribed percentage of scripts to be double-marked, and the daily application nationally of control scripts for all markers. As part of the Queensland marking operation for 2016, referee marking continued, further ensuring marking reliability. There was also provision for appeal over individual Writing test scores, once test results were released. On appeal, a student’s script is re-marked independently by two senior Writing test markers. An earlier version of the NAPLAN Narrative writing marking guide is available at www.nap.edu.au/NAPLAN/About_each_domain/Writing/index.html. Performance Anecdotal evidence from markers indicated that students in Years 7 and 9 were comfortable with the writing prompt, The sign said. A significant proportion of students elected to use one of the visual images provided, with Wanted brave employee and Last fuel for 500 kilometres proving to be the most common choices. Those students who diverged from the images provided on the prompt tended to write more challenging narratives, though this was not exclusively so. For instance, mysterious surroundings with warning signs proved effective frames for a number of storylines. One danger with this approach, however, was the predictability of the conclusions and climactic events. The notion of narrative complication was better understood than in those scripts from Years 3 and 5 students, however the level of originality or degree of substance in the story plots was of concern. Characters, in general, were not well developed, and the need to layer the exposition of characters, even in such a relatively short text, was not fully realised. Many one-dimensional characters found themselves on road trips or in various forms of circus employment, and while the scenarios were plausible or credible, the responses or reactions by the characters involved lacked

| 2016 NAPLAN: State reportYear 9 Writing 8 genuine development. This would be a useful area for classroom activity — flesh out characters to show rather than tell the depth of their responses to situations, complications and encounters with other characters. Work on vocabulary development, use of figurative language and idiomatic expression in dialogue would support improvement in students’ writing in the narrative genre. Dialogue, in general, was not handled to any great effect. At a semantic level, it rarely provided a key to help unlock characters’ emotions and motivations. Dialogue was often primarily used to progress the storyline, which is a legitimate if somewhat unsophisticated use of the device. At the basic skill level, conventions around punctuation of direct speech were often overlooked. The NAPLAN marking rubric deemed direct speech marks as ‘other punctuation’. Therefore those students who elected to use dialogue but lacked understanding of the form tended to be precluded from higher punctuation scores, where control of the convention was required. Students need to be aware that the judicious use of verbs and adverbs in association with direct speech may also reveals a character’s mood, intention and personality. Students in Years 7 to 9 tended to respond with lengthier texts than in previous years where persuasive prompts were used. On the one hand, this allowed more capable writers to explore story, character and setting in more depth. On the other, some students misjudged timing, so that a number of scripts had an unfinished sense. This was costly in many of the criteria, including audience, text structure, ideas and cohesion. The selection of the narrative genre for 2016 provided broad opportunities for students to explore sentence forms. More successful scripts adopted greater range in form through the use of fragments for effect, the embedding of clauses, and even simple sentence structures used selectively. Compound sentence forms, such as the continuous and, though more frequently used by students in the younger grades, still found their way into the scripts of older students, often demonstrating some lack of maturity in language control. One aspect that was evident from the test was the need for students to plan effectively, even within the constraints of a demand writing task. In narrative, planning involves deciding on: • the protagonist — problems, motivation, obstacles • focus — characters, relationships, task, surprise • how to solve the problem/surprise • what unifying pattern/s will hold the text together — cohesion in the most powerful sense • what will change by the conclusion for the character/s or for the reader? Beyond the planning imperative, students need to adopt a clear voice which is individual, lively and authentic. Regular classroom writing should always be encouraged. References Australian Curriculum, Assessment and Reporting Authority 2013, Australian Curriculum: English, www.australiancurriculum.edu.au. Queensland Curriculum and Assessment Authority 2013, Hidden worlds, www.qcaa.qld.edu.au/downloads/p_10/3579_wt_hidden_worlds.pdf. Queensland Curriculum and Assessment Authority 2011, Queensland’s Literacy Test: A framework for describing spelling items, www.qcaa.qld.edu.au/downloads/p_10/3579_describing_spell_items.pdf.

9 Queensland Curriculum & Assessment Authority| Writing task sample Year 9 — STOP NOW! YOU HAVE BEEN WARNED

| 2016 NAPLAN: State reportYear 9 Writing 10

11 Queensland Curriculum & Assessment Authority| Year 9 Commentary — STOP NOW! YOU HAVE BEEN WARNED The very unpredictability and inconclusiveness of this brief narrative are its strengths. The protagonist, Timothy, ignoring the signed warning, finds himself in a mysterious house with its own chequered past. There, he confronts a gothic-like stranger, with the subsequent experience dramatically changing Tim’s character. To what extent, the reader is left wondering. The story demonstrates an interesting structure. The opening paragraph provides a perspective of Timothy, the fearless risk-taker, the master of the ‘double dare’. The body of the text describes how Tim negotiates the house he has been warned not to enter. The open-ended conclusion describes a changed Tim, but provides little detail for the reader. The reader is then drawn into Tim’s fate through a direct address from the writer as a concluding sentence. Cohesion is achieved through the central semantic elements of character and location. These are woven through the narrative, as Tim’s mysterious journey reaches its equally mysterious conclusion. Though there is a time sequence in the text, the use of time connectives is limited. Rather, the locations within the house provide their own cohesive structure, as the reader is acquainted with each. So stairs, corridors, doors, furnishings progressively provide a background for Tim’s investigation. Occasional spelling and grammar errors. Language is natural and precise. A range of sentence forms provide variety, interest and suspense, particularly through the use of short sharp simple sentences and fragments as tension rises in the latter segment of the narrative. (‘No furniture. No bed. Empty space.’) Paragraphing structure, though unusual, does assist in pacing the reader through the story. Punctuation supports these various sentence forms, controlling the slowly revealed message and its delivery. Occasional spelling errors (superstituous, disuaded) prevented the text from receiving the highest score in that criterion.

| 2016 NAPLAN: State reportYear 9 Literacy 12 Year 9 Literacy Language conventions Spelling — Results and item descriptions The percentage columns give the facility rate (percentage correct). These results are based on provisional data. Item Answer Qld% Aust% Description Proofreading — error identified 1 offering 93.8 94.4 Correctly spells a three-syllable word with the inflectional ending -ing requiring no change to the base word. 2 renewable 86.7 87.6 Correctly spells a three-syllable word ending with -able. 3 improvise 88.8 88.8 Correctly spells a three-syllable word ending with -ise. 4 circular 82.1 81.7 Correctly spells a three-syllable word ending with -ar. 5 miserable 65.3 66.5 Correctly spells a three-syllable word with an elided syllable. 6 moisten 65.9 67.5 Correctly spells a two-syllable word with a silent medial -t. 7 protein 54.9 58.5 Correctly spells a two-syllable word with the long vowel digraph -ei. 8 insight 45.4 48.9 Correctly spells a two-syllable homophone. 9 brochures 44.2 49.0 Correctly spells a two-syllable word with the fricative -ch. 10 optimist 42.0 44.0 Correctly spells a three-syllable word with the neutral vowel (schwa) represented by -i. 11 auditorium 43.2 43.0 Correctly spells a five-syllable word with the etymological element audi-. 12 quarrel 44.0 47.0 Correctly spells a two-syllable word with the double consonant -rr at the syllable juncture. 13 enigma 24.4 28.7 Correctly spells a three-syllable word with the schwa represented by -a. 14 gnawing 24.8 26.5 Correctly spells a two-syllable word with the inflectional ending -ing requiring no change to the base word. 15 heightened 23.7 25.3 Correctly spells a two-syllable word with the diphthong pattern -eigh.

13 Queensland Curriculum & Assessment Authority| Spelling — Key messages Performance Generally the Queensland performance was very similar to the national results. The facility rate was equal to or slightly higher than the national result for the words improvise and circular. The words farthest from the national minimum standard (4% lower) were brochures and vulnerable. Queensland students performed very well (over 80%) on words from the error identified section: offering, renewable, improvise, circular and graphic. Words which had very low facility rates (less than 25%) were: •enigma, gnawing and heightened in the error-identified section •amateur, procession, deteriorates, deciduous, satellites, pseudonym and privilege in the error- unidentified section. Proofreading — error unidentified 16 graphic 85.7 86.1 Identifies an error, then correctly spells a two-syllable word with the final consonant -c. 17 techniques 70.5 73.7 Identifies an error, then correctly spells a two-syllable word with the etymological element -tech. 18 futuristic 65.5 67.0 Identifies an error, then correctly spells a four-syllable word with the derivational suffix -istic requiring a change to the base word (drop -e). 19 chaos 57.4 61.2 Identifies an error, then correctly spells a two-syllable word beginning with the plosive ch-. 20 unannounced 54.8 56.9 Identifies an error, then correctly spells a three-syllable word with the double letter -nn at the second syllable juncture. 21 exceeding 45.0 45.1 Identifies an error, then correctly spells a three-syllable word with -xc. 22 sponsored 33.1 36.7 Identifies an error, then correctly spells a two-syllable word with the schwa represented by -or. 23 vulnerable 31.8 36.9 Identifies an error, then correctly spells a four-syllable word with an unstressed lateral. 24 amateur 14.7 17.9 Identifies an error, then correctly spells a three-syllable word with the schwa represented by -eur. 25 procession 16.4 17.3 Identifies an error, then correctly spells a three-syllable word with the fricative -c. 26 deteriorates 10.9 12.4 Identifies an error, then correctly spells a five-syllable word with the schwa represented by -o. 27 deciduous 10.3 11.5 Identifies an error, then correctly spells a four-syllable word with the ending -uous. 28 satellites 8.6 10.9 Identifies an error, then correctly spells a three-syllable word with the single letter -t at the first syllable juncture and the double letter -ll at the second. 29 pseudonym 4.3 5.5 Identifies an error, then correctly spells a three-syllable word with the etymological element -nym. 30 privilege 7.0 7.2 Identifies an error, then correctly spells a three-syllable word ending in -ege. Item Answer Qld% Aust% Description

| 2016 NAPLAN: State reportYear 9 Literacy 14 The word pseudonym had the lowest facility rate (4.3%) probably because it is subject-specific and rarely used by Year 9 students. It has been referred to as a ‘dictionary word’ as it is made up of two Greek roots (pseudo false and nym name). As a general comment, students revert to earlier less sophisticated strategies of ‘sounding out’ the word instead of considering: • the syllable/word function layer • the etymological origins of the word • the meaning layer. There were four common categories of errors: • words which had inflectional endings and suffixes which did not require changes to the base word, e.g. offering and gnawing did not require any changes when the inflectional ending -ing was added. When the suffixes -able and -ic were added to the base word, no change to the base word was required, as in renewable and graphic. Conversely when the suffixes -ar, and -istic were added to the base words circle and future, changes were required, as in the words circular and futuristic. • errors associated with syllables. There was the question of what to do at a syllable juncture — whether or not to double the consonant as in the words quarrel, unannounced and satellites. The latter was very difficult (8.6% facility rate) because the consonant wasn’t doubled at the first syllable juncture but it was at the second juncture. An elided syllable -er, in the middle of the word miserable, posed difficulties for students. • errors based on unusual sounds (or absence of sound) with certain vowels and consonants. Vowels in a range of words caused difficulties for students, e.g. the long vowel digraph -ei in the word protein, the diphthong or sliding pattern of -eigh in the word heightened and the schwas or neutral vowels in optimist (-i), enigma (-a), amateur (-eur), sponsored (-or) and deteriorate (-o). Silent or unstressed consonants also showed the futility of relying on the sounding-out technique, e.g. the silent -t in moisten and the unstressed -l in vulnerable. The fricative consonants (-ch) and (-c) in the words brochures and procession posed difficulties for students, especially the word procession. Boys achieved better than girls on the word chaos beginning with the plosive consonants -ch. • errors related to the meaning layer in the spelling system. An example of a homophone was seen in the word insight which was spelt as incite in the test. Many words had an etymological element. Students who had a knowledge of the origins of a word (e.g. Greek and Latin roots or Old French) had a decided advantage as they would recognise an unusual combination of letters as a root which represented a meaning, e.g. –auditorium (a Latin word which means a place where something is heard) –deciduous (from the Latin root deciduus meaning that which falls down) –techniques (a Greek word tekhnologia meaning the systematic treatment of an art) –privilege (from the Latin roots privus meaning the individual and legis meaning law) –pseudonym (from two Greek roots pseudo meaning false and nym meaning name) –exceed (from the Latin root excedere meaning to go beyond). Implications for teaching Year 9 students should have a well developed of the orthographic system and be able to recognise when they need to draw on different layers of the system, which are:

15 Queensland Curriculum & Assessment Authority| • the sound/symbol and pronunciation layer, e.g. cat, sat • the syllable/word function layer, e.g. satellites and word function layer, e.g. brochures (plural), exceeding (present participle) • the meaning layer e.g. insight, auditorium. Often students seem to be influenced by the printed misspelling of the target word instead of spelling the word using their normal practices. To avoid the distracting effect of the printed misspelling, students should identify the target word, then cover up the misspelling and write the target word using their own best knowledge of spelling patterns and word components. Standardised test results can give reliable generalisations about groups of students but for diagnostic and teaching purposes teachers should use assessments such as dictation tests, audits of student writing, and records of conferences with students. Teachers should teach strategies for activities such as these: • analysing syllables and stress patterns (use this strategy for phonetically coded multisyllable words) • applying knowledge of inflections and affixes (use this for words containing grammatical components) • applying knowledge of classical and foreign word elements. Recall visual memories of the written form of mature vocabulary words (use this for specialist and academic vocabulary). Spelling lessons should not be about random words but words that exemplify the separate layers of the spelling system and that suit the learning phase of the students. In learning the spelling system, students should study: • foreign-derived words (e.g amateur, privilege) • words with within-word letter patterns that represent complex vowel patterns (e.g. protein, heightened) • homophones and near-homophones (e.g. insight/incite) • words with syllable and affix patterns: – open syllables (e.g. fu-turistic, pro-cession, po-et) and closed syllables (e.g. unan-nounced, quar-rel) – advanced stems and affixes (e.g. circul-ar, re-new-able, graph-ic, moist-en, pro-cess-ion, de-cid-uous) – words where pronunciation changes after adding affixes, or alternation (e.g. moist/moisten, miser/miserable). Although adding suffixes usually leaves the spelling of the stem unchanged (e.g. offering), there are exceptions (e.g. futuristic). • words from the derivational constancy level containing Latin or Greek elements (e.g. deciduous, exceeding, auditorium, pseudonym, privilege, amateur). Meaning, not sound, will tell students how these elements are spelt whenever they are encountered.

| 2016 NAPLAN: State reportYear 9 Literacy 16 Grammar and punctuation — Results and item descriptions The percentage columns give the facility rate (percentage correct). These results are based on provisional data. Item Answer Qld% Aust% Description 31 B 93.5 94.1 Identifies the correct non-finite verb to complete a complex sentence. 32 C 86.3 87.1 Selects the definite article to correctly specify a noun. 33 A 89.3 90.5 Identifies the correct relative pronoun to introduce an adjectival clause. 34 C 82.3 83.0 Identifies the correct use of paired commas in a complex sentence. 35 B 80.2 79.0 Identifies the pair of sentences that can be joined by whereas. 36 C 85.6 86.3 Identifies the correct subordinating conjunction to introduce an adverbial clause. 37 B 74.8 75.7 Identifies the correct reference for a pronoun in a complex sentence. 38 B 76.6 77.6 Selects the correct compound verb to complete a complex sentence. 39 D 61.3 65.7 Identifies the sentence with the correct past tense irregular verb. 40 D 68.8 69.1 Identifies the correct modal adjective in a compound sentence. 41 A 61.2 62.1 Identifies that a run-on sentence should be punctuated as two sentences. 42 D 66.9 68.6 Identifies the correct construction of a sentence with a non-finite clause. 43 D 58.7 61.2 Identifies the correct use of parallel construction in a sentence. 44 C 49.3 48.8 Identifies a word used as a noun in a complex sentence. 45 D 46.9 47.3 Selects a complex sentence requiring the use of the possessive pronoun its. 46 D 58.7 59.2 Identifies an adverbial phrase of time in a complex sentence. 47 A 38.9 41.5 Identifies the correct use of capital letters for a geographical name in a simple sentence. 48 A 52.5 51.4 Identifies the correct use of paired commas for embedded information. 49 C 37.9 39.9 Identifies the correct use of an adverb in a simple sentence. 50 A 46.2 42.6 Identifies a compound sentence with but as the joining word. 51 D 35.0 34.4 Identifies the subject of the main clause in a complex sentence. 52 D 38.5 40.5 Identifies the correct use of paired dashes for parenthetical information in a sentence. 53 B 36.5 33.8 Identifies the simple sentence that contains an apostrophe of contraction. 54 B 33.3 34.6 Identifies correct subject–verb agreement in a simple sentence. 55 C 33.1 34.2 Identifies the correct use of a semicolon to separate clauses. 56 C 20.4 20.5 Identifies the correct parallel construction in a compound sentence.

17 Queensland Curriculum & Assessment Authority| Grammar and punctuation — Key messages The 2016 test covered similar skills to previous NAPLAN testing. One new area on which several items were based involved the concept of parallel construction (Items 43, 56). The notion of ‘pairing’ was also explored in items involving punctuation (Items 34, 48, 52) and other sentence construction (Item 35). Performance The Queensland Year 9 students exceeded the national facility rates including Item 50 (identifying a compound sentence) and Item 53 (identifying an apostrophe of contraction). Items in which Queensland students performed lower than the national facility rates included Item 39 (identifying a past tense irregular verb), Item 43 (a parallel construction), and Item 47 (capitalising a geographical name.) Performance on Item 45 was on par with the national facility rate, but it is of concern that many Year 9 students in Australia had difficulty in identifying the correct use of the possessive its. Implications for teaching There remains a demand on students (and teachers) to be familiar with the metalanguage associated with grammar and punctuation. Many items use the linguistic label in the question/ stem, e.g. Which group of words in this sentence is an adverbial phrase of time?, Which sentence is a compound sentence? Students unfamiliar with the terminology would have difficulty answering items of this type. Students also need an awareness of how grammar functions in different circumstances. Item 44, for instance, asked: In which sentence is the word ‘opposite’ used as a noun? A knowledge of the term noun was clearly required, but those students who could recognise the function being performed by the word opposite in each of the options would have been in a stronger position to answer successfully. At a testwiseness level, facility rates for items generally diminishes across the duration of the test, with the final items reflecting very low facility rates, (Items 58: 17.2%, Items 57: 14.7%, Items 56: 20.4%). Omit rates were not vastly different in these more difficult items, which suggests that students were completing the test, but faced greater difficulty with particular item types. In Item 57, students were required to identify a correctly punctuated sentence. The options were lengthy, and students may have been expecting to select an option with internal punctuation when in fact the key was the one option with no internal use of the comma. Option A: The storm clouds piled on the horizon marked the start of the wet season. Classroom strategies should incorporate irregular grammatical forms. In order to improve literacy skills in grammar and punctuation, it is important to select reading materials appropriately, being careful to include texts that are challenging and sometimes divergent in form. The conventions of language are ideally best taught within appropriate contexts, through microlessons on very specific (and often problematic) language elements. Item 41, for instance, required students to identify the run-on sentence. Since this is such a common error in student writing across all year levels, a revision of what sentences do and how they should be constructed would be suitable for Year 9 students. The incorrect punctuation associated with the run-on sentence is also a costly 57 A 14.7 15.4 Recognises that a defining clause does not need to be marked with a comma. 58 C 17.2 17.9 Identifies the correctly punctuated compound adjective for a simple sentence. Item Answer Qld% Aust% Description

| 2016 NAPLAN: State reportYear 9 Literacy 18 feature of student performance on the NAPLAN Writing test, and this error has shown little improvement over several years. While attention needs to be drawn to more basic grammar or punctuation features, so too do Year 9 students need to be aware of more sophisticated concepts such as modality (Item 40) and parallel construction (Items 43 and 56). A comprehensive language program at Year 9 should address these types of language forms. Please refer to SunLANDA, which is available to schools via the School Portal on the QCAA website through the school BIC and password. The SunLANDA program displays the school’s results but also links to detailed analysis of every item on the NAPLAN test. The analyses include Australian Curriculum links, language resource texts and other QCAA materials. The item analysis is also available collected into PDF format on the NAPLAN pages of the QCAA website. A detailed scope and sequence of teaching grammar and punctuation can be found in Grammar— Years 1 to 9 (QCAA 2007, https://www.qcaa.qld.edu.au/downloads/p_10/ qcar_ss_english_grammar.pdf). The teaching and assessing of grammar and punctuation should be: • developmental — covering increasingly mature skills • timely — taught when students need to learn • systematic — covering the features relevant to the levels of communication, from the whole text level to the sentence level and down to groups of words. Resources Teachers can refer to the following resources: • Comprehensive and specific information about what to teach from Years 1 to 9 is given in the draft scope and sequence for teaching grammar (and punctuation), available from the QCAA at: https://www.qcaa.qld.edu.au/downloads/p_10/qcar_ss_english_grammar.pdf • Books by Beverley Derewianka and Sally Humphrey suggest ways of teaching that can often apply to older students. (Published by Primary English Teaching Association Australia). • Topics for teaching Year 9 grammar and punctuation are suggested in the Australian Curriculum English: – Understand that authors innovate with text structures and language for specific purposes and effects (ACELA1553) – Compare and contrast the use of cohesive devices in texts, focusing on how they serve to signpost ideas, to make connections and to build semantic associations between ideas (ACELA1770) – Understand how punctuation is used along with layout and font variations in constructing texts for different audiences and purposes (ACELA1556) – Explain how authors creatively use the structures of sentences and clauses for particular effects (ACELA1557) – Understand how certain abstract nouns can be used to summarise preceding or subsequent stretches of text (ACELA1559) – Identify how vocabulary choices contribute to specificity, abstraction and stylistic effectiveness (ACELA1561).

19 Queensland Curriculum & Assessment Authority| Reading Results and item descriptions The percentage columns give the facility rate (percentage correct). These results are based on provisional data. Item Answer Qld% Aust% Description The Terracotta Army 1 B 95.2 95.2 Locates directly stated information in an information text. 2 C 95.2 94.5 Interprets information in an information text. 3 A 94.7 95.3 Interprets a description in an information text. 4 B 83.8 84.3 Identifies directly stated information in an information text. 5 C 83.1 85.3 Identifies a similarity in an information text. 6 D 58.8 56.4 Identifies a pronoun reference in an information text. The Great Blondin 7 A 88.4 87.8 Synthesises an information text to identify a character trait. 8 D 87.4 88.4 Infers the reason for including a fact in an information text. 9 C 33.4 35.3 Interprets the tone of an information text to identify the purpose of scare quotes. 10 D 88.3 89.9 Identifies a fact in an information text. 11 A 50.8 53.5 Interprets a character’s reaction in an information text. 12 C 84.1 85.4 Infers the purpose of a final sentence in an information text. The wave 13 B 65.8 66.8 Interprets a character’s state of mind in a narrative. 14 A 78.0 79.9 Identifies the characteristics of two types of water in a narrative. 15 A 70.4 71.1 Interprets the use of a colon in a narrative. 16 D 79.4 81.0 Interprets a description in a narrative. 17 C 79.6 81.2 Interprets the meaning of a word from context in a narrative. 18 B 64.5 63.9 Interprets a simile in a narrative. Geysers 19 C 61.5 59.2 Identifies the meaning of a term used in text and diagram in an explanation. 20 D 57.9 61.2 Identifies the role of a diagram to support text in an explanation. 21 B 55.1 55.9 Demonstrates understanding of the organisational roles of subheadings in an explanation. 22 A 55.3 55.5 Identifies the reason for the use of brackets in an explanation. 23 D 25.2 26.3 Infers a writer’s point of view in the closing sentence of an explanation. 24 2;5;4;1;3 24.7 26.5 Synthesises an explanation to sequence events in a process. Looking back 25 B 57.1 57.7 Identifies the device used at the beginning of a first-person narrative to engage readers.

| 2016 NAPLAN: State report 20 Key messages As in 2015, the 2016 Year 9 Reading test consisted of 50 items based on eight reading magazine units spanning the genres of information (3), persuasion (2) imaginative-narrative (2), and an imaginative text linked to a descriptive text (poem and diary entry on the same subject). There were no short-response items for Year 9 this year. Teachers can view school-specific performance information through the QCAA’s SunLANDA program. SunLANDA is available on-line through the School Portal on the QCAA home page. 26 C 70.7 71.7 Interprets a complex statement in a first-person narrative. 27 A 69.2 71.4 Interprets a word in context in a first-person narrative. 28 B 40.0 41.7 Interprets a character’s reaction in a first-person narrative. 29 C 56.5 58.3 Synthesises a paragraph to identify its underlying purpose in a first-person narrative. 30 B 63.6 65.2 Identifies a character’s reaction in a first-person narrative. 31 D 22.7 21.5 Interprets a literary description in a first-person narrative. One man’s trash … 32 D 51.0 55.6 Identifies an economic construct in a persuasive text. 33 A 61.0 65.2 Identifies the argument in a paragraph of a persuasive text. 34 D 34.5 36.6 Infers a writer’s point of view in a paragraph in a persuasive text. 35 A 25.0 25.8 Identifies a synonym for a word in context in a persuasive text. 36 C 55.6 58.7 Analyses how a word choice supports meaning in a persuasive text. 37 B 32.1 33.7 Interprets the use of a nominalisation as a cohesive device in a persuasive text. Into the blue 38 B 44.4 45.5 Identifies the use of extended personification in a poem. 39 A 50.4 50.8 Interprets a figurative description in a poem. 40 A 36.1 37.9 Interprets the use of an adverb to personify in a poem. 41 D 45.8 47.7 Synthesises a short description to identify the writer’s perspective on the subject matter. 42 D 41.7 43.8 Identifies the use of brackets to indicate an omitted word in a short description. 43 A 38.1 40.2 Synthesises two texts to infer the connection between them. An apology 44 B 31.8 33.8 Identifies a reason given in an advertisement. 45 C 34.4 34.0 Infers how a dependent clause is used to create meaning in an advertisement. 46 C 51.6 53.3 Identifies the use of a pun in an advertisement. 47 D 25.5 27.7 Analyses a section of an advertisement to identify its purpose. 48 C 42.4 42.5 Identifies the meaning of a word in context in an advertisement. 49 B 40.3 41.2 Interprets an example of lexical cohesion in an advertisement. 50 B 26.1 27.7 Synthesises an advertisement to determine its tone. Item Answer Qld% Aust% Description

21 Queensland Curriculum & Assessment Authority| State schools can also access this content through OneSchool. SunLANDA displays the performance of classes, subgroups, and individuals within the school and compares the school’s performance with that of the state and nation. Most importantly, hyperlinked to each item are the analyses and teaching ideas to help teachers and students with this type of question. Performance It was pleasing to see that Queensland students (92.4%) performed at or above the national minimum standard for reading, compared to a national average of 92.8%. There was an increasing level of difficulty across the reading test. The first two texts in the paper had a high and a medium facility rate respectively across most items. The exception to this was an item in The Great Blondin which involved a question about how scare quotes might affect tone. This pattern is typical of entry-level texts. The next two narratives had a pattern of medium facility rates across most items. Items on The wave by Tim Winton were handled particularly well considering the demanding nature of the items. The question that challenged students the most in Looking back was Item 31, which demanded a good vocabulary as well as an understanding that the writer is using a literary technique (personification). Surprisingly, a pattern of low facility was evident in most items in the information text Geysers. Item 23 asked students to infer a writer’s point of view as well as requiring them to be sensitive to modality in a text (25% facility rate). The sequencing question, Item 24, also had a 25% facility rate and, more importantly, 40% of students omitted this question. This is surprising as the relevant information was found in one paragraph and was reinforced in the diagram. This suggests that synthesising is a skill that most students need to practise. Predictably the last three texts in the test had a pattern of low facility rates (two persuasive texts and a poem combined with a diary entry). Item 35 in One man’s trash …, which had a very low facility rate, went against the usual gender trend and had boys outperforming girls by 8% (where students were asked to identify a synonym for a word used in an unusual way in context). Unit 7, Into the blue, continued a trend seen in recent years of including authentic texts by published authors of the past, namely Henry Lawson and Charles Darwin. Students had to interpret challenging vocabulary and unusual grammatical structures and language features. In the last persuasive text, An apology, students had difficulty deciphering the ironic, tongue-in-cheek tone which resulted in two levels of meaning — the expressed and the intended meaning. Items involving purpose (Item 47) and tone (Item 50) had facility rates of 25% and 26% respectively, both of which required higher-order reasoning and comprehension as well as attention to subtle clues in the text. Implications for teaching As a general note, all items involving purpose, main idea, theme or tone of the text in whole or part challenge students because they have to understand the whole of the text in order to answer the question. The big challenge for teachers is to get students to annotate texts in the classroom and discuss them in groups so that they can see how all the parts of the text contribute towards the meaning of the whole. This is the time to discuss the patterns in the text (e.g. cause and effect), identify connections between ideas in the text, the two or three main parts of the text and how the parts contribute to the overall meaning. All of this should occur before they begin a close study of the text. Students will handle the distractors in the items much better if they are clear about the subject matter and the purpose of the text before they proceed to the items. Teachers need to encourage students to read for pleasure and recreation in order to extend their knowledge of themselves and the world around them. Reading develops empathy for characters and people in difficult situations. Students also need to be able to confidently participate in a close study of a text, to check for fallacies and persuasive techniques, and to identify emotive language

| 2016 NAPLAN: State report 22 and literary techniques. World citizens need to be discerning and capable readers and confident speakers and writers about those texts. The complexity of the reading process is made visible when students discuss texts and how they arrive at their personal understanding of the text. Teachers are the facilitators of this process, not the leaders. Their focus should be on: • finding authentic texts which appeal to adolescent children • providing a range of genres and a range of texts from classic or traditional texts to texts with post-modern elements • promoting higher-order questioning of texts (both set texts for special study and unseen texts for close study) • reading aloud to students to promote reading for pleasure (sometimes at year 9 this is forgotten) • developing an awareness of how the parts of the text combine to create a whole through both semantic (links between the ideas) and syntactic (grammatical links) cohesion • encouraging students to make inferences as they read (an informed guess backed by evidence or a statement about the unknown based on the known) • encouraging the link between reading and writing by asking students to regularly write analytical paragraphs about an aspect of what they have read (which includes a controlling central idea) in response to the question, e.g. Can this character be trusted? Is there a shift in tone in this text? • encouraging students to look deeper into a text by drawing on analytical skills, e.g. explore gaps and silences, consider writer bias, look at contradictions within the text, look for themes, hidden purposes and so on and re-read the text from another perspective (e.g. a contemporary, feminist, eco-critical perspective) • encouraging adolescent boys to be active readers and make connections between the text and their own knowledge and experiences. QCAA resources QCAA 2015, Beyond NAPLAN How to read challenging texts, Beyond NAPLAN series, www.qcaa.qld.edu.au/downloads/p_10/naplan_read_challenging_texts.pdf.

23 Queensland Curriculum & Assessment Authority| Year 9 Numeracy Results and item descriptions The numeracy strands are abbreviated as follows: number and algebra (NA); measurement and geometry (MG); statistics and probability (SP). All items are worth one score point. For the purpose of this report, the SUNLANDA strands of number and algebra, functions and patterns have been combined as number and algebra to reflect the Australian Curriculum strands. The percentage columns give facility rates (percentage correct). These results are based on provisional data. Calculator-allowed paper Item Strand Answer Qld% Aust % Description 1 NA C 97.8 98.1 Uses an hourly rate and time to determine total pay. 2 NA D 85.7 86.8 Represents a whole number as the product of powers of prime numbers. 3 NA A 85.5 86.8 Uses a ratio to solve a problem in context. 4 NA B 79.5 81.7 Evaluates an expression using the addition and subtraction of integers. 5 NA 10 68.4 69.2 Multiplies and divides fractions and decimals in context. 6 MG D 83.4 85.0 Determines the duration of an event using 12-hour time with a change from am to pm. 7 NA B 76.8 78.7 Adds and subtracts fractions and mixed numbers with related denominators. 8 NA B 63.4 64.6 Estimates the cost of an item given the total and the cost of other items. 9 NA A 66.2 66.7 Identifies a linear graph using the coordinates of two points. 10 SP C 62.9 63.7 Determines the probability of an event as a decimal. 11 NA C 75.5 77.4 Determines the solution of an equation by substitution into a linear equation. 12 SP 20 54.4 53.9 Calculates the mean for a set of data in context. 13 NA B 52.9 55.3 Solves a ratio problem that includes fractions and decimals. 14 MG 250 51.3 53.0 Converts from litres to millilitres. 15 NA A 55.2 57.2 Determines the best buy by calculating the sale prices using fraction and percentage off the original prices. 16 MG 135 58.7 58.5 Determines the size of an unknown angle using angles on a straight line. 17 MG B 43.2 44.3 Calculates the distance travelled using circumference. 18 NA D 38.2 42.7 Selects an equivalent expression by factorising an algebraic expression. 19 SP D 43.2 48.0 Calculates the probability of an event involving ‘and’. 20 MG D 43.8 46.7 Identifies corresponding angles.

| 2016 NAPLAN: State reportYear 9 Numeracy 24 Non-calculator paper 21 SP B 38.3 41.7 Calculates the range of a data set from a dot plot. 22 NA 8 24.1 27.2 Uses ratio to solve a problem. 23 NA 101 32.9 35.8 Uses reasoning and efficient strategies to solve a problem involving all four operations with whole numbers. 24 NA A 35.8 39.1 Calculates the unit price of items to determine the best buy. 25 MG 540 21.0 21.9 Calculates the sum of the sizes of the angles of a pentagon. 26 MG C 33.5 35.6 Describes the location of points on the Cartesian plane. 27 MG 3227 15.6 18.1 Calculates the surface area of a square-based pyramid. 28 NA 5 21.8 26.8 Calculates the percentage rate in a simple interest problem. 29 MG 491 14.3 15.6 Calculates the area of a circle. 30 MG 11 11.9 14.7 Solves a problem involving the volume of a right triangular prism. 31 NA 10 11.0 15.8 Solves a linear equation with rational coefficients. 32 NA 300 4.3 6.3 Solves a multistep problem involving informal simultaneous equations and conversion from milligrams to grams. Item Strand Answer Qld% Aust% Description 1 NA A 92.9 94.1 Uses division to solve a problem. 2 MG B 85.3 86.8 Converts a measurement of metres and centimetres to metres alone. 3 SP 80 76.8 79.4 Calculates the probability of a complementary event. 4 SP C 73.1 76.4 Interprets events in a Venn diagram using the exclusive ‘or’. 5 MG A 81.3 79.9 Determines the coordinates of a point after a translation on the Cartesian plane. 6 SP C 90.1 91.6 Identifies a dot plot given a data set in context. 7 NA B 67.7 68.8 Uses ratio to solve a problem in context. 8 NA D 74.5 76.9 Selects an algebraic expression to describe a word problem. 9 NA A 78.0 78.8 Interprets the graph of a non-linear relation in context. 10 MG B 72.8 76.8 Calculates the perimeter of a parallelogram. 11 NA A 44.0 43.8 Expresses a number in scientific notation using negative indices. 12 NA C 51.5 54.5 Compares two fractions with unlike denominators to say which is larger. 13 NA A 78.1 80.6 Selects an algebraic expression to represent a word problem. Item Strand Answer Qld% Aust % Description

25 Queensland Curriculum & Assessment Authority| Key messages Performance Student results for numeracy in Years 7 and 9 are reported as a single score. Where a student completes only one of the two numeracy tests, their numeracy score is an estimate of the score they may have received if they had completed both tests. The numeracy tests consist of 64 items from three strands across two papers — a calculator- allowed test (CA) and a non-calculator test (NC) — each with 32 items. Not all items on the calculator-allowed test required the use of a calculator. The distribution of the 64 items across the strands was: • 35 number and algebra • 19 measurement and geometry • 10 statistics and probability. 14 SP B 47.2 48.8 Identifies the probability for a two-step experiment. 15 MG C 51.7 52.9 Uses the distance on a coordinate plane to determine the map scale. 16 MG C 61.7 59.0 Identifies the image of a shape after a combination of a reflection and a rotation. 17 SP D 48.1 50.7 Identifies a possible effect of a new data value on the mean and median. 18 NA 5 42.5 45.0 Solves a problem using patterning or a linear equation. 19 MG B 46.2 47.2 Classifies a triangle according to its angles. 20 SP C 45.1 46.7 Interprets a two-way table to calculate a probability. 21 NA D 31.4 35.1 Applies the commutative property to an algebraic expression. 22 MG 12 32.4 35.3 Calculates the area of a composite shape composed of a square and a triangle. 23 NA D 39.5 42.1 Divides a whole number by a fraction to solve a word problem. 24 NA 36 30.1 31.9 Identifies a two-digit number that is a square number and is a multiple of two given numbers. 25 NA E 33.4 36.0 Converts a fraction to a decimal. 26 NA C 25.7 27.2 Selects an expression which demonstrates the use of the distributive property for mental calculation. 27 NA 40 22.3 24.1 Calculates the gradient of the rule for a linear relationship. 28 NA 21.45 24.3 26.8 Calculates the cost of an item after a percentage increase. 29 MG D 14.2 15.5 Converts from square metres to square centimetres. 30 NA 35 12.3 14.6 Solves a problem involving direct proportion. 31 MG 400 18.2 21.0 Applies Pythagoras’s theorem to solve a problem in context. 32 NA D 8.4 9.4 Identifies the ratio of two areas, one in square kilometres the other in square metres. Item Strand Answer Qld% Aust% Description

| 2016 NAPLAN: State reportYear 9 Numeracy 26 Approximately 69% of items were multiple-choice, with the remaining 31% requiring students to construct their answers. While the majority of students attempted to answer all items, a number omitted the more challenging items towards the end of the test. Many of these require a constructed response rather than selecting an answer from given options. These items are generally designed to differentiate student performance — to provide opportunities for higher performing students to demonstrate their ability to solve complex problems. The percentage of students failing to answer constructed-response items on the non-calculator test ranged from 5 to 15%. The percentage was significantly higher on the calculator-allowed test, where the range was from 6 to 28%. For the multiple-choice items, approximately 4% of students failed to answer each item. Teachers need to ask students their reasons for omitting questions, as a non-response provides no information for teachers to improve on learning. For multiple-choice items, there should always be 100% completion. The percentage of students who correctly answered items on the calculator-allowed test ranged from 97.8% down to 4.3% (the final item on the test). This item required students to solve a multistep problem involving simultaneous equations and conversion from milligrams to grams. Sixteen of the 32 items were answered correctly by more than 50% of Queensland students. For the non-calculator test, the percentage of students answering items correctly ranged from 92.8% down to 8.4% (Item 32). This question required students to identify the ratio of two areas, one in square kilometres the other in square metres. Fourteen of the 32 items on this test were answered correctly by more than 50% of Queensland students. There are some significant differences between the facility rates of the national cohort of Year 9 students and those of Queensland students. Queensland students performed equal to or above the national cohort on two items on the calculator-allowed test — items 12 and 16. They were 3% or more below the national rate on 7 items (18, 19, 21, 22, 24, 28 and 31). Five of these were from the number and algebra strand and the other 2 were from statistics and probability. The most concerning difference was 5% (Item 28), where students had to calculate the annual percentage interest rate in a simple interest problem. On the non-calculator test, Queensland students performed above the national cohort on 3 test items; however they were below the national rate by 3% or more on 4 items (4, 10, 12 and 21). On the calculator-allowed test, Queensland male students outperformed female students on 21 of the 32 items. For six items (2, 12, 14, 17, 22 and 28) the difference was 5% or greater with the most significant difference being 13% (Item 14), which required students to convert from litres to millilitres. Female students outperformed males by 5% or more on 2 items (4 and 21). For the non-calculator test, male students outperformed female students by 5% or more on 4 of the 21 items (7, 11, 14 and 25), with Item 25 having the largest difference of 10%. Female students outperformed male students by 5% on 3 items (4, 6 and 13). There is no pattern to these differences for either test. However, an examination of school-specific data may provide further information. When looking at the data for a single test item, teachers can compare the grouped data for their class with that of the state or national cohort. This will indicate the level of difficulty that students experienced with that item. For some items, the differences between the national, state and class data is not significant, but teachers may still investigate the reasons for the lower performance of students on items that test basic concepts which are fundamental to numeracy development.

27 Queensland Curriculum & Assessment Authority| Implications for teaching Across the two Year 9 numeracy tests, 15 items had facility rates of less than 30%. Most of these were in the second half of the test paper and therefore among the most difficult items on the test. All of these 15 items were either presented as word problems that students had to decode before determining the mathematical operation required to solve them, or included a diagram, or both. A number of items in the test involved calculations for which students should have been able to use formulas (e.g. perimeter, area, volume), rules or convert between units. Providing a student with a formula or rule does not help develop their understanding of a concept such as area or volume. They need to be provided with opportunities to explore and understand the relationships. Some students may benefit from the use of hands on activities using concrete materials to explore these relationships. Many students found word problems particularly challenging. It seems that reading, interpreting and deciding what to do may be part of the difficulty. Understanding relies on familiarity with mathematical and everyday language used in a mathematical context. Teachers should encourage the use of strategies such as: • reading the whole question more than once, the first time to get a general idea of what it is about, and subsequent readings to identify important information and what the question is asking • circling or underlining clues • sorting information into a useful form by drawing a diagram, or making a table or list. These strategies would also help students to identify the mathematics they know or that they need to know to solve the problem. Students’ visual literacy influences their ability to make sense of the mathematical data presented with different representations — pictures, diagrams, tables, graphs, maps. Almost half the items across the two tests involved students reading and interpreting some form of diagram. A strategy to assist students with visual literacy would be to present students with a variety of diagram types to interpret the information in the diagram. Even questions without diagrams may be conceptualised with models. Additionally, students had difficulty identifying how to solve ratio, rate and proportion problems, which involve multiplicative thinking. There were several questions (e.g. CA, Item 22, NC, Item 32), each with a facility rate below 40%. Teachers should use practical, familiar contexts to practise this type of question. Also challenging were problems dealing with fractions, with low facility rates on operations involving common fractions (e.g. NC, Item 23), and decimal fractions (e.g. NC, Item 25). A range of strategies to assist in this is found in the SunLANDA item analyses. CA, Items 15 and 25, and NC Items 3 and 28 involved percentages. This concept continues to challenge students. They need to understand that a percentage is another representation of a fraction or part of a whole, a proportional representation of a relationship between two quantities. The other basic understanding that seems to elude many students is that percentages are numbers that can be compared only if they represent different portions of the same whole. As percentages are used extensively in marketing and banking, the use of contexts from everyday situations — discounts, interest rates and population statistics — will enable students to see the relevance of the concept. Students should also be able to recall the percentage equivalents of key fractions as an aid to checking calculations and for estimation purposes. Teachers should not assume that students know how to calculate percentage on a calculator. Teachers should provide opportunities for students to solve problems from a range of both real-life

| 2016 NAPLAN: State reportYear 9 Numeracy 28 and purely mathematical contexts. Multistep problems proved challenging, particularly where students needed to apply different mathematical understandings from either the same strand or across different strands, or across different representations (word, visual, symbolic). For example, challenging items included: • percentages and fractions (CA, Item 24) • simultaneous equations and conversion (CA, Item 32) • ratio and conversion (NC, Item 32). Mathematics is sometimes taught as isolated concepts. Teachers should consider combining mathematics from multiple strands. This will enable students to make connections between different areas of mathematics. Results also suggest that students may have had difficulty deciding on a strategy to solve problems and then checking the reasonableness of their answers. Problem-solving should be taught in a range of situations, from simple to complex, and familiar to unfamiliar. Please refer to SunLANDA for a detailed analysis of individual test items, including teaching ideas designed to assist with the development of the understanding and skills required by each item. SunLANDA is available to all schools on the School Portal link on the QCAA website. Additionally, SunLANDA materials are available to State schools through OneSchool. When looking at the data for a single test item, teachers can compare the grouped data for their class with that of the state or national cohort. This will enable them to judge the level of difficulty that their students experienced with that item. For some items, the differences between the national, state and class data may not be significant, but teachers may wish to investigate the reasons for the poor performance of students on items that assess simple content and skills fundamental to numeracy development.