File Download Area

Information about "naplan 2015 state report year 9 with answers.pdf"

  • Filesize: 1.08 MB
  • Uploaded: 25/09/2019 11:43:52
  • Status: Active

Free Educational Files Storage. Upload, share and manage your files for free. Upload your spreadsheets, documents, presentations, pdfs, archives and more. Keep them forever on this site, just simply drag and drop your files to begin uploading.

Download Urls

  • File Page Link
    https://www.edufileshare.com/2c93f1ac744f28e9/naplan_2015_state_report_year_9_with_answers.pdf
  • HTML Code
    <a href="https://www.edufileshare.com/2c93f1ac744f28e9/naplan_2015_state_report_year_9_with_answers.pdf" target="_blank" title="Download from edufileshare.com">Download naplan 2015 state report year 9 with answers.pdf from edufileshare.com</a>
  • Forum Code
    [url]https://www.edufileshare.com/2c93f1ac744f28e9/naplan_2015_state_report_year_9_with_answers.pdf[/url]

[PDF] naplan 2015 state report year 9 with answers.pdf | Plain Text

NAPLAN 2015 State report: Year 9 For all Queensland schools



i Queensland Curriculum & Assessment Authority | Contents Preface .................................................................................................. ............. 1 Placing the tests in the assessm ent context ........................................... ........... 2 Marking and scoring the tests ................................................................. ........... 2 Marking the tests ...................................................................................................... 2 Calculating raw scores ............................................................................................. 2 Constructing scale scores ........................................................................................ 2 Using scale scores ................................................................................................... 3 Understanding the data Which reports? ........................................................................................ ........... 4 Using data to improve teachi ng and learning ..................................................... 5 Year 9 Writing Writing prompt .............. ........................................................................... ........... 7 Key messages................. ........................................................................ ........... 8 About the task............................................................... ................................. ........... 8 Performance ............................................................................................................. 8 References ............................................................................................................... 9 Writing task sample ......... ........................................................................ ......... 10 Year 9 Literacy Language conventions ............................................................................ ......... 14 Spelling — Results and item descriptions ........ ............................................. ......... 14 Spelling — Key messages............. .................................................. ....................... 15 Grammar and punctuation — Results and item descriptions ................................. 17 Grammar and punctuation — Key messages......................................................... 18 Reading ............. ...................................................................................... ......... 20 Results and item descriptions................................................................................. 20 Key messages ........................................................................................................ 21 Year 9 Numeracy Results and item descriptions................................................................................. 25 Key messages ........................................................................................................ 27

| 2015 NAPLAN: State report ii

1 Queensland Curriculum & Assessment Authority | Preface The purpose of the National Assessment Program is to collect information that governments, education authorities and schools can use to determine whether Australian students are reaching important educational goals. As part of that program, the Literacy and Numeracy tests are valuable sources of information about literacy a nd numeracy learning that can be used to inform educational policy and current educational practice. The National Assessment Program — Literacy and Numeracy (NAPLAN) tests are developed using the nationally agreed Statements of Learning for English and Statements of Learning for Mathematics, 2005 . These statements describe essential skills, knowledge, understandings and capabilities that all young Austra lians should have had the opportuni ty to acquire by the end of Years 3, 5, 7 and 9. From 2016, the tests will relate to the Australian Curriculum. The NAPLAN tests are designed to provide a na tionally comparable indication of student performance in Language conventions, Writing, Reading and Numeracy. The tests are designed to assess a student’s ability to demonstrate the following skills: • Language conventions: The test assesses the ability of students to independently recognise and use correct Standard Austra lian English grammar, punctuation and spelling in written contexts. • Writing: The test assesses the ability of students to convey thoughts, ideas and information through the independent construction of a wr itten text in Standard Australian English. • Reading: The test assesses the ability of studen ts to independently make meaning from written Standard Australian English texts, in cluding those with some visual elements. • Numeracy: The test assesses students’ knowledge of mathematics, their ability to apply that knowledge in context independently, and their ability to independently reason mathematically. This document reports the performance of Q ueensland students in Year 9 who sat the 2015 National Assessment Program — Lite racy and Numeracy (NAPLAN) tests. Who should use this report? NAPLAN: State report will help teachers, principals and other school pers onnel understand, interpret and use the student performance informat ion contained in the test reports. Class and school reports are supplied electronically on the secure section of the Queensland Curriculum and Assessment Authorit y (QCAA) website: https://naplan.qcaa.qld.e du.au/naplan/pages/login.jsp . These reports are accessible only with the scho ol’s Brief Identification Code (BIC) login and password. Individual student reports are distributed to schools as printed copies. Principals Principals can use this document to help interpret their school reports and to provide information to the school community on aspects of the tests. The document pr ovides information on how to access and interpret the online repor ts located on the QCAA’s website. Curriculum leaders, Heads of Departme nt and Heads of Special Education Services Queensland’s performance on each of the Literacy and Numeracy strands is provided in this document. Curriculum l eaders can use this information to interpret the class reports.

| 2015 NAPLAN: State report 2 Classroom teachers Classroom teachers can us e information such as the item desc riptors, state and national results and the commentaries provided in this report to interpret their class reports. Teachers can compare the performance of thei r students on a particular item with Australian results. For example, an item with a low fac ility rate may not necessarily indi cate a problem in teaching and learning. It may be that this wa s simply a difficult item for all students in this cohort across Australia. The results for such an item may pr ovide information about the learning challenges associated with that concept but should not necessarily be cause for concern. Parents/carers Parents can use the information in this document to interpret the results on their child’s report. They are also able to judge how their child perfo rmed when compared with the whole population of students. The item descriptors provide useful information about the scope of the tests. Pre-service teachers Pre-service teachers will find the information in the commentaries on overall student performance useful in gaining an understanding of what students know and ca n do in some areas of Literacy and Numeracy at Year 9. Placing the tests in the assessment context The NAPLAN tests are national instruments desi gned to contribute to a school’s assessment program and to inform the teaching and learning cy cle. It must be remembered, however, that the results from the 2015 NAPLAN tests represent on ly one aspect of a school’s assessment program. The results from a school’s formal and informal assessment of students should be consistent with the NAPLAN test results. Principals and teachers should keep in mind that these were pencil-and- paper, point-in-time, timed tests. If the test results are different from what was expected, consider the possible reasons. The re sults of the tests may indicate as pects of student performance that need further investigation within the cla ssroom using other forms of assessment. Marking and scoring the tests Marking the tests The tests are scored against nationally agreed mark ing guides. There are four guides, one for the writing task and one each for the open responses in reading, numeracy and spelling. These guides provide information on the acceptable forms of the correct answer. For the Numeracy tests, students may provide a co rrect response in different forms. Professional officers review these resu lts and decide how to score. Calculating raw scores The simplest calculation made in scoring the test s is the raw score — the number of questions answered correctly. All of the questions for the Language conventions, Writing, Reading and Numeracy tests are marked as either correct or incorrect. Constructing scale scores Raw scores have limited use. They enable the performance of students who have all completed

3 Queensland Curriculum & Assessment Authority | the same test at the same time to be placed in a rank order, but they do not provide information about the level of difficulty of the test no r the relative differences between students. To achieve this, raw scores are transferred to a comm on scale that reflects how difficult it was to achieve each score. The scale is comparable between year levels for each assessment area. An equating process is also carried out on each year ’s test to enable scores to be compared between years of testing. This might mean, for example, that a raw score of 20 on the Year 3 Reading test is transformed to a scale score of 354. This will also represent the same achievement for a student with the same scale score in Year 5, and for a st udent with the same scale score for Reading in a previous year. The single scale for all students in all year leve ls is centred on approximately 500. Scale scores also provide a basis for measuring and comparing students’ abilities across years of schooling, for example comparing a student’s result in Year 3 in 2013 and Year 5 in 2015. Using scale scores The scale score can be used to compare the resu lts of different students. Principals and teachers should take care when making comparisons between small groups of students. For groups of fewer than 10 students, differences may not be reliable, particularly small differences. The scales can be used to monitor the growth of groups of students over time. Principals and teachers should ensure that the compositions of the groups are th e same. This enables the school to evaluate special programs that may have been put in place.

| 2015 NAPLAN: State reportUnderstanding the data 4 Understanding the data Which reports? Th e NAPLAN National Summary Report and the NAPLAN National report provide nationally comparable data about student performance within the National Assessment Program. These data provide states and territories with informat ion about the achievement of their students in relation to their peers across the nation. T hese data are available from the ACARA website. This NAPLAN State report provides detailed information about student performance on each of the test items. It gi ves information about: • the Queensland performanc e on each of the items • the national performance on each item • the item descriptors • some commentary on the state results • some recommendations for teaching. Together, these publications pr ovide system-level information and are generally available. The NAPLAN School reports give information about a school’s performance in each year level tested. They provide a summary of year-level performance as well as performance by gender, language background and Indigenous status in the following fields: • distribution of scale scores • distribution of achievement bands • school and state means • participation of the group. The shading showing the range of performance for the middle 60% of Queensland students, together with the state mean, locates a school’s performance relative to that of the state. NAPLAN data Government systems Australian public Schools Teachers Analysis of systems data: • Systems planning • Trends Analysis of school data: • Range • Comparisons of student & state Analysis of class data: • Test results by – class – group response Teaching, learning and assessment including planned explicit teaching and feedback based on identified learning goals. National report School report Class report

5 Queensland Curriculum & Assessment Authority | The NAPLAN Class reports show the performance of each stu dent on every item. They show the items a student had correct, including the errors made in each strand with the exception of reading, where the answers are generally too long to record. The report also gives the: • scale scores for each student • bands for each student • percentage correct for each item fo r the class and state, and by gender. The NAPLAN school and class reports are availabl e to schools from the QCAA secure website. Using data to improve teaching and learning While the national and state reports provide the comparative data, it is the class reports that provide a school with the information that can be used to inform teaching and learning and to build capacity in schools. Analysis of the NAPLAN cl ass data, in particular the performance on each item, will provide teachers wi th information about the understandings a nd patterns of misunderstandings in student learning. An analysis of the distracters presented in mu ltiple-choice items and the answers to the constructed-response items, other than those for reading, is available through the SunLANDA data analysis tool. This is available on the QCAA website and is designed to help schools with their analyses of class and school results. These results should be placed in a context with other school-based as sessments. Looking at the performance on the items and then analysing the error patterns allows teachers and principals to make hypotheses about why grou ps of students make particular errors. Schools can: • compare the facility rates (percent age correct) of items to see if their performance is consistent with the national and state resu lts available in this document • look at the common errors made by their stude nts and compare them with the common errors made in the state (Only errors from Queensland students are availabl e. These are found in the item analyses that are part of SunLANDA.) • form hypotheses about why studen ts are making these errors, e.g. – How did students think about this aspect of curriculum? – What misunderstandings migh t these errors represent? – How might the structure of the test question have shaped the response? Using a combination of the N APLAN data, school data and prof essional judgment, teachers should then test these hypotheses to see whether they are valid or whether there is more to be thought about and investigated. Inte rpretation of these results allows teachers to make judgments about teaching approaches and curriculum. The professional conversations that are part of this process are the most effective and powerful way to use the data as they are the vehi cle for developing shared understandings.

| 2015 NAPLAN: State reportUnderstanding the data 6

7 Queensland Curriculum & Assessment Authority | Year 9 Writing Writing prompt Simply the best Choose: UÊ ÊÌ…iÊLiÃÌÊ“œÛˆi]Ê/6ÊÃ…œÜÊœÀÊ«iÀvœÀ“>˜ViÊ ÞœÕÊ…>ÛiÊÃii˜Ê œÀÊ UÊ Ì…iÊLiÃÌÊLœœŽÊÞœÕÊ…>ÛiÊÀi>`°Ê 7ÀˆÌiÊ̜ʫiÀÃÕ>`iÊ>ÊÀi>`iÀÊÌ…>ÌÊÌ…iÞÊÃ…œÕ`Ê ÃiiÊœÀÊÀi>`ÊÜ…>ÌÊÞœÕÊ…>ÛiÊV…œÃi˜ÊÌœÊÜÀˆÌiÊ >LœÕÌ°ÊÊ UÊ Ê Start with an introduction. ˜Êˆ˜ÌÀœ`ÕV̈œ˜ÊiÌÃÊ>ÊÀi>`iÀÊŽ˜œÜÊÜ…>ÌÊÞœÕÊ >ÀiÊ}œˆ˜}ÊÌœÊÜÀˆÌiÊ>LœÕÌ° UÊ Ê Write your opinion on the topic. ˆÛiÊÀi>Ãœ˜ÃÊvœÀÊÞœÕÀÊœ«ˆ˜ˆœ˜°Ê Ý«>ˆ˜ÊÞœÕÀÊ Ài>Ãœ˜Ã° UÊ Ê Finish with a conclusion. ÊVœ˜VÕÈœ˜ÊÃÕ“ÃÊÕ«ÊÞœÕÀÊÀi>Ãœ˜ÃÊÃœÊÌ…>ÌÊ >ÊÀi>`iÀʈÃÊVœ˜Ûˆ˜Vi`ÊœvÊÞœÕÀÊœ«ˆ˜ˆœ˜° Remember to: UÊ «>˜ÊÞœÕÀÊÜÀˆÌˆ˜} UÊ ÊÕÃiÊ«>À>}À>«…ÃÊ̜ʜÀ}>˜ˆÃiÊÞœÕÀʈ`i>à UÊ ÜÀˆÌiʈ˜ÊÃi˜Ìi˜Vià UÊ ÊV…œœÃiÊÞœÕÀÊÜœÀ`ÃÊV>ÀivՏÞÊÌœÊVœ˜Ûˆ˜ViÊ>Ê Ài>`iÀÊœvÊÞœÕÀÊœ«ˆ˜ˆœ˜ UÊ Ê«>ÞÊ>ÌÌi˜Ìˆœ˜ÊÌœÊÞœÕÀÊëiˆ˜}Ê>˜`Ê «Õ˜VÌÕ>̈œ˜ UÊ ÊV…iVŽÊ>˜`Êi`ˆÌÊÞœÕÀÊÜÀˆÌˆ˜}ÊÜʈÌʈÃÊVi>À° © ACARA 2015 YEAR 7 AND YEAR 9

| 2015 NAPLAN: State reportYear 9 Writing 8 Key messages About the task In 2015, the NAPLAN Writing test used two prom pts for the first time, one for Years 3 & 5, and another for Years 7 & 9. Besides this difference, th e test conditions and administration of the test remained the same as in previous years, i.e. te achers delivered the same spoken instructions and read the text aloud to students. Working independently, students had to plan, compose and edit a written response. Students were allowed five minute s to plan, 30 minutes to write their script, and a further five minutes to edit and complete the task. Three pages were provided for students to write a response. The 2015 prompt for Years 7 & 9 was entitled Simply the best. Students were asked, in the textual component of the prompt, to choose the best movie, TV show, performance seen, or book read. They were then asked to persuade a reader that they should see or read what was chosen. Additional information was provided in the text ual component of the prompt. This named the structural components, and further defined these elements, e.g. Start with an introduction. An introduction lets a reader know what you are going to write about. Other notes were also provided in relation to the conventions associated with this type of writing task. A series of silhouetted images of various cultural activities, books, a nd film shots surrounded the textual component of the prompt. As was the case in 2013 and 2014, th e prompt was relatively open-ended, allowing students to base their writing on a topic of their own choice within the persuasive genre. Markers for this Writing test were trained usi ng the national persuasive writing marker training package, delivered as part of AC ARA’s national assessment progra m. Markers were recruited and trained in accordance with national protocols, app lied consistently across all states and territories. Registered teachers mark the NAPLAN Writing test in Queensland. All markers applied the 10 criteria and related standards from the marking rubric. Writing test scripts were marked on screen in all states and territories. Stringent quality-co ntrol measures were applied to the marking of student scripts, including a prescribed percenta ge of scripts to be double-marked, and the daily application nationally of control scripts for all markers. As part of the Queensland marking operation for 2015, referee marki ng was expanded to further ensure marking reliability. There is also provision for appeal over individual Writing test scores, once test results are released. On appeal, a student’s script is re-marked independently by two senior Writing test markers. The NAPLAN Persuasive writing marking guide is available at www.nap.edu.au/NAPLAN/About_each _domain/Writing/index.html. Performance In contrast to the performance of Years 3 and 5 st udents, there was little change in the Writing test performance for the older cohorts of Years 7 and 9. A possible prompt effect exhibited in Years 3 and 5 leading to improved performance was not rep licated in the upper grades. Other conditions (genre, time, test protocols) remained constant from 2014. The prompt did allow students to select a topic of personal interest with which they had some knowledge and familiarity. In fact, fe w students had difficulty in finding a subject on which their text could be based. Issues tended to emerge with re spect to the application of genre to the subject matter. So, students typically introduced their subj ect (e.g. favourite book or film) in an opening paragraph, stated broadly why the book/film/perfo rmance should be read/viewed, then proceeded to provide information regarding plot, characteri sation, cinematographic features etc. The danger here was that students were flirting with informativ e rather than the persuasive genre. Conclusions tended to focus on simple restatements of main points referred to in the body of the text. Often, there was a very close parallel between the word ing of the introduction and conclusion; on occasions, almost identical wording.

9 Queensland Curriculum & Assessment Authority | Some students, particularly Year 9 students, adopted a ‘review’ ty pe response, tending to provide information rather than persuasive argument defending their choice of book, film etc. Typically, this information was sandwiched between an intr oduction and conclusion which reflected some persuasive elements. So these st udent scripts were deemed to be ‘on genre’, though their final scores were impacted by the abs ence of persuasive elements used consistently throughout their texts. Because the Writing test is an ‘on de mand’ assessment, student responses frequently lacked detail about books, films and performances . So, a student may have been able to write with some fluency about generic features of their ch osen subject, but lacked substance in supporting the arguments provided. Exceptions to this we re cases where students clearly felt passionate about their topic, knew fi ne-grained details of the book or film, and were able to reflect this passion for the work in a clearly persuasive fashion. A small number of students adopted a text type that blended narrative and persuasive genres. Introductions provided something like a narrative or recount of an experience, leading in to the persuasive text proper. For inst ance, an introduction may have referred to an attendance at an evening performance of a play or musical, followed by a description of th e scene, any emotional response and so on. Features of the performance were then related in a persuasive body of text. There are risks in following this style, not the leas t of which is that the principal genre for the task may not be clearly developed, or not be outwardly apparent to the marker. On a more positive note, students in Years 7 and 9 showed that they are re ading novels, enjoying films, and attending other cultur al events. For many, this is an enjoyable experience on which they could readily reflect. Students generally wrote, on average, lengthier responses than in 2014, the prompt possibly having an effect in this regard. The 2015 test may also have suited students of a more academic persuasion, though students of varying ability levels were able to produce sound written responses on topics they were personally interested in. At this level, Years 7 and 9, students should be able to demonstrate control of the persuasive genre. The simple textual structure of introduction , body consisting of two to three paragraphs, and conclusion can sometimes inhibit the capacity of many students to more effectively persuade readers to their points of view . That is, the textual ‘form’ can become more dominant than the language purpose of actually persuading. Students who show greater command of the logical and textual features of persuasive writing, pa rticularly how this might play out in the body of the text, where persuasive tools such as condition, causation, comparison and supporting evidence are employed effectively, are rewarded through the NAPLAN marking rubric and in persuasive writing more generally. References Australian Curriculum, Assessment and Reporting Authority 2013, Australian Curriculum: English www.australianc urriculum.edu.au Queensland Curriculum and Assessment Authority 2013, Hidden worlds www.qcaa.qld.edu.au/downloads/p_10/3579_wt_hidden_worlds.pdf Queensland Curriculum and Assessment Authority 2011, Queensland’s Literacy Test: A framework for describing spelling items www.qcaa.qld.edu.au/downloads/p _10/3579_describing_spell_items.pdf

| 2015 NAPLAN: State reportYear 9 Writing 10 Writing task sample Year 9 — Shawshank redemption

11 Queensland Curriculum & Assessment Authority |

| 2015 NAPLAN: State reportYear 9 Writing 12 Year 9 — Shawshank redemption Audience 6 The script uses the characteristic language and tone of cultural reviews to praise the film and thus to encourage the reader to see it. The writer adopts an enthusiastic, exper\ t stance. Text structure 4 The script is structured as a formulaic three-reason school essay. While it is possib\ le for more-able students to score well using this structure, its use more often tends to limit the opportunity to produce sophisticated scripts. Although text types have their typical structures, student s should not think that there is one rigid structure they must always reproduce. Since writing is about thinking, the ‘str\ ucture’ of the thought should be the structure of the writing. Good thinking has a unity with interlinking pa\ rts. It is the message (what the student wants to say), that should create its necessary structure. The thought is likely to fit a ‘top-lev\ el structure’ type (such as comparison/contrast ; part-to-whole; order of importance etc.) Ideas 5 A sophisticated approach in dealing with this film would have been to relate it to big issues in society or psychology. This student makes claims about the film’s themes profoundly affecting society and, later, that they are relevant and empowering . However, these claims are not explained. As typically happens with the use of the formulaic structure, this student presents three unrelated ‘reasons’ why the film is enjoyable and writes a short paragraph on each. He chooses three disparate headings: plot, popularity and theme. This prevents him from conceiving a strong main idea unfolded through logical discussion. So rather than developing a strong main idea, the student focuses on the film’s plot and popularity. These are minor features compared to its theme. Understanding and explaining theme requires much thought and space. In a final paragraph, the student attempts to describe the film’s themes as (1) being calm and peaceful and (2) justice. These attempts are not well developed. The script would have been enhanced had the student developed a central idea, for example that the film’s theme is the need for hope in adversity. Teaching this student how to develop strong central ideas will enhance his writing. Persuasive devices 4 The script accommodates and appeals to the reader in a variety of ways. It attempts to ‘shame’ readers who pride themselves in their movie knowledge ( You cannot say you like movies and not have watched Shawshank redemption ). It also anticipates an audience objection by praising how the film caters to fans of action as well as emotion. It overuses adjectives. In learning to craft his writing, the student needs to realise that one precise word is usually more effective than many (e.g. outstandingly gripping, emotional and action-packed plot ). The least successful of the persuasive devices is the attempt to persuade by ethos, by claiming the film deals with values such as justice. This potentially strong persuasive move is frustrated by a lack of clear thinking. Vocabulary 5 The student seeks to use the appropriate language of cultural commentary. Less \ appropriate are the idiomatic word choices, e.g. For starters and ticks that box . Although such turns of phrase are effective in writing on popular culture, this essay would be more effective if they had been better integrated with the more formal language. Cohesion 4 The use of apposition in the first sentence suggests how well the script keeps focus. The top score was awarded even though the pronoun it on the fifth line of the second paragraph and in the last sentence of the second last paragraph should have been better referenced. This student is ready to learn how to improve a first draft script by focusing on parts where a reader could ask ‘what does that mean?’ As well, the student is ready to learn some skills of crafting, even in first draft writing. For example, the teacher has the chance to show the student how to avoid the repetition of the word film in the first sentence of paragraph 3.

13 Queensland Curriculum & Assessment Authority | Paragraphing 3 Despite the script’s brevity, its paragraphs act to segment the text \ and have sound internal structure. Sentence structure 5 Generally, the sentences are fluent. Clumsy passages point to where the student can be assisted in developing his writing, e.g. You cannot say you like movies and not have watched Shawshank redemption. This sentence should be rewritten with unless you have instead of and not have. Errors include: missing words, ‘as’ after acclaimed in paragraph 3; ‘for’ at the bottom of the first page and a duplicated word still in the second-last paragraph. Punctuation 5 The student mistakenly believes the name Shawshank is two words. The second word of the title, redemption, is not punctuated consistently. Spelling 5 The words enduring and empowering on the third last line were judged to be incorrectly spelt.

| 2015 NAPLAN: State reportYear 9 Literacy 14 Year 9 Literacy Language conventions Spelling — Results and item descriptions The percentage columns give the proporti on of correct answers (facility rates). These results are based on provisional data. Item Answer Qld% Aust% Description Proofreading — error identified 1 traditional (traditionel) 88.2 87.4 Correctly spells a word with the suffix - al. 2 pumpkin (pumkin) 79.1 82 Correctly spells a two-syllable word with the consonant p at the closed syllable juncture. 3 routine (rootine) 76.2 78.5 Correctly spells a word with the long vowel digraph - ou. 4 theory (theary) 78.3 78.8 Correctly spells a word with the schwa ( o). 5orbit (orbet) 71.4 72.5 Correctly spells a word with the schwa ( i) in the final unaccented syllable. 6radius (radias) 66.2 68.9 Correctly spells a word with the Latin inflection - us. 7salmon (sammon) 57.6 63.7 Correctly spells a two-syllable word with the silent letter -l at the syllable juncture. 8 hilarious (hillarious) 59.4 61.1 Correctly spells a word with the single consonant l at the syllable juncture. 9 hectares (hectairs) 48.8 49.4 Correctly spells a word with the diphthong pattern - are. 10 illegally (illegaly) 45.3 46.9 Correctly spells a word with no change to the stem before the suffix -ly. 11 absolutely (absolutly) 48.8 49.7 Correctly spells a word with no change to the stem before the suffix -ly. 12 convenient (conveniant) 43.2 44.6 Correctly spells a word with the adjective-forming suffix -ent . 13 queue (quewe) 39.1 38.6 Correctly spells a one-syllable word with the long u vowel pattern -eue. 14 vigorous (vigerous) 21.8 24.7 Correctly spells a word containing the suffix spelled - or, not - our, when it is followed by a second suffix (- ous). 15 consignment (consinement) 16.9 17.5 Correctly spells a word with the silent consonant g. 16 buoyant (boyant) 8.9 12.5 Correctly spells a word with the diphthong / oi/ spelled - uoy .

15 Queensland Curriculum & Assessment Authority | Spelling — Key messages Performance Generally the Queensland performance was similar to that across Australia. The facility rates were slightly higher than the national result for the words traditional, circumnavigate and contradicted. Queensland performance was lower than the national result for several words, especially salmon, platypus, census and aerial. Items with notably low facility rates across Australia were • veterinarian, occasionally, fo liage, vigorous, consignment, all of which require knowledge of how base words have spellings t hat reflect meaning and of how affixes attach to base words • aerial, census, buoyant and queue , all of which require familiarity with the meaning, usage and spelling of advanced vocabulary words. Words such as buoyant and queue have spellings that are rare and as such are tests of individual word knowledge rather than testing an understanding of the spelling system. Proofreading — error unidentified 17 identity (identety) 79.8 80 Identifies a mistake then correctly spells a word with the connective schwa ( i) before the suffix -ty . 18 experience (expirience) 59.2 59.8 Identifies a mistake then correctly spells a multisyllable word with an r-controlled vowel ( er). 19 glacier (glaciar) 64.5 65.1 Identifies a mistake then correctly spells a multisyllable word with the unaccented ending - er. 20 achieved (acheived) 50.3 51.7 Identifies a mistake then correctly spells a word with the long vowel digraph - ie. 21 wreckage (reckage) 44.6 46.6 Identifies a mistake, then correctly spells a word with the silent initial letter -w. 22 circumnavigate (cercumnavigate) 43.7 40.4 Identifies a mistake, then correctly spells a word with the prefix circum-. 23 platypus (platypuss) 41.3 45.4 Identifies a mistake then correctly spells a word with the prefix -pus. 24 census (sensus) 32.9 36.7 Identifies a mistake then correctly spells a word with the initial sound s represented by - c. 25 noxious (noxous) 37.8 38.1 Identifies a mistake then correctly spells a word with the blend xi followed by the suffix - ous. 26 contradicted (contrerdicted) 35.1 34.2 Identifies a mistake then correctly spells a word with the prefix contra-. 27 foliage (folige) 24.5 26.7 Identifies a mistake then correctly spells a word with the suffix -age. 28 aerial (aeriel) 22.2 26 Identifies a mistake then correctly spells a word with the adjective-forming suffix - al. 29 occasionally (occassionally) 10.3 10.7 Identifies a mistake then correctly spells a word with the single consonant s at the syllable juncture. 30 veterinarian (vetinarian) 4.1 4.9 Identifies a mistake then correctly spells a word with an abbreviated syllable. Item Answer Qld% Aust% Description

| 2015 NAPLAN: State reportYear 9 Literacy 16 The errors reveal that, when fa ced with more difficult words, students abandon advanced spelling strategies and revert to matching sounds to lett ers. Often, they seem to allow themselves to be influenced by the printed misspelling of the target word instead of spelling the word using their normal practices. The misspellings of the words contradicted, circumnavigate and consignment were good examples of students reverting to ea rlier and less sophisticated strategies. While 35.1% of Queensland st udents could spell contradicted, the evidence from t he errors suggests that those who couldn’t reverted to a sound-it-out strategy rather than using their knowledge of the prefix contra and the root dict. As a result, these students fail ed to recognise and spell the unstressed syllable to produce the following errors — contredict (21%), contridict (12%), conterdict (4%) and controdict (1%). If these students had known the pref ix and been able to use it as a spelling strategy then this word would have been spelt correctly by almost three-quarters of Year 9 students. In spelling circumnavigate, students who identified that wo rd as the error most often correctly identified that the error was in the spelling of the prefix circum and not navigate. Students produced errors such as curcumnavigate (5%), cercumnavigate (3%) and cerconavigate (2%). If students had realised that consignment was related to sign, and been able to build it up from that word, e.g. con+ sign+ment, then the performance on this wo rd would also have been at least 35% higher as the top five most common errors show ed the students struggling with how to spell the middle syllable. To be efficient spellers, students need to work in larger chunks of words such as Latin and Greek roots or base words. This allows them to build a more extensive repertoire of spelling more quickly. It also contributes to their vocabulary development. Implications for teaching Test effects • To avoid the distracting effect of the printed misspelling, students should identify the target word, then cover up the misspelling and write the ta rget word using their own best knowledge of spelling patterns and word components. • Standardised test results can give reliable ge neralisations about groups of students but for diagnostic and teaching pur poses teachers should use assessme nts such as dictation tests, audits of student writing, and reco rds of conferences with students. Strategies Teachers should teach st rategies for activities such as these: • Analysing syllables and stress patterns. Use this strategy for phonetically coded multisyllable words. • Applying knowledge of inflections and affixes. Use this for words containing grammatical components. • Applying knowledge of classical and foreign wo rd elements. Recall visual memories of the written form of mature vocabulary words. Use this for specialist and academic vocabulary. Content Spelling lessons should not be about random words bu t words that exemplify the separate layers of the spelling system and that suit secondary students’ stage of learning these codes. In learning the spelling system, students should study: • foreign-derived words (e.g buoyant) • words with ‘within-word letter patterns’ that represent complex vowel patterns (e.g. glacier, vigour, buoyant, experience, achieved) and consonant patterns (e.g. salmon, pumpkin, wreckage)

17 Queensland Curriculum & Assessment Authority | • homophones and near-homophones (e.g. route/root/rout, census/senses/censors/censures, queue/cue ) • words with ‘syllable and affix patterns’: – open and closed syllables ( or-bit, gla-cier, pump-kin and salm-on ) – advanced stems and affixes ( tradit-ion-al, il-legal-ly, con-veni-ent, con-sign-ment, ident-ify, nox-ious, hilar-ious, vigor-ous, wreck-age, foli-age, veterin-arian, oc-cas-ion-al-ly) – words whose pronunciation changes (‘al ternation’) after adding affixes (e.g. absolve/ absolution/absolute ). Although adding suffixes usually leaves the spelling of the stem unchanged (e.g. foli-o to foli-age), there are exceptions such as vigour/vig orous, expl ain/ explan ation and pronounce/pron unciation. • words from the ‘derivationa l constancy’ level containing Latin or Greek elements ( platypus, circumnavigate, contradictory, foliage, aerial, hectares, orbit, radius, theory). Meaning, not sound, will tell students how these elements are spelled whenever they are encountered. For example, the letter a in contra- is pronounced differently but spelled the same in the words contradicted and contrary. In both words, contra- is a Latin prefix meaning ‘against’. Grammar and punctuation — Re sults and item descriptions The percentage columns give the relative pr oportion of correct answers (facility rates). These results are based on provisional data. Item Answer Qld% Aust% Description 31 B 93 93.3 Identifies the correct use of an auxiliary with a past participle. 32 B 92.1 92.9 Identifies an infinitive that completes a sentence. 33 D 91 91.6 Identifies the correct use of a preposition and noun phrase. 34 B 86.7 86.8 Identifies the correct preposition to complete a sentence. 35 C 85.3 86.7 Identifies the correct use of a comma after an introductory dependent clause. 36 A 84.4 85.6 Identifies the correct use of brackets to enclose the explanation of an acronym. 37 C 80.6 80.4 Identifies an independent clause punctuated as a sentence. 38 A 65.1 65.1 Identifies the referent for a pronoun in a short passage. 39 A 68.8 72.7 Identifies the correct pronouns in a compound subject. 40 B 67.9 68 Identifies the correct use of commas for an embedded clause and brackets for non-essential information. 41 B 61.5 64.9 Identifies the correct synonym for a modal verb . 42 A 66.1 68.5 Identifies the correct use of commas in a complex list. 43 B 49 50.7 Identifies the subject of a participial phrase. 44 C 65.9 66.9 Identifies the correct suffix that changes a verb to an adjective. 45 A 57.1 55.8 Identifies correct use the homophones who’s and whose. 46 A 48.4 51.5 Identifies the correct use of capital letters for proper nouns. 47 B 55.2 53.5 Identifies correct use the homophones its and it’s. 48 B 51.3 56.4 Identifies the correct parallel grammatical form. 49 B 45.3 48.5 Identifies the correct use of hyphens for a compound adjective.

| 2015 NAPLAN: State reportYear 9 Literacy 18 Grammar and punctuation — Key messages This year, the test focused on some skills more than others, for example: • naming and recognising parts of speech and other grammar categories (items 49, 52, 55, 56 and 58) • linking modifiers to the true subject of the sentence (items 43 and 53) • adding supplementary information to a sentence (items 36, 40 and 57). Performance The Queensland Year 9 results for grammar and punc tuation were mostly very similar to those achieved nationally. Queensland exceeded the national result for item 50 (identify a main clause in a complex sentence). Queensland students also performed better than the national result on item 52 (identify a compound sentence). However, this same ques tion was answered correctly by about the same percentage of Year 7 students. This indicates a continuing need to teach grammatical skills to senior students as they will not necessarily improve by themselves. Queensland facility rates were lower for • item 39 (pronouns in a compound subject) • item 41 (modal verbs) • item 48 (recognise a parallel grammatical form) • item 49 (hyphens for a compound adjective). Implications for teaching Grammatical terms The low facility rates on items 49, 52, 55, 56 and 58 suggest that many students need to learn the grammatical terms referring to sentence modes, pa rts of speech, suffixes, prefixes and so on. Better knowledge of grammatical terms will allow better classroom discussions of language issues. Formal grammar concepts and term s should be taught along with lessons on practical, functional communication. Students are likel y to engage with grammar and punctuation if they can see that it 50 C 59.7 56.9 Identifies the main clause in a complex sentence. 51 D 45.8 44.7 Identifies the correct punctuation of direct speech with internal attribution. 52 D 42 35.4 Identifies a compound sentence. 53 C 42 42.7 Identifies a sentence in which a modifier is correctly related to the subject. 54 A 39.7 40.7 Identifies the correct use of a semicolon. 55 A 34.2 33.1 Identifies a word functioning grammatically as a noun. 56 D 37.4 37.9 Identifies the grammatical functions of words in a context sentence. 57 D 20.9 20.3 Identifies the correct use of dashes to mark off a qualifying phrase. 58 A 36 36.4 Identifies the use of a participle as an adjective. Item Answer Qld% Aust% Description

19 Queensland Curriculum & Assessment Authority | helps them to understand and enjoy mature texts and to use the techniques of writers to improve their own writing. Assessment for teaching Teachers should assess and intervene when stude nts’ communication aims are being retarded or frustrated by their lack of gr ammar and punctuation knowledge. NAPLAN results cannot give teachers detailed asse ssment information. Instead, for baseline and post-lesson assessme nt, teachers could • audit student writing • audit student reading for fluency and comprehension • set exercises related to specific lessons. Content of lessons The teaching and assessing of gr ammar and punctuation should be • developmental (because it co vers increasingly mature skills) but timely (taught when students need to learn) and • systematic (covering the features relevant to th e levels of communication, from the whole text level to the sentence level and down to groups of words). Teachers can refer to the following resources: 1. Comprehensive and specific info rmation about what to teach from Years 1 to 9 is given in the draft scope and sequence for teaching gramma r (and punctuation), available from the QCAA at: https://www.qcaa.qld.edu. au/downloads/p_10/qcar_s s_english_grammar.pdf 2. Books by Beverley Derewianka and Sally Humphr ey suggest ways of teaching that can often apply to older students. (Published by Prim ary English Teaching Association Australia). 3. Topics for teaching Year 9 grammar and punctuation are suggested in the Australian Curriculum English: • Understand that authors innovate with text structures and language for specific purposes and effects (ACELA1553) • Compare and contrast the use of cohesive devices in texts, focusing on how they serve to signpost ideas, to make connections and to build semantic associations between ideas (ACELA1770) • Understand how punctuation is used along with layo ut and font variations in constructing texts for different audiences and purposes (ACELA1556) • Explain how authors creatively use the struct ures of sentences and clauses for particular effects (ACELA1557) • Understand how certain abstract nouns can be used to summarise preceding or subsequent stretches of text (ACELA1559) • Identify how vocabulary choices contribute to specificity, abstraction and stylistic effectiveness (ACELA1561).

| 2015 NAPLAN: State reportYear 9 Literacy 20 Reading Results and item descriptions The percentage columns give the proporti on of correct answers (facility rates). These results are based on provisional data. Item Answer Qld% Aust% Description Music for the planet 1 C 86.2 85 Locates directly stated information. 2 D 91.1 91.5 Locates directly stated information. 3 C 93.6 93.9 Integrates information to make an inference. 4 A 82.8 84.5 Interprets the purpose of a visual metaphor. 5 C 64.9 64.1 Infers the meaning of a proverb. 6 B 82.5 83.5 Identifies the main purpose of a poster. The best medicine 7 D 87 89.2 Integrates information to make an inference. 8 A 92.8 94.1 Locates directly stated information. 9 C 93.2 94.1 Interprets information to make an inference. 10 D 63.3 64.7 Infers a purpose for the use of inverted commas. 11 * See below 60.1 62.5 Identifies directly stated information. 12 B 83.3 84.8 Locates directly stated information. The Minotaur 13 A 75.2 77.5 Infers a character’s role. 14 D 42.1 44.5 Infers the meaning of a group’s reaction. 15 C 36.2 38.4 Infers the background to an action. 16 B 51.2 53.9 Interprets the meaning of an expression. 17 B 66.1 67.1 Infers a character's motivation. 18 A 48.2 53.2 Identifies the specific purpose of brackets. A way forward 19 D 24.5 27.8 Interprets change in pronoun use. 20 B 65.3 67.8 Evaluates a character’s stance. 21 A 47.6 50.6 Interprets the meaning of a phrase. 22 C 69.6 71.4 Infers the meaning of a sentence from context. 23 D 62.4 64.1 Synthesises information across a text. 24 B 46 48.5 Infers a perception of a character. Caffeine — an eye opener 25 D 75.8 78.3 Identifies the purpose of a diagram. 26 C 56 58.6 Synthesises information within a paragraph. 27 D 48.9 52.2 Interprets directly stated information. 28 B 41.2 41.9 Interprets the intent of an expression.

21 Queensland Curriculum & Assessment Authority | Item 11: Response refers to natural and/or unforced laughter. Item 29: Response gives two interpretations: the article is an eye-opener becaus\ e it tells you previously unknown information about caffeine AND an eye-opener is something that wakes you up, which is one of caffeine’s effects. Item 40: Responses that identify all three of the following: atmosphere, magnetosphere, solar wind. Key messages As in 2014, the Year 9 Reading test for 2015 cons isted of 50 questions about eight stimulus texts (informative, persuasive and narrative). A welcome feature was the inclusion of two extracts from published works (‘Lord Douglas’ by Henry Lawson and Feeling the heat by Pat Lowe). For the first time, a partial credit item was used; Item 29 required students to write two meanings of a phrase but those who wrote only one meaning were scored as partially correct. Teachers can view school-spec ific performance information through the QCAA’s SunLANDA program. SunLANDA is available on-line thr ough the Schools portal on the QCAA home page. Caffeine — an eye opener 29 * See below 41.5 42.9 Interprets meanings of a title. 30 A 29 30.1 Identifies a writer’s style. 31 C 52.1 54.2 Evaluates bias in a persuasive text. Mrs Douglas 32 A 54.1 56.8 Interprets information to make an inference. 33 C 52.5 55.9 Infers the tone of a narrator. 34 B 38.9 41.9 Interprets a character’s insight. 35 C 43.5 47.6 Infers the motivation of a character. 36 A 61.3 64.3 Infers the meaning of an expression. 37 B 54.3 57.2 Identifies purpose of the format of direct speech. 38 A 37.1 40.7 Infers a character's personal qualities. Auroras: neon signs in the sky 39 B 65.2 68.9 Locates directly stated information. 40 * See below 42 43.3 Locates directly stated information. 41 A 53.5 54.9 Identifies the reason for use of inverted commas. 42 C 41 43.8 Evaluates the purpose of a comparison. 43 C 49.4 52.7 Evaluates main purpose of a paragraph. 44 B 56.8 58.1 Identifies common information in two paragraphs. Lost and found 45 2, 3, 5, 4, 1 19.4 21.4 Identifies the narrative sequence of events. 46 D 54 58 Interprets a character’s feelings. 47 A 50.6 53.3 Interprets the use of italics. 48 B 57 60.5 Interprets information related to a character’s situation. 49 B 50.1 53.8 Identifies the use of flashback technique. 50 D 25.8 30.9 Identifies the author’s tone. Item Answer Qld% Aust% Description

| 2015 NAPLAN: State reportYear 9 Literacy 22 (Education Queensland schools may al so access this content through OneSchool.) SunLANDA displays the performance of clas ses, subgroups and individuals within the school and compares the school’s performance with that of the state and nation. Hyperlinked to each item are the analyses and teaching ideas. It is possible, howe ver, to use SunLANDA off-line by installing the program and loading it with files containing school data. This option is available from the Test reporting and analysis section on the QCAA’s NAPLAN pages. From these pages, parents and teachers can also view collected PD F versions of the item analyses. Performance Queensland, Tasmania and South Australia typica lly perform below the ACT, Victoria and New South Wales. On a few items, Queensland result s are more than 3% lower than the national. Lower facility rates in Queensland often occurred on higher-order task s. This helps to explain why only about 13% and 4% of the Queensland cohort reached Bands 9 and10 respectively, while the national results were 15% and 6%. Many higher-ord er tasks involved inference — either inference from the text (items18, 35 and 39), or inference from the context (item 38). Although some easy items are described in the table above as ‘inferenc e’ tasks, these involve very simple inference. Other higher-order tasks were set by items 46, 49 and 50. These required students to use knowledge of literacy terms and abstract vocabulary. Also notable were the incorrect answers that re flect students’ lack of test-wiseness (knowledge about the types of questi ons asked in tests and about the prac tical steps to take when answering). This seemed to influence performance on items 11, 17 and 26 among others. Implications for teaching The test results show the need for teaching advan ced reading skills. These skills are not confined to the narrow range of inference and interpretation tasks in the NAPLAN test. Skilled readers are able to: • connect themselves with what they are going to read • consider how a text’s purpose and subjec t matter influence its language features • use the special terminology of textual analysis to describe, summarise, compare and contrast • evaluate and appreciate good quality writing • determine the scope, reliability and accuracy of information • apply, synthesise and extend ideas • read to make decisions, solve problems and predict outcomes. Skilled readers can also access rich prior knowled ge. As active readers, they encounter many text types and ideas. This leads to efficient reading because authors inevitably make assumptions about their readers’ existing knowledge of text s, vocabulary and facts. For example, Henry Lawson in 1902 did not need to explain that Mrs Douglas is humbling herself when she pays the grocer cash-per-purchase rather than continuing to run up a tab. His readers already knew that certain actions typify declining so cial status. Modern readers also know this if they have enough knowledge of history and literature, just as they know that 10 shillings was a big sum (about a quarter of a weekly wage at that time). Some assumed knowledge can be inferred from the text if it is not known beforehand, but the more that student s read, the less perplexity and difficulty will arise when they lack prior knowledge. QCAA resources QCAA 2015, Beyond NAPLAN How to read challenging texts , Beyond NAPLAN series www.qcaa.qld.edu.au/downloads/p_10/ naplan_read_challenging_texts.pdf

23 Queensland Curriculum & Assessment Authority | QCAA 2014, Using reading data to improv e students’ performance in higher-order questioning , Beyond NAPLAN series www.qcaa.q ld.edu.au/Naplan-reading-advice.html Teaching higher-order reading and test-taking strategies Explicit teaching of reading strat egies involves the method of gra dual release of responsibility to the learner. Teachers model thei r own thought processes for students to emulate. Ultimately, students should be able to make their ow n thinking visible in peer discussions. Authentic, high quality texts are needed. Students should be motivated, at least potentially, to read the chosen texts, either because they are texts that they would read voluntarily in their own time or because they are set te xts for school subjects. While teaching authentic reading sk ills, teachers can take the opportunity to point out how they apply in test situations. Pre-reading: Whether they are reading authentically or in a test situation, students should brainstorm their existing (prior) knowledge and ac tivate their personal curiosity about a topic before reading a text on that topic. This gi ves them a meaningful frame in which to read. On the other hand, texts will sometimes contradict students’ prior knowledge and assumptions. In a test situation, students should know that they cannot choo se an answer by using their prior knowledge alone, without reading the text. This mistake probably explains, for example, many of the incorrect answers to items 17 and 26. Teachers can design pre-reading ac tivities to help students • access their background knowledge • establish their purpose in reading and the purpose behind the text • formulate questions that can drive inquiry. ‘Read with a pen’: Active reading also involves annotatin g the text: writing summary comments in the margins, using question mark s and exclamation marks to show how the reader responds to the ideas, ruling lines between related parts of the text, underlining topic sentences and special terms, numbering or labelling exam ples, proofs and definitions. This is an authentic reading skill that will greatly assist in test situations. Skim and scan: Skimming and scanning a text are auth entic reading skills, but they are also useful in test situations. Te st items 8, 13 and 44 all requi red skimming and scanning skills. We skim read a text to get the gist. We would skim to find out, for example, whether the text is likely to be useful for our purpose or to get a quic k idea of the content of a collection of texts. Skim reading a long journal article or a book chapter might involve reading the headings, the first and last paragraphs and the topic sentences in each paragraph. We scan texts to find something specific. Suppose, for example, we want to know whether Henry Lawson’s stories reflect his changi ng political views. To research this question we might scan some early stories and some late ones, looking for attitudes related to republicanism and unionism. If we are reading online, the computer can search for exact terms, but this is not useful unless those specific terms are the only things we need to search for. Some information texts have contents pages and index es, but these may not include th e concepts we are interested in. Test question knowledge: Ideally, test items should ask the same questions that good readers ask as they read (although pragmatic factors may not always allow this). Readers’ questions are of four main types: Can I recall explicit statements? Can I make connections and find underlying meanings? What is another perspective on this? Where does this lead? Students should categorise the test question and be sure not to give, for example, a recall answer to an inference question.

| 2015 NAPLAN: State reportYear 9 Literacy 24 Certain words and phrases in test questions shou ld also be noted. Sometimes questions use the phrase ‘according to the text’. That does not mean that the other quest ions can be answered without reference to the text! But students should be particularly careful not to answer from prior knowledge when they see that phrase. Normally when discussing texts we use the phrase ‘according to the text’ to refer to explicit statements. When ideas ar e only inferable or peripheral, we tend to refer to them with phrases such as ‘based on the text’ or ‘judging from the text’. In the NAPLAN test, however, the phrase ‘according to the text’ could refer to any aspect of the text. Item 11, for example, which uses ‘according to the text’, requires the reader to dr aw their own conclusion from a minor qualifying phrase. In a standardised test of reading, the reading load should be in the st imulus text rather than in the question. Nevertheless students should be prep ared for cases where the question contains complicated grammar and nominalisation. In a wa y, understanding a complicated question such as item 21 resembles the authentic task of reading commentaries on texts. The question contains a reader’s precis of a passage in the text.

25 Queensland Curriculum & Assessment Authority | Year 9 Numeracy Results and item descriptions The numeracy strands are abbreviated as follows: Algebra, function and pattern (AFP); Measurement, chance and data (MCD); Number (N ); Space (S). All items are worth one score point. The percentage columns give the proporti on of correct answers (facility rates). These results are based on provisional data. Calculator-allowed paper Item Strand Answer Qld% Aust % Description 1 S B 89.6 90.4 Identifies a solid with half the volume of a given solid. 2 MCD C 71.8 69.9 Identifies the attribute measured in litres. 3 S A 81.9 82.2 Identifies the mirror image of an object. 4 AFP C 69.9 70.5 Substitutes a value into an equation to find an unknown. 5 MCD B 67.4 69.8 Converts a rate from metres per second to metres per minute. 6 N A 62.6 66.6 Evaluates an expression involving decimal division. 7 MCD C 67.2 67.5 Estimates the number of cubes needed to fill a box. 8 N B 68.8 71 Expresses a number using scientific notation. 9 N B 67.4 68.3 Expresses a quantity as a percentage. 10 AFP A 73.9 75.8 Identifies an equivalent expression that involves squares, multiplication and division. 11 MCD D 63.9 65.4 Calculates a finishing time given the start time and duration. 12 S C 59.2 60.7 Calculates the size of an angle in an isosceles triangle. 13 AFP C 47.6 49.1 Calculates a volume using a rate of cents per litre. 14 AFP C 58.7 60.8 Identifies the pair of numbers in a table that satisfy a rule. 15 N 5 062 043 57.5 59.6 Solves a multistep problem involving large numbers, addition and subtraction. 16 S C 56.8 58.1 Identifies a 2-D shape from a list of its properties. 17 S D 34.9 36.8 Solves a problem involving compass directions and measure of turn. 18 AFP A 33.8 36.9 Rearranges a linear equation to give y as the subject. 19 N D 33.6 39.3 Identifies an equivalent form of a number with a negative index. 20 AFP B 36.1 39 Uses substitution to solve a linear equation. 21 S 40 36.3 38 Calculates a length from a scale drawing.

| 2015 NAPLAN: State reportYear 9 Numeracy 26 Non-calculator paper 22 MCD A 26.1 27 Solves a problem involving a rate and the conversion of units of time. 23 N 350 26.2 30.3 Uses proportional reasoning to solve a problem involving two percentages. 24 AFP 70 21 22.5 Interprets the pattern in a table of values to make a prediction. 25 MCD 2197 13.5 16.8 Calculates the volume of a cube given the sum of its edges. 26 S D 18.6 19.9 Identifies a true statement about the properties of quadrilaterals. 27 N D 10.3 12.8 Solves a two-step problem involving percentage discount. 28 MCD 655 8.6 10.7 Solves a multistep problem involving perimeter and conversion of units to calculate cost. 29 AFP A 11.1 12.8 Identifies the formula of a line. 30 S 120 10 12.2 Applies geometric properties of shapes and angles to calculate the size of an angle. 31 N 33 2.8 5.4 Calculates the largest sum of three whole numbers given their product. 32 MCD 65 2.6 4.8 Calculates the perimeter of a sector of a circle. Item Strand Answer Qld% Aust% Description 1 AFP B 87.1 87.5 Calculates a missing value in an additive pattern presented in a table of values. 2 AFP B 82 83.6 Applies knowledge of the order of operations to identify the expression to solve a problem. 3 MCD A 67.7 68.5 Selects the most likely event from 2 spins of a spinner. 4 S B 77 79.3 Identifies the missing part of a symmetrical design. 5 MCD B 70.6 72.9 Interprets points on a graph to identify information. 6 S D 77.8 78.2 Identifies an incorrect face in a net. 7 N B 69.4 69.3 Calculates a product which includes a cubic power. 8 AFP C 59.9 61.2 Determines the value of a symbol in an informal equation. 9 S C 64.6 67 Identifies a pair of points that form an edge of a prism. 10 MCD A 51.2 53.6 Estimates the capacity of a container shown in a picture. 11 AFP C 53.4 52.5 Matches the shape of a time/distance line graph to the appropriate data. 12 S D 50.2 51.8 Identifies a property of a trapezium. 13 AFP B 43.4 50.9 Simplifies an algebraic expression involving like terms. Item Strand Answer Qld% Aust % Description

27 Queensland Curriculum & Assessment Authority | Key messages Performance Student results for Numeracy for Years 7 and 9 are reported as a single score. Where a student completes only one of the two Numeracy tests, thei r Numeracy score is an estimate of the score they may have received had they completed both tests. A significant difference between the raw scores achieved by an indivi dual student on the calculator-allowed and non-calculator tests may warrant investigation. While the majority of students attempted to answer all test items, a significant number omitted the more difficult items towards the end of the test, pa rticularly items for which they were required to construct an answer rather than select a correct response for given options. These items are generally designed to differentiate student performance: to provide opportunities for higher performing student s to demonstrate their ability to solv e complex problems. The percentage of students failing to enter an answer for constructe d-response items on the calculator-allowed test ranged from 4% to 20%. This was significantly hi gher than the result for constructed-response 14 N D 64.4 67.9 Calculates the difference between a negative integer and a positive integer. 15 MCD A 45.9 48.8 Solves a length problem. 16 N C 44.6 46.3 Estimates a cost using a rate of cents per minute. 17 N D 37.7 41.6 Identifies the decimal that is the closest to 0. 18 S A 39.5 41.9 Determines the coordinates of a point of a parallelogram on a Cartesian plane. 19 N 90 29.8 34.7 Uses proportional reasoning to calculate a total. 20 MCD D 29.2 31.2 Calculates the probability of an event involving factors. 21 N A 26.5 30.5 Simplifies a complex expression involving indices. 22 S A 36.4 39.4 Infers the shape that matches given parameters from three examples. 23 N 6 25.6 27.9 Applies a rate to calculate a related quantity. 24 AFP A 24.8 28.8 Continues a number pattern when given a rule. 25 AFP D 36 39.5 Solves a word problem involving ratio. 26 AFP D 16.4 18.9 Solves an equation involving fractions. 27 MCD E 19.8 22.7 Calculates the perimeter of a shape using the properties of equilateral triangles. 28 AFP D 15.8 17.4 Determines the point of intersection of two lines from their equations. 29 MCD 4500 12.1 14.6 Calculates the capacity of square prism. 30 MCD C 29.1 30.7 Compares the areas of rectangles after an enlargement. 31 S 38 4.2 5.6 Uses a number pattern to calculate the perimeter of a series of composite shapes. 32 AFP 180 10.1 13.4 Solves a word problem involving the difference between two fractions. Item Strand Answer Qld% Aust% Description

| 2015 NAPLAN: State reportYear 9 Numeracy 28 items on the non-calculator test where between 6% and 16% of students failed to answer. Teachers will need to ask students their reasons for failing to answer questions, as a non- response provides no information for teachers to use to improve learning. The percentage of students who correctly answered items on the calculator-allowed tests ranged from 90% on the easiest item to 2.6% on the most difficult, the final item on this test. This item required students to calculate the perimeter of a sector of a circle. Fifteen of the 32 questions on the calculator-allowed test were answered correct ly by more than 50% of students in Queensland and across Australia. For the non-calculator test, the percentage of students answering items correctly ranged from 87.1% for the easiest item to 4.2% on item 31. This item required students to use a number pattern to calculate the perimeter of a seri es of composite shapes. Thirteen of the 32 questions on this test were answered correctly by more than 50% of Queensland students. This is one fewer than the number for the nation as a whole. Queensland students performed above the national faci lity rate on only one item on the calculator- allowed test — item 2. They were 3% or more below the national facility rate on 5 items (6, 18, 19, 23, 25). The most significant diff erence was 5.7% on item 19 where students had to identify an equivalent form of a number with a negative index. On the non-calculator test, Queensland students marginally outperformed the national cohort on 4 test items, however they were below the national rate by 3% or more on nine items. Eight of these were classified as either Number or Algebra, func tion and pattern items (13, 14, 17, 19, 21, 24, 25, 32). The other item (22) was from the Space strand. A number of items are common to both the Year 7 and Year 9 tests. These ‘link’ items are for equating purposes and are intended to reflect di fferences between the two groups of students. The difference between the facility rates on these items should indi cate that students at Year 9 have a significantly improved understanding of the relevant mathematical concept. Of the 11 ‘link’ items on the calc ulator-allowed test, Year 9 students outperfo rmed their Year 7 counterparts by 10% or more on 7 items. Of the other 4 items: • 1.6% more Year 9 students were able to identify a solid with half the volume of a given solid (1) • 4.2% more Year 9s could identi fy the mirror image of an object (3) • 67.4% of Year 9 students expressed a quantity as a percentage compared with 61% of Year 7 students (9) • 3.3% more Year 9 students were able to use the geometric properties of shapes and angles to calculate the size of an angle (30). There were 8 items that appeared on both the Ye ar 7 and Year 9 non-calculator test. Three of these (1, 8 and 15) were answered correctly by approximately 10% more Year 9 students than Year 7 students. On two probability items (3 and 20) the diffe rences were 6.6% and 7.1%. For items 4 and 6, both Space problems, the differen ces were 6% and 7.8%. The item that required students to estimate a cost using a rate of cents per minute (16) had fa cility rates of 43% for Year 7 and 44.6% for Year 9. On the calculator-allowed test, Queensland male students outperformed female students on 21 of the 32 items. For four items (5, 7, 14 and 23) th e difference was greater than 5% with the most significant difference being 12% for item 23, which required students to use proportional reasoning to solve a problem involving two percentages. On the same test, female students outperformed their male counterparts by more than 5% on 3 items (4, 6 and 10), two of which were Algebra, function and pattern items. For the non-calculator test, male students outperform ed female students by more than 5% on 6 of

29 Queensland Curriculum & Assessment Authority | the 32 items (11, 14, 15, 17, 24 and 25), with item 17 having the largest difference of 15%. Female students outperformed male students by 6% on two items — 10 and 12. There is no pattern to these differences for either test, however an examination of school-specific data may paint a different picture. Implications for teaching Across the two Year 9 numeracy tests, 23 items had facility rates of less than 30%. Most of these were in the second half of the test paper meani ng that they were among the more difficult items on the test. Of these 23 items, 8 addressed t he concepts of length, volume, mass, area or time (22 CA, 25 CA, 28 CA, 32 CA, 27 NC, 29 NC, 30 NC, 31 NC). These items were presented as word problems that students had to decode before determining the mathematical operation required to solve them. Most involved calculations for which students sh ould have been able to use formulas (perimeter, volume, area) or conversions between units. Simp ly providing students with a formula does not help develop their understanding of a concept such as perimeter or volume. Students need to be provided with opportunities to explore the rela tionships between the various dimensions of a shape and to use these relations hips to develop formulas. Some students may benefit from the use of concrete materials to explore these relationships. The majority of these items also included a di agram. Teachers should not assume that students know how to interpret or use diagrams to help th em solve problems. They need to be taught how to interpret diagrams, particularly those that us e labelling conventions for lines and angles, and how to annotate a diagram with the information pres ented in the text of a word problem. Having all the relevant information in one place will assist students to see the relationships necessary to solve the problem. Conversion between units of measure is still a challenge for many students in Year 9. It is imperative that students understand the prefix es used for metric measurements and the multiplicative relationships of these measurements. Another 4 (23 CA, 27 CA, 19 NC, 23 NC) of the 23 items that had facility rates of less than 30%, related to the concepts of rate, ratio, percentage or proportion, all of which involve multiplicative thinking. Items 23 and 27 on the calculator-allowed test involved percentages. This concept continues to challenge students. They need to understand that a percentage is another representation of a fraction or part of a ‘whole’, a proportional repres entation of a relationship between two quantities. The other basic understanding that seems to el ude many students is that percents are numbers that can be compared, added and subt racted only if they represent different portions of the same ‘whole’. As percentages are used extensively in marketing and banking, the use of contexts from everyday situations — di scounts, interest rates and population increases — will enable students to see the relevance of t he concept. Students should also be able to recall the percentage equivalents of key fractions as an aid to checking calculations and for estimation purposes. Teachers should not assume that students know how to use the percentage ke y on their calculator — this needs to be taught. To correctly answer items 19 and 23 on the non-calculator test, students needed to read the problems carefully to see the relationships between the different quantities and then use proportional reasoning to calculate their solution. Presenting st udents with a variety of problems — both single- and multistep — and asking them to identify the steps needed without the pressure of having to find a solution, may assist in the de velopment of students’ problem-so lving skills. Classroom discussions are also important in developing problem-solving skills as these encourage the sharing of alternative methods and understandings, and expose common mistakes and misconceptions.

| 2015 NAPLAN: State reportYear 9 Numeracy 30 The other set of concept s prominent in items with a facility rate of less than 30% involved expressions, equivalence or equati ons. Across the two tests, there were 12 items related to this set of concepts (4 CA, 6 CA, 10 CA, 18 CA, 20 CA, 29 CA; 2 NC, 8 NC, 13 NC, 21 NC, 26 NC, 28 NC). These included both algebraic and numeri cal expressions and equations that required students to substitute values, simplify expres sions, identify expressions that represented problems, rearrange equations, identify equivalent expressions and determine the value of a symbol in an informal equation. Item 26 on the non-calculator test required student s to solve an equation involving fractions. Many students at this year level find calculating with fractions in any context a challenge, so adding the concept of equivalence and the need to know how to solve an equation caused more than a few problems. To develop an understanding of the mean ing of algebraic expressions and equations, have students write the meaning of the expression or equation in words, e.g. x divided by 2 is equal to (has the same value as) 3 divided by 11, or half of x is equal to three-elevenths . Students should also be asked to translate expressions writ ten in words into algebraic expressions. They need to understand the concept of equivalence in order to understand how to solve equations. Items 28 on the non-calculator test and 29 on the calculator-allowed test assessed students’ understandings of the equation of a line. These two items may be considered to be more an assessment of mathematics than of numeracy, but solving linear equations using algebraic and graphical techniques and verifyi ng solutions by substitution are in the Australian Curriculum Mathematics for Year 8. It may be necessary to revisit the concepts of patterns, relationships, variables, expressions, equations , unknowns and graphs so that students become more comfortable using algebraic methods to solve prob lems. The use of concrete materials may aid in reinforcing these concepts. Please refer to SunLANDA for a detailed analysis of individual test items, including teaching ideas designed to assist with the development of the understanding and skills required for each item. SunLANDA is available to all schools on the QCAA website. These materials are also available to Education Queensla nd schools through OneSchool . When looking at the data for a single test item , teachers can compare the grouped data for their class with that of the state or national coho rt. This will enable them to judge the level of difficulty that their students experienced with that item. For some items, the differences between the national, state and class data may not be signific ant, but teachers may wish to investigate the reasons for the poor performance of students on items that assess simple content and skills fundamental to numeracy development.