Index

Ong Daphne Rachel

Authors List

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Assessment 

Authors List

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Author/s:

Ong Daphne Rachel (Broadrick Secondary School (Singapore)) Keywords History Junior College Secondary School Approaches to teaching history Assessment Introduction Source-Based Case Study (SBCS) is a compulsory part of the formal history assessment in Singapore. It falls under Assessment Objective 3 which requires students to “interpret and evaluate source material” (MOE, 2013). Since this is an important […]

Ong Daphne Rachel (Broadrick Secondary School (Singapore))

Keywords
History
Junior College
Secondary School
Approaches to teaching history
Assessment

Introduction
Source-Based Case Study (SBCS) is a compulsory part of the formal history assessment in Singapore. It falls under Assessment Objective 3 which requires students to “interpret and evaluate source material” (MOE, 2013). Since this is an important component in the current assessment framework, history teachers spend a significant amount of time helping students to master the requisite source-work skills. In addition, they would frequently be engaged in the task of setting and marking SBCS assignments. Some of these teachers would strive to give feedback to help students know where they stand and how they can improve. They would normally include comments and some may write copious amount of feedback. While these teachers held good intentions when writing feedback, for example, to help students improve their performance, anecdotal evidence suggests that students were likely to skim over written feedback and instead concentrate mainly on the marks and grades awarded. This action on the part of the students, however, negates the purpose of Formative Assessment (FA) “as one that is specifically meant to provide feedback on performance to improve and accelerate learning” (Sadler, 1998, p. 77).

Another issue hindering student improvement in answering SBCS questions is their over-reliance on the teacher, especially in going through detailed explanations for each question after the marking process, and then for students to merely address the corrections by copying given answers. This situation can be described as “learning is being taught” (Watkins, 2003) where the traditional roles of the teacher as the provider of all knowledge and that of the student as the absorber of passed down knowledge play out in the context mentioned above. While doing corrections may suggest that students have comprehended their mistakes, anecdotal evidence again suggests the ineffectiveness of this approach as the recurrence of the same mistake being made by students appears very high. One reason is because most students – without being consciously aware – are just copying the model answers without ever thinking about the question again. While some students may independently re-look and try to make sense of these answers before tests and examinations, a large number of them can experience “rumination”, a state in which students get stuck on their mistakes and wander around them without learning how to find a solution (Panadero & Alonso-Tapia, 2014). Moreover, the copying of model answers erroneously reinforce the idea that the teacher’s answer is the only logical or correct one while discarding the possibility of other acceptable answers (which the students are not exposed to).

This article aims to share how designing a comprehensive error analysis lesson package, which was implemented at Broadrick Secondary School (BSS), can serve as a means for thinking about a student-centered approach to bridge their learning gaps in answering SBCS questions. Teachers can leverage the opportunity of maximizing error analysis methods into an Assessment for Learning (AfL) design by using marking codes, feedback, questioning, gradual release of responsibility, differentiated instruction and self-reflection to engage students in their learning.

AfL as A Way to Learn
AfL or FA “is an active and intentional learning process that partners the teacher and students to continuously and systematically gather evidence of learning with the express goal of improving student achievement” (Moss  &  Brookhart,  2009, p. 6).

Error analysis becomes a form of AfL when feedback, questioning, collaboration and differentiated sense-making are established into a model of learning. This type of learning follows a socio-cultural model of learning and can be considered as co-constructivist as learning takes place through interacting with others in meaningful contexts and through problem-solving activities (Watkins, 2003).

Download Full Article

Author/s:
,

Celine Oon (Curriculum Planning and Development Division, Ministry of Education (Singapore) Bertrand Tan (Curriculum Planning and Development Division, Ministry of Education (Singapore)) Keywords History Assessment Introduction Identifying students’ learning gaps is often a challenge for Pre-University History teachers. Besides generic formative assessment strategies such as teacher questioning, think-pair-share and student reflection, formative assessments carried out at the […]

Celine Oon (Curriculum Planning and Development Division, Ministry of Education (Singapore)
Bertrand Tan (Curriculum Planning and Development Division, Ministry of Education (Singapore))

Keywords
History
Assessment

Introduction
Identifying students’ learning gaps is often a challenge for Pre-University History teachers. Besides generic formative assessment strategies such as teacher questioning, think-pair-share and student reflection, formative assessments carried out at the A-Levels also involve getting students to discuss or write essays in response to past year history examination questions. While these tasks provide teachers with some sense of how students are able to manage question items in the A-Level History examination, how much do these essays or Source-based Case Study (SBCS) assignments tell teachers about students’ understanding of historical concepts and skills?[i] Furthermore, how helpful are these assignments in informing the next steps of instruction?

Generally, many pre-University history teachers recognize the value of formative assessment in supporting teaching and learning. Knowing where students ‘are at’ at significant junctures of the learning process can help teachers decide what to do to close students’ learning gaps (Wiliam, 2011). However, in the absence of formative assessments that can be quickly implemented and targeted to elicit information on students’ knowledge of historical concepts and skills, teachers often end up using summative assessment for formative purposes.

Yet, to meet formative assessment objectives, dealing mainly with A-Level History examination questions may have limited utility. The first issue is that lengthy essays make it difficult for teachers to quickly identify particular skills or concepts that need further attention (Breakstone, 2014). The second issue relates closely to the purpose of the assessment. Specifically, A-Level History examination questions require students to synthesize component skills in the course of answering them. Yet, a student’s response to A-Level History examination questions offers little or limited information on precisely where the student’s strengths and weaknesses lie and do not serve as an effective compass pointing teachers towards appropriate instructional interventions. As put forth by the National Research Council (NRC) in Knowing What Students Know: The Science and Design of Educational Assessment, “…the more purposes a single assessment aims to serve, the more each purpose will be compromised” (National Research Council [NRC], 2001: 2).

Author/s:
Author/s:

Janet Alleman (Michigan State University) Keywords Social Studies Primary School Assessment Introduction Just when the tendency to ‘measure’ entered education and the schools is not definitely known. We do know, however, that even early teachers including Socrates challenged their students with carefully prepared questions which undoubtedly were used to determine students’ intellectually capacity and abilities to […]

Janet Alleman (Michigan State University)

Keywords
Social Studies
Primary School
Assessment

Introduction
Just when the tendency to ‘measure’ entered education and the schools is not definitely known. We do know, however, that even early teachers including Socrates challenged their students with carefully prepared questions which undoubtedly were used to determine students’ intellectually capacity and abilities to exercise higher order thinking. No outstanding advancements in educational measurement were reported until about the middle of the nineteenth century. By the 1920s, quantitative measurement appeared in literature associated with educational tests and a little later the quality of tests became a part of the conversation. In 1922, the first edition of the Stanford Achievement Test was published. Initially the emphasis was on mastery, however, later attention was directed toward student strengths and weaknesses and the use of data for enhancing the learning process (Loeck, 1952).

Now fast forward to 2012. The discourse about assessment and testing has exploded, primarily due to standards and high stakes testing with a dramatic shift from almost exclusively student performance and accountability to include teacher performance and accountability. In some school districts, teachers have lost their jobs due to poor student performance and in other instances teachers’ salaries are determined, in part, by student performance. While this article will not enter the debate about where the emphasis should be or who is to be praised or blamed, this author advocates a balance and argues that attention to student ongoing assessment correlates with teacher performance if the assessments are multi-facets and aligned with curricular goals and if the results are used to inform planning and modify instruction.

Assessment: Integral Part of the Learning Cycle
Imagine assessment as an integral part of the learning cycle that takes multiple snapshots of each student. The teacher needs a host of data types in order to create a profile of each learner. Think of the profile as telling a story of each student – his/her assets/successes as well as areas that need attention. While the renewed interest in assessment seems to be based on the onslaught of standards and standardized tests, it behooves the teacher to avoid this narrow perspective and instead seize this opportunity to rethink assessment within the content of curricular goals. Consider it in terms of its potential for determining students’ progress in learning, for curricular improvements, for instructional planning, and for grading.

Assessment should be a natural part of teaching and learning with the student in the loop and acquiring skills to self-monitor. Assessment should be ongoing, frequently cast as preliminary formative, and summative. Different forms and times for assessment should be determined according to the purpose of the learning situation, the kind of information sought, and how the assessment will be used to accomplish the subject’s goals. Since assessment is ongoing, many instructional activities can be used as assessment tools. The key is for the teacher to realize the difference between using an activity for teaching (processing information, etc.) and for “testing” a student’s performance.

Download Full Article

Scroll to Top