Interactive Engagement with Classroom Response Systems

by S. Raj Chaudhury, Christopher Newport University

 

Details

Instructor Name:

  1. Raj Chaudhury

Course Title:

Introductory science for non-majors

Institution:

Christopher Newport University

What is the overall aim of the course?:

Introductory science courses for non-majors are often among the larger courses taught by liberal arts colleges, and while they fulfill “breadth” requirements within core curricula, their very size and nature often pose a challenge for properly assessing student learning and engagement with the material. Even though such courses often stress “finding the right answer,” I’m interested in generating  discussion among students and engendering a sense of shared inquiry.

Course design and scope of the project:

Physical science courses for non-majors have been a special interest of mine over the last several years – from classes that follow a “studio” model with integrated lecture and laboratory – to large lecture-only courses where student interest, attendance and motivation are all open to question. Much of this work has been completed at Norfolk State University (a Historically Black University), but I am continuing it at Christopher Newport University, also a state institution in Virginia. I’ve been especially interested in exploring classroom response systems (aka “clickers”) to promote understanding of the material, collaboration, and metacognitive awareness.

Incorporation of Technology:

Interactive handheld response systems (“clickers”) lie at the heart of my approach. Multiple choice questions are posed by the instructor to the class who each anonymously respond using their devices–which look like TV remote controls. Once all responses are received, a histogram displaying the results displays on the screen. Ideally, the chosen question will generate a bi-modal distribution. The instructor asks students to engage in Peer Instruction–“ “turn to your neighbor and try to convince them to change their answer to yours.” This period usually lasts 60-90 seconds. As the buzz in the room subsides, the instructor polls the class again using the same question. Depending on the outcome of this poll, the instructor may chose to revisit the topic, clarify a point, or simply proceed with the lesson. A 50-minute lecture broken into 3 segments of 10 minute direct instruction followed by one or two “clicker” questions keeps students engaged and provides the instructor with useful formative assessment data.

Lessons Learned:

Response systems are, in my opinion, excellent tools for scholars of teaching and learning because of their data generation capabilities. Student-work artifacts in pencil and paper form can take long hours to grade to obtain evidence for an inquiring teacher; with the response systems the work goes into creating excellent questions – once, since they can be reused and the data is automatically generated and stored by the system. Even though I teach introductory science – where there is often an emphasis on “finding the right answer,” I use the system principally to generate discussion among students and to engender a sense of shared inquiry, where the assessment data is shared in real-time by the students and the instructor. This approach is applicable across many disciplines – wherever there are lectures that could be made more interactive.

As students and instructor view a histogram of results together, they connect, in a powerful way, around the material – creating a pathway for the development of students’ metacognitive skills in a manner not easily possible without the technology.

References, links:

There are several classroom response systems available – both in Higher Education and in the Secondary School markets. I am currently using the CPS (Classroom Performance System) from e-Instruction. Their website is a good place to start. Research on response systems has been growing – especially in the Physics Education field, where papers from Eric Mazur’s group at Harvard and the U-Mass Amherst group are well regarded. A number of presentations at the Carnegie Colloquium and meetings such as AAC&U have focused on implementations of response systems. I shall be developing an online poster on my usage of the CPS system at CNU in Fall 2005. Links will be available from my home page.

Measured Results:

While I have always received positive anecdotal feedback from students regarding the usage of response systems in my classes at Norfolk State University, much of my attention there was focused on encouraging other faculty members to adopt an approach of pursuing interactive engagement in their lectures using Peer Instruction. We used the Personal Response System (PRS) technology – now sold as Interwrite PRS. Its data aggregation facilities were primitive and made it hard to store and analyze data from multiple class sessions. This year at CNU I have been pursuing a very systematic strategy of storing the results of each session (using the CPS technology). I have asked students to comment on the Student Evaluation forms about the effectiveness of using technology in their course (introductory physics for non-science majors) for learning. I hope to receive this feedback sometime in the Spring semester. In the meantime, my data suggest that many physics misconceptions have been identified with a CPS question, addressed through a short instructional sequence and then assessed with follow-up questions that allowed students to demonstrate increased understanding of the topic. Most recently, this happened during the study of thermal energy and differentiating the temperature of an object from its specific heat capacity.