Pretests for Learning?
I have long advocated that we should preassess our students. We can learn about our students’ prior knowledge and be able to adjust our instruction to be at an appropriate level. We can also ensure students have the prerequisite skills needed to engage in subsequent learning. But according to some research, there may be another benefit to using protests.
Exploratory Activities
The February blog focused on the importance of allowing students to problem solve before giving direct instruction (PS-I) and its benefits over delivering direct instruction first. There have been some studies that do not show increased learning with PS-I strategies, however, it is becoming an accepted tenet that PS-I strategies do indeed result in in increased learning over direct-instruction-first (I-PS). The cognitive mechanisms for these results are a topic of ongoing research, but the benefits to engaging students, tapping into prior knowledge, and making them aware of existing knowledge gaps prior to direct instruction are clear.
The initial ‘problem solving’ phase of PS-I does not have to be solving a problem. There can be data sets, contrasting cases, scenarios, or other tasks that serve to orient students to the subsequent direct instruction phase. Often these tasks give students opportunities to invent incorrect answers and may lead to productive failure. Students may be given different levels of support from no support to step-by-step guidance during this phase. Because of the variation it is best to use the general term “exploratory activities” to encompass the variety of initial tasks in the PS phase of a PS-I approach. In this blog we will add another kind of oft overlooked initial task, pretests.
This article
The article by Newman and DeCaro (2019) has an intriguing title: “Learning by exploring: How much guidance is optimal?” There are three experiments that, on the surface, compare whether inventing a solution (minimal guidance) increases learning as well as giving students worked examples (maximal guidance) and how this guidance affects cognitive load. When students are required to invent a solution it may lead to a higher cognitive load that can be unproductive. Students may use their working memory to explore incorrect solutions which can waste time and effort and leave students frustrated. If students are guided with worked examples, then working memory is directed only towards productive solutions.
The scientist in me is not happy with the series of experiments because there are too many variables that are not controlled. This makes comparison between experiments nearly impossible. The experiments are conducted with no instructor support; even direct instruction is a text passage. It is also unclear whether students worked in groups or alone during the exploration activity leading one to question whether social interaction or an instructor could influence students’ learning. Despite these limitations, we see that, in general, students who studied worked examples (guidance) scored higher on the post assessment than those in the invention group (minimal guidance) and cognitive load and perceived awareness of knowledge gaps was higher for the invention group. These are not unexpected results and not the reason I chose to highlight this article.
In learning psychology experiments, students are given a pre-test to make sure that the groups have an equivalent starting point. If one group appears to learn more on the post-test, who’s to say that they weren’t different from the start, right? But in this article studying guidance during the PS phase of learning, the authors brought up a novel idea: What if pre-tests themselves were an exploratory activity?
In experiment 3, there were four conditions: invention or worked examples and given a pretest or no pretest. When students were not given a pretest, there was no significant difference on post-test performance between the invention and worked examples groups. But when students were given a pretest, there was a difference with the students given worked examples scoring significantly higher than those who invented a method for determining variance (Table 1).
Table 1: Posttest means (and standard deviations) for Experiment 3 of Newman & DeCaro (2019). Data from Table 5 (p.60).
*Posttest scores significantly higher (p=0.22) between minimal and maximal guidance when a pretests is administered.
Pretests as an Exploratory Activity
As mentioned, we don’t know the cognitive mechanisms for why PS-I increases learning. It may be that exploratory activities activate students’ prior knowledge and increase information retrieval. They may draw attention to the most important features that will be addressed in subsequent instruction, orienting students’ thinking. They may also elicit awareness of knowledge gaps, motivating students during subsequent direct instruction to learn the information they don’t not know. It is most likely that there are multiple overlapping mechanisms as postulated with the forward testing effect described in last month’s blog. Regardless of the cognitive mechanisms, there are clear benefits to student learning when students are engaged with exploratory activities before being given direct instruction.
As for the types of exploratory activities, much of the research uses problems, contrasting cases, and data sets, often situated within a scenario. But there is limited research on the potential benefits of using pretests. This is likely because learning psychology experiments use pretests as a way to ensure equivalent grouping. As practitioners, we have the opportunity to devise our own ways of determining whether pretests could be beneficial for student learning.
Implications for the math classroom
I have long advocated using low stakes pretests to diagnose students’ prior knowledge and adjusted instruction accordingly. For example, I preassessed a college biology class and found that students only needed a quick refresher on basic genetics. But two students did not know the metric system of measurement so we had an office session to cover this topic and I would check in on them after a lab to make sure they were good on conversions.
I have rallied against using a posttest as a pretest. When you compare the identical ‘pretest’ to the ‘posttest’ and find that students ‘learned’ it can inflate your ego. But posttests take time to administer, often a full class period, and giving students a test that is likely well beyond their current skill can result in frustration preventing a good diagnosis of current knowledge.
Instead, pretest should both address the knowledge students will need to have AND have a question or two that address what students will need to be able to do after the instructional unit is complete. I would give a large, low-stakes pretest on the first day of a biology class so I could adjust my teaching for the semester. This is not always practical, so I’ve also given 1-3 item pretests just before an instructional unit, usually at the end of the prior class period. I do this so I have time to diagnose students’ prior knowledge and adjust my instruction.
With the results from the Newman and DeCaro (2019) article, we may use short, low-stakes pretests at the beginning of a lesson or unit as an exploratory activity to activate the cognitive mechanisms needed to learn. In other words, you don’t need to come up with anything great or complicated to engage students before direct instruction. You could pick an item or two from your posttest, give them to the students, and see how they do with them.
If you like learning research, then conduct your own quasi-experiments with this idea of pretest-direct instruction form of PS-I. Have some students work alone and others in groups on the pretests to see if group think plays a role. Use students’ incorrect solutions in subsequent direct instruction or simply launch into your usual instruction to see if making direct references makes a difference. You can also assess not only learning but affective domains like self-efficacy, enjoyment, or anxiety when you use low-stakes pretests.
Take Aways
Exploratory activities before direct instruction works to improve learning!
Short, low-stakes pretests are one easy exploratory activity to try
The use of pretests is not well-studied in the research. Try it and see how it works for your students.
If you do decide to use low-stakes pretests – let us know! Share your best practices with the Almy Education community so we can improve math teaching and learning together!