Just a quick post to recommend a book: “Connections Are Everything: A College Student’s Guide to Relationship-Rich Education” by Felten et al. (2023), which is available as a free e-book, so no excuse for not reading it!
Tag Archives: for students
Recommended reading: “The New Science of Learning: How to Learn in Harmony with Your Brain” by Zakrajsek (2022) (Part 2)
I’m back to browsing the “menu” in my new favorite book, “The New Science of Learning: How to Learn in Harmony with Your Brain” by Zakrajsek (2022). If you haven’t read the first blog post about the book, you might want to read that one first for context.
Recommended reading: “The New Science of Learning: How to Learn in Harmony with Your Brain” by Zakrajsek (2022)
I found a new YOU HAVE TO READ THIS BOOK!!!-book: “The New Science of Learning: How to Learn in Harmony with Your Brain” by Zakrajsek (2022). It is aimed at students and it might be the most important thing students ever read in school…
Effective learning techniques for students: Currently reading Dunlosky et al. (2013)
I want to give you a quick summary of the super useful article “Improving Students’ Learning With Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology” by Dunlosky et al. (2013). A lot of what I write about on here is about how we can improve our own teaching, but another really important aspect is helping students to develop the skills they need to become successful and independent learners. This can be achieved either by explicitly teaching them study techniques, or by building lessons in ways that use those techniques (although I think that even then it would be useful to make the techniques and why they were chosen explicit).
In the article, Dunlosky et al. (2013) suggest one possible lesson plan that combines several of the techniques they recommend: Starting a new topic with a practice test and feedback on the most important points learned before, practising exercises on the current topic mixed with “older” content, picking up or referring back to old ideas repeatedly. Also, asking students to connect new content with prior knowledge by asking how the new information fits with what they already know and if they can explain it.
So what are the techniques we should be using and teaching our students? Out of the 10 techniques reviewed in the article, two are high impact, three are moderate impact, and the rest have low impact (even though some of those are the ones most often used by students), and I am presenting them in that order (and you might recognise them from the suggested lesson plan — surprise!).
High impact: Practice testing
One of the two most useful learning techniques, according to Dunlosky et al. (2013), is practice testing: either self-testing of to-be-learned material, or doing really low-stakes (or even no-stakes) tests in class.
For self-testing, this can mean different things, like learning vocabulary using (electronic) flashcards. When I was learning Norwegian, I practised a lot using the App Anki and self-written cards with vocabulary or sentences I wanted to know, now I use Duolingo regularly (616 day streak today, wohoo!). But it could also mean doing additional exercises, either provided with the teaching materials, or even seeking out or coming up with additional questions. For my exams in oceanography, as a student I spent a lot of time (mostly sleepless nights though) before the exam trying to imagine what I might be asked, and how I would answer.
I wrote about the importance of assessment practices and how “testing drives learning” previously, but here the important point is really that the students are ideally using this as a learning technique themselves.
High impact: Distributed practice
The second high impact practice is distributed practice: not cramming everything the night before an exam, but spreading practice out over as long a period as possible, and come back to material repeatedly over time. This is not how we typically teach, nor how textbooks present materials (usually one topic is presented in one chapter, together with all exercises or practice problems that go with that topic), so it is not a learning technique that students are necessarily familiar with.
Distributed practice can be “encouraged” (enforced?) by frequent low-stakes testing in class. It is also built into the apps I mentioned above: Flashcards or practice problems that were answered wrong will come up again after a little while, and then, if you answered correctly, again and again with longer intervals in between. If you answered wrong, they’ll probably pop up even more often. And, of course, it is something that we can plan for and can encourage our students to plan for — ideally combined with an explanation and maybe some data for why this is a really good idea.
Moderate impact: Interleaved practice
One moderate impact practice that I like a lot is interleaved practice: mixing different types of problems or different topics during a practice session. Interestingly, results during those practice sessions are worse than when the same types of problems or topics are practised grouped together. But when tested later, interleaved practice is a lot more efficient, likely because in interleaved practice, students learn to figure out which way to solve a problem is required for which type of problem. Whereas in blocked practice, it is very easy to just numbly apply a procedure over and over again without actually thinking about why it is the appropriate one for a specific case. Which is what I am currently experiencing with my Swedish classes, now that I’m thinking about it…
Moderate impact: Elaborative interrogation
But, a second moderate impact practice could help in these cases: elaborative interrogation. There, we would do exactly what I describe above that I don’t do in Swedish classes: Asking myself why I am applying a rule in one situation but not in another, why a pattern shows up here and not there, and coming up with explanations. This is very easy to implement actually.
But it is not so easy to instruct as a technique, when we don’t want to prescribe the kinds of questions students should ask themselves, but want them to generate the questions themselves, and then answer them. How do we tell them at what level of abstraction or difficulty they should aim? If we give prompts, then how many? Maybe this is something we can / need to model explicitly?
Moderate Impact: Self-explanation
Another moderate impact practice is self-explanation, where we explicitly connect new information with what we know already, explore how information fits together and which parts are actually new and/or surprising to us and why, or explain rules we come across. This is really useful for far-transfer later on.
We can prompt self-explanation on a very abstract level, giving general instructions like “what is the new information in this paragraph?”, or on a much more concrete level like “why are you applying this rule rather than that one?”.
The most efficient way to use self-explanation is to do it right in the learning process. But doing it retrospectively is still better than not doing it. And it is important for learning that we don’t have access to explanations, but find them ourselves (this makes me think of people that always bring out their smartphone and google the answer to an intriguing question, instead of engaging in the back-of-the-envelope fun).
Low impact: Summarization
And now we’ve reached low impact practice no 1: summarization. Writing summaries of the content we are trying to learn, that’s something I do a lot, for example just now when writing this blog post (but I don’t rely on remembering what I’m writing; I google things on my own blog. So maybe that’s not the same thing?).
Summarising, i.e. rephrasing the important points in one’s own words, is more useful that just selecting the most important content and then copying it word by word.
Low impact: Highlighting/underlining
Another low impact, yet highly popular, practice is highlighting and underlining. I’ve never understood why people do that, I’ve always written my own summaries and found that a lot more useful. But reasons why people might do it is because it’s quick and looks like someone has done some work with a text, even though it isn’t more beneficial than just reading a text. But the part about looking like work has been done might give students false confidence in how much work they have actually done, and hence how much they have learned.
Low impact: Keyword mnemonic
The “keyword mnemonic” low impact practice is about “building donkey bridges” as we would say in German — finding ways to remember more complex things by memorizing something simple, for example mental images or word sequences. I do that for example to remember the difference between refraction and diffraction, or the order of the planets in the solar system. But apparently it’s not a very useful technique at scale.
Low impact: Imagery for text
Another low impact practice related to mental images: creating mental imagery while reading or listening to texts. This can be helpful, and interestingly enough, the mental image is more useful than actually drawing it out!
Low impact: Rereading
And lastly: rereading. This is what students do A LOT in preparation for exams; reading old material again and again. This is a lot less efficient than the high- and moderate impact practices described above!
So what can we do with this information? As described in the beginning, we can include the higher-impact practices in our planning so students benefit from them without necessarily knowing that it is happening. But then we can also make those techniques explicit when we are using them, and encourage students explicitly to use them in their own studying. And we can point out that highlighting and rereading, for example, might feel like studying, but are much less efficient than those other techniques.
Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public interest, 14(1), 4-58.