Tag Archives: literature

Co-creating learning and teaching (Bovill, 2020)

Maybe it was because of the contexts in which I encountered it, but I always perceived “co-creation” as an empty buzzword without any actionable substance to it. I have only really started seeing the huge potential and getting excited about it since I met Catherine Bovill. Cathy and I are colleagues in the Center for Excellence in Education iEarth, and I have attended two of her workshops on “students as partners” and now recently read her book (Bovill, 2020). And here are my main takeaways:

Speaking about students as partners can mean very many, very different things. The partnership between students and teachers is “a collaborative, reciprocal process through which all participants have the opportunity to contribute equally, although not necessarily in the same ways, to curricular or pedagogical conceptualizations, decision-making, implementation, investigation, or analysis” (Cook-Sather et al., 2014). Fo me, understanding the part about contributing equally, although not necessarily in the same ways really helped to get over objections like “but I am responsible for what goes on in my course and that students have the best possible environment for their learning. How can I put part of that responsibility on students? And can they even contribute in a meaningful way when they are not experts yet?” and the key is that they are contributing as equals, but that does not mean that we are sharing responsibility or tasks (or anything necessarily!) 50/50.

Including students as partners to co-create their learning and teaching leads to many advantages: the forms of teaching and learning that are created in such a process are more engaging to students and more human in general. Since it feels more relevant to students, learning is enhanced, and becomes more inclusive. The student also experience new roles which helps them in becoming more independent, secure, and responsible. And it seems to be a lot of fun for the teacher, too, because a lot of new opportunities for positive interactions are created.

“Students as partners” does not mean that one necessarily has to jump into the pool at the deep end and re-design the whole curriculum from scratch. There is a whole continuum of increasing student participation where a teacher only gradually shares more and more control, and every small move towards more participation is a step in the right direction. This includes many smaller steps I’ve implemented in my teaching already, without even realising that that could be counted as working towards “students as partners”!

Some of those small steps suggested in the book and that can already have a positive impact include

  • Reserving one or two lessons at the end of the semester for perspectives or topics that students would like included (which I personally have really good experiences with!).
  • Giving student questions back into the group with the question “what do you think? and why?”, sharing the power to answer questions rather than claiming it solely for the teacher.
  • Doing a “note-taking relay”: at regular intervals, the teacher stops and gives time for students to take notes. Students do take notes and then pass them on to their neighbour. At the next note-taking break, they take notes on that piece of paper in front of them, and then pass it on to the next neighbour. They are thus creating a documentation of the class with and for each other.
  • Invite students to create study guides or resources for next year’s students.
  • Invite them to design infographics, slides, diagrams on important topics, or present their own role plays of different theories in fictitious situations, which then are used in teaching of their own class.

Especially this last point I think I might have underestimated until now. When I saw my name mentioned in the newsletters of my two favourite podcasts this week, it made me feel super proud! If students only feel a fraction of that pride when their work is featured in a course as something that other people can learn from, it is something we should be doing MUCH MORE!

Other things that come to my mind that share responsibility in small ways or strengthen relationships:

If you (and they!) so choose, students could also become partners on bigger parts of the course, and especially on designing their own assessment, and in evaluating the class. Here are some examples described in the book:

  • In one of her own courses on the topic of educational research (which probably included how to gather data in order to evaluate teaching and learning), Cathy invited students to pick aspects of her course which they wanted to evaluate, and then work with her to design an evaluation, analyse the data and present their findings.
  • She also describes how she invited Master students to co-design dissertation learning outcomes, and that it was possible to include it in the official university regulations: In addition to the ones that are prescribed for all students, each student gets to design one individually in collaboration with their supervisor.
  • Another idea she presents is to give students key words and let them create their own essay titles including those keywords. They have the freedom to choose what question they find most interesting related to a certain topic, while the teacher can make sure the important keywords from their point of view are included. But it is then important that students and teacher work together to make sure the scope is right and there is enough literature to answer that question!
  • And it is possible to let students vote on the weighting of different assessment components towards their final grade. This could even be done with boundary conditions that, e.g., each assignment will have to count for at least a certain percentage. Apparently the outcomes of such votes do not vary much from year to year, but still it is increasing student buy-in a lot!

Or, going further along that continuum of students as partners, students can get involved in the whole process of designing, conducting, evaluating and reporting on a course.

  • Cathy presents an example of a business course where student groups come up with business ideas in the beginning and then everybody discusses what students would need to learn in order to make those ideas become reality. Those topics are then presented to each other by different student groups.
  • The point above reminds me of something I heard on a podcast, where the students also got involved in presenting materials and the teacher gave them the choice of which topics they wanted to present themselves and which topics they would prefer taught by the teacher. This sounds like a great idea to give the students the opportunity to pick the topics they are really interested in and at the same time leave the seemingly less attractive topics (or those where they would really value the teacher’s experience in teaching them) to the teacher.
  • A project I am currently working on with Kjersti and Elin, where we bring together students that took a class the previous year with students who are taking it this year in order for them to do some tank experiments together, but working towards different learning outcomes depending on their level. Here the older students help the younger ones by engaging in dialogue with them and acting as role models, while also “learning through teaching”. We are working on engaging the students in designing the learning environment, and it is super exciting!
  • In a recent iEarth Digital Learning Forum, Mattias and Guro described the process of completely re-designing a course in dialogue between the teacher and a team of students. And not only did they co-design the course, they also presented it together (which is a step that is really easy to forget when the partnership isn’t fully internalized yet!).

I really like the framework of “students as partners” as a reminder to think about including students in a different way, and especially to think about it as a continuum where it’s ok — and even encouraged! — to start small, and then gradually build on it. And I am excited about trying more radical forms of “students as partners” in the future!


Bovill, C. (2020). Co-creating learning and teaching: Towards relational pedagogy in higher education. Critical Publishing.

Cook-Sather, A., Bovill, C., & Felten, P. (2014). Engaging students as partners in learning and teaching: A guide for faculty. John Wiley & Sons.

The learning styles myth (based on Pashler et al., 2008; Nancekivell et al., 2020)

One idea that I encounter a lot in higher education workshops is the idea of learning styles: that some people are “visual learners” that learn best by looking at visual representations of information, and other people that learn best from reading, or from listening to lectures, and that those are traits we are born with. When I encounter these ideas, they usually come with the understanding that we, as teachers, should figure out students’ learning styles and cater to the individual students’ styles. Which — even though I haven’t seen that actually happening in practice very much — if we take it seriously, obviously adds a lot of pressure and work that is taken on with the best intentions of supporting students in the best possible way.

But learning styles are a bit of a myth. When you ask people, yes, they will tell you about their preferences for learning. And certainly there are people who work better with one representation than with another. But that in itself is not enough to support the whole theory of learning styles.

In a review article, Pashler et al. (2008) show what kind of studies we would need to conduct in order to conclude that learning styles actually exist: We would have to separate a group of students based on their leaning styles and then teach part of each group with one method designed for one of the learning styles, and another part of the group with another method designed for the other learning style, and test both groups with the same test.

Pashler et al. (2008) then show what would count as evidence for the existence of learning styles and what would not (which is one reason for why I enjoyed reading the article so much, check out their Figure 1!!): Only if students with one learning style learn best from the method designed for their learning style, and students with the other learning style learn best from the method designed for them, we can conclude that students should be taught using the method that works with their learning style. And Pashler et al. (2008) state that they could not find studies showing that kind of evidence. If one method works better for all students with one learning style but the other method does not work better for students with the other, then we might still consider offering different methods to different people, but clearly the learning style isn’t the criterium we should be using to assign methods.

Then why do so many people believe in learning styles that it’s worth building an entire industry around them? It seems that the myth that our learning style is something we are born with is really common; especially in educators working with young children (Nancekivell et al., 2020). What that means it that the idea that “you are someone who learns best from looking at pictures” or “you are someone who learns best from listening” is propagated from kindergarten teachers to really young kids, and that we are likely to grow up with a belief about how we learn best based on what we were told when we were young. That belief is usually not challenged (and why would we challenge it?), and since it’s a framework that we have accepted for us and others, we are likely to start diagnosing learning types in others later on and thus keep the myth going. Since the learning style idea is never challenged, we are likely to adapt inefficient strategies based on our belief on what our learning style is.

What does that then mean for which methods we should be using in teaching? Pashler et al. (2008) conclude that we should focus our time and energy on methods for which there is empirical evidence of effectiveness (see for example here). Mixing up representations and including visual, auditory, tactile, … learning is probably still good — only tying it to specific learners and suggesting to them that they are inherently better at learning from one representation over all others, is not. Or if it is, there is no empirical evidence of it.


Nancekivell, S. E., Shah, P., & Gelman, S. A. (2020). Maybe they’re born with it, or maybe it’s experience: Toward a deeper understanding of the learning style myth. Journal of Educational Psychology, 112(2), 221.

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: Concepts and evidence. Psychological science in the public interest, 9(3), 105-119.

Building gender equity in the academy (Laursen & Austin, 2020)

After being “invited” to do some service work because someone noticed “that there was nobody on the committee without a beard” (gee, thanks for making me feel like you appreciate my qualifications!), and then the next day feeling all kinds of stereotype threats triggered in a video call where I noticed I was the only woman out of more than a dozen people, I finally read Sandra Laursen & Ann E. Austin’s book “Building Gender Equity in the Academy. Institutional Strategies for Change” this weekend. And it was great!

The book compiles many years of experience in the NSF’s ADVANCE program into a compelling collection. After setting the scene and describing the structural problems that women face in academia, all based on hard data that show the scope of the issue, the whole book is basically a call to action to “fix the system, not the women” while giving actionable suggestions for how to do this.

The book is structured along four main themes:

  • Many processes in recruitment, hiring, tenure, and promotion are biased, but there are ways to counteract the biases.
  • Workplaces themselves need an overhaul to make them more equitable, for example by addressing institutional leadership, improving climate at departments, or making gender issues more visible.
  • People need to be seen and supported as whole persons if we want to attract diversity into the workplace, for example by  supporting dual-career hires, allowing flexibility in work arrangements, or providing adequate accommodation.
  • While we are still working on the whole system becoming more equable, individual success of people who are already in the system can be supported by providing grants, development programmes, or mentoring and networking

For each of the four main themes, four strategies are presented together with different examples of how the strategy has been implemented in one of the ADVANCE projects, and reflections on how it worked.

The authors explain that even though their focus in the book is on gender (because the program that funded the projects they were evaluating was one focussing on gender), all the strategies most likely work for increasing diversity for other characteristics, too.

I found this really interesting from several different perspectives:

  • As someone who wants to support cultural change, I like that this book gives actionable suggestions and reflections on how they worked in different contexts. It will be great to refer back to this book whenever I see that there is potential for changes in policy and procedures, because there will certainly be good ideas in there that have already been tested and that we can build on! For everybody working in uni admin in any capacity, I would totally recommend keeping this book close by
  • As a woman in science, I used to be very active and on the leadership board of the Earth Science Women’s Network, where I met Sandra and really appreciated her perspective on things (and that she would join me for early morning swims in the lake!) and I’m just super happy to see that there is such a great body of work that we can all build on together and change things!
  • As someone who’s getting more and more interested in exploring the literature on faculty development and cultural change, this is a really good review of the literature related to gender equity in the academy. This is a great starting point for quickly finding relevant literature on this topic

So there is really no reason for anyone to not pick up this book and learn how to build gender equity in the academy :)


Laursen, S., & Austin, A. E. (2020). Building gender equity in the academy: Institutional strategies for change. Baltimore, MD: Johns Hopkins University Press.

Why it’s important to use students’ names, and how to make it easy: use name tents! (After Cooper et al., 2017)

One thing I really enjoy about teaching virtually is that it is really easy to address everybody by their names with confidence, since their names are always right there, right below their faces. But that really does not have to end once we are back in lecture theatres again, because even in large classes, we can always build and use name tents. And voilà: names are there again, right underneath people’s faces!

Sounds a bit silly when there are dozens or hundreds of students in the lecture theatre, both because it has a kindergarten feel and also because there are so many names, some of them too far away to read from the front, and also you can’t possibly address this many students by name anyway? In last week’s CHESS/iEarth workshop, run by Cathy and Mattias on “students as partners”, we touched upon the topic of the importance of knowing students’ names, and that reminded me of an article that I’ve been wanting to write about forever, that actually gives a lot of good reasons for using name tents: “What’s in a name? The importance of students perceiving that an instructor knows their names in a high-enrollment biology classroom” by Cooper et al. (2017). So here we go!

In that biology class with 185 students, the instructors encouraged the regular use of name tents (those folded pieces of paper that students put up in front of themselves), and afterwards the impact of those was investigated. What they found is that while of the large classes students had taken previously, only 20% of the students thought that instructors knew their names. In this class it were actually 78% (even though in reality, instructors knew only 53% of the names). And 85% of students felt that instructors knowing their names was important. It is important for nine different reasons that can be classified under three categories, as Cooper and colleagues found out:

  1. When students think the instructor knows their names, it affects their attitude towards the class since they feel more valued and also more invested.
  2. Students then also behave differently, because they feel more comfortable asking for help and talking to the instructor in general. They also feel like they are doing better in the class and are more confident about succeeding in class.
  3. It also changes how they perceive the course and the instructor: In the course, it helps them build a community with their peers. They also feel that it helps create relationships between them and the instructor, and that the instructor cares about them, and that the chance of getting mentoring or letters of recommendation from the instructor is increased.

So what does that mean for us as instructors? I agree with the authors that this is a “low-effort, high-impact” practice. Paper tents cost next to nothing and they don’t require any effort to prepare on the instructor’s side (other than it might be helpful to supply some paper). Using them is as simple as asking students to make them, and then regularly reminding them to put them up again (in the class described in the article, this happened both verbally as well as on the first slide of the presentation). Obviously, we then also need to make use of the name tents and actually call students by their names, and not only the ones in the first row, but also the ones further in the back (and walking through a classroom — both while presenting as well as when students are working in small groups or on their own, as for example in a think-pair-share setting — is a good strategy in any case because it breaks up things and gives more students direct access to the instructor). And in the end, students even sometimes felt that the instructors knew their names when they, in fact, did not, so we don’t actually have to know all the names for positive effects to occur (but I wonder what happens if students switch name tents for fun and the instructor does not notice. Is that going to just affect the two that switched, or more people since the illusion has been blown).

In any case, I will definitely be using name tents next time I’m actually in the same physical space as other people. How about you? (Also, don’t forget to include pronouns! Read Laura Guertin’s blogpost on why)


Cooper, K. M., Haney, B., Krieg, A., & Brownell, S. E. (2017). What’s in a name? The importance of students perceiving that an instructor knows their names in a high-enrollment biology classroom. CBE—Life Sciences Education, 16(1), ar8.

A tool to understand students’ previous experience and adapt your practical courses accordingly — by Kirsty Dunnett

Last week, I wrote about increasing inquiry in lab-based courses and mentioned that it was Kirsty who had inspired me to think about this in a new-to-me way. For several years, Kirsty has been working on developing practical work, and a central part of that has been finding out the types and amount of experiences incoming students have with lab work. Knowing this is obviously crucial to adapt labs to what students do and don’t know and avoid frustrations on all sides. And she has developed a nifty tool that helps to ask the right questions and then interpret the answers. Excitingly enough, since this is something that will be so useful to so many people and, in light of the disruption to pre-univeristy education caused by Covid-19, the slow route of classical publication is not going to help the students who need help most, she has agreed to share it (for the first time ever!) on my blog!

Welcome, Kirsty! :)

A tool to understand students’ previous experience and adapt your practical courses accordingly

Kirsty Dunnett (2021)

Since March 2020, the Covid-19 pandemic has caused enormous disruption across the globe, including to education at all levels. University education in most places moved online, while the disruption to school students has been more variable, and school students may have missed entire weeks of educational provision without the opportunity to catch up.

From the point of view of practical work in the first year of university science programmes, this may mean that students starting in 2021 have a very different type of prior experience to students in previous years. Regardless of whether students will be in campus labs or performing activities at home, the change in their pre-university experience could lead to unforeseen problems if the tasks set are poorly aligned to what they are prepared for.

Over the past 6 years, I have been running a survey of new physics students at UCL, asking about their prior experience. It consists of 5 questions about the types of practical activities students did as part of their pre-universities studies. By knowing students better, it is possible to introduce appropriate – and appropriately advanced – practical work that is aligned to students when they arrive at university (Dunnett et al., 2020).

The question posed is: “What is your experience of laboratory work related to Physics?”, and the five types of experience are:
1) Designed, built and conducted own experiments
2) Conducted set practical activities with own method
3) Completed set practical activities with a set method
4) Took data while teacher demonstrated practical work
5) Analysed data provided
For each statement, students select one of three options: ‘Lots’, ‘Some’, ‘None’, which, for analysis, can be assigned numerical values of 2, 1, 0, respectively.

The data on its own can be sufficient for aligning practical provision to students (Dunnett et al., 2020).

More insight can be obtained when the five types of experience are grouped in two separate ways.

1) Whether the students would have been interacting with and manipulating the equipment directly. The first three statements are ‘Active practical work’, while the last two are ‘Passive work’ on the part of the student.

2) Whether the students have had decision making control over their work. The first two statements are where students have ‘Control’, while the last three statements are where students are given ‘Instructions’.

Using the values assigned to the levels of experience, four averages are calculated for each student: ‘Active practical work’, ‘Passive work’; ‘Control’, ‘Instructions’. The number of students with each pair of averages is counted. This leads to the splitting of the data set, into one that considers ‘Practical experience’ (the first two averages) and one that considers ‘Decision making experience’ (the second pair of averages). (Two students with the same ‘Practical experience’ averages can have different ‘Decision making experience’ averages; it is convenient to record the number of times each pair of averages occurs in two separate files.)

To understand the distribution of the experience types, one can use each average as a co-ordinate – so each pair gives a point on a set of 2D axes – with the radius of the circle determined by the fraction of students in the group who had that pair of averages. Examples are given in the figure.

Prior experience of Physics practical work for students at UCL who had followed an A-level scheme of studies before coming to university. Circle radius corresponds to the fraction of responses with that pair of averages; most common pairs (largest circles, over 10% of students) are labelled with the percentages of students. The two years considered here are students who started in 2019 and in 2020. The Covid-19 pandemic did not cause disruption until March 2020, and students’ prior experience appears largely unaffected.

With over a year of significant disruption to education and limited catch up opportunities, the effects of the pandemic on students starting in 2021 may be significant. This is a quick tool that can be used to identify where students are, and, by rephrasing the statements of the survey to consider what students are being asked to to in their introductory undergraduate practical work – and adding additional statements if necessary, provide an immediate check of how students’ prior experience lines up with what they will be asked to do in their university studies.

With a small amount of adjustment to the question and statements as relevant, it should be easy to adapt the survey to different disciplines.

At best, it may be possible to actively adjust the activities to students’ needs. At worst, instructors will be aware of where students’ prior experience may mean they are ill-prepared for a particular type of activity, and be able to provide additional support in session. In either case, the student experience and their learning opportunities at university can be improved through acknowledging and investigating the effects of the disruption caused to education by the Covid-19 pandemic.


K. Dunnett, M.K. Kristiansson, G. Eklund, H. Öström, A. Rydh, F. Hellberg (2020). “Transforming physics laboratory work from ‘cookbook’ to genuine inquiry”. https://arxiv.org/abs/2004.12831

Increasing inquiry in lab courses (inspired by @ks_dnnt and Buck et al., 2008)

My new Twitter friend Kirsty, my old GFI-friend Kjersti and I have been discussing teaching in laboratories. Kirsty recommended an article (well, she did recommend many, but one that I’ve read and since been thinking about) by Buck et al. (2008) on “Characterizing the level of inquiry in the undergraduate laboratory”.

In the article, they present a rubric that I found intriguing: It consists of six different phases of laboratory work, and then assigns 5 levels ranging from a “confirmation” experiment to “authentic inquiry”, depending on whether or not instruction is giving for the different phases. The “confirmation” level, for example, prescribes everything: The problem or question, the theoretical background, which procedures or experimental designs to use, how the results are to be analysed, how the results are to be communicated, and what the conclusions of the experiment should be. For an open inquiry, only the question and theory are provided, and for authentic inquiry, all choices are left to the student.

The rubric is intended as a tool to classify existing experiments rather than designing new ones or modifying existing, but because that’s my favourite way to think things through, I tried plugging my favourite “melting ice cubes” experiment into the rubric. Had I thought about it a little longer before doing that, I might have noticed that I would only be copying fewer and fewer cells from the left going to the right, but even though it sounds like a silly thing to do in retrospect, it was actually still helpful to go through the exercise.

It also made me realize the implications of Kirsty’s heads-up regarding the rubric: “it assumes independence at early stages cannot be provided without independence at later stages”. Which is obviously a big limitation; one can think of many other ways to use experiments where things like how results are communicated, or even the conclusion, are provided, while earlier steps are left open for the student to decide. Also providing guidance on how to analyse results without prescribing the experimental design might be really interesting! So while I was super excited at first to use this rubric to povide an overview over all the different ways labs can possibly be structured, it is clearly not comprehensive. And a better idea than making a comprehensive rubric would probably be to really think about why instruction for any of phases should or should not be provided. A little less cook-book, a little more thought here, too! But still a helpful framework to spark thoughts and conversations.

Also, my way of going from one level to the next by simply withholding instruction and information is not the best way to go about (even though I think it works ok in this case). As the “melting ice cubes” experiment shows unexpected results, it usually organically leads into open inquiry as people tend to start asking “what would happen if…?” questions, which I then encourage them to pursue (but this usually only happens in a second step, after they have already run the experiment “my way” first). This relates well to “secret objectives” (Bartlett and Dunnett, 2019), where a discrepancy appears between what students expect based on previous information and what they then observe in reality (for example in the “melting ice cube” case, students expect to observe one process and find out that another one dominates), and where many jumping-off points exist for further investigation, e.g. the condensation pattern on the cups, or the variation of parameters (what if the ice was forced to the bottom of the cup? what’s the influence of the exact temperatures or the water depth, …?).

Introducing an element of surprise might generally be a good idea to spark interest and inquiry. Huber & Moore (2001) suggest using “discrepant events” (their example is dropping raisins in carbonated drinks, where they first sink to the bottom and then raise as gas bubbles attach to them, only to sink again when the bubbles break upon reaching the surface) to initiate discussions. They then  suggest following up the observation of the discrepant event with a “can you think of a way to …?” question (i.e. make the raisin raise faster to the surface). The “can you think of a way to…?” question is followed by brainstorming of many different ideas. Later, students are asked “can you find a way to make it happen?”, which then means that they pick one of their ideas and design and conduct an experiment. Huber & Moore (2001) then suggest a last step, in which students are asked to do a graphical representation or of their results or some other product, and “defend” it to their peers.

In contrast to how I run my favourite “melting ice cubes” experiment when I am instructing it in real time, I am using a lot of confirmation experiences, for example in my advent calendar “24 days of #KitchenOceanography”. How could they be re-imagined to lead to more investigation and less cook-book-style confirmation, especially when presented on a blog or social media? Ha, you would like to know, wouldn’t you? I’ve started working on that, but it’s not December yet, you will have to wait a little! :)

I’m also quite intrigued by the “product” that students are asked to produce after their experimentation, and by what would make a good type of product to ask for. In the recent iEarth teaching conversations, Torgny has been speaking of “tangible traces of learning” (in quotation marks which makes me think there is definitely more behind that term than I realize, but so far my brief literature search has been unsuccessful). But maybe that’s why I like blogging so much, because it makes me read articles all the way to the end, think a little more deeply about them, and put the thought into semi-cohesive words, thus giving me tangible proof of learning (that I can even google later to remind me what I thought at some point)? Then, maybe everybody should be allowed to find their own kind of product to produce, depending on what works best for them. On the other hand, for the iEarth teaching conversations, I really like the format of one page of text, maximum, because I really have to focus and edit it (not so much space for rambling on as on my blog, but a substantially higher time investment… ;-)). Also I think giving some kind of guidance is helpful, both to avoid students getting spoilt for choice, and to make sure they focus their time and energy on things that are helping the learning outcomes. Cutting videos for example might be a great skill to develop, but it might not be the one you want to develop in your course. Or maybe you do, or maybe the motivational effects of letting them choose are more important, in which case that’s great, too! One thing that we’ve done recently is to ask students to write blog or social media posts instead of classical lab reports and that worked out really well and seems to have motivated them a lot (check out Johanna Knauf’s brilliant comic!!!).

Kirsty also mentioned a second point regarding the Buck et al. (2008) rubric to keep in mind: it is just about what is provided by the teacher, not about the students’ role in all this. That’s an easy trap to fall into, and one that I don’t have any smart ideas about right now. And I am looking forward to discussing more thoughts on this, Kirsty :)

In any case, the rubric made me think about inquiry in labs in a new way, and that’s always a good thing! :)


Bartlett, P. A. and K. Dunnett (2019). Secret objectives: promoting inquiry and tackling preconceptions in teaching laboratories. arXiv:1905.07267v1 [physics.ed-ph]

Buck, L. B., Bretz, S. L., & Towns, M. H. (2008). Characterizing the level of inquiry in the undergraduate laboratory. Journal of college science teaching, 38(1), 52-58.

Huber, R.A., and C.J. Moore. 2001. A model for extending hands-on science to be inquiry based. School Science and Mathematics 101 (1): 32–41.

“Wonder questions” and geoscience misconceptions.

Recently, as part of the CHESS/iEarth Summer School, Kikki Kleiven lead a workshop on geoscience teaching. She gave a great overview over how to approach teaching and presented many engaging methods (like, for example, concept cartoons and role plays), but two things especially sparked my interest, so that I read up on them a little more: “wonder questions” and misconceptions in geosciences.

“Wonder questions”

The first topic that prompted a little literature search were “wonder questions”, and I found a recent article by Lindstrøm (2021) on the topic that describes the three ways in which “wonder questions” are a powerful pedagogical tool:

  1. they support and stimulate student learning: When students are asked to come up with  “wonder questions”, they need to consider what they just learned and how it fits (or doesn’t fit) with what they already knew before. They need to think new thoughts and actively look for connections, both helping them learn.
  2. they models scientists’ behavior: Asking good questions is a skill that needs practice!
  3. they can be a powerful motivator for students and teachers alike: As a teacher, it’s great to see what questions students come up with and it helps tailor the teaching to what’s really relevant to the students. Seeing their questions taken up in teaching, on the other hand, is giving students agency and makes them feel heard.

Lindstrøm distinguishes four types of wonder questions that she typically encounters, and which are useful in different ways:

  • Questions where students rephrase a concept and want confirmation that they understood something correctly are helping them make sure they are on the right track, but also confirm it to the teacher. Those questions can also be used in future teaching to paraphrase the material in the students’ own words.
  • Questions that are very close to course content and bring in real-world examples are great to make sure the examples used in (future) classes are actually relevant to students’ lives.
  • Questions that go beyond the course content are also useful to clarify what is going to be taught in this specific course and what other courses will build on it. They can also open up doors for future (student) research projects.
  • Questions that reveal misconceptions are great because we can only address misconceptions if we know about them in the first place.

Which brings us to the next topic Kikki inspired me to revisit:

Geoscience misconceptions

Kikki mentioned the article “A compilation and review of over 500 geoscience misconceptions” by Francek (2013). I’m familiar with misconceptions in physics (especially the ones related to hydrostatics and rotating systems & Coriolis force that I’ve worked with), and within iEarth there has been a lot of talk about how students don’t understand geological time (which I don’t have a good grasp of, either). But reading the “500” in the title was enough to make me want to check out the article to get an idea of what other misconceptions might be relevant for my own teaching. And it turns out there are plenty to choose from!

Many of the misconceptions that are particularly relevant for my own interests were originally collected by Kent Kirkby (2008) as “easier to address” misconceptions, for example on science, ocean systems, glaciers, climate:

  • “Upwelling occurs as deeper water layers warm and rise ([…] tied to students’ knowledge of how air masses are affected by temperature).”
  • “Upwelling occurs as deeper water layers lose their salinity and rise (students like symmetry!).”
  • “Glacial ice moves backwards during glacial ‘retreats’ (like everything that retreats in real life)”
  • “Glacial ice is stationary during times when front is neither advancing or retreating.”
  • “Earth’s climate is controlled primarily by the atmosphere circulation, rather than ocean circulation (real life experiences as a terrestrial animal, TV weather reports)”

Reading through that list is really interesting and a good reminder that there are a lot of things that we take for granted but that are really not as obvious as we have might come to believe over the years. And the misconceptions are only “easy to address” (and one way of addressing them is through “elicit, confront, resolve“) when we are aware of them in the first place.

Francek, M. (2013). A compilation and review of over 500 geoscience misconceptions. International Journal of Science Education, 35(1), 31-64.

Lindstrøm, C. (2021). The pedagogical power of Wonder Questions. The Physics Teacher, 59(4), 275-277.

Why should students want engage in something that changes their identity as well as their view of themselves in relation to friends and family?

Another iEarth Teaching Conversation with Kjersti Daae and Torgny Roxå, summarized by Mirjam Glessmer

“Transformative experiences” (Pugh et al., 2010) are those experiences that change the way a person looks at the world, so that they henceforth voluntarily engage in a new-to-them practice of sensemaking on this new topic, and perceive it as valuable. There are methods to facilitate transformative experiences for teaching purposes (Pugh et al., 2010), and discovering this felt like the theoretical framework I had been looking for for #WaveWatching just fell into my lap. But then Torgny asked the question in the title above. For many academics, seeing the world through new eyes, being asked questions they haven’t asked themselves before, discovering gaps in their argumentations, surrendering to a situation (Pugh 2011), engaging in sensemaking (Odden and Russ, 2019), being part of a community of practice (Wenger, 2011) is fun. Not in all contexts and on all topics, of course, but at least in many contexts. But can we assume it’s the same for students?

In order to feel that you want to take on a challenge in which you don’t know whether or not you’ll succeed, a crucial condition is that you believe that your intelligence and your skills can be developed (Dweck, 2015). A growth mindset can be cultivated by the kind of feedback we give students (Dweck, 2015). The scaffolding (Wood et al., 1976) we provide, and the opportunities for creating artefacts as tangible proof of learning* can support this. But how do we get students to engage in the first place?

One approach, the success of which I have anecdotal evidence for, could be to use surprising gimmicks like a DIY fortune teller or a paper clip to be shaped into a spinning top to raise intrigue, if not for the topic itself right away, then for something that will later be related to the topic, hoping that the engagement with the object can be transferred to the topic.

Another approach, which also aligns with my personal experience, might be to let students experience the relevance of a situation vicariously, infecting students with the teacher’s enthusiasm for a topic (Hodgson, 2005). However, Torgny raised the point that sometimes the (overly?) enthusiastic teacher themselves could become the subject of student fascination, thus diverting attention from the topic they wanted the students to engage with.

A third way might be to point out alignment of tasks with the students’ own goals & identities. Growth mindset interventions can increase domain-specific desire to learn (Burette et al., 2020), identity interventions increase the likelihood of engagement, for example targeting physics identity (Wulff et al., 2018). Goal-setting intervention can improve academic performance (Morisano et al., 2010).

I want to relate these three ideas to feelings of competence, relatedness and autonomy, which are the three basic requirements for intrinsic motivation (Ryan & Deci, 2017), but I am sadly out of space. But I think that self-determination theory is a useful lens to keep in mind when developing teaching.

References:

  • Burnette, J. L., Hoyt, C. L., Russell, V. M., Lawson, B., Dweck, C. S., & Finkel, E. (2020). A growth mind-set intervention improves interest but not academic performance in the field of computer science. Social Psychological and Personality Science11(1), 107-116.
  • Dweck, C. (2015). Carol Dweck revisits the growth mindset. Education Week35(5), 20-24.
  • Hodgson, V. 2005. Lectures and the experience or relevance. In Experience of learning: Implications for teaching and studying in higher education, F. Marton, D. Hounsell, and N. Entwistle, vol. 3, 159–71. Edinburgh: University of Edinburgh, Centre for Teaching, Learning and Assessment
  • Odden, T. O. B., & Russ, R. S. (2019). Defining sensemaking: Bringing clarity to a fragmented theoretical construct. Science Education103(1), 187-205.
  • Morisano, D., Hirsh, J. B., Peterson, J. B., Pihl, R. O., & Shore, B. M. (2010). Setting, elaborating, and reflecting on personal goals improves academic performance.Journal of Applied Psychology, 95(2), 255–264. https://doi.org/10.1037/a0018478
  • Pugh, K. J., Linnenbrink-Garcia, L., Koskey, K. L., Stewart, V. C., & Manzey, C. (2010). Teaching for transformative experiences and conceptual change: A case study and evaluation of a high school biology teacher’s experience. Cognition and Instruction28(3), 273-316.
  • Pugh, K. J. (2011). Transformative experience: An integrative construct in the spirit of Deweyan pragmatism. Educational Psychologist46(2), 107-121.
  • Ryan, R. M., & Deci, E. L. (2017). Self-determination theory: Basic psychological needs in motivation, development, and wellness. New York: Guilford
  • Wenger, E. (2011). Communities of practice: A brief introduction.
  • Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of child psychology and psychiatry17(2), 89-100.
  • Wulff, P., Hazari, Z., Petersen, S., & Neumann, K. (2018). Engaging young women in physics: An intervention to support young women’s physics identity development. Physical Review Physics Education Research14(2), 020113.

*Very nice example by Kjersti: Presenting students (or fathers-in-laws) with a few simple ideas about rotating fluid dynamics enables them to combine the ideas to draw a schematic of the Hadley cell circulation. Which is a lot more engaging and satisfying that being presented with a schematic and someone talking you through it. If you are willing to surrender to the experience in the first place…

#WaveWatching as “transformative experience”? (Based on articles by Pugh et al. 2019, 2011, 2010)

I was reading an article on “active learning” by Lombardi et al. (2021), when the sentence “In undergraduate geoscience, Pugh et al. (2019) found that students who made observations of the world and recognized how they might be explained by concepts from their classes were more likely to stay in their major than those who do not report this experience” jumped at me. Something about observing the world and connecting it to ideas from class was so intriguing, that I had to go down that rabbit hole and see where this statement was coming from, and if it might help me as a theoretical framework for thinking about #WaveWatching (which I’ve been thinking about a lot since the recent teaching conversation).

Going into that Pugh et al. (2019) article, I learned about a concept called “transformative experience”, which I followed back to Pugh (2011): A transformative experience happens when students see the world with new eyes, because they start connecting concepts from class with their real everyday lives. There is quote at the beginning of that article which reminds me very much of what people say about wave watching (except that in the quote the person talks about clouds): that once they’ve started seeing pattern because they understood that what they look at isn’t chaotic but can be explained, they cannot go back to just looking at the beauty of it without questioning why it came to be that way. They now feel the urge to make sense of the pattern they see, everytime they come across anything related to the topic.

This is described as the three characteristics of transformative experiences:

  • they are done voluntarily out of intrinsic motivation (meaning that the application of class concepts is not required by the teacher or some other authority),
  • they expand peception (when the world is now seen through the subject’s lens and looks different than before), and
  • they have experiential value (meaning the person experiencing them perceives them as adding value to their lives).

And it turns out that facilitating such transformative experiences might well be what distinguishes schools with higher student retention from those with lower student retention in Pugh et al.’s 2019 study!

But how can we, as teachers, facilitate transformative experiences? Going another article further down the rabbit hole to Pugh et al. (2010), this is how!

The “Teaching for Transformative Experiences” model consists of three methods acting together:

  • framing content in a way that the “experiential value” becomes clear, meaning making an effort to explain the value that perceiving the world in such a way adds to our lives. This can be done by expressing the feelings it evokes or usefulness that it adds. For #WaveWatching, I talk about how much I enjoy the process, but also how making sense of an aspect of the world that first seemed chaotic is both satisfying and calming to me. But framing in terms of the value of the experience can also be done by metaphors, for example about the tales that rocks, trees, or coastlines could tell. Similarly, when I speak about “kitchen oceanography”, I hope that it raises curiosity about how we can learn about the ocean in a kitchen.
  • scaffolding how students look at the world by helping them change lenses step by step, i.e. “re-seeing”, for example by pointing out specific features, observing them together, talking through observations or providing opportunities to share and discuss observations (so pretty much my #WaveWatching process!).
  • modeling transformative experiences, i.e. sharing what and how we perceive our own transformative experiences, in order to show students that it’s both acceptable and desirable to see the world in a certain way, and communicate about it. I do this both in person as well as whenever I post about #WaveWatching online.

So it seems that I have been creating transformative experiences with #WaveWatching all this time without knowing it! Or at least that this framework works really well to describe the main features of #WaveWatching.

Obviously I have only just scratched the literature on transforming experiences, but I have a whole bunch of articles open on my desktop already, about case studies of facilitating transformative experiences in teaching. And I cannot wait to dig in and find out what I can learn from that research and apply it to improve #WaveWatching! :)

Lombardi, D., Shipley, T. F., & Astronomy Team, Biology Team, Chemistry Team, Engineering Team, Geography Team, Geoscience Team, and Physics Team. (2021). The curious construct of active learning. Psychological Science in the Public Interest, 22(1), 8-43.

Pugh, K. J., Phillips, M. M., Sexton, J. M., Bergstrom, C. M., & Riggs, E. M. (2019). A quantitative investigation of geoscience departmental factors associated with the recruitment and retention of female students. Journal of Geoscience Education, 67(3), 266-284.

Pugh, K. J. (2011). Transformative experience: An integrative construct in the spirit of Deweyan pragmatism. Educational Psychologist, 46(2), 107-121.

Pugh, K. J., Linnenbrink-Garcia, L., Koskey, K. L., Stewart, V. C., & Manzey, C. (2010). Teaching for transformative experiences and conceptual change: A case study and evaluation of a high school biology teacher’s experience. Cognition and Instruction, 28(3), 273-316.

What does “sensemaking” really mean in the context of learning about science? (Reading Odden & Russ, 2019)

I read the article “Defining sensemaking: Bringing clarity to a fragmented theoretical construct” by Odden and Russ (2019) and what I loved about the article are two main things: I realized that “sensemaking” is the name of an activity I immensely enjoy under certain conditions, and being able to put words to that activity made me really happy! And I found it super helpful that the differences between “sensemaking” and other concepts like “explaining” or “thinking” were pointed out, because that gave me an even clearer idea of what is meant by “sensemaking”.

What is sensemaking? The definition given in the Odden and Russ (2019) article is simple:

Sensemaking is a dynamic process of building or revising an explanation in order to “figure something out”—to ascertain the mechanism underlying a phenomenon in order to resolve a gap or inconsistency in one’s understanding.

Odden and Russ discuss that in the educational science literature, sensemaking has previously been used to mean three different things, that can all be reconceiled under this definition, but that have been discussed mostly independently before:

  1. An approach to learning: Sensemaking can mean really wanting to figure something out by yourself — making sense of an intriguing problem by bringing together what you know, asking yourself questions, building and testing hypotheses, but not asking other people for the correct solution. This is my approach to escape games, for example — I hate using the help cards! I know that it should be possible to figure the puzzles out, so I want to do it myself! This approach is obviously desirable in science learners, since they are not just relying on memorizing responses or assembling surface-level knowledge. They really want to make sense out of something that did not make sense before.
  2. A cognitive process: In this sense, sensemaking is really about how students bring together pieces of previous knowledge and experiences, and new knowledge, and how they integrate them to form a new and bigger coherent structure, for example by using analogies or metaphors.
  3. A way of communicating: Sensemaking then is the collaborative effort to make sense by bringing together different opinions or to construct an explanation, and than critiquing it in order to make sure the arguments are watertight. This can happen both using technical terms and everyday language.

And now how is “sensemaking” different from other, seemingly similar terms? (Or, as the authors say, how can we differentiate sensemaking “from other <good> things to do when learning science”?) This is my summary of the arguments from the article:

Thinking. Compared with sensemaking, thinking is a lot broader. One can do a lot of thinking without attempting to create any new sense. Thinking does not require the critical approach that is essential to sensemaking.

Learning. While sensemaking is a form of learning, there are a lot of other forms that don’t include sensemaking, for example memorization.

Explaining. Sensemaking requires the process of “making sense” of something that previously did not make sense, explanating does not necessarily require that. Depending on the context, explanations can sometimes well be generated out of previous knowledge without building new relationships or anything.

Argumentation. Argumentation is a much wider term than sensemaking — one can for example argue with the goal of persuading someone else rather than building a common understanding and making sense out of information.

Modeling. There is a great overlap between modeling and sensemaking, but sensemaking is typically more dynamic and short-term, whereas modeling is a more formal activity that can take place over days and weeks, sometimes with the purpose of communicating ideas.

I found reading this article enlightening because it is giving me a language to talk about sensemaking, to articulate nuances, that I previously did not have. By reflecting on situations where I really enjoy sensemaking (another example is wave watching: I am trying to make sense of what I see by running through questions in my head. Can I observe what causes the waves? Is their behavior consistent with what I would expect given what I can observe about the topography? If not, what does that tell me about the topogaphy in places where I can’t observe it?) and on others where I don’t (thinking of times in school when I did not see the point of trying to make sense out of something [as in make all the individual pieces of previous knowledge and new information fit together coherently without conflict] and just needed to go though the motions of it to pass a test or something), I find it intriguing to think about why I sometimes engage in the process and enjoy it, and sometimes I don’t even try to engage.

How does it work for you, do you know under what conditions you engage in sensemaking, and under which don’t you?

Odden, T. O. B., & Russ, R. S. (2019). Defining sensemaking: Bringing clarity to a fragmented theoretical construct. Science Education, 103(1), 187-205.