Tag Archives: inquiry

Structuring local, inquiry-based field work (Praskievicz, 2022)

I am catching up on my reading for the iEarth Journal Club! This month is very much in line with what my recent thinking on place-based learning, active lunch breaks to connect disciplinary content to everyday experience and also to reconnect with the fun of it, and our forthcoming vignette in a Teaching Fieldwork book, in which Kjersti, Hans-Christian and I suggest an (even more) structured method to do fieldwork (blog post with more details here). Continue reading

A tool to understand students’ previous experience and adapt your practical courses accordingly — by Kirsty Dunnett

Last week, I wrote about increasing inquiry in lab-based courses and mentioned that it was Kirsty who had inspired me to think about this in a new-to-me way. For several years, Kirsty has been working on developing practical work, and a central part of that has been finding out the types and amount of experiences incoming students have with lab work. Knowing this is obviously crucial to adapt labs to what students do and don’t know and avoid frustrations on all sides. And she has developed a nifty tool that helps to ask the right questions and then interpret the answers. Excitingly enough, since this is something that will be so useful to so many people and, in light of the disruption to pre-univeristy education caused by Covid-19, the slow route of classical publication is not going to help the students who need help most, she has agreed to share it (for the first time ever!) on my blog!

Welcome, Kirsty! :)

A tool to understand students’ previous experience and adapt your practical courses accordingly

Kirsty Dunnett (2021)

Since March 2020, the Covid-19 pandemic has caused enormous disruption across the globe, including to education at all levels. University education in most places moved online, while the disruption to school students has been more variable, and school students may have missed entire weeks of educational provision without the opportunity to catch up.

From the point of view of practical work in the first year of university science programmes, this may mean that students starting in 2021 have a very different type of prior experience to students in previous years. Regardless of whether students will be in campus labs or performing activities at home, the change in their pre-university experience could lead to unforeseen problems if the tasks set are poorly aligned to what they are prepared for.

Over the past 6 years, I have been running a survey of new physics students at UCL, asking about their prior experience. It consists of 5 questions about the types of practical activities students did as part of their pre-universities studies. By knowing students better, it is possible to introduce appropriate – and appropriately advanced – practical work that is aligned to students when they arrive at university (Dunnett et al., 2020).

The question posed is: “What is your experience of laboratory work related to Physics?”, and the five types of experience are:
1) Designed, built and conducted own experiments
2) Conducted set practical activities with own method
3) Completed set practical activities with a set method
4) Took data while teacher demonstrated practical work
5) Analysed data provided
For each statement, students select one of three options: ‘Lots’, ‘Some’, ‘None’, which, for analysis, can be assigned numerical values of 2, 1, 0, respectively.

The data on its own can be sufficient for aligning practical provision to students (Dunnett et al., 2020).

More insight can be obtained when the five types of experience are grouped in two separate ways.

1) Whether the students would have been interacting with and manipulating the equipment directly. The first three statements are ‘Active practical work’, while the last two are ‘Passive work’ on the part of the student.

2) Whether the students have had decision making control over their work. The first two statements are where students have ‘Control’, while the last three statements are where students are given ‘Instructions’.

Using the values assigned to the levels of experience, four averages are calculated for each student: ‘Active practical work’, ‘Passive work’; ‘Control’, ‘Instructions’. The number of students with each pair of averages is counted. This leads to the splitting of the data set, into one that considers ‘Practical experience’ (the first two averages) and one that considers ‘Decision making experience’ (the second pair of averages). (Two students with the same ‘Practical experience’ averages can have different ‘Decision making experience’ averages; it is convenient to record the number of times each pair of averages occurs in two separate files.)

To understand the distribution of the experience types, one can use each average as a co-ordinate – so each pair gives a point on a set of 2D axes – with the radius of the circle determined by the fraction of students in the group who had that pair of averages. Examples are given in the figure.

Prior experience of Physics practical work for students at UCL who had followed an A-level scheme of studies before coming to university. Circle radius corresponds to the fraction of responses with that pair of averages; most common pairs (largest circles, over 10% of students) are labelled with the percentages of students. The two years considered here are students who started in 2019 and in 2020. The Covid-19 pandemic did not cause disruption until March 2020, and students’ prior experience appears largely unaffected.

With over a year of significant disruption to education and limited catch up opportunities, the effects of the pandemic on students starting in 2021 may be significant. This is a quick tool that can be used to identify where students are, and, by rephrasing the statements of the survey to consider what students are being asked to to in their introductory undergraduate practical work – and adding additional statements if necessary, provide an immediate check of how students’ prior experience lines up with what they will be asked to do in their university studies.

With a small amount of adjustment to the question and statements as relevant, it should be easy to adapt the survey to different disciplines.

At best, it may be possible to actively adjust the activities to students’ needs. At worst, instructors will be aware of where students’ prior experience may mean they are ill-prepared for a particular type of activity, and be able to provide additional support in session. In either case, the student experience and their learning opportunities at university can be improved through acknowledging and investigating the effects of the disruption caused to education by the Covid-19 pandemic.


K. Dunnett, M.K. Kristiansson, G. Eklund, H. Öström, A. Rydh, F. Hellberg (2020). “Transforming physics laboratory work from ‘cookbook’ to genuine inquiry”. https://arxiv.org/abs/2004.12831

Increasing inquiry in lab courses (inspired by @ks_dnnt and Buck et al., 2008)

My new Twitter friend Kirsty, my old GFI-friend Kjersti and I have been discussing teaching in laboratories. Kirsty recommended an article (well, she did recommend many, but one that I’ve read and since been thinking about) by Buck et al. (2008) on “Characterizing the level of inquiry in the undergraduate laboratory”.

In the article, they present a rubric that I found intriguing: It consists of six different phases of laboratory work, and then assigns 5 levels ranging from a “confirmation” experiment to “authentic inquiry”, depending on whether or not instruction is giving for the different phases. The “confirmation” level, for example, prescribes everything: The problem or question, the theoretical background, which procedures or experimental designs to use, how the results are to be analysed, how the results are to be communicated, and what the conclusions of the experiment should be. For an open inquiry, only the question and theory are provided, and for authentic inquiry, all choices are left to the student.

The rubric is intended as a tool to classify existing experiments rather than designing new ones or modifying existing, but because that’s my favourite way to think things through, I tried plugging my favourite “melting ice cubes” experiment into the rubric. Had I thought about it a little longer before doing that, I might have noticed that I would only be copying fewer and fewer cells from the left going to the right, but even though it sounds like a silly thing to do in retrospect, it was actually still helpful to go through the exercise.

It also made me realize the implications of Kirsty’s heads-up regarding the rubric: “it assumes independence at early stages cannot be provided without independence at later stages”. Which is obviously a big limitation; one can think of many other ways to use experiments where things like how results are communicated, or even the conclusion, are provided, while earlier steps are left open for the student to decide. Also providing guidance on how to analyse results without prescribing the experimental design might be really interesting! So while I was super excited at first to use this rubric to povide an overview over all the different ways labs can possibly be structured, it is clearly not comprehensive. And a better idea than making a comprehensive rubric would probably be to really think about why instruction for any of phases should or should not be provided. A little less cook-book, a little more thought here, too! But still a helpful framework to spark thoughts and conversations.

Also, my way of going from one level to the next by simply withholding instruction and information is not the best way to go about (even though I think it works ok in this case). As the “melting ice cubes” experiment shows unexpected results, it usually organically leads into open inquiry as people tend to start asking “what would happen if…?” questions, which I then encourage them to pursue (but this usually only happens in a second step, after they have already run the experiment “my way” first). This relates well to “secret objectives” (Bartlett and Dunnett, 2019), where a discrepancy appears between what students expect based on previous information and what they then observe in reality (for example in the “melting ice cube” case, students expect to observe one process and find out that another one dominates), and where many jumping-off points exist for further investigation, e.g. the condensation pattern on the cups, or the variation of parameters (what if the ice was forced to the bottom of the cup? what’s the influence of the exact temperatures or the water depth, …?).

Introducing an element of surprise might generally be a good idea to spark interest and inquiry. Huber & Moore (2001) suggest using “discrepant events” (their example is dropping raisins in carbonated drinks, where they first sink to the bottom and then raise as gas bubbles attach to them, only to sink again when the bubbles break upon reaching the surface) to initiate discussions. They then  suggest following up the observation of the discrepant event with a “can you think of a way to …?” question (i.e. make the raisin raise faster to the surface). The “can you think of a way to…?” question is followed by brainstorming of many different ideas. Later, students are asked “can you find a way to make it happen?”, which then means that they pick one of their ideas and design and conduct an experiment. Huber & Moore (2001) then suggest a last step, in which students are asked to do a graphical representation or of their results or some other product, and “defend” it to their peers.

In contrast to how I run my favourite “melting ice cubes” experiment when I am instructing it in real time, I am using a lot of confirmation experiences, for example in my advent calendar “24 days of #KitchenOceanography”. How could they be re-imagined to lead to more investigation and less cook-book-style confirmation, especially when presented on a blog or social media? Ha, you would like to know, wouldn’t you? I’ve started working on that, but it’s not December yet, you will have to wait a little! :)

I’m also quite intrigued by the “product” that students are asked to produce after their experimentation, and by what would make a good type of product to ask for. In the recent iEarth teaching conversations, Torgny has been speaking of “tangible traces of learning” (in quotation marks which makes me think there is definitely more behind that term than I realize, but so far my brief literature search has been unsuccessful). But maybe that’s why I like blogging so much, because it makes me read articles all the way to the end, think a little more deeply about them, and put the thought into semi-cohesive words, thus giving me tangible proof of learning (that I can even google later to remind me what I thought at some point)? Then, maybe everybody should be allowed to find their own kind of product to produce, depending on what works best for them. On the other hand, for the iEarth teaching conversations, I really like the format of one page of text, maximum, because I really have to focus and edit it (not so much space for rambling on as on my blog, but a substantially higher time investment… ;-)). Also I think giving some kind of guidance is helpful, both to avoid students getting spoilt for choice, and to make sure they focus their time and energy on things that are helping the learning outcomes. Cutting videos for example might be a great skill to develop, but it might not be the one you want to develop in your course. Or maybe you do, or maybe the motivational effects of letting them choose are more important, in which case that’s great, too! One thing that we’ve done recently is to ask students to write blog or social media posts instead of classical lab reports and that worked out really well and seems to have motivated them a lot (check out Johanna Knauf’s brilliant comic!!!).

Kirsty also mentioned a second point regarding the Buck et al. (2008) rubric to keep in mind: it is just about what is provided by the teacher, not about the students’ role in all this. That’s an easy trap to fall into, and one that I don’t have any smart ideas about right now. And I am looking forward to discussing more thoughts on this, Kirsty :)

In any case, the rubric made me think about inquiry in labs in a new way, and that’s always a good thing! :)


Bartlett, P. A. and K. Dunnett (2019). Secret objectives: promoting inquiry and tackling preconceptions in teaching laboratories. arXiv:1905.07267v1 [physics.ed-ph]

Buck, L. B., Bretz, S. L., & Towns, M. H. (2008). Characterizing the level of inquiry in the undergraduate laboratory. Journal of college science teaching, 38(1), 52-58.

Huber, R.A., and C.J. Moore. 2001. A model for extending hands-on science to be inquiry based. School Science and Mathematics 101 (1): 32–41.