Tag Archives: backward design

Outreach is about more than about the perfect presentation (or even the perfect hands-on tank experiment!)

In most of my blog posts on outreach I focus on how to run the _perfect_ experiment. And while I still think that’s awesome, I recently read an article by Johanna Varner (“Scientific Outreach: Toward Effective Public Engagement with Biological Science”, 2014) that made a lot of points that I have definitely not stressed enough on my blog, and probably not even considered enough.
Outreach is often modeled on scientific communication and intuition. Of course, since that is what we’ve learned over the years and gotten good at, and what we are most comfortable with. But when we are trying to engage the “general public”, those are mostly people who have a very different background from us. Speaking of backgrounds — there is a problem with the concept of “the general public”, as there is no _one_ general public. The general public is very very diverse, and it is important to consider each audience individually. And there is the next thing: “Audience” then often implies that a scientist talks and “the general public” listens, which is not the best model. One-way communication that we often use in outreach, more often than not using simplified, sensationalized stories, is just not effective. For retention of facts as well as for building enthusiasm and for engaging in deep thinking, the public needs to be actively engaged, not talked to.
To also consider is that the reliability of a source is not judged by how many PhDs a speaker has, but by how well it supports the listener’s preconceptions. Any new information is interpreted in such a way that it supports existing ideas. And even if ideas could be “objectively transferred”: new knowledge does not change attitudes or behaviour. And even the intention to act is a poor predictor of future behaviour!
So what can we do?
The article provides a structure for planning outreach activities which is basically backward design: Start with what you want people to learn, then think about what you would take as evidence that they actually learned it, and then plan the activity. Check out the article if you are not familiar with the concept, it’s a really nice introduction. And it is always important to remember that effectiveness of any activity depends on an explicit definition of the goals.
Then, there are a couple of design elements we can use. All of those come from the article originally, but I give my own interpretation and examples.
  • Use “trusted resources” to help us share our message. Instead of doing our outreach activity as a self-organized event, use local churches, artists, any institution or person whom the community trusts to invite you and set the stage for you, this will make it much more likely that people will not only listen to, but actually consider taking on your message.
  • Know your audience. This is super difficult! But since you will want to create personal relevance for your audience (since personal relevance is essential for engagement), you need to know about what your audience’s knowledge, attitudes, values are. And it goes without saying that every outreach activity needs to be tailored to each audience specifically.
  • Establish common ground with your audience, this makes your message more likely to be accepted. Don’t be the scientist who nobody can relate to, be the person who lives in the same neighbourhood, who supports the same sports team, who likes the same kind of music, whatever is applicable in your case.
  • Use appropriate language! Don’t alienate by speaking to science-y, and also beware that words carry a very different meaning in science than in everyday language sometimes (And if you have never seen those tables that tell you that the term “alcohol”, vor example, means “booze” to the general public, when you use it to mean “solvent”, definitely check out examples of such tables here or here!)
  • Get into dialogue instead of just “preaching” in a one-way manner. Ask for questions and feedback, offer to follow-up by email, engage with the people there!
  • Frame your science in a storyline. It makes it much easier to follow and to digest as well as to remember.

    wasserflaschen copy

    Click to enlarge

  • Use “vivid hooks”, i.e. present your research question as an actual question or puzzle to solve, ask people to brainstorm hypotheses, show them the real data, let them get actively involved! Experiential learning and personal experience influence attitudes and beliefs strongly. This might be easiest if you had animals to show, but even just a good question works. Sometimes it’s actually surprising to see what works: The other day I had a blog post showing an empty bottle and one filled with water and asked whether people knew which one was which. And I got so many private messages with people’s answers, asking me to confirm they were correct! I had never thought that this particular blog post would raise such interest.
  • Emphasize benefits of action rather than risks of inaction. Fear appeals can backfire, since they lead to feelings of helplessness, which then lead to denial, apathy, resignation. And all of those prevent engagement.
  • Provide action resources. Enthusiasm and active engagement don’t stay up for very long after you are done with your outreach experiment if you don’t do something to keep them up. Therefore, provide action resources! Let people know when your next event will be, or the schedule of public events at your institution. Hand out take-home activities. Provide online resources or lists of other people’s online resources. Make sure that those who would like to stay engaged have a very low threshold to do so!

And now, go read the original research where all of these ideas came from:

Varner (2014) “Scientific Outreach: Toward Effective Public Engagement with Biological Science”

How to plan a course from scratch

Where do I even start???
A very helpful concept, which is completely contrary to how most people approach course planning, is “backward design”. Instead of looking at all the cool experiments, the awesome, fun materials, the best case studies, we first look at the learning outcomes we want to achieve with our course. From those learning outcomes, we think about how we could determine whether they have actually been met (the assessment) and only then we look at how we can convey what students need to learn in order to meet the learning outcomes.
In practice that means that with every new course, the first step is to think about why are we teaching this course? What will students be able to do, and what attitudes will they have once they have participated in our course?
Imagine I were to plan a summer camp on oceanography for teen-aged kids. It would of course be most important that they enjoy their summer holiday, but once that is taken care of, there are a couple of things I would want them to take away from their week with me. As you’ve probably heard about a million times by now, learning outcomes are commonly written from the students’ point of view, using active, measurable verbs. So learning outcomes for that summer camp could look something like this:

Learning outcomes for an oceanography summer camp with teen-aged participants

After participating, students are able to
  1. give a broad overview over the field of ocean sciences to a lay audience, demonstrate practical applications of oceanography and illustrate their relevance to our lives;
  2. develop simple experiments following the scientific method, assess their validity with respect to answering a specific question and decide on further steps;
  3. develop joint questions and solutions in heterogeneous teams and reflect on team collaboration and their own contribution towards it; and
  4. perform independently and assess their own state of learning with the aid of the instructors.
Looking at those learning outcomes, you might notice that those cover all four groups of competences: 1 and 2 are professional competences (knowledge and skills, respectively) and 3 and 4 are personal competences (social competence and self-reliance). You might also notice that I am dealing with fairly high skill levels here: In Bloom’s classification, the knowledge learning outcomes are around level 2 and 3, the skills are even as high as Bloom-level 5 and 6. The personal competencies are more difficult to place in the Bloom categories, but are also on the high end.
Ambitious goals for a week with teenagers, you say? Yes, true. But I am really not interested in just conveying factual knowledge, and as soon as things become interesting, they also become more difficult. Plus note that following the conventions, I only mention the highest Bloom-level learning outcomes – in order to illustrate something (i.e. Bloom-level 3 “application”), I will need to have the factual knowledge (level 1) and also have understood it (level 2).

Assessment of the learning outcome

Obviously I don’t want to turn my summer camp into a permanent assessment of skills (or at least not the way that sounds to most people) – the main aim is still that participants have a good summer. But still it is nice to have some part of assessment included, both for myself so I know whether I achieved what I wanted to achieve, and for the participants so they realize how much they have learned in just one week. I have cleverly included “assess their own state of learning with the aid of the instructors” as one of the learning outcomes, but what would that look like in practice?
As you can see from my learning outcomes, the whole summer camp is about working in teams on understanding how the ocean works, and presenting that to a lay audience (so probably the parents when they come to collect the kids, or other guests at the camp). So a good assessment would be to have them do just that: Present an experiment that they developed themselves, in a team, to an audience and explain what it is all about. Since this is a summer camp, this is probably about the extent of the assessment I would go to, but knowing how I like to function as an instructor, there will be a lot of formative feedback along the way on all four learning outcomes.

Determining the instructional method

So now to the part that people usually start with: Finding an instructional method to prepare students for the assessment to make sure they take away from the course what I want them to take away.
It makes sense to only assess what the participants had a chance to practice before, so we should be practicing working in groups on developing experiments and presenting them. This means our course plan should look something like this (This is a half-day raster. Not mentioned here are the “purely fun” activities like the afternoon at the beach, the canoeing trips, the BBQ, etc, but the schedule below is flexible enough to fit in all of those weather-dependent activities, which are currently indicated by empty “-“s):

Course plan:

Day 1
– Arrive at camp. Get to know everybody. Rules & boundary conditions.

Day 2
– What is so exciting about the ocean? Collect questions participants are interested in.
– Introduction to the scientific method. How do scientists learn about the world? Melting ice cubes experiment to practice the process as well as learning to write protocols.
Day 3
– Develop own questions and experiments to answer them.
Day 4
– Conduct experiment
– Conduct experiment
Day 5
– Analyze and interpret data
Day 6
– Prepare presentation of results.
Day 7
– Present everything to parents & everybody else interested.
So there we go! Coming up with all of this and writing it down took me maybe one hour, and we already have a pretty good idea of what that course might look like. Of course, the course planning isn’t done. In future posts, we will look at individual units and see how learning outcomes are reflected in the activities, and we will likely enter into an iterative process which will change our initial plan. But such is life :-)
P.S.: So why, on this blog, do I keep talking about how awesome experiments are, and how we can use them for almost any class, and with any audience, in any setting? Shouldn’t I be talking about the learning outcomes first, and then the assessment, and only then the teaching methods (i.e. the experiments)? Yes. Totally. But, in my defense, even though I don’t always make them explicit on this blog, I know what my underlying learning goals are. But I’ll try to do better and make them more explicit on here in the future!