Search
  • Christian Moore Anderson

On the design & quality of core (quiz) questions in biology

Updated: 3 days ago

In my opinion core questions are a useful tool for a subject such as biology not just for student learning but also as powerful planning tool for teachers.


In this post I shall give an idealised model for:

  • what a quality core question may look like

  • the optimal scope of a core question

  • how grouping questions can increase their coherence & avoid fragmentation


What are core questions?


I shall define core questions as a type of quiz question. Specifically, a list of closed questions that encompass the core content of the course with the intention of them being utilised for the commitment of such content to memory. Core questions are most often associated with the concept of retrieval practice, and I was first introduced to the concept by Adam Boxer.


They might look like this in a table:


This is the context I am coming from:

I use core questions during lessons (practising new content) and for regular quizzes (practising prior knowledge). I share all my core questions (and their answers) with my students.


Their utility and central importance to my courses makes the quality of core questions a planning priority, because:

  1. My students typically favour core questions for revision,

  2. The core questions I design for a concept will influence my classroom teaching of it

It is all too easy to write poor core questions for biology that are based on fragmented memorisation of vocabulary.


If core questions are reduced, or atomised, they become easier to use as a learning tool, but too atomised and they lose meaning, connectedness, and explanatory power, giving way to criticism for being just a list of facts. This is a trade off.


At the other end of the spectrum, if core questions are too open, too large, they lose utility as a learning tool as students find the answers too hard to dissect, commit to memory, or understand the parts and whole. They must be concise in nature, but the optimum size of a core question is difficult to ascertain and will depend on the context. More on this in a minute.


Core questions differ to classroom questioning as they cannot be responsive or flexible. They are also a time-limited resource. There is only so much time that students will devote to revision and practice. As such, this post explores what quality core questions may look like, and how their design can avoid fragmentation of knowledge and enhance their coherence.



The quality of core core questions


Let's look first at quality of answers taking influence from Miller and Cañas (2008) who categorised the quality of propositions in concept maps from the simple and static: 'Animals may be vertebrates', or 'plants have cells' to more dynamic relationships:

  • Non-causative dynamic propositions: Roots absorb water, Herbivores eat plants, Living beings need oxygen.

  • Causative dynamic propositions: Cigarettes produce cancer, Rule of law attracts foreign investment, Heat melts ice, Paper comes from trees.

  • Quantified causative dynamic propositions: Increased transparency in public affairs discourages corruption, Under-activity of the thyroid gland decreases body metabolism, Increased quality of education contributes to greater national development.

Dynamic relationships show change, they show causality and consequence, they show trends, and probability. The learning of these should allow students to apply that bit of knowledge more than static relationships.


Of course, there will always be a need for more static, more vocabulary based core questions, such as for those pesky genetics terms that really need to be internalised, but the framework allows us to think beyond them alone.


Applying this framework to core questions means that quality increases when the answers require students to retrieve more dynamic relationships. More static relationships, especially component names and functions may be best learnt as drawings of diagrams.


For example, a core question like the one below:

  1. What is the function of chewing (mechanical digestion)?

  • Increase the surface area to volume ratio of food

Could become something like this (off the top of my head):

  1. State the relationship between chewing food and rate of digestion, and give a reason.

  • An increase in chewing causes an increase in the rate of digestion

  • It increases the surface area to volume ratio of food



Here's another example:

  1. What is the function of the highly branched structure of glycogen?

  • It allows for quick hydrolysis (breakdown) to glucose


Could become something like this:

  1. State the relationship between the extent of branching in polysaccharides and the rate at which they can be hydrolysed to their monomers

  • An increase in branches increases the rate at which it can be hydrolysed

The second version is not as specific to glycogen but it gives students a generalisable explanation to learn which can then be applied to additional core questions on glycogen as a quick access energy store, and the rate of digestion of amylose, amylopectin, and glycogen in the human digestive system.


Now compare these two chunks of core questions. The second chunk's questions require longer responses , but this is traded off with keeping connected information together. And, as with the example above, a more generalisable dynamic core question is given before more specific core questions.



Mechanism and causality in core questions


Next, I have taken influence from my own paper (teachers, DM me if you are interested in reading), which can be understood in this short post.


In my paper I argue that comprehension in biology education is enhanced by a focus on more mechanistic explanations that show causality (rather than focusing on components and their functions), and the connectedness of the concepts learnt.


Figure 1. My framework, adapted and reworded from my paper: Designing a curriculum for the networked knowledge facet of systems thinking in secondary biology courses: a pragmatic framework

When core questions become too atomised they often end up in the upper left quadrant. Fragmented and lacking explanation. I worry that by atomising a pattern or concept too much it becomes fragmented also in the student's mind.


Currently, I prefer to keep certain mechanistic and causal relationships together, while maintaining a certain equilibrium with the trade off of the utility of atomising and brevity.


If we assume that core questions are of equal quality, there should be a vague point of equilibrium as shown below on my idealised graph . Obviously, it will also depend on the context and content, so we could think of an optimum range around this point. It may also differ by stage so that the optimum scope is lower at KS3, and higher at KS5.

For example in a KS5 biology core question I could ask the simple:

State the orientation of phospholipids in a bilayer, and request these answers:

  • Hydrophobic fatty acids in the centre

  • Hydrophilic phosphate groups on the outside.


To add more explanatory power (but reducing atomisation) I could add a bit more to the question and answer:

State the orientation of phospholipids in a bilayer, and why, and request these answers:

  • Fatty acids in the centre --> because the centre is hydrophobic

  • Phosphate groups on the outside --> because they form H bonds with water


Compare the questions below, the second set is less atomised, but the answers have more explanatory power:


Wherever the optimal atomisation lies on the spectrum, coherence can be increased by chunking core questions so that students practice them as a package. By coherently chunking it is possible to mix a range of core question levels of atomisation and quality; mixing questions on vocabulary, function, mechanism, relationships, and meaning in context, to meet the needs of both utility and understanding.


It is also possible to build up from more to less atomised answers within a chunk. As such, it makes more sense to judge the chunk of core questions rather than individual questions.


An improved model focuses not on a single optimal scope, but an idealised optimal range for a chunk of questions (see the graph below). Nevertheless, as with any idealised model, it will be subject to context dependent variations (remember those genetics vocabulary terms which probably form a chunk alone).

Compare the core question chunks below. My current way of thinking prefers the second set, which is more varied in atomisation and quality, but coherent as a whole. The more demanding questions requiring retrieval of a mechanism are deliberately as succinct as possible, and difficulty is offset by the preceding questions.


Rather than one long list of questions, I have my core questions in tables per topic, and then chunked with a title indicating the concept like the examples above and below. I previously used to quiz students by choosing totally random questions, but I have transitioned instead to quiz by chunks so that students are retrieving concepts as a more connected whole.


Is it possible to design core question chunks for the flexible understanding quadrant of my framework? According to my arguments I present in my paper this is more complicated than just writing out some questions. To reach this quadrant in thinking requires the construction of connections in the mind of the learner, it is the role of the curriculum (and core questions as part of it) to create the environment in which this is more likely.


Core questions provide a learning tool that contributes to enabling students to remember and connect the pieces themselves, but ultimately more is required beyond core questions alone. To see what I think core questions should be contributing to, see this post.


Thanks to Brett Kingsbury for our many conversations on core questions and his feedback on this post.


Christian Moore Anderson

@CMooreAnderson (follow me on twitter)


Find my other posts here


References


Miller, N. L., & Cañas, A. J. (2008). Effect of the nature of the focus question on presence of dynamic propositions in a concept map. In A. J. Cañas, P. Reiska, M. Åhlberg, & J. D. Novak (Eds.), Proceedings of the 3rd international conference on concept mapping. Tallinn, Estonia & Helsinki, Finland.


Moore-Anderson, C. 2021. “Designing a curriculum for the networked knowledge facet of systems thinking in secondary biology courses: a pragmatic framework.” Journal of Biological Education, DOI: 10.1080/00219266.2021.1909641

494 views0 comments