top of page
Search
  • Writer's pictureChristian Moore Anderson

How to design quiz (or "core") questions in biology

Updated: Aug 26, 2023


Quiz questions are a useful tool for a subject such as biology—a content heavy subject—not just for students but also as a planning tool for teachers.


This is the context I am coming from:

  • I plan quiz questions for myself, to distill a new curriculum down to the essentials. This helps me define what I'm going to teach. But they only help, we work together as a team—the understanding behind the questions is in my own mind. I need to help students with both, the content and the understanding.

  • I share all my quiz questions (and their answers) with my students, but we never use the quizzes in our lessons. The students have them to know the minimum content level (something often missed by students), and to follow the order of the course. My students typically use them to help revision before exams.

  • To ensure my students don't get confused, I call mine "basic knowledge questions". They are just the basics, and more is needed than knowing them alone.


In this post I shall discuss:

  • what good quiz questions may look like in biology

  • how grouping questions can increase their coherence & avoid fragmentation


What are quiz questions?


A list of closed questions that encompass the core content of the course with the intention of them being utilised for the commitment of such content to memory. Sometimes they are called core questions.


They might look like this in a table:

It is all too easy to write poor quiz questions for biology that are based on fragmented memorisation of vocabulary.


If quiz questions are reduced, or atomised, they become easier to use as a learning tool, but too atomised and they lose meaning, connectedness, and explanatory power, giving way to criticism for being just a list of facts. This is a trade off.


At the other end of the spectrum, if quiz questions are too open, too large, they lose utility as a learning tool as students find the answers too hard to dissect, commit to memory, or understand the parts and whole. They must be concise in nature, but the optimum size of a quiz question is difficult to ascertain and will depend on the context. More on this in a minute.


Quiz questions differ to classroom questioning as they cannot be responsive or flexible. They are also a time-limited resource. There is only so much time that students will devote to revision and practice. As such, this post explores what quality quiz questions may look like, and how their design can avoid fragmentation of knowledge and enhance their coherence.



The quality of quiz questions


Let's look first at quality of answers taking influence from Miller and Cañas (2008) who categorised the quality of propositions in concept maps from the simple and static: 'Animals may be vertebrates', or 'plants have cells' to more dynamic relationships:

  • Non-causative dynamic propositions: Roots absorb water, Herbivores eat plants, Living beings need oxygen.

  • Causative dynamic propositions: Cigarettes produce cancer, Rule of law attracts foreign investment, Heat melts ice, Paper comes from trees.

  • Quantified causative dynamic propositions: Increased transparency in public affairs discourages corruption, Under-activity of the thyroid gland decreases body metabolism, Increased quality of education contributes to greater national development.

Dynamic relationships show change, they show causality and consequence, they show trends, and probability. The learning of these should allow students to apply that bit of knowledge more than static relationships.


Of course, there will always be a need for more static, more vocabulary based quiz questions, such as for those pesky genetics terms that really need to be internalised, but the framework allows us to think beyond them alone.


Applying this framework to quiz questions means that quality increases when the answers require students to retrieve more dynamic relationships. More static relationships, especially component names and functions may be best learnt as drawings of diagrams.


For example, a quiz question like the one below:

  1. What is the function of chewing (mechanical digestion)?

  • Increase the surface area to volume ratio of food

Could become something like this (off the top of my head):

  1. State the relationship between chewing food and rate of digestion, and give a reason.

  • An increase in chewing causes an increase in the rate of digestion

  • It increases the surface area to volume ratio of food



Here's another example:

  1. What is the function of the highly branched structure of glycogen?

  • It allows for quick hydrolysis (breakdown) to glucose


Could become something like this:

  1. State the relationship between the extent of branching in polysaccharides and the rate at which they can be hydrolysed to their monomers

  • An increase in branches increases the rate at which it can be hydrolysed

The second version is not as specific to glycogen but it gives students a generalisable explanation to learn which can then be applied to additional quiz questions on glycogen as a quick access energy store, and the rate of digestion of amylose, amylopectin, and glycogen in the human digestive system.


Now compare these two chunks of quiz questions. The second chunk's questions require longer responses , but this is traded off with keeping connected information together. And, as with the example above, a more generalisable dynamic quiz question is given before more specific quiz questions.



Mechanism and causality in quiz questions


Next, I take influence from my taxonomy of understanding (published in my book).


When quiz questions become too atomised they often end up in the bottom left quadrant. Fragmented and lacking explanation. I worry that by atomising a pattern or concept too much it becomes fragmented also in the student's mind.


Currently, I prefer to keep certain mechanistic and causal relationships together, while maintaining a certain equilibrium with the trade off of the utility of atomising and brevity.


If we assume that quiz questions are of equal quality, there should be a vague point of equilibrium as shown below on my idealised graph . Obviously, it will also depend on the context and content, so we could think of an optimum range around this point. It may also differ by stage so that the optimum scope is lower at KS3, and higher at KS5.

For example in a KS5 biology quiz question I could ask the simple:

State the orientation of phospholipids in a bilayer, and request these answers:

  • Hydrophobic fatty acids in the centre

  • Hydrophilic phosphate groups on the outside.


To add more explanatory power (but reducing atomisation) I could add a bit more to the question and answer:

State the orientation of phospholipids in a bilayer, and why, and request these answers:

  • Fatty acids in the centre --> because the centre is hydrophobic

  • Phosphate groups on the outside --> because they form H bonds with water


Compare the questions below, the second set is less atomised, but the answers have more explanatory power:


Wherever the optimal atomisation lies on the spectrum, coherence can be increased by chunking quiz questions so that students practice them as a package. By coherently chunking it is possible to mix a range of quiz question levels of atomisation and quality; mixing questions on vocabulary, function, mechanism, relationships, and meaning in context, to meet the needs of both utility and understanding.


It is also possible to build up from more to less atomised answers within a chunk. As such, it makes more sense to judge the chunk of quiz questions rather than individual questions.


An improved model focuses not on a single optimal scope, but an idealised optimal range for a chunk of questions (see the graph below). Nevertheless, as with any idealised model, it will be subject to context dependent variations (remember those genetics vocabulary terms which probably form a chunk alone).

Compare the quiz question chunks below. My current way of thinking prefers the second set, which is more varied in atomisation and quality, but coherent as a whole. The more demanding questions requiring retrieval of a mechanism are deliberately as succinct as possible, and difficulty is offset by the preceding questions.


Rather than one long list of questions, I have my quiz questions in tables per topic, and then chunked with a title indicating the concept like the examples above and below. However, learning them alone can still lead to rote learning a lack of interconnected understanding. Students need to more with these questions than just learn them by heart. I make this explicit to my students by referring to them as basic knowledge questions—they'll need to do more to move beyond the basics. If you've enjoyed this—check out my book. Download chapter 1 here—English edition—edición española—or check out my other posts.


@CMooreAnderson (twitter)


References


Miller, N. L., & Cañas, A. J. (2008). Effect of the nature of the focus question on presence of dynamic propositions in a concept map. In A. J. Cañas, P. Reiska, M. Åhlberg, & J. D. Novak (Eds.), Proceedings of the 3rd international conference on concept mapping. Tallinn, Estonia & Helsinki, Finland.


Moore-Anderson, C. 2021. “Designing a curriculum for the networked knowledge facet of systems thinking in secondary biology courses: a pragmatic framework.” Journal of Biological Education, DOI: 10.1080/00219266.2021.1909641

1,016 views
bottom of page