Chapter 8 Embracing Formative feedback

As a teacher who primarily creates and runs data science workshops for postgrads and early career researchers I have not had a chance to create or mark assessments (A1, A2). Naturally, I also gained an unfavourable opinion towards assessment as a student. However, through my learning in this course and through a digital learning site (brilliant.org) I have recently come to see the positives of using assessment to improve learning. Previously I included exercises in the courses. Now I am also incorporating MCQs (Multiple-Choice Questions) into my html books (A5). In the future I aim to create some form of pre-assessment for my workshops. Of course, formative assessment works best when an appropriate form of feedback is included with it.

My teaching is carried out through workshops which are not graded. Therefore, I try to incorporate formative assessment into my teaching. This formative assessment is carried out through exercises that I have used extensively, asking verbal questions to learners, and through MCQs that I am now incorporating into my e-materials. These forms of formative assessment are useful to help learning by giving time for students to stop reading and practising, and instead apply their new knowledge and skills through active learning. When done correctly these activities can be beneficial to learning and be enjoyable (Adkins, 2018). Formative assessments help students monitor their progress and encourage further study (V2) (McCallum & Milner, 2021). Incorporating these activities within the materials and having the courses being self-paced by learners incorporates aspects of a flipped classroom which can decrease cognitive load and supports a diverse group of students in their learning (V1) (Cortese et al., 2022).

I have always included exercises in my computational biology materials. This gives learners an opportunity to put into action what they have learnt. They will have to think about what they are doing and why they are doing it. I have 3 forms of feedback for these exercises:

  1. I ensure these exercises have a clear output such as the exact file or visualisation they need to create. I attempt to make it clear when they have completed it successfully or not.
  2. I include an answer chapter. This includes the code I used along with annotation that includes the how and why of my code. The answers are useful so they can peak if they get stuck and it allows them to use the materials in the future outside of the workshop environment.
  3. I am on hand to assist, answer questions, and provide feedback (A3). I have a streaming set up so I can quickly show them my working environment and demonstrate how to carry out any coding task if I or the learners feel it is necessary.

Although I provide all this information coding exercises can be a form of subjective assessment as there are many ways to do the same thing in coding. The intended takeaway for learners is that they understand why their code worked or not and they know how to start the process of fixing any errors.

Due to going to online workshops, I changed to creating digital html books rather than normal documents. To create these I use an R package called bookdown. I am now adding in MCQs into these bookdowns as a form of formative assessment to improve the quality of learning (K6). I use three answer options as the optimal amount (V3) (Rodriguez, 2005). Additionally, I am incorporating macro design strategies, such as interspersed answers, to create my MCQs (V3) (Söbke & Mosebach, 2020).

These MCQs are digital and the learner can select the answer on the html. The primary benefits of these digital MCQs is that they are practically feasible, easily created by me and utilised by the learners, and they provide instant feedback (K4) (Snekalatha et al., 2021). Additionally, the learner can reattempt the question if they chose the wrong option. Allowing for reattempts within formative assessment is beneficial to learning (Zhang et al., 2021). Due to the nature of MCQs they are very good for objective assessment, where there is only one answer.

Perfect uses of MCQs are to reinforce definitions & theory, and to explore output (K2). When learning programming it is very useful to learn the terminology (aka jargon). This is vital to allow for future learning as a lot of programming is using search engines with the correct terminology (K1). This can be difficult as terminology can be unintuitive or clash with other programming languages. Therefore having recap MCQs to reinforce terminology via repetition can be very useful to promote recall (K3) (Yang, 2019).

A large part of data science is investigating the coding output, whether it is a text file or a visualisation. Learning this skill is therefore vital (Via, 2022). Using MCQs, combined with good explanations of the output, to ask learners to pick out certain parts of the output is a very good technique to learn this skill (K2). This simultaneously gives learners a clear and concise instruction with an achievable goal. In this form the MCQs act as guiding questions that help them progress in their learning (A4) (Kojo et al., 2018). This helps them pick out the useful information versus the unneeded noise.

I am monitoring the acceptance of my e-materials and teaching by feedback forms given to students after the workshops. This includes asking students their opinion on the effectiveness and clarity of the e-materials and MCQs (K5). This allows me to reflect on and revise my materials to improve them for future workshop cohorts (Foley, 2020). Currently the feedback is very positive but with room for improvement.

In the future I would like to create some form of online quiz to use as a diagnostic assessment(A3). It would be perfect if I could set it up for each course and each student could carry out the assessment. Afterwards it would tell them which of my current pre-course materials they should run through prior to the course. This would encourage the learners to run through the prerequisite materials which aim to bring all learners to the level of proficiency required for the workshop (V1, V2) (McClatchy et al., 2020).

I have learnt how assessment can be a vital teaching and learning tool and not just a tool of discrimination. I have always used exercises and tried to help learners by questioning them in certain ways rather than just tell them the answer. Now I am including MCQs in my html materials to reinforce terminology and concepts as well as to focus their investigation of the file and visualisation outputs they make in the workshop.

8.1 References

  1. Adkins, J. K. (2018). Active learning and formative assessment in a user-centered design course. Information Systems Education Journal, 16(4), 34.

  2. Cortese, G., Greif, R., & Mora, P. C. (2022). Flipped classroom and a new hybrid “Teach the Airway Teacher” course: an innovative development in airway teaching?. Trends in Anaesthesia & Critical Care, 42, 1.

  3. Foley, D. (2020). Teachable Teachers: Using Circumspect Feedback to Improve Teaching and Learning.

  4. Kojo, A., Laine, A., & Näveri, L. (2018). How did you solve it?– Teachers’ approaches to guiding mathematics problem solving. LUMAT: International Journal on Math, Science and Technology Education, 6(1), 22-40.

  5. McCallum, S., & Milner, M. M. (2021). The effectiveness of formative assessment: student views and staff reflections. Assessment & Evaluation in Higher Education, 46(1), 1-16.

  6. McClatchy, S., Bass, K. M., Gatti, D. M., Moylan, A., & Churchill, G. (2020). Nine quick tips for efficient bioinformatics curriculum development and training. PLoS Computational Biology, 16(7), e1008007.

  7. Rodriguez, M. C. (2005). Three options are optimal for multiple‐choice items: A meta‐analysis of 80 years of research. Educational measurement: issues and practice, 24(2), 3-13.

  8. Snekalatha, S., Marzuk, S. M., Meshram, S. A., Maheswari, K. U., Sugapriya, G., & Sivasharan, K. (2021). Medical students’ perception of the reliability, usefulness and feasibility of unproctored online formative assessment tests. Advances in physiology education, 45(1), 84-88.

  9. Söbke, H., & Mosebach, J. (2020, October). Macro Design of Multiple-Choice Questions for Learning. In European Conference on e-Learning (pp. 476-XX). Academic Conferences International Limited.

  10. Via, A., Palagi, P. M., Lindvall, J. M., & Attwood, T. K. (2022). Introducing the The Mastery Rubric for Bioinformatics–a Professional Guide. F1000Research, 11(735), 735.

  11. Yang, B. W., Razo, J., & Persky, A. M. (2019). Using testing as a learning tool. American journal of pharmaceutical education, 83(9).

  12. Zhang, S., Bergner, Y., DiTrapani, J., & Jeon, M. (2021). Modeling the interaction between resilience and ability in assessments with allowances for multiple attempts. Computers in Human Behavior, 122, 106847.