A Lecturer’s Unique Approach to Assessing Student Comprehension

A Lecturer’s Unique Approach to Assessing Student Comprehension

It’s getting tougher to assess how much university students have learnt. In his work as a Mathematical Statistics lecturer, Michael von Maltitz has tried a new way of getting students to learn, and of assessing what they’ve absorbed and retained. Students have to show and discuss how they arrived at their understanding of the subject. They can’t just rely on cramming, because he interviews them as if they were applying for a job.

What prompted you to try something new?

“We understand, but how will it be asked in the test?” This is the question that was posed to me time and again in 2019 when I started lecturing a module in mathematical statistics at second-year university level.

I knew I had to make a change. I already understood that students were stressed, prone to memorising content and cramming before tests and examinations, and using short cuts to attain a good grade, rather than to learn anything.

What did you then do differently?

The module was unfamiliar to me so I decided to allow the students to approach the course content in the same way as I was: gathering information from different sources and combining and collating it digitally, reflecting on how it helped to meet certain objectives or learning outcomes.

These portfolios of learning evidence would contain course and outcome information, content knowledge (including theorems and proofs), examples with solutions, showpiece assignments, links to and discussions on online tutorials or videos, and paragraphs of self-reflection. Readers might see these portfolios as “study notes on steroids”.

Assessing the portfolio would be an exercise in evaluating the learning process, rather than a memorised product.




Read more:
The greatest risk of AI in higher education isn’t cheating – it’s the erosion of learning itself


The process was challenging but offered a reward for me and my students – that of discovery. Students seemed to be genuinely learning.

Besides checking their portfolios, I needed a way to assess progress that didn’t fall into the old habits of memorisation and “teaching to the test”. I needed to ensure that a student had created their own portfolio and could defend the content in it. And I needed an assessment method that would not take more time and effort than coming up with a unique written test or examination, formulating a typeset memorandum, and marking more than 100 answer scripts, giving feedback that the students might never look at.

I decided to test this form of deep learning using a workplace method – the interview. In a 30-minute online interview with each student, I asked questions about their understanding of the module content, as well as questions concerning their own portfolios. Each student had to defend the information collected and reflected upon.

The interview worked perfectly when paired with the portfolio. I assessed a set of portfolios in an evening, gave typed feedback, and then interviewed those portfolios’ creators the next day. Feedback was immediate, and the interview assessment became a learning experience, for me and the student.




Read more:
South African university students use AI to help them understand – not to avoid work


They were able to defend their portfolios if I made any errors on the portfolio assessment, and I could give the correct answer immediately to any interview question they were stumped by.

Afterwards, the recording of the interview could be given to the student, and if they felt I was being unfair at all, they could compare their interview with another student’s. In doing so, the students themselves could moderate my assessment practice.

What results did you observe?

After a year or two of teaching and assessing like this, I noticed my students seemed to understand more of the content. They retained more into their final year, they were fluent in “statistics” communication and they had better time management and self-reflection skills.

Students told me that they were asked the same questions in their first job interviews as I had asked in my modules, and that they felt much more at ease in those first few job interviews.

How did you confirm these results?

To formally test the developments I had noticed in my students, I conducted research on the class in 2022, which was published in conference proceedings and an article.

This study showed that students experienced significant learning in every facet of an educational framework known as Fink’s taxonomy:

Thus, the method of learning and assessment could formally be called a success within Statistics.

Can this approach be used in other courses?

Yes. One might argue that if this method can be employed for a mathematical module, it can be utilised anywhere. Mathematical modules contain theorems, proofs, definitions, theoretical and practical problem solving – items that might seem difficult to assess through verbal communication. But it is the understanding of the ideas behind the theorems, the stories of and the tricks used within the proofs, the application of the theoretical problems, that are so important in an age where your favourite AI can provide content knowledge.




Read more:
Three South African universities have new approaches to assessing students: why this is a good thing


Mathematical proofs and worked calculations, both of which take time in practice, can be assessed by looking at a portfolio containing these items with the student’s annotations and reflections. The understandings of these concepts are assessed in the interview.

Likewise, in other subjects, a portfolio could be used for assessing knowledge-based content, while the interview could be used to gauge a student’s understanding of what was put into the portfolio, why they chose that content, why the content is important, and how that content is used in practice.

The post “one lecturer’s way of testing what students understand” by Michael Johan von Maltitz, Associate Professor, Mathematical Statistics and Actuarial Science, University of the Free State was published on 03/04/2026 by theconversation.com