ABSTRACT VIEW
Abstract NUM 437

DESIGNING OPEN BOOK ASSESSMENTS USING AN ADAPTATION OF THE "4S" APPROACH
D. O Hanlon, J. Murphy
Munster Technological University (IRELAND)
Invigilated closed-book exams are proposed as a possible solution for maintaining academic integrity in the era of artificial intelligence (AI). Such a solution, however, has limitations when used to assess higher-order thinking skills. Open-Book Assessments (OBAs) are argued to be well suited for questions that require students to use higher-order thinking skills to analyse scenarios and “cases”. OBAs can be administered within invigilated contexts ensuring that “live” student use of artificial intelligence is unlikely.

The 4S Application Exercise model (used in Team Based Learning) was trialled by the first author as a model to inform the design of OBAs. This study explored how students experienced two OBAs which had been designed following the approach. Students were part-time undergraduate BA in Human Resource Management students (n=17). The OBAs were used within two 5 ECTS credit modules (Recruitment and Selection, and Performance Management).

In line with guidelines on 4S application exercise design, both OBAs required students to make a series of “specific” decisions requiring them to evaluate real-life examples of recruitment, selection and performance management. Students were provided with the autonomy to include real-life examples of their own choosing, in order to increase the “significance” of the task.

The first OBA tasked students with answering questions which required them to compare five different real-life recruitment and selection scenarios of their own choosing (from a crowdsourced selection of 15). They had the opportunity to work with the examples for several weeks prior to the OBA. Following feedback from students the second OBA reduced the number of cases that students evaluated to two (students compared a case based on their real-life experience with a lecturer example). Again, the students had several weeks to compare the cases.

After the second in-class OBA, a focus group with four volunteer students was facilitated to explore how the students experienced both assessments.

Students reported a number of experiences which could inform future iterations of OBAs. Students generally experienced the first OBA to be overwhelming. Students were tasked with making decisions about which case was best across a range of criteria. It was seen as lacking authenticity and was generally deemed to lack relevance to practice. Students, in the main did not value the task of comparing examples generated by their colleagues and felt that the content included by students lacked the requisite detail for an in-depth evaluation. Most students experienced the first OBA as a very novel experience and recommended that further guidance ought to be provided (e.g. sample questions) and a trial run to be prepared appropriately to perform well.

The second OBA was viewed more favourably by students, who appreciated having less case studies to work with and the ability to select a case relevant to their own work. The ability to compare their own example against an exemplar was seen as challenging, and was thought to promote “deep” learning. The students perceived it as more authentic to real world practice as well as being fairer.

From the lecturer perspective, the 4S approach proved to be a very usable and straightforward model to help design OBAs. Future research is planned to examine how formative assessment, and feedback could be incorporated to complement this summative assessment.

Keywords: Open Book Assessment, Assessment, Team Based Learning.

Event: ICERI2025
Session: Rethinking Assessment in the Age of AI
Session time: Monday, 10th of November from 11:00 to 12:15
Session type: ORAL