ABSTRACT VIEW
APPLICATION OF ITEM RESPONSE THEORY IN THE QUALITY CONTROL OF AN ENTRANCE EXAM IN A MATHEMATICS TEACHING COURSE
J. Gomes1, M. Púcuta1, J. Cruz2
1 Instituto Superior de Ciências da Educação de Cabinda (ANGOLA)
2 University of Aveiro (PORTUGAL)
Learning assessment, particularly in admission processes for undergraduate courses at higher education institutions, is a crucial step and constitutes a constant challenge for these educational institutions. Ensuring quality and equity in these processes is essential to ensure that the selection process is fair and that the candidates admitted have the appropriate profile. At the Instituto Superior de Ciências da Educação de Cabinda (ISCED-Cabinda/Angola), the entrance exam for the Mathematics Teaching course is the main instrument used to select candidates with the necessary profile to face the challenges of this degree.

In this context, Item Response Theory (IRT), is a standard statistical approach used in educational assessment offering accurate information about the performance of candidates and the characteristics of the items that make up the exam.

Since the 1960s, Item Response Theory has stood out as a robust methodology, offering significant advantages over basic descriptive methods, studying the quality of items (through the analysis of the difficulty and discrimination power of each item) and providing a more detailed view of individual skills and, ultimately, the overall quality of an assessment test.

IRT can be based on a single factor of the examinee (usually referred to as “ability”), but more factors can be studied to be understood (possibly referred to as “ability in area 1” and “ability in area 2”), which broadens the possibilities for understanding the examinees’ abilities, providing a more complete and differentiated picture of their competencies in multiple areas. This approach, called Multidimensional Item Response Theory (MIRT), can be especially useful for assessments that seek not only to classify but also to diagnose competencies in multiple domains.

For calculation and programming purposes, the statistical software R (R Core Team, 2023) was used, due to its flexibility and free access and for IRT we used the “mirt” library (Rizopoulos, 2023).

The single factor analysis showed that the items as a whole were well-designed, but allowed us to identify two problematic items, suggesting improvements for the construction of future exams. The Multidimensional Item Response Analysis revealed the presence of two main latent factors, corresponding to the skills in:
(i) Analytical Geometry and Algebra,
(ii) Calculus and Trigonometry, suggesting that the entrance exam is capable of adequately capturing these two core competencies for the course.

This type of analysis helps to understand how much the latent factors capture the variability in item responses, and also provides indicators of the effectiveness of the model in describing candidates' skills.

In general, the results show that the test has good internal consistency and is composed of discriminative and very discriminative items and that there are items of all levels of difficulty. Therefore, there are reasons to agree that the test produces the expected results, that is, it measures the skills of candidates in the ISCED-Cabinda/Angola entrance exam.

Keywords: Item Response Theory, examination quality, educational assessment, quality control, mathematics teaching, ISCED-Cabinda (Angola).

Event: INTED2025
Session: Pedagogical Innovations in Education
Session time: Monday, 3rd of March from 11:00 to 13:45
Session type: POSTER