Explainable AI to understand study interest of engineering studentsGhosh, S., Kamal, S., Chowdhury, L., Neogi, B., Dey, N. and Sherratt, S. ORCID: https://orcid.org/0000-0001-7899-4445 (2023) Explainable AI to understand study interest of engineering students. Education and Information Technologies. ISSN 1573-7608
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1007/s10639-023-11943-x Abstract/SummaryStudents are the future of a nation. Personalizing student interests in higher education courses is one of the biggest challenges in higher education. Various AI and ML approaches have been used to study student behaviour. Existing AI and ML algorithms are used to identify features for various fields, such as behavioural analysis, economic analysis, image processing, and personalized medicine. However, there are major concerns about the interpretability and understandability of the decision made by a model.This is becausemostAI algorithms are black-box models. In this study, explainable AI (XAI) aims to break the black box nature of an algorithm. In this study, XAI is used to identify engineering students’ interests, and BRB and SP-LIME are used to explain which attributes are critical to their studies. We also used (PCA) for feature selection to identify the student cohort. Clustering the cohort helps to analyse the between influential features in terms of engineering discipline selection. The results show that there are some valuable factors that influence their study and, ultimately, the future of a nation.
Altmetric Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |