Accessibility navigation

Scalable and explainable visually-aware recommender systems

Markchom, T., Liang, H. ORCID: and Ferryman, J. (2023) Scalable and explainable visually-aware recommender systems. Knowledge-Based Systems, 263. 110258. ISSN 1872-7409 (110258)

[img] Text - Accepted Version
· Restricted to Repository staff only
· Available under License Creative Commons Attribution Non-commercial No Derivatives.


It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1016/j.knosys.2023.110258


Recommender systems are popularly used to deal with an information overload issue. Existing systems mainly focus on user–item interactions and semantic information derived from metadata of users and items to improve recommendation accuracy. Item images provide useful information to infer users’ individual preferences, especially for those domains where visual factors are influential such as fashion items. However, this type of information has been ignored by most previous work. To bridge this gap and meet the requirements of performance from the aspects of Accuracy, Scalability, and Explainability evaluation metrics, this paper proposes a scalable and explainable visually-aware recommender system framework called SEV-RS. This framework contains a visually-augmented heterogeneous information network, a scalable meta-path feature extraction method for multi-hop relations, and a shallow explainable meta-path based Collaborative Filtering recommendation approach. We compared SEV-RS with the state-of-the-art models such as the deep learning model using Graph Attention Network on two real-world datasets and one synthetic dataset. The results show that SEV-RS produced more accurate and more explainable recommendations. Also, SEV-RS has substantially less computational time than the compared deep learning models.

Item Type:Article
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:110549

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation