Accessibility navigation


Towards expressive modular rule induction for numerical attributes

Almutairi, M., Stahl, F. ORCID: https://orcid.org/0000-0002-4860-0203, Jennings, M., Le, T. and Bramer, M. (2016) Towards expressive modular rule induction for numerical attributes. In: Thirty-sixth SGAI International Conference on Artificial Intelligence, 13-15 DECEMBER 2016, Cambridge, UK, pp. 229-235.

[img]
Preview
Text - Accepted Version
· Please see our End User Agreement before downloading.

209kB

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

Official URL: http://dx.doi.org/10.1007/978-3-319-47175-4_16

Abstract/Summary

The Prism family is an alternative set of predictive data mining algorithms to the more established decision tree data mining algorithms. Prism classifiers are more expressive and user friendly compared with decision trees and achieve a similar accuracy compared with that of decision trees and even outperform decision trees in some cases. This is especially the case where there is noise and clashes in the training data. However, Prism algorithms still tend to overfit on noisy data; this has led to the development of pruning methods which have allowed the Prism algorithms to generalise better over the dataset. The work presented in this paper aims to address the problem of overfitting at rule induction stage for numerical attributes by proposing a new numerical rule term structure based on the Gauss Probability Density Distribution. This new rule term structure is not only expected to lead to a more robust classifier, but also lowers the computational requirements as it needs to induce fewer rule terms.

Item Type:Conference or Workshop Item (Paper)
Refereed:Yes
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:68358
Publisher:Springer International Publishing

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation