Induction of modular classification rules: using Jmax-pruningStahl, F. ORCID: https://orcid.org/0000-0002-4860-0203 and Bramer, M. (2011) Induction of modular classification rules: using Jmax-pruning. In: Bramer, M., Petridis, M. and Hopgood, A. (eds.) Research and Development in Intelligent Systems XXVII. Springer, London, pp. 79-92. ISBN 9780857291295 Full text not archived in this repository. It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1007/978-0-85729-130-1_6 Abstract/SummaryThe Prism family of algorithms induces modular classification rules which, in contrast to decision tree induction algorithms, do not necessarily fit together into a decision tree structure. Classifiers induced by Prism algorithms achieve a comparable accuracy compared with decision trees and in some cases even outperform decision trees. Both kinds of algorithms tend to overfit on large and noisy datasets and this has led to the development of pruning methods. Pruning methods use various metrics to truncate decision trees or to eliminate whole rules or single rule terms from a Prism rule set. For decision trees many pre-pruning and postpruning methods exist, however for Prism algorithms only one pre-pruning method has been developed, J-pruning. Recent work with Prism algorithms examined J-pruning in the context of very large datasets and found that the current method does not use its full potential. This paper revisits the J-pruning method for the Prism family of algorithms and develops a new pruning method Jmax-pruning, discusses it in theoretical terms and evaluates it empirically.
Altmetric Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |