Accessibility navigation


Implementation of deep neural networks (DNN) with batch normalization for batik pattern recognition

Nurhaida, I., Ayumi, V., Fitrianah, D., Zen, R. A.M., Noprisson, H. and Wei, H. (2020) Implementation of deep neural networks (DNN) with batch normalization for batik pattern recognition. International Journal of Electrical and Computer Engineering (IJECE), 10 (2). pp. 2045-2053. ISSN 2088-8708

[img]
Preview
Text (Open Access) - Published Version
· Available under License Creative Commons Attribution Share Alike.
· Please see our End User Agreement before downloading.

580kB

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.11591/ijece.v10i2.pp2045-2053

Abstract/Summary

One of the most famous cultural heritages in Indonesia is batik. Batik is a specially made drawing cloth by writing Malam (wax) on the cloth, then processed in a certain way. The diversity of motifs both in Indonesia and the allied countries raises new research topics in the field of information technology, both for conservation, storage, publication and the creation of new batik motifs. In computer science research area, studies about Batik pattern have been done by researchers and some algorithms have been successfully applied in Batik pattern recognition. This study was focused on Batik motif recognition using texture fusion feature which is Gabor, Log-Gabor, and GLCM; and using PCA feature reduction to improve the classification accuracy and reduce the computational time. To improve the accuracy, we proposed a Deep Neural Network model to recognise batik pattern and used batch normalisation as a regularises to generalise the model and to reduce time complexity. From the experiments, the feature extraction, selection, and reduction gave better accuracy than the raw dataset. The feature selection and reduction also reduce time complexity. The DNN+BN significantly improve the accuracy of the classification model from 65.36% to 83.15%. BN as a regularization has successfully made the model more general, hence improve the accuracy of the model. The parameters tuning also improved accuracy from 83.15% to 85.57%.

Item Type:Article
Refereed:Yes
Divisions:Faculty of Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:95532
Publisher:IAES

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation