Implementation of Machine Learning Using the Convolution Neural Network Method for Aglaonema Interest Classification


Rachmat Rasyid
Abdul Ibrahim


One of the wealth of the Indonesian nation is the many types of ornamental plants. Ornamental plants, for example, the Aglaonema flower, which is much favored by hobbyists of ornamental plants, from homemakers, is a problem to distinguish between types of aglaonema ornamental plants with other ornamental plants. So the authors try to research with the latest technology using a deep learning convolutional neural network method. It is for calcifying aglaonema interest. This research is based on having fascinating leaves and colors. With the study results using the CNN method, the products of aglaonema flowers of Adelia, Legacy, Widuri, RedKochin, Tiara with moderate accuracy value are 56%. In contrast, the aglaonema type Sumatra, RedRuby, has the most accuracy a high of 61%.


How to Cite
Rasyid, R., & Abdul Ibrahim. (2021). Implementation of Machine Learning Using the Convolution Neural Network Method for Aglaonema Interest Classification. Jurnal E-Komtek (Elektro-Komputer-Teknik), 5(1), 21-30.


[1] S. Pan et al., “Land-cover classification of multispectral LiDAR data using CNN with optimized hyper-parameters,” ISPRS J. Photogramm. Remote Sens., vol. 166, no. November 2019, pp. 241–254, 2020, doi: 10.1016/j.isprsjprs.2020.05.022.
[2] A. Rawat, A. Kumar, P. Upadhyay, and S. Kumar, “Deep learning-based models for temporal satellite data processing: Classification of paddy transplanted fields,” Ecol. Inform., vol. 61, no. November 2020, p. 101214, 2021, DOI: 10.1016/j.ecoinf.2021.101214.
[3] E. N. Arrofiqoh and H. Harintaka, “Implementasi Metode Convolutional Neural Network Untuk Klasifikasi Tanaman Pada Citra Resolusi Tinggi,” Geomatika, vol. 24, no. 2, p. 61, 2018, doi: 10.24895/jig.2018.24-2.810.
[4] M. Sapti, “済無No Title No Title,” Kemamp. Koneksi Mat. (Tinjauan Terhadap Pendekatan Pembelajaran Savi), vol. 53, no. 9, pp. 1689–1699, 2019.
[5] A. Karpathy, G. Toderici, S. Shetty, T. Leung, R. Sukthankar, and F. F. Li, “Large-scale video classification with convolutional neural networks,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 1725–1732, 2014, DOI: 10.1109/CVPR.2014.223.
[6] L. Liu, J. Chen, P. Fieguth, G. Zhao, R. Chellappa, and M. Pietikäinen, “From BoW to CNN: Two Decades of Texture Representation for Texture Classification,” Int. J. Comput. Vis., vol. 127, no. 1, pp. 74–109, 2019, DOI: 10.1007/s11263-018-1125-z.
[7] A. Patil and M. Rane, “Convolutional Neural Networks: An Overview and Its Applications in Pattern Recognition,” Smart Innov. Syst. Technol., vol. 195, pp. 21–30, 2021, DOI: 10.1007/978-981-15-7078-0_3.
[8] H. Feng, W. Lin, W. Shang, J. Cao, and W. Huang, “MLP and CNN-based classification of points of interest in side-channel attacks,” Int. J. Networked Distrib. Comput., vol. 8, no. 2, pp. 108–117, 2020, DOI: 10.2991/IJNDC.K.200326.001.
[9] V. Maha, P. Salawazo, D. Putra, J. Gea, F. Teknologi, and U. P. Indonesia, “Implementasi Metode Convolutional Neural Network ( CNN ) Pada Peneganalan Objek Video Cctv,” J. Mantik Penusa, vol. 3, no. 1, pp. 74–79, 2019.
[10] A. Voulodimos, N. Doulamis, A. Doulamis, and E. Protopapadakis, “Deep Learning for Computer Vision: A Brief Review,” Comput. Intell. Neurosci., vol. 2018, 2018, DOI: 10.1155/2018/7068349.