Optimasi Learning Rate Neural Network Backpropagation Dengan Search Direction Conjugate Gradient Pada Electrocardiogram

Authors

  • Azwar Riza Habibi Institut Teknologi dan Bisnis Asia Malang, Indonesia
  • Vivi Aida Fitria Institut Teknologi dan Bisnis Asia Malang, Indonesia
  • Lukman Hakim Institut Teknologi dan Bisnis Asia Malang, Indonesia

DOI:

https://doi.org/10.25217/numerical.v3i2.603

Keywords:

Neural Network; Conjugate Gradient; Pembobotan; Arah Pencarian

Abstract

This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this method is in defining the direction of linear search. The conjugate gradient method has several methods to determine the steep size such as the Fletcher-Reeves, Dixon, Polak-Ribere, Hestene Steifel, and Dai-Yuan methods by using discrete electrocardiogram data. Conjugate gradients are used to update learning rates on neural networks by using different steep sizes. While the gradient search direction is used to update the weight on the NN. The results show that using Polak-Ribere get an optimal error, but the direction of the weighting search on NN widens and causes epoch on NN training is getting longer. But Hestene Steifel, and Dai-Yua could not find the gradient search direction so they could not update the weights and cause errors and epochs to infinity.

Downloads

Published

2020-01-06

How to Cite

Habibi, A. R., Fitria, V. A., & Hakim, L. (2020). Optimasi Learning Rate Neural Network Backpropagation Dengan Search Direction Conjugate Gradient Pada Electrocardiogram. Numerical: Jurnal Matematika Dan Pendidikan Matematika, 3(2), 131–137. https://doi.org/10.25217/numerical.v3i2.603

Issue

Section

Artikel Pendidikan Matematika