Loading

Accuracy of the Neurons Number in the Hidden Layer of the Levenberg-Marquardt Algorithm
Hindayati Mustafidah1, Suwarsito2, Silvia Nila Candra Permatasari3
1Hindayati Mustafidah*, Informatics Engineering, Universitas Muhammadiyah Purwokerto, Purwokerto, Indonesia.
2Suwarsito, Geography, Universitas Muhammadiyah Purwokerto, Purwokerto, Indonesia.
3Silvia Nila Candra Permatasari, Informatics Engineering, Universitas Muhammadiyah Purwokerto, Purwokerto, Indonesia.

Manuscript received on November 15, 2019. | Revised Manuscript received on November 23, 2019. | Manuscript published on November 30, 2019. | PP: 2349-2353 | Volume-8 Issue-4, November 2019. | Retrieval Number: D8259118419/2019©BEIESP | DOI: 10.35940/ijrte.D8259.118419

Open Access | Ethics and Policies | Cite  | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Backpropagation, as a learning method in artificial neural networks, is widely used to solve problems in various fields of life, including education. In this field, backpropagation is used to predict the validity of questions, student achievement, and the new student admission system. The performance of the training algorithm is said to be optimal can be seen from the error (MSE) generated by the network. The smaller the error produced, the more optimal the performance of the algorithm. Based on previous studies, we got information that the most optimal training algorithm based on the smallest error was Levenberg–Marquardt with an average MSE = 0.001 in the 5-10-1 model with a level of α = 5%. In this study, we test the Levenberg-Marquardt algorithm on 8, 12, 14, 16, 19 neurons in hidden layers. This algorithm is tested at the learning rate (LR) = 0.01, 0.05, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, and 1. This study uses mixed-method, namely development with quantitative and qualitative testing using ANOVA and correlation analysis. The research uses random data with ten neurons in the input layer and one neuron in the output layer. Based on ANOVA analysis of the five variations in the number of neurons in the hidden layer, the results showed that with α = 5% as previous research, the Levenberg–Marquardt algorithm produced the smallest MSE of 0.00019584038 ± 0.000239300998. The number of neurons in the hidden layer that reaches this MSE is 16 neurons at the level of LR = 0.8.
Keywords: Accuracy, Hidden Layer, Levenberg-Marquardt, MSE..
Scope of the Article: Parallel and Distributed Algorithms.