Please use this identifier to cite or link to this item: http://repository.potensi-utama.ac.id/jspui/jspui/handle/123456789/5567
Title: Effect of Gradient Descent With Momentum Backpropagation Training Function in Detecting Alphabet Letters
Authors: Rosnelly, Rika
Keywords: Letter Writing Recognition, Haar Wavelet, Backpropagation, Image, Artificial neural network
Issue Date: Jan-2023
Publisher: SINKRON
Series/Report no.: Volume 8, Issue1;
Abstract: The research uses the Momentum Backpropagation Neural Network method to recognize characters from a letter image. But before that, the letter image will be converted into a binary image. The binary image is then segmented to isolate the characters to be recognized. Finally, the dimension of the segmented image will be reduced using Haar Wavelet. One of the weaknesses of computer systems compared to humans is recognizing character patterns if not using supporting methods. Artificial Neural Network (ANN) is a method or concept that takes the human nervous system. In ANN, there are several methods used to train computers that are made, training is used to increase the accuracy or ability of computers to recognize patterns. One of the ANN algorithms used to train and detect an image is backpropagation. With the Artificial Neural Network (ANN) method, the algorithm can produce a system that can recognize the character pattern of handwritten letters well which can make it easier for humans to recognize patterns from letters that are difficult to read due to various error factors seen by humans. The results of the testing process using the Backpropagation algorithm reached 100% with a total of 90 trained data. The test results of the test data reached 100% of the 90 test data
Description: Artificial neural networks include artificial intelligence systems that are in an effort to imitate human intelligence, have not approached it in its physical form but from the other side (Budiharto & Suhartono, 2014). Artificial neural network is one of the information processing systems designed by mimicking the workings of the human brain in solving a problem by learning through changes in synapse weights. Artificial neural networks are able to recognize past-based activities (Hayadi et al., 2021). Artificial neural network (ANN) is basically a mathematical model function that defines the function f : X - Y. Artificial neural networks are an information processing paradigm inspired by biological neural systems, such as information processing in the human brain (Herawan Hayadi et al., 2017). The key element of this paradigm is the structure of the information processing system which consists of a large number of interconnected processing elements, working simultaneously to solve a particular problem. The way a JST works is like the way a human works. A JST is configured for a specific application, such as pattern recognition or data application. There are many techniques that can be used for the implementation of artificial neural networks, namely Perceptron and Backpropagation (Nuraeni, 2009). In this study the authors used the Backpropagation algorithm. Backpropagation is a gradient descent algorithm to minimize the square of the output error. There are three stages that must be carried out in network training, namely the forward propagation stage, the back propagation stage, and the weight and bias change stage (Hayadi, 2018). This network architecture consists of input layer, hidden layer, and output layer (Dalgleish et al., 2014)
URI: http://repository.potensi-utama.ac.id/jspui/jspui/handle/123456789/5567
ISSN: 2541-2019
Appears in Collections:A Paper



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.