top of page
Writer's picturegraphbimamanvi

Introduction to Neural Networks Using MATLAB 6.0: A Hands-on Approach with MATLAB Code and Exercises



Written for undergraduate students in computer science, this book provides a comprehensive overview of the field of neural networks. The book presents readers with the application of neural networks to areas like bioinformatics, robotics, communication, image processing, and healthcare. Topics covered include fundamental models of artificial neural networks, perception networks, and adaptive resonance theory.




introduction to neural networks using matlab 6.0



The world we live in is becoming ever more reliant on the use of electronic gadgets and computers to control the behavior of real world resources. For example, an increasing amount of commerce is performed without a single bank note or coin ever being exchanged. Similarly, airports can safely land and send off aeroplanes without even looking out of a window. Another, more individual, example is the increasing use of electronic personal organizers for organizing meetings and contacts. All of these examples share a similar structure; multiple parties (e.g. aeroplanes, or people) come together to coordinate their activities in order to achieve a common goal. It is not surprising, then, that a lot of research is being done on how the mechanics of the coordination process can be automated using computers. This is where neural networks come in.


This paper reviews methods to fix a number of hidden neurons in neural networks for the past 20 years. And it also proposes a new method to fix the hidden neurons in Elman networks for wind speed prediction in renewable energy systems. The random selection of a number of hidden neurons might cause either overfitting or underfitting problems. This paper proposes the solution of these problems. To fix hidden neurons, 101 various criteria are tested based on the statistical errors. The results show that proposed model improves the accuracy and minimal error. The perfect design of the neural network based on the selection criteria is substantiated using convergence theorem. To verify the effectiveness of the model, simulations were conducted on real-time wind data. The experimental results show that with minimum errors the proposed approach can be used for wind speed prediction. The survey has been made for the fixation of hidden neurons in neural networks. The proposed model is simple, with minimal error, and efficient for fixation of hidden neurons in Elman networks.


One of the major problems facing researchers is the selection of hidden neurons using neural networks (NN). This is very important while the neural network is trained to get very small errors which may not respond properly in wind speed prediction. There exists an overtraining issue in the design of NN training process. Over training is akin to the issue of overfitting data. The issue arises because the network matches the data so closely as to lose its generalization ability over the test data.


Artificial neural networks (ANN) is an information processing system which is inspired by the models of biological neural networks [1]. It is an adaptive system that changes its structure or internal information that flows through the network during the training phase. ANN is widely used in many areas because of its features such as strong capacity of nonlinear mapping, high accuracy for learning, and good robustness. ANN can be classified into feedforward and feedback network. Back propagation network and radial basis function network are the examples of feedforward network, and Elman network is an example of feedback network. The feedback has a profound impact on the learning capacity and its performance in modeling nonlinear dynamical phenomena.


The hidden neuron can influence the error on the nodes to which their output is connected. The stability of neural network is estimated by error. The minimal error reflects better stability, and higher error reflects worst stability. The excessive hidden neurons will cause over fitting; that is, the neural networks have overestimate the complexity of the target problem [3]. It greatly degrades the generalization capability to lead with significant deviation in prediction. In this sense, determining the proper number of hidden neurons to prevent over fitting is critical in prediction problem. The modeling process involves creating a model and developing the model with the proper values [4]. One of the major challenges in the design of neural network is the fixation of hidden neurons with minimal error and highest accuracy. The training set and generalization error are likely to be high before learning begins. During training, the network adapts to decrease the error on the training patterns. The accuracy of training is determined by the parameters underconsideration. The parameters include NN architecture, number of hidden neurons in hidden layer, activation function, inputs, and updating of weights.


Thus various criteria were proposed for fixing hidden neuron by researchers during the last couple of decades. Most of researchers have fixed number of hidden neurons based on trial rule. In this paper, new method is proposed and is applied for Elman network for wind speed prediction. And the survey has been made for the fixation of hidden neuron in neural networks for the past 20 years. All proposed criteria are tested using convergence theorem which converges infinite sequences into finite sequences. The main objective is to minimize error, improve accuracy and stability of network. This review is to be useful for researchers working in this field and selects proper number of hidden neurons in neural networks.


Several researchers tried and proposed many methodologies to fix the number of hidden neurons. The survey has been made to find the number of hidden neurons in neural network is and described in a chronological manner. In 1991, Sartori and Antsaklis [5] proposed a method to find the number of hidden neurons in multilayer neural network for an arbitrary training set with P training patterns. Several existing methods are optimized to find selection of hidden neurons in neural networks. In 1993, Arai [6] proposed two parallel hyperplane methods for finding the number of hidden neurons. The hidden neurons are sufficient for this design of the network.


In 1995, Li et al. [7] investigated the estimation theory to find the number of hidden units in the higher order feedforward neural network. This theory is applied to the time series prediction. The determination of an optimal number of hidden neurons is obtained when the sufficient number of hidden neurons is assumed. According to the estimation theory, the sufficient number of hidden units in the second-order neural network and the first-order neural networks are 4 and 7, respectively. The simulation results show that the second-order neural network is better than the first-order in training convergence. According to that, the network with few nodes in the hidden layer will not be powerful for most applications. The drawback is long training and testing time.


In 2010, Doukim et al. [21] proposed a technique to find the number of hidden neurons in MLP network using coarse-to-fine search technique which is applied in skin detection. This technique includes binary search and sequential search. This implementation is trained by 30 networks and searched for lowest mean squared error. The sequential search is performed in order to find the best number of hidden neurons. Yuan et al. [22] proposed a method for estimation of hidden neuron based on information entropy. This method is based on decision tree algorithm. The goal is to avoid the overlearning problem because of exceeding numbers of the hidden neurons and to avoid the shortage of capacity because of few hidden neurons. The number of hidden neurons of feedforward neural network is generally decided on the basis of experience. In 2010, Wu and Hong [23] proposed the learning algorithms for determination of number of hidden neurons. In 2011, Panchal et al. [24] proposed a methodology to analyze the behavior of MLP. The number of hidden layers is inversely proportional to the minimal error.


Generally, neural network involves the process of training, testing, and developing a model at end stage in wind farms. The perfect design of NN model is important for challenging other not so accurate models. The data required for inputs are wind speed, wind direction, and temperature. The higher valued collected data tend to suppress the influence of smaller variable during training. To overcome this problem, the min-max normalization technique which enhances the accuracy of ANN model is used. Therefore, data are scaled within the range . The scaling is carried out to improve accuracy of subsequent numeric computation. The selection criteria to fix hidden neuron are important in prediction of wind speed. The perfect design of NN model based on the selection criteria is substantiated using convergence theorem. The training can be learned from previous data after normalization. The performance of trained network is evaluated by two ways: First the actual and predicted wind speeds comparison and second computation of statistical errors of the network. Finally wind speed is predicted which is the output of the proposed NN model.


In this paper, a survey has been made on the design of neural networks for fixing the number of hidden neurons. The proposed model was introduced and tested with real-time wind data. The results are compared with various statistical errors. The proposed approach aimed at implementing the selection of proper number of hidden neurons in Elman network for wind speed prediction in renewable energy systems. The better performance is also analyzed using statistical errors. The following conclusions were obtained. (1)Reviewing the methods to fix hidden neurons in neural networks for the past 20 years.(2)Selecting number of hidden neurons thus providing better framework for designing proposed Elman network.(3)Reduction of errors made by Elman network.(4)Predicting accurate wind speed in renewable energy systems.(5)Improving stability and accuracy of network. 2ff7e9595c


0 views0 comments

Recent Posts

See All

Download do dr.driving para laptop

Como Baixar Dr. Driving para Laptop Dr. Driving é um popular jogo de simulação móvel que desafia você a dirigir bem, encontrar...

Comments


bottom of page