A. E. Ruano

7004284159

Publications - 12

Exploiting the functional training approach in takagi-sugeno neuro-fuzzy systems

Publication Name: Advances in Intelligent Systems and Computing

Publication Date: 2013-01-01

Volume: 195 AISC

Issue: Unknown

Page Range: 543-559

Description:

When used for function approximation purposes, neural networks and neuro-fuzzy systems belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the derivatives involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for Takagi-Sugeno models. An example shows that this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found. © 2013 Springer-Verlag Berlin Heidelberg.

Open Access: Yes

DOI: 10.1007/978-3-642-33941-7_48

Extending the functional training approach for B-splines

Publication Name: Proceedings of the International Joint Conference on Neural Networks

Publication Date: 2012-08-22

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. This paper extends the application of this formulation to B-splines, describing how the Levenberg-Marquardt method can be applied using this methodology. Simulation examples show that the use of the functional approach obtains important savings in computational complexity and a better approximation over the whole input domain. © 2012 IEEE.

Open Access: Yes

DOI: 10.1109/IJCNN.2012.6252741

Exploiting the functional training approach in B-splines

Publication Name: IFAC Proceedings Volumes IFAC Papersonline

Publication Date: 2012-01-01

Volume: 45

Issue: 4

Page Range: 127-132

Description:

When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for B-splines. An example shows that, besides great computational complexity savings, this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found. © 2012 IFAC.

Open Access: Yes

DOI: 10.3182/20120403-3-DE-3010.00070

A new domain decomposition for B-spline neural networks

Publication Name: Proceedings of the International Joint Conference on Neural Networks

Publication Date: 2010-01-01

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

B-spline Neural Networks (BSNNs) belong to the class of networks termed grid or lattice-based associative memories networks (AMN). The grid is a key feature since it allows these networks to exhibit relevant properties which make them efficient in solving problems namely, functional approximation, non-linear system identification, and on-line control. The main problem associated with BSNNs is that the model complexity grows exponentially with the number of input variables. To tackle this drawback, different authors developed heuristics for functional decomposition, such as the ASMOD algorithm or evolutionary approaches [2]. In this paper, we present a complementary approach, by allowing the properties of B-spline models to be achieved by non-full grids. This approach can be applied either to a single model or to an ASMOD decomposition. Simulation results show that comparable results, in terms of approximations can be obtained with less complex models. © 2010 IEEE.

Open Access: Yes

DOI: 10.1109/IJCNN.2010.5596648

Applying bacterial memetic algorithm for training feedforward and fuzzy flip-flop based neural networks

Publication Name: 2009 International Fuzzy Systems Association World Congress and 2009 European Society for Fuzzy Logic and Technology Conference Ifsa Eusflat 2009 Proceedings

Publication Date: 2009-12-01

Volume: Unknown

Issue: Unknown

Page Range: 1833-1838

Description:

In our previous work we proposed some extensions of the Levenberg-Marquardt algorithm; the Bacterial Memetic Algorithm and the Bacterial Memetic Algorithm with Modified Operator Execution Order for fuzzy rule base extraction from input-output data. Furthermore, we have investigated fuzzy flip-flop based feedforward neural networks. In this paper we introduce the adaptation of the Bacterial Memetic Algorithm with Modified Operator Execution Order for training feedforward and fuzzy flipflop based neural networks. We found that training these types of neural networks with the adaptation of the method we had used to train fuzzy rule bases had advantages over the conventional earlier methods.

Open Access: Yes

DOI: DOI not available

Fuzzy rule extraction by bacterial memetic algorithms

Publication Name: International Journal of Intelligent Systems

Publication Date: 2009-03-01

Volume: 24

Issue: 3

Page Range: 312-339

Description:

In our previous papers, fuzzy model identification methods were discussed. The bacterial evolutionary algorithm for extracting fuzzy rule base from a training set was presented. The Levenberg-Marquardt method was also proposed for determining membership functions in fuzzy systems. The combination of the evolutionary and the gradient-based learning techniques is usually called memetic algorithm. In this paper, a new kind of memetic algorithm, the bacterial memetic algorithm., is introduced for fuzzy rule extraction. The paper presents how the bacterial evolutionary algorithm can be improved with the Levenberg-Marquardt technique. © 2009 Wiley Periodicals, Inc.

Open Access: Yes

DOI: 10.1002/int.20338

Fuzzy rule base extraction by the improved bacterial memetic algorithm

Publication Name: Sami 2008 6th International Symposium on Applied Machine Intelligence and Informatics Proceedings

Publication Date: 2008-08-25

Volume: Unknown

Issue: Unknown

Page Range: 49-53

Description:

In this paper we introduce new methods for handling knot order violation occurred in the Bacterial Memetic Algorithm (BMA) used for fuzzy rule base extraction. These methods perform slightly better than the method used before and are easier to integrate with the Bacterial Memetic Algorithm. ©2008 IEEE.

Open Access: Yes

DOI: 10.1109/SAMI.2008.4469132

Genetic and Bacterial Programming for B-Spline Neural Networks Design

Publication Name: Journal of Advanced Computational Intelligence and Intelligent Informatics

Publication Date: 2007-03-01

Volume: 11

Issue: 2

Page Range: 220-231

Description:

The design phase of B-spline neural networks is a highly computationally complex task. Existent heuristics have been found to be highly dependent on the initial conditions employed. Increasing interest in biologically inspired learning algorithms for control techniques such as Artificial Neural Networks and Fuzzy Systems is in progress. In this paper, the Bacterial Programming approach is presented, which is based on the replication of the microbial evolution phenomenon. This technique produces an efficient topology search, obtaining additionally more consistent solutions.

Open Access: Yes

DOI: 10.20965/jaciii.2007.p0220

Bacterial memetic algorithm for fuzzy rule base optimization

Publication Name: 2006 World Automation Congress Wac 06

Publication Date: 2006-01-01

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

In our previous works model identification methods were discussed. The bacterial evolutionary algorithm for extracting a fuzzy rule base from a training set was presented. The LevenbergMarquardt method was also proposed for determining membership functions in fuzzy systems. The combination of evolutionary and gradient-based learning techniques - the bacterial memetic algorithm - was also introduced. In this paper an improvement of the bacterial memetic algorithm is shown for fuzzy rule extraction. The new method can optimize not only the rules, but can also find the optimal size of the rule base. Copyright - World Automation Congress (WAC) 2006.

Open Access: Yes

DOI: 10.1109/WAC.2006.376057

An hybrid training method for B-spline neural networks

Publication Name: 2005 IEEE International Workshop on Intelligent Signal Processing Proceedings

Publication Date: 2005-12-01

Volume: Unknown

Issue: Unknown

Page Range: 165-170

Description:

Current and past research has brought up new views related to the optimization of neural networks. For a fixed structure, second order methods are seen as the most promising. From previous works we have shown how second order methods are of easy applicability to a neural network. Namely, we have proved how the Levenberg-Marquard possesses not only better convergence but how it can assure the convergence to a local minima. However, as any gradient-based method, the results obtained depend on the startup point. In this work, a reformulated Evolutionary algorithm - the Bacterial Programming for Levenberg-Marquardt is proposed, as an heuristic which can be used to determine the most suitable starting points, therefore achieving, in most cases, the global optimum. © 2005 IEEE.

Open Access: Yes

DOI: DOI not available

Design of B-spline neural networks using a bacterial programming approach

Publication Name: IEEE International Conference on Neural Networks Conference Proceedings

Publication Date: 2004-12-01

Volume: 3

Issue: Unknown

Page Range: 2313-2318

Description:

The design phase of B-spline neural networks represents a very high computational task. For this purpose, heuristics have been developed, but have been shown to be dependent on the initial conditions employed. In this paper a new technique, Bacterial Programming, is proposed, whose principles are based on the replication of the microbial evolution phenomenon. The performance of this approach is illustrated and compared with existing alternatives.

Open Access: Yes

DOI: 10.1109/IJCNN.2004.1380987

Estimating fuzzy membership functions parameters by the levenberg-marquardt algorithm

Publication Name: IEEE International Conference on Fuzzy Systems

Publication Date: 2004-12-01

Volume: 3

Issue: Unknown

Page Range: 1667-1672

Description:

In previous papers from the authors fuzzy model identification methods were discussed. The bacterial algorithm for extracting fuzzy rule base from a training set was presented. The Levenberg-Marquardt algorithm was also proposed for determining membership functions in fuzzy systems. In this paper the Levenberg-Marquardt technique is improved to optimise the membership functions in the fuzzy rules without Ruspini-partition. The class of membership functions investigated is the trapezoidal one as it is general enough and widely used. The method can be easily extended to arbitrary piecewise linear functions as well.

Open Access: Yes

DOI: 10.1109/FUZZY.2004.1375431