C. Cabrita

55958626100

Publications - 7

Exploiting the functional training approach in takagi-sugeno neuro-fuzzy systems

Publication Name: Advances in Intelligent Systems and Computing

Publication Date: 2013-01-01

Volume: 195 AISC

Issue: Unknown

Page Range: 543-559

Description:

When used for function approximation purposes, neural networks and neuro-fuzzy systems belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the derivatives involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for Takagi-Sugeno models. An example shows that this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found. © 2013 Springer-Verlag Berlin Heidelberg.

Open Access: Yes

DOI: 10.1007/978-3-642-33941-7_48

Extending the functional training approach for B-splines

Publication Name: Proceedings of the International Joint Conference on Neural Networks

Publication Date: 2012-08-22

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. This paper extends the application of this formulation to B-splines, describing how the Levenberg-Marquardt method can be applied using this methodology. Simulation examples show that the use of the functional approach obtains important savings in computational complexity and a better approximation over the whole input domain. © 2012 IEEE.

Open Access: Yes

DOI: 10.1109/IJCNN.2012.6252741

Exploiting the functional training approach in B-splines

Publication Name: IFAC Proceedings Volumes IFAC Papersonline

Publication Date: 2012-01-01

Volume: 45

Issue: 4

Page Range: 127-132

Description:

When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for B-splines. An example shows that, besides great computational complexity savings, this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found. © 2012 IFAC.

Open Access: Yes

DOI: 10.3182/20120403-3-DE-3010.00070

A new domain decomposition for B-spline neural networks

Publication Name: Proceedings of the International Joint Conference on Neural Networks

Publication Date: 2010-01-01

Volume: Unknown

Issue: Unknown

Page Range: Unknown

Description:

B-spline Neural Networks (BSNNs) belong to the class of networks termed grid or lattice-based associative memories networks (AMN). The grid is a key feature since it allows these networks to exhibit relevant properties which make them efficient in solving problems namely, functional approximation, non-linear system identification, and on-line control. The main problem associated with BSNNs is that the model complexity grows exponentially with the number of input variables. To tackle this drawback, different authors developed heuristics for functional decomposition, such as the ASMOD algorithm or evolutionary approaches [2]. In this paper, we present a complementary approach, by allowing the properties of B-spline models to be achieved by non-full grids. This approach can be applied either to a single model or to an ASMOD decomposition. Simulation results show that comparable results, in terms of approximations can be obtained with less complex models. © 2010 IEEE.

Open Access: Yes

DOI: 10.1109/IJCNN.2010.5596648

Fuzzy rule extraction by bacterial memetic algorithms

Publication Name: International Journal of Intelligent Systems

Publication Date: 2009-03-01

Volume: 24

Issue: 3

Page Range: 312-339

Description:

In our previous papers, fuzzy model identification methods were discussed. The bacterial evolutionary algorithm for extracting fuzzy rule base from a training set was presented. The Levenberg-Marquardt method was also proposed for determining membership functions in fuzzy systems. The combination of the evolutionary and the gradient-based learning techniques is usually called memetic algorithm. In this paper, a new kind of memetic algorithm, the bacterial memetic algorithm., is introduced for fuzzy rule extraction. The paper presents how the bacterial evolutionary algorithm can be improved with the Levenberg-Marquardt technique. © 2009 Wiley Periodicals, Inc.

Open Access: Yes

DOI: 10.1002/int.20338

Genetic and Bacterial Programming for B-Spline Neural Networks Design

Publication Name: Journal of Advanced Computational Intelligence and Intelligent Informatics

Publication Date: 2007-03-01

Volume: 11

Issue: 2

Page Range: 220-231

Description:

The design phase of B-spline neural networks is a highly computationally complex task. Existent heuristics have been found to be highly dependent on the initial conditions employed. Increasing interest in biologically inspired learning algorithms for control techniques such as Artificial Neural Networks and Fuzzy Systems is in progress. In this paper, the Bacterial Programming approach is presented, which is based on the replication of the microbial evolution phenomenon. This technique produces an efficient topology search, obtaining additionally more consistent solutions.

Open Access: Yes

DOI: 10.20965/jaciii.2007.p0220

Estimating fuzzy membership functions parameters by the levenberg-marquardt algorithm

Publication Name: IEEE International Conference on Fuzzy Systems

Publication Date: 2004-12-01

Volume: 3

Issue: Unknown

Page Range: 1667-1672

Description:

In previous papers from the authors fuzzy model identification methods were discussed. The bacterial algorithm for extracting fuzzy rule base from a training set was presented. The Levenberg-Marquardt algorithm was also proposed for determining membership functions in fuzzy systems. In this paper the Levenberg-Marquardt technique is improved to optimise the membership functions in the fuzzy rules without Ruspini-partition. The class of membership functions investigated is the trapezoidal one as it is general enough and widely used. The method can be easily extended to arbitrary piecewise linear functions as well.

Open Access: Yes

DOI: 10.1109/FUZZY.2004.1375431