Abstract—The Cerebellar Model Articulation Controller (CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. The standard CMAC uses the least mean squares algorithm to train the weights. Recently, the recursive least squares algorithm was proposed as a superior algorithm for training the CMAC online as it can converge in one epoch, and does not require tuning of a learning rate. However, the RLS algorithms computational speed is dependant on the number of weights required by the CMAC which is often large and thus can be very computationally inefficient. Recently also, the use of kernel methods in the CMAC was proposed to reduce memory usage and improve modeling capabilities. In this paper the Kernel Recursive Least Squares (KRLS) algorithm is applied to the CMAC. Due to the kernel method, the computational complexity of the CMAC becomes dependant on the number of unique training data, which can be significantly less than the weights required by non-kernel CMACs. Additionally, online sparsification techniques are applied to further improve computational speed.
Index Terms—CMAC, kernel recursive least squares.
The authors are with the Department of Electrical and Electronic Engineering, University of Auckland, Auckland, New Zealand (e-mail: clau070@aucklanduni.ac.nz, g.coghill@auckland.ac.nz)
[PDF]
Cite:C. W. Laufer and G. Coghill, "Kernel Recursive Least Squares for the CMAC Neural Network," International Journal of Computer Theory and Engineering vol. 5, no. 3, pp. 454-459, 2013.