Oohori T, Naganuma H, Watanabe K. (2007). A new backpropagation learning algorithm for layered neural networks with nondifferentiable units. Neural computation. 19 [PubMed]

See more from authors: Oohori T · Naganuma H · Watanabe K

References and models cited by this paper

Alkon DL, Vogl TP, Mangis JK, Rigler AK, Zink WT. (1988). Accelerating the convergence of the back propagation method Biol Cybern. 59

Amari S. (1978). Mathematical Theory of Nerve Nets.

Baxter RA, Widrow B, Winter RG. (1988). Layered neural nets for pattern recognition IEEE Trans Acoustic, Speech and Signal Processing. 36

Furukawa T, Watanabe K, Oohori T, Mitani M. (1997). Fluctuation-driven learning for feed forward neural networks IEICE Trans Information and System. 1249

Jacobs R. (1988). Increased rates of convergence through learning rate adaption Neural Netw. 1

Kobayashi Y, Watanabe K, Oohori T, Kamada S, Konishi Y. (2000). A new error back propagation learning algorithm for a layered neural network with nondifferentiable units Tech Rep NC99-85 Institute of Electronics, Information, and Communications Engineers (Tokyo).

Minsky M. (1969). Perceptrons.

Rosenblatt F. (1962). Principles Of Neurodynamics.

Rumelhart DE, Hinton GE, Williams RJ. (1986). Learning representations by back-propagating errors. Nature. 323

Yamada K, Kuroyanagi S, Iwata A. (2004). A supervised learning method using duality in the artificial neuron model IEICE Trans Information and System. 87

Zeng Z, Goodman RM, Smyth P. (1993). Learning finite state machines with self-clustering recurrent networks Neural Comput. 5

References and models that cite this paper
This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.