Alquezar R, Sanfeliu A. (1994). A hybrid connectionist symbolic approach to regular grammar inference based on neural learning and hierarchical clustering Proc ICGI94.
Alquezar R, Sanfeliu A. (1994). Inference and recognition of regular grammars by training recurrent neural networks to learn the next-symbol prediction task Advances in pattern recognition and applications: Selected papers from the Vth Spanish Symposium on Pattern Recognition and Image Analysis.
Alquezar R, Sanfeliu A. (1995). Active grammatical inference: A new learning methodology Shape, Structure and Pattern Recognition, 5th IAPR International Workshop on Structural and Syntactic Pattern Recognition.
Alquezar R, Sanfeliu A, Sainz M. (1997). Experimental assessment of connectionist regular inference from positive and negative examples VII Simposium Nacional De Reconocimiento De Formas Y Analisis De Imagenes. 1
Andrews R, Diederich J, Tickle AB. (1995). Survey and critique of techniques for extracting rules from trained artificial neural networks Knowledge Based Systems. 8
Bakker B. (2004). The state of mind: Reinforcement learning with recurrent neural networks Unpublished doctoral dissertation.
Bakker B, de_Jong M. (2000). The epsilon state count From animals to animats 6: Proceedings of the Sixth International Conference on Simulation of Adaptive Behavior.
Barreto Gde A, Araújo AF, Kremer SC. (2003). A taxonomy for spatiotemporal connectionist networks revisited: the unsupervised case. Neural computation. 15 [PubMed]
Bengio Y, Simard P, Frasconi P. (1994). Learning long-term dependencies with gradient descent is difficult. IEEE transactions on neural networks. 5 [PubMed]
Blair A, Boden M, Wiles J, Tonkes B. (1999). Learning to predict a context-free language: Analysis of dynamics in recurrent hidden units Proc ICANN99. 99
Blair A, Pollack J. (1997). Analysis of dynamical recognizers Neural Comput. 9
Blair A, Wiles J, Tonkes B. (1998). Inductive bias in context-free language learning Proceedings of the Ninth Australian Conference on Neural Networks.
Blanco A, Delgado M, Pegalajar MC. (2000). Extracting rules from a (fuzzy-crisp) recurrent neural network using a self-organizing map Int J Intell Syst. 15
Boden M, Niklasson L. (1997). Representing structure and structured representations in connectionist networks Neural Network perspectives on cognition and adaptive robotics.
Bruske J, Sommer G. (1995). Dynamic cell structure learns perfectly topology preserving map Neural Comput. 7
Bullinaria JA. (1997). Analyzing the internal representations of trained artificial neural networks Neural network analysis, architectures and applications.
Carrasco RC, Forcada ML, Valdés-Muñoz MA, Neco RP. (2000). Stable encoding of finite-state machines in discrete-time recurrent neural nets with sigmoid units. Neural computation. 12 [PubMed]
Casey M. (1996). The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction. Neural computation. 8 [PubMed]
Cechin AL, Pechmann_Simon DR, Stertz K. (2003). State automata extraction from recurrent neural nets using k-means and fuzzy clustering XXIII International Conference of the Chilean Computer Science Society.
Chen D, Giles CL, Goudreau MW, Chakradhar ST. (1994). First-order vs. second-order single layer recurrent neural networks IEEE Trans Neural Networks. 5
Chen D et al. (1992). Learning and extracting finite state automata with second-order recurrent neural networks Neural Comput. 4
Chen D et al. (1992). Extracting and learning an unknown grammar with recurrent neural networks Advances in neural information processing systems. 4
Chen H et al. (1991). Second-order recurrent neural networks for grammatical inference Proceedings Of International Joint Conference On Neural Networks. 2
Crutchfield JP. (1994). The calculi of emergence: Computation, dynamics, and induction Physica D. 75
Crutchfield JP, Young K. (1990). Computation at the onset of chaos Complexity, entropy and the physics of information.
Crutchfield JP, Young K. (1993). Fluctuation spectroscopy Chaos, Solutions, and Fractals. 4
Das S, Das R. (1991). Induction of discrete-state machine by stabilizing a simple recurrent network using clustering Computer Science And Information. 21
Das S, Giles CL, Sun GZ. (1993). Using prior knowledge in a NNPDA to learn context-free languages Advances in neural information processing systems. 5
Das S, Mozer MC. (1994). A unified gradient-descent-clustering architecture for finite state machine induction Advances in neural information processing systems. 6
Diederich J, Schellhammer I, Towsey M, Brugman C. (1998). Knowledge extraction and recurrent neural networks: An analysis of an Elman network trained on a natural language learning task Proceedings of the Joint Conference on New Methods in Language Processing and Computational Natural Language Learning: NeMLaP3-CoNLL98.
Elman JL. (1990). Finding structure in time Cognitive Science. 14
Elman JL, Wiles J. (1995). Learning to count without a counter: A case study of dynamics and activation landscapes in recurrent neural networks Proceedings of the Seventeenth Annual Conference of the Cognitive Science Society.
Elman JL, Wiles J, Rodriguez P. (1999). A recurrent network that learns to count Connection Science. 11
Forcada ML. (2002). Neural networks: Automata and formal models of computation An unfinished survey (Available online at: http:--www.dlsi.ua.es-~mlf-nnafmc-).
Forcada ML, Carrasco RC. (2001). Simple strategies to encode tree automata in sigmoid recursive neural networks IEEE Trans Know Data Eng. 13
Forcada ML, Carrasco RC. (2001). Finite-state computation in analog neural networks: Steps towards biologically plausible models? Emergent computational models based on neuroscience.
Frasconi P, Gori M, Maggini M, Soda G. (1996). Representation of finite state automata in recurrent radial basis function networks Mach Learn. 23
Gers FA, Schmidhuber E. (2001). LSTM recurrent networks learn simple context-free and context-sensitive languages. IEEE transactions on neural networks. 12 [PubMed]
Giles CL, Chen HH, Sun GZ. (1998). The neural network pushdown automation: Architecture, dynamics and learning Adaptive processing of sequences and data structures.
Giles CL, Goudreau MW. (1995). Using recurrent neural networks to learn the structure of interconnection networks Neural Netw. 8
Giles CL, Horne BG. (1995). An experimental comparison of recurrent neural networks Advances in neural information processing systems. 7
Giles CL, Lawrence S, Fong S. (2000). Natural language grammatical inference with recurrent neural networks IEEE Transactions On Knowledge And Data Engineering. 12
Giles CL, Lawrence S, Tsoi A. (1997). Rule inference for financial prediction using recurrent neural networks Proceedings of IEEE-IAFE Conference on Computational Intelligence for Financial Engineering (CIFEr).
Giles CL, Lawrence S, Tsoi AC. (1998). Symbolic conversion, grammatical inference and rule extraction for foreign exchange rate prediction Neural networks in the capital markets NNCM96.
Giles CL, Lawrence S, Tsoi AC. (2001). Noisy time series prediction using a recurrent neural network and grammatical inference Mach Learn. 44
Giles CL, Miller CB. (1993). Experimental comparison of the effect of order in recurrent neural networks Int J Pattern Recogn Art Intell. 7
Giles CL, Omlin C. (1996). Extraction of rules from discrete-time recurrent neural networks Neural Networks. 9
Giles CL, Omlin CW. (1994). Pruning recurrent neural networks for improved generalization performance. IEEE transactions on neural networks. 5 [PubMed]
Giles CL, Tino P, Horne BG, Collingwood PC. (1998). Finite state machines and recurrent neural networks-automata and dynamical systems approaches Neural networks and pattern recognition.
Golea M. (1996). On the complexity of rule extraction from neural networks and network-querying Tech Rep.
Golea M, Andrews R, Diederich J, Tickle A. (1997). Rule extraction from artificial neural networks Neural network analysis, architectures and applications .
Golea M, Andrews R, Diederich J, Tickle AB. (1998). The truth will come to light: Directions and challenges in extracting the knowledge embedded within mined artificial neural networks IEEE Transactions On Neural Networks. 9
Gori M, Maggini M, Martinelli E, Soda G. (1998). Inductive inference from noisy examples using the hybrid finite state filter. IEEE transactions on neural networks. 9 [PubMed]
Gori M, Maggini M, Soda G. (1994). Scheduling of modular architectures for inductive inference of regular grammars ECAI94 Workshop on Combining Symbolic and Connectionist Processing.
Hammer B, Tino P. (2003). Recurrent neural networks with small weights implement definite memory machines Neural Comput. 15
Hammer B, Tino P. (2003). Architectural bias in recurrent neural networks-Fractal analysis Neural Comput. 15
Hinton GE. (1990). Mapping part-whole hierarchies into connectionist networks Art Intell. 46
Hopcroft J, Ullman J. (1979). Introduction to automata theory, languages, and computation.
Horne BG, Hush DR. (1994). Bounds on the complexity of recurrent neural network implementations of finite state machines Advances in neural information processing systems. 6
Jacobsson H, Ziemke T. (2003). Improving procedures for evaluation of connectionist context-free language predictors. IEEE transactions on neural networks. 14 [PubMed]
Jacobsson H, Ziemke T. (2003). Reducing complexity of rule extraction from prediction RNNs through domain interaction Tech. Rep. No. HS-IDA-TR-03-007.
Jacobsson H, Ziemke T, Boden M. (2000). Evolving context-free language predictors Proceedings of the Genetic and Evolutionary Computation Conference.
Jaeger H. (2003). Adaptive nonlinear system identification with echo state networks Advances in neural information processing systems. 15
Jain AK, Murty MN, Flynn PJ. (1999). Data clustering: A review ACM Computing Surveys. 31
Kohonen T. (1995). Self-organizing Maps.
Kolen JF. (1993). Fools gold: Extracting finite state machines from recurrent network dynamics Neural information processing systems. 6
Kolen JF. (1994). Exploring the computational capabilities of recurrent neural networks Unpublished doctoral dissertation.
Kolen JF, Kremer SC. (2001). A field guide to dynamical recurrent networks.
Kremer SC. (2001). Spatiotemporal connectionist networks: A taxonomy and review Neural Comput. 13
Kremer SC, Cicchello O. (2003). Inducing grammars from sparse data sets: A survey of algorithms and results J Mach Learn Res. 4
Kuhn TS. (1962). The structure of scientific revolutions.
Lin T, Giles CL, Horne BG. (1995). Learning a class of large finite state machines with a recurrent neural network Neural Netw. 8
Maggini M. (1998). Recursive neural networks and automata Adaptive processing of sequences and data structures.
Manolios P, Fanelli R. (1994). First order recurrent neural networks and deterministic finite state automata Neural Comput. 6
Mcclelland JL, Servan-Schreiber D, Cleeremans A. (1989). Finite state automata and simple recurrent networks Neural Comput. 1
Mcclelland JL, Servan-Schreiber D, Cleeremans A. (1989). Learning sequential structure in simple recurrent networks Advances in neural information processing systems. 1
Mcclelland JL, Servan-Schreiber D, Cleeremans A. (1991). Graded state machines: The representation of temporal contingencies in simple recurrent networks Mach Learn. 7
Medler D. (1998). A brief history of connectionism Neural Computing Surveys. 1
Meeden LA. (1996). An incremental approach to developing intelligent neural network controllers for robots. IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics : a publication of the IEEE Systems, Man, and Cybernetics Society. 26 [PubMed]
Miller C, Omlin CW, Giles C. (1992). Heuristics for the extraction of rules from discrete-time recurrent neural networks Proceedings Of The International Joint Conference On Neural Networks. 1
Mirkin B. (1996). Mathematical classification and clustering.
Mozer M, Das S. (1998). Dynamic On-line Clustering and State Extraction: An Approach to Symbolic Learning. Neural networks : the official journal of the International Neural Network Society. 11 [PubMed]
Omlin CW. (2001). Understanding and explaining DRN behaviour A field guide to dynamical recurrent networks.
Omlin CW, Giles CL. (1992). Training second-order recurrent neural networks using hints Proceedings of the Ninth International Conference on Machine Learning.
Omlin CW, Giles CL. (1993). Insertion and refinement of production rules in recurrent neural networks Connection Science. 5
Omlin CW, Giles CL. (1996). Constructing deterministic finite-state automata in recurrent neural networks J ACM. 43
Omlin CW, Giles CL. (1996). Rule revision with recurrent neural networks Know Data Eng. 8
Omlin CW, Giles CL. (2000). Symbolic knowledge representation in recurrent neural networks: Insights from theoretical models of computation Knowledge-based neurocomputing.
Omlin CW, Giles CL, Thornber KK. (1998). Deterministic fuzzy finite state automata can be deterministically encoded into recurrent neural networks IEEE Trans Fuzzy Systems. 6
Omlin CW, Vahed A. (1999). Rule extraction from recurrent neural networks using a symbolic machine learning algorithm Tech. Rep. No. US-CS-TR-4.
Paz A. (1971). Introduction to probabilistic automata.
Pitts W, Mcculloch WS. (1943). A Logical Calculus of Ideas Immanent in Nervous Activity Bull Math Biophysics. 5
Pollack J, Kolen J. (1995). The observers paradox: Apparent computational complexity in physical systems J Exp Theoret Art Intell. 7
Rabin MO. (1963). Probabilistic automata Information And Control. 6
Rodriguez PF. (1999). Mathematical foundations of simple recurrent neural networks in language processing Unpublished doctoral dissertation.
Schmidhuber J. (1992). Learning complex, extended sequences using the principle of history compression Neural Comput. 4
Sharkey AJC. (1996). [Special issue]. Combining artificial neural nets: Ensemble approaches Connection Science. 8
Sharkey NE, Jackson SA. (1995). An internal report for connectionists Computational architectures integrating neural and symbolic processes .
Shastri L, Jagota A, Sun R, Plate T. (1999). Connectionist symbol processing: Dead or alive? Neural Computing Surveys. 2
Shavlik JW, Craven MW. (1994). Using sampling and queries to extract rules from trained neural networks Machine learning: Proceedings of the Eleventh International Conference.
Shavlik JW, Craven MW. (1996). Extracting tree-structured representations of trained networks Advances in neural information processing systems. 8
Shavlik JW, Craven MW. (1999). Rule extraction: Where do we go from here? Tech. Rep. No. Machine Learning Research Group Working Paper 99-1.
Sontag E, Siegelmann H. (1995). On the computational power of neural nets Journal Of Computer And System Sciences. 50
Sun R. (2001). Introduction to sequence learning Sequence learning: Paradigms, algorithms, and applications.
Sun R, Peterson T, Sessions C. (2001). The extraction of planning knowledge from reinforcement learning neural networks Proceedings of WIRN2001.
Tabor W, Tanenhaus M. (1999). Dynamical models of sentence processing Cognitive Science. 24
Tino P, Cernanský M, Benusková L. (2004). Markovian architectural bias of recurrent neural networks. IEEE transactions on neural networks. 15 [PubMed]
Tino P, Dorffner G, Schittenkopf C. (2000). Understanding state space organization in recurrent neural networks with iterative function systems dynamics Hybrid neural symbolic integration.
Tino P, Köteles M. (1999). Extracting finite-state representations from recurrent neural networks trained on chaotic symbolic sequences. IEEE transactions on neural networks. 10 [PubMed]
Tino P, Sajda J. (1995). Learning and extracting initial mealy machines with a modular neural network model Neural Comput. 7
Tino P, Vojtek V. (1998). Extracting stochastic machines from recurrent neural networks trained on complex symbolic sequences Neural Network World. 8
Tomita M. (1982). Dynamic construction of finite-state automata from examples using hillclimbing Proceedings of Fourth Annual Cognitive Science Conference.
Towell GG, Shavlik JW. (1993). The extraction of refined rules from knowledge-based neural networks Mach Learn. 13
Trakhtenbrot BA, Barzdin JM. (1973). Finite automata: Behavior and synthesis.
Vahed A, Omlin CW. (2004). A machine learning method for extracting symbolic knowledge from recurrent neural networks. Neural computation. 16 [PubMed]
Watrous RL, Kuhn GM. (1992). Induction of finite-state automata using second-order recurrent networks Advances in neural information processing systems. 4
Wiles J, Tonkes B. (1999). Learning a context-free task with a recurrent neural network: An analysis of stability Dynamical Cognitive Science: Proceedings of the Fourth Biennial Conference of the Australasian Cognitive Science Society.
Zeng Z, Goodman RM, Smyth P. (1993). Learning finite state machines with self-clustering recurrent networks Neural Comput. 5
Ziemke T, Thieme M. (2002). Neuromodulation of reactive sensorimotor mappings as a short-term memory mechanism in delayed response tasks Adapt Behav. 10
Jacobsson H. (2006). The crystallizing substochastic sequential machine extractor: CrySSMEx. Neural computation. 18 [PubMed]
Tino P, Mills AJ. (2006). Learning beyond finite memory in recurrent networks of spiking neurons. Neural computation. 18 [PubMed]