Neural Information Processing. Models and Applications

Neural Information Processing. Models and Applications
Author: Kevin K.W. Wong
Publisher: Springer
Total Pages: 763
Release: 2010-11-18
Genre: Computers
ISBN: 3642175341

The two volume set LNCS 6443 and LNCS 6444 constitutes the proceedings of the 17th International Conference on Neural Information Processing, ICONIP 2010, held in Sydney, Australia, in November 2010. The 146 regular session papers presented were carefully reviewed and selected from 470 submissions. The papers of part I are organized in topical sections on neurodynamics, computational neuroscience and cognitive science, data and text processing, adaptive algorithms, bio-inspired algorithms, and hierarchical methods. The second volume is structured in topical sections on brain computer interface, kernel methods, computational advance in bioinformatics, self-organizing maps and their applications, machine learning applications to image analysis, and applications.

Process Neural Networks

Process Neural Networks
Author: Xingui He
Publisher: Springer Science & Business Media
Total Pages: 240
Release: 2010-07-05
Genre: Computers
ISBN: 3540737626

For the first time, this book sets forth the concept and model for a process neural network. You’ll discover how a process neural network expands the mapping relationship between the input and output of traditional neural networks and greatly enhances the expression capability of artificial neural networks. Detailed illustrations help you visualize information processing flow and the mapping relationship between inputs and outputs.

Advances in Neural Information Processing Systems 17

Advances in Neural Information Processing Systems 17
Author: Lawrence K. Saul
Publisher: MIT Press
Total Pages: 1710
Release: 2005
Genre: Computational intelligence
ISBN: 9780262195348

Papers presented at NIPS, the flagship meeting on neural computation, held in December 2004 in Vancouver.The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation. It draws a diverse group of attendees--physicists, neuroscientists, mathematicians, statisticians, and computer scientists. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning and control, emerging technologies, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December, 2004 conference, held in Vancouver.

Spike-timing dependent plasticity

Spike-timing dependent plasticity
Author: Henry Markram
Publisher: Frontiers E-books
Total Pages: 575
Release:
Genre:
ISBN: 2889190439

Hebb's postulate provided a crucial framework to understand synaptic alterations underlying learning and memory. Hebb's theory proposed that neurons that fire together, also wire together, which provided the logical framework for the strengthening of synapses. Weakening of synapses was however addressed by "not being strengthened", and it was only later that the active decrease of synaptic strength was introduced through the discovery of long-term depression caused by low frequency stimulation of the presynaptic neuron. In 1994, it was found that the precise relative timing of pre and postynaptic spikes determined not only the magnitude, but also the direction of synaptic alterations when two neurons are active together. Neurons that fire together may therefore not necessarily wire together if the precise timing of the spikes involved are not tighly correlated. In the subsequent 15 years, Spike Timing Dependent Plasticity (STDP) has been found in multiple brain brain regions and in many different species. The size and shape of the time windows in which positive and negative changes can be made vary for different brain regions, but the core principle of spike timing dependent changes remain. A large number of theoretical studies have also been conducted during this period that explore the computational function of this driving principle and STDP algorithms have become the main learning algorithm when modeling neural networks. This Research Topic will bring together all the key experimental and theoretical research on STDP.

Advances in Neural Information Processing Systems 19

Advances in Neural Information Processing Systems 19
Author: Bernhard Schölkopf
Publisher: MIT Press
Total Pages: 1668
Release: 2007
Genre: Artificial intelligence
ISBN: 0262195682

The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation and machine learning. This volume contains the papers presented at the December 2006 meeting, held in Vancouver.

Advances in Neural Information Processing Systems 9

Advances in Neural Information Processing Systems 9
Author: Michael C. Mozer
Publisher: MIT Press
Total Pages: 1128
Release: 1997
Genre: Artificial intelligence
ISBN: 9780262100656

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes neural networks and genetic algorithms, cognitive science, neuroscience and biology, computer science, AI, applied mathematics, physics, and many branches of engineering. Only about 30% of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. All of the papers presented appear in these proceedings.

Probabilistic Models of the Brain

Probabilistic Models of the Brain
Author: Rajesh P.N. Rao
Publisher: MIT Press
Total Pages: 348
Release: 2002-03-29
Genre: Medical
ISBN: 9780262264327

A survey of probabilistic approaches to modeling and understanding brain function. Neurophysiological, neuroanatomical, and brain imaging studies have helped to shed light on how the brain transforms raw sensory information into a form that is useful for goal-directed behavior. A fundamental question that is seldom addressed by these studies, however, is why the brain uses the types of representations it does and what evolutionary advantage, if any, these representations confer. It is difficult to address such questions directly via animal experiments. A promising alternative is to use probabilistic principles such as maximum likelihood and Bayesian inference to derive models of brain function. This book surveys some of the current probabilistic approaches to modeling and understanding brain function. Although most of the examples focus on vision, many of the models and techniques are applicable to other modalities as well. The book presents top-down computational models as well as bottom-up neurally motivated models of brain function. The topics covered include Bayesian and information-theoretic models of perception, probabilistic theories of neural coding and spike timing, computational models of lateral and cortico-cortical feedback connections, and the development of receptive field properties from natural signals.

Handbook on Neural Information Processing

Handbook on Neural Information Processing
Author: Monica Bianchini
Publisher: Springer Science & Business Media
Total Pages: 547
Release: 2013-04-12
Genre: Technology & Engineering
ISBN: 3642366570

This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include: Deep architectures Recurrent, recursive, and graph neural networks Cellular neural networks Bayesian networks Approximation capabilities of neural networks Semi-supervised learning Statistical relational learning Kernel methods for structured data Multiple classifier systems Self organisation and modal learning Applications to content-based image retrieval, text mining in large document collections, and bioinformatics This book is thought particularly for graduate students, researchers and practitioners, willing to deepen their knowledge on more advanced connectionist models and related learning paradigms.