Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutorial Introductions)
Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutorial Introductions)

Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutorial Introductions)

Regular price
€19.99
Regular price
€19.99
Sale price
€19.99
Unit price
per 
Availability
Sold out
Tax included.

Author: Stone, James V

Brand: Tutorial Introductions

Edition: 1

Format: Illustrated

Number Of Pages: 212

Release Date: 23-05-2018

Details: Product Description The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the computational efficiency of neurons, with special reference to visual perception and the efficient coding hypothesis. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural processing; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, and a list of annotated Further Readings, this book is an ideal introduction to the principles of neural information theory. Review "This is a terrific book, which cannot fail to help any student who wants to understand precisely how energy and information constrain neural design. The tutorial approach adopted makes it more like a novel than a textbook. Consequently, both mathematically sophisticated readers and readers who prefer verbal explanations should be able to understand the material. Overall, Stone has managed to weave the disparate strands of neuroscience, psychophysics, and Shannon's theory of communication into a coherent account of neural information theory. I only wish I'd had this text as a student!" Peter Sterling, Professor of Neuroscience, University of Pennsylvania, USA. "Essential reading for any student of the why of neural coding: why do neurons send signals they way they do? Stone's insightful, clear, and eminently readable synthesis of classic studies is a gateway to a rich, glorious literature on the brain. Student and professor alike will find much to spark their minds within. I shall be keeping this wonderful book close by, as a sterling reminder to ask not just how brains work, but why." Professor Mark Humphries, School of Psychology, University of Nottingham, UK. "This excellent book provides an accessible introduction to an information theoretic perspective on how the brain works, and (more importantly) why it works that way. Using a wide range of examples, including both structural and functional aspects of brain organisation, Stone describes how simple optimisation principles derived from Shannon's information theory predict physiological parameters (e.g. axon diameter) with remarkable accuracy. These principles are distilled from original research papers, and the informal presentation style means that the book can be appreciated as an overview; but full mathematical details are also provided for dedicated readers. Stone has integrated results from a diverse range of experiments, and in so doing has produced an invaluable introduction to the nascent field of neural information theory. " Dr Robin Ince, Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, UK. About the Author Dr James Stone is an Honorary Reader in Vision and Computational Neuroscience at the University of Sheffield, England.

EAN: 9780993367922

Languages: English

Binding: Paperback

Item Condition: New