Estimating Information-Theoretic Quantities
release_j2tk5w4pufhtpkaawiakvqj4jy
by
Robin A. A. Ince, Simon R. Schultz, Stefano Panzeri
2015
Abstract
Information theory is a practical and theoretical framework developed for the
study of communication over noisy channels. Its probabilistic basis and
capacity to relate statistical structure to function make it ideally suited for
studying information flow in the nervous system. It has a number of useful
properties: it is a general measure sensitive to any relationship, not only
linear effects; it has meaningful units which in many cases allow direct
comparison between different experiments; and it can be used to study how much
information can be gained by observing neural responses in single trials,
rather than in averages over multiple trials. A variety of information
theoretic quantities are in common use in neuroscience - (see entry "Summary of
Information-Theoretic Quantities"). Estimating these quantities in an accurate
and unbiased way from real neurophysiological data frequently presents
challenges, which are explained in this entry.
In text/plain
format
Archived Content
There are no accessible files associated with this release. You could check other releases for this work for an accessible version.
Know of a fulltext copy of on the public web? Submit a URL and we will archive it
1501.01863v1
access all versions, variants, and formats of this works (eg, pre-prints)