Information and Coding Theory | SpringerLinkA major stumbling block to cracking the real-time neural code is neuronal variability - neurons discharge spikes with enormous variability not only across trials within the same experiments but also in resting states. Such variability is widely regarded as a noise which is often deliberately averaged out during data analyses. In contrast to such a dogma, we put forth the Neural Self-Information Theory that neural coding is operated based on the self-information principle under which variability in the time durations of inter-spike-intervals ISI , or neuronal silence durations, is self-tagged with discrete information. As the self-information processor, each ISI carries a certain amount of information based on its variability-probability distribution; higher-probability ISIs which reflect the balanced excitation-inhibition ground state convey minimal information, whereas lower-probability ISIs which signify rare-occurrence surprisals in the form of extremely transient or prolonged silence carry most information. These variable silence durations are naturally coupled with intracellular biochemical cascades, energy equilibrium and dynamic regulation of protein and gene expression levels.
Information entropy - Journey into information theory - Computer Science - Khan Academy
Information and Coding Theory (Springer Undergraduate Mathematics Series)
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together. Skip to content. Permalink Dismiss Join GitHub today GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together. Sign up. Branch: master Find file Copy path. Find file Copy path. Raw Blame History.
Information theory is a relatively young subject. Every time you make a phone call, store a file on your computer, query an internet search engine, watch a DVD, stream a movie, listen to a CD or mp3 file, etc. However, independent of such applications, the underlying mathematical objects arise naturally as soon as one starts to think about "information" in a mathematically rigorous way. In fact, a large part of the course deals with two fundamental questions:. The student will have learned about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Understand the theoretical limits of transmitting information due to noise.
Citations per year
My research interests are mainly in Group Theory and its applications to areas such as Combinatorics, Galois Theory, Geometry and Topology. Current research projects involve dessins d'enfants and Beauville surfaces. I am currently working on textbooks on dessins d'enfants and the theory of algorithms. I have supervised about a dozen PhD students. I retired in , and since then I have continued my research as an Emeritus Professor. Mathematical Sciences.