"information theory"

Rate-distortion functions for gamma-type sources under absolute-log distortion measure

When the information source is a continuous distribution and the rate-distortion function is strictly larger than the Shannon lower bound, the explicit evaluation of the rate-distortion function is not straightforward. We evaluate the rate-distortion …

Channel capacity and achievable rates of peak power limited AWGNC, and their applications to adaptive modulation and coding

The channel conditions vary over time in wireless communications. In order to transmit information efficiently, digital wireless communication systems choose the modulation scheme and coding adaptively. This framework is called the adaptive …

Optimization of probability measure and its applications in information theory

Since Shannon's seminal work, channel capacity has been a fundamental quantity in information thoery. The general definition is formulated as an optimization problem of a probability measure under some moment constraints. Similarly, the definition of …

Rate-distortion function for gamma sources under absolute-log distortion measure

We evaluate the rate-distortion function for the i.i.d. gamma sources with respect to the absolute-log distortion measure. The logarithmic transformation reduces this rate-distortion problem to that under the absolute distortion measure. Extending …

Convex formulation for nonparametric estimation of mixing distribution

We discuss a nonparametric estimation method of the mixing distribution in mixture models. We propose an objective function with one parameter, where its minimization becomes the maximum likelihood estimation or the kernel vector quantization in …

An Introductory Review of Information Theory in the Context of Computational Neuroscience

This article introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information …

Capacity of a Single Spiking Neuron Channel

Information transfer through a single neuron is a fundamental component of information processing in the brain, and computing the information channel capacity is important to understand this information processing. The problem is difficult since the …

Stochastic reasoning, free energy, and information geometry

Belief propagation (BP) is a universal method of stochastic reasoning. It gives exact inference for stochastic models with tree interactions and works surprisingly well even if the models have loopy interactions. Its performance has been analyzed …

Information geometry of turbo and low-density parity-check codes

Turbo and LDPC (low-density parity check) codes are simple and new types of error correction codes which give a powerful and practical performance of error correction. Although experimental results show their efficacy, further theoretical analysis is …