## Gambling and Shannon’s entropy function.

A little while ago Matt posted an overview of the definition of Shannon’s entropy function and it’s use in assessing interaction between random variables. I’d like  to stay with the 1960’s Bell Labs team and describe some of the results from J. L. Kelly’s wonderful paper, which arises from the definitions and ideas in Shannon’s original work.

Consider at first the trivial scenario where we have a noiseless binary channel that transmits the results of say a baseball game. If the odds are even and the gambler has access to this channel he can grow his capital exponentially by making certain bets (since he knows the outcome before betting). His capital would grow at a rate of $2^N$, after N bets. In view of the fact that any such function should be maximum in the above scenario, Kelly defines the exponential rate of growth of capital as

$G = lim_{N \rightarrow \infty} \frac{1}{N} log_2(\frac{V_N}{V_0}),$

were $V_0$ is the initial capital and $V_N$ is the capital after N bets.

Now suppose that the channel is noisy, and a given symbol has a probability p of error and q of correct transmission. The question now is, how much should the gambler bet?

He could bet his entire capital, which would maximize the value of his expected capital $V_N$ which here is equal to $(2q)^N V_0$. However, as N gets large this is a terrible strategy since with probability 1 the gambler will loose, i.e. will be ruined. So let us assume that he will bet on a fraction, l, of capital at any given time, and that fraction is constant. Then

$V_N = (1+l)^W (1-l)^L V_0,$

where W and L are the number of wins and losses, respectively, in the N bets. Then we have

$G = lim_{N \rightarrow \infty} (\frac{W}{N} log_2 (1+l) + \frac{L}{N} log_2 (1-l)) = q *log_2 (1+l) + p* log_2 (1-l),$

with probability 1.

If we maximize G with respect to l,  we get $2q = (1+l)$ or $2p = (1-l)$ and

$G_{max} = 1 + p* log_2(p) + q* log_2(q)$

which is the rate of transmission function defined by Shannon. Hence, in a nondeterministic game optimizing a gambler’s strategy based on a probabilistic information channel stream gives us the rate of transmission.

In the next post, I’ll talk about what happens when you extend these results to a more general case where the channel has many possible inputs and outputs, as well as not fair odds. The amazing result is that regardless of whether the odds are fair or not, the gambler will ignore them when maximizing the exponential rate of growth function G.