100 Essential Things You Didn't Know You Didn't Know (13 page)

BOOK: 100 Essential Things You Didn't Know You Didn't Know
5.78Mb size Format: txt, pdf, ePub
ads

fn1
See J.D. Barrow,
The Infinite Book
(Vintage), for details.

fn2
This argument was first made by the Californian economist Marc Reinganum. For more conservative investors, note that £1 invested in 2007 in a savings account giving 4% compound interest would have risen in value to £1× (1 + 0.04)
1000
= 108 million billion pounds by the year 3007. However, this will probably still be what you would have to pay for your Sunday newspaper.

38

A Thought for Your Pennies

One of the evils of money is that it tempts us to look at it rather than at the things that it buys.

E.M. Forster

Coins cause delays. Buy something that costs 79 pence (or 79 cents) and you will be rummaging in the depths of your purse for an exact combination of coins to make up the amount. If you don’t, and offer £1 (or $1) instead, then you are about to receive even more small change and it will take you even longer to find the change you need next time. The question is: what is the collection of different coin values that is best for making up your change?

In the UK (and Europe) we have coins of value 1, 2, 5, 10, 20 and 50 in order to make up £1 (or 1 Euro). In the USA there are 1, 5, 10, and 25 cent coins with which to make change of $1 or 100 cents. These systems are ‘handy’ in the sense that the smallest number of coins needed to make up any amount up to a 100 is used by following a simple recipe that computer scientists call a ‘greedy algorithm’. This recipe, and hence its name, makes up the total by taking as many of the largest coins as possible first, then doing the same with the next largest, and so on. If we wanted to make 76 cents then this could best be done by taking only three coins: one 50c, one 25c, and one 1c. In the UK, for 76 pence we would need one 50p, one 20p, one 5p and one 1p, so
we
require a minimum of four coins – one more than the American system needs despite having more denominations to choose from.

My last year in school is memorable for the fact that it contained the ‘D-Day’ date of 15 February 1971, when the UK changed to a decimal coinage system. Previously, the old system of £sd, pounds, shillings and pence, had a large number of coins with unusual values (for historical reasons). There were 240 old pennies (labelled d, from the Latin
denarii
) in £1, and in my childhood there were coins with values, ½d, 1d, 3d, 6d, 12d, 24d, and 30d, which had the common names halfpenny, penny, threepenny bit, sixpence, shilling, florin and half crown. This system of coins does not have the ‘handy’ property that it makes change most efficiently by following the greedy algorithm. If you wanted to make up 48d (4 shillings), then the algorithm would tell you to use one 30d, one 12d and one 6d – three coins in total. But you could have done it more efficiently by using two 24p pieces. In previous centuries there existed a 4d coin, called a groat, which had the same effect on the change making. A greedy algorithm would make 8d with three coins, a 6d and two 1d, but it can be done with just two groats. This situation arises when we double the value of one of our coins that are bigger that 2d (the 24d or the 4d) because there is not another coin with that value. This cannot happen with the modern US, UK and Euro denominations.

All the modern systems that are in use pick from a similar set of round numbers, 1, 2, 5, 10, 20, 25 and 50, for their coins. But is this the best possible set to choose from? The numbers are relatively easy to add up and combine, but are they the optimal when it comes to making change with as few coins as possible?

A few years ago Jeffrey Shallit, based in Waterloo, Ontario, carried out a computer investigation of the average number of coins you need to make up any amount of change from 1 cent to 100 cents in the US system with different choices of coin denominations. If you use the actual collection of 1, 5, 10 and 25 cent
denominations
, then the average number of coins you will need to make up every amount from 1 to 100 cents turns out to be 4.7. If you only had a 1 cent coin you would need 99 coins to make up 99 cents and the average number for all amounts from 1 to 100 is 49.5 – the worst possible. If you just have a 1 cent and a 10 cent coin then the average falls to 9. The interesting question is whether we can make the average lower with a different set of 4 coins than the actual set of 1, 5, 10 and 25 cent pieces. The answer is ‘yes’, and there are two different sets of coins that do better. If there were 1, 5, 18 and 29 cent coins or 1, 5, 18 and 25 cent coins only, then the average number of coins needed to make every amount from 1 to 100 cents is only 3.89. The best one to take is the 1, 5, 18 and 25 cent solution as this only requires one small change from the actual – replace the 10 cent dime with a new 18 cent piece!

We can see what happens in the UK or the Euro system when we subject it to a similar analysis and ask what new coin should we add to our currency to make it as efficient as possible. These systems now have coins with 1, 2, 5, 10, 20, 50, 100 and 200 denominations (i.e., in the UK we now have £1 and £2 coins and no £1 bank note). Suppose we work out the average number of coins needed to make up all amounts from 1 (penny or cent) to 500 in these systems. It is 4.6. But this could be reduced to 3.92 by adding a coin of value 133 or 137 pence or cents.

39

Breaking the Law of Averages

There are three types of lies – lies, damned lies, and statistics.

Benjamin Disraeli

Our look at the generation of wind power in Chapter 35 hides an interesting subtlety about statistics that it is good to be aware of. We all know that there are three types of lies – lies, damned lies, and statistics, as Disraeli warned – but it is one thing to know that there is something to beware of, quite another to know where the danger lurks. Let’s see how you could be misled by a sneaky use of statistics in the case of wind power. We know that the power available in the wind is proportional to the cube of the wind’s speed, V
3
. To avoid writing in all the other quantities involved, like the density of the air, which don’t change, let’s just say that the power generated per unit of time is equal to this, so P = V
3
. For simplicity we assumed that the average speed of the wind over a year was 5 metres per second and so the power generated over one year would be 5
3
× 1 yr = 125 units.

In practice, the average speed of the wind is not what the wind speed always is. Suppose that we allow for a very simple variation. Assume that the wind’s speed is zero for half of the days of the year and equal to 10 metres per second for the other half. The average speed over the year is still ½× 10 = 5 metres per second.
But
what is the power generated now? For half of the year it must be zero because the wind speed is zero; for the other half it is ½× 10
3
= 500. The total generated over the year is therefore 500 units, much more than we get if we just take the average speed. The days with the higher wind speeds vastly over compensate in power output for the days with no wind at all. In reality, the distribution of wind speeds over a year is more complicated than the simple one used here, but it has the same property of producing much more power gain when the wind speed is above average than power loss when it is below average This is a situation that contravenes what is proverbially known as the
law of averages
. In fact, this is not a general law at all, but just the intuition that many people have that variations all even out in the end: that is, in the long run there will be as many gains as losses of comparable sizes above and below the average behaviour. This is true only for random variations governed by statistics that have a special symmetric pattern. In our problem of the wind power generation this is not the case, and so the above-average winds have a much bigger effect than the below-average winds. Beware of Mister Average – and Ms Average too.

40

How Long are Things Likely to Survive?

Statistics are like a bikini. What they reveal is suggestive, but what they conceal is vital.

Aaron Levenstein

Statistics are powerful tools. They seem to tell you something for nothing. They often predict the outcomes of our elections on the basis of just a few early samples of voters’ decisions in one or two constituencies. To the outsider it appears that they are able to draw conclusions from almost no evidence at all. One of my favourite examples of this apparent wizardry is with regard to predicting the future. If we have some institution or tradition that has been in existence for a given period of time, how long should we expect it to survive in the future? The key idea is rather simple. If you were to see something at a random moment, then there is a 95% chance that you will be glimpsing it during the middle 95% of its total lifetime. Consider an expanse if time equal to 1 time unit and take out the middle interval of 0.95 so there is an interval of length 0.025 at the beginning and the end:

If you observed at A, then the future occupies 0.95 + 0.025 = 0.975 and the past occupies 0.025. So the future is 975/25 = 39 times longer than the past. Similarly, if you observed at time B then the future would be only a fraction, 1/39, of the past.

When we observe something in history, like the existence of the Great Wall of China or Cambridge University, then, if we assume that we are not observing it at a special time in history, in the sense that there is no reason why we should be particularly near its beginning or its end, we can predict how long we should expect it to exist in the future with a 95% level of confidence.
fn1
Our diagram shows us the numbers we need. If an institution has been around for Y number of years and we are observing it at a random time in history, then we can be 95% sure that it will exist for at least Y/39 years but not for more than 39 × Y years.
10
In 2008 Cambridge University has been in existence for 800 years. This formula predicts that its future longevity beyond that date has a 95% chance of being for at least 800/39 = 20.5 years, approximately, and at most 800 × 39 = 31,200 years. The United States was declared an independent country in 1776. It has a 95% chance of lasting more than 5.7 years but less than 8,736 years. The human race has been around for about 250,000 years, so it will survive for between 6,410 and 9,750,000 years with 95% confidence. The Universe has been expanding for 13.7 billion years. If this is how long it has been in existence, then with 95% probability it will last for more than another 351 million years and less than 534.3 billion years.

Try it on some things that have been around for rather less time. My house has been standing for 39 years, so I should be 95% sure it won’t succumb to some random disaster in the next year but I (or somebody) should be surprised at the 95% level if it survives for 1,521 years. You can try it out on football club managers, businesses, countries, political parties, fashions and the runs of West End plays. It is interesting to make predictions too. As I write this, Gordon Brown has been Prime Minister of the UK since 27 June 2007 – almost exactly three months. He addressed his first Labour Party Conference as Prime Minister yesterday, but we could tell him that with 95% confidence he will be in office for at least another two or three days but not more than 9
3
/4 years. By the time this book is published you might be able to test this prediction!

fn1
It is tempting to continue applying this rule to everything and anything, but a word of caution is in order. Some things have a life expectancy that is determined by something more than chance alone (such as biochemistry). When this probability rule is applied in situations where there is a non-random process determining the timescale for change, it leads to a wrong conclusion about the longest (or the shortest) expected survival time. The formula says there is a 95% chance that a 78-year-old person will live for between 2 and 3042 more years. However, we would expect that the chance of them living for more than 50 years is precisely zero. Age 78 was not a random time in that person’s life: it is much closer to the biological endpoint than to the day when they were born. For more detailed discussion see
http://arxiv.org/abs/0806.3538
.

41

A President who Preferred the Triangle to the Pentagon

The triangle appears to require no specialist ability to play.

The Triangle,
Wikipedia

James Garfield was the 20th President of the United States. You probably know nothing about him except perhaps that he was shot by an unhappy member of the public who had wanted to be appointed to a federal post, on 2 July 1881, after just four months in office, and he died ten weeks later. Strangely, the lethal bullet that lodged in his body was never found, despite Alexander Graham Bell being commissioned to invent a metal detector for the purpose. He succeeded in making such a device, but it was not very effective, primarily we suspect because Garfield’s bed at the White House had a metal frame – a very rare thing at the time – and nobody suspected that it was the cause of the instrument’s malfunction. In retrospect, the real cause of his death was careless medical care that punctured his liver. Thus, he became the second President to be assassinated and served the second shortest term of office. Despite this sorry end, his name lives on in a small way because of an unusual contribution he made to mathematics.

BOOK: 100 Essential Things You Didn't Know You Didn't Know
5.78Mb size Format: txt, pdf, ePub
ads

Other books

Second Nature by Ae Watson
Hangman's Curse by Frank Peretti
The Millionaire by Victoria Purman
A Scandalous Plan by Donna Lea Simpson
In the Dark by Marliss Melton
Strictly Love by Julia Williams
Raptor 6 by Ronie Kendig
A SEALed Fate by Nikki Winter