# cbloom rants

## 11/01/2007

### 11-01-07 - 2

Another simple fun math problem for like the high school level : Say you have some cash, and it is appreciating at X% per year, and you are spending Y dollars per year (after the appreciation, so the principle changes like P*(1+X/100) - Y), how much money must you start with to last N years (after N years you have zero left) ? (assume no inflation)

Some easy algebra and series gives you the answer : P = Y * ( 1 - 1/(1+X%)^N ) / (X%)

First of all, the whole thing is proportional to Y, your yearly burn, so let's ignore that part and just concentrate on the scale factor, P/Y

Let's look at X = 5 which is perhaps somewhere reasonable. For N-> infinity , you get P/Y = 20. That's the most you ever need. A few values :

```inf: 20
30 : 15.37
20 : 12.46
10 : 7.72
5 : 4.33
1 : 0.95
```
The first few years of retirement almost cost you 1 unit, but each year you add is much cheaper than the last, so that going above 30 years is very cheap indeed. Unfortunately this isn't of much use unless you are a royal family thinking of how your fortune will easily ensure a legacy for many generations.

In some cases it's surprisingly low. If you own your house and live frugally you could easily survive on 30k a year, which means you can retire on just 600k principle; that also has a safety pad because near the end you can sell your house or get a reverse mortgage or whatever to cover any exceptional costs.

One thing I don't see discussed very much is that volatility is really bad for investments. Say you have two investments with the same average return. The more constant one will give you a much better net. This is because the way to maximimize the area of a fixed-perimeter rectangle is with a square. Say both funds average 5% over two years , one is (1.05)*(1.05) = 1.1025 , the other is (1.0)*(1.10) = 1.10 , and in fact it's even worse if they occasionally take losses, like (0.90)*(1.20) = 1.08