# Exponentials kill Polynomials

I gave my second session at the UK IMO training and selection camp at Trinity Cambridge earlier today. This one to a group of the more experienced students on the subject of polynomials. This always feels like a tricky topic to present to olympiad students. I always felt that there were lots of useful connections between roots and coefficients, but it was hard to get a handle on exactly what sort of relationship would be useful for each question. Perhaps the main problem is that any of the natural interesting things to talk about lie annoyingly on the fringes of complex analysis or abstract algebra. Or, at any rate, are best explained in that language, which isn’t particularly suitable at this stage when there’s only an hour and a half to play with.

One problem I was particularly keen for the students to attempt was a proof that exponential functions always grow faster than polynomials. I think this is a good problem to think about because it is so useful in all sorts of areas. In probability for example, polynomial decay and exponential decay are the two regimes generally discussed for the tail behaviour of distributions of random variables, and all sorts of things are qualitatively different in the two cases. It is also often a useful step in a proof when we need very crude bounds on function.

Anyway, how to prove it? Well the first stage is to prove that a polynomial of degree n+1 dominates any polynomial of strictly smaller degree. I am writing ‘dominate’ to mean, ‘is eventually larger than’, under the assumption that the leading coefficients are always positive. (As this seems easier than sticking modulus signs everywhere.)

This isn’t too hard. If we take

$P(x)=a_nx^n+\ldots+a_1x+a_0,$

then for any $x>|a_n|+\ldots+|a_0|$, we must have $x^{n+1}>P(x)$ eventually.

Now we introduce the exponential function. In most applications, it turns out to be most natural to use $e^x$, but for students who haven’t even necessarily done AS-levels I wasn’t happy using a concept whose definition might be rather unfamiliar.

If you are happy with the Taylor series definition

$e^x:=1+x+\frac{x^2}{2!}+\frac{x^3}{3!}+\ldots$

then the result is not too challenging. Given a polynomial P of degree k with positive leading coefficient, by the previous result we have that eventually

$P(x)<1+x+\frac{x^2}{2!}+\ldots+\frac{x^k}{k!}+\frac{x^{k+1}}{(k+1)!}\leq e^x.$

Although the students were able to follow this proof, they were happier thinking about $P(n)<2^n$. Obviously, we could replace x by n log 2 in the previous argument, but I was pleased with the following direct proof. Ironically, this has much more of the flavour of analysis than the above.

First we can show by induction that $n<2^{n/2}$ for n > 4. It makes sense to take the broader induction hypothesis $4, and then show that in this range “adding 1 gives you a smaller answer than multiplying by the square root of two”.

From the initial result about polynomials dominating smaller degree polynomials, it suffices to prove the result for $P(x)=x^k$, for some k, rather than arbitrary polynomials. Now we can proof this by induction on k, the degree of P. We can prove the base case k = 1 via the previous paragraph.

If $n^k<2^n$ eventually, then $(4n)^k<2^{n+2k}$, so by changing variables, $n^k<2^{n/4+2k}$ eventually, which is in turn $< 2^{n/2}$. So, eventually

$n^{k+1}<2^{n/2}\cdot 2^{n/2}=2^{n}.$