A Peculiar Limit

A short story about cosines

Bill Markos
10 min readMar 20, 2023
Limits and speed are two interchangeably used terms in mathematics — Photo by Joshua Hoehne on Unsplash

You know, probably the cornerstone of modern calculus are limits. So, it makes no surprise that some of the most interesting problems in mathematics are related to them. Today we are going to work with a really peculiar one…

The “Casus Belli”

I was working on a Quantum Control problem the other day and I stumbled upon a quite neat limit:

\lim_{n\to\infty}\cos^{2n}\frac{1}{2n}
Seems simple — and it truly is!

Nice and simple, one would say; and that is actually true. There are many ways to compute this limit, as, e.g., using these monstrous yet extremely powerful identities about powers of sines and cosines, however we shall make use of a more common trick here. Consider the following function:

When does this expression make sense?

Evidently, one has to be quite careful with functions that involve variable exponents, since they are not always well defined. In our case, let’s have a quick look on the plot of cosine around zero:

The plot of cos x around zero — plot generated with desmos.

As you may easily observe, cosine takes strictly positive values around zero, so f(x) is well defined there. Let’s now consider the following limit:

You see where this is going, right?

Probably the most usual way to tackle limits that involve variable exponents is by taking advantage of logarithms and their amazing ability to handle them. So, let’s consider the following limit:

No more tricky exponents!

Observe how this limit is of a really well-known form: zero over zero. Such limits are easily computed using a bit of algebra and de l’Hôspital’s rule. Namely:

Classy, right?

Given that, we can easily compute the limit of f to zero, since:

\lim_{x\to0}f(x)=\lim_{x\to0}e^{\ln f(x)}=e⁰=1.
Nice and clean result.

Now, where did all this start? Scroll a bit up and you see that we wanted to compute the limit of the following sequence:

\lim_{n\to\infty}\cos^{2n}\frac{1}{2n}
Do you remember me?

Since the limit of f to zero exists, and the sequence 1/2n converges to zero, we can directly infer that:

Ha! We got that really fast!

The Start of the “War”

Okay, we computed a nice limit, quite quickly, and we can move on with our lives, can’t we?

Well, that’s not how mathematicians think — at least most of the times. Once we are done with a problem, it seems like natural for a mathematician to ask “Yeah, but why did this thing work?”. So, let us meditate on the above for a little bit…

[Let that “little bit” pass…]

Okay, back from our meditative break, we can make some comments about the above limit. To begin with, the limit has the following more general form:

Well, there was a “2” there, but, you know, constants do not matter that often, do they?

In the above example, f was a mere cosine but things might be much worse, for sure. Another interesting property of f is that f(0)=1, which is actually no coincidence, since the limit we end-up with is also 1.

But, not all functions such that f(0)=1 can lead to the above limit being equal to 1 — alas, it might not even exist! So, there are some other missing conditions that may ensure that given a function f such that f(0)=1 and *something else* then:

This is our goal, remember!

Let’s start our investigation!

First Try: A Common Thought

All this thing about functions and limits being equal brings to mind continuity — which, in most cases, boils down to exactly this. So, we might first consider continuity as that missing condition.

Whenever we consider a hypothesis, we have to choose between two paths:

  • Are we going to prove this hypothesis?
  • Are we going to disprove this hypothesis?

In case we have some strong intuition that our hypothesis works, we can start by trying to prove it. But, in this case, do we?

Well, even posing that question, it means that we don’t. So, let us try to think of some examples at first, to get a better grasp of what’s going on. Consider the following function:

How simpler could things be?

Then, the desired limit is:

Okay, that went really bad…

As you might recall 1/e is significantly less than one, so all our thoughts seem to have been in vain.

But, hey, a failure in mathematics can be really informative. To begin with, let us have a look at the plot of this function, compared to cosine’s:

y = cos x (red) and y = 1-x (green).

Now, consider the limit we are trying to compute. As one might observe, f(1/n) vanishes while n grows indefinitely, which means that, given that f(0)=1, it gradually approaches 1. However, adding an exponent to this makes things wildly different. As you might already know, while exponentiation tends to make large numbers larger, this is not the case when it comes to small ones — where “small” means “less than 1”. Indeed, observe the following plot:

Various powers of x in [0,1].

The straight line corresponds to y = x, while the rest represent increasingly larger powers of x (the purple one being the largest). Observe how most numbers in (0,1) virtually vanish once raised to a large enough power.

So, there seems to be a trade-off here, where a function needs not only to converge to 1 as we approach 0, but this has to be done relatively fast so as to cancel out the lowering effects of exponentiation.

Second Try: The Expected Step Forward

Okay, we have made a point out there, about speed of convergence. If you scroll a bit up, you shall see how, indeed, cos x converges much faster to 1 than 1 — x as we approach zero, so our argument above makes sense. A way to ensure speed, in some sense, seems to be to demand our function to be strictly concave — that is, at each point its values should be greater than any segment joining two of its plot’s points.

For instance, let us consider the following function:

Okay, almost like before, just a bit faster.

Its plot, compared to cosine, looks like this:

cos x (red) and f(x) (green).

Okay, this might seem a bit slower than cosine but, again, definitely faster than our previous try. Our limit in this case translates to:

Seems familiar…

The innermost parenthesis is quite familiar; indeed, it converges to 1/e. But, what happens when we take the n-th root of a convergent sequence? Well, we know from calculus that for any positive number a the following holds:

One of the most useful common limits.

In a similar fashion, since we already know that:

Another useful limit.

we also know that after some certain integer N, it holds that:

Essentially, any convergent sequence is finally arbitrarily close to its limit.

Then, taking n-th roots to all parts we get:

Again, you see where this goes…

So, since both the LHS and the RHS of the above inequality converge to 1, the same applies to the “sandwiched” part of the inequality. We have finally arrived to our goal:

Okay, we’ve got two functions now.

Observe how the square term is not that significant in the following sense. Consider the following functions, where r > 1:

Another group of nice concave functions.

Let us have a look at some of their plots, below:

r = 0 (red), r = 1.2 (purple), r = 2 (blue), r = 4 (green).

As you might have observed, the interesting cases are when 1 < r < 2, i.e., when we consider a function faster than 1-x but not “that fast”. So, let us consider the same limit:

I’ve already met you before, right?

Similarly, the innermost parenthesis converges to 1/e while again, taking these strange quasi n-th roots doesn’t change our argument that much. So, again, we get:

Oh, yeah, now we’ve got infinitely many!

Okay, so, it seems as if concavity is what we needed, since all functions we have tried so far have worked — more or less under the same argument.

So, shall we try to prove our result?

Third Try: The Catch

No, to prove we shall not. The examples we have considered so far, while providing quite satisfying results, are of a very certain form. So, we should first try to consider a really slow strictly concave function. If you have a little experience with asymptotics and all that stuff, you have probably correctly guessed our next try:

Of course, logarithms…

To have a quick look on how slow things are, let us compare this with our all time slow try so far; 1-x:

f(x) (red), 1-x (blue) and cos x (green).

Okay, things seem really bad in this case. Indeed, observe that:

It’s nice to have already proved some results.

So, in case the above limit exists, it must be at most 1/e which is less than 1. So, (strict) concavity does not seem to be our much-sought missing condition. Indeed, while there are “hyper-linear” concave functions, they all have their slow counterparts, as with logarithms above.

But, if concavity is not what we need, what are we looking for then?

Let us have a closer look on concavity’s definition. Its formal definition is that a function f is strictly concave over an interval I whenever for any two points x,y of I and any t in (0,1) one has:

The classical definition of concavity.

Intuitively, this is interpreted as in the following Figure:

A visualization of concavity’s definition.

Essentially, what the above definition tells us is that, given any two points on the plot of a concave function, each point between them is above any point of the line segment connecting these two points. This “betweeness” is captured by the parameter t we have introduced in the definition of concavity. So, the definition of concavity demands that the function has some certain sort of non-trivial curvature. However, as we have seen, this does not suffice.

But then, what would?

Well, when a concept is not strong enough to capture our intuition, we try to make it stronger by appropriate modifications. So, in this case, since we are trying to impose some certain sort of curvature onto our curve — hoping that this will also increase convergence speed — we might try to demand something better than “just” super-linear curvature.

A step towards that direction might be the following modification of concavity’s definition:

Just another term — where it all begins.

In the above, a might be any positive real number. If you have a closer look on the above equation, you might see that we have just added another term, this time of square order with respect to t.

So, let us again get some intuition about that definition. As we said above, in the usual definition, we ask for our functions curvature to be non-trivial. In this case, we ask for something more, as shown below:

Interpreting the definition of strong concavity.

Let us explain what we are seeing in the above Figure. At first, we have fixed two points, in this case (0,1) and (1,0) and have connected them in two different ways:

  • the green line is a plain old straight line;
  • the blue line is a parabola connecting those two points with (0,1) being its peak point.

The red dashed line is the curve described by the RHS of strong-concavity’s definition, with a varying from 0 up to 2. Observe how the closer a is to 0 the more our refined definition resembles that of typical concavity. On the other hand, the larger a gets, the closer we move to a parabola — in terms of curvature. So, depending on the value of a, a strongly concave function might be really… curvy.

But, again, does this suffice?

A Cliffhanger

You might now expect a proof that strong concavity leads to the desired limit equaling 1. But, we shall discuss more about that in our next story. In the meantime, you can share your thoughts on whether this is going to work in the comments section.

Until next time, stay tuned!

--

--