pull down to refresh

Here's a fairly short article about the Riemann Rearrangement Theorem for those who want the details.

In brief, there are certain kinds of infinite series that sum to different values depending on the order in which the numbers are added. In fact, each of these "conditionally convergent" series can be made to total any value whatsoever.

This is a totally mainstream and uncontroversial result. I recall learning it in Calc II and not thinking about it beyond it being a neat result. There are lots of weird non-intuitive results where infinite sets are involved and the explanation is usually just "infinity is weird".

What I've wondered recently though is this: If the Riemann Rearrangement Theorem isn't evidence of an inconsistency in mathematics, what would an inconsistency look like?

We have two seemingly contradictory facts:

  1. The order in which terms are added together does not change the sum (basic finite arithmetic)
  2. The order in which terms are added together in a conditionally convergent series does change the sum (Rearrangement Theorem).

I'm eager to hear what my fellow math nerds think about this.

Wow, AI gave me some really solid intuition for why this happens:

Imagine a bathtub filling up with water at the same rate that water is draining out. That's what a conditionally convergent sum is like. Think of the positive terms as the water filling in and the negative terms like water draining out.

But for that tub, you can achieve any water level, simply by filling it up to whatever level you desire first, then letting the inflow and outflow equalize.

reply

That is a good metaphor but it relies on a temporal dimension that isn’t really appropriate

reply

It's perfectly appropriate! You just have to abstract the time away by steps and consider the mass of water added in discrete volumes. Every sum is a new discrete volume of water, and every subtraction is a drained discrete volume of water. As long as the time window for each action is the same, time is irrelevant, and you can just account for the actions in sequential steps, exactly in the form of a series.

reply

Ok, but the intuition would break then.

If I'm going to fill and drain a bathtub in discrete steps of well defined amounts, only varying the order of operations, then the expectation is that we always end up with the same outcome.

reply

What the analogy proposes is that the baseline is unknown, which is what makes for any order to lead to equally valid yet unequal convergence values. There is then maybe, in the rearrangements, a hidden constant that's inadvertently being changed.

reply
There is then maybe, in the rearrangements, a hidden constant that's inadvertently being changed.

That's the troubling part because it's not true under any number of finite steps.

reply

I'm not familiarized with the series. Maybe if they are rearranged in two infinite series, so that one gives zero right from the start (i.e. for finite steps), the second one might reveal the form of the sthealty constant.

reply

Nope. There are infinitely many ways to divide these kinds of series into one that goes to zero and another that goes to whatever value you want (I'm pretty sure).

Perhaps the answer to that is more frightening than if there could be an actually "broken" version, as at the very bottom of math, there is nothing even to break:

reply

We better hope for incompleteness, since the alternative is inconsistency.

reply

Inconsistency is not only not something to fear but actually a requirement for any theory to be considered scientific, as the "falsifiability" principle demands:
https://en.wikipedia.org/wiki/Falsifiability

You can only even attempt to make a scientific proposition if it's falsifiable. Inconsistency is a strict and even formal requirement for science even to exist.

That's the creepy thing about math: at the very bottom, it's not even inconsistent.

reply

Inconsistency and falsifiability aren't the same kind of thing, at least not as I understand it.

Falsifiability means we can potentially demonstrate a claim to be false: i.e. we can find a contradiction that results from the claim. If a contradiction is found, then we reject the claim.

Inconsistency would mean that we can show that a claim and its opposite are both true: i.e. A is true and not-A is true.

Inconsistency actually destroys the premise of falsifiability because the contradiction no longer tells us anything.

reply

I take it, it's a more proper interpretation of the words.

In that case, math is riddled with such inconsistencies upfront: take an expression as simple as , which for the limit of gives you as valid two opposing extremes:
and . Both extreme opposites are equally true at the exact same point.

reply

Inconsistency is avoided in cases like that by limiting the domain of the function to non-zero values and categorizing the limit as being undefined.

I hadn't thought about it in these terms, but that's an explicit sacrifice of completeness for the sake of consistency.

reply
Inconsistency is avoided in cases like that by limiting the domain of the function to non-zero values and categorizing the limit as being undefined.

But then we can always avoid any inconsistency by limiting the results to the domain of consistency. Yet this is an interesting venue of thought: if the definition of a broken math is that there is no way to define a domain of consistency, that would make said "broken math" even more consistent than math itself.

reply

fairly short

10 pages

@remindme in 3 hours

But seriously, this is a very interesting topic. I need to understand more about the “conditionally convergent” nature

reply

You just need to get to the first example in the box to see the issue. It’s a good illustration

reply