pull down to refresh

LOL yes I laughed at that. Checks out with the guy's personality. Won't be surprised if he loses the wallet back again in the same way :P
Sad thing is that most of the public of such posts is people already "acquainted" with bitcoin. What's left for the nocoiners?...
On the other hand, the OP couldn't have chosen a worse wording...
I'm not following. My engineering mind interprets: shortening an infinite series to the first elements for which a value converges well enough for practical purposes. But I'm sure that's not what you mean
It's a possibility. But if pruning an inconsistency introduces an error, maybe such an analysis is flawed itself from the start. Take for example again the simple form , now let's say we prune the inconsistency only, by limiting the domain of only to the positive side. Then would go from equating for any value of to equating for negative values of , which is incorrect. The inconsistency introduced by is needed for to yield the correct result. A different thing is to, instead of removing the inconsistency, limiting the domain for the whole expression.
But that is correct. The fact there is a contradiction do not makes the derived math incorrect. It's just part of the right conclusion. Maybe the paradox arises here from equating "contradiction" to "incorrect", when we have seen that some contradictions are structural to some math forms. Actually, an incorrect result will be obtained if the contradiction is pruned.
I think you answered that already. Like for my example, you concluded that in order to keep operating normally, we would have to limit ourselves (depending on the application) to a certain domain. We always have to do that in physics, for instance. Even with something as simple as the equation of the deceleration of a car stopping, you have to limit the domain to the moment it stops, as otherwise the equation predicts it would start accelerating in reverse.
Inconsistency is avoided in cases like that by limiting the domain of the function to non-zero values and categorizing the limit as being undefined.
But then we can always avoid any inconsistency by limiting the results to the domain of consistency. Yet this is an interesting venue of thought: if the definition of a broken math is that there is no way to define a domain of consistency, that would make said "broken math" even more consistent than math itself.
I'm sure of that, the thing is that every one of those combinations should reveal the construction of a different constant. That there are infinite ways to do so is the first hint that infinite different constants are obtainable, hence the differing results.
I'm not familiarized with the series. Maybe if they are rearranged in two infinite series, so that one gives zero right from the start (i.e. for finite steps), the second one might reveal the form of the sthealty constant.
I take it, it's a more proper interpretation of the words.
In that case, math is riddled with such inconsistencies upfront: take an expression as simple as , which for the limit of gives you as valid two opposing extremes:
and . Both extreme opposites are equally true at the exact same point.
What the analogy proposes is that the baseline is unknown, which is what makes for any order to lead to equally valid yet unequal convergence values. There is then maybe, in the rearrangements, a hidden constant that's inadvertently being changed.
Inconsistency is not only not something to fear but actually a requirement for any theory to be considered scientific, as the "falsifiability" principle demands:
https://en.wikipedia.org/wiki/Falsifiability
You can only even attempt to make a scientific proposition if it's falsifiable. Inconsistency is a strict and even formal requirement for science even to exist.
That's the creepy thing about math: at the very bottom, it's not even inconsistent.
And that's exactly what happened and what didn't happen, respectively.