Regarding this “+ an infinitesimal” idea:
Let’s grant the possibility that (0.\dot9 + infinitesimal = 1.0)
Divide by 3 to get:
(0.\dot3 + \frac{1}3(infinitesimal) = \frac{1}3)
What the hell is (\frac{1}3(infinitesimal))?
Isn’t the infinitesimal the smallest possible real number? How can you then divide it by 3 to make it even smaller?
Or is it that (0.\dot3 + infinitesimal = \frac{1}3)?
In which case multiplying by 3 gets:
(0.\dot9 + 3\times{infinitesimal} = 1.0)
Now we have the difference betwen (0.\dot9) and (1.0) being 3 times something?
Or do we allow the double standard that infinitesimals are as small as we need them to be, but infinites can all be different sizes?
The ridiculousness of the infinitesimal is infinite.
So the function of (0.000…1) to make (0.\dot3 = \frac{1}3) and also (0.\dot9 = 1) is entirely inconsistent.
(\frac1{\infty}) is as absurd as it is undefined and “convenient” - that’s what leads to logical contradictions, not that people “such as Ecmandu, takes a bit too literally”.
I can just as easily say that most likely I do understand your arguments, and you merely think I don’t.
The problem is neither of us can point out what’s beyond the understanding of the other because it’s beyond “their” understanding.
Either I lack the understanding to see your understanding, and therefore that’s why I don’t accept it, or you lack the understanding to see my understanding, and therefore that’s why you don’t accept it.
Who is right?
Considering these natural restrictions, this rules out the less smart person from legitimately arbitrating this dilemma, hence why multiple other people than me are trying to point you in the less preferable direction - that this less smart person might in fact be you. It might be me - I kinda hope it is because I can tell you that being the smartest guy in the room too often can get a bit annoying, but I’m just going by probability here - and I hope this is just my complacency acting against me. I’m always considering the possibility that I might be wrong - how about you? I’ve tried to lay down the criteria to follow to prove me wrong, have you? Do you know how to be proven wrong? If not, I advise whole-heartedly to only forward a proposition if you have first understood the conditions under which such a proposition can be falsified. What are necessary conditions for you to accept that your propositions are flawed? Do you have too much pride to admit the possibility that they are?
There’s so much wrong here.
“5” can be used to represent any five things in absolutely any order that you can imagine - if 5 is cardinal only.
Ordinal numbers absolutely have everything to do with how things are ordered. By definition there is a 1st, 2nd, 3rd element etc.
I can only assume you’ve only looked up cardinality so far and weren’t aware of ordinality and ordered sets.
So no, I’m not forgetting that “sets have no dimensions and that their elements are not ordered”.
You just didn’t know about ordinality.
You keep on explaining to me the concept of “One-to-one correspondence” as though I hadn’t mentioned it several times already - stop it. I know what bijection is.