[size=85]Current mathematics, basing its model of the continuum on Peano’s axioms, constructs the natural numbers by simply taking 0 or 1 as a non-logical symbol and adding 1 to it, repeating this indefinitely, generating the entire continuum number by number toward infinity on the basis of a repeated operation of ‘addition’ whose ‘unary function’, reifying the procedural coherence of the infinite iteration of the assumed variable, (ie. we assume it even makes sense to infinitely repeat an operation; we assume the nature of addition does not suddenly change at some exceedingly far point in the continuum) defines the totality of these numbers as an interminable series or ‘set’ a la modern set-theory, or set N, leaving behind the Feferman–Schütte or ‘first impredicative’ ordinal associated with arithmetical transfinite recursion as the smallest ordinal that cannot be generated through ordinal addition on 0, that is, the basis of proof-theoretical mathematics, where one moves on a recursive path from given sets of theorems toward those axioms necessary for their construction, with this impredicative ordinal signifying a minimal axiomatics necessary for the articulation of a statement or theorem with some truth-value. This makes it impossible to understand the formation and distribution of something like prime numbers, for primes are related to an entirely different mathematical operation, namely multiplication, such that simply adding one to a prime will not likely carry us to another prime, or if it does, hardly reveal anything useful about the frequency of the distribution of prime numbers in the continuum. [An important note must be made here. We cannot ‘disentangle’ the additive and multiplicative simply by conducting primitive recursion from any given set of theorems to their necessary axioms or ‘impredicative’ ordinal. It is, in other words, of no account as to how precisely the operation of addition and multiplication are defined,- either defining addition through Peano arithmetic, in whose logical signature the multiplication and addition relations are both contained, or defining it through Tarski’s identity by extending the first-order Skolem arithmetic (that theory of the natural numbers for whose signature only multiplication and equality are given) via the successor predicate,- for in both cases, the operations of addition and multiplication become entangled by the unary function, inasmuch as the truth-value of formulas in the Skolem arithmetic are reducible to the sequences of non-negative integers that make up their prime factor decomposition, such that the multiplication operation becomes simply a pointwise addition of these sequences. Futhermore, despite the far greater complexity and presumed foundation of the reals, moving from the natural numbers to the real numbers through yet another extension of the same unary function can be easily negotiated if we consider a geometrical analogy. If one draws a line going through a circle in a 2-dimensional plane, there is no single point where they meet; the mathematical analyst, when tasked with enumerating that point, will simply insert a new point on the plane where he wants the connection to be made, and then use converging additive sequences on that inserted point to describe the intersection of the line and circle. He has enumerated the imaginary point where the line and circle meet as an equivalence class of a family of converging sequences (fundamentally, sequences of additions) in the plane. Real numbers are simply algebraic constructions analogous to these geometrical ones. A real number is similarly constructed as an equivalence class of Cauchy sequences of rational numbers, with the continuum itself conceived as the continuity of these equivalence classes, viz. the assumption of there existing a continuous sequence of all such equivalence classes, from 1 to infinity, forming a series that can be iterated by a function (namely, the ‘unary function’, which reifies the operation of addition) and enclosed by a limit, forming a set,- or more precisely conceived topologically, a ‘dense set’ from whose metric space E any uniformly continuous function into another metric space can be extended to a unique continuous function on all of E, as demonstrated in the techniques used by Minkowski to map the quadratic irrationals to the rationals. (“Infinite Ergodic Theory of Numbers”, “The Farey map; definition and topological properties.”: “… any uniformly continuous function from a dense set of a metric space E into another metric space can be uniquely extended to a continuous function on all of E.”) That sequence might not exist, and these equivalence classes might be discontinuous, such that no formula can be derived that iterates it, undermining our entire model of the continuum,- for it would mean, however surprisingly and unintuitively, that there are ‘numbers’ that cannot be constructed regardless of how many times the unary function is iterated; there are numbers one can never ‘get to’, no matter how far they count, in keeping with the modern presumption of ‘inaccessible cardinals’ implied by the axiom of choice. However, it would also mean the continuum hypothesis is false,- a speculation rarely ventured in the literature, though we do find some arguments for it like that in Woodin’s omega-conjecture and infinitary Ω-logic, where, given the existence of a proper class of Woodin cardinals satisfying an analogue of the completeness theorem for which any axiom exists that is comprehensive over the structure of hereditarily countable sets N3, the continuum by implication is not N1. Instead of developing, like Mochizuki, a non-set theoretical mathematics in which to account for this discontinuity, which he calls a discontinuous homomorphism, Woodin indulged in it and simply endeavored, in keeping with his overall interest in large cardinalities, to find sufficient axioms of this kind, with which to generalize the theory of the determinacy of pointclasses and gain insight into structures larger than N1, that is, the structure of N2, viz. a structure larger than the structure covered by the axiom of projective determinacy,- axioms with which an inner model of the large cardinals can hopefully be constructed as a set-theoretical universe within which the continuum hypothesis has a positive truth value, thereby rescuing mathematical platonism and set-theory from themselves. While what has been said here might suggest a finitist philosophy of mathematics, note that someone like Mayberry, championing the finitist anti-Platonic view, rejects the ‘operationalism’ inherent in this construction of the natural numbers, (a construction he believes is descended to us in the obvious set-theory of modern mathematics through the conceptualization of the arithmos found all the way back in Euclid’s fifth common notion) whereby an indefinite process of continued addition is somehow reified as a coherent operation by a unary function that infinitely extends it to the formation of a number-field in which that function can continuously map elements of one set to those in a subset of itself, or in Euclidean terminology, an arithmos constituting ‘a whole necessarily greater than any of its parts’, simply as a consequence of his more stringent epistemological criteria for what constitutes well-defined mathematical concepts, from whose exalted order all indefinite operations and infinite series are to be expunged. Euclid establishes the concept of a ratio by taking two definite geometrical magnitudes and continually adding one of them to itself such that, if this results in a new magnitude exceeding either, they can be said to possess a ratio: here we have affirmed the basic principles of non-contradiction and identity, for two things that express such a ratio have essentially expressed the a priori fact that they are not equal to the third magnitude, and are therefor not equal to one another. Mayberry believes that Euclid took a tragic misstep when he attempted to generalize the concept of ratio by extending the notion of ‘geometrical magnitude’ with which he had been working to a purely arithmetical magnitude with unbounded quantifiers, moving from consideration of the definite magnitude of lines, angles and figures to that of the indefinite ‘arithmoi’, or what we would today call an empty set. For, if we similarly take a subset of a set and continually add it to itself, it becomes impossible to create a congruent one-to-one mapping between that subset and a subset of itself, that is, a ‘set of all sets’, at least when the operation is infinitely iterated as it is in the construction of the natural and real number line, such that the notion of ‘continuity’ itself seems to break down into the kind of paradoxes of irreconcilable and incommensurable infinities Cantor was the first to more formally explore. As Poincaire noted, this apparent ‘breaking down’ essentially means that the basis of arithmetic is not ‘self-evident’ in the way that the truths of geometry are, demanding that we fall back to mere axioms, not self-evident truths, in the development of our mathematics; it means the principle of infinite iteration, viz. the unary function on which the continuum is founded and through which the reals are constructed, is irreducible to the a priori principle of non-contradiction and the basic formalism of logical identity given by the ‘factum’ of Euclidean geometry,- a factum taken as the highest domain accessible to us,- an obviation of ‘pure reason’ that led of course to Euclid’s transition from the logical rigor of his self-evident geometry to a necessarily synthetic generalization of geometrical magnitudes and ratios to the arithmos or arithmetical operations, imposing upon us, as his intellectual progeny, a reliance on the very same ‘synthetic intuition’ he was forced to adopt, namely the concept of infinite iteration,- an intuition that both conceals from our limited human minds the a priori source of the truths of that geometry and allows us to deploy, from an ‘intuitive’ axiomatics (vis. Peano arithmetic, ZF, set theory, etc.) beyond which we simply cannot hope to venture for want of any more certain ground, precisely the basis of that arithmetic by which we have found ourselves able to navigate, if only pragmatically and not theoretically, a higher mathematics despite those human limitations. In the terms of predicate logic, we can say that the concept of equality, which can be derived self-evidently from the existence of geometrical ratios, is lost in the movement to a looser equivalence relation than equality, (this loss of the uniqueness quantifier up to equality signifies the resulting impossibility of congruently mapping the subset of a set to a subset of itself under infinite addition, as noted earlier) namely a relation that cannot be expressed by first order logic and from which the uniqueness quantifier up to isomorphism is derived that we later use to axiomatically ground the infinite quantification of the empty set and, by extension, our arithmetic, whose equivalence relation, designated category-theoretically, holds only up to isomorphism and not up to equality. Here, however, the construction of the numbers through infinite addition is rejected due to the fact that it entangles, at increasing levels of abstraction, fundamental operations and concepts in improper semantics.] The ABC conjecture is precisely about this, the relationship of numbers generated by addition and those by multiplication, and it remains unsolved because that relationship, which is really about the operations themselves from which the two classes of number are produced, is not understood; a state of conceptual conflation or what Mochizuki calls an entanglement of “the two underlying combinatorial dimensions of a number field, which may be thought of as corresponding to the additive and multiplicative structures of a ring or, alternatively, to the group of units and value group of a local field associated to the number field”. This is the case for all of the fundamental mathematical operations,- they are conceptually ambiguated. The local structures (viz. the natural numbers) and global structures (things like prime numbers) encoded by the mathematical operations, in other words, exist in an indeterminate, confused state awaiting disentanglement; the contained ‘semantic content’ that exists between the two levels of abstract structure remains latent at the second episteme, unextracted and yet to be reincorporated at a higher abstract level, namely that of the third episteme,- an incorporation at a higher abstract level attempted in IUT, regarding the operations of addition and multiplication, through the construction of Hodge theaters where deformations of objects (distinctions of local structures) translated from one to another model or ‘theater’ are calculated inside a log shell and removed from Qs and number fields, and then gathered into a kind of ‘inter-universal’ container that can be inserted into and removed from multiple theaters in order to measure and equalize the translational variance of objects through a process of strategic deformation of those objects. The integration of latent semantic content will not be achieved until a new model of the continuum is developed, one extended ‘axiologically’ beyond Peano arithmetic. Until then, we will be faced with a never-ending litany of imponderables. As Wittgenstein observed, set-theoretical mathematics has been constructed through a combination of laws, developed from symbolic logic, and axioms, freely chosen programmatically,- two things that have been utilized in parallel, with axioms serving to make up for the conceptual gaps in the laws, and the laws in their turn making up for deficiencies in the axioms, only further entangling semantic layers with each leap toward greater generalization from that convoluted foundation. In other words, the issue is one of circular argumentation; when asked to define a function, the analyst will happily do so in terms of a rule, yet when the same analyst is asked to define a rule, he will define it in terms of a function. The kind of questions created (like the Riemann zeta hypothesis) through this entanglement are not so much questions as they are manifestations of defects in our fundamental concepts. They are the result of an entangled semantics leading us to formulate questions that, appearing to make sense syntactically, or at one level of abstraction, do not actually signify anything, containing no latent semantic content that can be reintegrated. They’re not questions, they are malformed statements semantically entangled in concepts; concepts that, once clarified and reified in higher global structures, will lead to the development of an entirely new system in which questions of this sort are not answered, but simply disappear into the greater horizon opened up by the expansion of the space of representation available to the first episteme, that is, cease to be questions, as those structures about which they have been ventured are themselves reunified, following local differentiation and sublation, into new singularities. [Note that Solomon Feferman similarly proposed a novel theory of mathematical definiteness grounded in a semi-intuitionistic sublogic that applies classical logic only to bounded quantifiers, using intuitionistic logic for unbounded quantifiers, such that any proposition designated ϕ can be said to be ‘definite’, that is, to possess a truth value, only if the semi-intuitionistic sublogic can prove ϕV¬ϕ, such that many famous problems like the continuum hypothesis would simply be malformed statements lacking any truth value one way or another. His use of two logics, one for bounded and one for unbounded quantifiers, again suggests the same semiotic process found in Grothendiek’s use of projective and injective norms and Mochizuki’s Frobenius and etale-like portions of a Frobenioid, though this logical partition is not, in Feferman, later sublated as an arbitrary differentiation on local objects, or re-encoded by a codeterminate form on some higher abstract or ‘global’ structure with which the semantic content of those local objects can be ‘disentangled’ as ‘congruent representations’.]
In fact all ‘problems’ in all fields of knowledge, be it mathematics or philosophy or ethics, are the result of an analogous deficiency revealed through an application of Pierce’s semiotics, that is, the very same more fundamental problem related to the existence of concepts entangled at different epistemes. In order to realize a total incorporation of latent semantic content, the distinctions of separate local structures must be clarified, these differences reunified by global structures inside a codeterminate form, and then this codeterminate form must be reencoded in two or more abstract levels between which a congruent representation can be delineated, out of which a new ‘object’ can be defined as a correspondence between multiple abstract levels, fully completing the semiogenetic circuit through the three active epistemes. Returning to mathematics, new axiologies must be introduced, one for each of the operations and foundational concepts, just as addition was grounded in Peano’s axioms, so that models of the operations can be developed in isolation. Once these isolated models are developed, local structures can be mapped from one to another model, a process that will reveal and clarify the distinctions of the local structures inasmuch as the mappings will not be complete and isomorphic; indeterminacies will be created in the attempt to translate local structures across different models. Once the distinctions of local structures are clarified in this way they can be removed through the ad hoc introduction of an arbitrary ‘artificial differentiation’ that equalizes our results by surjection once it is sublated, to finally be reunified as an ‘anamorphic projection’ by emergent global structures inside codeterminate forms that correlate the different indeterminacies created by a series of translations that can be systematically carried out in a process of intentional disfigurement, much as visual anamorphosis clarifies a distorted image by framing it at multiple arbitrary vantage points, accumulating the differences between those vantage points, and then equalizing them. Only then can such codeterminate forms be reencoded so as to establish a congruent representation between two or more levels of abstraction, (these levels, for Grothendiek, being defined by projective and injective norms; for Mochizuki, who we will briefly consider momentarily, they are defined by the etale-like and Frobenius-like portions of a Frobenioid as related through a Kummer isomorphism arising from cyclotomic rigidity, or more precisely, from a cyclotomic synchronization isomorphism which permits us to establish a more general relationship or ‘global structure’ between the Frobenius-like portions of independent Frobenioids once their etale-like portions are connected, a relationship whose mono-abelian transport introduces the kind of indeterminacies noted here) these new representations signifying a novel class of abstract object. In fact, this is the exact method taken by Mochizuki in his attempt to solve many conjectures like and including ABC through something he calls ‘inter-universal Teichmüller theory’, where the isolated models of operations suggested here, with their independent axiologies, are called ‘mathematical theaters’, and where the distortions and inequalities introduced by the movement across theaters and the translation of structures in one to another is accumulated inside a log shell by the theta function, whose variance is measured within a Hodge theater to the ends of redeploying new global structures and reconstructing the hopefully ‘disentangled’ continuum once the distinctions of the ‘local structures’ have been fully clarified in this way and then later sublated or ‘dropped’ from the number-field. We see that there is a single process (a movement through the epistemes) occurring in all fields of thought; philosophy, mathematics, ontology, ethics, semiotics, psychoanalysis, etc. A problem in one field corresponds to a problem in another, even if the corresponding problem has not yet been discovered by those working in its field; and a solution in one, of course, corresponds to a solution in another, even if those in that field have not yet become aware of it. All the better, we can assume the solutions in one field for ourselves, and personally reconstruct them in the field we are working in.
In short, by applying my own extension of Pierce’s semiotics to the question, (ie. the quaternary logic of the three active epistemes) it is seen that the truths of geometry are a priori and self-evident, while the ‘truths’ of arithmetic require us to adopt synthetic argumentation (via the assumption of the coherency of infinite addition and infinite series,- the assumption that it makes sense to infinitely repeat the operation of adding 1 to a starting number to create the number line, which is not a self-evident truth) that falls back to axiomatics under examination, (namely, the axiom of choice) leading eventually to an ‘entangled semantics’ between the two different levels of abstract structure that has further entangled the fundamental mathematical operations and concepts with one another and so prevented the semiotic chain from fully completing the loop through the three epistemes needed to clarify and extend meaningful representations of objects, a failure resulting in the production of imponderable questions that have no actual signification or ‘semantic content’, lacking any truth-value whatsoever, since such questions, like that involved in the RZF, arise only from a confused semantics. One might be naturally led to group theory in the search for some means of disentangling the confused semantics of the mathematical relations encoded by addition, multiplication, etc., given the fact that the sums of an infinite number of groups require that the constituents always have finite non-zero elements, whereas direct products of sets are not similarly bounded, and Mochizuki has sought in precisely this domain, attempting to disentangle these relations through some extensions of Teichmuller-space and Galois groups. Beyond the ABC conjecture, for whose solution IUT was tentatively conceived, the explication of latent semantic content would give us a deeper understanding of the continuum in far more general ways. All current mathematics is, in essence, simply set-theory, which is to say that it all takes place within a single ‘theater’, a theater whose continuum is based on a model of the fundamental operation of addition. Accordingly, the modern conceptualization of a transcendental number is based on Cantor’s work, a product of set-theoretical proofs. There is some structure analogous to or correspondent with a transcendental number in each of the other theaters. To understand what a transcendental number like PI actually is, [Note. We hardly understand what it ‘actually’ is at present. Within the single theater we use, derived axiomatically, in the wake of Godel, (following the failure of Russell and Whitehead’s Principia, an attempt to ground mathematics in a complete symbolic logic) from Peano and the unary function on which basis we establish the apparent scale invariance, internal consistency, or ‘smoothness’ of the continuum through the operation of addition, it can only be grasped as an irrational number; this is not its true nature, but only the result of a defect in our theory, in just the same way that a singularity results in the mathematics we use to try and decipher what takes place beneath the event-horizon of a black hole,- a singularity that most physicists agree is not actually there, signifying nothing more than a conceptual hole in our understanding.] we would need to correlate the local differences between each of these analogues and then equalize them through an artificial differentiation that permits their global reincorporation into a codeterminate form that we can reencode on multiple independent abstract levels, with congruent representations of that codeterminant form across these levels giving us our deeper, more complete understanding of the continuum and something like transcendental numbers, the primes, etc. The relative meaninglessness of our conceptualization of irrational numbers like PI does not seem to cause us many engineering problems, for most applications only require a few digits of it, yet this issue does cause us major problems when attempting to articulate the foundations of mathematics. We can write down a formula for PI, but we do not know what it means, because we have no framework for performing actual arithmetic with it; if we want to do something like add PI and E, all we can do is say it is ‘PI plus E’ and write a new, even more ambiguous formula for it. These irrational numbers simply do not ‘fit’ in the mathematical theater we are using. However, they might fit better, or perfectly, in another theater. Most importantly, the local distinctions of such numbers, clarified between their appearance in multiple theaters, would bring us closer to the apprehension of global structure and give us the most insight into what an ‘irrational number’ truly is. A transcendental or irrational number might correspond, following this extraction of latent semantic content within the continuum, to an entirely different class of object,- not a ‘number’ at all, but some object corresponding to a different axiology entirely, just as ‘numbers’ are objects corresponding to the primitive axiology of Peano’s arithmetic, (0), 0+(1)=(2), 2+1=(3), 3+1=(4), and so on. Equally interesting, in a mathematics grounded on the unary function and addition, (set theory) Tarski’s exponential function problem (which asks if the real numbers together with the exponential function is decidable; as noted earlier, the reals are just as much grounded on addition as are the naturals) is undecidable, along with the question as to whether or not the real version of Schanuel’s conjecture is true, (whose proof would confirm the decidability of that problem) for the same reason that no answer to ABC can be found within set-theoretical mathematics, namely the semantics of addition and multiplication are entangled by an improper syntax on local structures (the naturals and reals) that must be decoded into higher abstract levels between which congruent representations of objects related to global structures (exponents, primes, etc.) can be determined.[/size]