The only truly irreducible self-evident axiom in mathematics is number value. If you are capable of perceiving differences in number value, the rest is merely the a priori logical consequences.

Number value?

Can you tell me just what this axiom is? I mean. can you state it?

The intuitional human awareness that there is a difference between ‘one’ and ‘two’. It is an axiom precisely because no proof of it can be given, it arises directly from the intuitions of the human mind.

Not in every detail, but I do hold it for possible that the ‘anatomy’ of the mind can be represented and thereby known. I’m working on that.

I’m not sure, but maybe this is something in that direction.

Faust, what are these philosophers you mention about? I looked up Moore and have read some of Wittgenstein, and both seem to be in the business of searching and/or criticizing an aspect of language that does not, in my eyes, exist - namely, certainty.

It should be clear from the fact that there is not one but many languages that there is no absolute meaning to any term. Mathematical terms are an exception because the terms have been deliberately posited as absolutes, but they also do not have absolute meaning whenever these terms are applied to something that actually exists.

What we are saying is, I’ve noticed here, nothing other than what we are perceived as saying, by whomever. Including by the person who is doing the saying - but not exclusively by him. It’s perspective, and imperfect, ad hoc and sometimes happens to be very clear and appears to mean the same to everyone - but that seems to have more to do with the delivery than with the content.

Moore criticised the idealism that was almost universal when he was making his name in philosophy. But you’re right, he still had a certain rational strictness to his “certainty”. Wittgenstein is the most practical about certainty (it’s not what Descartes wants it to mean for himself), and also about meaning - “meaning is use” - a non-absolutist.

I shall get burnt if I put my hand in the fire: that is certainty.
That is to say: here we see the meaning of certainty. (What it amounts to, not just the meaning of the word “certainty.”)

(I’d just bookmarked that quote last night, I don’t have the whole lot memorised)

I’m not completely sold on ordinary-language philosophy, as some philosophy isn’t ordinary-language. There is a call for clarity of method in many studies and practices, and that can call for a special vocabulary and a rigorous approach. But it’s a good check against the sort of wild growth of nonsense that accompanies sloppy use of words.

Ben - do you know any of the purported axioms of mathematics, or are you just flying by the seat of your pants, here?

Do you realise that knowing the difference between one and two does not tell us what “number” means?

Jake -

Then you have noticed that they are talking about language, and not about “reality”. Not about noumena and phenomena and what the difference is supposed to be.

I’m not sure I know how the rest of your post pertains to any comments I have made.

As for “number value”, I have no idea what Ben is talking about. I wait with bated breath to see if he does.


Thanks once again for dropping some knowledge, you seem like one hell of an insightful cat.

I like the way you’ve phrased this a lot. In reading up on Wittgenstein, Russell, and a few others, I have noticed and increased emphasis on simplicity and meaning in language – specifically using language in it’s most practical sense, as it pertains to context and usage. That is, the meaning of a word being a direct result of it’s usage in language. However, this would seem to lead one to a conclusion much like Jakob’s – a single word alone can have many usages, and thus many potential meanings, so certainty becomes relative, which defeats the very definition of “certainty”.

Perhaps only now, after we’ve so conveniently ascribed number values to nearly everything.

I can’t fathom how numbers, themselves, are self-evident. Numerical values represent amounts in approximation.

I may agree it could be self-evident that 1+1=“more” (ex. having an apple + finding another = more apples). A perception of increase or decrease could certainly be intuitive; but a perception of numbers, specifically, requires knowledge of symbolism.

Numbers are also not irrefutable, nor object certainties (though they are our best, and often quite accurate approximations of certainty). Do not forget that we experience object reality through a perspective looking glass. Numbers can also be relative, depending on what values are being represented.

For instance, If I have what I perceive as two apples, one much larger than the other, it is obvious that I am basing those numerical values on the perceived number of objects present. However, if one were to base those values on, say, weight or content, they might conclude that, in all actuality, I have something more like 2 and 1/2 apples.

I would understand given the plausibility that an infant, or someone isolated from society, would recognize numerical values; but I don’t see that as probable. I would agree, however, that concepts like ‘ownership’, and thus ‘amount’, could be intuitive.

In the end–


The prose…



Statik - I’m not necessarily disagreeing with Jake. I just don’t really know how his posts are responses, positive of negative, to mine. Noting the time difference between where I am and where Jake is, I’d say he might be wacky on the junk right now.

Or maybe I am.

Oh, no, It wasn’t my intention to speak for you or Jake, I was just thinking in consideration of both of your posts (which were both largely agreeable to me). I actually didn’t even consider you to be in disagreement, I was more inclined to draw parallels between your thinking because I see merit in both methodologies.

Certainty is certainly subjective. And it’s possibly relative.

I’m certain 2+2=4
I’m certain I’m not dreaming
I’m certain I’ve never been to Portugal
I’m certain I’ve never been to the moon.

Well, this is the problem philosophers have had with certainty. It’s a psychological state which philosophers often mistake for an epistemic state. And so they take sides about certainty, without realising that we are all at once on the same side and on only our own.


Are you saying that you require no knowledge of mathematical axioms in order to balance your checkbook or that you do not require knowledge of all mathematical axioms in order to balance your checkbook?

Are you denying that number valuation is the most fundamental and irreducible axiom of mathematics? Are you denying that it is an axiomatic way of reasoning? I understand that when in common usage we speak about ‘axioms of mathematics’ we are not speaking about the act of number valuation, but this is only because it is the foundational first principle of mathematics; it hardly need be mentioned.

Similarly, scientists, to engage in the act of science, do not require knowledge of all epistemological alternatives, but anyone who seeks to know anything, must by election if not by speculation, choose to operate according to some epistemological framework; the vast majority of the time, naive realism.


In order to represent and communicate the idea of ‘two’ versus the idea of ‘three’ to another person, you would have to be able to represent ‘two’ and ‘three’ symbolically. But if you see two apples, and you see three apples, before you attribute any term or symbol to express the difference between them, it is apparent to the human mind that we are not looking at the same amount of apples. The difference is intuitive. ‘Numbers’ as a form of notation are just a kind of language for denoting this intuitively realized distinction and working out the logical consequences thereof.

As I said, that may very well be the case. However, usage of symbols to express said difference is a learned behavior.

Numbers, themselves, are not intuitive. They had to be precisely defined at some point in time, we are not born with that knowledge.

If you take ‘number’ to be the symbol, then you are correct, it is learned behavior. I take ‘number’ to refer to the human capacity for differentiation in value, and the symbol to merely be an expression of this.

Ben -

The former.

I am asking you to tell me what “number value” is, and what the commonly-accepted axioms of mathematics are. I don’t think we say “axioms of mathematics” in common usage. In fact, if I didn’t post here, I would probably never say it at all. Maybe you travel in different circles.

I dunno, Bengie. I think that if you asked a hundred scientists what “naive realism” was, you’d get about 99 blank stares. In fact, I’d wager that if you asked a hundred scientists what “epistemology” means, you’d get 82 blank stares, seventeen “It’s the study of letter-writing” and one “Isn’t that like Hume or Kant? I was dating a girl in college…”

You can’t do philosophy by accident. It’s an art - elephants swish paint onto a canvas - that’s not art. Stumbling upon a new life form is not science and naive naive realism is not philosophy.

Get it? Naive naive realism?

I made a philosophy joke.

That’s not easy.

What do you take ‘numbers’ to be? And ‘letters’?

This is exactly what I’m getting at. Mathematics is self-justifying, I don’t understand how it could possibly be self-evident without the broad mathematical framework that has already been created and refined over thousands of years.

I think, perhaps, “axioms” in this context are being confused with mathematical induction. That is to say, we use mathematics as a form of proof rather than mathematics inherently proving something to us.

Okay - maybe it’s all semantics. I think the symbol is a numeral and a number is pretty close to Russell’s definition. It’s a tortuous definition, but the point is that “number” can be defined to a great degree, and so “number value” is reducible. I think Russell’s Introduction to Mathematical Philosophy, if not definitive, certainly frames the problem as well as anyone has, to my knowledge.

Statik - in the aforementioned book, Russell makes that very point - that mathematics had been done for centuries before Peano and Frege even began to find the foundational premises under which it operated. Once stated, those premises are taken as self-evident and unprovable, but proof isn’t everything. “Succession”, for instance, is an obvious idea - the point is not that it’s mysterious, but that it’s foundational to mathematics. Axioms are not new and shocking knowledge - they are called axioms not because of their conceptual content (there’s usually nothing special about that, because they are “old news”, as you say, by the time they are discovered to be axioms) but because of the role they play.

Ben -

This is surely not the definition of number. You can’t count with just this concept, as Statik has pointed out, and numbers are what we use to count. There is a difference between the idea of “number” and “a number” of course, but I can’t see how you are talking about either.

Number value is a fairly self-explanatory phrase, I think. It is the value of a number. If you have three apples, you have one more apple than if you had two apples.

This is an axiom because it is not possible to prove that 3 > 2. It is irreducible. Human beings perceive that when there are three apples, there are more than two apples. They do not deduce it from any other conclusion.

If you do not accept, as an axiom, the perception of number value (simply that it is possible to have different amounts of things, some things are greater, some things are less), it will not be possible for you to engage in mathematical reasoning.

I don’t want to post this, go back and quote you again, and then edit the post, so I’ll mention something else here: You stated that it would not be possible to count without symbolic number systems. It absolutely would be possible to do so. You could simply take an apple, add another, add another, etc. There are a variety of things you would not be able to do: Express what it is you were doing to others, count independent of some object to be counted, count at numbers large enough that you would not be able to tell how many apples there were simply by looking at them, etc. But symbolic number systems do not precede the act of counting. Formalized number systems were created out of a demand for human beings to mathematically reason more effectively. They were not simply stumbled upon out of nowhere, and then mathematical reasoning sprung forth from them, like Athena from the head of Zeus.

I would wager you money that you’d have a hard time finding one trained, practicing natural scientist that did not know what ‘epistemology’ meant, nevermind 99. But it is irrelevant to the point. I am not arguing that scientists have worked out, stated epistemologies, only that it is necessary for them to at least assume some theory of what knowledge is in their actions. Essentially, the principle of revealed preferences: en.wikipedia.org/wiki/Revealed_preferences

If you didn’t have some set of qualifications by which you counted something ‘knowledge’, it would be impossible to engage in any human action whatsoever, nevermind natural science. To get out of bed and go to work in the morning, you must take in a vast amount of information, and must qualify this information as knowledge, take it into account, and then act upon it. You would need to do so whether or not you had a philosophically worked out position on epistemology.

I disagree. I do not think philosophy is an art in the way that the fine arts are. Other arts are isolated skills - learn these skills if you want to be good at painting. Philosophy, at least as I see it, deals with the subjects which undergird our every action: Epistemology, Ethics, Politics, Aesthetics. We simply must have philosophical positions in order to have any basis for engaging in actions. Whether or not we choose to work them out and apply them to rigorous reasoning is up to us, but we cannot not have views on philosophical issues. To do so would make trying to live a human life totally incomprehensible. That is precisely, in my opinion, why philosophy is of such great fascination for people across such vast gulfs of space and time, even though it, in itself, builds no bridges and bakes no bread.


Numbers, as formalized notation, arise from a human need to reason mathematically that pre-existed any formalized mathematical system. As I said to Faust, if you see two apples, and then you see three apples, you have a immediate perception that there is one more apple in the latter case, even if you have not yet worked out any formalized number system and have no way to formally express the concept ‘one’.

Similarly, formal letters are produced by the human need to communicate verbally and to represent verbal communication; verbal/written human communication did not spring in to existence because letters were discovered. Quite the reverse.