Is 1 = 0.999... ? Really?

Notice that you don’t address my point about infinity at all.

Why wouldn’t infinity/2 be a real number?

Because infinity isn’t a real number.

The symbols (\pm \infty ) are defined rigorously as extended real numbers. They’re used as convenient shorthands in real analysis and measure theory.

But even as extended real numbers, division of those symbols by other real numbers is not defined. Which just means that it’s not defined. It has no standard definition and there’s no sensible way of creating one.

The specific rules for the defined arithmetic properties of (\pm \infty ) are here:

en.wikipedia.org/wiki/Extended_real_number_line

I ignored it because you’re repeating a point I addressed earlier.

(\infty) is not a real number. Neither is (\infty \div 2). Nonetheless, they are both numbers.
(Albeit, as wtf noted, the latter expression has no recognized meaning within the official language of mathematics, which means nothing with regard to this topic.)

You appear to be saying one of the following:

  1. if (a) is not a real number then (a) is not a number

  2. if (a \div 2) is not a real number then (a) is not a number

Neither of those is a logically valid argument.

I said it earlier: the set of real numbers is not the set of all numbers. There are numbers that are not real numbers e.g. complex numbers and hyperreal numbers.

I will let WTF have a turn on the roller-coaster ride.

Thanks, but I’m afraid I’m getting off the roller coaster. Magnus wrote, “(Albeit, as wtf noted, the latter expression has no recognized meaning within the official language of mathematics, which means nothing with regard to this topic.)”, my emphasis. If math isn’t what the thread’s about, I’m at a loss to contribute.

I did write extensively in this thread several years ago explaining why .999… = 1, but clearly to no avail. Long answer short, though, it’s because .999… = 1 is a theorem that can be proved (by a computer if one likes) in standard set theory.

There’s no other reason. Once you define the symbols as they are defined in standard math, the conclusion follows. If you define them differently, you can say that .999… = 47 or anything else you like. There is no moral or absolute truth to the matter, it’s strictly an exercise in defining the notation and then showing that the theorem follows. Just like 1 + 1 = 2. If you give the symbols different meanings, you get a different truth value. With the standard meanings to the symbols, the statement is true.

But none of this is of much interest, I gather. It’s “not what the thread’s about.” Perhaps I never understood what this thread is about.

If you have been following this thread, then you know that he has rejected the operations on infinity which are described in that wiki article:
[attachment=0]extreal.JPG[/attachment]

I was trying to show the contradictions within his own ideas about infinity.

Only sporadically, and (I admit) with dismay.

None of this bears on the original topic. The infinity of the extended reals has nothing, repeat nothing, to do with the infinite cardinals and ordinals of set theory, or the meaning of positional notation in decimals. It’s a red herring and a distraction to the question of .999… = 1.

The notation .999… is a shorthand for the infinite series 9/10 + 9/100 + 9/1000 + … which sums to 1 as shown in freshman calculus. There truly isn’t any more to it than that, though if desired one could drill this directly down to the axioms of set theory.

But this point has been made repeatedly by myself and others in this thread to no avail. I confess to not understanding the objections, It’s like asking if the knight in chess “really” moves that way. The question is a category error. Chess is a formal game and within the rules of the game, the knight moves that way. There is no real-world referent. Likewise with a notation such as .999… By the rules of the game, it evaluates to 1. It’s a geometric series. If you make some other rules, you can get a different result. Efforts to imbue this with some kind of philosophical objections are likewise category errors. Rules of formal games aren’t right or wrong. They’re only interesting and useful, or not. The rules of math turn out to be interesting and useful so we teach them.

Oh I agree completely.

I wasn’t finished typing! But if you agree with me on anything, you’d be the first person to do so in the history of this thread! :slight_smile:

I agree with you on this topic too.

I think every competent mathematician who has contributed to this thread has been saying things that agree with what you’ve been saying. The entire mathematical community in fact (philosophers or not) says what you’ve been saying.

It’s only been the ones who’ve admitted “I’m not a mathematician, but…”, or words to that effect, who have been trying to say things that do not agree with what you’ve been saying.

There’s only really one of these types left fighting his corner, simply accusing any content that disagrees with him of being irrelevant, or never having existed in the first place. It’s a chronic case of confirmation bias, other cognitive biases, and logical fallacies - that’s really all there is to this thread.

Dude! You guys! For some bizarre reason I have been relegated to the corner with a dunce cap!

I have two arguments! 1.) There are no greater infinities! 2.) Infinite series don’t converge!

I have no humanly clue why you disagree with me!

1.) no greater infinities: the disproof I use for this is what I call “the cheat”. What is “the cheat”? Very simple!

1.) rational number
2.) irrational number
3.) imaginary number
4.) different rational number
5.) different irrational number
6.) different imaginary number

You can list every number on this single list!

Etc…

My disproof for convergent series!

If you take any rational, irrational or imaginary number and divide it by half, let’s say, the number 1!

0.5 + 0.5 = 1

Then you divide that in half!

0.25 + 0.25 + 0.25 + 0.25 = 1

When you divide that in half!

You get:

0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 + 0.125 = 1

Etc…

When you push this series to the convergent limit, eventually 0=1

Contradiction.

In order for series to converge (and I reverse engineered the problem to prove this)… 1 MUST equal zero!!

I can’t believe this thread is still going on! I made actual proofs for this!

This is what brings me back to this thread. I want to understand the objection that is being made.

If I can sum up the way this feels to me, the pattern goes like this:

  • I give any one of half a dozen perfectly valid reasons why .999… = 1. (None of them are those semi-fallacious “multiply .999… by ten” or “multiply 1/3 by 3”, for the record. I regard those proofs as essentially circular and agree that they can be fairly criticized.

  • I am immediately accused of being a literal-minded mathematicians, brainwashed by the orthodoxy and only able to spout what I’ve been taught; and unable to think for myself or comprehend the larger point that they seem to be making. As a brainwashed mathematician I am incapable of seeing why my mathematical response to a mathematical question is totally inadequate.

  • But that’s the thing. I can never pin them down on exactly what extra baggage their imputing to the symbols. They’re mathematical symbols hence their meaning comes only from their definition and they do not necessarily refer to anything in the real world.

So this is my mystery. I want to understand what extra ontological power they assign to the harmless symbology .999… = 1. Of course it’s not true in the physical world, I’m sure I must have pointed that out early on.

So what then? What philosophical point are people trying to make? The truth is that there is no natural meaning to the symbols. They are assigned a meaning in mathematics. And by the meaning they are assigned, we can demonstrate that .999., = 1 many different ways.

The only reason, in my opinion, that anyone would think decimal notations have any other meaning or ontology, is due to faulty teaching along the way; and not to any philosophical insight. That would be my take on the situation.

I asked that question long ago and never got a satisfactory response, but I’ll just mention it here again. What is the secret sauce that leads anyone to think that .999… = 1 has any meaning whatever besides what mathematicians give it.

tl;dr: The thread subject asks a mathematical question and is fully answered by a mathematical response.

The notion that it’s anything more than a mathematical question is a false belief grounded in mathematical ignorance; which I generally do not hold against the individual, but rather the mathematics education establishment.

I agree. Math ‘lets’ it. Because it is agreeable. But the order of the notation has been lost. Abandoned for a “new” theoretical math. Counting from zero to infinity and beyond, within infinite sets, some grand and others infinitesimal. Doesn’t get anymore philosophical then that. Slicing and dicing the infinite.

That’ll take you places.

I’m glad to hear that. But the rest of your post consists of disagreement with my point. I’ll just respond if I may.

Yes, I take that to be a good description of the pragmatic nature of math, which is after all a contingent activity of human beings. Meaning that it’s always in flux and never just one thing.

There is no “order of notation.” You seem to think that the notation .999… means something even before we’ve defined what it means?

If I am understanding you correctly, then please, I ask you, what is it that you think it means? Keep it simple for a mathematically brainwashed simpleton such as myself.

It’s perfectly true that the way we think about the real numbers has evolved over the centuries. That’s perfectly normal, it’s the same in any human endeavor. You seem to claim that there is an “old” and I presume “more true” math. If so please do your best to help me understand what you mean by that.

That seems a little handwavy. If you don’t hear the music maybe math’s not your thing.

Then it doesn’t get philosophical at all, since you didn’t say anything.

Actually not. Actually not!! What the modern mathematical formalisms do is beautifully finesse the issues of infinity and infinitesimals. That’s the beauty of Newton’s conception as it evolved over the next two hundred years. We can build up the entire edifice within a perfectly logical finitistic context. If you will only grant the intuition you have of the counting numbers 1, 2, 3, 4, …, the rest can be built of iron-clad logic that a computer could verify.

That’s the clever bit. We have these vague intuitions about infinity, and the mathematical formalism provides BETTER intuitions. It’s a great achievement of humanity.

Like I say, if it’s not your thing that’s cool. But if you can verbalize what “order of notation” is being ignored by the mathematicians, what “inherent truth” you think is in the symbolism, please explain it to me.

Suddenly there are two symbols for one number. And one symbol has the concept of infinity within it. That’s unsettling.

Hi to all,

All the mathematicians here recognized that .999… can be represented as a function mapping the Counting numbers 1, 2, 3 …
To the Rational numbers given by the relationship 9*(1/10 + 1/100 + 1/1000 + …).

This representation is in fact is a function with the domain of the Counting numbers and a Range in the Rational numbers.

Functions are not numbers!

While the limit of this function is a number (= 1), the function itself is not equal to 1.

This is a very clear ontological error.

P.S.
I have been a professional mathematician, and if I say so myself, quite good at it.

I tried to tell em, Ed. If you go back to page 752 of this thread you’ll see me tell em functions isn’t numbers. We wuz on this thing with sil and the James #2 public relations manager (forget his name… oh ‘obsrvr’ or something like that) about actual infinities as countable sets and the extension of infinity, or the ‘rule’ of counting itself. Confusing the rule for a product is what gives rise to the imaginary concept of a potential infinity as a set. But it ain’t a set, bro… It ain’t countable, it’s undefined and unreliable… like ice cube’s agent’s promise that people will watch his movies because of his talent as a actor and not just because he’s a rapper.

I don’t see why this is a problem. Are you equally concerned that 4, 2 + 2, 2 + 1 + 1, and 1 + 1 + 1 + 1 are different symbols for the same number?

How many other different ways can you think of to symbolize the number 4?

You understand that the number 4, the “actual” number 4, is an abstract idea that is “pointed to” by the representations. The representations are not the number.

You seem uncomfortable with the endless sequence of counting numbers 1, 2, 3, 4, 5, …, that most people seem to be perfectly comfortable with. All infinitary reasoning in mathematics is based on this fundamental intuition of the counting numbers.

Is there something about the idea of the endless sequence of counting numbers that bothers you? If so, you would be equally troubled by decimal notation. I grant you that. But why are you bothered by it in the first place? A child knows that you can “always add 1” to any counting number.

Additionally, we live in the age of computers. Even many endless decimals, like pi, are completely characterized by finite-length descriptions as computer programs. Pi only encodes a finite amount of information. .999… only encodes a finite amount of information: “(1) Print dot; (2) Print ‘9’; (3) GoTo 2”. That’s a finite description of the symbol .999…

A quibble. In this particular case the sum of the convergent series is rational. But in the general case, the sum of a sequence of rationals may be irrational. In fact that’s exactly how we construct the reals. Each real is the limit of a sum of rationals.

I hope you’ll reconsider this point. The limits of sequence of rationals give us the reals. That’s what it means for the rationals to be dense in the reals. The reals are the completion of the rationals.

Again, beg to differ. One way of constructing the real numbers is as equivalence classes of Cauchy sequences of rationals. Each such sequence is a function. So a real number is an equivalence class of functions. At least that’s one way to construct them.

There isn’t much actual difference between functions and numbers. If I take the collection of functions from a singleton set {x} to the natural numbers, then each natural number can be naturally identified with the function that maps x to that number. That’s the categorical point of view.

Yes it is. In fact as a Dedekind cut or limit of a Cauchy sequence, we essentially define the number 1 as the sequence itself.

You are wrong. In fact a real number is best understood as being literally identified with any sequence of rationals that converges to it.

Back to undergrad real analysis for you, my friend. I think many professional mathematicians don’t remember or never really cared about the foundational definitions. I’m sure they weren’t needed in your work. But in this discussion the precise formalizations are important. I hope this is fair to say.

This is off-topic. Indeed, each time you talk about reputation (and you do that all the time) you are being off-topic. The only way to stay on-topic (or at the very least, close to being on-topic) is by addressing arguments. That’s not what you’re doing. Instead, you’re trying to turn this thread into a reputation contest.

I suggest you to do what I asked you to do earlier and that is to define the word “undefined” in order to help us (other people) understand your arguments. I don’t expect you to do this but that would be the only thing that will improve the quality of this thread the discussion between the two of us.

Of course, if you don’t want to, you don’t have to. If you think there is a better way to spend your time, fine. But in that case, you’d also have to restrain from making off-topic posts.