............ I failed Algebra II..
You guys are really smart... >_<
............ I failed Algebra II..
You guys are really smart... >_<
Yes Im like a hardcore christian except in a non religous way
Kefka's coming, look intimidating!
Have a nice day!!
It's okay man. I'll just be relentlessly logical until you grudgingly accept.
It's how I was taught.
Take a really small real number. Say one divided by the amount of prions in the universe. In practical reality something that small does not exist, but in mathematics it does.
Now with that number you divide it by two. Oh crap you have an even smaller real number! Divide it again! and again! Divide it by the inverse of itself! Oh crap it became even more smaller! Do it again and again!
You can do this "infinitely" and it would always be a smaller real number. There is no such thing as a smallest real number greater than zero.
Once I realized that, 0.999... ceased to be an irrational number
Programing has lead me to believe that whoever wrote the universe has included some version of
#include<limits>
#define fltmin numeric_limits<float>::min()
for the universe. Except rather than float we use all numbers
Kefka's coming, look intimidating!
Have a nice day!!
Well in programming there is a smallest number greater than 0 and I believe the universe is a giant program. When I learnt about Planck lengths that blew my mind
Kefka's coming, look intimidating!
Have a nice day!!
Yes mathematics commonly has sexual intercourse with other sciences. No matter how many large english's breasts get, maths just doesnt swing that way
Kefka's coming, look intimidating!
Have a nice day!!
Decimals and fractions are different. 1 does not equal 0.999.....
In Math to prove something fully, you also have to disprove it is false. You can not disprove that 0.999999... =/= 1, therefore the statement is not true. (when you disprove falsehood you use indirect proofs usually)
Infinity can never be reached, only approached, and also does not actually exist in math. The full proven statement here is that as the number of digits after the decimal place approaches infinity, the value of the number (which is still rational) approaches 1, but never quite reaches it.
Let ".999..." equal a decimal followed by an infinite series of nines.
Ok, all of you who are claiming that .999... does not equal one are wrong. The number between them is not "infinitely small," because there is no number between them. They are not "basically equal with rounding," they are in fact exactly equal.
Yeah it does. I'm pretty sure you can identify a single object in the "practical world". ".999..." is just a different representation of it, but just as valid as "1."Originally Posted by Rantzien
This makes no sense. It's not a "misunderstanding," it's adding together rational numbers.Originally Posted by Maxx
This is also wrong, and I'm not even sure where you got this. It is a basic rule of algebra that any two, distinct real numbers have an infinite number of other distinct, real numbers between them. 0.999... is a real number.Originally Posted by Maxx
Someone who remembers more calculus than me will have to post the actual proof. It was posted a couple of years ago in the last thread but I'm too lazy to look it up. But yes, it is actually, definitively proven that .999... = 1.
EDIT: Ok, previous threads on this: First one. Second one. Third one.
Last edited by Raistlin; 07-19-2009 at 06:06 PM.
I'm not talking about what the rules about mathematics are, I'm debating the actual merit of ever using the representation .999... when all it does is cause confusion. Some would undoubtedly say that the .999... representation exists to increase clarity, but as proven by this discussion and countless others, it clearly does just the opposite. The simple fact that they are equal renders the representation .999... obsolete as far as I'm concerned.
Programming is not mathematics. Programming is defined, restricted, by the limits of the computer. Computers cannot comprehend infinity. Most humans can't comprehend infinity, why should a commercially-available machine MADE BY HUMANS be able to do it? When you see "infinity" on a computer, that's just a pre-defined bitstring that the computer interprets as "infinity", just like all other numbers are pre-defined bitstrings interpreted as the computer as explicit numbers. If you desire, I would be glad to have a sitdown discussion with you about computer architecture.
The universe, on the other hand, is not limited. This is the reason why we have the <b>concept</b> (did you get that) of infinity, because the human mind can comprehend this concept and it is a useful concept in mathematics.
Your "belief" that the universe is a giant program does not make it so. Reject your beliefs and accept science.
tl;dr - stop trollin and lrn2math
The universe has it's limits, or atleast you cant prove it doesnt
Kefka's coming, look intimidating!
Have a nice day!!
If you use it in something its going to come out so close it won't really matter. But technically and logically they are different
If you placed the lines on a graph, they'd be close. You might argue they'd even be touching. But they wouldn't be the exact same