I meant recurring decimals, which I by mistake forgot to mention.I think you are getting confused. 0.99 = 0.9900... which is different to 0.999...
At the bottom line, we assume 0.999... to be equal to 1 because our divide system cannot completely solve the problems of recurring decimals.![]()
int a= 1;
int answer = a/3;
System.out.println("Solution is: " + answer);
int a=1;
long answer = a/3;
System.out.println("Solution is: " + Math.roundOf(answer));
1) If we had to be very accurate.... Keep it as 0.9999!
2) If we wish to make our manipulation simple, make it as 1.
I'm not saying that I don't approximate it to be 1, I do, I have to when it comes to calculations, its a well known fact that 0.999 ... is taken to be 1, either you call it round off or approximation. I'm just saying its not EQUAL to 1. Its really close, but there's a difference.I disagree. If you were going to be accurate, 1/3 would not be .33333 in the first place. It would just be 1/3. 0.333333333 is as equal to 1/3 as 0.99999999 is equal to 1. Getting 0.3333333333333 is rounding off. Rounding 0.99999999999999 off to 1 rounds it back to what 0.33... x 3 would have been.
I honestly did hope we would've all agreed by now that 0.999... = 1. Those who don't think it does, are you arguing for the sake of arguing? You would hope so.
I disagree. If you were going to be accurate, 1/3 would not be .33333 in the first place.
Its really close, but there's a difference.
Its nothing that I'm denying the current system or its wrong, I'm just merely saying that its not equal to one, its really close, so close we assume it to be 1.I only got used to do the fraction calculation(if at all it has to be expressed in terms of decimals) with only upto 4 decimals. And not 0.3333333333..333 or 0.99999..9999 or whatever... This is the standard we prefer here even in crucial exams.
surendar added 2 Minutes and 29 Seconds later...
Such a negligible difference hardly matters when comes to practical applications. Can you point just one real time application which do have a drastic impact or abnormal deviations because of this 0.00000000..001? I am also curious to know if there is any!
Such a negligible difference hardly matters when comes to practical applications. Can you point just one real time application which do have a drastic impact or abnormal deviations because of this 0.00000000..001? I am also curious to know if there is any!
Infinity is not a number. It is a concept. A concept of what would happen if an increasing number could actually get to where it is heading towards.It can never reach infinite 9's, because there's no end to it. Every time you add a 9 it gets closer, that's it.
And btw, PC roulette is not rigged. I've seen the casino code and it definitely isn't rigged.![]()
I know, infinity is a concept, which is why I'm saying that the assumption that 0.999... = 1 is based on a concept. There's no such practical proof for it, not that's its possible to do it practically though.Infinity is not a number. It is a concept. A concept of what would happen if an increasing number could actually get to where it is heading towards.
1/1000 is pretty close to zero. 1/999999 is even closer to zero. But to express the number that actually would be zero, we say 1/infinity. 1/infinity is 0. In the same way, if you put infinity 9s after the decimal place, you get 1. You can't actually put infinity 9s there, it is merely a concept. If it was possible to put them there though, the number would finally equal 1.
This infinity concept is one of the main ideas behind sequences and series.