0.999.... != 1 in the sense that 0.999... and 1 differ by 1.0 x 10^-infinity. Two things wrong with that however, 1) infinity cannot be used in arithmetic because it is not a number, and 2) 1.0 x 10^-infinity for all practical and most impractical purposes does not exist. 1.0 x 10^-infinity is 0.000.... . You think that the 1 is coming sometime, but it never is because of the infinite number of times it is being multiplied by 1/10. The 1 would normally appear at the end of the decimal, but this decimal has no end, so the 1 will never appear, so all that is there is 0.000.... which is the same as 0. Although 1.0 x 10^-infinity does exist on some scale, even for mathematical precision, it's existence is insignificant. And therefore, 0.999... = 1.

Now how is it a question that 1.333... != 4/3? That just dumb.