Originally posted by 37818
View Post
0 x 1 = 0. 0 x 2 = 0. 0 x 100 = 0. So 0/0 is indeterminate. In an equation should 0/0 occur, its value is defined by the equality. Otherwise 0/0 by itself is indeterminate.
As I've said, its relatively straightforward to prove that if you can divide 1 by 0, then you're forced to accept that '1 = 0'. There is no way around that.
The symbol for infinity has to be treated with care and can't be used as either a cardinal number or even a real number. However you used aleph_0. And the multiplication of zero and aleph_0 is zero. So if you try to decide by fiat of definition, that one divided by zero is equal to aleph zero, then by multiplying both sides of this equality with zero, you'd get 'one is equal to zero'. Clearly a contradiction! Ergo one divided by zero is not equal to aleph_0.
Comment