We know that a number cannot be divided by 0, but why?
In elementary school arithmetic, this problem is very simple. At that time, we defined division as "divide a thing into several parts." Dividing into one, two, three, four, five, six or seven parts is easy to imagine, but how do you divide 10 biscuits into 0 people? Can't imagine it! So it cannot be divided.
A keen classmate might think that if 0 cookies are distributed to 0 people, there is nothing at all, it seems that it doesn't matter. But since there is nothing and no one, it is possible for everyone to get a certain amount. It is impossible to give a single definite value.
This conclusion is correct, but it is all based on intuition. If you can't imagine it, it doesn't necessarily mean it doesn't. Mathematics in ancient times was based on intuition. Shopping for food is enough, but to develop further, it must have definitions and proofs-so we went to middle school.
Junior High School
Now we are beginning to get in touch with the most basic algebra—that is, solving equations. We found that division and multiplication are the inverse operations of each other, so ask
Is equivalent to solving the equation
0 * x = 1
Well, by definition, 0 multiplied by any number is 0, and it cannot be equal to 1, so the number that satisfies x does not exist, so it cannot be divided.
Similarly, if you ask
Is equivalent to solving the equation
0 * x = 0
In the same way, any number can satisfy x, so it cannot be divided-a single answer cannot be determined.
When we come into contact with basic formal logic, we will find another way of proof: the method of proof by contradiction.
A bunch of true expressions cannot infer a false expression, so if we add a bunch of true expressions with "can be divided by zero normally" and finally introduce a false one, it can only show the "divide by zero". This thing is no longer true.
So, it is known
0 * 1 = 0
0 * 2 = 0
0 1 = 0 2
Divide both sides by zero at the same time to get
(0/0) 1 = (0/0) 2
Simplify to get
1 = 2. This is obviously wrong.
So, the problem is solved! Actually not yet. Think about another question: What is the square root of -1?
You might say that -1 cannot take a square root, because the squares of all numbers are non-negative. But this is talking about real numbers. What if I add a definition? Define
i^2=-1, which creates an imaginary number, so -1 can also take the square root.
So, why can't we define a "new" number so that
1/0 is also equal to it, and set up a set of algorithms for this number? This has to be answered at the university.
University - Grade 1
When you just learn the calculus course, you will immediately be exposed to the symbol ∞. Hey, isn't this "infinite". We have all learned the concept of the limit, so I make b tend to 0, and then define the limit of a/b as infinity, can't it?
This immediately encountered a problem, its left limit and right limit are not the same. Does b approach 0 from the negative end, or the positive end? This one is getting more and more negative, and the other is getting more and more positive. There is no way to define such a limit.
Therefore, the calculus course will repeatedly say that although the symbol ∞ is used, it only represents a trend and is definitely not a real number, so it cannot be used in calculations.
University - Grade 2
Then learn the lesson, I don't need ready-made symbols, I directly define
1/0 = w, w is an "infinite" number, do not touch any limit, you have nothing to say!
However, the definition does not come from the ground up. Although you can define things as you like, but after the definition is inconsistent with other existing systems, it cannot be used, or it is very difficult to use.
And we ran into problems immediately when faced with w. First of all, how does w fit into the basic addition, subtraction, multiplication, and division system? What is 1 + w? What is w-w? If you make a number, but you can't even add, subtract, multiply and divide, it's not very useful, right.
For example, intuitively, 1 + w should be equal to w, it is infinite! And w-w is equal to 0, so subtract yourself!
But this will immediately conflict with the extremely important "associative law" in addition:
1 + (w-w) = 1 + 0 = 1, but
(1 + w)-w = w-w = 0. The associative law is a very basic thing in addition. For a w, even the associative law is not necessary. This cost is a bit high-not only the associative law itself, but also in the process of proving many mathematical theorems. We have to start all over again and establish a new system. The new system is not impossible to build, but it is laborious and (temporarily) useless, so everyone is still using the old one—and in the old one, in order to preserve the associative law, you can’t play like that.
Readers are welcome to use their imagination and try to give the calculation method for w. But you will find that no matter how you stipulate the relationship between w and other numbers, as long as you insist that 1/0 = w, you can't make it not contradict the basic mathematics you have learned since childhood. Still that sentence, you can set up another door to build your new mathematics on the basis of w, but it is incompatible with most traditional mathematics, and it will definitely be very difficult to use, so we used one. The division by zero system is very reasonable.
University - Grade 3
You may object: There are so many definitions, have I tried them all? If I haven't tried it, how do I know that a self-consistent solution will not come up someday?
"New discoveries overthrow old conclusions" can happen in biology, in chemistry, and in physics, but not in mathematics. Because mathematics is based on logic, there are exceptions to individual cases, and no exceptions to logic. Of course, our mathematics has not yet completed the final axiomatization, and we have to face Gödel’s ghost, but at least in this example, if w is a real number, then it violates some very important axioms, and these axioms The status is very deep.
For example, there is a set of basic axioms called "Piano Axioms". One of them says that every certain natural number has a certain successor, and the successor is also a natural number; the other says that natural number b=c, if and only if b is Successor = successor of c.
Then whose successor is w—or, who can add 1 to get w? Obviously all other numbers have their own successors, w has no place in them, and no other number plus 1 can become w. Then it can only be 1+w=w, but it directly contradicts the second sentence. Without Peano's axiom, the entire system of natural numbers cannot be established.
It is assumed that w is a natural number. Other situations will be slightly more complicated, but in any case, similar things happen in various definitions of w. If you want to treat w as a number, it will not be compatible with our existing real numbers. Therefore, we can only declare in almost all situations, and cannot divide by zero.
University - Grade 4 and above
Since we said "almost" before, there are exceptions-in some strange occasions, yes.
For example, there is something called "complex infinity", it is a point on the extended complex plane, it is really a point with definition. Under this special rule, you can write an expression like
1/0 = ∞. The reason for this is a long story, but it is not an operation in the usual sense-for example, you can't get 0 back, and you can't write
1 = 0 * ∞.
In addition, the word "infinity" can be treated as a "thing" in some other situations. For example, when you measure the size of a set, it can be infinite. But there are many different kinds of infinities-natural numbers are infinitely many, rational numbers are infinitely many, real numbers are also infinitely many, but odd and even numbers and positive and negative integers are as many as natural and rational numbers, while real numbers are infinitely many. More than them! The same is infinity, and some infinity is more infinite than others. But this is another topic, stop it.