Suppose a typewriter has 50 keys and the word you want to type is "banana". When inputting randomly, the probability of inputting the first letter "b" is 1/50, and the probability of inputting the second letter "a" is 1/50. Because the events are independent, the probability of entering the word "banana" at the beginning is:
(1/50) × (1/50 )× (1/50 )× (1/50 )× (1/50) = Similarly, the probability of continuing to type "banana" is also high.
So, in the given six letters, the probability of not entering "banana" is 1? ( 1/50)6。 Because each paragraph (6 letters) is independent, the probability Xn of not inputting "banana" in n consecutive paragraphs is:
The larger n is, the smaller Xn is. When n is equal to 1 million, Xn is about 0.9999 (the probability of not inputting "banana" is 99.99%); But when n is equal to 1000 billion, Xn is about 0.53 (the probability of not inputting "banana" is 53%); Xn is about 0.00 17 when n = 1000 billion (the probability of not inputting "banana" is 0.17%); When n tends to infinity, Xn tends to zero. In other words, as long as N is made large enough, Xn can be made small enough.
The same argument can also show that at least one of the infinite monkeys can only type a specific article. Here Xn = (1? (1/50)6)n, where Xn represents the probability that the first n monkeys never hit bananas once. When we have10 billion monkeys, this probability decreases to 0. 17%. As the number of monkeys N tends to infinity, the probability Xn of not playing bananas tends to zero.
However, when there is only limited time and limited monkeys, the conclusion is quite different. If we have monkeys with the same number of elementary particles in the Hubble volume, about 10 80, and type 1000 words per second, and continue to type 100 times the life length of the universe (about 10 20 seconds), the probability that monkeys can type a very thin book is close to zero. See below: Probability. The above two situations can be extended to all strings:
Given an infinite string in which each character is randomly generated, any finite string will appear as a substring (in fact, it will appear an infinite number of times). Given a sequence with an infinite number of strings of infinite length, and each character in each string is randomly generated, any finite string will appear at the beginning of some strings (actually, the beginning of an infinite number of strings). For the second theorem, suppose Ek a given string appears at the beginning of the kth string. The probability p of this event is fixed and not zero, and Ek is independent, so:
The probability that the event Ek occurs infinitely many times is 1. The first theorem can be treated in a similar way. Firstly, an infinitely long string is divided so that the length of each segment is the same as the given string, and then Ek is an event with k segments equal to the given string. Excluding punctuation, spaces and case, the probability that the first letter typed by the monkey is the same as that in Hamlet is one in 26, and the probability that the first two letters are the same is one in 676 (that is, 26 times 26). Because the probability explodes exponentially, the probability that the first 20 letters are the same is 26 to the negative 20th power! (The probability that the typed word is -29 power 5.02* 10 is the same as all the texts in Hamlet is reduced beyond people's imagination. The whole hamlet is about 130000 letters. Although there is a one-to-one probability of 3.4× 10 183946 to type all the texts at once, the average number of letters to be entered before typing the texts is 3.4× 10 183946, or including punctuation, 4.4×/kloc-0.
Even if Hubble is full of monkeys who keep typing, the probability of producing a Hamlet is still less than one in 10 183800.