Education.com
Try
Brainzy
Try
Plus

Chaos, Bounds, and Randomness Help (page 3)

By — McGraw-Hill Professional
Updated on Apr 25, 2014

The Malthusian Model

Chaos theory has been applied in grim fashion to describe the characteristics of the earth's population growth. Suppose we want to find a function that can describe world population versus time. The simplest model allows for an exponential increase in population, but this so-called Malthusian model (named after its inventor, Thomas Malthus) does not incorporate factors such as disease pandemics, world wars, or the collision of an asteroid with the planet.

The Malthusian model is based on the idea that the world's human population increases geometrically in the way bacteria multiply, while the world's available supply of food and other resources increase arithmetically. It is easy to see that a pure Malthusian population increase can only go on for a certain length of time. When a certain critical point is reached, the population will no longer increase, because the earth will get too crowded and there won't be enough resources to keep people alive. What will happen then? Will the population level off smoothly? Will it decline suddenly and then increase again? Will it decline gradually and then stay low? The outcome depends on the values we assign to certain parameters in the function we ultimately find that describes population versus time.

A Bumpy Ride

The limiting process for any population-versus-time function depends on the extent of the disparity between population growth and resource growth. If we consider the earth's resources to be finite, then the shape of the population-versus-time curve depends on how fast people reproduce until a catastrophe occurs. As the reproduction rate goes up – as the ''function is driven harder'' – the time period until the first crisis decreases, and the ensuing fluctuations become more and more wild. In the worst cases, the predictions become dire indeed.

The Malthusian equation for population increase is:

  • xn+1 = rxn (1 – xn)

where n is a whole number starting with n = 0, and r is a factor that represents the rate of population increase. (This is not the same r factor that represents correlation, defined earlier in this chapter.) Statisticians, social scientists, biologists, mathematicians, and even some politicians have run this formula through computers for various values of r, in an attempt to predict what would happen to the world's population as a function of time on the basis of various degrees of ''population growth pressure.'' It turns out that a leveling-off condition occurs when the value of r is less than about 2.5. The situation becomes more complicated and grotesque with higher values of r. As the value of the r factor increases, the function is ''driven harder,'' and the population increases with greater rapidity – until a certain point in time. Then chaos breaks loose.

According to computer models, when the r factor is low, the world population increases, reaches a peak, and then falls back. Then the population increases again, reaches another peak, and undergoes another decline. This takes place over and over but with gradually diminishing wildness. Thus, a damped oscillation occurs in the population function as it settles to a steady state (Fig. 7-11A).

Chaos, Bounds, and Randomness

In real life, the r factor can be kept low by strict population control and public education. Conversely, the r factor could become higher if all efforts at population control were abandoned. Computers tell us with unblinking screens what they ''think'' will happen then. If the value of r is large enough, the ultimate world population does not settle down, but oscillates indefinitely between limiting values. The amplitude and frequency of the oscillation depends on how large the r factor is allowed to get (Fig. 7-11B). At a certain critical value for the r factor, even this vestige of orderliness is lost, and the population-versus-time function fluctuates crazily, never settling into any apparent oscillation frequency, although there are apparent maximum and minimum limits to the peaks and valleys (Fig. 7-11C).

A graph in which the world's ultimate human population is plotted on the vertical (dependent-variable) axis and the r factor is plotted on the horizontal (independent-variable) axis produces a characteristic pattern something like the one shown in Fig. 7-12. The function breaks into oscillation when the r factor reaches a certain value. At first this oscillation has defined frequency and amplitude. But as r continues to increase, a point is reached where the oscillation turns into noise. As an analogy, think about what happens when the audio gain of a public-address system is increased until feedback from the speakers finds its way to the microphone, and the speakers begin to howl. If the audio gain is turned up higher, the oscillations get louder. If the system is driven harder still, the oscillations increase in fury until, in the absence of all restraint, the system roars like thunder.

Chaos, Bounds, and Randomness

Does the final population figure in the right-hand part of Fig. 7-12 truly represent unpredictable variation between extremes? If the computer models are to be believed, it does. By all indications, the gray area in the right-hand part of Fig. 7-12 represents a sort of randomness.

What is Randomness?

In statistical analysis, there is often a need to obtain sequences of values that occur at random. What constitutes randomness? Here's one definition that can be applied to single-digit numbers:

  • A sequence of digits from the set {0, 1, 2, 3, 4, 5, 6, 7, 8, 9} can be considered random if and only if, given any digit in the sequence, there exists no way to predict the next one.

At first thought, the task of generating a sequence of random numbers in this way seems easy. Suppose we chatter away, carelessly uttering digits from 0 to 9. Everyone has a leaning or preference for certain digits or sequences of digits, such as 5 or 58 or 289 or 8827. If a sequence of digits is truly random, then over long periods a given digit x will occur exactly 10% of the time, a given sequence xy will occur exactly 1% of the time, a given sequence xyz will occur exactly 0.1% of the time, and a given sequence wxyz will occur exactly 0.01% of the time. These percentages, over time, should hold for all possible sequences of digits of the given sizes, and similar rules should hold for sequences of any length. But if you speak or write down or keypunch digits for a few days and record the result, it's a good bet that this will not be the case. (Your imagination may seem wild to you, but there is some order to it no matter what.)

Here's another definition of randomness. This definition is based on the idea that all artificial processes contain inherent orderliness:

  • In order for a sequence of digits to be random, there must exist no algorithm capable of generating the next digit in a sequence, on the basis of the digits already generated in that sequence.

According to this definition, if we can show that any digit in a sequence is a function of those before it, the sequence is not random. This rules out many sequences that seem random to the casual observer. For example, we can generate the value of the square root of 2 (or 21/2) with an algorithm called extraction of the square root. This algorithm can be applied to any whole number that is not a perfect square. If we have the patience, and if we know the first n digits of a square root, we can find the (n + 1)st digit by means of this process. It works every time, and the result is the same every time. The sequence of digits in the decimal expansion of the square root of 2, as well as the decimal expansions of π, e, or any other irrational number, is the same every time a computer grinds it out. The decimal expansions of irrational numbers are therefore not random-digit sequences.

If the digits in any given irrational number fail to occur in a truly random sequence, where can we find digits that do occur randomly? Is there any such thing? If a random sequence of digits cannot be generated by an algorithm, does this rule out any thought process that allows us to identify the digits? Are we looking for something so elusive that, when we think we've found it, the very fact that we have gone through a thought process to find it proves that we have not? If that is true, how is the statistician to get hold of a random sequence that can actually be used?

In the interest of practicality, statisticians often settle for pseudorandom digits or numbers. The prefix pseudo- in this context means ''pretend'' or ''for all practical purposes.'' Computer algorithms exist that can be used to generate strings of digits or numbers that can be considered random in most real-world applications.

The Net to the Rescue

You can search the Internet and find sites with information about pseudorandom and random numbers. There's plenty of good reading on the Web (as well as plenty of nonsense), and even some downloadable programs that can turn a home computer into a generator of pseudorandom digits. For a good start, go to the Google search engine at www.google.com, bring up the page to conduct an advanced search, and then enter the phrase ''random number generator.'' Be careful what you download! Make sure your anti-virus program is effective and up to date. If you are uneasy about downloading stuff from the Web, then don't do it.

A safer way to get random digits is a site maintained by the well-known and respected mathematician Dr. Mads Haahr of Trinity College in Dublin, Ireland. It can be brought up by pointing your Web browser to www.random.org. The author describes the difference between pseudorandom and truly random numbers. He also provides plenty of interesting reading on the subject, and links to sites for further research.

Dr. Haahr's Web site makes use of electromagnetic noise to obtain real-time random-number and pseudorandom-number sequences. For this scheme to work, there must exist no sources of orderly noise near enough to be picked up by the receiver. Orderly noise sources include internal combustion engines and certain types of electrical appliances such as old light dimmers. The hissing and crackling that you hear in a radio receiver when it is tuned to a vacant channel is mostly electromagnetic noise from the earth's atmosphere and from outer space. Some electrical noise also comes from the internal circuitry of the radio.

View Full Article
Add your own comment

Ask a Question

Have questions about this article or topic? Ask
Ask
150 Characters allowed