混沌学 Chaos theory In mathematics, chaos theory describes the behavior of certain dynamical systems – that is, systems whose state evolves with time – that may exhibit dynamics that are highly sensitive to initial conditions (popularly referred to as the butterfly effect). As a result of this sensitivity, which manifests itself as an exponential growth of perturbations in the initial conditions, the behavior of chaotic systems appears to be random. This happens even though these systems are deterministic, meaning that their future dynamics are fully defined by their initial conditions, with no random elements involved. This behavior is known as deterministic chaos, or simply chaos. Chaotic behaviour is also observed in natural systems, such as the weather. This may be explained by a chaos-theoretical analysis of a mathematical model of such a system, embodying the laws of physics that are relevant for the natural system. Overview Chaotic behavior has been observed in the laboratory in a variety of systems including electrical circuits, lasers, oscillating chemical reactions, fluid dynamics, and mechanical and magneto-mechanical devices. Observations of chaotic behaviour in nature include the dynamics of satellites in the solar system, the time evolution of the magnetic field of celestial bodies, population growth in ecology, the dynamics of the action potentials in neurons, and molecular vibrations. Everyday examples of chaotic systems include weather and climate.[1] There is some controversy over the existence of chaotic dynamics in the plate tectonics and in economics.[2][3][4] Systems that exhibit mathematical chaos are deterministic and thus orderly in some sense; this technical use of the word chaos is at odds with common parlance, which suggests complete disorder. A related field of physics called quantum chaos theory studies systems that follow the laws of quantum mechanics. Recently, another field, called relativistic chaos,[5] has emerged to describe systems that follow the laws of general relativity. As well as being orderly in the sense of being deterministic, chaotic systems usually have well defined statistics.[citation needed] For example, the Lorenz system pictured is chaotic, but has a clearly defined structure. Bounded chaos is a useful term for describing models of disorder. History The first discoverer of chaos was Henri Poincaré. In 1890, while studying the three-body problem, he found that there can be orbits which are nonperiodic, and yet not forever increasing nor approaching a fixed point.[6] In 1898 Jacques Hadamard published an influential study of the chaotic motion of a free particle gliding frictionlessly on a surface of constant negative curvature.[7] In the system studied, "Hadamard's billiards," Hadamard was able to show that all trajectories are unstable in that all particle trajectories diverge exponentially from one another, with a positive Lyapunov exponent. Much of the earlier theory was developed almost entirely by mathematicians, under the name of ergodic theory. Later studies, also on the topic of nonlinear differential equations, were carried out by . Birkhoff,[8] A. N. Kolmogorov,[9][10][11] . Cartwright and . Littlewood,[12] and Stephen Smale.[13] Except for Smale, these studies were all directly inspired by physics: the three-body problem in the case of Birkhoff, turbulence and astronomical problems in the case of Kolmogorov, and radio engineering in the case of Cartwright and Littlewood. Although chaotic planetary motion had not been observed, experimentalists had encountered turbulence in fluid motion and nonperiodic oscillation in radio circuits without the benefit of a theory to explain what they were seeing. Despite initial insights in the first half of the twentieth century, chaos theory became formalized as such only after mid-century, when it first became evident for some scientists that linear theory, the prevailing system theory at that time, simply could not explain the observed behaviour of certain experiments like that of the logistic map. What had been beforehand excluded as measure imprecision and simple "noise" was considered by chaos theories as a full component of the studied systems. The main catalyst for the development of chaos theory was the electronic computer. Much of the mathematics of chaos theory involves the repeated iteration of simple mathematical formulas, which would be impractical to do by hand. Electronic computers made these repeated calculations practical, while figures and images made it possible to visualize these systems. One of the earliest electronic digital computers, ENIAC, was used to run simple weather forecasting models. An early pioneer of the theory was Edward Lorenz whose interest in chaos came about accidentally through his work on weather prediction in 1961.[14] Lorenz was using a simple digital computer, a Royal McBee LGP-30, to run his weather simulation. He wanted to see a sequence of data again and to save time he started the simulation in the middle of its course. He was able to do this by entering a printout of the data corresponding to conditions in the middle of his simulation which he had calculated last time. To his surprise the weather that the machine began to predict was completely different from the weather calculated before. Lorenz tracked this down to the computer printout. The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like was printed as . This difference is tiny and the consensus at the time would have been that it should have had practically no effect. However Lorenz had discovered that small changes in initial conditions produced large changes in the long-term outcome.[15] Lorenz's discovery, which gave its name to Lorenz attractors, proved that meteorology could not reasonably predict weather beyond a weekly period (at most). The year before, Benoit Mandelbrot found recurring patterns at every scale in data on cotton prices.[16] Beforehand, he had studied information theory and concluded noise was patterned like a Cantor set: on any scale the proportion of noise-containing periods to error-free periods was a constant – thus errors were inevitable and must be planned for by incorporating redundancy.[17] Mandelbrot described both the "Noah effect" (in which sudden discontinuous changes can occur, ., in a stock's prices after bad news, thus challenging normal distribution theory in statistics, aka Bell Curve) and the "Joseph effect" (in which persistence of a value can occur for a while, yet suddenly change afterwards).[18][19] In 1967, he published "How long is the coast of Britain? Statistical self-similarity and fractional dimension," showing that a coastline's length varies with the scale of the measuring instrument, resembles itself at all scales, and is infinite in length for an infinitesimally small measuring device.[20] Arguing that a ball of twine appears to be a point when viewed from far away (0-dimensional), a ball when viewed from fairly near (3-dimensional), or a curved strand (1-dimensional), he argued that the dimensions of an object are relative to the observer and may be fractional. An object whose irregularity is constant over different scales ("self-similarity") is a fractal (for example, the Koch curve or "snowflake", which is infinitely long yet encloses a finite space and has fractal dimension equal to circa , the Menger sponge and the Sierpiński gasket). In 1975 Mandelbrot published The Fractal Geometry of Nature, which became a classic of chaos theory. Biological systems such as the branching of the circulatory and bronchial systems proved to fit a fractal model. Chaos was observed by a number of experimenters before it was recognized; ., in 1927 by van der Pol[21] and in 1958 by . Ives.[22][23] However, Yoshisuke Ueda seems to have been the first experimenter to have identified a chaotic phenomenon as such by using an analog computer on November 27, 1961. The chaos exhibited by an analog computer is a real phenomenon, in contrast with those that digital computers calculate, which has a different kind of limit on precision. Ueda's supervising professor, Hayashi, did not believe in chaos, and thus he prohibited Ueda from publishing his findings until 1970.[24] In December 1977 the New York Academy of Sciences organized the first symposium on Chaos, attended by David Ruelle, Robert May, James Yorke (coiner of the term "chaos" as used in mathematics), Robert Shaw (a physicist, part of the Eudaemons group with J. Doyne Farmer and Norman Packard who tried to find a mathematical method to beat roulette, and then created with them the Dynamical Systems Collective in Santa Cruz), and the meteorologist Edward Lorenz. The following year, Mitchell Feigenbaum published the noted article "Quantitative Universality for a Class of Nonlinear Transformations", where he described logistic maps.[25] Feigenbaum had applied fractal geometry to the study of natural forms such as coastlines. Feigenbaum notably discovered the universality in chaos, permitting an application of chaos theory to many different phenomena. In 1979, Albert J. Libchaber, during a symposium organized in Aspen by Pierre Hohenberg, presented his experimental observation of the bifurcation cascade that leads to chaos and turbulence in convective Rayleigh–Benard systems. He was awarded the Wolf Prize in Physics in 1986 along with Mitchell J. Feigenbaum "for his brilliant experimental demonstration of the transition to turbulence and chaos in dynamical systems".[26] The New York Academy of Sciences then co-organized, in 1986, with the National Institute of Mental Health and the Office of Naval Research the first important conference on Chaos in biology and medicine. Bernardo Huberman thereby presented a mathematical model of the eye tracking disorder among schizophrenics.[27] Chaos theory thereafter renewed physiology in the 1980s, for example in the study of pathological cardiac cycles. In 1987, Per Bak, Chao Tang and Kurt Wiesenfeld published a paper in Physical Review Letters[28] describing for the first time self-organized criticality (SOC), considered to be one of the mechanisms by which complexity arises in nature. Alongside largely lab-based approaches such as the Bak–Tang–Wiesenfeld sandpile, many other investigations have centred around large-scale natural or social systems that are known (or suspected) to display scale-invariant behaviour. Although these approaches were not always welcomed (at least initially) by specialists in the subjects examined, SOC has nevertheless become established as a strong candidate for explaining a number of natural phenomena, including: earthquakes (which, long before SOC was discovered, were known as a source of scale-invariant behaviour such as the Gutenberg–Richter law describing the statistical distribution of earthquake sizes, and the Omori law[29] describing the frequency of aftershocks); solar flares; fluctuations in economic systems such as financial markets (references to SOC are common in econophysics); landscape formation; forest fires; landslides; epidemics; and biological evolution (where SOC has been invoked, for example, as the dynamical mechanism behind the theory of "punctuated equilibria" put forward by Niles Eldredge and Stephen Jay Gould). Worryingly, given the implications of a scale-free distribution of event sizes, some researchers have suggested that another phenomenon that should be considered an example of SOC is the occurrence of wars. These "applied" investigations of SOC have included both attempts at modelling (either developing new models or adapting existing ones to the specifics of a given natural system), and extensive data analysis to determine the existence and/or characteristics of natural scaling laws. The same year, James Gleick published Chaos: Making a New Science, which became a best-seller and introduced general principles of chaos theory as well as its history to the broad public. At first the domains of work of a few, isolated individuals, chaos theory progressively emerged as a transdisciplinary and institutional discipline, mainly under the name of nonlinear systems analysis. Alluding to Thomas Kuhn's concept of a paradigm shift exposed in The Structure of Scientific Revolutions (1962), many "chaologists" (as some self-nominated themselves) claimed that this new theory was an example of such as shift, a thesis upheld by J. Gleick. The availability of cheaper, more powerful computers broadens the applicability of chaos theory. Currently, chaos theory continues to be a very active area of research, involving many different disciplines (mathematics, topology, physics, population biology, biology, meteorology, astrophysics, information theory, etc.). [edit] Chaotic dynamics For a dynamical system to be classified as chaotic, it must have the following properties:[30] it must be sensitive to initial conditions, it must be topologically mixing, and its periodic orbits must be dense. Sensitivity to initial conditions means that each point in such a system is arbitrarily closely approximated by other points with significantly different future trajectories. Thus, an arbitrarily small perturbation of the current trajectory may lead to significantly different future behaviour. Sensitivity to initial conditions is popularly known as the "butterfly effect", so called because of the title of a paper given by Edward Lorenz in 1972 to the American Association for the Advancement of Science in Washington, . entitled Predictability: Does the Flap of a Butterfly’s Wings in Brazil set off a Tornado in Texas? The flapping wing represents a small change in the initial condition of the system, which causes a chain of events leading to large-scale phenomena. Had the butterfly not flapped its wings, the trajectory of the system might have been vastly different. Sensitivity to initial conditions is often confused with chaos in popular accounts. It can also be a subtle property, since it depends on a choice of metric, or the notion of distance in the phase space of the system. For example, consider the simple dynamical system produced by repeatedly doubling an initial value (defined by iterating the mapping on the real line that maps x to 2x). This system has sensitive dependence on initial conditions everywhere, since any pair of nearby points will eventually become widely separated. However, it has extremely simple behaviour, as all points except 0 tend to infinity. If instead we use the bounded metric on the line obtained by adding the point at infinity and viewing the result as a circle, the system no longer is sensitive to initial conditions. For this reason, in defining chaos, attention is normally restricted to systems with bounded metrics, or closed, bounded invariant subsets of unbounded systems. Even for bounded systems, sensitivity to initial conditions is not identical with chaos. For example, consider the two-dimensional torus described by a pair of angles (x,y), each ranging between zero and 2π. Define a mapping that takes any point (x,y) to (2x, y + a), where a is any number such that a/2π is irrational. Because of the doubling in the first coordinate, the mapping exhibits sensitive dependence on initial conditions. However, because of the irrational rotation in the second coordinate, there are no periodic orbits, and hence the mapping is not chaotic according to the definition above. Topologically mixing means that the system will evolve over time so that any given region or open set of its phase space will eventually overlap with any other given region. Here, "mixing" is really meant to correspond to the standard intuition: the mixing of colored dyes or fluids is an example of a chaotic system. Linear systems are never chaotic; for a dynamical system to display chaotic behaviour it has to be nonlinear. Also, by the Poincaré–Bendixson theorem, a continuous dynamical system on the plane cannot be chaotic; among continuous systems only those whose phase space is non-planar (having dimension at least three, or with a non-Euclidean geometry) can exhibit chaotic behaviour. However, a discrete dynamical system (such as the logistic map) can exhibit chaotic behaviour in a one-dimensional or two-dimensional phase space. [edit] Attractors Some dynamical systems are chaotic everywhere (see . Anosov diffeomorphisms) but in many cases chaotic behaviour is found only in a subset of phase space. The cases of most interest arise when the chaotic behaviour takes place on an attractor, since then a large set of initial conditions will lead to orbits that converge to this chaotic region. An easy way to visualize a chaotic attractor is to start with a point in the basin of attraction of the attractor, and then simply plot its subsequent orbit. Because of the topological transitivity condition, this is likely to produce a picture of the entire final attractor. For instance, in a system describing a pendulum, the phase space might be two-dimensional, consisting of information about position and velocity. One might plot the position of a pendulum against its velocity. A pendulum at rest will be plotted as a point, and one in periodic motion will be plotted as a simple closed curve. When such a plot forms a closed curve, the curve is called an orbit. Our pendulum has an infinite number of such orbits, forming a pencil of nested ellipses about the origin. [edit] Strange attractors While most of the motion types mentioned above give rise to very simple attractors, such as points and circle-like curves called limit cycles, chaotic motion gives rise to what are known as strange attractors, attractors that can have great detail and complexity. For instance, a simple three-dimensional model of the Lorenz weather system gives rise to the famous Lorenz attractor. The Lorenz attractor is perhaps one of the best-known chaotic system diagrams, probably because not only was it one of the first, but it is one of the most complex and as such gives rise to a very interesting pattern which looks like the wings of a butterfly. Another such attractor is the Rössler map, which experiences period-two doubling route to chaos, like the logistic map. Strange attractors occur in both continuous dynamical systems (such as the Lorenz system) and in some discrete systems (such as the Hénon map). Other discrete dynamical systems have a repelling structure called a Julia set which forms at the boundary between basins of attraction of fixed points - Julia sets can be thought of as strange repellers. Both strange attractors and Julia sets typically have a fractal structure. The Poincaré-Bendixson theorem shows that a strange attractor can only arise in a continuous dynamical system if it has three or more dimensions. However, no such restriction applies to discrete systems, which can exhibit strange attractors in two or even one dimensional systems. The initial conditions of three or more bodies interacting through gravitational attraction (see the n-body problem) can be arranged to produce chaotic motion. Minimum complexity of a chaotic system Simple systems can also produce chaos without relying on differential equations. An example is the logistic map, which is a difference equation (recurrence relation) that describes population growth over time. Another example is the Ricker model of population dynamics. Even the evolution of simple discrete systems, such as cellular automata, can heavily depend on initial conditions. Stephen Wolfram has investigated a cellular automaton with this property, termed by him rule 30. A minimal model for conservative (reversible) chaotic behavior is provided by Arnold's cat map.
一篇基于混沌特性和分形性质的优化算法原文的译文 :)Mohammad Saleh Tavazoei & Mohammad Haeri 的?
先复制下,翻译好了给你传上来!
1楼的明显用goodle翻译软件翻译,
马尔萨斯的思想很大程度是对他的父亲及其朋友们(如卢梭)的乐观思想的反动。他的一些文章也是对孔赛伯爵的回应。在1798年发表的《人口学原理》中,马尔萨斯作出一个著名的预言:男女两性之间的情欲是必然的、且几乎保持现状;食物为人类生存所必需的这两个抽象前提出发,断言在这两者中间,人口增殖力比土地生产人类生活资料力更为巨大。人口以几何级数增加,生活资料以算术级数增加,因而造成人口过剩,不可避免地出现饥饿、贫困和失业等现象。The power of population is so superior to the power of the earth to produce subsistence for man, that premature death must in some shape or other visit the human race. The vices of mankind are active and able ministers of depopulation. They are the precursors in the great army of destruction; and often finish the dreadful work themselves. But should they fail in this war of extermination, sickly seasons, epidemics, pestilence, and plague, advance in terrific array, and sweep off their thousands and tens of thousands. Should success be still incomplete, gigantic inevitable famine stalks in the rear, and with one mighty blow levels the population with the food of the world.人口学原理的基本思想是:如没有限制,人口是呈指数速率(即:2,4,8,16,32,64,128等)增长,而食物供应呈线性速率(即:1,2,3,4,5,6,7等)增长。注意:马尔萨斯使用的相对应术语是几何和算术。只有自然原因(事故和衰老),灾难(战争,瘟疫,及各类饥荒),道德限制和罪恶(马尔萨斯所指包括杀婴,谋杀,节育和同性恋)能够限制人口的过度增长。参见马尔萨斯灾难。马尔萨斯倾向于用道德限制(包括晚婚和禁欲)手段来控制人口增长。然而值得注意的是,马尔萨斯建议只对劳动群众和贫困阶级采取这样的措施。公开宣扬财产私有制有助于抑制人口增长,同时极力反对社会救济事业。那么根据他的理论,较低的社会阶层对于社会弊病要承担较大的责任。这就从根本上导致了推动立法手段使英国的穷人生存状况更为恶化,但也减缓了贫困人口的增长。马尔萨斯自己注意到许多人误用他的理论,痛苦地阐明他没有仅仅预测未来的大灾难。他辩解道,“……周期性灾难持续存在的原因自人类有史以来就已经存在,目前仍然存在,并且将来会继续存在,除非我们的大自然的物理结构发生决定性的变化。”因此,马尔萨斯认为他的《人口学原理》是对人类过去和目前状况的解释,以及对我们未来的预测。此外,许多人辩驳道,马尔萨斯没有认识到人类有能力增加食物供应。关于这个论题,马尔萨斯写道,“将人类与其他动物相区别的主要特性是人的生存能力,和具有大量增加生存手段的能力。 在东印度公司学院,马尔萨斯发展出一套需求供应失衡理论,他称之为过剩。在当时这被看做荒唐的理论,却是后来有关大萧条的一系列经济理论的先驱,他的崇拜者、经济学家约翰·梅纳德·凯恩斯将这个思想引入了著作。以前,高出生率被认为有利于经济因为会为经济提供更多的劳力。然而,马尔萨斯却从一个新的视角看待出生率,并且说服了大多数经济学家:即使高出生率可以增加毛生产量,它更趋于降低人均生产量。马尔萨斯具有广泛的影响力,他的崇拜者包括知名的经济学家大卫·李嘉图等人。他的理论的一个最知名的门徒是英国首相小威廉·皮特。1830年代,马尔萨斯的著作强烈地影响了辉格党人,他们改变了托利党的家长式作风,于1834年引入了《坏法修正法案》。对马尔萨斯理论的关注也帮助了英国全国人口普查的实施。1801年,政府官员约翰·李克曼主导了第一次现代人口普查。马尔萨斯人口理论的门徒还包括知名的神创论者、自然神学家威廉·佩里大主教,他于1802年发表了《自然神论》。他认为马尔萨斯的人口学原理证明了神的存在。讽刺的是,马尔萨斯自己反对节育,他的著作却强烈地影响了弗朗西斯·普勒斯,后者发动了极端马尔萨斯主义运动以推动节育。普勒斯于1822年发表了论文《人口学原理的证明》。马尔萨斯理论对现代进化论创始人达尔文和阿尔佛雷德·华莱士产生关键影响。达尔文在他的《物种起源》一书中说,他的理论是马尔萨斯理论在没有人类智力干预的一个领域里的应用。达尔文终生都是马尔萨斯的崇拜者,称他为“伟大的哲学家”。华莱士称马尔萨斯的著作是“……我所阅读过的最重要的书”,并把他和达尔文通过学习马尔萨斯理论,各自独立地发展出进化论,称做“最有趣的巧合”。进化论学家们普遍认可马尔萨斯无意中对进化论做出了许多贡献。马尔萨斯对于人口问题的思考是现代进化理论的基础。马尔萨斯强化了对“有限增长”条件下“生存挣扎”的观察。由于马尔萨斯理论,达尔文认识到了生存竞争不仅发生在物种之间,而且也在同一物种内部进行。联合国教科文组织的发起人、进化论学者和人道主义者于连·赫胥黎在1964年出版的著作《进化论的人道主义》中描述了“拥挤的世界”,呼吁制订“世界人口政策”。联合国人口基金会等国际组织关于地球能容纳多少人的辩论即起源于马尔萨斯。时至今日,马尔萨斯仍然发挥着重要影响。一个例子就是,1972年罗马俱乐部发表的报告《增长的极限》和《环球2000》,送达了当时的美国总统。科幻作家艾萨克·阿西莫夫发表了许多有关人口控制的文章,反映出来自马尔萨斯的观点。马尔萨斯被视为现代人口学的奠基人。马尔萨斯宣称他的人口学原理不仅对人类,而是对所有物种普适的自然法则。现在可以证明,没有一种东西会以固定速率呈指数方式增长。马尔萨斯关于食物供应的算术模型被普遍拒绝,因为在过去的两个世纪里,食物供应与人口增长保持了同步。 对马尔萨斯最高调的反对声音来自19世纪中叶的共产主义者卡尔·马克思(1867年,《资本论》)和弗里得里希·恩格斯(1844年,《政治经济学批判大纲》)。他们看来,马尔萨斯所谓人口对生产力造成压力的问题,实际上是生产力对人口的压力。换句话说,马尔萨斯把生产力低下归咎于过剩的人口,实际上是动荡的资本主义经济造成的后果,危机、失业、贫困等都是资本主义私有制统治的产物。恩格斯称马尔萨斯的理论“……是现存最冷酷无情、最野蛮的理论,一个摧毁了爱人如己和世界公民等所有美好词汇的、绝望的系统。”尽管出现了1921年俄罗斯饥荒,共产主义者总的来说仍然反对马尔萨斯理论。1954年在罗马召开的联合国人口会议上,苏联代表声称,“在社会主义国家,人口过剩问题从来没有出现……马尔萨斯理论完全是错误的。”中国共产党在大跃进期间对于人口过于自信,马尔萨斯的人口理论在当时的中国受到批判。三年自然灾害之后,中国开始实行计划生育政策,从而改变了对人口控制的观点。马尔萨斯的结论被许多20世纪的经济学家所诟病。由于技术进步,大规模的人口增长并未造成马尔萨斯灾难。所以有人称他是失败的咒诅先知。首先,一个广为接受的事实是,人口增长几乎从未呈指数方式,里面的变数太多,绝非一个简单的数学模型所能概括。自马尔萨斯的时代以来,人口增长率变得平缓并且是拜经济繁荣所赐。马尔萨斯生活的时代,英国人口经历了增长率变平之前的增加,而他没有研究亚洲地区的大量人口以及过去几千年出生率平缓的证据。其次,粮食的增长也不是线性方式。特别是由于社会和农业技术的进步,使粮食增长超过了人口的增长,虽然这样的增长是以沉重的资源负担、大量使用化肥为代价,仍未被证明是可持续的。天主教经济学者拒绝接受马尔萨斯理论,批评他只不过是一个17世纪威尼斯某异端邪说的复制品。天主教百科全书写道,“……他对人类的知识和福祉没有任何贡献,……扩散人口过度增长的恐惧,……适当的解决方法是寻求更好的社会和产业方式,更好的医疗保障,促进道德和宗教教育。”
181 浏览 3 回答
335 浏览 3 回答
275 浏览 2 回答
249 浏览 2 回答
296 浏览 5 回答
177 浏览 4 回答
301 浏览 5 回答
281 浏览 2 回答
103 浏览 3 回答
356 浏览 2 回答
238 浏览 3 回答
159 浏览 4 回答
192 浏览 5 回答
341 浏览 5 回答
357 浏览 3 回答