Thư viện tri thức trực tuyến
Kho tài liệu với 50,000+ tài liệu học thuật
© 2023 Siêu thị PDF - Kho tài liệu học thuật hàng đầu Việt Nam

Complexity hints for economic policy
Nội dung xem thử
Mô tả chi tiết
New Economic Windows
Series Editor
MASSIMO SALZANO
Series Editorial Board
Jaime Gil Aluja
Departament d’Economia i Organització d’Empreses, Universitat de Barcelona, Spain
Fortunato Arecchi
Dipartimento di Fisica, Università di Firenze and INOA, Italy
David Colander
Department of Economics, Middlebury College, Middlebury, VT, USA
Richard H. Day
Department of Economics, University of Southern California, Los Angeles, USA
Mauro Gallegati
Dipartimento di Economia, Università di Ancona, Italy
Steve Keen
School of Economics and Finance, University of Western Sydney, Australia
Giulia Iori
Department of Mathematics, King’s College, London, UK
Alan Kirman
GREQAM/EHESS, Université d’Aix-Marseille III, France
Marji Lines
Dipartimento di Science Statistiche, Università di Udine, Italy
Alfredo Medio
Dipartimento di Scienze Statistiche, Università di Udine, Italy
Paul Ormerod
Directors of Environment Business-Volterra Consulting, London, UK
J. Barkley Rosser
Department of Economics, James Madison University, Harrisonburg, VA, USA
Sorin Solomon
Racah Institute of Physics, The Hebrew University of Jerusalem, Israel
Kumaraswamy (Vela) Velupillai
Department of Economics, National University of Ireland, Ireland
Nicolas Vriend
Department of Economics, Queen Mary University of London, UK
Lotfi Zadeh
Computer Science Division, University of California Berkeley, USA
Editorial Assistants
Maria Rosaria Alfano
Marisa Faggini
Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Italy
Marisa Faggini
Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Italy
Series Editorial Board
Jaime Gil Aluja
Departament d’Economia i Organització d’Empreses, Universitat de Barcelona, Spain
Fortunato Arecchi
Dipartimento di Fisica, Università di Firenze and INOA, Italy
David Colander
Department of Economics, Middlebury College, Middlebury, VT, USA
Richard H. Day
Department of Economics, University of Southern California, Los Angeles, USA
Mauro Gallegati
Dipartimento di Economia, Università di Ancona, Italy
Steve Keen
School of Economics and Finance, University of Western Sydney, Australia
Alan Kirman
GREQAM/EHESS, Université d’Aix-Marseille III, France
Marji Lines
Dipartimento di Science Statistiche, Università di Udine, Italy
Thomas Lux
Department of Economics, University of Kiel, Germany
Alfredo Medio
Dipartimento di Scienze Statistiche, Università di Udine, Italy
Paul Ormerod
Directors of Environment Business-Volterra Consulting, London, UK
Peter Richmond
School of Physics, Trinity College, Dublin 2, Ireland
J. Barkley Rosser
Department of Economics, James Madison University, Harrisonburg, VA, USA
Sorin Solomon
Racah Institute of Physics, The Hebrew University of Jerusalem, Israel
Pietro Terna
Dipartimento di Scienze Economiche e Finanziarie, Università di Torino, Italy
Kumaraswamy (Vela) Velupillai
Department of Economics, National University of Ireland, Ireland
Nicolas Vriend
Department of Economics, Queen Mary University of London, UK
Lotfi Zadeh
Computer Science Division, University of California Berkeley, USA
Editorial Assistants
Marisa Faggini
Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Italy
Massimo Salzano • David Colander
Complexity Hints
for Economic Policy
MASSIMO SALZANO
Dipartimento di Scienze Economiche e Statistiche
Università degli Studi di Salerno, Italy
David Colander
Middlebury College, Middlebury, VT, USA
Library of Congress Control Number: 2006930687
ISBN-978-88-470-0533-4 Springer Milan Berlin Heidelberg New York
This work is subject to copyright. All rights are reserved, whether the whole of part of the
material is concerned, specifically the rights of translation, reprinting, re-use of illustrations,
recitation, broadcasting, reproduction on microfilms or in other ways, and storage in databanks. Duplication of this pubblication or parts thereof is only permitted under the provisions
of the Italian Copyright Law in its current version, and permission for use must always be
obtained from Springer-Verlag. Violations are liable for prosecution under the Italian
Copyright Law.
Springer is a part of Springer Science+Business Media
© Springer-Verlag Italia 2007
Printed in Italy
Cover design: Simona Colombo, Milano
Typeset by the authors using a Springer Macro package
Printing and binding: Grafiche Porpora, Segrate (MI)
Printed on acid-free paper
IV A. Achiron et al.
The publication of this book has been made possible thanks to the financial
support of the MIUR-FIRB RBAU01 B49F
springer.com
Preface
To do science is to find patterns, and scientists are always looking for patterns that they can use to structure their thinking about the world around
them. Patterns are found in data, which is why science is inevitably a quantitative study. But there is a difficulty in finding stable patterns in the data
since many patterns are temporary phenomena that have occurred randomly, and highly sophisticated empirical methods are necessary to distinguish stable patterns from temporary or random patterns.
When a scientist thinks he has found a stable pattern, he will generally
try to capture that pattern in a model or theory. A theory is essentially a
pattern, and thus theory is a central part of science. It would be nice to
have a single pattern - a unified theory - that could serve as a map relating
our understanding with the physical world around us. But the physical
world has proven far too complicated for a single map, and instead we
have had to develop smaller sub maps that relate to small areas of the
physical world around us. This multiple theory approach presents the problem of deciding not only what the appropriate map for the particular issue
is, but also of handling the map overlays where different maps relate to
overlapping areas of reality.
It is not only science that is focused on finding patterns; so too are
most individuals. In fact, as pointed out by Andy Clark (1993), human
brains are ‘associative engines’ that can be thought of as fast pattern completers - the human brain has evolved to see aspects of the physical world
and to create a pattern that places that aspect in context and allows individuals to draw broader implications for very limited data1
. Brian W. Arthur gives the following example ‘If I see a tail going around a corner, and
it’s a black swishy tail, I say, “There’s a cat”! There are patterns in music,
——————
1
This discussion is based on observations by Brian W. Arthur (2000).
VI Preface
art, religion, business; indeed humans find patterns in just about everything
that they do2
.
Determining when a pattern fits, when there are multiple patterns mapping to the same physical phenomena, and which pattern is the appropriate
pattern, is a difficult task that is the subject of much debate. Science differentiates itself from other areas of inquiry by setting rules about when a pattern can be assumed to fit, and when not, and what the structure of the map
can be. It essentially is a limit on the fast pattern completion nature of humans. For example, that swishy tail could be a small boy with who is playing a trick with a tail on one end of a stick.
To prevent too-fast pattern completion, and hence mistakes, standard
science requires the map to be a formal model that could be specified in a
set of equations, determined independently of the data sequence for which
it is a pattern, and that the patterns match the physical reality to a certain
degree of precision. That approach places standard logic at the center of
science. Arthur tells the story of Bertrand Russell’s to make this point. A
schoolboy, a parson, and a mathematician are crossing from England into
Scotland in a train. The schoolboy looks out and sees a black sheep and
says, ‘Oh! Look! Sheep in Scotland are black!’ The parson, who is learned,
but who represents a low level of science, says, ‘No. Strictly speaking, all
we can say is that there is one sheep in Scotland that is black’. The
mathematician, who might well represent a skilled theoretical scientist,
says, ‘No, that is still not correct. All we can really say is that we know
that in Scotland there exists at least one sheep, at least one side of which is
black’. The point is that science, and the formal models that underlie it,
works as a brake on our natural proclivity to complete patterns.
In many areas the standard science approach has served us well, and
has added enormous insight. By insisting on precision, scientists have built
an understanding that fits not just the hastily observed phenomena, but the
carefully observed phenomena, thereby developing much closer patterns.
Once individuals learn those patterns, the patterns become obvious to them
- yes, that table is actually nothing but a set of atoms - but without science,
what is obvious to us now would never have become obvious. In other areas, though, we have been unable to find a precise set of equations or
models that match the data representing the physical world, leaving large
areas of physical reality outside the realm of science. For many scientists,
——————
2
Human’s ‘fast pattern completer’ brains have been shown today to be less rational than previously supposed by economists. The new “neuroeconomics”, in
fact, has demonstrated that many choices are often made more from an emotional
than a rational approach.
Preface VII
a group of areas that have proven impervious to existing mappings includes most of the social sciences; these areas are simply too hard for traditional science.
The difficulty of finding a precise map has not stopped social scientists
from developing precise formal models and attempting to fit those precise
models to the data, and much of the methodological debate in the social
sciences concerns what implications we can draw from the vague imprecise mappings that one can get with existing analytic theories and data series. Critics of economics have called it almost useless - the celestial mechanics of a nonexistent universe. Until recently, the complaints of critics
have been ignored, not because standard economists did not recognize the
problems, but because the way we were doing economics was the best we
could do. But science, like all fields, is subject to technological change,
and recently there have been significant changes in analytic and computational technology that are allowing new theories to develop. Similar advances are occurring in empirical measurement, providing scientists with
much more data, and hence many more areas to place finer patterns on,
and in the analytics of empirical measurement, which allows the patterns
developed by theories to be brought to the data better3
. Such technological
change has been occurring in the economics profession over the last decade, and that technological change is modifying the way economics is
done. This book captures some of that change.
The new work is sometimes described as complexity theory, and in
many ways that is a helpful and descriptive term that we have both used
(Colander 2000a, b; Salzano and Kirman 2004). But it is also a term that is
often misused by the popular press and conveys to people that the complexity approach is a whole new way of doing economics, and that it is a
replacement for existing economics. It is neither of those things. The complexity approach is simply the integration of some new analytic and computational techniques into economists’ bag of tools. We see the new work
as providing some alternative pattern generators, which can supplement
existing approaches by providing an alternative way of finding patterns
than can be obtained by the traditional scientific approach.
The problem with the use of the complexity moniker comes about because, as discussed above, individuals, by nature, are fast pattern completers. This has led some scientific reporters, and some scientists in their non-
——————
3
As usual the reality is more complicated than can be presented in a brief introduction. The problem is that measurement does not stand alone, but is based on
theory. Discontent about traditional theories can lead to a search for new ways of
measurement, and improvements in the quality of data.
VIII Preface
scientific hats, to speculate about possible patterns that can follow from the
new models and techniques. As this speculation develops, the terms get
away from the scientists, and are no longer seen as simply descriptions of
certain properties of specific mathematical models, but instead as grand
new visions of how science is to be done, and of our understanding of reality. Such a grandiose vision makes it seem that complexity is an alternative
to standard science, when it is actually simply a continuation of science as
usual4
.
The popular press has picked up on a number of ideas associated with
these models - ‘emergent structure’, ‘edge of order’, ‘chaos’, ‘hierarchy,
‘self organized criticality’, ‘butterfly effect’, ‘path dependency’, ‘histeresis’, etc. - and, using its fast-pattern completer skills, has conveyed many
of these ideas to the general population with a sense that they offer a whole
new way of understanding reality, and a replacement for standard science.
Scientists have shied away from such characterizations and have emphasized that while each of these terms has meaning, that meaning is in the
explicit content in the mathematical model from which they derive, not in
some general idea that is everywhere appropriate. The terms reflect a pattern that economic scientists are beginning to develop into a theory that
may prove useful in understanding the economy and in developing policies. But the work is still in the beginning stages, and it is far too early to
declare it a science and a whole new way of looking at something. Scientists are slow, precise, pattern completers and they recognize that the new
work has a long way be go before it will describe a meaningful pattern,
and an even longer way to go before it can be determined whether those
patterns are useful5
.
Thus, even if the complexity approach to economics is successful, it
will be a complement to, not a substitute for, existing approaches in economics. That said, there are some hopeful signs and research in “complexity” is some of the most exciting research going on in economics. The
most hopeful work is occurring in analysis of the financial sector, where
——————
4
This is not dissimilar from what has already happened for the “biological evolution” work, which was a more general, but not fundamentally different, approach
from the mechanical physical biological model.
5
In agent-based modeling (ABM), the model consists of a set of agents that encapsulate the behaviors of the various individuals that make up the system, and
execution consists of emulating these behaviors. In equation-based modeling
(EBM), the model is a set of equations, and execution consists of evaluating them.
Thus, “simulation” is the general term that applies to both methods, which are distinguished as (agent-based) emulation and (equation-based) evaluation. See Parunak & A. (1998).
Preface IX
enormous amounts of quality data are available. But even where high quality data is not available, such as in questions of industrial organization, the
approach is changing the way the questions are conceptualized, with markets being conceptualized as dynamic rather than static as in the more traditional approach.
How the Complexity Approach Relates to the Standard
Approach
The standard theoretical approach used in economics is one loosely based
on a vision of rational agents optimizing, and is a consideration of how a
system composed of such optimizing agents would operate. The complexity approach retains that same vision, and thus is simply an extension of
the current analysis. Where complexity differs is in the assumptions it allows to close the model. The complexity approach stresses more local,
rather than global, optimization by agents than is done in the traditional
approach. Agent heterogeneity and interaction are key elements of the
complexity approach.
The standard approach, which developed over the last 100 years, was
limited in the methods available to it by the existing evolving analytic and
empirical technology. That meant that it had to focus on the solvable aspects of the model, and to structure assumptions of the model to fit the
analytics, not the problem. The analysis evolved from simple static constrained optimization to nonstochastic control theory to dynamic stochastic
control theory, but the general structure of the analysis - the analytic pattern generating mechanism - remained the same, only more jazzed up.
In the search to focus on the solvable aspects of the model, the standard approach had to strongly simplify the assumptions of the model using
the representative agent simplification, and the consequent gaussianeity in
the heterogeneity of agents. Somehow, models the economy without any
consideration of agent heterogeneity were relied upon to find the patterns
that could exist. The complexity approach does not accept that, and proposes a different vision, which is a generalization of the traditional analysis. In fact, it considers the representative agent hypothesis that characterizes much of modern macro as a possible, but highly unlikely, case and
thus does not find it a useful reference point. But these changes in assumptions are not without cost. The cost of making agent heterogeneity central
is that the complexity model is not analytically solvable. To gain insight
into it, researchers must make use of simulations, and generally, with
simulations, results are neither univocally nor probabilistically determined.
X Preface
The standard approach offered enormous insights, and proved highly
useful in generating patterns for understanding and applying policy. It led
to understanding of rationing situations, shadow pricing, and underlay a
whole variety of applied policy developments: cost benefit analysis, modern management techniques, linear programming, non-linear programming, operations research, options pricing models, index funds … the list
could be extended enormously. It suggested that problems would develop
if prices were constrained in certain ways, and outlined what those problems would be; it led to an understanding of second order effectsexternalities, and how those second order effects could be dealt with. It
also led to actual policies - such as marketable permits as a way of reducing pollution - and underlies an entire approach to the law. Moreover, the
standard approach is far from moribund; there are many more areas where
the patterns generated by the standard approach will lead to insights and
new policies over the coming decades. Standard economics remains
strong.
Despite its academic successes there are other areas in which standard
economics has not been so helpful in generating useful patterns for understanding. These include, paradoxically, areas where it would seem that the
standard theory directly applies - areas such as stock market pricing, foreign exchange pricing, and understanding the macro economy more generally. In matching the predictions of standard theory to observed phenomena, there seem to be too many movements uncorrelated with underlying
fundamentals, and some patterns, such as the arch and garch movement in
stock price data, that don’t fit the standard model. The problem is twofold -
the first is the simplicity of the model assumptions do not allow the complexity of the common sense interactions that one would expect; the second is the failure of the models to fit the data in an acceptable way.
It is those two problems that are the starting points for the complexity
approach. The reason why the complexity approach is taking hold now in
economics is because the computing technology has advanced. This advance allows consideration of analytical systems that could not previously
be considered by economists. Consideration of these systems suggested
that the results of the ‘control-based’ models might not extend easily to
more complicated systems, and that we now have a method - piggybacking
computer assisted analysis onto analytic methods - to start generating patterns that might provide a supplement to the standard approach. It is that
approach that we consider the complexity approach.
It is generally felt that these unexplained observations have something
to do with interdependent decisions of agents that the standard model assumes away. Moreover, when the dynamics are non-linear, local variations
from the averages can lead to significant deviations in the overall system
Preface XI
behavior. Individuals interact not only within the market; they also interact
with other individuals outside the market. Nowadays, theorists are trying
to incorporate dynamic interdependencies into the models. The problem is
that doing so is enormously complex and difficult, and there are an almost
infinite number of possibilities. It is that complexity that the papers in this
volume deal with.
In terms of policy the papers in this volume suggest that when economists take complexity seriously, they become less certain in their policy
conclusions, and that they expand their bag of tools by supplementing their
standard model with some additional models including (1) agent-based
models, in which one does not use analytics to develop the pattern, but instead one uses computational power to deal with specification of models
that are far beyond analytic solution; and (2) non-linear dynamic stochastic
models many of which are beyond analytic solution, but whose nature can
be discovered by a combination of analytics and computer simulations. It
is elements of these models that are the source of the popular terms that
develop.
Developments in this new approach will occur on two dimensions. The
first is in further development of these modeling techniques, understanding
when systems exhibit certain tendencies. It is one thing to say that butterfly
effects are possible. It is quite another to say here are the precise characteristics that predict that we are near a shift point. Until we arrive at such an
understanding, the models will be little help in applied policy. Similarly
with agent-based models. It is one thing to find an agent-based model that
has certain elements. It is quite another to say that it, rather than one of the
almost infinite number of agent based models that we could have chosen,
is the appropriate model to use as our theory.
The second development is in fitting the patterns developed to the data.
Is there a subset of aspects of reality that better fit these models than the
standard models? Are there characteristics of reality that tell us what aspects they are? Both these developments involve enormous amounts of
slogging through the analytics and data.
The papers in this volume are elements of that slogging through. They
are divided into four sections: general issues, modeling issues, applications, and policy issues. Each struggles with complicated ideas related to
our general theme, and a number of them try out new techniques. In doing
so, they are part of science as usual. The choice of papers highlights the
necessity to consider a multifaceted methodology and not a single methodology in isolation. Our goal is to give the reader a sense of the different
approaches that researchers are following, so as to provide a sense of the
different lines of work in the complexity approach.
XII Preface
It is commonly said that science progresses one funeral at a time; the
papers in this volume suggest that there is another method of progression -
one technique at a time, and as various of these techniques prove fruitful,
eventually the sum of them will lead economics to be something different
than it currently is, but it is a change that can only be seen looking back
from the future.
Part I: General Issues
The first two papers deal with broad definitional and ontological issues,
the cornerstone of economic thinking. One of the ways economists have
arrived at patterns from theory is to carefully delineate their conception of
agents, restricting the analyst to what Herbert Simon called sub rationality,
and which Vercelli, in the first paper, “Rationality, Learning, and Complexity: from the Homo Economicus to the Homo Sapiens”, calls ‘very restrictive notions of expectations formation and learning that deny any role
for cognitive psychology’. He argues that the approach of standard economics succeeds in simplifying the complexity of the economic system
only at the cost of restricting its theoretical and empirical scope. He states,
‘If we want to face the problems raised by the irreducible complexity of
the real world we are compelled to introduce an adequate level of epistemic complexity in our concepts and models’, (p 15) concluding, ‘epistemic complexity is not a virtue, but a necessity’.
In the second paper, “The Confused State of Complexity Economics:
An Ontological Explanation”, Perona addresses three issues: the coexistence of multiple conceptions and definitions of complexity, the contradictions manifest in the writings of economists who alternate between treating
complexity as a feature of the economy or as a feature of economic models, and finally the unclear status of complexity economics as an orthodox/heterodox response to the failures in traditional theory. He argues that
economists supporting the complexity ideas tend to move alternatively between different conceptions of complexity, which makes it hard to follow
what their argument is. Using Tony Lawson’s ontic/theoretietic distinction,
he argues that the plurality of complexity definitions makes sense when we
observe that most of the definitions are theoretic notions, but that the apparent agreement between heterodox and orthodox traditions over complexity ideas is fictitious since the ‘two sides represent in fact quite different and even opposite responses to the problems perceived in traditional
theory’. (p 14).
Preface XIII
Part II: Modeling Issues I - Modeling Economic
Complexity
The next set of papers enters into the issues of modeling. The first of these,
“The Complex Problem of Modeling Economic Complexity” by Day, develops the pattern sense of science that we emphasized in the introduction,
and provides a general framework for thinking about the problem of modeling complexity. Day argues that rapid progress only began in science
when repetitive patterns, which could be expressed in relatively simple
mathematical formulas, were discovered, and that the patterns developed
could provide simplification in our understanding of reality. However, this
approach left out many aspects that could not be described by simple equations.
Day argues that Walrasian general equilibrium is a theory of mind over
matter, that has shown “how, in principle a system of prices could coordination the implied flow of goods and services among the arbitrarily many
heterogeneous individual decision-makers”. The problem with the theory
is that it is about perfect coordination, while the interesting questions are to
be found in less than perfect coordination - how the process of coordination works out of equilibrium. He then goes through some of the key literature that approached the problem along these lines. He concludes with an
admonition to complexity researchers not be become intrigued with the
mathematics and models, but to concentrate on the task of explaining such
things as how governments and central banks interact with private households and business sectors, with the hope of identifying “policies that improve the stability and distributional properties of the system as a whole”.
Day’s paper is the perfect introduction to the modeling sections of the
book because it provides the organizing belief of the contributors to this
volume that the work in complexity on chaotic dynamics and statistical
mechanics “provides potential templates for models of variables in any
domain whose behavior is governed by nonlinear, interacting causal forces
and characterized by nonperiodic, highly irregular, essentially unpredictable behavior beyond a few periods into the future”.
Above we argued that the complexity revolution was not be a revolution in the sense of an abandonment of previous work, but will be a set of
developments that occur one technique at a time, which when looked back
upon from a long enough perspective, will look like a revolution. The second paper in this section, “Visual Recurrence Analysis: Application to
Economic Time Series” by Faggini, suggests one of those new techniques.
In the paper Faggini argues that the existing linear and non-linear techniques of time series analysis are inadequate when considering chaotic
phenomena. As a supplement to standard time series techniques he argues
XIV Preface
that Recurrence Plots can be a useful starting point for analyzing nonstationary sequences. In the paper he compares recurrence plots with classical approaches for analyzing chaotic data and detecting bifurcation. Such
detection is necessary for deciding whether the standard pattern or one of
the new complexity patterns is the pattern appropriate for policy analysis.
The third paper in this part, “Complexity of Out-of-equilibrium Play in
a Tax Evasion Game” by Lipatov, uses the same evolutionary framework
that characterizes work in complexity, but combines it with more traditional economic models, specifically evolutionary game theory with learning, to model interactions among taxpayers in a tax evasion game. The paper expands on previous models of tax evasion, but adds an explicitly
characterization of taxpayer interaction, which is achieved by using an
evolutionary approach allowing for learning of individuals from each
other. He argues that the dynamic approach to tax compliance games reopens many policy issues, and that as a building block for more general
models, the evolutionary approach he provides can be used to consider
policy issues.
Part III: Modeling Issues II - Using Models from Physics to
Understand Economic Phenomena
Part III is also about modeling, but is focus is on adapting models from
physics to explain economic phenomena. The papers in this section explore various issues of modeling, demonstrating various mathematical
techniques that are available to develop patterns that can then be related to
observation to see if they provide a useful guide. The first paper in this
section, “A New Stochastic Framework for Macroeconomics: Some Illustrative Examples” by Aoki, extends the Day’s argument, starting with the
proposition that that the standard approach to microfoundations of macroeconomics is misguided, and that therefore we need a new stochastic approach to study macroeconomics. He discusses the stochastic equilibria
and ultra metrics approaches that might be used to get at the problems, and
how those approaches can provide insight into real world problems. He
gives an example of how these approaches can explain Okun’s Law and
Beveridge curves, and can demonstrate how they shift in response to macroeconomic demand policies. He also shows how these new tools can reveal some unexpected consequences of certain macroeconomic demand
policies.
The second paper in this part, “Probability of Traffic Violations and
Risk of Crime: A Model of Economic Agent Behavior” by Mimkes, shows
the relationship between models in physics and economic models. Specifi-
Preface XV
cally he relates the system of traffic agents under constraint of traffic laws
to correspond to atomic systems under constraint of energy laws, and to
economic models of criminal behavior. He finds that the statistical Lagrange LeChatalier principle agrees with the results, and concludes by
suggesting that “similar behavior of agents may be expected in all corresponding economic and social systems (situations) such as stock markets,
financial markets or social communities” (p. 11).
Muchnik and Solomon’s paper, “Markov Nets and the Nat Lab Platform: Application to Continuous Double Auction”, considers Markov
Nets, which preserve the exclusive dependence of an effect even on the
event directly causing it but makes no assumption on the time lapse separating them. These authors present a simulation platform (NatLab) that
uses the Markov Net formalisms to make simulations that preserve the
causal timing of events and consider it in the context of a continuous double auction market. Specifically, they collect preferred trading strategies
from various subjects and simulate their interactions, determining the success of each trading strategy within the simulation.
Arecchi, Meucci, Allaria, and Boccaletti’s paper “Synchronization in
Coupled and Free Chaotic Systems”, examines the features of a system affected by heteroclinic chaos when subjected to small perturbations. They
explore the mathematics of what happens when there is an intersection
point between a stable and unstable manifold, which can create homoclinic
tangles, causing systemic erratic behavior and sensitivity to initial conditions. The possibilities of such behavior should serve as a cautionary tale
to policy makers who make policy relying on the existence of systemic
stability.
Part IV: Agent Based Models
Part IV considers agent based models, which differ from the models in the
previous section in that they begin with specification of agent strategies
and do not have any analytic solution, even in principle. Instead one allows
agents with various strategies to interact in computer simulations and from
those interactions one gains knowledge about the aggregate system. This
allows the exploration of models in which the agents are assumed to have
far less information and information processing capabilities than is generally required for analytic models.
The first paper in this section, “Explaining Social and Economic Phenomena by Models with Low or Zero Cognition Agents” by Omerod,
Trabatti, Glass, and Colbaugh, examines two socio-economic phenomena,
the distribution of the cumulative size of economic recessions in the US