research when uncertainty is a certainty jim hazy adelphi university garden city, ny

14
Research When Research When Uncertainty is a Uncertainty is a Certainty Certainty Jim Hazy Jim Hazy Adelphi University Adelphi University Garden City, NY Garden City, NY

Upload: joel-richard

Post on 27-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Research When Research When

Uncertainty is a CertaintyUncertainty is a Certainty

Jim HazyJim HazyAdelphi UniversityAdelphi University

Garden City, NYGarden City, NY

Page 2: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

To those who do not know mathematics it is To those who do not know mathematics it is difficult to get across a real feeling as to the difficult to get across a real feeling as to the beauty, the deepest beauty, of nature…. If you beauty, the deepest beauty, of nature…. If you want to learn about nature, to appreciate nature, want to learn about nature, to appreciate nature, it is necessary to understand the language it is necessary to understand the language through which she speaks to us.through which she speaks to us.

The Character of Physical LawThe Character of Physical Law (1965) Ch. 2 (1965) Ch. 2

- Richard Feynman 1918 - 1988- Richard Feynman 1918 - 1988

Page 3: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Some Theoretical PointsSome Theoretical Points

Information Theory Implies UncertaintyInformation Theory Implies Uncertainty– What is What is entropyentropy anyway? anyway?

Nonlinear dynamical systems (NDS) Nonlinear dynamical systems (NDS) – Linearization - linear “thinking” often works!Linearization - linear “thinking” often works!

Local versus global predictionLocal versus global prediction

– Until it doesn’t! Until it doesn’t! New information created New information createdBifurcations & CatastrophesBifurcations & CatastrophesSensitivity to Initial Conditions (SIC), Divergence (along Sensitivity to Initial Conditions (SIC), Divergence (along emergent dimensions) & Deterministic Chaosemergent dimensions) & Deterministic Chaos

Page 4: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Why Information Theory?Why Information Theory?An informal interpretationAn informal interpretation

““Listening” or “watching” for what is happening “signals” Listening” or “watching” for what is happening “signals” when “noise” or uncertainty is a certaintywhen “noise” or uncertainty is a certainty– Recognize information: Some events are predictable, some surprisingRecognize information: Some events are predictable, some surprising– Gathering “new information” about the events in the environmentGathering “new information” about the events in the environment

To gather new information, one must probe for itTo gather new information, one must probe for it– Observer’s “probability model” Observer’s “probability model” predicts predicts outcomes - looking for outcomesoutcomes - looking for outcomes– Entropy “maps” a model into “questions” to glean info from noiseEntropy “maps” a model into “questions” to glean info from noise

Perfect Predictability, Perfect Predictability, p = 1p = 1 No New Info after eventsNo New Info after events– For example: Death is permanent.For example: Death is permanent.– Event entropy = 0Event entropy = 0

Surprise, Surprise, p < 1p < 1 New Info is available after eventNew Info is available after event

Page 5: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

To clarify this important concept, let the “predictive model” for a fair coin flip be:

Heads = 1 with probability 1/2 The variable X = Tails = 0 with probability 1/2

Now there is an event, a coin flip that we “assume” is fair.

Here is the question: In the ensemble of all possible outcomes (1 or 0) above, how much information is needed to determine which actually occurred?

The entropy of X in a space of 2 outcomes 1 or 0 is:

H(X) = - ½ log ½ - ½ log ½ -(.5)*(-1) - (.5)*(-1) .5 + .5 = 1 bit is learned about the state of environment

Here’ how to think about this: Based upon the above “model”, it will take only one “yes” or “no” question (base 2), or “probe” or “experiment,” to determine what actually happened during the event. One checks the coin on the ground for heads or tails.

In the above example, an observer guessing “was it heads?” will be correct 50% of the time; if wrong, one knows it’s “tails” with 100% confidence. As a result the expected number of “yes” or “no” questions is 1 bit.

Page 6: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

H(p) | p(x) real numbers as follows:

H(X) = - Σ p(x) log2 p(x) x = χ

http://pandasthumb.org/pt-archives/entropy.jpg

Page 7: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

To clarify this important concept, the following example calculation is taken from Cover and Thomas (2006, 15), let the ensemble have 4 possible states post event:

a with probability 1/2 X = b with probability 1/4

c with probability 1/8 d with probability 1/8

The entropy of X is:

H(X) = - ½ log ½ - ¼ log ¼ - 1/8 log 1/8 – 1/8 log 1/8 = 7/4 or 1.75 bits

One way to think about this is that for the thoughtful investigator it will on average take fewer than two “yes” or “no” questions, or “probes” or “experiments,” to determine the precise outcome within the ensemble, that is, which future happened.

In the above example, an observer guessing, “was it “a”?” will be correct 50% of the time, so 50% of the time it takes one question; if wrong, guessing “b” will then be right 50% of the time, so 50% of 50% or 25% of the time it takes 2 questions; if wrong, guessing “c” will be right 50% of the time; and if this third guess is also wrong, then it is “d” 100% of the time. Thus, the remaining 25% of the time it takes 3 questions. The expected number of “yes” or “no” questions is = 0.5*1 + .25*2 + .25*3 = 1.75 questions or bits. (There is a theorem that says entropy ≤ # questions).

This “regularity” in the random variable – a consistent mix of predictability versus surprise -- in the physical environment can be identified with 1.75 yes/no experiments in the informational environment. This is a “model” of an uncertain environment wherein one can know what happened with 1.75 bits of info.

Page 8: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Modeling, Experimentation & Analysis

MutualInformation

from“Surprises”

Uncertainty About

“Phenomenon A”

Uncertainty About

Phenomenon B

Experimenting to Gather Information

Page 9: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

H(X)

Length L

ET

This line indicates information gathered from the inherent uncertainty in predicting the system

being observed; there are always surprises from random events: “A COIN FLIP”

This curve indicates an observer’s naïve model of the phenomenon resulting from insufficient

observation, that is, “ignorance” about the “system”: “I FEEL LUCKY”

Line Slope = entropy rate = “new mutual information” per observation or symbol LLine Intercept = E = Excess entropy includes observed info thru & improved model

One must learn the language of the symbols, L, “spoken” by the system.

Adapted from Feldman, McTague, & Crutchfield (2008).

The blue is transient information embedded in the complexity of the system & available as

events unfold to improve our models: LEARNING THAT THE FLIP IS FAIR

Page 10: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Qualitative breakthroughs in performance due to innovation

overcome prior constraints

A linear stability model enables local

approximation but doesn’t fully recognize constraints

An even better bifurcation model

opens new possibilities

Panel 1 Panel 3Panel 2

A Better, Nonlinear Model Clarifies

Robustness of Stability

Prior constraints limit maximum performance potentialeven when human organizing dynamics are optimal

Page 11: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Transformation model for

additional people & resource flow

needed/supported

Stability models for given levels of

people & resources needed

or supported

Phenomenon

Model

Page 12: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Model_N State_NInnovationDynamics …

Model_3 State_3InnovationDynamics

Model_2 State_2InnovationDynamics

Model_1 State_1InnovationDynamics

Model_0 State_0

(Surie & Hazy, 2006)

StructuralComplexity

Of Organizing

State, b

Accuracy ofPrediction

Model Driving Agents

Choices, a

Page 13: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

Thank YouThank You

Page 14: Research When Uncertainty is a Certainty Jim Hazy Adelphi University Garden City, NY

ReferencesReferencesCover, T, & Thomas, J. A. (2004). Cover, T, & Thomas, J. A. (2004). Elements of Information Theory 2nd EditionElements of Information Theory 2nd Edition. Hoboken, NJ: Wiley-Interscience . Hoboken, NJ: Wiley-Interscience Crutchfield, J (1994). Is anything ever new? Considering emergence. In G. Cowan, D. Pines & D. Meltzer 9eds.), Crutchfield, J (1994). Is anything ever new? Considering emergence. In G. Cowan, D. Pines & D. Meltzer 9eds.), Complexity: Complexity: Metaphors, Models, and RealtyMetaphors, Models, and Realty 515-537, Reading, Massachusetts: Addison Wesley Publishing. 515-537, Reading, Massachusetts: Addison Wesley Publishing.Crutchfield J.P. & Feldman, D.P. (1997). Statistical complexity of simple one-dimensional spin systems, Crutchfield J.P. & Feldman, D.P. (1997). Statistical complexity of simple one-dimensional spin systems, Physics Review EPhysics Review E 55 (2), 55 (2), 1239-1242.1239-1242.Crutchfield J.P. & Feldman, D.P. (2003). Regularities unseen, randomness observed: The entropy convergence hierarchy. Crutchfield J.P. & Feldman, D.P. (2003). Regularities unseen, randomness observed: The entropy convergence hierarchy. Chaos.Chaos. 15: 25-54. 15: 25-54. Epstein, J. M. (1997). Epstein, J. M. (1997). Nonlinear dynamics, mathematical biology, and social scienceNonlinear dynamics, mathematical biology, and social science (Vol. IV). Reading, Massachusetts: Addison- (Vol. IV). Reading, Massachusetts: Addison-Wesley Publishing.Wesley Publishing.Feldman D. P., & J. P. Crutchfield (1998). Statistical measures of complexity: Why? Feldman D. P., & J. P. Crutchfield (1998). Statistical measures of complexity: Why? Physics Letter APhysics Letter A, 238 (415), 244-252. , 238 (415), 244-252. Feldman, D. P., McTague, C. S., & Crutchfield, J. P. (2008). The organization of intrinsic computation; Complexity-entropy diagrams Feldman, D. P., McTague, C. S., & Crutchfield, J. P. (2008). The organization of intrinsic computation; Complexity-entropy diagrams and the diversity of natural information processing. and the diversity of natural information processing. Chaos 18,Chaos 18, 043106-1. 043106-1.Gell-Mann, M. (1995). Gell-Mann, M. (1995). The Quark and the Jaguar: Adventures in the simple and the complexThe Quark and the Jaguar: Adventures in the simple and the complex. New York: Henry Holt & Company.. New York: Henry Holt & Company.Haken, H. (2006).Haken, H. (2006). Information and self-organization: A macroscopic approach to complex systems. Information and self-organization: A macroscopic approach to complex systems. Berlin: Springer. Berlin: Springer. Hazy, J. K. (2008). Toward a theory of leadership in complex adaptive systems: Computational modeling explorations. Hazy, J. K. (2008). Toward a theory of leadership in complex adaptive systems: Computational modeling explorations. Nonlinear Nonlinear Dynamics, Psychology and Life SciencesDynamics, Psychology and Life Sciences 1212(3), 281-310.(3), 281-310.Osorio, R., Borland, L. & Tsallis, C. (2004). Distributions of High-Frequency Stock Market Observables. In M. Gell-Mann & C. Osorio, R., Borland, L. & Tsallis, C. (2004). Distributions of High-Frequency Stock Market Observables. In M. Gell-Mann & C. Tsallis, (Eds.) Tsallis, (Eds.) Nonextensive Entropy – Interdisciplinary Applications. Nonextensive Entropy – Interdisciplinary Applications. Oxford: Oxford University Press.Oxford: Oxford University Press.Prigogine, I. (1997). Prigogine, I. (1997). The end of certainty: Time, chaos, and the new laws of natureThe end of certainty: Time, chaos, and the new laws of nature. New York: Free Press.. New York: Free Press.Prokopenko, M., Boschetti, F. and Ryan, A. J. (2008). An information-theoretic primer on complexity, self-organization and Prokopenko, M., Boschetti, F. and Ryan, A. J. (2008). An information-theoretic primer on complexity, self-organization and emergence. emergence. Complexity (Online) 9999 Complexity (Online) 9999 (9999) NA. (9999) NA. Rubinstein, A. (1998). Rubinstein, A. (1998). Modeling Bounded RationalityModeling Bounded Rationality. Cambridge: The MIT Press.. Cambridge: The MIT Press.Ruelle, D. (1989). Ruelle, D. (1989). Chaotic Evolution and Strange Attractors.Chaotic Evolution and Strange Attractors. Cambridge: Cambridge University Press. Cambridge: Cambridge University Press.Shannon, C. E. (1948). A mathematical theory of communication. Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technology Journal 27.Bell System Technology Journal 27. 379-423, 623-657. 379-423, 623-657.Simon, H. A. (1962). Simon, H. A. (1962). The architecture of complexity.The architecture of complexity. Paper presented at the Proceedings of the American Philosophical Society 106, Paper presented at the Proceedings of the American Philosophical Society 106, No. 6.No. 6.Sterman, J. D. (2000). Sterman, J. D. (2000). Business Dynamics: Systems thinking and modeling for a complex world .Business Dynamics: Systems thinking and modeling for a complex world . Boston: Irwin McGraw-Hill. Boston: Irwin McGraw-Hill.