Organodynamics

Grant Holland, Apr 25, 2014

Slide: Preview: Information Theory

 Nav    Refs    Top    Bottom    Previous    Next

90m        90m

45m        45m

25m        25m

 10. Dynamics

 

ÒDynamicsÓ implies some mechanism that determines (precisely) the time evolution of the state of the system.

 

We are going to generalize this idea of ÒdynamicsÓ by relaxing it to include merely ÒnarrowingÓ, or constraining, the time evolution of the states of the system.

 

Organodynamics can do this by using probability theory and its offshoots.

 

We shall also need a way to describe the extent to which our probabilistic ÒconstraintsÓ result in ÒnarrowingÓ the time evolution of the system that we are modeling.

 11. Chance Gone Haywire?

 

But, how can the introduction of chance into a dynamical systems theory result in anything but eventually running wildly out of controlÉ

And therefore providing no constraints and no dynamics?

How can a systems theory based on chance variation provide a dynamics mechanism that can constrain the time evolution of systems that it models?

 

 12. Or Chance that is Well Behaved?

 

It turns out that information theory reveals the solution to this conundrum.

The key turns out to be, essentially, stochastic dependence over time. Information theory Ð not to be confused with communications theory Ð is a branch of probability theory that studies entropic functionals and their applications.

(Admittedly, I have to defend this definition.)

When applied to stochastic processes, these entropic functionals can characterize the conditions under which these processes are well behaved, or whether they are wildly out of control.

 

 

 

 

[Shannon 1948]  shows that entropy applies to any situation that has probabilities; and that it is a measure of the degree of uncertainty inherent in those probabilities.

 

Uncertainty is a general trait that covers many specific applications Ð all of which are subject to chance variation. These include such traits as instability, volatility, opportunity, freedom, error, unpredictability, etc. These are all potential applications of entropy.

 

Notes: