Information entropy

Joined
9/14/12
Messages
3
Points
11
I'm trying to understand what information entropy actually measures, and whether there is a way to determine the degree to which a time series is predictable or not (and I assume that entropy would play a role in that, although it would probably not be sufficient).

As I understand it, information entropy(applied to a time series) is supposed to calculate the amount of randomness, as opposed to structure, in that time series.

However, a pseudo-random series is entirely deterministic, yet I believe (though I haven't tested that) its entropy would be comparable to a purely random process. So entropy would not be a test of "randomness" (as opposed to determinism) per se, but more like a measure of the underlying complexity of the structure.

Also, can it be said that the presence of structure in a time series implies that it is predictable? Are they equivalent?

And finally, a question that is relevant but which information entropy doesn't address, is whether we have access to all the variables that come into play. So even if a time series has a certain structure, it could very well be that the variables that produce the structure are hidden, and thus, the time series is not (or poorly) predictable by those of us without access to these variables... (although we could probably model these hidden variables, i.e. somehow "reverse engineer" them?)

Basically I'm trying to disentangle all these concepts and determine if information entropy is the best measure of predictability of a time series, and if not, what would be.

Any thoughts? Any good resources on these topics?
 
Back
Top Bottom