notes-cog-memory statistics

todo: add that thing from Tanenbaum's class that suggests that the frequency of terms recurring in newspapers is a powerlaw distribution, and that both short-term and long-term memory recall follows the same distribution.

mild/wild (heavy-tailed) probability distributions, and memory:

one desirable property of my friend rof's neural net was that the memories were robust to later noise input. Now, the statistics directly corresponding to the physical components out of which a cognitive system is built (for example, a neural firing frequency histogram; or a neuron's in-degree and out-degree) will probably end up being 'mild' (gaussian or exponential or similar) because these components have built-in maximums (eg a neuron has about a 2ms refractory period, giving an upper limit to the firing frequency, and there is a maximum number of synapses per neuron (otoh, i guess a neuron could fire very INfrequently...). But a computing system which can do something like logical inference, or emulate a (fixed memory approximation to a) Turing machine, and which can learn things like 2+2=4, has to have memories which behave 'discretely' in that they undergo near-irreversible jumps/hysteresis. For example, at some point you learned that 2+2 = 4. Even though you probably learned that over only a few hours, that memory should be much more durable than the memory of other things which occured for a few hours. So, the statistics of some things at a higher level of abstraction should be heavy-tailed.

This is possibly related to the sudden jumps/heavy-tailed distributions, and hysteresis, seen in financial markets. These jumps may be when Mr. Market learns something, or has some insight.