notes-cog-ai-miscAi

"Have you read about Model of Hierarchical Complexity (http://metamoderna.org/what-is-the-mhc?lang=en)?

When we have networks, where entities adopt to their information space, the signaling has to remain fast, even though the network grows. This means, that many simple entities become dependant of few more complex entities; this dependency will increase their capability to survive.

The lower levels of complexity evaluate the authenticity of the complex level entities and bet their rewards for signals based on their own cognitive system and evaluation criteria.

I believe, that AGI can be achieved by having correct model of hierarchical complexity, and instead of using psychological modeling, that can be achieved by using Agent-Based Modeling, which follows the structure of a Scale-Free Network; the strongest theories in neuroscience are those, which consider the brain to be a holarchic system, where concepts (entities) have essential dependencies and curiousity towards co-operation with diverse set of ideas.

The original theory of holarchic systems has been a bit outdated as it prefers cascading garbage collection over adaptability. True intelligence in adaptive (as lack of intelligent utilities will lead to extinction of the idea).

In case you were already aware of these, how would you criticise these ideas to be inadequate? " [1]