Computational intelligence since the 1950s: Complexity and logic

This article describes the role combinatorial complexity plays in the field of computational intelligence.

According to contemporary research in animal and human vision, simple object perception involves bottom-up signals from sensory organs and top-down signals from internal mind’s representations (memories) of objects., During perception, the mind associates subsets of bottom-up signals corresponding to objects with top-down signals originating from representations of object. This produces object recognition; it activates brain signals leading to mental and behavioral responses.

The difficulties in developing mathematical descriptions of the recognition step of the association-recognition-understanding process can be summarized under the notion of combinatorial complexity (CC). . Combinatorial Complexity refers to multiple combinations of various elements in a complex system; for example, recognition of a scene often requires concurrent recognition of its multiple elements that could be encountered in various combinations. Combinatorial Complexity is prohibitive because the number of combinations is very large: for example, an image with 100 elements (not too large a number) results in the number of combinations equal to 100^100. This number exceeds the number of all elementary particle events in the life of the Universe; no computer will ever be able to compute that many combinations.

The problem was first identified in pattern recognition and classification research in the 1960s and was named the “curse of dimensionality”. In theory, adaptive self-learning algorithms and neural networks are capable of learning solutions to any problem ‘on their own’, if provided with sufficient number of training examples. The following thirty years of developing adaptive statistical pattern recognition and neural network algorithms led to a conclusion that the required number of training examples often was combinatorially large. Thus, self-learning approaches encountered CC of learning requirements.

Rule-based systems were proposed in the 1970’s to solve the problem of learning complexity,. An initial idea was that rules would capture the required knowledge and eliminate the need for learning. However in presence of variability, the number of rules grew; rules became contingent on other rules; combinations of rules had to be considered; rule systems encountered the CC of rules.

Beginning in the 1980s, model-based systems were proposed.

They used models which depended on adaptive parameters. The idea was to combine advantages of rules with learning-adaptivity by using adaptive models. The knowledge was encapsulated in models, whereas unknown aspects of particular situations were to be learned by fitting model parameters. A simple example of an adaptive model is linear regression: the knowledge is encapsulated in using linear model, and in the choice of variables, the uncertainty and adaptivity is in the unknown parameters, fitted to the data. Whereas linear regression uses one model, model-based systems use a large number of models. For example, a scene is described using geometric models of many objects. Parameters may include: size, orientation angles, color, illumination conditions, etc. A simple still nontrivial problem, causing difficulties in applications till today, is tracking multiple objects on radar images in the presence of

Fitting models to data required selecting data subsets corresponding to various models. The number of subsets, however, is combinatorially large. A general popular algorithm for fitting models to the data, multiple hypothesis testing is known to face Combinatorial Complexity of computations. Model-based approaches encountered computational CC.



Consistent returns of Combinatorial Complexity can be related to the type of logic underlying various algorithms and neural networks . Formal logic is based on the “law of excluded middle,” according to which every statement is either true or false and nothing in between. Therefore, algorithms based on formal logic have to evaluate every little variation in data or internal representations as a separate logical statement (hypothesis); a large number of combinations of these variations causes combinatorial complexity.

Combinatorial complexity of algorithms based on logic is related to Gödel theory (Gödel's incompleteness theorems): it is a manifestation of the inconsistency of logic in finite systems.


Multi-valued logic and fuzzy logic were proposed to overcome limitations related to the law of excluded middle . Both approaches encounter difficulties related to CC. The mathematics of multivalued logic is no different in principle from formal logic, “excluded third” is substituted by “excluded n+1.” Fuzzy logic encounters a difficulty related to the degree of fuzziness. When too much fuzziness is specified, the solution does not achieve a needed accuracy and when too little, it becomes similar to formal logic. Complex systems require different degrees of fuzziness in various elements of system operations; searching for the appropriate degrees of fuzziness among combinations of elements again would lead to CC.


It appears that mind’s logic after Gödel is much more complicated and much less logical than was assumed by founders of artificial intelligence. . Roger Penrose thought that Gödel’s results entail incomputability of the mind processes and testify for a need for new physics . An opposite position here is that incomputability of logic does not entail incomputability of the mind. Logic is not the basic mechanism of the mind. Various manifestations of CC are all related to formal logic and Gödel theory. Rule systems rely on formal logic in a most direct way. Self-learning algorithms and neural networks rely on logic in their training or learning procedures: every training example is treated as a separate logical statement. Fuzzy logic systems rely on logic for setting degrees of fuzziness. CC of mathematical approaches to the mind is related to the fundamental inconsistency of logic. A different type of logic, termed has been developed, aiming to overcome Combinatorial Complexity. , gives a mathematical description of the dynamic process of mind’s representations (concepts) evolving “from vague to crisp.”
 
< Prev   Next >