Artificial General Intelligence

Reasoning System and Logic

1. Formalization

An AGI project should be described on (at least) three levels:
  1. as a theory of intelligence, in a natural language,
  2. as a model of the theory, in a symbolic (formal) language,
  3. as an implementation of the model, in a programming language (plus hardware, if necessary).
(This course is mainly about the first two levels.)

The advantage of the middle level: more accurate and concrete than the top level, while independent of the implementation details. Without it, the developing process of the system will be full of trail-and-error.

There are three most commonly used frameworks of formalization:

Though in principle these three frameworks have equivalent expressing and processing power, for a concrete problem their applications may have very different easiness and naturalness. For AGI, the framework of a reasoning system (also known as inference system) has the following advantages: The first two correspond to weakness of the dynamical framework; the last two correspond to weakness of the computational framework.

Therefore, in the following most discussions will be carried out in the reasoning framework, though the others will be mentioned whenever relevant.

2. Reasoning system

As a formal model, a reasoning system has a "logic" part and a "control" part. The logic part consists of and the control part consists of the following major components, described in an implementation-independent manner: In a computerized reasoning system, the architecture typically consists of the following components: Such a system runs by repeating an inference cycle, in which the premises are selected from the knowledge base and sent into the inference engine, and the conclusions are put into the knowledge base. The input/output interface handles the information exchange between the environment and the knowledge base. The control module manages the overall process by making selections among candidates.

3. Types of reasoning system

The study of reasoning system has been going on in logic, mathematics, and computer science. The major traditional theories include Very often, these theories are collectively referred to as "mathematical logic", in a broad sense of the term. According to it, a reasoning system has the following required or desired properties: In the following, a system with all these properties is referred to as "pure-axiomatic".

Though mathematical logic has achieved great successes in formalizing binary deduction in certain domains, it runs into problems when explaining the "laws" of human reasoning. These problems are clustered by topics:

These issues cannot be simply dismissed by saying that logic is normative, not descriptive. For the purpose of AGI, as far as we still believe the existence of regularity in human reasoning, as well as its rationality from an evolutionary point of view, logic is responsible to find the regularity, and to justify its rationality.

Various non-classical logics and other specific solutions have been proposed for the above problems. Each of them extends or revises certain aspects of a pure-axiomatic system, while keeps many other aspects. The result is a "semi-axiomatic" system.

Similarly, (semi-axiomatic) logic-based approaches in AI include:

From the viewpoint of AGI, one attractive possibility is to take the above problems as all coming from the historical fact that mathematical logic was not developed to formalize the regularity of human reasoning in general, but to provide a logical foundation for mathematics. Therefore, it is not a normative model of reasoning in general, but only concentrates on highly idealized situations. The exploration in this direction leads us to a "non-axiomatic" system, in which all aspects of the system are fundamentally different from the pure-axiomatic reasoning systems.


Reading