Startseite › Forschung › Projekte › Probabilistic Logic and Knowledge Representation

Probabilistic Logic and Knowledge Representation

Probability theory provides a founded and mathematically correct framework for representing quantified uncertain knowledge. Besides Fuzzy Logic and Dempster Shafer Theory, it is the most commonly used numerical method for knowledge representation. On the one hand, probabilities allow to represent statistical information conveniently, and on the other hand, they may be used as subjective probabilities to express the strength of the beliefs of an agent. This makes probabilistics a particularly adequate method for designing knowledge bases e.g. for diagnostic systems.

In general, probabilistic knowledge bases constitute of a probability distribution resp. of a system of compatible marginal distributions. Knowledge may easily be derived by calculating conditional probabilities. The high flexibility of probability theory, however, turns out to be a heavy drawback when such knowledge bases have to be built up in the presence of only incomplete information (as it is usually the case): If all I know is that "All A's are B's" with probability x, and "All B's are C's" with probability y – what can be said of the probability of an A being a C? Though it seems intuitive that this latter probability might be based somehow on x and y, the famous Penguins-Birds-Example smashes hasty expectations: Penguins are birds, birds mostly fly, but penguins don't.

Indeed, there are no easy combination rules for probabilities. In contrast to applying extensional deduction methods, the techniques of optimizing entropy make use of the intensional structure of probabilistic knowledge. Entropy measures the uncertainty inherent to a distribution, and optimizing entropy amounts to choose the least informative distribution representing the explicitly stated knowledge. That means, no information is added extensionally. So in the small example given above, it is actually possible to calculate the probability of an A being a C from x and y by using the methods of optimizing entropy – provided all I know is "All A's are B's" with probabilityx, and "All B's are C's" with probability y. If I have further information about penguins as nonflying objects, the correct statements may be derived, too – entropy does not make penguins fly.

Due to its intensionality, the techniques based on optimizing entropy (usually called ME-techniques as an abbreviation for Maximum Entropy resp. Minimum cross- Entropy) provide a powerful yet complicated tool for representation and inference in a probabilistic framework. Optimizing is like using a "Black Box", and actual inferences are often hard to explain in detail. [KI98] uses a constructive approach to ME via conditionals: It is shown that ME-techniques are based on the conditional structures of prior and new information. This makes ME-inferences far more intelligible and acceptable, and allows even explicit calculations in some cases [KI97].

Persons Involved

Publications related to probabilistic logic (until 1998), newer publications.