»Conditionals, Information, and Inference«

Hagen, Germany
May 13-15, 2002


We thank all authors for contributing their ideas, as well as all members of the program committee for accompanying this workshop and helping us with qualified reviewing work. Concerning realization of the workshop, we have to thank the FernUniversität Hagen for providing rooms, equipment, and technical support.
The workshop took place in a very good atmosphere and all scientists discussed the presented papers. We hope, everybody profits from the conversations during the three days and there are new ideas for the future work. Perhaps we can stay in contact and meet each other at the next CII-workshop.

Concluding here is a short review of the contributions:

Conditionals, most generally expressed as if-then-statements and also termed default rules, are crucial pieces of information. They represent, for instance, causal or plausible connections, bring isolated facts together and help us obtain a coherent image of the world. Conditional knowledge often is generic knowledge, which has been acquired inductively from experience or learned from authorities. Conditionals tie a flexible and highly interrelated network of links along which reasoning is possible and which can be applied to different situations.

Due to their non-Boolean nature, however, conditionals are not easily dealt with. They are not simply true or false - rather, a conditional ''if A then B'' provides a context, A, for B to be plausible (or true) and must not be confused with ''A entails B'' or with the material implication ''not A or B''. First work on conditional objects dates back to Boole in the 19th century, and the interest in conditionals was revived in the second half of the last century, when the emerging Artificial Intelligence claimed for appropriate formal tools to handle ''generalized rules''. Since then, conditionals have been the topic of countless publications, each emphasizing their relevance for knowledge representation, plausible reasoning, nonmonotonic inference, and belief revision. Moreover, conditionals are closely related to information, understood as reduction of uncertainty. To learn that, in the context A, the proposition B is plausible, may reduce uncertainty about B and hence is information. The ability to predict such conditioned propositions is knowledge and as such (earlier) acquired information.

To date, a diversity of default and conditional theories have been brought forth, in quantitative as well as in qualitative frameworks, but clear benchmarks are still in discussion. Therefore, the proper handling of conditionals and information is still a challenge both for theoretical issues and practical applications.

The intention of this workshop was to bring together researchers interested in and working with conditionals and information processing, in order to present new results, discuss open problems and intensify cooperation. Logicians, philosophers, computer scientists, and scientists concerned with the cognitive meaning of conditionals for processing information all contributed to realizing this aim, ensuring that conditionals are discussed in an adequately broad scope. The present volume contains these contributions, and our special thanks go to our invited speakers Ernest Adams, Phil Calabrese, and Didier Dubois for providing us generously with printed material.

In his first paper, Ernest W. Adams pointed out the insufficiencies of classical logic in handling conditionals adequately, emphasizing the dynamic nature of conditionals and their crucial meaning for practical reasoning. His second paper studied similarities and differences among the notions of generalizations, conditionals, proportions, and probabilities. Philip G. Calabrese investigated deductive relations on conditionals which can be built from four basic implication relations. Moreover, he considered the question of how much confidence can be attached to probability values computed by using the principle of maximum entropy. Didier Dubois sketched a unifying framework for plausible reasoning under incomplete information, based on confidence relations. Inference is achieved by conditioning, thereby imitating standard probabilistic inference in an ordinal environment.

In the section with regular papers, Jean-Francois Bonnefon and Dennis Hilton began with presenting an argumentative approach to interpret conditional statements, offering a broader view on conditional human reasoning in taking a speaker's intention into accound. Dale Jacquette contemplated the paradox of murder by conditional logic which may arise when doing practical reasoning involving conditional statements of intent. In Emil Weydert's paper, conditionals both served as building blocks for epistemic states and provided new evidence which has to be incorporated by revision. He proposed a method to perform such a revision of epistemic states by new conditional beliefs which can also be used for iterated revision. Belief revision was also the concern of Richard Booth's paper. Its main purpose was to make a start on a theory of iterated non-prioritised revision, differentiating between regular beliefs and core beliefs. Rainer Osswald made use of conditional relationships to express observation categoricals, and to represent default assumptions in observational logic. He interpreted defaults in two ways: first, as intutionistic conditionals, and second, in a Reiter-style way. Radim Jirousek, Milan Studeny, and Jirina Vejnarova gave an overview on open problems which arose in connection with realizing crucial concepts from probabilistics -- like conditioning and conditional independence -- in a possibilistic framework and combining them with information-theoretical principles.

Piotr Chrzastowski-Wachtel and Jerzy Tyszkiewicz presented a Maple package for experimenting with conditional event algebras. Their implementation uses the correspondence of conditional events to Moore machines and Markov chains, and offers the possibility of deriving symbolic results. I.R. Goodman, D. Bamber, and H.T. Nguyen's paper dealed with the estimation of conditional probabilities from partial knowledge of other conditional probabilities. To do so, they made use of two nonmonotonic ''high probability'' logics based on second order probabilities. Moreover, they exemplified their ideas in applying them to well-known rule-based entailment schemas. Frantisek Matus used discrete structures built from matrices to discover conditional independences in Gaussian vectors. Christoph Beierle and Gabriele Kern-Isberner addressed the question how probabilistic logic and probabilistic conditional logic can be formalized as abstract logical systems, using the framework of institutions. They further investigated the formal logical relationships between these two logics and propositional logic.

Alex Dekhtyar and Judy Goldsmith presented an efficient method to conditionalize interval probability distribution functions with a possible worlds semantics. Jeff B. Paris and A. Vencovska applied the maximum entropy paradigm to problems in inductive reasoning. They showed not only that this yields results in accordance with common sense, but also that ''reasons'' can be found to explain these results. Manfred Schramm and Bertram Fronhöfer used modified maximum entropy techniques to make incomplete Bayesian networks complete. They dealed with two concepts of incompleteness, and consider two different MaxEnt-modifications.

Moreover, in two poster presentations, A.V. Ravishankar Sarma discussed the problem of finding appropriate semantics for causal counterfactuals, and investigates the use of counterfactuals in AGM belief revision.

Gabriele Kern-Isberner
Wilhelm Rödder
Friedhelm Kulmann
FernUniversität Hagen


Last Modified: 21-May-2002