John L. Pollock
Publications
9/25/13
Books
- An Introduction to Symbolic Logic.
Holt, Rinehart, and Winston, 1970.
- Knowledge and Justification.
Princeton University Press, 1974. This book is out
of print, but can be downloaded
as a pdf file (5 MB)
- Subjunctive Reasoning.
Reidel, 1976. This book is out of print, but can be downloaded
as a pdf file (3.3 MB)
- Language and Thought.
Princeton University Press, 1982. This book is out of print, but can
be downloaded as a
pdf file (5 MB)
- The Foundations of Philosophical
Semantics. Princeton
University Press, 984. This book is out of print, but can be downloaded
as a pdf file (3.9 MB)
- Contemporary Theories of Knowledge.
Rowman and Littlefield, 1986.
- How to Build a Person.
Bradford/MIT, 1990.
- Nomic Probability and the Foundations
of Induction. Oxford
University Press, 1990.
- Technical Methods in Philosophy.
Westview Press, 1990. This book is out of print, but can be downloaded
as a pdf file (1.9 MB)
- Philosophy and Artificial Intelligence:
Essays at the Interface. Co-edited with Rob
Cummins. Bradford/MIT. 1991.
- Cognitive Carpentry,
Bradford/MIT Press, 1995.
- Contemporary Theories of Knowledge, 2nd
edition.
Coauthored with Joe Cruz. Rowman and Littlefield, 1999.
- Logic:
An Introduction to the Formal Study of Reasoning.
This is an introductory symbolic logic text. I do not intend to
publish it in other than electronically, and it is free to anyone to
use. I would appreciate any feedback you may have if you either read
it for yourself or use it in a course.
- Thinking about Acting: Logical Foundations for
Rational Decision Making (Oxford
University Press, 2006).
The objective of this book is to produce a theory of rational
decision making for realistically resource-bounded agents. My
interest is not in "What should I do if I were an ideal agent?", but
rather, "What should I do given that I am who I am, with all my
actual cognitive limitations?"
The book has three parts. Part One addresses the question of where
the values come from that agents use in rational decision making.
The most comon view among philosophers is that they are based on
preferences, but I argue that this is computationally impossible. I
propose an alternative theory somewhat reminiscent of Bentham, and
explore how human beings actually arrive at values and how they use
them in decision making.
Part Two investigates the knowledge of probability that is required
for decision-theoretic reasoning. I argue that subjective
probability makes no sense as applied to realistic agents. I sketch
a theory of objective probability to put in its place. Then I use
that to define a variety of causal probability and argue that this
is the kind of probability presupposed by rational decision making.
So what is to be defended is a variety of causal decision theory.
Part Three explores how these values and probabilities are to be
used in decision making. In chapter eight, it is argued first that
actions cannot be evaluated in terms of their expected values as
ordinarily defined, because that does not take account of the fact
that a cognizer may be unable to perform an action, and may even be
unable to try to perform it. An alternative notion of "expected
utility" is defined to be used in place of expected values. In
chapter nine it is argued that individual actions cannot be the
proper objects of decision-theoretic evaluation. We must instead
choose plans, and select actions indirectly on the grounds that they
are prescribed by the plans we adopt. However, our objective cannot
be to find plans with maximal expected utilities. Plans cannot be
meaningfully compared in that way. An alternative, called "locally
global planning", is proposed. According to locally global planning,
individual plans are to be assessed in terms of their contribution
to the cognizer's "master plan". Again, the objective cannot be to
find master plans with maximal expected utilities, because there may
be none, and even if they are, finding them is not a computationally
feasible task for real agents. Instead, the objective must be to
find good master plans, and improve them as better ones come along.
It is argued that there are computationally feasible ways of doing
this, based on defeasible reasoning about values and probabilities.
View or download table of contents
(pdf file)
Publications arranged by topic
Epistemology and Epistemic Cognition
Rational Decision Making and Practical Cognition
(including decision-theoretic planning)
Reasoning: Defeasibly or Deductively
Probability
Agent Architectures and OSCAR
Philosophy of Mind
Older Papers
New Papers:
The
OSCAR Architecture for Rational Agents:
- "OSCAR: An agent architecture based on defeasible
reasoning." Proceedings of the 2008 AAAI Spring Symposium
on Architectures for Intelligent Theory-Based Agents. "OSCAR
is a fully implemented architecture for a cognitive agent, based
largely on the author's work in philosophy concerning epistemology and
practical cognition. The seminal idea is that a generally intelligent
agent must be able to function in an environment in which it is
ignorant of most matters of fact. The architecture incorporates a
general-purpose defeasible reasoner, built on top of an efficient
natural deduction reasoner for first-order logic. It is based upon a
detailed theory about how the various aspects of epistemic and
practical cognition should interact, and many of the details are
driven by theoretical results concerning defeasible reasoning." Download
paper in pdf form.
- "OSCAR: A cognitive architecture for intelligent agents".
The "grand problem" of AI has always been to build artificial agents
of human-level intelligence, capable of operating in environments of
real-world complexity. OSCAR is a cognitive architecture for such
agents, implemented in LISP. OSCAR is based on my extensive work in
philosophy concerning both epistemology and rational decision making.
This paper provides a detailed overview of OSCAR. The main conclusions
are that such agents must be capablew of operating against a
background of pervasive ignorance, because the real world is too
complex for them to know more than a small fraction of what is true.
This is handled by giving the agent the power to reason defeasibily.
The OSCAR system of defeasible reasoning is sketched. It is argued
that if epistemic cognition must be defeasible, planning must also be
done defeasibly, and the best way to do that is to reason defeasibly
about plans. A sketch is given about how this might work. Download
paper in pdf form.
- "OSCAR: An architecture for generally intelligent agents".
"OSCAR is a fully implemented architecture for a cognitive agent,
based largely on the author's work in philosophy concerning
epistemology and practical cognition. The seminal idea is that a
generally intelligent agent must be able to function in an environment
in which it is ignorant of most matters of fact. The architecture
incorporates a general-purpose defeasible reasoner, built on top of an
efficient natural deduction reasoner for first-order logic. It is
based upon a detailed theory about how the various aspects of
epistemic and practical cognition should interact, and many of the
details are driven by theoretical results concerning defeasible
reasoning. The architecture is easily extensible by changing the set
of inference schemes supplied to the reasoner. Existing inference
schemes handle many kinds of epistemic cognition, including reasoning
from perceptual input, causal reasoning and the frame problem, and
reasoning defeasibly about probabilities. Work is underway to
implement a system of defeasible decision-theoretic planning. Download
paper in pdf form.
- "Rational Cognition in OSCAR". A general overview of OSCAR,
presented at the ATAL-99 conference, and published in Proceedings
of ATAL-99, ed. N. Jennings and Y. Lesperance, Springer Verlag.
The zipped Powerpoint slides can also be downloaded. Download
slides ; download
paper in pdf form.
- "Rational thought and artificial intelligence". Powerpoint
slides of a talk given at RPI. Download
slides in zipped form.
- "Planning Agents". Appeared in Foundations of Rational
Agency, ed. Rao and Wooldridge, published by Kluwer. "It is
argued that the essence of a rational agent lies in its ability to
make and execute plans. Viewing planning from the perspective of
rational agents requires planning and epistemic reasoning to be
interleaved in ways that are impossible for standard planners.
Planning must be carried out by reasoning rather than algorithmically.
It is illustrated how this can be accomplished in the OSCAR
architecture for rational agency. "Postscript
or pdf
Philosophy of
Mind:
- "What Am I? Virtual machines and
the mind/body problem". Forthcoming in Philosophy and
Phenomenological Research. "When your word processor or email
program is running on your computer, this creates a 'virtual machine'
that manipulates windows, files, text, etc. What is this virtual
machine, and what are the virtual objects it manipulates? Many
standard arguments in the philosophy of mind have exact analogues for
virtual machines and virtual objects, but we do not want to draw the
wild metaphysical conclusions that have sometimes tempted philosophers
in the philosophy of mind. A computer file is not made of
epiphenomenal ectoplasm. I argue instead that virtual objects are
'supervenient objects'. The stereotypical example of supervenient
objects is the statue and the lump of clay. To this end I propose a
theory of supervenient objects. Then I turn to persons and mental
states. I argue that my mental states are virtual states of a
cognitive virtual machine implemented on my body, and a person is a
supervenient object supervening on his cognitive virtual machine." Download
paper in pdf form.
- "So you think you exist? In defense
of nolipsism." Coauthored with Jenann Ismael. In Knowlege
and Reality: Essays in Honor of Alvin Plantinga (Kluwer),
eds. Thomas Crisp, Matthew Davidson, David Vander Laan. Springer
Verlag, 2004. "Human beings think of themselves in terms of a
privileged non-descriptive designator: a mental "I". Such thoughts are
called "de se" thoughts. The mind/body problem is the problem of
deciding what kind of thing I am, and it can be regarded as arising
from the fact that we think of ourselves non-descriptively. Why do we
think of ourselves in this way? We investigate the functional role of
"I" (and also "here" and "now") in cognition, arguing that the use of
such non-descriptive "reflexive" designators is essential for making
sophisticated cognition work in a general-purpose cognitive agent. If
we were to build a robot capable of similar cognitive tasks as humans,
it would have to be equipped with such designators. Once we understand
the functional role of reflexive designators in cognition, we will see
that to make cognition work properly, an agent must use a de se
designator in specific ways in its reasoning. Rather simple arguments
based upon how "I" works in reasoning lead to the conclusion that it
cannot designate the body or part of the body. If it designates
anything, it must be something non-physical. However, for the purpose
of making the reasoning work correctly, it makes no difference whether
"I" actually designates anything. If we were to build a robot that
more or less duplicated human cognition, we would not have to equip it
with anything for "I" to designate, and general physicalist
inclinations suggest that there would be nothing for 'I' to designate
in the robot. In particular, it cannot designate the physical
contraption. So the robot would believe "I exist", but it would be
wrong. Why should we think we are any different?" Download
paper in pdf form.
Probability:
- "Reasoning Defeasibly about
Probabilities." To appear in Michael O'Rourke and Joseph
Cambell (eds.), Knowledge and Skepticism, Cambridge, MA: MIT
Press. (This is a short version of the next paper listed.) Originally
presented at the Pacific Division APA, April, 2007. "In concrete
applications of probability, statistical investigation gives us
knowledge of some probabilities, but we generally want to know many
others that are not directly revealed by our data. For instance, we
may know prob(P/Q) (the probability of P
given Q) and prob(P/R), but what we really
want is prob(P/Q&R), and we may not
have the data required to assess that directly. The probability
calculus is of no help here. Given prob(P/Q) and
prob(P/R), it is consistent with the probability
calculus for prob(P/Q&R) to have any
value between 0 and 1. Is there any way to make a reasonable estimate
of the value of prob(P/Q&R)? A related
problem occurs when probability practitioners adopt undefended
assumptions of statistical independence simply on the basis of not
seeing any connection between two propositions. This is common
practice, but its justification has eluded probability theorists, and
researchers are typically apologetic about making such assumptions. Is
there any way to defend the practice? This paper shows that on a
certain conception of probability 'nomic probability' there are
principles of 'probable probabilities' that license inferences of the
above sort. These are principles telling us that although certain
inferences from probabilities to probabilities are not deductively
valid, nevertheless the second-order probability of their yielding
correct results is 1. This makes it defeasibly reasonable to make the
inferences. Thus I argue that it is defeasibly reasonable to assume
statistical independence when we have no information to the contrary.
And I show that there is a function Y(r,s,a) such that if
prob(P/Q) = r, prob(P/R)
= s, and prob(P/U) = a (where U
is our background knowledge) then it is defeasibly reasonable to
expect that prob(P/Q&R) = Y(r,s,a).
Numerous other defeasible inferences are licensed by similar
principles of probable probabilities. This has the potential to
greatly enhance the usefulness of probabilities in practical
application." Download
paper in pdf form. Download
slides from APA. Download
LISP code for computing probable probabilities.
- "Probable probabilities". "In
concrete applications of probability, statistical investigation gives
us knowledge of some probabilities, but we generally want to know many
others that are not directly revealed by our data. For instance, we
may know prob(P/Q) (the probability of P
given Q) and prob(P/R), but what we really
want is prob(P/Q&R), and we may not
have the data required to assess that directly. The probability
calculus is of no help here. Given prob(P/Q) and
prob(P/R), it is consistent with the probability
calculus for prob(P/Q&R) to have any
value between 0 and 1. Is there any way to make a reasonable estimate
of the value of prob(P/Q&R)? A related
problem occurs when probability practitioners adopt undefended
assumptions of statistical independence simply on the basis of not
seeing any connection between two propositions. This is common
practice, but its justification has eluded probability theorists, and
researchers are typically apologetic about making such assumptions. Is
there any way to defend the practice? This paper shows that on a
certain conception of probability 'nomic probability' there are
principles of 'probable probabilities' that license inferences of the
above sort. These are principles telling us that although certain
inferences from probabilities to probabilities are not deductively
valid, nevertheless the second-order probability of their yielding
correct results is 1. This makes it defeasibly reasonable to make the
inferences. Thus I argue that it is defeasibly reasonable to assume
statistical independence when we have no information to the contrary.
And I show that there is a function Y(r,s|a)
such that if prob(P/Q) = r, prob(P/R)
= s, and prob(P/U) = a (where U
is our background knowledge) then it is defeasibly reasonable to
expect that prob(P/Q&R) = Y(r,s|a).
Numerous other defeasible inferences are licensed by similar
principles of probable probabilities. This has the potential to
greatly enhance the usefulness of probabilities in practical
application. Download
paper in pdf form. Download
LISP code for computing probable probabilities.
- "Probable probabilities (with proofs)". This is the
long version of the previous paper, including additional results and
the proofs of theorems. Download
paper in pdf form. Download
LISP code for computing probable probabilities.
- "Probabilities for AI". "Probability
plays an essential role in many branches of AI, where it is typically
assumed that we have a complete probability distribution when
addressing a problem. But this is unrealistic for problems of
real-world complexity. Statistical investigation gives us knowledge of
some probabilities, but we generally want to know many others that are
not directly revealed by our data. For instance, we may know prob(P/Q)
(the probability of P given Q) and prob(P/R),
but what we really want is prob(P/Q&R),
and we may not have the data required to assess that directly. The
probability calculus is of no help here. Given prob(P/Q)
and prob(P/R), it is consistent with the probability
calculus for prob(P/Q&R) to have any
value between 0 and 1. Is there any way to make a reasonable estimate
of the value of prob(P/Q&R)? A related
problem occurs when probability practitioners adopt undefended
assumptions of statistical independence simply on the basis of not
seeing any connection between two propositions. This is common
practice, but its justification has eluded probability theorists, and
researchers are typically apologetic about making such assumptions. Is
there any way to defend the practice? This paper shows that on a
certain conception of probability 'nomic probability' there are
principles of 'probable probabilities' that license inferences of the
above sort. These are principles telling us that although certain
inferences from probabilities to probabilities are not deductively
valid, nevertheless the second-order probability of their yielding
correct results is 1. This makes it defeasibly reasonable to make the
inferences. Thus I argue that it is defeasibly reasonable to assume
statistical independence when we have no information to the contrary.
And I show that there is a function Y(r,s|a)
such that if prob(P/Q) = r, prob(P/R)
= s, and prob(P/U) = a (where U
is our background knowledge) then it is defeasibly reasonable to
expect that prob(P/Q&R) = Y(r,s|a).
Numerous other defeasible inferences are licensed by similar
principles of probable probabilities. This has the potential to
greatly enhance the usefulness of probabilities in practical
application. Download
paper in pdf form. Download
LISP code for computing probable probabilities.
- "Probabilities for AI (with proofs)". This is the
long version of the previous paper, including additional results and
the proofs of theorems. Download
paper in pdf form. Download
LISP code for computing probable probabilities.
- ""Problems for Bayesian
Epistemology"This paper raises problems for the different
strategies for making sense of subjective probability within the
framework of Bayesian epistemology, in each case arguing that there is
no way to do it for real cognitive agents. If subjective probability
makes any sense at all, it is only for ideal agents, but that is
useless for epistemological purposes. Forthcoming in Philosophical
Studies. Download
paper in pdf form.
- "Direct Inference and
Probable Probabilities" New results in the theory of nomic
probability have led to a theory of probable probabilities, which
licenses defeasible inferences between probabilities that are not
validated by the probability calculus. Among these are classical
principles of direct inference together with some new more general
principles that greatly strengthen direct inference and make it much
more useful. Download paper
in pdf form.
- "Joint Probabilities ". "When combining information from
multiple sources and attempting to estimate the probability of a
conclusion, we often find ourselves in the position of knowing the
probability of the conclusion conditional on each of the individual
sources, but we have no direct information about the probability of
the conclusion conditional on the combination of sources. The
probability calculus provides no way of computing such joint
probabilities. This paper introduces a new way of combining
probabilistic information to estimate joint probabilities. It is shown
that on a particular conception of objective probabilities, clear
sense can be made of second-order probabilities (probabilities of
probabilities), and these can be related to combinatorial theorems
about proportions in finite sets as the sizes of the sets go to
infinity. There is a rich mathematical theory consisting of such
theorems, and the theorems generate corresponding theorems about
second-order probabilities. Among the latter are a number of theorems
to the effect that certain inferences from probabilities to
probabilities, although not licensed by the probability calculus, have
probability 1 of producing correct results. This does not mean that
they will always produce correct results, but the set of cases in
which the inferences go wrong form a set of measure 0. Among these
inferences are some enabling us to reasonably estimate the values of
joint probabilities in a wide range of cases. A function called the
Y-function is defined. The central theorem is the Y-Theorem, which
tells us that if we know the individual probabilities for the
different information sources and estimate the joint probability using
the Y-function, the second-order probability of getting the right
answer is 1. This mathematical result is tested empirically using a
simple multi-sensor example. The Y-theorem agrees with Dempster's rule
of combination in special cases, but not in general. The paper goes on
to investigate cases in which the Y-theorem cannot be expected to give
the right answer, and it is shown that there are generalizations of
the Y-theorem that can still be employed." Download
paper in pdf form.
- "The Y-function ". In Probability and Evidence,
(ed). Greg Wheeler and Billl Harper, King's College publications,
2007. "Direct inference derives values for definite (single-case)
probabilities from those of related indefinite (general)
probabilities. But direct inference is less useful than might be
supposed, because we often have too much information, with the result
that we can make conflicting direct inferences, and hence they all
undergo collective defeat, leaving us without any conclusion to draw
about the value of the definite probabilities. This paper presents
reason for believing that there is a function 'the Y-function' that
can be used to combine different indefinite probabilities to yield a
single value for the definite probability. Thus we get a kind of
'computational' direct inference." Download
paper in pdf form.
- "An Objectivist Argument for Thirdism", with the
OSCAR seminar: Adam Arico, Nathan Ballantyne, Matt Bedke, Jacob Caton,
Ian Evans, Don Fallis, Brian Fiala, Martin Frické, David Glick, Peter
Gross, Terry Horgan, Jenann Ismael, Daniel Sanderman, Paul Thorn,
Orlin Vakarelov, Analysis, forthcoming. Download
paper in pdf form.
- "Causal probability"(Synthese 132 (2002),
143-185). Examples growing out of the Newcomb problem have
convinced many people that decision theory should proceed in terms of
some kind of causal probability. I endorse this view and define and
investigate a variety of causal probability. My definition is related
to Skyrms' definition, but proceeds in terms of objective
probabilities rather than subjective probabilities avoids taking
causal dependence as a primitive concept. Download
paper in pdf form.
- "The theory of nomic probability". This is a slightly
revised version of "The theory of nomic probability", Synthese
90 (1992), 263-300. It summarizes much of the material published in Nomic
Probability and the Foundations of Induction, Oxford, 1990. Download
paper in pdf form.
Rational
Decision Making, Practical Cognition, Decision-Theoretic Planning:
- "A Resource-Bounded Agent
Addresses the Newcomb Problem". To appear in Synthese.
"In the Newcomb problem, the standard arguments for taking either one
box or both boxes adduce what seem to be relevant considerations, but
they are not complete arguments, and attempts to complete the
arguments rely upon incorrect principles of rational decision making.
It is argued that by considering how the predictor is making his
prediction, we can generate a more complete argument, and this in turn
supports a form of causal decision theory." Download
paper in pdf form. Download powerpoint
slides for a related talk.
"Rational Decison-Making in Resource-Bounded Agents". To
appear in How to Build Smart Machines, in the Rensselaer
Core Debates in Cognitive Science. "The objective of this
paper is to construct an implementable theory of rational
decision-making for cognitive agents subject to realistic resource
constraints. It is argued that decision-making should select actions
indirectly by selecting plans that prescribe them. It is also argued
that although expected values provide the tool for evaluating plans,
plans cannot be compared straightforwardly in terms of their
expected values, and the objective of a realistic agent cannot be to
find optimal plans. The theory of Locally Global planning is
proposed as a realistic alternative to standard "maximizing"
theories of rational decision-making." Download
paper in pdf form.
- "Evaluative Cognition". Nous 35 (2001),
325-364. The cognition of a cognitive agent can be subdivided into two
parts. Epistemic cognition is that kind of cognition responsible for
producing and maintaining beliefs. Practical cognition evaluates the
world, adopts plans, and initiates action. There is a massive
literature both in philosophy and artificial intelligence concerning
various aspects of epistemic cognition, and large parts of it are well
understood. Practical cognition is less well understood. We can
usefully divide practical cognition into five parts: (1) the
evaluation of the world as represented by the agent's beliefs, (2) the
adoption of goals for changing it, (3) the construction of plans for
achieving goals, (4) the adoption of plans, and (5) the execution of
plans. There is a substantial literature in AI concerning the
construction and execution of plans, and I will say nothing further
about those topics here. This paper will focus on the evaluative
aspects of practical cognition. Evaluation plays an essential role in
both goal selection and plan adoption. My concern here is the
investigation of evaluation as a cognitive enterprise performed by
cognitive agents. I am interested both in how it is performed in human
beings and how it might be performed in artificial rational agents. Download
paper in pdf form.
- "Plans and
decisions". (Theory and Decision 57
(2005), 79-107.) Counterexamples are constructed for classical
decision theory, turning on the fact that actions must often be chosen
in groups rather than individually, i.e., the objects of rational
choice are plans. It is argued that there is no way to define
optimality for plans that makes the finding of optimal plans the
desideratum of rational decision-making. An alternative called
'locally global planning' is proposed as a replacement for classical
decision theory. Decision-making becomes a non-terminating process
without a precise target rather than a terminating search for an
optimal solution. Download
paper in pdf form.
- Rational Choice and Action omnipotence"(Philosophical
Review 111 (2003), 1-23) Counterexamples
are constructed for classical decision theory, turning on the fact
that an agent may be unable to perform an action, and may even be
unable to try to perform an action. A proposal is made for how to
repair classical decision theory in light of these counterexamples. Download
paper in pdf form.
- "Some logical conundrums for decision-theoretic contingency
planning". There are two general approaches to handling
contingencies in decision-theoretic planning. State-space planners
reason globally, building a map of the parts of the world relevant to
the planning problem, and then attempt to distill a plan out of the
map. POCL planners reason locally, attempting to build the plan up
from local relationships. A planning problem is constructed that
humans find trivial, but no state-space planner can solve. This
motivates an investigation of decision-theoretic POCL contingency
planners. Existing POCL contingency planners attempt to generalize the
results of classical POCL contingency planning. However, this paper
argues that the nature of contingency planning changes dramatically in
decision-theoretic contexts, and results from classical contingency
planning are of little relevance. In particular, in classical planning
contingencies can only be attached to conditional forks, but in most
uses of contingencies in decision-theoretic planning they are attached
to single branches of the plan rather than to conditional forks. A
criterion of adequacy for contingency planners is formulated,
following from ordinary completeness, and it is shown that existing
decision-theoretic POCL contingency planners do not satisfy it. Some
tentative suggestions are made regarding how to construct a planner
that does satisfy the adequacy condition.. Download
paper in pdf form.
- "Against Optimality: The
Logical Foundations of Decision-Theoretic Planning". Computational
Intelligence 22 (2006), 1:25." This paper
investigates decision-theoretic planning in sophisticated autonomous
agents operating in environments of real-world complexity. An example
might be a planetary rover exploring a largely unknown planet. It is
argued that existing algorithms for decision-theoretic planning are
based on a logically incorrect theory of rational decision making.
Plans cannot be evaluated directly in terms of their expected values,
because plans can be of different scopes, and they can interact with
other previously adopted plans. Furthermore, in the real world, the
search for optimal plans is completely intractable. An alternative
theory of rational decision making is proposed, called 'locally global
planning'." Download paper
in pdf form.
- "The Logical Foundations of Decision-Theoretic Planning".
(version of 2/1/03) Decision-theoretic planning is normally
based on the assumption that plans can be compared by comparing their
expected-values, and the objective is to find an optimal plan. This is
typically defended by reference to classical decision theory. However,
classical decision theory is actually incompatible with this "simple
plan-based decision theory". A defense of plan-based decision theory
must begin by showing that classical decision theory is incorrect
insofar as the two theories conflict, so this paper begins by raising
objections to classical decision theory . First, there is a discussion
of the considerations arising out of the Newcomb problem that have
given rise to causal decision theory. Next, counterexamples are
constructed for classical decision theory turning on the fact that an
agent may be unable to perform an action, and may even be unable to
try to perform an action. A proposal is made for how to repair
classical decision theory in light of these counterexamples. But then
turning to the concept of an "alternative" that is presupposed by
classical decision theory, it is argued that actions must often be
chosen in groups rather than individually, i.e., the objects of
rational choice are plans. It is argued that optimality cannot be
defined for plans, and even if it could be, it would not be reasonable
to require rational agents to find optimal plans. So simple plan-based
decision theory must also be rejected. An alternative called "locally
global planning" is proposed as a replacement for both classical
decision theory and simple plan-based decision theory. Download
paper in pdf form.
- "An Easy "Hard Problem" for Decision-Theoretic Planning".
This paper presents a challenge problem for decision-theoretic
planners. State-space planners reason globally, building a map of the
parts of the world relevant to the planning problem, and then attempt
to distill a plan out of the map. A planning problem is constructed
that humans find trivial, but no state-space planner can solve.
Existing POCL planners cannot solve the problem either, but for a less
fundamental reason. Download
paper in pdf form.
- "Locally Global Planning". This is a presentation at the
Decision-Theoretic Planning Workshop during AIPS-2000. It is
conjectured that MDP and POMDP planning will remain unfeasible for
complex domains, so some form of "classical" decision-theoretic
planning is sought. However, local plans cannot be properly compared
in terms of their expected values, because those values will be
affected by the other plans the agent has adopted. Plans must instead
be merged into a single "master-plan", and new plans evaluated in
terms of their contribution to the value of the master plan. To make
both the construction and evaluation of plans feasible, it is proposed
to evaluate plans and their interactions defeasibly. Download
paper in pdf form.
- "The Logical Foundations of Goal-Regression Planning in
Autonomous Agents". Artificial Intelligence, 106
(1998), 267-335. "This paper addresses the logical foundations of
goal-regression planning in autonomous rational agents. It focuses
mainly on three problems. The first is that goals and subgoals will
often be conjunctions, and to apply goal-regression planning to a
conjunction we usually have to plan separately for the conjuncts and
then combine the resulting subplans. A logical problem arises from the
fact that the subplans may destructively interfere with each other.
This problem has been partially solved in the AI literature (e.g., in
SNLP and UCPOP), but the solutions proposed there work only when a
restrictive assumption is satisfied. This assumption pertain to the
computability of threats. It is argued that this assumption may fail
for an autonomous rational agent operating in a complex environment.
Relaxing this assumption leads to a theory of defeasible planning. The
theory is formulated precisely and an implementation in the OSCAR
architecture is discussed. The second problem is that goal-regression
planning proceeds in terms of reasoning that runs afoul of the Frame
Problem. It is argued that a previously proposed solution to the Frame
Problem legitimizes goal-regression planning, but also has the
consequence that some restrictions must be imposed on the logical form
of goals and subgoals amenable to such planning. These restrictions
have to do with temporal-projectibility. The third problem is that the
theory of goal-regression planning found in the AI literature imposes
restrictive syntactical constraints on goals and subgoals and on the
relation of logical consequence. Relaxing these restrictions leads to
a generalization of the notion of a threat, related to collective
defeat in defeasible reasoning. Relaxing the restrictions also has the
consequence that the previously adequate definition of
"expectable-result" no longer guarantees closure under logical
consequence, and must be revised accordingly. That in turn leads to
the need for an additional rule for goal-regression planning. Roughly,
the rule allows us to plan for the achievement of a goal by searching
for plans that will achieve states that "cause" the goal. Such a rule
was not previously necessary, but becomes necessary when the
syntactical constraints are relaxed. The final result is a general
semantics for goal-regression planning and a set of procedures that is
provably sound and complete. It is shown that this semantics can
easily handle concurrent actions, quantified preconditions and
effects, creation and destruction of objects, and causal connections
embodying complex temporal relationships." Postscript
or pdf
- "Reasoning Defeasibly about Plans". OSCAR Project technical
report. "Planning theory has traditionally made the assumption that
the planner begins with all relevant knowledge for solving the
problem. Autonomous agents cannot make that assumption. They are both
planning agents and epistemic agents, and the pursuit of knowledge is
driven by the planning. The search for a plan raises questions about
the world and the agent must pursue answers to those questions during
the course of the planning. Thus planning and epistemic investigation
are interleaved. This is difficult to do with a traditional
algorithmic planner. The obvious alternative is to build an agent in
which planning and epistemic investigation use the same inference
engine. This paper shows how to build such a planner based upon the
OSCAR architecture for rational agents and using OSCAR's defeasible
reasoner as the inference engine. The resulting planner constructs
plans in roughly the same way as UCPOP, but does it by reasoning
defeasibly about plans rather than running conventional plan search
algorithms. A beneficial side effect is that the resulting planner is
completely freed of the syntactical restrictions imposed by the STRIPS
representation of actions. The planner can use the full power of
first-order logic in representing the information used in planning." Postscript
or pdf
Reasoning: Defeasibly or
Deductively:
- "A Recursive Semantics for
Defeasible Reasoning", in Argumentation in Artificial
Intelligence, ed. Iyad Rahwan and Guillermo Simari, Springer.
"One of the most striking characteristics of human beings is their
ability to function successfully in complex environments about which
they know very little. In light of our pervasive ignorance, we cannot
get around in the world just reasoning deductively from our prior
beliefs together with new perceptual input. As our conclusions are not
guaranteed to be true, we must countenance the possibility that new
information will lead us to change our minds, withdrawing previously
adopted beliefs. In this sense, our reasoning is 'defeasible'. The
question arises how defeasible reasoning works, or ought to work. In
particular we need rules governing what a cognizer ought to believe
given a set of interacting arguments some of which defeat others. That
is what is called a 'semantics' for defeasible reasoning, and this
chapter will propose a new semantics that avoids certain clear
counter-examples to all existing semantics." Download
paper in pdf form.
- Defeasible
Reasoning". Reasoning: Studies of Human Inference and its
Foundations, ed. Jonathan Adler and Lance Rips, Cambridge
University Press. This gives an overview of the OSCAR theory
of defeasible reasoning. Download
paper in pdf form.
- Skolemization and
Unification in Natural Deduction, OSCAR Project technical
report. "Skolemization and unification are familiar parts of the
machinery of automated theorem proving. However, their use has
generally been confined to systems using variants of
resolution-refutation, or tableau methods, or similar methods in which
the desired conclusion is negated and added to the premises, the
resulting set of premises is skolemized, and then a contradiction is
derived. In an earlier paper, I described a natural deduction system
(OSCAR) that was noteworthy for its efficiency. However, a peculiar
feature of that system was that it did not use skolemization and
unification. Instead, it used "intuitive" rules of universal and
existential instantiation and generalization that constructed
substitution instances of quantified formulas, using all the (closed)
terms that had occurred elsewhere in the argument. It has always
seemed that this should be a source of considerable inefficiency,
because the same reasoning will be replicated for different
substitution instances. This suggests that if a version of the natural
deduction system could be produced using skolemization and
unification, it might be more efficient. However, it was not obvious
how to use skolemization and unification in natural deduction. This
paper presents a solution to that problem.".
Epistemic Cognition:
- "Epistemology, Rationality, and
Cognition". This is a longer version of a paper by the same
title to appear in Companion to Epistemology, second
edition, ed. Matthias Steup, Blackwells. It consists of a general
sketch of my views on epistemology and how it relates to cognitive
science and artificial intelligence. Download
paper in pdf form.
- "Pollock -- Epistemology: 5 Questions".
My answers to the interview questions in Epistemology: 5
Questions, eds. Vincent Hendricks and Duncan Pritchard,
Automatic Press/VIP, 2008. Download
paper in pdf form.
- "Irrationality and Cognition".
Presented at the Inland Northwest Philosophy Conference on Knowledge
and Skepticism, held April 30-May 2, 2004, in Moscow, ID and Pullman,
WA. The strategy of this paper is to throw light on rational cognition
and epistemic justification by examining irrationality. I argue that
practical irrationality derives from a general difficulty we have in
overriding conditioned features likings. Epistemic irrationality is
possible because we are reflexive cognizers, able to reason about
redirect some aspects of our own cognition. This has the consequence
that practical irrationality can affect our epistemic cognition. I
argue that all epistemic irrationality can be traced to this single
source. The upshot is that one cannot give a theory of epistemic
rationality or epistemic justification without simultaneously giving a
theory of practical rationality. A consequence of this account is that
a theory of rationality is a descriptive theory, describing contingent
features of a cognitive architecture, and it forms the core of a
general theory of "voluntary" cognition?those aspects of cognition
that are under voluntary control. It also follows that most of the
so-called "rules for rationality" that philosophers have proposed are
really just rules describing default (non-reflexive) cognition. It can
be perfectly rational for a reflexive cognizer to break these rules.
The "normativity" of rationality is a reflection of a built-in feature
of reflexive cognition -- when we detect violations of rationality, we
have a tendency to desire to correct them. This is just another part
of the descriptive theory of rationality. Although theories of
rationality are descriptive, the structure of reflexive cognition
gives philosophers, as human cognizers, privileged access to certain
aspects of rational cognition. Philosophical theories of rationality
are really scientific theories, based on inference to the best
explanation, that take contingent introspective data as the evidence
to be explained. Download paper
in pdf form.
- "Vision, Knowledge, and the Mystery
Link", coauthored with Iris Oved. Almost final draft, to appear
in Philosophical Perspectives vol. 19 . It is argued
that empirical data indicates that colors do not have characteristic
looks that are the same for all people at all times. This creates a
problem for many theories of perceptual knowledge. An examination of
current computational theories of vision leads to an account of the
visual image as a system of mental representations. This is used to
develop an account of how epistemic cognition can produce beliefs on
the basis of the visual image. The result is a sophisticated version
of direct realism. Download paper
in pdf form.
- "The need for an epistemology". Proceedings of the 3rd
International NASA Workshop on Planning and Scheduling for Space. It
is argued that we cannot build a sophisticated autonomous planetary
rover just by implementing sophisticated planning algorithms. Planning
must be based on information, and the agent must have the cognitive
capability of acquiring new information about its environment. That
requires the implementation of a sophisticated epistemology.
Epistemological considerations indicate that the rover cannot be
assumed to have a complete probability distribution at its disposal.
Its planning must be based upon "thin" knowledge of probabilities, and
that has important implications for what planning algorithms might be
employed. Download
paper in pdf form.
- "Defeasible reasoning with variable degrees of justification". Artificial
Intelligence 133 (2002), 233-282. The question
addressed in this paper is how the degree of justification of a belief
is determined. A conclusion may be supported by several different
arguments, the arguments typically being defeasible, and there may
also be arguments of varying strengths for defeaters for some of the
supporting arguments. What is sought is a way of computing the "on
sum" degree of justification of a conclusion in terms of the degrees
of justification of all relevant premises and the strengths of all
relevant reasons. I have in the past defended various principles
pertaining to this problem. In this paper I reaffirm some of those
principles but propose a significantly different final analysis.
Specifically, I endorse the weakest link principle for the computation
of argument strengths. According to this principle the degree of
justification an argument confers on its conclusion in the absence of
other relevant arguments is the minimum of the degrees of
justification of its premises and the strengths of the reasons
employed in the argument. I reaffirm my earlier rejection of the
accrual of reasons, according to which two arguments for a conclusion
can result in a higher degree of justification than either argument by
itself. This paper diverges from my earlier theory mainly in its
treatment of defeaters. First, it argues that defeaters that are too
weak to defeat an inference outright may still diminish the strength
of the conclusion. Second, in the past I have also denied that
multiple defeaters can result in the defeat of an argument that is not
defeated by any of the defeaters individually. In this paper I urge
that there are compelling examples that support a limited version of
this "collaborative" defeat. The need to accomodate diminishers and
collaborative defeat has important consequences for the computation of
degrees of justification. The paper proposes a characterization of
degrees of justification that captures the various principles endorsed
and constructs an algorithm for computing them. Download
paper in pdf form.
- "OSCAR System Description". This is a presentation at the
Non-Monotonic Reasoning Workshop 2000. Download
paper in pdf form.
- "Belief Revision and Epistemology". Coauthored with Anthony
Gillies.(Synthese 122 (2000), 69-92)."Belief revision
is the process of changing one's beliefs to reflect the acquisition of
new information. There are two ways one might proceed in investigating
rational belief revision. The postulational approach proposes abstract
general principles purportedly governing the process--in effect,
axiomatizing or partially axiomatizing belief revision. By contrast,
the derivational approach tries to derive a theory of belief revison
from a more concrete epistemological theory. A number of authors have
favored the postulational approach. The best known such theory is the
AGM theory originated by Alchourron, Gardenfors, and Makinson (1985).
The purpose of this paper is to question the viability of that theory,
and by implication to question the fruitfulness of the entire
postulational approach. Our contention will be twofold. First, such
theories of belief revision proceed at too high a level of
abstraction, ignoring aspects of rational cognition without which it
is impossible to formulate true principles of rational belief
revision. A theory of belief revision should be driven by a more
concrete epistemology. In the absence of an epistemological theory to
generate a theory of belief revision, the latter will have be driven
by nothing more than vague formal intuitions, and such intuitions are
almost certain to get the details of belief revision wrong. Second,
abstract theories of belief revision are not necessary anyway if we
have a sufficiently detailed epistemological theory, because we can
just apply the theory to see how to revise our beliefs. Of course, to
serve this purpose the epistemological theory must be developed in
sufficient detail to tell us precisely how belief revision should
proceed in any given instance. Many epistemological theories are
themselves too abstract to serve that purpose. The epistemological
theory employed in this paper is that underlying the OSCAR
architecture for rational agents (Pollock 1995). This theory is fully
implemented in the AI system OSCAR and in any given case it makes a
determinant decision about how an agent's beliefs should be revised."
Postscript or pdf
- "Perceiving and Reasoning about a Changing World", Computational
Intelligence, Volume 14, Number 4, 1998, 498-562. This is a
revised version of 2/12/05. "A rational agent (artificial or
otherwise) residing in a complex changing environment must gather
information perceptually, update that information as the world
changes, and combine that information with causal information to
reason about the changing world. Using the system of defeasible
reasoning that is incorporated into the OSCAR architecture for
rational agents, a set of reason-schemas is proposed for enabling an
agent to perform some of the requisite reasoning. Along the way,
solutions are proposed for the Frame Problem, the Qualification
Problem, and the Ramification Problem. The principles and reasoning
described have all been implemented in OSCAR.". pdf
Older Papers:
- Should we maximize expected value?,
OSCAR Project technical report. "It is argued that cases of rational
risk aversion force the abandonment of the view that rationality
requires choosing actions that maximize expected value. It is proposed
that plans should be the unit of decision-theoretic evaluation rather
than individual actions." The source
code for the simulation discussed in the paper can also be
downloaded.
- Taking perception seriously.
"A rational agent (artificial or otherwise) residing in a complex
changing environment must gather information perceptually and update
that information as the world changes. An agent designer must address
two problems. First, perception need not be veridical?the world can be
other than it appears. Second, perception is really a form of
sampling. An agent cannot perceptually monitor the entire state of the
world at all time. The best perception can do is provide the agent
with images of small parts of the world at discrete times or over
short time intervals, and it is up to the agent's cognitive faculties
to make inferences from these to a coherent picture of the world.
Using the system of defeasible reasoning that is incorporated into the
OSCAR architecture for rational agents, a set of reason-schemas will
be proposed for enabling an agent to perform some of the requisite
reasoning. The principles and reasoning described have all been
implemented in OSCAR." This is a longer version of a paper to appear
in the proceedings of The First International Conference on Autonomous
Agents.
- OSCAR-DSS, OSCAR Project
technical report. "OSCAR-DSS is a generic decision support system
based upon OSCAR--a general-purpose programmable architecture for
rational agents. OSCAR-DSS is a partial implementation of that
architecture, with some minor modifications to convert it to a
decision support system that merely recommends actions rather than
performing them. OSCAR-DSS has been further instantiated to produce
OSCAR-MDA, a medical decision support system currently focused on
emergency room medicine. The OSCAR architecture begins by
distinguishing between epistemic cognition (cognition about what to
believe) and practical cognition (cognition about what to do). The
core of the OSCAR architecture consists of the system of epistemic
cognition together with procedures for dealing with plans that have
been adopted. The system of epistemic cognition is in turn based
largely upon a general-purpose defeasible and deductive reasoner. The
core is implemented directly in LISP. The bulk of the system of
practical cognition is then implemented indirectly in the core by
giving the system of epistemic cognition the ability to reason about
plan adoptability."
- Reason in a Changing World, International
Conference on Formal and Applied Practical Reasoning, Bonn,
Germany, 1996. This is the version of the paper presented at the
conference, but it is a significant revision of the paper appearing in
the Proceedings.