FOURTH INTERNATIONAL
SYMPOSIUM ON
IMPRECISE PROBABILITIES AND THEIR APPLICATIONS
Carnegie Mellon University
Pittsburgh, PA, USA
July 20-23 2005
DETAILED PROGRAM
After consulting the program
overview, you can find here a more detailed list of activities for
ISIPTA'05.
Wednesday, July 20
-
1:00PM - 2:25PM
Tutorial: Introduction to Imprecise Probabilities,
by Gert de Cooman, Ghent University, Belgium.
The tutorial will introduce basic notions and ideas in the theory of imprecise probabilities. It will highlight the behavioural interpretation of several types of imprecise probability models, such as lower previsions, sets of probability measures and sets of desirable gambles; as well as their mutual relationships. Rationality criteria for these models, based on their interpretation, will be discussed, such as avoiding sure loss and coherence. We also touch upon the issues of conditioning, and decision making using such models.
-
2:35PM - 4:00PM
Tutorial: Imprecise Probabilities and Financial Risk Assessment
by Paolo Vicig, Universita di Trieste, Italy.
Although financial risk measurement is a largely investigated research area, its relationship with imprecise probabilities was mostly overlooked. However risk measures can be viewed as instances of upper (or lower) previsions, thus letting us apply the theory of imprecise previsions to them. Therefore, after a presentation of some well known risk measures (including Value-at-Risk or VaR, coherent and convex risk measures), we show how their definitions can be generalized and discuss their consistency properties. Thus, for instance, VaR may or may not avoid sure loss, and conditions for this can be derived. This analysis also makes us consider a very large class of imprecise previsions, which we termed convex previsions, generalizing convex risk measures and other uncertainty models. Measures for conditional risks can further be introduced by extending this class to conditional convex previsions. Finally, we discuss the role of some important notions in the theory of imprecise probabilities, like that of natural extension, when applied to risk measurement.
-
4:10AM - 5:35AM
Tutorial: The Logical Concept of Probability and
Statistical Inference,
by Kurt Weichselberger (with Anton Wallner), Ludwig-Maximilians-University Munich,
Germany.
This tutorial will review Prof. Weichselberger's theory of interval
probability. A consistent concept of logical probability affords the employment
of interval probability. Such a concept which attributes
probability to arguments consisting of premise and conclusion, can
be used to generate a system of axioms for statistical inference.
For the contents of this tutorial,
check the paper with the same title!
Thursday, July 21
-
8:30AM - 8:45AM
Welcome.
-
8:45AM - 10:05AM
Technical session: Decisions and Elicitation
-
10:20AM - 11:40AM
Technical session: Conditional Probability
-
11:40AM - 12:40PM
Poster session (posters presented during the morning)
-
12:40PM - 2:00PM
Lunch
-
2:00PM - 3:20PM
Technical session: Fuzzy and Belief Measures
-
3:30PM - 5:10PM
Technical session: Models and Predictions
-
5:10PM - 6:10PM
Poster session (posters presented during the afternoon)
-
Banquet at the Andy Warhol Museum
Banquet talk: Convexity and E-Admissibility in Rational Choice,
by Isaac Levi, Columbia University
After a brief summary of leading ideas I have advocated since
the early 1970's concerning the representation of probability
judgment by sets of probabilities, I consider the most contentious
of the assumptions I impose on sets of permissible probabilities.
That is the requirement that sets of permissible probabilities
satisfy a convexity condition. Several objections are reviewed.
An objection recently raised in conversation by T.Seidenfeld is
singled out for special attention. My response to his worry
and to another objection raised by Seidenfeld, Kadane and Schervish
derives the convexity of credal probability sets from the idea that
all potential resolutions of conflict in the evaluation of options
should be considered permissible together with the assumption that
any evaluation of options that preserves evaluations that are
common to already recognized permissible evaluations ought to
qualify as a potential resolution. On this basis, the ``value
structure'' representing the evaluation of options is shown to
be convex. This in turn argues for the convexity of the set of
permissible probabilities.
Friday, July 22
-
8:45AM - 10:05AM
Technical session: Algorithms and Computations
-
10:20AM - 11:40AM
Technical session: Foundations
-
11:40AM - 12:40PM
Poster session (posters presented during the morning)
-
12:40PM - 2:00PM
Lunch
-
2:00PM - 3:20PM
Technical session: Algorithms and Foundations
-
3:40PM - 5:00PM
Invited Talk: Probability and the Problem of Ignorance,
by Arthur Dempster, Harvard University
I will address in this talk several issues related to the
Dempster-Shafer extension of the calculus of probability.
To fix a locus for the theory, it is helpful to recognize
that mathematics relates to science in two fundamentally
different ways, one concerned with explicit formal representation
of objective phenomena, and the second with explicit formal
representation of subjective reasoning about uncertain
aspects of the objective world. DS calculus is in essence
a mathematical representation of the latter sort, providing
a rigorous foundation for characterizing uncertainties
of a specific type, namely uncertainties about objectively
defined unknowns. For every binary margin of a defined system,
traditional Bayesian reasoning requires specification of
two probabilities summing to unity, surely an impossibly
strong assumption for most contemporary real applications,
whereas DS allows a third probability of ``don't know''
which can range from zero in the Bayesian case to unity
for the representation of complete ignorance. The extended
calculus promises a large extension of the class of potentially
useful models, beyond the subclass currently developing
within the Bayesian special case. Models are constructed
in the DS mode using a mathematical definition of the
fundamental concept of independence, misleadingly called
the Dempster rule of combination, together with graphical
structures that permit practical modeling of the uncertainty
of high-dimensional complex systems. To the extent that the
Bayesian special case succeeds in practice, as it often does,
it trades on the acceptability of a strong implicit assumption
that knowledge of unrepresented phenomena is independent of the
explicitly represented model, and therefore is irrelevant.
An intriguing feature of the DS calculus is that it casts
new light on the problem of how inference is possible about
complex systems despite the futility of attempting to assign
evidence-based Bayesian probabilities to the vast arrays of
variables that modern technologies are capable of representing
and measuring.
Saturday, July 23
-
8:45AM - 10:25AM
Technical session: Graphical and Applied Models
-
10:40AM - 11:40AM
Special session: Computational Challenges
This session will feature a discussion of several computational
problems and challenges that are relevant to the world of imprecise
probabilities.
-
11:40AM - 12:40PM
Poster session (posters presented during the morning
and during Friday's afternoon)
-
12:40PM - 2:00PM
Lunch
-
2:00PM - 3:20PM
Technical session: Behavior and Probabilistic Logic
-
3:30PM - 4:50PM
Technical session: Mathematical Properties
-
5:00PM - 6:00PM
Poster session (posters presented during the afternoon)
Sunday, July 24 (only morning)
Workshop on Financial Risk Assessment
-
Dynamic monetary risk measures for processes
Patrick Cheridito
We study time-consistency properties of processes of monetary risk measures that depend on bounded discrete-time processes describing the evolution of financial values. The time horizon can be finite or infinite. We call a process of monetary risk measures time-consistent if it assigns to a process of financial values the same risk irrespective of whether it is calculated directly or in two steps backwards in time, and we show how this property manifests itself in the corresponding process of acceptance sets. For processes of coherent and convex monetary risk measures admitting a robust representation with sigma-additive linear functionals, we give necessary and sufficient conditions for time-consistency in terms of the representing functionals.
-
Time Consistent and
Currency Invariant Convex Risk Measures
Stephen D'Silva
In order to study time-consistent monetary concave risk adjusted valuations in a continuous time, finite horizon setting, there arises a need to define risk adjusted valuations at all dates in the time continuum. One approach of tackling this issue is to define risk adjusted valuations at all dates and call this sequence of risk valuations as the risk valuation process. However, monetary risk adjusted valuations enjoy the translation invariance property which enables us to define monetary risk adjusted valuations at intermediate stopping times in terms of the date-0 risk adjusted valuation.
We characterize these intermediate risk adjusted valuations in terms of the date-0 risk adjusted valuation for strongly relevant time-consistent monetary risk measures. We develop a representation for strongly relevant monetary risk measures which are time consistent at certain specified dates.
We seek to characterize convex risk measures which are both time consistent and currency invariant. We define currency invariance for monetary concave risk adjusted valuations. We show that only trivial monetary concave risk adjusted valuations satisfy both time consistency and currency invariance for all positive exchange rate processes.
-
Generalized Deviation in Risk Analysis
R. Tyrrell Rockafellar,
Stan Uryasev, Michael Zabarankin
General deviation measures are introduced and studied systematically for their potential applications to risk management in areas like portfolio optimization and engineering. Such measures include standard deviation as a special case but need not be symmetric with respect to ups and downs. Their properties are explored with a mind to generating a large assortment of examples and assessing which may exhibit superior behavior. Connections are shown with coherent risk measures in the sense of Artzner, Delbaen, Eber and Heath, when those are applied to the difference between a random variable and its expectation, instead of to the random variable itself. However, the correspondence is only one-to-one when both classes are restricted by properties called lower range dominance, on the one hand, and strict expectation boundedness on the other. Dual characterizations in terms of sets called risk envelopes are fully provided.
-
Fundamental Theorems of
Previsions and Asset Pricing Theories
Mark Schervish, Teddy Seidenfeld, Jay Kadane
We explore the connections between the concepts of arbitrage and Dutch Book. These concepts are related to the fundamental theorem of previsions and the fundamental theorem of asset pricing. In loose terms, fair prices for gambles (previsions) are coherent and asset prices are arbitrage free if they are expected values under probability measures. How generally this loose result holds and how closely the two concepts correspond are the main focus of this study.
[
back to the
ISIPTA '05 home page ]
Page created June 6 2005,
last updated June 6 2005.
Send any remarks to the following address:
fgcozman@usp.br