# SIPTA Seminars

Are you a curious student that has just started to explore the topic of imprecise probabilities?
Or an experienced researcher that would like to keep in touch with the community?
Join us at the *SIPTA Seminars*, an online series of seminars on imprecise probabilities (IP).
The seminars are open to anyone interested in IP, and are followed by a Q&A and open discussion.
They take place roughly once per month, with a break over the summer.
Topics range from foundational IP theories to applications that can benefit from IP approaches.

Details about the individual seminars are available in the list below. Close to the date of the next seminar, a zoom link will be provided there as well, which is freely accessible. If you click it, you will first be taken to a waiting room; please be patient until the organizers let you in. During the talk, questions should be put in the chat, and the audience is expected to mute their microphones. After the talk, there will be time for Q&A and discussion, at which point you can turn on your microphone when you want to contribute. The talk (but not the Q&A and discussion) will be recorded, and will afterwards be made freely available on the SIPTA Youtube channel.

The organisation is taken care of by Sébastien Destercke, Enrique Miranda and Jasper De Bock. If you have questions about the seminars, or suggestions for future speakers, you can get in touch with us at seminars@sipta.org. Suggestions for prominent speakers outside the IP community, whose work is nevertheless related to IP, are especially welcome.

## Upcoming seminars

### Decision making under weakly structured information with applications to robust statistics and machine learning

Christoph Jansen 30 April 2024, 15:00 CESTZoom link: https://utc-fr.zoom.us/j/85460316690

After a brief introduction to the concepts of classical decision theory relevant to us, the focus of the second part of the talk is on decision-making under weakly structured information. In particular, two different attenuations of the assumptions of the classical theory are discussed: (i) the presence of ambiguity and (ii) partially cardinal preference structures. Here we address (i) with the help of imprecise probabilities, while we model (ii) with the help of so-called preference systems, i.e., order-theoretic structures allowing for incorporating partial preference intensity information. Exemplified by a generalization of stochastic dominance (GSD), we also discuss ways to determine suitable choice functions for this weakly structured setting.

In the third part of the presentation, we then address the question of how the methods presented in Part 2 can be utilized for dealing with non-standard data in machine learning and statistics. The central idea here is to reinterpret the concept of a preference system as a sample space with locally varying scale of measurement. The concept of GSD then naturally extends to this statistical interpretation, too, and we discuss possibilities for statistically testing it, if only samples of the variables of interest are available. Moreover, we demonstrate how these tests can be robustified to their assumptions by using imprecise probabilities.

The talk ends with two applications of the ideas presented, namely the robust comparison of weakly structured random variables and a general framework for statistical multi-criteria benchmarking of classification algorithms.

## Past seminars

### A dynamic Choquet pricing rule with bid-ask spreads under Dempster–Shafer uncertainty

Barbara Vantaggi 26 March 2024Watch on YouTube

Imprecise probabilities play a central role in many fields, and finance is surely one of the most promising sources of interesting problems, challenging the classical probabilistic setting. As a prototypical example, real markets show frictions, most evidently in the form of bid-ask prices for every asset. Therefore, addressing asset pricing in a real market requires to deal with a non-linear dynamic model of uncertainty which possesses, at the same time, a nice parameterization so as to allow for both a semantic interpretation and a computationally feasible market-consistent calibration.

With this target problem in mind, in this talk we introduce a particular imprecise stochastic process: a multiplicative binomial process under Dempster-Shafer uncertainty (briefly, a DS-multiplicative binomial process). This process relies on suitable notions of Markov and time-homogeneity and on the product rule of conditioning for belief functions. Moreover, we investigate to what extent such properties can be constrained by means of one-step time-homogeneity.

Such a process permits to model the price evolution of a stock, allowing for frictions in the form of bid-ask spreads. Furthermore, it gives automatically rise to a conditional Choquet expectation operator. Next, we consider a market formed by a frictionless risk-free bond (whose price is modeled by a deterministic process) and a non-dividend paying stock with frictions (whose lower price is modeled by a DS-multiplicative binomial process).

In this market we prove an analog of the classical theorem of change of measure relying on the notion of equivalent one-step Choquet martingale belief function. We then propose a dynamic Choquet pricing rule with bid-ask spreads showing that the discounted lower price process of a European derivative contract on the stock is a Choquet super-martingale. We also provide a normative justification in terms of a dynamic generalized no-arbitrage condition relying on the notion of partially resolving uncertainty due to Jaffray. Finally, we introduce a market consistent calibration procedure and show the use of the calibrated model in bid-ask option pricing.

[Work in collaboration with Andrea Cinfrignini (Sapienza University of Rome) and Davide Petturiti (University of Perugia)]

### Mixing time and uncertainty. A tale of superpositions

Rafael Peñaloza Nyssen 14 February 2024Watch on YouTube

Formalisms capable of dealing with time and uncertainty are necessary for modelling the existing knowledge of (business) processes which must interact with an unreliable environment. Yet, combining time and uncertainty is far from trivial and can easily lead to undecidability, making those formalisms useless in practice. A recent proposal for probabilistic temporal logic uses the idea of superposition semantics, where an object simultaneously has and does not have a property, until it is observed. We apply this superposition semantics to Linear Temporal Logic over finite time, and show how it can be used for Business Process Modelling tasks. Under the open world assumption, we see how to compute lower and upper probability bounds for observing specific process behaviours.

### Fundamentally finitary foundations for probability and bounded probability

Matthias Troffaes 31 January 2024Watch on YouTube

Bounded probability uses sets of probability measures, specified through bounds on expectations, to represent states of severe uncertainty. This approach has been successfully applied to a wide range of fields where risk under severe uncertainty is a concern.

Following Walley’s work from the 90’s, the canonical interpretation of these bounds has been through betting. However, in high-risk situations where the assessor themselves is at risk, various authors have argued that the betting interpretation of probability, and therefore also of lower previsions, cannot be applied. For this reason, in 2006, Lindley suggested an alternative interpretation of probability, based on urns, which mathematically leads to a theory of rational valued probabilities for finite possibility spaces.

Independently, in the 80’s, Nelson proposed a radically elementary probability theory based on real-valued probability mass functions for arbitrary possibility spaces, demonstrating this theory could recover, and simplify the formulation of, many important probabilistic results. This includes the functional central limit theorem that characterizes Brownian motion, and thereby Nelson removes all of the usual technicalities that arise in the measure theoretic approach to Brownian motion. Indeed, his approach needs no measure theory, a feature shared by the theories from De Finetti and Walley. Interestingly, Nelson’s approach also incorporates a fundamental notion of ambiguity, and embraces finite additivity, though these aspects are not often pointed out. Unfortunately, despite its beauty and elegance, Nelson’s programme failed to gain large traction to this day, perhaps partly because it relies on extending ZFC in a way that may first seem bizarre and counterintuitive.

In this talk, I revisit Lindley’s interpretation in the context of probability bounding. In doing so, I provide an alternative interpretation of lower previsions, which leads to new expressions for consistency (called avoiding sure loss) and inference (called natural extension). Unlike Walley’s approach, the duality theory that follows from this interpretation does not need the ultrafilter principle, and is purely constructive. A key corollary from these results is that every conditional probability measure (even finitely additive ones) can be represented by a net of probability mass functions, establishing that Nelson’s programme is universal: there is no probability measure that cannot be modelled by his approach. I reflect on what this means for practical probabilistic modelling and inference, and whether perhaps, in Nelson’s spirit, it is worthwhile to replace probability measures with probability mass functions as a foundation for probability and bounded probability, and to treat sigma-additivity as a sometimes welcome but often unnecessary by-product of idealization.

### On the interplay of optimal transport and distributionally robust optimization

Daniel Kuhn 12 December 2023Watch on YouTube

Optimal Transport (OT) seeks the most efficient way to morph one probability distribution into another one, and Distributionally Robust Optimization (DRO) studies worst-case risk minimization problems under distributional ambiguity. It is well known that OT gives rise to a rich class of data-driven DRO models, where the decision-maker plays a zero-sum game against nature who can adversely reshape the empirical distribution of the uncertain problem parameters within a prescribed transportation budget. Even though generic OT problems are computationally hard, the Nash strategies of the decision-maker and nature in OT-based DRO problems can often be computed efficiently. In this talk we will uncover deep connections between robustification and regularization, and we will disclose striking properties of nature’s Nash strategy, which implicitly constructs an adversarial training dataset. We will also show that OT-based DRO offers a principled approach to deal with distribution shifts and heterogeneous data sources, and we will highlight new applications of OT-based DRO in machine learning, statistics, risk management and control. Finally, we will argue that, while OT is useful for DRO, ideas from DRO can also help us to solve challenging OT problems.

### Structural causal models are (solvable by) credal networks

Alessandro Antonucci 28 November 2023Watch on YouTube

We present recent results on the intimate connections between causal inference and imprecise probabilities. A structural causal model is made of endogenous (manifest) and exogenous (latent) variables. We show that endogenous observations induce linear constraints on the probabilities of the exogenous variables. This allows to map causal models to credal networks. Causal inferences, such as interventions and counterfactuals, can be obtained by credal network algorithms. These natively return sharp values in the identifiable case, while intervals corresponding to the exact bounds are produced for unidentifiable queries. Exact computation will be inefficient in general, given that, as we show, causal inference is NP-hard, even for simple topologies. We then target approximate bounds via a causal EM scheme. We evaluate their accuracy by providing credible intervals on the quality of the approximation; we show through a synthetic benchmark that the EM scheme delivers accurate results in a fair number of runs. We also present an actual case study on palliative care to show how our algorithms can readily be used for practical purposes.

### Falsification, Fisher’s underworld of probability, and balancing behavioral & statistical reliability

Ryan Martin 18 October 2023Watch on YouTube

Statisticians develop methods to assist in building probability statements that will be used to make inference on relevant unknowns. Popper argued that probability statements themselves can’t be falsified, but what about the statistical methods that use data to generate them? Science today is largely empirical, so if statistical methods’ conversion of data into scientific judgments can’t be scrutinized, then it’s not fair to expect society to “trust the science.”

Fisher’s underworld of probability concerns layers below the textbook surface level, where knowledge is vague and imprecise. Roughly, suppose that an agent quantifies his uncertainty about a relevant unknown via (imprecise) probability statements, which defines his betting odds. Now suppose that a second agent, who may not have her own probability statements about the relevant unknown, believes that the first agent’s assessments are wrong and can formulate odds at which she’d bet against the first agent’s wagers. If the second agent wins in these side-bets, then she reveals a shortcoming in the first agent’s assessments. I claim that the statistical method and “society” above are like the first and second agents here, respectively, and that scrutiny of a statistical method proceeds by giving “society” an opportunity to bet against its claims.

In this talk, I’ll carry out this scrutiny formally/mathematically and present some key take-aways. No surprise, a statistical method that’s falsification-proof in this sense is the behaviorally most reliable and conservative generalized Bayes rule. More surprising, however, is that a necessary condition for being falsification-proof is a statistical reliability property – called validity – that I’ve been advocating for recently. It follows, then, from the false confidence theorem that statistical methods quantifying uncertainty via precise probabilities can typically be falsified in this sense. More generally, since validity also implies certain behavioral reliability properties and needn’t be overly conservative, my new possibilistic inference framework (which I’ll describe and illustrate) is a promising way to balance the behavioral and statistical reliability properties.

There’s no paper yet on the exact contents of this talk, but some relevant material can be found at https: //arxiv.org/abs/2203.06703 and https://arxiv.org/abs/2211.14567.

### Application of uncertainty theory in the field of environmental risks

Dominique Guyonnet 27 September 2023Watch on YouTube

The management of environmental hazards is largely based on the assessment of risk; i.e., risk for human health and/or for ecosystems (water, soil, air, …). Environmental risks pertain to real-world systems that are typically incompletely known; hence risk is affected by significant uncertainty. Historically, the treatment of uncertainties in risk assessments has relied largely on the use of oftentimes subjective single probability distributions. Since around 30 years, alternative uncertainty theories have been applied, including the general framework of imprecise probabilities with, e.g., possibility theory and belief functions. This presentation will provide specific examples of application of such uncertainty theories to environmental hazards, with a special focus on hazards related to soil and water pollution. It will show how possibility theory and belief functions are well suited for representing information typically available in these contexts. The issue of uncertainty propagation in risk assessment modelling will be addressed and also the question of communication on risk and associated uncertainties with third-party stakeholders.

### One way to define an imprecise-probabilistic version of the Poisson process

Alexander Erreygers 16 June 2023Watch on YouTube

The Poisson process is one of the more fundamental continuous-time uncertain processes. Besides its appearance in many applications, it is also interesting because there’s several equivalent ways to define/construct it. In this talk I will treat a couple of these definitions, will explain how I succeeded in generalising one of them to allow for imprecision, and will discuss how one could go about generalising the others. Furthermore, I will explain how this work on the Poisson process fits in my more general push to advance the theory of Markovian imprecise (continuous-time) processes, and will touch on some (un)solved problems.

### Random fuzzy sets and belief functions: application to machine learning

Thierry Denœux 24 May 2023Watch on YouTube

The theory of belief functions is a powerful formalism for uncertain reasoning, with many successful applications to knowledge representation, information fusion, and machine learning. Until now, however, most applications have been limited to problems (such as classification) in which the variables of interest take values in finite domains. Although belief functions can, in theory, be defined in infinite spaces, we lacked practical representations allowing us to manipulate and combine such belief functions. In this talk, I show that the theory of epistemic random fuzzy sets, an extension of Possibility and Dempster-Shafer theories, provides an appropriate framework for evidential reasoning in general spaces. In particular, I introduce Gaussian random fuzzy numbers and vectors, which generalize both Gaussian random variables and Gaussian possibility distributions. I then describe an application of this new formalism to nonlinear regression.

The talk discusses the foundations of decision making under Knightian uncertainty, i.e. in situations when the relevant probability distributions are unknown, or only partially known. After reviewing the basic concepts and models that have been developed in decision theory on the one hand, and mathematical finance on the other hand, we put special emphasis on markets under uncertainty in so-called identified models where recently some substantial progress has been made.

### Some finitely additive probabilities and decisions

Teddy Seidenfeld 17 February 2023Watch on YouTube

Imprecise Probability’s roots grow in de Finetti’s fertile foundations of coherent decision making. A continuing theme in de Finetti’s work is that coherence does not require expectations to be countably additive. In this presentation (involving joint work with Jay Kadane, Mark Schervish, and Rafael Stern) I review two contexts: the first one about probabilities for (logical) Boolean algebras; the second one about admissibility in statistical decision theory, that require merely finitely additive (not countably additive) expectations. The common perspective for viewing the two contexts is through the requirement of countably additivity – as presented in Kolmogorov’s theory – as a continuity principle. In each of the two contexts such continuity is precluded, but in very different ways.

### Engineering and IP: what’s going on?

Alice Cicirello, Matthias Faes & Edoardo Pattelli 13 January 2023Watch on YouTube

Engineers design components, structures and systems and plan activities to extend their service lives despite a limited understanding of the underlying physics and/or the availability of sufficient informative data. A big challenge is to deal with unknown and uncontrollable variables such as changes on the environmental conditions, deliberated threats, change of intended use, etc. As a result of this, large safety factors are usually adopted in order to mitigate the use of approximate methods and deal with uncertainty. Often the methods for dealing with uncertainty assume a complete knowledge of the underlying stochastic process. This wide availability of information is however rarely the case in practice.

Although imprecise probability offers the tools to cope with lack of knowledge and data, it is not largely adopted in practice. One of the main reasons is the lack of accessible and efficient tools, both analytical and numerical, for uncertainty quantification. On top, there exists still a lack of awareness of the potential capabilities of imprecise probability theory and its applications.

In this seminar, we are presenting the challenges in the application of imprecise probability to practical engineering problems. These challenges have been the driver for several novel algorithms and approaches that are going to be presented.

### Dealing with uncertain arguments in Artificial Intelligence

Fabio Cozman 29 November 2022Watch on YouTube

Argumentation techniques have received significant attention in Artificial Intelligence, particularly since 1995, when Dung proposed his “argumentation frameworks” and showed that they unify many branches of knowledge representation. Argumentation frameworks that deal with uncertainty have been explored since then; often, these frameworks rely on imprecise or indeterminate probabilities. Indeed, probabilistic argumentation frameworks may be one of the most promising applications of imprecise probabilities in Artificial Intelligence. This talk will review the main ideas behind argumentation frameworks and how they are often connected with imprecise probabilities.

### Coalitional game theory vs Imprecise probabilities: Two sides of the same coin … or not?

Ignacio Montes 21 October 2022Watch on YouTube

Lower probabilities, defined as normalised and monotone set functions, constitute one of the basic models within Imprecise Probability theory. One of their interpretations allows building a bridge with coalitional game theory: the possibility space is regarded as a set of players who must share a reward, events represent coalitions of players who collaborate in order to obtain a greater reward, and the lower probability of a coalition represents the minimum reward that this collaboration can guarantee.

This correspondence makes lower probabilities and coalitional games formally equivalent, being the notation, terminology and interpretation the only difference. As an example, coherent lower probabilities are the same as exact games, the credal set of the lower probability is referred to as the core of the game,…

In this presentation I dig into this connection, paying special attention to game solutions and their interpretation as centroids of the credal set. In addition, I show that if we move to the more general setting of lower previsions, it is possible to represent information about the coalitions and their rewards that cannot be captured by the standard coalitional game theory. This shows that lower previsions constitute a more general framework than the classical theory of coalitional games.

We develop a representation of a decision maker’s uncertainty based on e-values, a recently proposed alternative to the p-value. Like the Bayesian posterior, this e-posterior allows for making predictions against arbitrary loss functions that do not have to be specified ex ante. Unlike the Bayesian posterior, it provides risk bounds that have frequentist validity irrespective of prior adequacy: if the e-collection (which plays a role analogous to the Bayesian prior) is chosen badly, bounds get loose rather than wrong. As a consequence, e-posterior minimax decision rules are safer than Bayesian ones. The resulting quasi-conditional paradigm addresses foundational issues in statistical inference. If the losses under consideration have a special property which we call Condition Zero, risk bounds based on the standard e-posterior are equivalent to risk bounds based on a `capped’ version of it. We conjecture that this capped version can be interpreted in terms of possibility measures and Martin-Liu inferential models.

### Imprecise probabilities in modern data science: challenges and opportunities

Ruobin Gong 29 June 2022Watch on YouTube

Imprecise probabilities (IP) capture structural uncertainty intrinsic to statistical models. They offer a richer vocabulary with which the modeler may articulate specifications without concocting unwarranted assumptions. While IP promises a principled approach to data-driven decision making, its use in practice has so far been limited. Two challenges to its popularization are 1) IP reasoning may defy the intuition we derive from precise probability models, and 2) IP models may be difficult to compute. On the other hand, recent developments in formal privacy present a unique opportunity for IP to contribute to responsible data dissemination. Case in point is differential privacy (DP), a cryptographically motivated framework endorsed by corporations and official statistical agencies including the U.S. Census Bureau. I discuss how IP offers the correct language for DP, both descriptive and inferential, particularly when the privacy mechanism lacks transparency. These challenges and opportunities highlight the urgency to adapt IP research to meet the demands of modern data science.

### Imprecision, not as a problem, but as part of the solution

Gert de Cooman 30 May 2022Watch on YouTube

Imprecision in probability theory is often considered to be unfortunate, something to be tolerated, and then only if there is no other way out. In this talk, I will argue that imprecision also has strongly positive sides, and that it can allow us to look at, approach and deal with existing problems in novel ways. I will provide a number of examples to corroborate for this thesis, based on my research experience in a number of fields: inference and decision making, stochastic processes, algorithmic randomness, game-theoretic probability, functional analysis, …