Skip to content

Abumze978/SemesterProject

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Distributional Robustness via Maximum Mean Discrepancy Metric

This thesis constitutes a preliminary step in the study of distributionally robust optimization (DRO) problems, where the ambiguity set is a ball in the space of probability distributions defined using the maximum mean discrepancy (MMD) metric. We start by providing a self-contained introduction to the theory of reproducing kernel Hilbert spaces (RKHS) and the theory of kernel mean embedding (KME) of probability distributions. Here, we illustrate the fundamental results from the literature which will be essential for the comprehension of the results presented in this thesis. Armed with these results, we proceed to define and analyze the DRO problem. In this direction, our contribution can be summarized as follows. We first provide insights on the impact that the choice of the kernel has on the DRO problem. Then, we study the properties of the two protagonists of the DRO problem: the loss function and the ambiguity set. Finally, we focus on the data-driven scenario and study the regularization effect of the DRO problem. Specifically, we will prove that when the number of data points is large, and the center of the ambiguity set is taken as the empirical distribution, the DRO problem is equivalent to a RKHS norm regularization of the empirical loss.

About

This thesis constitutes a preliminary step in the study of distributionally robust optimization (DRO) problems, where the ambiguity set is a ball in the space of probability distributions defined using the maximum mean discrepancy (MMD) metric. We start by providing a self-contained introduction to the theory of reproducing kernel Hilbert spaces…

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors