Published on January 3, 2022–Updated on October 24, 2022
Dates
on the January 7, 2022
from 10:30am to 12:00pm
Program
10:30 - 11:00
Etrit Haxholli (Inria)
On the Estimation of Shape Parameters of Tails of Marginal Distributions
Abstract:Anomalies are data patterns that under the absence of epistemic uncertainty, correspond to observations with different characteristics from normal instances. The principal idea in most applications is that the behavior of a model trained on normal data will change significantly when switching to an abnormal period. Extreme value theory (EVT) is useful in modeling the tails of distributions and thus is helpful in the choice of an anomaly threshold, which indicates when such a transition from a normal period occurs. Under some regularity conditions, we give theoretical guarantees that the tail of the marginal distribution coincides with the thickest tail of distributions defined on the range of the marginalized out variables. This result can be used to make tail estimations of loss functions more robust against the choice of the training set and enables us to estimate the shape of the tails with fewer samples, while at the same time reducing computational complexity.
11:00 - 11:30
Cedric Vincent-Cuaz (UCA)
Semi-relaxed Gromov-Wasserstein divergence with applications on graphs
Abstract: Comparing structured objects such as graphs is a fundamental operation involved in many learning tasks. To this end, the Gromov-Wasserstein (GW) distance, based on Optimal Transport (OT), has proven to be successful in handling the specific nature of the associated objects. More specifically, through the nodes connectivity relations, GW operates on graphs, seen as probability measures over specific spaces. At the core of OT is the idea of conservation of mass, which imposes a coupling between all the nodes from the two considered graphs. We argue that this property can be detrimental for tasks such as graph dictionary or partition learning, and we relax it by proposing a new semi-relaxed Gromov-Wasserstein divergence. Aside from immediate computational benefits, we discuss its properties, and show that it can lead to an efficient graph dictionary learning algorithm. We empirically demonstrate its relevance for complex tasks on graphs such as partitioning, clustering and completion.
When browsing Université Côte d'Azur website and Université Côte d'Azur components websites by profile ("I am" menu), informations may be saved in a "Cookie" file installed by Université Côte d'Azur on your computer, tablet or mobile phone. This Cookie file contains informations, such as a unique identifier, the name of the portal, and the chosen profile. This Cookie file is read by its transmitter. During its 12-month validity period, it allows to recognize your terminal and to propose the chosen profile as your default home page.
You have accepted the deposit of profile information cookies in your navigator.
You have declined the deposit of profile information cookies in your navigator.
"Do Not Track" is enabled in your browser. No profiles information will be collected.
Cookies de mesure d 'audiences
This website uses Google Analytics. By clicking on "I accept" or by navigatin on it, you authorize us to deposit a cookie for audience measurements purposes.
You have accepted the deposit of audience measurement cookies in your navigator.
You have declined the deposit of audience measurement cookies in your navigator.
"Do Not Track" is enabled in your browser. No navigation statistics will be collected.