Academic literature on the topic 'Distances de Wasserstein'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Distances de Wasserstein.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Distances de Wasserstein"
Solomon, Justin, Fernando de Goes, Gabriel Peyré, Marco Cuturi, Adrian Butscher, Andy Nguyen, Tao Du, and Leonidas Guibas. "Convolutional wasserstein distances." ACM Transactions on Graphics 34, no. 4 (July 27, 2015): 1–11. http://dx.doi.org/10.1145/2766963.
Full textKindelan Nuñez, Rolando, Mircea Petrache, Mauricio Cerda, and Nancy Hitschfeld. "A Class of Topological Pseudodistances for Fast Comparison of Persistence Diagrams." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 12 (March 24, 2024): 13202–10. http://dx.doi.org/10.1609/aaai.v38i12.29220.
Full textPanaretos, Victor M., and Yoav Zemel. "Statistical Aspects of Wasserstein Distances." Annual Review of Statistics and Its Application 6, no. 1 (March 7, 2019): 405–31. http://dx.doi.org/10.1146/annurev-statistics-030718-104938.
Full textKelbert, Mark. "Survey of Distances between the Most Popular Distributions." Analytics 2, no. 1 (March 1, 2023): 225–45. http://dx.doi.org/10.3390/analytics2010012.
Full textVayer, Titouan, Laetitia Chapel, Remi Flamary, Romain Tavenard, and Nicolas Courty. "Fused Gromov-Wasserstein Distance for Structured Objects." Algorithms 13, no. 9 (August 31, 2020): 212. http://dx.doi.org/10.3390/a13090212.
Full textBelili, Nacereddine, and Henri Heinich. "Distances de Wasserstein et de Zolotarev." Comptes Rendus de l'Académie des Sciences - Series I - Mathematics 330, no. 9 (May 2000): 811–14. http://dx.doi.org/10.1016/s0764-4442(00)00274-3.
Full textPeyre, Rémi. "Comparison between W2 distance and Ḣ−1 norm, and Localization of Wasserstein distance." ESAIM: Control, Optimisation and Calculus of Variations 24, no. 4 (October 2018): 1489–501. http://dx.doi.org/10.1051/cocv/2017050.
Full textTong, Qijun, and Kei Kobayashi. "Entropy-Regularized Optimal Transport on Multivariate Normal and q-normal Distributions." Entropy 23, no. 3 (March 3, 2021): 302. http://dx.doi.org/10.3390/e23030302.
Full textBeier, Florian, Robert Beinert, and Gabriele Steidl. "Multi-marginal Gromov–Wasserstein transport and barycentres." Information and Inference: A Journal of the IMA 12, no. 4 (September 18, 2023): 2720–52. http://dx.doi.org/10.1093/imaiai/iaad041.
Full textZhang, Zhonghui, Huarui Jing, and Chihwa Kao. "High-Dimensional Distributionally Robust Mean-Variance Efficient Portfolio Selection." Mathematics 11, no. 5 (March 6, 2023): 1272. http://dx.doi.org/10.3390/math11051272.
Full textDissertations / Theses on the topic "Distances de Wasserstein"
Boissard, Emmanuel. "Problèmes d'interaction discret-continu et distances de Wasserstein." Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1389/.
Full textWe study several problems of approximation using tools from Optimal Transportation theory. The family of Wasserstein metrics are used to provide error bounds for particular approximation of some Partial Differential Equations. They also come into play as natural measures of distorsion for quantization and clustering problems. A problem related to these questions is to estimate the speed of convergence in the empirical law of large numbers for these distorsions. The first part of this thesis provides non-asymptotic bounds, notably in infinite-dimensional Banach spaces, as well as in cases where independence is removed. The second part is dedicated to the study of two models from the modelling of animal displacement. A new individual-based model for ant trail formation is introduced, and studied through numerical simulations and kinetic formulation. We also study a variant of the Cucker-Smale model of bird flock motion : we establish well-posedness of the associated Vlasov-type transport equation as well as long-time behaviour results. In a third part, we study some statistical applications of the notion of barycenter in Wasserstein space recently introduced by M. Agueh and G. Carlier
Fernandes, Montesuma Eduardo. "Multi-Source Domain Adaptation through Wasserstein Barycenters." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG045.
Full textMachine learning systems work under the assumption that training and test conditions are uniform, i.e., they do not change. However, this hypothesis is seldom met in practice. Hence, the system is trained with data that is no longer representative of the data it will be tested on. This case is represented by a shift in the probability measure generating the data. This scenario is known in the literature as distributional shift between two domains: a source, and a target. A straightforward generalization of this problem is when training data itself exhibit shifts on its own. In this case, one consider Multi Source Domain Adaptation (MSDA). In this context, optimal transport is an useful field of mathematics. Especially, optimal transport serves as a toolbox, for comparing and manipulating probability measures. This thesis studies the contributions of optimal transport to multi-source domain adaptation. We do so through Wasserstein barycenters, an object that defines a weighted average, in the space of probability measures, for the multiple domains in MSDA. Based on this concept, we propose: (i) a novel notion of barycenter, when the measures at hand are equipped with labels, (ii) a novel dictionary learning problem over empirical probability measures and (iii) new tools for domain adaptation through the optimal transport of Gaussian mixture models. Through our methods, we are able to improve domain adaptation performance in comparison with previous optimal transport-based methods on image, and cross-domain fault diagnosis benchmarks. Our work opens an interesting research direction, on learning the barycentric hull of probability measures
Schrieber, Jörn [Verfasser], Dominic [Akademischer Betreuer] Schuhmacher, Dominic [Gutachter] Schuhmacher, and Anita [Gutachter] Schöbel. "Algorithms for Optimal Transport and Wasserstein Distances / Jörn Schrieber ; Gutachter: Dominic Schuhmacher, Anita Schöbel ; Betreuer: Dominic Schuhmacher." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2019. http://d-nb.info/1179449304/34.
Full textSEGUY, Vivien Pierre François. "Measure Transport Approaches for Data Visualization and Learning." Kyoto University, 2018. http://hdl.handle.net/2433/233857.
Full textGairing, Jan, Michael Högele, Tetiana Kosenkova, and Alexei Kulik. "On the calibration of Lévy driven time series with coupling distances : an application in paleoclimate." Universität Potsdam, 2014. http://opus.kobv.de/ubp/volltexte/2014/6978/.
Full textFlenghi, Roberta. "Théorème de la limite centrale pour des fonctionnelles non linéaires de la mesure empirique et pour le rééchantillonnage stratifié." Electronic Thesis or Diss., Marne-la-vallée, ENPC, 2023. http://www.theses.fr/2023ENPC0051.
Full textThis thesis is dedicated to the central limit theorem which is one of the two fundamental limit theorems in probability theory with the strong law of large numbers.The central limit theorem which is well known for linear functionals of the empirical measure of independent and identically distributed random vectors, has recently been extended to non-linear functionals. The main tool permitting this extension is the linear functional derivative, one of the notions of derivation on the Wasserstein space of probability measures.We generalize this extension by first relaxing the equal distribution assumptionand then the independence property to be able to deal with the successive values of an ergodic Markov chain.In the second place, we focus on the stratified resampling mechanism.This is one of the resampling schemes commonly used in particle filters. We prove a central limit theorem for the first resampling according to this mechanism under the assumption that the initial positions are independent and identically distributed and the weights proportional to a positive function of the positions such that the image of their common distribution by this function has a non zero component absolutely continuous with respect to the Lebesgue measure. This result relies on the convergence in distribution of the fractional part of partial sums of the normalized weights to some random variable uniformly distributed on [0,1]. More generally, we prove the joint convergence in distribution of q variables modulo one obtained as partial sums of a sequence of i.i.d. square integrable random variables multiplied by a common factor given by some function of an empirical mean of the same sequence. The limit is uniformly distributed over [dollar][0,1]^q[dollar]. To deal with the coupling introduced by the common factor, we assume that the common distribution of the random variables has a non zero component absolutely continuous with respect to the Lebesgue measure, so that the convergence in the central limit theorem for this sequence holds in total variation distance.Under the conjecture that the convergence in distribution of fractional parts to some uniform random variable remains valid at the next steps of a particle filter which alternates selections according to the stratified resampling mechanism and mutations according to Markov kernels, we provide an inductive formula for the asymptotic variance of the resampled population after n steps. We perform numerical experiments which support the validity of this formula
Bobbia, Benjamin. "Régression quantile extrême : une approche par couplage et distance de Wasserstein." Thesis, Bourgogne Franche-Comté, 2020. http://www.theses.fr/2020UBFCD043.
Full textThis work is related with the estimation of conditional extreme quantiles. More precisely, we estimate high quantiles of a real distribution conditionally to the value of a covariate, potentially in high dimension. A such estimation is made introducing the proportional tail model. This model is studied with coupling methods. The first is an empirical processes based method whereas the second is focused on transport and optimal coupling. We provide estimators of both quantiles and model parameters, we show their asymptotic normality with our coupling methods. We also provide a validation procedure for proportional tail model. Moreover, we develop the second approach in the general framework of univariate extreme value theory
Nadjahi, Kimia. "Sliced-Wasserstein distance for large-scale machine learning : theory, methodology and extensions." Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAT050.
Full textMany methods for statistical inference and generative modeling rely on a probability divergence to effectively compare two probability distributions. The Wasserstein distance, which emerges from optimal transport, has been an interesting choice, but suffers from computational and statistical limitations on large-scale settings. Several alternatives have then been proposed, including the Sliced-Wasserstein distance (SW), a metric that has been increasingly used in practice due to its computational benefits. However, there is little work regarding its theoretical properties. This thesis further explores the use of SW in modern statistical and machine learning problems, with a twofold objective: 1) provide new theoretical insights to understand in depth SW-based algorithms, and 2) design novel tools inspired by SW to improve its applicability and scalability. We first prove a set of asymptotic properties on the estimators obtained by minimizing SW, as well as a central limit theorem whose convergence rate is dimension-free. We also design a novel likelihood-free approximate inference method based on SW, which is theoretically grounded and scales well with the data size and dimension. Given that SW is commonly estimated with a simple Monte Carlo scheme, we then propose two approaches to alleviate the inefficiencies caused by the induced approximation error: on the one hand, we extend the definition of SW to introduce the Generalized Sliced-Wasserstein distances, and illustrate their advantages on generative modeling applications; on the other hand, we leverage concentration of measure results to formulate a new deterministic approximation for SW, which is computationally more efficient than the usual Monte Carlo technique and has nonasymptotical guarantees under a weak dependence condition. Finally, we define the general class of sliced probability divergences and investigate their topological and statistical properties; in particular, we establish that the sample complexity of any sliced divergence does not depend on the problem dimension
Liu, Lu. "A Risk-Oriented Clustering Approach for Asset Categorization and Risk Measurement." Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/39444.
Full textLescornel, Hélène. "Covariance estimation and study of models of deformations between distributions with the Wasserstein distance." Toulouse 3, 2014. http://www.theses.fr/2014TOU30045.
Full textThe first part of this thesis concerns the covariance estimation of non stationary processes. We are estimating the covariance in different vectorial spaces of matrices. In Chapter 3, we give a model selection procedure by minimizing a penalized criterion and using concentration inequalities, and Chapter 4 presents an Unbiased Risk Estimation method. In both cases we give oracle inequalities. The second part deals with the study of models of deformation between distributions. We assume that we observe a random quantity epsilon through a deformation function. The importance of the deformation is represented by a parameter theta that we aim to estimate. We present several methods of estimation based on the Wasserstein distance by aligning the distributions of the observations to recover the deformation parameter. In the case of real random variables, Chapter 7 presents properties of consistency for a M-estimator and its asymptotic distribution. We use Hadamard differentiability techniques to apply a functional Delta method. Chapter 8 concerns a Robbins-Monro estimator for the deformation parameter and presents properties of convergence for a kernel estimator of the density of the variable epsilon obtained with the observations. The model is generalized to random variables in complete metric spaces in Chapter 9. Then, in the aim to build a goodness of fit test, Chapter 10 gives results on the asymptotic distribution of a test statistic
Books on the topic "Distances de Wasserstein"
An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows. European Mathematical Society, 2021.
Find full textComputational Inversion with Wasserstein Distances and Neural Network Induced Loss Functions. [New York, N.Y.?]: [publisher not identified], 2022.
Find full textBook chapters on the topic "Distances de Wasserstein"
Bachmann, Fynn, Philipp Hennig, and Dmitry Kobak. "Wasserstein t-SNE." In Machine Learning and Knowledge Discovery in Databases, 104–20. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-26387-3_7.
Full textVillani, Cédric. "The Wasserstein distances." In Grundlehren der mathematischen Wissenschaften, 93–111. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-540-71050-9_6.
Full textBarbe, Amélie, Marc Sebban, Paulo Gonçalves, Pierre Borgnat, and Rémi Gribonval. "Graph Diffusion Wasserstein Distances." In Machine Learning and Knowledge Discovery in Databases, 577–92. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-67661-2_34.
Full textJacobs, Bart. "Drawing from an Urn is Isometric." In Lecture Notes in Computer Science, 101–20. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-57228-9_6.
Full textSantambrogio, Filippo. "Wasserstein distances and curves in the Wasserstein spaces." In Optimal Transport for Applied Mathematicians, 177–218. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20828-2_5.
Full textÖcal, Kaan, Ramon Grima, and Guido Sanguinetti. "Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks." In Computational Methods in Systems Biology, 347–51. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-31304-3_24.
Full textCarrillo, José Antonio, Young-Pil Choi, and Maxime Hauray. "The derivation of swarming models: Mean-field limit and Wasserstein distances." In Collective Dynamics from Bacteria to Crowds, 1–46. Vienna: Springer Vienna, 2014. http://dx.doi.org/10.1007/978-3-7091-1785-9_1.
Full textHaeusler, Erich, and David M. Mason. "Asymptotic Distributions of Trimmed Wasserstein Distances Between the True and the Empirical Distribution Function." In Stochastic Inequalities and Applications, 279–98. Basel: Birkhäuser Basel, 2003. http://dx.doi.org/10.1007/978-3-0348-8069-5_16.
Full textWalczak, Szymon M. "Wasserstein Distance." In SpringerBriefs in Mathematics, 1–10. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-57517-9_1.
Full textBreiding, Paul, Kathlén Kohn, and Bernd Sturmfels. "Wasserstein Distance." In Oberwolfach Seminars, 53–66. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-51462-3_5.
Full textConference papers on the topic "Distances de Wasserstein"
Zhang, Xiaoxia, Chao Wang, Xusheng Hu, and Claude Delpha. "Incipient Cracks Characterization Based on Jensen-Shannon Divergence and Wasserstein Distance." In 2024 Prognostics and System Health Management Conference (PHM), 8–13. IEEE, 2024. http://dx.doi.org/10.1109/phm61473.2024.00010.
Full textLyu, Zihang, Jun Xiao, Cong Zhang, and Kin-Man Lam. "AI-Generated Image Detection With Wasserstein Distance Compression and Dynamic Aggregation." In 2024 IEEE International Conference on Image Processing (ICIP), 3827–33. IEEE, 2024. http://dx.doi.org/10.1109/icip51287.2024.10648186.
Full textMalik, Vikrant, Taylan Kargin, Victoria Kostina, and Babak Hassibi. "A Distributionally Robust Approach to Shannon Limits using the Wasserstein Distance." In 2024 IEEE International Symposium on Information Theory (ISIT), 861–66. IEEE, 2024. http://dx.doi.org/10.1109/isit57864.2024.10619597.
Full textLopez, Adrian Tovar, and Varun Jog. "Generalization error bounds using Wasserstein distances." In 2018 IEEE Information Theory Workshop (ITW). IEEE, 2018. http://dx.doi.org/10.1109/itw.2018.8613445.
Full textMemoli, Facundo. "Spectral Gromov-Wasserstein distances for shape matching." In 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops. IEEE, 2009. http://dx.doi.org/10.1109/iccvw.2009.5457690.
Full textProssel, Dominik, and Uwe D. Hanebeck. "Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions." In 2022 25th International Conference on Information Fusion (FUSION). IEEE, 2022. http://dx.doi.org/10.23919/fusion49751.2022.9841286.
Full textSteuernagel, Simon, Aaron Kurda, and Marcus Baum. "Point Cloud Registration based on Gaussian Mixtures and Pairwise Wasserstein Distances." In 2023 IEEE Symposium Sensor Data Fusion and International Conference on Multisensor Fusion and Integration (SDF-MFI). IEEE, 2023. http://dx.doi.org/10.1109/sdf-mfi59545.2023.10361440.
Full textPerkey, Scott, Ana Carvalho, and Alberto Krone-Martins. "Using Fourier Coefficients and Wasserstein Distances to Estimate Entropy in Time Series." In 2023 IEEE 19th International Conference on e-Science (e-Science). IEEE, 2023. http://dx.doi.org/10.1109/e-science58273.2023.10254949.
Full textBarbe, Amelie, Paulo Goncalves, Marc Sebban, Pierre Borgnat, Remi Gribonval, and Titouan Vayer. "Optimization of the Diffusion Time in Graph Diffused-Wasserstein Distances: Application to Domain Adaptation." In 2021 IEEE 33rd International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2021. http://dx.doi.org/10.1109/ictai52525.2021.00125.
Full textGarcia Ramirez, Jesus. "Which Kernels to Transfer in Deep Q-Networks?" In LatinX in AI at Neural Information Processing Systems Conference 2019. Journal of LatinX in AI Research, 2019. http://dx.doi.org/10.52591/lxai201912087.
Full text