Academic literature on the topic 'Wasserstein barycenters'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Wasserstein barycenters.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Wasserstein barycenters"
Chi, Jinjin, Zhiyao Yang, Ximing Li, Jihong Ouyang, and Renchu Guan. "Variational Wasserstein Barycenters with C-cyclical Monotonicity Regularization." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 6 (June 26, 2023): 7157–65. http://dx.doi.org/10.1609/aaai.v37i6.25873.
Full textXu, Hongteng, Dixin Luo, Lawrence Carin, and Hongyuan Zha. "Learning Graphons via Structured Gromov-Wasserstein Barycenters." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (May 18, 2021): 10505–13. http://dx.doi.org/10.1609/aaai.v35i12.17257.
Full textBigot, Jérémie, and Thierry Klein. "Characterization of barycenters in the Wasserstein space by averaging optimal transport maps." ESAIM: Probability and Statistics 22 (2018): 35–57. http://dx.doi.org/10.1051/ps/2017020.
Full textAgueh, Martial, and Guillaume Carlier. "Barycenters in the Wasserstein Space." SIAM Journal on Mathematical Analysis 43, no. 2 (January 2011): 904–24. http://dx.doi.org/10.1137/100805741.
Full textKim, Young-Heon, and Brendan Pass. "Wasserstein barycenters over Riemannian manifolds." Advances in Mathematics 307 (February 2017): 640–83. http://dx.doi.org/10.1016/j.aim.2016.11.026.
Full textBaum, Marcus, Peter Willett, and Uwe D. Hanebeck. "On Wasserstein Barycenters and MMOSPA Estimation." IEEE Signal Processing Letters 22, no. 10 (October 2015): 1511–15. http://dx.doi.org/10.1109/lsp.2015.2410217.
Full textPuccetti, Giovanni, Ludger Rüschendorf, and Steven Vanduffel. "On the computation of Wasserstein barycenters." Journal of Multivariate Analysis 176 (March 2020): 104581. http://dx.doi.org/10.1016/j.jmva.2019.104581.
Full textLe Gouic, Thibaut, and Jean-Michel Loubes. "Existence and consistency of Wasserstein barycenters." Probability Theory and Related Fields 168, no. 3-4 (August 17, 2016): 901–17. http://dx.doi.org/10.1007/s00440-016-0727-z.
Full textBuzun, Nazar. "Gaussian Approximation for Penalized Wasserstein Barycenters." Mathematical Methods of Statistics 32, no. 1 (March 2023): 1–26. http://dx.doi.org/10.3103/s1066530723010039.
Full textSow, Babacar, Rodolphe Le Riche, Julien Pelamatti, Merlin Keller, and Sanaa Zannane. "Wasserstein-Based Evolutionary Operators for Optimizing Sets of Points: Application to Wind-Farm Layout Design." Applied Sciences 14, no. 17 (September 5, 2024): 7916. http://dx.doi.org/10.3390/app14177916.
Full textDissertations / Theses on the topic "Wasserstein barycenters"
Fernandes, Montesuma Eduardo. "Multi-Source Domain Adaptation through Wasserstein Barycenters." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG045.
Full textMachine learning systems work under the assumption that training and test conditions are uniform, i.e., they do not change. However, this hypothesis is seldom met in practice. Hence, the system is trained with data that is no longer representative of the data it will be tested on. This case is represented by a shift in the probability measure generating the data. This scenario is known in the literature as distributional shift between two domains: a source, and a target. A straightforward generalization of this problem is when training data itself exhibit shifts on its own. In this case, one consider Multi Source Domain Adaptation (MSDA). In this context, optimal transport is an useful field of mathematics. Especially, optimal transport serves as a toolbox, for comparing and manipulating probability measures. This thesis studies the contributions of optimal transport to multi-source domain adaptation. We do so through Wasserstein barycenters, an object that defines a weighted average, in the space of probability measures, for the multiple domains in MSDA. Based on this concept, we propose: (i) a novel notion of barycenter, when the measures at hand are equipped with labels, (ii) a novel dictionary learning problem over empirical probability measures and (iii) new tools for domain adaptation through the optimal transport of Gaussian mixture models. Through our methods, we are able to improve domain adaptation performance in comparison with previous optimal transport-based methods on image, and cross-domain fault diagnosis benchmarks. Our work opens an interesting research direction, on learning the barycentric hull of probability measures
Cazelles, Elsa. "Statistical properties of barycenters in the Wasserstein space and fast algorithms for optimal transport of measures." Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0125/document.
Full textThis thesis focuses on the analysis of data in the form of probability measures on R^d. The aim is to provide a better understanding of the usual statistical tools on this space endowed with the Wasserstein distance. The first order statistical analysis is a natural notion to consider, consisting of the study of the Fréchet mean (or barycentre). In particular, we focus on the case of discrete data (or observations) sampled from absolutely continuous probability measures (a.c.) with respect to the Lebesgue measure. We thus introduce an estimator of the barycenter of random measures, penalized by a convex function, making it possible to enforce its a.c. Another estimator is regularized by adding entropy when computing the Wasserstein distance. We are particularly interested in controlling the variance of these estimators. Thanks to these results, the principle of Goldenshluger and Lepski allows us to obtain an automatic calibration of the regularization parameters. We then apply this work to the registration of multivariate densities, especially for flow cytometry data. We also propose a test statistic that can compare two multivariate distributions, efficiently in terms of computational time. Finally, we perform a second-order statistical analysis to extract the global geometric tendency of a dataset, also called the main modes of variation. For that purpose, we propose algorithms allowing to carry out a geodesic principal components analysis in the space of Wasserstein
Le, Gouic Thibaut. "Localisation de masse et espaces de Wasserstein." Toulouse 3, 2013. http://thesesups.ups-tlse.fr/2163/.
Full textThe study of this manuscript is based on two disctincts tools : the packing and the Wasserstein spaces. A first part focuses on the measure localization of a probability Mu. For a regular Mu, the level sets of its density are a good notion to localize where measure is dense, but loose its meaning for a finitely supported measure such as the empirical measure. We thus define a function Tau , called size function, on the closed sets, based on the packing of the sets. The sets of smallest Tau -size with a given probability 1 − alpha localize dense areas, even if Mu is not regular. We show that these smallest sets given Mu and alpha depend continuously on Mu and alpha, for the Hausdorff distance. We derive a new method to quantize Mu in a robust and stable way. A second part focuses on the Wasserstein distance between a probability measure and the associated empirical measure. We obtain a non asymptotic upper bound of the expectation of this distance, for any arbitrary underlying metric space. An application of the result to finite dimensional spaces shows the accuracy of the bound. We also obtain new bounds for the case of Gaussian measure on Banach spaces that coincide asymptotically with the best quantizers possible. Using concentration inequalities, we show deviation bounds. Finally, we use these results to define non asymptotic and non parametric statistical tests of goodness of fit to a family of probability measures. A third part focuses on the barycenter of a finite family of probability measures. The Fréchet mean is an extension to the notion of barycenter to metric spaces, and gives us a way to define barycenter on Wasserstein spaces. We show the existence of these barycenters and then study properties of continuity on the family of measures. We then discuss practical applications in agreagation of empirical measures and texture mixing
Silveti, Falls Antonio. "First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms." Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Full textIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Nenna, Luca. "Numerical Methods for Multi-Marginal Optimal Transportation." Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED017/document.
Full textIn this thesis we aim at giving a general numerical framework to approximate solutions to optimal transport (OT) problems. The general idea is to introduce an entropic regularization of the initialproblems. The regularized problem corresponds to the minimization of a relative entropy with respect a given reference measure. Indeed, this is equivalent to find the projection of the joint coupling with respect the Kullback-Leibler divergence. This allows us to make use the Bregman/Dykstra’s algorithm and solve several variational problems related to OT. We are especially interested in solving multi-marginal optimal transport problems (MMOT) arising in Physics such as in Fluid Dynamics (e.g. incompressible Euler equations à la Brenier) and in Quantum Physics (e.g. Density Functional Theory). In these cases we show that the entropic regularization plays a more important role than a simple numerical stabilization. Moreover, we also give some important results concerning existence and characterization of optimal transport maps (e.g. fractal maps) for MMOT
Book chapters on the topic "Wasserstein barycenters"
Bouchet, Pierre-Yves, Stefano Gualandi, and Louis-Martin Rousseau. "Primal Heuristics for Wasserstein Barycenters." In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 239–55. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58942-4_16.
Full textTrinh, Thanh-Son. "Wasserstein Barycenters Over Heisenberg Group." In Algorithms for Intelligent Systems, 273–79. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-8976-8_24.
Full textAuricchio, Gennaro, Federico Bassetti, Stefano Gualandi, and Marco Veneroni. "Computing Wasserstein Barycenters via Linear Programming." In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 355–63. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-19212-9_23.
Full textCazelles, Elsa, Jérémie Bigot, and Nicolas Papadakis. "Regularized Barycenters in the Wasserstein Space." In Lecture Notes in Computer Science, 83–90. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_10.
Full textHu, François, Philipp Ratz, and Arthur Charpentier. "Fairness in Multi-Task Learning via Wasserstein Barycenters." In Machine Learning and Knowledge Discovery in Databases: Research Track, 295–312. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43415-0_18.
Full textLe Gouic, Thibaut, and Jean-Michel Loubes. "Barycenter in Wasserstein Spaces: Existence and Consistency." In Lecture Notes in Computer Science, 104–8. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25040-3_12.
Full textNadeem, Saad, Travis Hollmann, and Allen Tannenbaum. "Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation." In Medical Image Computing and Computer Assisted Intervention – MICCAI 2020, 362–71. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59722-1_35.
Full textRabin, Julien, Gabriel Peyré, Julie Delon, and Marc Bernot. "Wasserstein Barycenter and Its Application to Texture Mixing." In Lecture Notes in Computer Science, 435–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24785-9_37.
Full textWang, Xu, Jiawei Huang, Qingyuan Yang, and Jinpeng Zhang. "On Robust Wasserstein Barycenter: The Model and Algorithm." In Proceedings of the 2024 SIAM International Conference on Data Mining (SDM), 235–43. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2024. http://dx.doi.org/10.1137/1.9781611978032.27.
Full textJin, Cong, Zhongtong Li, Yuanyuan Sun, Haiyin Zhang, Xin Lv, Jianguang Li, and Shouxun Liu. "An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription." In Communications and Networking, 230–40. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41117-6_19.
Full textConference papers on the topic "Wasserstein barycenters"
Simon, Dror, and Aviad Aberdam. "Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing." In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00793.
Full textSimou, Effrosyni, and Pascal Frossard. "Graph Signal Representation with Wasserstein Barycenters." In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683335.
Full textUribe, Cesar A., Darina Dvinskikh, Pavel Dvurechensky, Alexander Gasnikov, and Angelia Nedic. "Distributed Computation of Wasserstein Barycenters Over Networks." In 2018 IEEE Conference on Decision and Control (CDC). IEEE, 2018. http://dx.doi.org/10.1109/cdc.2018.8619160.
Full textOuyang, Jihong, Yiming Wang, Ximing Li, and Changchun Li. "Weakly-supervised Text Classification with Wasserstein Barycenters Regularization." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/468.
Full textArque, Ferran, Cesar Uribe, and Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters." In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/202107244.
Full textArque, Ferran, Cesar Uribe, and Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters." In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/lxai202107244.
Full textColombo, Pierre, Guillaume Staerman, Chloé Clavel, and Pablo Piantanida. "Automatic Text Evaluation through the Lens of Wasserstein Barycenters." In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.emnlp-main.817.
Full textMontesuma, Eduardo F., and Fred-Maurice Ngole Mboula. "Wasserstein Barycenter Transport for Acoustic Adaptation." In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9414199.
Full textLian, Xin, Kshitij Jain, Jakub Truszkowski, Pascal Poupart, and Yaoliang Yu. "Unsupervised Multilingual Alignment using Wasserstein Barycenter." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/512.
Full textMontesuma, Eduardo Fernandes, and Fred Maurice Ngole Mboula. "Wasserstein Barycenter for Multi-Source Domain Adaptation." In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2021. http://dx.doi.org/10.1109/cvpr46437.2021.01651.
Full text