Letteratura scientifica selezionata sul tema "Wasserstein barycenters"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Wasserstein barycenters".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Wasserstein barycenters"
Chi, Jinjin, Zhiyao Yang, Ximing Li, Jihong Ouyang e Renchu Guan. "Variational Wasserstein Barycenters with C-cyclical Monotonicity Regularization". Proceedings of the AAAI Conference on Artificial Intelligence 37, n. 6 (26 giugno 2023): 7157–65. http://dx.doi.org/10.1609/aaai.v37i6.25873.
Testo completoXu, Hongteng, Dixin Luo, Lawrence Carin e Hongyuan Zha. "Learning Graphons via Structured Gromov-Wasserstein Barycenters". Proceedings of the AAAI Conference on Artificial Intelligence 35, n. 12 (18 maggio 2021): 10505–13. http://dx.doi.org/10.1609/aaai.v35i12.17257.
Testo completoBigot, Jérémie, e Thierry Klein. "Characterization of barycenters in the Wasserstein space by averaging optimal transport maps". ESAIM: Probability and Statistics 22 (2018): 35–57. http://dx.doi.org/10.1051/ps/2017020.
Testo completoAgueh, Martial, e Guillaume Carlier. "Barycenters in the Wasserstein Space". SIAM Journal on Mathematical Analysis 43, n. 2 (gennaio 2011): 904–24. http://dx.doi.org/10.1137/100805741.
Testo completoKim, Young-Heon, e Brendan Pass. "Wasserstein barycenters over Riemannian manifolds". Advances in Mathematics 307 (febbraio 2017): 640–83. http://dx.doi.org/10.1016/j.aim.2016.11.026.
Testo completoBaum, Marcus, Peter Willett e Uwe D. Hanebeck. "On Wasserstein Barycenters and MMOSPA Estimation". IEEE Signal Processing Letters 22, n. 10 (ottobre 2015): 1511–15. http://dx.doi.org/10.1109/lsp.2015.2410217.
Testo completoPuccetti, Giovanni, Ludger Rüschendorf e Steven Vanduffel. "On the computation of Wasserstein barycenters". Journal of Multivariate Analysis 176 (marzo 2020): 104581. http://dx.doi.org/10.1016/j.jmva.2019.104581.
Testo completoLe Gouic, Thibaut, e Jean-Michel Loubes. "Existence and consistency of Wasserstein barycenters". Probability Theory and Related Fields 168, n. 3-4 (17 agosto 2016): 901–17. http://dx.doi.org/10.1007/s00440-016-0727-z.
Testo completoBuzun, Nazar. "Gaussian Approximation for Penalized Wasserstein Barycenters". Mathematical Methods of Statistics 32, n. 1 (marzo 2023): 1–26. http://dx.doi.org/10.3103/s1066530723010039.
Testo completoSow, Babacar, Rodolphe Le Riche, Julien Pelamatti, Merlin Keller e Sanaa Zannane. "Wasserstein-Based Evolutionary Operators for Optimizing Sets of Points: Application to Wind-Farm Layout Design". Applied Sciences 14, n. 17 (5 settembre 2024): 7916. http://dx.doi.org/10.3390/app14177916.
Testo completoTesi sul tema "Wasserstein barycenters"
Fernandes, Montesuma Eduardo. "Multi-Source Domain Adaptation through Wasserstein Barycenters". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG045.
Testo completoMachine learning systems work under the assumption that training and test conditions are uniform, i.e., they do not change. However, this hypothesis is seldom met in practice. Hence, the system is trained with data that is no longer representative of the data it will be tested on. This case is represented by a shift in the probability measure generating the data. This scenario is known in the literature as distributional shift between two domains: a source, and a target. A straightforward generalization of this problem is when training data itself exhibit shifts on its own. In this case, one consider Multi Source Domain Adaptation (MSDA). In this context, optimal transport is an useful field of mathematics. Especially, optimal transport serves as a toolbox, for comparing and manipulating probability measures. This thesis studies the contributions of optimal transport to multi-source domain adaptation. We do so through Wasserstein barycenters, an object that defines a weighted average, in the space of probability measures, for the multiple domains in MSDA. Based on this concept, we propose: (i) a novel notion of barycenter, when the measures at hand are equipped with labels, (ii) a novel dictionary learning problem over empirical probability measures and (iii) new tools for domain adaptation through the optimal transport of Gaussian mixture models. Through our methods, we are able to improve domain adaptation performance in comparison with previous optimal transport-based methods on image, and cross-domain fault diagnosis benchmarks. Our work opens an interesting research direction, on learning the barycentric hull of probability measures
Cazelles, Elsa. "Statistical properties of barycenters in the Wasserstein space and fast algorithms for optimal transport of measures". Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0125/document.
Testo completoThis thesis focuses on the analysis of data in the form of probability measures on R^d. The aim is to provide a better understanding of the usual statistical tools on this space endowed with the Wasserstein distance. The first order statistical analysis is a natural notion to consider, consisting of the study of the Fréchet mean (or barycentre). In particular, we focus on the case of discrete data (or observations) sampled from absolutely continuous probability measures (a.c.) with respect to the Lebesgue measure. We thus introduce an estimator of the barycenter of random measures, penalized by a convex function, making it possible to enforce its a.c. Another estimator is regularized by adding entropy when computing the Wasserstein distance. We are particularly interested in controlling the variance of these estimators. Thanks to these results, the principle of Goldenshluger and Lepski allows us to obtain an automatic calibration of the regularization parameters. We then apply this work to the registration of multivariate densities, especially for flow cytometry data. We also propose a test statistic that can compare two multivariate distributions, efficiently in terms of computational time. Finally, we perform a second-order statistical analysis to extract the global geometric tendency of a dataset, also called the main modes of variation. For that purpose, we propose algorithms allowing to carry out a geodesic principal components analysis in the space of Wasserstein
Le, Gouic Thibaut. "Localisation de masse et espaces de Wasserstein". Toulouse 3, 2013. http://thesesups.ups-tlse.fr/2163/.
Testo completoThe study of this manuscript is based on two disctincts tools : the packing and the Wasserstein spaces. A first part focuses on the measure localization of a probability Mu. For a regular Mu, the level sets of its density are a good notion to localize where measure is dense, but loose its meaning for a finitely supported measure such as the empirical measure. We thus define a function Tau , called size function, on the closed sets, based on the packing of the sets. The sets of smallest Tau -size with a given probability 1 − alpha localize dense areas, even if Mu is not regular. We show that these smallest sets given Mu and alpha depend continuously on Mu and alpha, for the Hausdorff distance. We derive a new method to quantize Mu in a robust and stable way. A second part focuses on the Wasserstein distance between a probability measure and the associated empirical measure. We obtain a non asymptotic upper bound of the expectation of this distance, for any arbitrary underlying metric space. An application of the result to finite dimensional spaces shows the accuracy of the bound. We also obtain new bounds for the case of Gaussian measure on Banach spaces that coincide asymptotically with the best quantizers possible. Using concentration inequalities, we show deviation bounds. Finally, we use these results to define non asymptotic and non parametric statistical tests of goodness of fit to a family of probability measures. A third part focuses on the barycenter of a finite family of probability measures. The Fréchet mean is an extension to the notion of barycenter to metric spaces, and gives us a way to define barycenter on Wasserstein spaces. We show the existence of these barycenters and then study properties of continuity on the family of measures. We then discuss practical applications in agreagation of empirical measures and texture mixing
Silveti, Falls Antonio. "First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms". Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Testo completoIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Nenna, Luca. "Numerical Methods for Multi-Marginal Optimal Transportation". Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED017/document.
Testo completoIn this thesis we aim at giving a general numerical framework to approximate solutions to optimal transport (OT) problems. The general idea is to introduce an entropic regularization of the initialproblems. The regularized problem corresponds to the minimization of a relative entropy with respect a given reference measure. Indeed, this is equivalent to find the projection of the joint coupling with respect the Kullback-Leibler divergence. This allows us to make use the Bregman/Dykstra’s algorithm and solve several variational problems related to OT. We are especially interested in solving multi-marginal optimal transport problems (MMOT) arising in Physics such as in Fluid Dynamics (e.g. incompressible Euler equations à la Brenier) and in Quantum Physics (e.g. Density Functional Theory). In these cases we show that the entropic regularization plays a more important role than a simple numerical stabilization. Moreover, we also give some important results concerning existence and characterization of optimal transport maps (e.g. fractal maps) for MMOT
Capitoli di libri sul tema "Wasserstein barycenters"
Bouchet, Pierre-Yves, Stefano Gualandi e Louis-Martin Rousseau. "Primal Heuristics for Wasserstein Barycenters". In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 239–55. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58942-4_16.
Testo completoTrinh, Thanh-Son. "Wasserstein Barycenters Over Heisenberg Group". In Algorithms for Intelligent Systems, 273–79. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-8976-8_24.
Testo completoAuricchio, Gennaro, Federico Bassetti, Stefano Gualandi e Marco Veneroni. "Computing Wasserstein Barycenters via Linear Programming". In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 355–63. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-19212-9_23.
Testo completoCazelles, Elsa, Jérémie Bigot e Nicolas Papadakis. "Regularized Barycenters in the Wasserstein Space". In Lecture Notes in Computer Science, 83–90. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_10.
Testo completoHu, François, Philipp Ratz e Arthur Charpentier. "Fairness in Multi-Task Learning via Wasserstein Barycenters". In Machine Learning and Knowledge Discovery in Databases: Research Track, 295–312. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43415-0_18.
Testo completoLe Gouic, Thibaut, e Jean-Michel Loubes. "Barycenter in Wasserstein Spaces: Existence and Consistency". In Lecture Notes in Computer Science, 104–8. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25040-3_12.
Testo completoNadeem, Saad, Travis Hollmann e Allen Tannenbaum. "Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation". In Medical Image Computing and Computer Assisted Intervention – MICCAI 2020, 362–71. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59722-1_35.
Testo completoRabin, Julien, Gabriel Peyré, Julie Delon e Marc Bernot. "Wasserstein Barycenter and Its Application to Texture Mixing". In Lecture Notes in Computer Science, 435–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24785-9_37.
Testo completoWang, Xu, Jiawei Huang, Qingyuan Yang e Jinpeng Zhang. "On Robust Wasserstein Barycenter: The Model and Algorithm". In Proceedings of the 2024 SIAM International Conference on Data Mining (SDM), 235–43. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2024. http://dx.doi.org/10.1137/1.9781611978032.27.
Testo completoJin, Cong, Zhongtong Li, Yuanyuan Sun, Haiyin Zhang, Xin Lv, Jianguang Li e Shouxun Liu. "An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription". In Communications and Networking, 230–40. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41117-6_19.
Testo completoAtti di convegni sul tema "Wasserstein barycenters"
Simon, Dror, e Aviad Aberdam. "Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing". In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00793.
Testo completoSimou, Effrosyni, e Pascal Frossard. "Graph Signal Representation with Wasserstein Barycenters". In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683335.
Testo completoUribe, Cesar A., Darina Dvinskikh, Pavel Dvurechensky, Alexander Gasnikov e Angelia Nedic. "Distributed Computation of Wasserstein Barycenters Over Networks". In 2018 IEEE Conference on Decision and Control (CDC). IEEE, 2018. http://dx.doi.org/10.1109/cdc.2018.8619160.
Testo completoOuyang, Jihong, Yiming Wang, Ximing Li e Changchun Li. "Weakly-supervised Text Classification with Wasserstein Barycenters Regularization". In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/468.
Testo completoArque, Ferran, Cesar Uribe e Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/202107244.
Testo completoArque, Ferran, Cesar Uribe e Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/lxai202107244.
Testo completoColombo, Pierre, Guillaume Staerman, Chloé Clavel e Pablo Piantanida. "Automatic Text Evaluation through the Lens of Wasserstein Barycenters". In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.emnlp-main.817.
Testo completoMontesuma, Eduardo F., e Fred-Maurice Ngole Mboula. "Wasserstein Barycenter Transport for Acoustic Adaptation". In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9414199.
Testo completoLian, Xin, Kshitij Jain, Jakub Truszkowski, Pascal Poupart e Yaoliang Yu. "Unsupervised Multilingual Alignment using Wasserstein Barycenter". In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/512.
Testo completoMontesuma, Eduardo Fernandes, e Fred Maurice Ngole Mboula. "Wasserstein Barycenter for Multi-Source Domain Adaptation". In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2021. http://dx.doi.org/10.1109/cvpr46437.2021.01651.
Testo completo