Gotowa bibliografia na temat „Barycentres de Wasserstein”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Barycentres de Wasserstein”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "Barycentres de Wasserstein"
Beier, Florian, Robert Beinert i Gabriele Steidl. "Multi-marginal Gromov–Wasserstein transport and barycentres". Information and Inference: A Journal of the IMA 12, nr 4 (18.09.2023): 2720–52. http://dx.doi.org/10.1093/imaiai/iaad041.
Pełny tekst źródłaChi, Jinjin, Zhiyao Yang, Ximing Li, Jihong Ouyang i Renchu Guan. "Variational Wasserstein Barycenters with C-cyclical Monotonicity Regularization". Proceedings of the AAAI Conference on Artificial Intelligence 37, nr 6 (26.06.2023): 7157–65. http://dx.doi.org/10.1609/aaai.v37i6.25873.
Pełny tekst źródłaXu, Hongteng, Dixin Luo, Lawrence Carin i Hongyuan Zha. "Learning Graphons via Structured Gromov-Wasserstein Barycenters". Proceedings of the AAAI Conference on Artificial Intelligence 35, nr 12 (18.05.2021): 10505–13. http://dx.doi.org/10.1609/aaai.v35i12.17257.
Pełny tekst źródłaBigot, Jérémie, i Thierry Klein. "Characterization of barycenters in the Wasserstein space by averaging optimal transport maps". ESAIM: Probability and Statistics 22 (2018): 35–57. http://dx.doi.org/10.1051/ps/2017020.
Pełny tekst źródłaSow, Babacar, Rodolphe Le Riche, Julien Pelamatti, Merlin Keller i Sanaa Zannane. "Wasserstein-Based Evolutionary Operators for Optimizing Sets of Points: Application to Wind-Farm Layout Design". Applied Sciences 14, nr 17 (5.09.2024): 7916. http://dx.doi.org/10.3390/app14177916.
Pełny tekst źródłaBigot, Jérémie, Elsa Cazelles i Nicolas Papadakis. "Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration". Information and Inference: A Journal of the IMA 8, nr 4 (30.11.2019): 719–55. http://dx.doi.org/10.1093/imaiai/iaz023.
Pełny tekst źródłaXu, Hongtengl. "Gromov-Wasserstein Factorization Models for Graph Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 04 (3.04.2020): 6478–85. http://dx.doi.org/10.1609/aaai.v34i04.6120.
Pełny tekst źródłaBonneel, Nicolas, Gabriel Peyré i Marco Cuturi. "Wasserstein barycentric coordinates". ACM Transactions on Graphics 35, nr 4 (11.07.2016): 1–10. http://dx.doi.org/10.1145/2897824.2925918.
Pełny tekst źródłaXiang, Yue, Dixin Luo i Hongteng Xu. "Privacy-Preserved Evolutionary Graph Modeling via Gromov-Wasserstein Autoregression". Proceedings of the AAAI Conference on Artificial Intelligence 37, nr 12 (26.06.2023): 14566–74. http://dx.doi.org/10.1609/aaai.v37i12.26703.
Pełny tekst źródłaAgueh, Martial, i Guillaume Carlier. "Barycenters in the Wasserstein Space". SIAM Journal on Mathematical Analysis 43, nr 2 (styczeń 2011): 904–24. http://dx.doi.org/10.1137/100805741.
Pełny tekst źródłaRozprawy doktorskie na temat "Barycentres de Wasserstein"
Fernandes, Montesuma Eduardo. "Multi-Source Domain Adaptation through Wasserstein Barycenters". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG045.
Pełny tekst źródłaMachine learning systems work under the assumption that training and test conditions are uniform, i.e., they do not change. However, this hypothesis is seldom met in practice. Hence, the system is trained with data that is no longer representative of the data it will be tested on. This case is represented by a shift in the probability measure generating the data. This scenario is known in the literature as distributional shift between two domains: a source, and a target. A straightforward generalization of this problem is when training data itself exhibit shifts on its own. In this case, one consider Multi Source Domain Adaptation (MSDA). In this context, optimal transport is an useful field of mathematics. Especially, optimal transport serves as a toolbox, for comparing and manipulating probability measures. This thesis studies the contributions of optimal transport to multi-source domain adaptation. We do so through Wasserstein barycenters, an object that defines a weighted average, in the space of probability measures, for the multiple domains in MSDA. Based on this concept, we propose: (i) a novel notion of barycenter, when the measures at hand are equipped with labels, (ii) a novel dictionary learning problem over empirical probability measures and (iii) new tools for domain adaptation through the optimal transport of Gaussian mixture models. Through our methods, we are able to improve domain adaptation performance in comparison with previous optimal transport-based methods on image, and cross-domain fault diagnosis benchmarks. Our work opens an interesting research direction, on learning the barycentric hull of probability measures
Cazelles, Elsa. "Statistical properties of barycenters in the Wasserstein space and fast algorithms for optimal transport of measures". Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0125/document.
Pełny tekst źródłaThis thesis focuses on the analysis of data in the form of probability measures on R^d. The aim is to provide a better understanding of the usual statistical tools on this space endowed with the Wasserstein distance. The first order statistical analysis is a natural notion to consider, consisting of the study of the Fréchet mean (or barycentre). In particular, we focus on the case of discrete data (or observations) sampled from absolutely continuous probability measures (a.c.) with respect to the Lebesgue measure. We thus introduce an estimator of the barycenter of random measures, penalized by a convex function, making it possible to enforce its a.c. Another estimator is regularized by adding entropy when computing the Wasserstein distance. We are particularly interested in controlling the variance of these estimators. Thanks to these results, the principle of Goldenshluger and Lepski allows us to obtain an automatic calibration of the regularization parameters. We then apply this work to the registration of multivariate densities, especially for flow cytometry data. We also propose a test statistic that can compare two multivariate distributions, efficiently in terms of computational time. Finally, we perform a second-order statistical analysis to extract the global geometric tendency of a dataset, also called the main modes of variation. For that purpose, we propose algorithms allowing to carry out a geodesic principal components analysis in the space of Wasserstein
Le, Gouic Thibaut. "Localisation de masse et espaces de Wasserstein". Toulouse 3, 2013. http://thesesups.ups-tlse.fr/2163/.
Pełny tekst źródłaThe study of this manuscript is based on two disctincts tools : the packing and the Wasserstein spaces. A first part focuses on the measure localization of a probability Mu. For a regular Mu, the level sets of its density are a good notion to localize where measure is dense, but loose its meaning for a finitely supported measure such as the empirical measure. We thus define a function Tau , called size function, on the closed sets, based on the packing of the sets. The sets of smallest Tau -size with a given probability 1 − alpha localize dense areas, even if Mu is not regular. We show that these smallest sets given Mu and alpha depend continuously on Mu and alpha, for the Hausdorff distance. We derive a new method to quantize Mu in a robust and stable way. A second part focuses on the Wasserstein distance between a probability measure and the associated empirical measure. We obtain a non asymptotic upper bound of the expectation of this distance, for any arbitrary underlying metric space. An application of the result to finite dimensional spaces shows the accuracy of the bound. We also obtain new bounds for the case of Gaussian measure on Banach spaces that coincide asymptotically with the best quantizers possible. Using concentration inequalities, we show deviation bounds. Finally, we use these results to define non asymptotic and non parametric statistical tests of goodness of fit to a family of probability measures. A third part focuses on the barycenter of a finite family of probability measures. The Fréchet mean is an extension to the notion of barycenter to metric spaces, and gives us a way to define barycenter on Wasserstein spaces. We show the existence of these barycenters and then study properties of continuity on the family of measures. We then discuss practical applications in agreagation of empirical measures and texture mixing
Silveti, Falls Antonio. "First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms". Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Pełny tekst źródłaIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Nenna, Luca. "Numerical Methods for Multi-Marginal Optimal Transportation". Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED017/document.
Pełny tekst źródłaIn this thesis we aim at giving a general numerical framework to approximate solutions to optimal transport (OT) problems. The general idea is to introduce an entropic regularization of the initialproblems. The regularized problem corresponds to the minimization of a relative entropy with respect a given reference measure. Indeed, this is equivalent to find the projection of the joint coupling with respect the Kullback-Leibler divergence. This allows us to make use the Bregman/Dykstra’s algorithm and solve several variational problems related to OT. We are especially interested in solving multi-marginal optimal transport problems (MMOT) arising in Physics such as in Fluid Dynamics (e.g. incompressible Euler equations à la Brenier) and in Quantum Physics (e.g. Density Functional Theory). In these cases we show that the entropic regularization plays a more important role than a simple numerical stabilization. Moreover, we also give some important results concerning existence and characterization of optimal transport maps (e.g. fractal maps) for MMOT
Części książek na temat "Barycentres de Wasserstein"
Bouchet, Pierre-Yves, Stefano Gualandi i Louis-Martin Rousseau. "Primal Heuristics for Wasserstein Barycenters". W Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 239–55. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58942-4_16.
Pełny tekst źródłaTrinh, Thanh-Son. "Wasserstein Barycenters Over Heisenberg Group". W Algorithms for Intelligent Systems, 273–79. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-8976-8_24.
Pełny tekst źródłaAuricchio, Gennaro, Federico Bassetti, Stefano Gualandi i Marco Veneroni. "Computing Wasserstein Barycenters via Linear Programming". W Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 355–63. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-19212-9_23.
Pełny tekst źródłaCazelles, Elsa, Jérémie Bigot i Nicolas Papadakis. "Regularized Barycenters in the Wasserstein Space". W Lecture Notes in Computer Science, 83–90. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_10.
Pełny tekst źródłaLe Gouic, Thibaut, i Jean-Michel Loubes. "Barycenter in Wasserstein Spaces: Existence and Consistency". W Lecture Notes in Computer Science, 104–8. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25040-3_12.
Pełny tekst źródłaHu, François, Philipp Ratz i Arthur Charpentier. "Fairness in Multi-Task Learning via Wasserstein Barycenters". W Machine Learning and Knowledge Discovery in Databases: Research Track, 295–312. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43415-0_18.
Pełny tekst źródłaNadeem, Saad, Travis Hollmann i Allen Tannenbaum. "Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation". W Medical Image Computing and Computer Assisted Intervention – MICCAI 2020, 362–71. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59722-1_35.
Pełny tekst źródłaRabin, Julien, Gabriel Peyré, Julie Delon i Marc Bernot. "Wasserstein Barycenter and Its Application to Texture Mixing". W Lecture Notes in Computer Science, 435–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24785-9_37.
Pełny tekst źródłaWang, Xu, Jiawei Huang, Qingyuan Yang i Jinpeng Zhang. "On Robust Wasserstein Barycenter: The Model and Algorithm". W Proceedings of the 2024 SIAM International Conference on Data Mining (SDM), 235–43. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2024. http://dx.doi.org/10.1137/1.9781611978032.27.
Pełny tekst źródłaJin, Cong, Zhongtong Li, Yuanyuan Sun, Haiyin Zhang, Xin Lv, Jianguang Li i Shouxun Liu. "An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription". W Communications and Networking, 230–40. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41117-6_19.
Pełny tekst źródłaStreszczenia konferencji na temat "Barycentres de Wasserstein"
Tang, Hanning, Xiaojing Shen, Hua Zhao, Zhiguo Wang i Pramod K. Varshney. "Bures-Wasserstein Barycentric Coordinates with Application to Diffusion Tensor Image Smoothing". W 2024 27th International Conference on Information Fusion (FUSION), 1–8. IEEE, 2024. http://dx.doi.org/10.23919/fusion59988.2024.10706482.
Pełny tekst źródłaSimon, Dror, i Aviad Aberdam. "Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing". W 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00793.
Pełny tekst źródłaSimou, Effrosyni, i Pascal Frossard. "Graph Signal Representation with Wasserstein Barycenters". W ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683335.
Pełny tekst źródłaOuyang, Jihong, Yiming Wang, Ximing Li i Changchun Li. "Weakly-supervised Text Classification with Wasserstein Barycenters Regularization". W Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/468.
Pełny tekst źródłaUribe, Cesar A., Darina Dvinskikh, Pavel Dvurechensky, Alexander Gasnikov i Angelia Nedic. "Distributed Computation of Wasserstein Barycenters Over Networks". W 2018 IEEE Conference on Decision and Control (CDC). IEEE, 2018. http://dx.doi.org/10.1109/cdc.2018.8619160.
Pełny tekst źródłaArque, Ferran, Cesar Uribe i Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". W LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/202107244.
Pełny tekst źródłaArque, Ferran, Cesar Uribe i Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". W LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/lxai202107244.
Pełny tekst źródłaMontesuma, Eduardo F., i Fred-Maurice Ngole Mboula. "Wasserstein Barycenter Transport for Acoustic Adaptation". W ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9414199.
Pełny tekst źródłaLian, Xin, Kshitij Jain, Jakub Truszkowski, Pascal Poupart i Yaoliang Yu. "Unsupervised Multilingual Alignment using Wasserstein Barycenter". W Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/512.
Pełny tekst źródłaMontesuma, Eduardo Fernandes, i Fred Maurice Ngole Mboula. "Wasserstein Barycenter for Multi-Source Domain Adaptation". W 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2021. http://dx.doi.org/10.1109/cvpr46437.2021.01651.
Pełny tekst źródła