Literatura científica selecionada sobre o tema "Barycentres de Wasserstein"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Barycentres de Wasserstein".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Barycentres de Wasserstein"
Beier, Florian, Robert Beinert e Gabriele Steidl. "Multi-marginal Gromov–Wasserstein transport and barycentres". Information and Inference: A Journal of the IMA 12, n.º 4 (18 de setembro de 2023): 2720–52. http://dx.doi.org/10.1093/imaiai/iaad041.
Texto completo da fonteChi, Jinjin, Zhiyao Yang, Ximing Li, Jihong Ouyang e Renchu Guan. "Variational Wasserstein Barycenters with C-cyclical Monotonicity Regularization". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 6 (26 de junho de 2023): 7157–65. http://dx.doi.org/10.1609/aaai.v37i6.25873.
Texto completo da fonteXu, Hongteng, Dixin Luo, Lawrence Carin e Hongyuan Zha. "Learning Graphons via Structured Gromov-Wasserstein Barycenters". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 12 (18 de maio de 2021): 10505–13. http://dx.doi.org/10.1609/aaai.v35i12.17257.
Texto completo da fonteBigot, Jérémie, e Thierry Klein. "Characterization of barycenters in the Wasserstein space by averaging optimal transport maps". ESAIM: Probability and Statistics 22 (2018): 35–57. http://dx.doi.org/10.1051/ps/2017020.
Texto completo da fonteSow, Babacar, Rodolphe Le Riche, Julien Pelamatti, Merlin Keller e Sanaa Zannane. "Wasserstein-Based Evolutionary Operators for Optimizing Sets of Points: Application to Wind-Farm Layout Design". Applied Sciences 14, n.º 17 (5 de setembro de 2024): 7916. http://dx.doi.org/10.3390/app14177916.
Texto completo da fonteBigot, Jérémie, Elsa Cazelles e Nicolas Papadakis. "Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration". Information and Inference: A Journal of the IMA 8, n.º 4 (30 de novembro de 2019): 719–55. http://dx.doi.org/10.1093/imaiai/iaz023.
Texto completo da fonteXu, Hongtengl. "Gromov-Wasserstein Factorization Models for Graph Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 04 (3 de abril de 2020): 6478–85. http://dx.doi.org/10.1609/aaai.v34i04.6120.
Texto completo da fonteBonneel, Nicolas, Gabriel Peyré e Marco Cuturi. "Wasserstein barycentric coordinates". ACM Transactions on Graphics 35, n.º 4 (11 de julho de 2016): 1–10. http://dx.doi.org/10.1145/2897824.2925918.
Texto completo da fonteXiang, Yue, Dixin Luo e Hongteng Xu. "Privacy-Preserved Evolutionary Graph Modeling via Gromov-Wasserstein Autoregression". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 12 (26 de junho de 2023): 14566–74. http://dx.doi.org/10.1609/aaai.v37i12.26703.
Texto completo da fonteAgueh, Martial, e Guillaume Carlier. "Barycenters in the Wasserstein Space". SIAM Journal on Mathematical Analysis 43, n.º 2 (janeiro de 2011): 904–24. http://dx.doi.org/10.1137/100805741.
Texto completo da fonteTeses / dissertações sobre o assunto "Barycentres de Wasserstein"
Fernandes, Montesuma Eduardo. "Multi-Source Domain Adaptation through Wasserstein Barycenters". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG045.
Texto completo da fonteMachine learning systems work under the assumption that training and test conditions are uniform, i.e., they do not change. However, this hypothesis is seldom met in practice. Hence, the system is trained with data that is no longer representative of the data it will be tested on. This case is represented by a shift in the probability measure generating the data. This scenario is known in the literature as distributional shift between two domains: a source, and a target. A straightforward generalization of this problem is when training data itself exhibit shifts on its own. In this case, one consider Multi Source Domain Adaptation (MSDA). In this context, optimal transport is an useful field of mathematics. Especially, optimal transport serves as a toolbox, for comparing and manipulating probability measures. This thesis studies the contributions of optimal transport to multi-source domain adaptation. We do so through Wasserstein barycenters, an object that defines a weighted average, in the space of probability measures, for the multiple domains in MSDA. Based on this concept, we propose: (i) a novel notion of barycenter, when the measures at hand are equipped with labels, (ii) a novel dictionary learning problem over empirical probability measures and (iii) new tools for domain adaptation through the optimal transport of Gaussian mixture models. Through our methods, we are able to improve domain adaptation performance in comparison with previous optimal transport-based methods on image, and cross-domain fault diagnosis benchmarks. Our work opens an interesting research direction, on learning the barycentric hull of probability measures
Cazelles, Elsa. "Statistical properties of barycenters in the Wasserstein space and fast algorithms for optimal transport of measures". Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0125/document.
Texto completo da fonteThis thesis focuses on the analysis of data in the form of probability measures on R^d. The aim is to provide a better understanding of the usual statistical tools on this space endowed with the Wasserstein distance. The first order statistical analysis is a natural notion to consider, consisting of the study of the Fréchet mean (or barycentre). In particular, we focus on the case of discrete data (or observations) sampled from absolutely continuous probability measures (a.c.) with respect to the Lebesgue measure. We thus introduce an estimator of the barycenter of random measures, penalized by a convex function, making it possible to enforce its a.c. Another estimator is regularized by adding entropy when computing the Wasserstein distance. We are particularly interested in controlling the variance of these estimators. Thanks to these results, the principle of Goldenshluger and Lepski allows us to obtain an automatic calibration of the regularization parameters. We then apply this work to the registration of multivariate densities, especially for flow cytometry data. We also propose a test statistic that can compare two multivariate distributions, efficiently in terms of computational time. Finally, we perform a second-order statistical analysis to extract the global geometric tendency of a dataset, also called the main modes of variation. For that purpose, we propose algorithms allowing to carry out a geodesic principal components analysis in the space of Wasserstein
Le, Gouic Thibaut. "Localisation de masse et espaces de Wasserstein". Toulouse 3, 2013. http://thesesups.ups-tlse.fr/2163/.
Texto completo da fonteThe study of this manuscript is based on two disctincts tools : the packing and the Wasserstein spaces. A first part focuses on the measure localization of a probability Mu. For a regular Mu, the level sets of its density are a good notion to localize where measure is dense, but loose its meaning for a finitely supported measure such as the empirical measure. We thus define a function Tau , called size function, on the closed sets, based on the packing of the sets. The sets of smallest Tau -size with a given probability 1 − alpha localize dense areas, even if Mu is not regular. We show that these smallest sets given Mu and alpha depend continuously on Mu and alpha, for the Hausdorff distance. We derive a new method to quantize Mu in a robust and stable way. A second part focuses on the Wasserstein distance between a probability measure and the associated empirical measure. We obtain a non asymptotic upper bound of the expectation of this distance, for any arbitrary underlying metric space. An application of the result to finite dimensional spaces shows the accuracy of the bound. We also obtain new bounds for the case of Gaussian measure on Banach spaces that coincide asymptotically with the best quantizers possible. Using concentration inequalities, we show deviation bounds. Finally, we use these results to define non asymptotic and non parametric statistical tests of goodness of fit to a family of probability measures. A third part focuses on the barycenter of a finite family of probability measures. The Fréchet mean is an extension to the notion of barycenter to metric spaces, and gives us a way to define barycenter on Wasserstein spaces. We show the existence of these barycenters and then study properties of continuity on the family of measures. We then discuss practical applications in agreagation of empirical measures and texture mixing
Silveti, Falls Antonio. "First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms". Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Texto completo da fonteIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Nenna, Luca. "Numerical Methods for Multi-Marginal Optimal Transportation". Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED017/document.
Texto completo da fonteIn this thesis we aim at giving a general numerical framework to approximate solutions to optimal transport (OT) problems. The general idea is to introduce an entropic regularization of the initialproblems. The regularized problem corresponds to the minimization of a relative entropy with respect a given reference measure. Indeed, this is equivalent to find the projection of the joint coupling with respect the Kullback-Leibler divergence. This allows us to make use the Bregman/Dykstra’s algorithm and solve several variational problems related to OT. We are especially interested in solving multi-marginal optimal transport problems (MMOT) arising in Physics such as in Fluid Dynamics (e.g. incompressible Euler equations à la Brenier) and in Quantum Physics (e.g. Density Functional Theory). In these cases we show that the entropic regularization plays a more important role than a simple numerical stabilization. Moreover, we also give some important results concerning existence and characterization of optimal transport maps (e.g. fractal maps) for MMOT
Capítulos de livros sobre o assunto "Barycentres de Wasserstein"
Bouchet, Pierre-Yves, Stefano Gualandi e Louis-Martin Rousseau. "Primal Heuristics for Wasserstein Barycenters". In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 239–55. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58942-4_16.
Texto completo da fonteTrinh, Thanh-Son. "Wasserstein Barycenters Over Heisenberg Group". In Algorithms for Intelligent Systems, 273–79. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-8976-8_24.
Texto completo da fonteAuricchio, Gennaro, Federico Bassetti, Stefano Gualandi e Marco Veneroni. "Computing Wasserstein Barycenters via Linear Programming". In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 355–63. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-19212-9_23.
Texto completo da fonteCazelles, Elsa, Jérémie Bigot e Nicolas Papadakis. "Regularized Barycenters in the Wasserstein Space". In Lecture Notes in Computer Science, 83–90. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_10.
Texto completo da fonteLe Gouic, Thibaut, e Jean-Michel Loubes. "Barycenter in Wasserstein Spaces: Existence and Consistency". In Lecture Notes in Computer Science, 104–8. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25040-3_12.
Texto completo da fonteHu, François, Philipp Ratz e Arthur Charpentier. "Fairness in Multi-Task Learning via Wasserstein Barycenters". In Machine Learning and Knowledge Discovery in Databases: Research Track, 295–312. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43415-0_18.
Texto completo da fonteNadeem, Saad, Travis Hollmann e Allen Tannenbaum. "Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation". In Medical Image Computing and Computer Assisted Intervention – MICCAI 2020, 362–71. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59722-1_35.
Texto completo da fonteRabin, Julien, Gabriel Peyré, Julie Delon e Marc Bernot. "Wasserstein Barycenter and Its Application to Texture Mixing". In Lecture Notes in Computer Science, 435–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24785-9_37.
Texto completo da fonteWang, Xu, Jiawei Huang, Qingyuan Yang e Jinpeng Zhang. "On Robust Wasserstein Barycenter: The Model and Algorithm". In Proceedings of the 2024 SIAM International Conference on Data Mining (SDM), 235–43. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2024. http://dx.doi.org/10.1137/1.9781611978032.27.
Texto completo da fonteJin, Cong, Zhongtong Li, Yuanyuan Sun, Haiyin Zhang, Xin Lv, Jianguang Li e Shouxun Liu. "An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription". In Communications and Networking, 230–40. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41117-6_19.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Barycentres de Wasserstein"
Tang, Hanning, Xiaojing Shen, Hua Zhao, Zhiguo Wang e Pramod K. Varshney. "Bures-Wasserstein Barycentric Coordinates with Application to Diffusion Tensor Image Smoothing". In 2024 27th International Conference on Information Fusion (FUSION), 1–8. IEEE, 2024. http://dx.doi.org/10.23919/fusion59988.2024.10706482.
Texto completo da fonteSimon, Dror, e Aviad Aberdam. "Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing". In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00793.
Texto completo da fonteSimou, Effrosyni, e Pascal Frossard. "Graph Signal Representation with Wasserstein Barycenters". In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683335.
Texto completo da fonteOuyang, Jihong, Yiming Wang, Ximing Li e Changchun Li. "Weakly-supervised Text Classification with Wasserstein Barycenters Regularization". In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/468.
Texto completo da fonteUribe, Cesar A., Darina Dvinskikh, Pavel Dvurechensky, Alexander Gasnikov e Angelia Nedic. "Distributed Computation of Wasserstein Barycenters Over Networks". In 2018 IEEE Conference on Decision and Control (CDC). IEEE, 2018. http://dx.doi.org/10.1109/cdc.2018.8619160.
Texto completo da fonteArque, Ferran, Cesar Uribe e Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/202107244.
Texto completo da fonteArque, Ferran, Cesar Uribe e Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/lxai202107244.
Texto completo da fonteMontesuma, Eduardo F., e Fred-Maurice Ngole Mboula. "Wasserstein Barycenter Transport for Acoustic Adaptation". In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9414199.
Texto completo da fonteLian, Xin, Kshitij Jain, Jakub Truszkowski, Pascal Poupart e Yaoliang Yu. "Unsupervised Multilingual Alignment using Wasserstein Barycenter". In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/512.
Texto completo da fonteMontesuma, Eduardo Fernandes, e Fred Maurice Ngole Mboula. "Wasserstein Barycenter for Multi-Source Domain Adaptation". In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2021. http://dx.doi.org/10.1109/cvpr46437.2021.01651.
Texto completo da fonte