Literatura científica selecionada sobre o tema "Wasserstein barycenters"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Wasserstein barycenters".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Artigos de revistas sobre o assunto "Wasserstein barycenters"

1

Chi, Jinjin, Zhiyao Yang, Ximing Li, Jihong Ouyang e Renchu Guan. "Variational Wasserstein Barycenters with C-cyclical Monotonicity Regularization". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 6 (26 de junho de 2023): 7157–65. http://dx.doi.org/10.1609/aaai.v37i6.25873.

Texto completo da fonte
Resumo:
Wasserstein barycenter, built on the theory of Optimal Transport (OT), provides a powerful framework to aggregate probability distributions, and it has increasingly attracted great attention within the machine learning community. However, it is often intractable to precisely compute, especially for high dimensional and continuous settings. To alleviate this problem, we develop a novel regularization by using the fact that c-cyclical monotonicity is often necessary and sufficient conditions for optimality in OT problems, and incorporate it into the dual formulation of Wasserstein barycenters. For efficient computations, we adopt a variational distribution as the approximation of the true continuous barycenter, so as to frame the Wasserstein barycenters problem as an optimization problem with respect to variational parameters. Upon those ideas, we propose a novel end-to-end continuous approximation method, namely Variational Wasserstein Barycenters with c-Cyclical Monotonicity Regularization (VWB-CMR), given sample access to the input distributions. We show theoretical convergence analysis and demonstrate the superior performance of VWB-CMR on synthetic data and real applications of subset posterior aggregation.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Xu, Hongteng, Dixin Luo, Lawrence Carin e Hongyuan Zha. "Learning Graphons via Structured Gromov-Wasserstein Barycenters". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 12 (18 de maio de 2021): 10505–13. http://dx.doi.org/10.1609/aaai.v35i12.17257.

Texto completo da fonte
Resumo:
We propose a novel and principled method to learn a nonparametric graph model called graphon, which is defined in an infinite-dimensional space and represents arbitrary-size graphs. Based on the weak regularity lemma from the theory of graphons, we leverage a step function to approximate a graphon. We show that the cut distance of graphons can be relaxed to the Gromov-Wasserstein distance of their step functions. Accordingly, given a set of graphs generated by an underlying graphon, we learn the corresponding step function as the Gromov-Wasserstein barycenter of the given graphs. Furthermore, we develop several enhancements and extensions of the basic algorithm, e.g., the smoothed Gromov-Wasserstein barycenter for guaranteeing the continuity of the learned graphons and the mixed Gromov-Wasserstein barycenters for learning multiple structured graphons. The proposed approach overcomes drawbacks of prior state-of-the-art methods, and outperforms them on both synthetic and real-world data. The code is available at https://github.com/HongtengXu/SGWB-Graphon.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Bigot, Jérémie, e Thierry Klein. "Characterization of barycenters in the Wasserstein space by averaging optimal transport maps". ESAIM: Probability and Statistics 22 (2018): 35–57. http://dx.doi.org/10.1051/ps/2017020.

Texto completo da fonte
Resumo:
This paper is concerned by the study of barycenters for random probability measures in the Wasserstein space. Using a duality argument, we give a precise characterization of the population barycenter for various parametric classes of random probability measures with compact support. In particular, we make a connection between averaging in the Wasserstein space as introduced in Agueh and Carlier [SIAM J. Math. Anal. 43 (2011) 904–924], and taking the expectation of optimal transport maps with respect to a fixed reference measure. We also discuss the usefulness of this approach in statistics for the analysis of deformable models in signal and image processing. In this setting, the problem of estimating a population barycenter from n independent and identically distributed random probability measures is also considered.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Agueh, Martial, e Guillaume Carlier. "Barycenters in the Wasserstein Space". SIAM Journal on Mathematical Analysis 43, n.º 2 (janeiro de 2011): 904–24. http://dx.doi.org/10.1137/100805741.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Kim, Young-Heon, e Brendan Pass. "Wasserstein barycenters over Riemannian manifolds". Advances in Mathematics 307 (fevereiro de 2017): 640–83. http://dx.doi.org/10.1016/j.aim.2016.11.026.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Baum, Marcus, Peter Willett e Uwe D. Hanebeck. "On Wasserstein Barycenters and MMOSPA Estimation". IEEE Signal Processing Letters 22, n.º 10 (outubro de 2015): 1511–15. http://dx.doi.org/10.1109/lsp.2015.2410217.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Puccetti, Giovanni, Ludger Rüschendorf e Steven Vanduffel. "On the computation of Wasserstein barycenters". Journal of Multivariate Analysis 176 (março de 2020): 104581. http://dx.doi.org/10.1016/j.jmva.2019.104581.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Le Gouic, Thibaut, e Jean-Michel Loubes. "Existence and consistency of Wasserstein barycenters". Probability Theory and Related Fields 168, n.º 3-4 (17 de agosto de 2016): 901–17. http://dx.doi.org/10.1007/s00440-016-0727-z.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Buzun, Nazar. "Gaussian Approximation for Penalized Wasserstein Barycenters". Mathematical Methods of Statistics 32, n.º 1 (março de 2023): 1–26. http://dx.doi.org/10.3103/s1066530723010039.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Sow, Babacar, Rodolphe Le Riche, Julien Pelamatti, Merlin Keller e Sanaa Zannane. "Wasserstein-Based Evolutionary Operators for Optimizing Sets of Points: Application to Wind-Farm Layout Design". Applied Sciences 14, n.º 17 (5 de setembro de 2024): 7916. http://dx.doi.org/10.3390/app14177916.

Texto completo da fonte
Resumo:
This paper introduces an evolutionary algorithm for objective functions defined over clouds of points of varying sizes. Such design variables are modeled as uniform discrete measures with finite support and the crossover and mutation operators of the algorithm are defined using the Wasserstein barycenter. We prove that the Wasserstein-based crossover has a contracting property in the sense that the support of the generated measure is included in the closed convex hull of the union of the two parents’ supports. We introduce boundary mutations to counteract this contraction. Variants of evolutionary operators based on Wasserstein barycenters are studied. We compare the resulting algorithm to a more classical, sequence-based, evolutionary algorithm on a family of test functions that include a wind-farm layout problem. The results show that Wasserstein-based evolutionary operators better capture the underlying geometrical structures of the considered test functions and outperform a reference evolutionary algorithm in the vast majority of the cases. The tests indicate that the mutation operators play a major part in the performances of the algorithms.
Estilos ABNT, Harvard, Vancouver, APA, etc.

Teses / dissertações sobre o assunto "Wasserstein barycenters"

1

Fernandes, Montesuma Eduardo. "Multi-Source Domain Adaptation through Wasserstein Barycenters". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG045.

Texto completo da fonte
Resumo:
Les systèmes d'apprentissage automatique fonctionnent sous l'hypothèse que les conditions d'entraînement et de test ne changent pas. Néanmoins, cette hypothèse est rarement vérifiée en pratique. En conséquence, le système est entraîné avec des données qui ne sont plus représentatives des données sur lesquelles il sera testé : la mesure de probabilité des données évolue entre les périodes d'entraînement et de test. Ce scénario est connu dans la littérature sous le nom de décalage de distribution entre deux domaines : une source et une cible. Une généralisation évidente de ce problème considère que les données d'entraînement présentent elles-mêmes plusieurs décalages intrinsèques. On parle, donc, d'adaptation de domaine à sources multiples (MSDA). Dans ce contexte, le transport optimal est un outil de mathématique utile. En particulier, qui sert pour comparer et manipuler des mesures de probabilité. Cette thèse étudie les contributions du transport optimal à l'adaptation de domaines à sources multiples. Nous le faisons à travers des barycentres de Wasserstein, un objet qui définit une moyenne pondérée, dans l'espace des mesures de probabilité, des multiples domaines en MSDA. Basé sur ce concept, nous proposons : (i) une nouvelle notion de barycentre lorsque les mesures en question sont étiquetées, (ii) un nouveau problème d'apprentissage de dictionnaire sur des mesures de probabilité empiriques et (iii) de nouveaux outils pour l'adaptation de domaines via le transport optimal de modèles de mélanges Gaussiens. Nos méthodes améliorent les performances de l'adaptation de domaines par rapport aux méthodes existantes utilisant le transport optimal sur des benchmarks d'images et de diagnostic de défauts inter-domaines. Notre travail ouvre une perspective de recherche intéressante sur l'apprentissage de l'enveloppe barycentrique de mesures de probabilité
Machine learning systems work under the assumption that training and test conditions are uniform, i.e., they do not change. However, this hypothesis is seldom met in practice. Hence, the system is trained with data that is no longer representative of the data it will be tested on. This case is represented by a shift in the probability measure generating the data. This scenario is known in the literature as distributional shift between two domains: a source, and a target. A straightforward generalization of this problem is when training data itself exhibit shifts on its own. In this case, one consider Multi Source Domain Adaptation (MSDA). In this context, optimal transport is an useful field of mathematics. Especially, optimal transport serves as a toolbox, for comparing and manipulating probability measures. This thesis studies the contributions of optimal transport to multi-source domain adaptation. We do so through Wasserstein barycenters, an object that defines a weighted average, in the space of probability measures, for the multiple domains in MSDA. Based on this concept, we propose: (i) a novel notion of barycenter, when the measures at hand are equipped with labels, (ii) a novel dictionary learning problem over empirical probability measures and (iii) new tools for domain adaptation through the optimal transport of Gaussian mixture models. Through our methods, we are able to improve domain adaptation performance in comparison with previous optimal transport-based methods on image, and cross-domain fault diagnosis benchmarks. Our work opens an interesting research direction, on learning the barycentric hull of probability measures
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Cazelles, Elsa. "Statistical properties of barycenters in the Wasserstein space and fast algorithms for optimal transport of measures". Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0125/document.

Texto completo da fonte
Resumo:
Cette thèse se concentre sur l'analyse de données présentées sous forme de mesures de probabilité sur R^d. L'objectif est alors de fournir une meilleure compréhension des outils statistiques usuels sur cet espace muni de la distance de Wasserstein. Une première notion naturelle est l'analyse statistique d'ordre un, consistant en l'étude de la moyenne de Fréchet (ou barycentre). En particulier, nous nous concentrons sur le cas de données (ou observations) discrètes échantillonnées à partir de mesures de probabilité absolument continues (a.c.) par rapport à la mesure de Lebesgue. Nous introduisons ainsi un estimateur du barycentre de mesures aléatoires, pénalisé par une fonction convexe, permettant ainsi d'imposer son a.c. Un autre estimateur est régularisé par l'ajout d'entropie lors du calcul de la distance de Wasserstein. Nous nous intéressons notamment au contrôle de la variance de ces estimateurs. Grâce à ces résultats, le principe de Goldenshluger et Lepski nous permet d'obtenir une calibration automatique des paramètres de régularisation. Nous appliquons ensuite ce travail au recalage de densités multivariées, notamment pour des données de cytométrie de flux. Nous proposons également un test d'adéquation de lois capable de comparer deux distributions multivariées, efficacement en terme de temps de calcul. Enfin, nous exécutons une analyse statistique d'ordre deux dans le but d'extraire les tendances géométriques globales d'un jeu de donnée, c'est-à-dire les principaux modes de variations. Pour cela nous proposons un algorithme permettant d'effectuer une analyse en composantes principales géodésiques dans l'espace de Wasserstein
This thesis focuses on the analysis of data in the form of probability measures on R^d. The aim is to provide a better understanding of the usual statistical tools on this space endowed with the Wasserstein distance. The first order statistical analysis is a natural notion to consider, consisting of the study of the Fréchet mean (or barycentre). In particular, we focus on the case of discrete data (or observations) sampled from absolutely continuous probability measures (a.c.) with respect to the Lebesgue measure. We thus introduce an estimator of the barycenter of random measures, penalized by a convex function, making it possible to enforce its a.c. Another estimator is regularized by adding entropy when computing the Wasserstein distance. We are particularly interested in controlling the variance of these estimators. Thanks to these results, the principle of Goldenshluger and Lepski allows us to obtain an automatic calibration of the regularization parameters. We then apply this work to the registration of multivariate densities, especially for flow cytometry data. We also propose a test statistic that can compare two multivariate distributions, efficiently in terms of computational time. Finally, we perform a second-order statistical analysis to extract the global geometric tendency of a dataset, also called the main modes of variation. For that purpose, we propose algorithms allowing to carry out a geodesic principal components analysis in the space of Wasserstein
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Le, Gouic Thibaut. "Localisation de masse et espaces de Wasserstein". Toulouse 3, 2013. http://thesesups.ups-tlse.fr/2163/.

Texto completo da fonte
Resumo:
Le travail de cette thèse est basé sur deux outils : le packing d'un ensemble et les espaces de Wasserstein. Une première partie s'intéresse à la localisation de la masse d'une mesure de probabilité Mu. Lorsque Mu est régulière, les ensembles de niveau de sa densité fournissent une bonne notion pour localiser les zones "denses" de masse, mais perdent leur sens pour les mesures à support fini, comme dans le cas de la mesure empirique. Nous définissons alors une fonction Tau dite de taille, sur les fermés d'un espace métrique, basée sur leur packing. Les ensembles de plus petite Tau-taille ayant une masse 1 − alpha donnée permettent de localiser les zones denses de Mu, même dans les cas irréguliers. Nous montrons que les ensembles de plus petite Tau-taille pour Mu et alpha fixés dépendent continuement de Mu et de alpha, pour la distance de Hausdorff. Nous en tirons une nouvelle méthode de quantification de Mu, robuste et stable. Une seconde partie s'intéresse à la distance de Wasserstein entre une probabilité mu et la mesure empirique associée. Nous obtenons une majoration non asymptotique de l'espérance de cette distance, dans la cadre d'un espace métrique quelconque. Une particularisation aux espaces de dimension finie permet de mettre en valeur la précision de cette majoration. Nous obtenons aussi dans le cas des mesures gaussiennes sur les espaces de Banach, de nouvelles majorations qui coïncident asymptotiquement avec celles des meilleurs quantifieurs possibles. À l'aide d'inégalités de concentration, nous établissons des bornes de déviations. Enfin, nous utilisons ces résultats pour définir des tests statistiques non asymptotiques et non paramétriques d'adéquation à une famille de lois. Une troisième partie s'intéresse au barycentre d'une famille finie de mesures de probabilité. La moyenne de Fréchet fournit une extension de la notion de barycentre aux espaces métriques, nous permettant de le définir sur les espaces de Wasserstein. Nous montrons son existence, puis, en étudions les propriétés de continuité en les mesures de probabilité. Nous discutons enfin de l'application pratique de ces résultats en agrégation de mesures empiriques et en mélange d'images
The study of this manuscript is based on two disctincts tools : the packing and the Wasserstein spaces. A first part focuses on the measure localization of a probability Mu. For a regular Mu, the level sets of its density are a good notion to localize where measure is dense, but loose its meaning for a finitely supported measure such as the empirical measure. We thus define a function Tau , called size function, on the closed sets, based on the packing of the sets. The sets of smallest Tau -size with a given probability 1 − alpha localize dense areas, even if Mu is not regular. We show that these smallest sets given Mu and alpha depend continuously on Mu and alpha, for the Hausdorff distance. We derive a new method to quantize Mu in a robust and stable way. A second part focuses on the Wasserstein distance between a probability measure and the associated empirical measure. We obtain a non asymptotic upper bound of the expectation of this distance, for any arbitrary underlying metric space. An application of the result to finite dimensional spaces shows the accuracy of the bound. We also obtain new bounds for the case of Gaussian measure on Banach spaces that coincide asymptotically with the best quantizers possible. Using concentration inequalities, we show deviation bounds. Finally, we use these results to define non asymptotic and non parametric statistical tests of goodness of fit to a family of probability measures. A third part focuses on the barycenter of a finite family of probability measures. The Fréchet mean is an extension to the notion of barycenter to metric spaces, and gives us a way to define barycenter on Wasserstein spaces. We show the existence of these barycenters and then study properties of continuity on the family of measures. We then discuss practical applications in agreagation of empirical measures and texture mixing
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Silveti, Falls Antonio. "First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms". Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.

Texto completo da fonte
Resumo:
Dans ce travail, nous développons et examinons deux nouveaux algorithmes d'éclatement du premier ordre pour résoudre des problèmes d'optimisation composites à grande échelle dans des espaces à dimensions infinies. Ces problèmes sont au coeur de nombres de domaines scientifiques et d'ingénierie, en particulier la science des données et l'imagerie. Notre travail est axé sur l'assouplissement des hypothèses de régularité de Lipschitz généralement requises par les algorithmes de fractionnement du premier ordre en remplaçant l'énergie euclidienne par une divergence de Bregman. Ces développements permettent de résoudre des problèmes ayant une géométrie plus exotique que celle du cadre euclidien habituel. Un des algorithmes développés est l'hybridation de l'algorithme de gradient conditionnel, utilisant un oracle de minimisation linéaire à chaque itération, avec méthode du Lagrangien augmenté, permettant ainsi la prise en compte de contraintes affines. L'autre algorithme est un schéma d'éclatement primal-dual incorporant les divergences de Bregman pour le calcul des opérateurs proximaux associés. Pour ces deux algorithmes, nous montrons la convergence des valeurs Lagrangiennes, la convergence faible des itérés vers les solutions ainsi que les taux de convergence. En plus de ces nouveaux algorithmes déterministes, nous introduisons et étudions également leurs extensions stochastiques au travers d'un point de vue d'analyse de stablité aux perturbations. Nos résultats dans cette partie comprennent des résultats de convergence presque sûre pour les mêmes quantités que dans le cadre déterministe, avec des taux de convergence également. Enfin, nous abordons de nouveaux problèmes qui ne sont accessibles qu'à travers les hypothèses relâchées que nos algorithmes permettent. Nous démontrons l'efficacité numérique et illustrons nos résultats théoriques sur des problèmes comme la complétion de matrice parcimonieuse de rang faible, les problèmes inverses sur le simplexe, ou encore les problèmes inverses impliquant la distance de Wasserstein régularisée
In this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Nenna, Luca. "Numerical Methods for Multi-Marginal Optimal Transportation". Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED017/document.

Texto completo da fonte
Resumo:
Dans cette thèse, notre but est de donner un cadre numérique général pour approcher les solutions des problèmes du transport optimal (TO). L’idée générale est d’introduire une régularisation entropique du problème initial. Le problème régularisé correspond à minimiser une entropie relative par rapport à une mesure de référence donnée. En effet, cela équivaut à trouver la projection d’un couplage par rapport à la divergence de Kullback-Leibler. Cela nous permet d’utiliser l’algorithme de Bregman/Dykstra et de résoudre plusieurs problèmes variationnels liés au TO. Nous nous intéressons particulièrement à la résolution des problèmes du transport optimal multi-marges (TOMM) qui apparaissent dans le cadre de la dynamique des fluides (équations d’Euler incompressible à la Brenier) et de la physique quantique (la théorie de fonctionnelle de la densité ). Dans ces cas, nous montrons que la régularisation entropique joue un rôle plus important que de la simple stabilisation numérique. De plus, nous donnons des résultats concernant l’existence des transports optimaux (par exemple des transports fractals) pour le problème TOMM
In this thesis we aim at giving a general numerical framework to approximate solutions to optimal transport (OT) problems. The general idea is to introduce an entropic regularization of the initialproblems. The regularized problem corresponds to the minimization of a relative entropy with respect a given reference measure. Indeed, this is equivalent to find the projection of the joint coupling with respect the Kullback-Leibler divergence. This allows us to make use the Bregman/Dykstra’s algorithm and solve several variational problems related to OT. We are especially interested in solving multi-marginal optimal transport problems (MMOT) arising in Physics such as in Fluid Dynamics (e.g. incompressible Euler equations à la Brenier) and in Quantum Physics (e.g. Density Functional Theory). In these cases we show that the entropic regularization plays a more important role than a simple numerical stabilization. Moreover, we also give some important results concerning existence and characterization of optimal transport maps (e.g. fractal maps) for MMOT
Estilos ABNT, Harvard, Vancouver, APA, etc.

Capítulos de livros sobre o assunto "Wasserstein barycenters"

1

Bouchet, Pierre-Yves, Stefano Gualandi e Louis-Martin Rousseau. "Primal Heuristics for Wasserstein Barycenters". In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 239–55. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58942-4_16.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Trinh, Thanh-Son. "Wasserstein Barycenters Over Heisenberg Group". In Algorithms for Intelligent Systems, 273–79. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-8976-8_24.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Auricchio, Gennaro, Federico Bassetti, Stefano Gualandi e Marco Veneroni. "Computing Wasserstein Barycenters via Linear Programming". In Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 355–63. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-19212-9_23.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Cazelles, Elsa, Jérémie Bigot e Nicolas Papadakis. "Regularized Barycenters in the Wasserstein Space". In Lecture Notes in Computer Science, 83–90. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_10.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Hu, François, Philipp Ratz e Arthur Charpentier. "Fairness in Multi-Task Learning via Wasserstein Barycenters". In Machine Learning and Knowledge Discovery in Databases: Research Track, 295–312. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43415-0_18.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Le Gouic, Thibaut, e Jean-Michel Loubes. "Barycenter in Wasserstein Spaces: Existence and Consistency". In Lecture Notes in Computer Science, 104–8. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25040-3_12.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Nadeem, Saad, Travis Hollmann e Allen Tannenbaum. "Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation". In Medical Image Computing and Computer Assisted Intervention – MICCAI 2020, 362–71. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59722-1_35.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Rabin, Julien, Gabriel Peyré, Julie Delon e Marc Bernot. "Wasserstein Barycenter and Its Application to Texture Mixing". In Lecture Notes in Computer Science, 435–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24785-9_37.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Wang, Xu, Jiawei Huang, Qingyuan Yang e Jinpeng Zhang. "On Robust Wasserstein Barycenter: The Model and Algorithm". In Proceedings of the 2024 SIAM International Conference on Data Mining (SDM), 235–43. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2024. http://dx.doi.org/10.1137/1.9781611978032.27.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Jin, Cong, Zhongtong Li, Yuanyuan Sun, Haiyin Zhang, Xin Lv, Jianguang Li e Shouxun Liu. "An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription". In Communications and Networking, 230–40. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41117-6_19.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Trabalhos de conferências sobre o assunto "Wasserstein barycenters"

1

Simon, Dror, e Aviad Aberdam. "Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing". In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00793.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Simou, Effrosyni, e Pascal Frossard. "Graph Signal Representation with Wasserstein Barycenters". In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683335.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Uribe, Cesar A., Darina Dvinskikh, Pavel Dvurechensky, Alexander Gasnikov e Angelia Nedic. "Distributed Computation of Wasserstein Barycenters Over Networks". In 2018 IEEE Conference on Decision and Control (CDC). IEEE, 2018. http://dx.doi.org/10.1109/cdc.2018.8619160.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Ouyang, Jihong, Yiming Wang, Ximing Li e Changchun Li. "Weakly-supervised Text Classification with Wasserstein Barycenters Regularization". In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/468.

Texto completo da fonte
Resumo:
Weakly-supervised text classification aims to train predictive models with unlabeled texts and a few representative words of classes, referred to as category words, rather than labeled texts. These weak supervisions are much more cheaper and easy to collect in real-world scenarios. To resolve this task, we propose a novel deep classification model, namely Weakly-supervised Text Classification with Wasserstein Barycenter Regularization (WTC-WBR). Specifically, we initialize the pseudo-labels of texts by using the category word occurrences, and formulate a weakly self-training framework to iteratively update the weakly-supervised targets by combining the pseudo-labels with the sharpened predictions. Most importantly, we suggest a Wasserstein barycenter regularization with the weakly-supervised targets on the deep feature space. The intuition is that the texts tend to be close to the corresponding Wasserstein barycenter indicated by weakly-supervised targets. Another benefit is that the regularization can capture the geometric information of deep feature space to boost the discriminative power of deep features. Experimental results demonstrate that WTC-WBR outperforms the existing weakly-supervised baselines, and achieves comparable performance to semi-supervised and supervised baselines.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Arque, Ferran, Cesar Uribe e Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/202107244.

Texto completo da fonte
Resumo:
We study a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we start with a distribution over the set of nodes that needs to be “transported” to a target distribution accounting for the network topology. We exploit the specific structure of the problem, characterized by the computation of implicit gradient steps, and formulate an approach based on discretized flows. As a result, our proposed algorithm relies on the iterative computation of constrained Wasserstein barycenters. We show how the proposed method finds approximate solutions to the network transport problem, taking into account the topology of the network, the capacity of the communication channels, and the capacity of the individual nodes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Arque, Ferran, Cesar Uribe e Carlos Ocampo-Martinez. "Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters". In LatinX in AI at International Conference on Machine Learning 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/lxai202107244.

Texto completo da fonte
Resumo:
We study a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we start with a distribution over the set of nodes that needs to be “transported” to a target distribution accounting for the network topology. We exploit the specific structure of the problem, characterized by the computation of implicit gradient steps, and formulate an approach based on discretized flows. As a result, our proposed algorithm relies on the iterative computation of constrained Wasserstein barycenters. We show how the proposed method finds approximate solutions to the network transport problem, taking into account the topology of the network, the capacity of the communication channels, and the capacity of the individual nodes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Colombo, Pierre, Guillaume Staerman, Chloé Clavel e Pablo Piantanida. "Automatic Text Evaluation through the Lens of Wasserstein Barycenters". In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.emnlp-main.817.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Montesuma, Eduardo F., e Fred-Maurice Ngole Mboula. "Wasserstein Barycenter Transport for Acoustic Adaptation". In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9414199.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Lian, Xin, Kshitij Jain, Jakub Truszkowski, Pascal Poupart e Yaoliang Yu. "Unsupervised Multilingual Alignment using Wasserstein Barycenter". In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/512.

Texto completo da fonte
Resumo:
We study unsupervised multilingual alignment, the problem of finding word-to-word translations between multiple languages without using any parallel data. One popular strategy is to reduce multilingual alignment to the much simplified bilingual setting, by picking one of the input languages as the pivot language that we transit through. However, it is well-known that transiting through a poorly chosen pivot language (such as English) may severely degrade the translation quality, since the assumed transitive relations among all pairs of languages may not be enforced in the training process. Instead of going through a rather arbitrarily chosen pivot language, we propose to use the Wasserstein barycenter as a more informative ``mean'' language: it encapsulates information from all languages and minimizes all pairwise transportation costs. We evaluate our method on standard benchmarks and demonstrate state-of-the-art performances.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Montesuma, Eduardo Fernandes, e Fred Maurice Ngole Mboula. "Wasserstein Barycenter for Multi-Source Domain Adaptation". In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2021. http://dx.doi.org/10.1109/cvpr46437.2021.01651.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia