Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Distances de Wasserstein“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Distances de Wasserstein" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Distances de Wasserstein"
Solomon, Justin, Fernando de Goes, Gabriel Peyré, Marco Cuturi, Adrian Butscher, Andy Nguyen, Tao Du und Leonidas Guibas. „Convolutional wasserstein distances“. ACM Transactions on Graphics 34, Nr. 4 (27.07.2015): 1–11. http://dx.doi.org/10.1145/2766963.
Der volle Inhalt der QuelleKindelan Nuñez, Rolando, Mircea Petrache, Mauricio Cerda und Nancy Hitschfeld. „A Class of Topological Pseudodistances for Fast Comparison of Persistence Diagrams“. Proceedings of the AAAI Conference on Artificial Intelligence 38, Nr. 12 (24.03.2024): 13202–10. http://dx.doi.org/10.1609/aaai.v38i12.29220.
Der volle Inhalt der QuellePanaretos, Victor M., und Yoav Zemel. „Statistical Aspects of Wasserstein Distances“. Annual Review of Statistics and Its Application 6, Nr. 1 (07.03.2019): 405–31. http://dx.doi.org/10.1146/annurev-statistics-030718-104938.
Der volle Inhalt der QuelleKelbert, Mark. „Survey of Distances between the Most Popular Distributions“. Analytics 2, Nr. 1 (01.03.2023): 225–45. http://dx.doi.org/10.3390/analytics2010012.
Der volle Inhalt der QuelleVayer, Titouan, Laetitia Chapel, Remi Flamary, Romain Tavenard und Nicolas Courty. „Fused Gromov-Wasserstein Distance for Structured Objects“. Algorithms 13, Nr. 9 (31.08.2020): 212. http://dx.doi.org/10.3390/a13090212.
Der volle Inhalt der QuelleBelili, Nacereddine, und Henri Heinich. „Distances de Wasserstein et de Zolotarev“. Comptes Rendus de l'Académie des Sciences - Series I - Mathematics 330, Nr. 9 (Mai 2000): 811–14. http://dx.doi.org/10.1016/s0764-4442(00)00274-3.
Der volle Inhalt der QuellePeyre, Rémi. „Comparison between W2 distance and Ḣ−1 norm, and Localization of Wasserstein distance“. ESAIM: Control, Optimisation and Calculus of Variations 24, Nr. 4 (Oktober 2018): 1489–501. http://dx.doi.org/10.1051/cocv/2017050.
Der volle Inhalt der QuelleTong, Qijun, und Kei Kobayashi. „Entropy-Regularized Optimal Transport on Multivariate Normal and q-normal Distributions“. Entropy 23, Nr. 3 (03.03.2021): 302. http://dx.doi.org/10.3390/e23030302.
Der volle Inhalt der QuelleBeier, Florian, Robert Beinert und Gabriele Steidl. „Multi-marginal Gromov–Wasserstein transport and barycentres“. Information and Inference: A Journal of the IMA 12, Nr. 4 (18.09.2023): 2720–52. http://dx.doi.org/10.1093/imaiai/iaad041.
Der volle Inhalt der QuelleZhang, Zhonghui, Huarui Jing und Chihwa Kao. „High-Dimensional Distributionally Robust Mean-Variance Efficient Portfolio Selection“. Mathematics 11, Nr. 5 (06.03.2023): 1272. http://dx.doi.org/10.3390/math11051272.
Der volle Inhalt der QuelleDissertationen zum Thema "Distances de Wasserstein"
Boissard, Emmanuel. „Problèmes d'interaction discret-continu et distances de Wasserstein“. Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1389/.
Der volle Inhalt der QuelleWe study several problems of approximation using tools from Optimal Transportation theory. The family of Wasserstein metrics are used to provide error bounds for particular approximation of some Partial Differential Equations. They also come into play as natural measures of distorsion for quantization and clustering problems. A problem related to these questions is to estimate the speed of convergence in the empirical law of large numbers for these distorsions. The first part of this thesis provides non-asymptotic bounds, notably in infinite-dimensional Banach spaces, as well as in cases where independence is removed. The second part is dedicated to the study of two models from the modelling of animal displacement. A new individual-based model for ant trail formation is introduced, and studied through numerical simulations and kinetic formulation. We also study a variant of the Cucker-Smale model of bird flock motion : we establish well-posedness of the associated Vlasov-type transport equation as well as long-time behaviour results. In a third part, we study some statistical applications of the notion of barycenter in Wasserstein space recently introduced by M. Agueh and G. Carlier
Fernandes, Montesuma Eduardo. „Multi-Source Domain Adaptation through Wasserstein Barycenters“. Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG045.
Der volle Inhalt der QuelleMachine learning systems work under the assumption that training and test conditions are uniform, i.e., they do not change. However, this hypothesis is seldom met in practice. Hence, the system is trained with data that is no longer representative of the data it will be tested on. This case is represented by a shift in the probability measure generating the data. This scenario is known in the literature as distributional shift between two domains: a source, and a target. A straightforward generalization of this problem is when training data itself exhibit shifts on its own. In this case, one consider Multi Source Domain Adaptation (MSDA). In this context, optimal transport is an useful field of mathematics. Especially, optimal transport serves as a toolbox, for comparing and manipulating probability measures. This thesis studies the contributions of optimal transport to multi-source domain adaptation. We do so through Wasserstein barycenters, an object that defines a weighted average, in the space of probability measures, for the multiple domains in MSDA. Based on this concept, we propose: (i) a novel notion of barycenter, when the measures at hand are equipped with labels, (ii) a novel dictionary learning problem over empirical probability measures and (iii) new tools for domain adaptation through the optimal transport of Gaussian mixture models. Through our methods, we are able to improve domain adaptation performance in comparison with previous optimal transport-based methods on image, and cross-domain fault diagnosis benchmarks. Our work opens an interesting research direction, on learning the barycentric hull of probability measures
Schrieber, Jörn [Verfasser], Dominic [Akademischer Betreuer] Schuhmacher, Dominic [Gutachter] Schuhmacher und Anita [Gutachter] Schöbel. „Algorithms for Optimal Transport and Wasserstein Distances / Jörn Schrieber ; Gutachter: Dominic Schuhmacher, Anita Schöbel ; Betreuer: Dominic Schuhmacher“. Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2019. http://d-nb.info/1179449304/34.
Der volle Inhalt der QuelleSEGUY, Vivien Pierre François. „Measure Transport Approaches for Data Visualization and Learning“. Kyoto University, 2018. http://hdl.handle.net/2433/233857.
Der volle Inhalt der QuelleGairing, Jan, Michael Högele, Tetiana Kosenkova und Alexei Kulik. „On the calibration of Lévy driven time series with coupling distances : an application in paleoclimate“. Universität Potsdam, 2014. http://opus.kobv.de/ubp/volltexte/2014/6978/.
Der volle Inhalt der QuelleFlenghi, Roberta. „Théorème de la limite centrale pour des fonctionnelles non linéaires de la mesure empirique et pour le rééchantillonnage stratifié“. Electronic Thesis or Diss., Marne-la-vallée, ENPC, 2023. http://www.theses.fr/2023ENPC0051.
Der volle Inhalt der QuelleThis thesis is dedicated to the central limit theorem which is one of the two fundamental limit theorems in probability theory with the strong law of large numbers.The central limit theorem which is well known for linear functionals of the empirical measure of independent and identically distributed random vectors, has recently been extended to non-linear functionals. The main tool permitting this extension is the linear functional derivative, one of the notions of derivation on the Wasserstein space of probability measures.We generalize this extension by first relaxing the equal distribution assumptionand then the independence property to be able to deal with the successive values of an ergodic Markov chain.In the second place, we focus on the stratified resampling mechanism.This is one of the resampling schemes commonly used in particle filters. We prove a central limit theorem for the first resampling according to this mechanism under the assumption that the initial positions are independent and identically distributed and the weights proportional to a positive function of the positions such that the image of their common distribution by this function has a non zero component absolutely continuous with respect to the Lebesgue measure. This result relies on the convergence in distribution of the fractional part of partial sums of the normalized weights to some random variable uniformly distributed on [0,1]. More generally, we prove the joint convergence in distribution of q variables modulo one obtained as partial sums of a sequence of i.i.d. square integrable random variables multiplied by a common factor given by some function of an empirical mean of the same sequence. The limit is uniformly distributed over [dollar][0,1]^q[dollar]. To deal with the coupling introduced by the common factor, we assume that the common distribution of the random variables has a non zero component absolutely continuous with respect to the Lebesgue measure, so that the convergence in the central limit theorem for this sequence holds in total variation distance.Under the conjecture that the convergence in distribution of fractional parts to some uniform random variable remains valid at the next steps of a particle filter which alternates selections according to the stratified resampling mechanism and mutations according to Markov kernels, we provide an inductive formula for the asymptotic variance of the resampled population after n steps. We perform numerical experiments which support the validity of this formula
Bobbia, Benjamin. „Régression quantile extrême : une approche par couplage et distance de Wasserstein“. Thesis, Bourgogne Franche-Comté, 2020. http://www.theses.fr/2020UBFCD043.
Der volle Inhalt der QuelleThis work is related with the estimation of conditional extreme quantiles. More precisely, we estimate high quantiles of a real distribution conditionally to the value of a covariate, potentially in high dimension. A such estimation is made introducing the proportional tail model. This model is studied with coupling methods. The first is an empirical processes based method whereas the second is focused on transport and optimal coupling. We provide estimators of both quantiles and model parameters, we show their asymptotic normality with our coupling methods. We also provide a validation procedure for proportional tail model. Moreover, we develop the second approach in the general framework of univariate extreme value theory
Nadjahi, Kimia. „Sliced-Wasserstein distance for large-scale machine learning : theory, methodology and extensions“. Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAT050.
Der volle Inhalt der QuelleMany methods for statistical inference and generative modeling rely on a probability divergence to effectively compare two probability distributions. The Wasserstein distance, which emerges from optimal transport, has been an interesting choice, but suffers from computational and statistical limitations on large-scale settings. Several alternatives have then been proposed, including the Sliced-Wasserstein distance (SW), a metric that has been increasingly used in practice due to its computational benefits. However, there is little work regarding its theoretical properties. This thesis further explores the use of SW in modern statistical and machine learning problems, with a twofold objective: 1) provide new theoretical insights to understand in depth SW-based algorithms, and 2) design novel tools inspired by SW to improve its applicability and scalability. We first prove a set of asymptotic properties on the estimators obtained by minimizing SW, as well as a central limit theorem whose convergence rate is dimension-free. We also design a novel likelihood-free approximate inference method based on SW, which is theoretically grounded and scales well with the data size and dimension. Given that SW is commonly estimated with a simple Monte Carlo scheme, we then propose two approaches to alleviate the inefficiencies caused by the induced approximation error: on the one hand, we extend the definition of SW to introduce the Generalized Sliced-Wasserstein distances, and illustrate their advantages on generative modeling applications; on the other hand, we leverage concentration of measure results to formulate a new deterministic approximation for SW, which is computationally more efficient than the usual Monte Carlo technique and has nonasymptotical guarantees under a weak dependence condition. Finally, we define the general class of sliced probability divergences and investigate their topological and statistical properties; in particular, we establish that the sample complexity of any sliced divergence does not depend on the problem dimension
Liu, Lu. „A Risk-Oriented Clustering Approach for Asset Categorization and Risk Measurement“. Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/39444.
Der volle Inhalt der QuelleLescornel, Hélène. „Covariance estimation and study of models of deformations between distributions with the Wasserstein distance“. Toulouse 3, 2014. http://www.theses.fr/2014TOU30045.
Der volle Inhalt der QuelleThe first part of this thesis concerns the covariance estimation of non stationary processes. We are estimating the covariance in different vectorial spaces of matrices. In Chapter 3, we give a model selection procedure by minimizing a penalized criterion and using concentration inequalities, and Chapter 4 presents an Unbiased Risk Estimation method. In both cases we give oracle inequalities. The second part deals with the study of models of deformation between distributions. We assume that we observe a random quantity epsilon through a deformation function. The importance of the deformation is represented by a parameter theta that we aim to estimate. We present several methods of estimation based on the Wasserstein distance by aligning the distributions of the observations to recover the deformation parameter. In the case of real random variables, Chapter 7 presents properties of consistency for a M-estimator and its asymptotic distribution. We use Hadamard differentiability techniques to apply a functional Delta method. Chapter 8 concerns a Robbins-Monro estimator for the deformation parameter and presents properties of convergence for a kernel estimator of the density of the variable epsilon obtained with the observations. The model is generalized to random variables in complete metric spaces in Chapter 9. Then, in the aim to build a goodness of fit test, Chapter 10 gives results on the asymptotic distribution of a test statistic
Bücher zum Thema "Distances de Wasserstein"
An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows. European Mathematical Society, 2021.
Den vollen Inhalt der Quelle findenComputational Inversion with Wasserstein Distances and Neural Network Induced Loss Functions. [New York, N.Y.?]: [publisher not identified], 2022.
Den vollen Inhalt der Quelle findenBuchteile zum Thema "Distances de Wasserstein"
Bachmann, Fynn, Philipp Hennig und Dmitry Kobak. „Wasserstein t-SNE“. In Machine Learning and Knowledge Discovery in Databases, 104–20. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-26387-3_7.
Der volle Inhalt der QuelleVillani, Cédric. „The Wasserstein distances“. In Grundlehren der mathematischen Wissenschaften, 93–111. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-540-71050-9_6.
Der volle Inhalt der QuelleBarbe, Amélie, Marc Sebban, Paulo Gonçalves, Pierre Borgnat und Rémi Gribonval. „Graph Diffusion Wasserstein Distances“. In Machine Learning and Knowledge Discovery in Databases, 577–92. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-67661-2_34.
Der volle Inhalt der QuelleJacobs, Bart. „Drawing from an Urn is Isometric“. In Lecture Notes in Computer Science, 101–20. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-57228-9_6.
Der volle Inhalt der QuelleSantambrogio, Filippo. „Wasserstein distances and curves in the Wasserstein spaces“. In Optimal Transport for Applied Mathematicians, 177–218. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20828-2_5.
Der volle Inhalt der QuelleÖcal, Kaan, Ramon Grima und Guido Sanguinetti. „Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks“. In Computational Methods in Systems Biology, 347–51. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-31304-3_24.
Der volle Inhalt der QuelleCarrillo, José Antonio, Young-Pil Choi und Maxime Hauray. „The derivation of swarming models: Mean-field limit and Wasserstein distances“. In Collective Dynamics from Bacteria to Crowds, 1–46. Vienna: Springer Vienna, 2014. http://dx.doi.org/10.1007/978-3-7091-1785-9_1.
Der volle Inhalt der QuelleHaeusler, Erich, und David M. Mason. „Asymptotic Distributions of Trimmed Wasserstein Distances Between the True and the Empirical Distribution Function“. In Stochastic Inequalities and Applications, 279–98. Basel: Birkhäuser Basel, 2003. http://dx.doi.org/10.1007/978-3-0348-8069-5_16.
Der volle Inhalt der QuelleWalczak, Szymon M. „Wasserstein Distance“. In SpringerBriefs in Mathematics, 1–10. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-57517-9_1.
Der volle Inhalt der QuelleBreiding, Paul, Kathlén Kohn und Bernd Sturmfels. „Wasserstein Distance“. In Oberwolfach Seminars, 53–66. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-51462-3_5.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Distances de Wasserstein"
Zhang, Xiaoxia, Chao Wang, Xusheng Hu und Claude Delpha. „Incipient Cracks Characterization Based on Jensen-Shannon Divergence and Wasserstein Distance“. In 2024 Prognostics and System Health Management Conference (PHM), 8–13. IEEE, 2024. http://dx.doi.org/10.1109/phm61473.2024.00010.
Der volle Inhalt der QuelleLyu, Zihang, Jun Xiao, Cong Zhang und Kin-Man Lam. „AI-Generated Image Detection With Wasserstein Distance Compression and Dynamic Aggregation“. In 2024 IEEE International Conference on Image Processing (ICIP), 3827–33. IEEE, 2024. http://dx.doi.org/10.1109/icip51287.2024.10648186.
Der volle Inhalt der QuelleMalik, Vikrant, Taylan Kargin, Victoria Kostina und Babak Hassibi. „A Distributionally Robust Approach to Shannon Limits using the Wasserstein Distance“. In 2024 IEEE International Symposium on Information Theory (ISIT), 861–66. IEEE, 2024. http://dx.doi.org/10.1109/isit57864.2024.10619597.
Der volle Inhalt der QuelleLopez, Adrian Tovar, und Varun Jog. „Generalization error bounds using Wasserstein distances“. In 2018 IEEE Information Theory Workshop (ITW). IEEE, 2018. http://dx.doi.org/10.1109/itw.2018.8613445.
Der volle Inhalt der QuelleMemoli, Facundo. „Spectral Gromov-Wasserstein distances for shape matching“. In 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops. IEEE, 2009. http://dx.doi.org/10.1109/iccvw.2009.5457690.
Der volle Inhalt der QuelleProssel, Dominik, und Uwe D. Hanebeck. „Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions“. In 2022 25th International Conference on Information Fusion (FUSION). IEEE, 2022. http://dx.doi.org/10.23919/fusion49751.2022.9841286.
Der volle Inhalt der QuelleSteuernagel, Simon, Aaron Kurda und Marcus Baum. „Point Cloud Registration based on Gaussian Mixtures and Pairwise Wasserstein Distances“. In 2023 IEEE Symposium Sensor Data Fusion and International Conference on Multisensor Fusion and Integration (SDF-MFI). IEEE, 2023. http://dx.doi.org/10.1109/sdf-mfi59545.2023.10361440.
Der volle Inhalt der QuellePerkey, Scott, Ana Carvalho und Alberto Krone-Martins. „Using Fourier Coefficients and Wasserstein Distances to Estimate Entropy in Time Series“. In 2023 IEEE 19th International Conference on e-Science (e-Science). IEEE, 2023. http://dx.doi.org/10.1109/e-science58273.2023.10254949.
Der volle Inhalt der QuelleBarbe, Amelie, Paulo Goncalves, Marc Sebban, Pierre Borgnat, Remi Gribonval und Titouan Vayer. „Optimization of the Diffusion Time in Graph Diffused-Wasserstein Distances: Application to Domain Adaptation“. In 2021 IEEE 33rd International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2021. http://dx.doi.org/10.1109/ictai52525.2021.00125.
Der volle Inhalt der QuelleGarcia Ramirez, Jesus. „Which Kernels to Transfer in Deep Q-Networks?“ In LatinX in AI at Neural Information Processing Systems Conference 2019. Journal of LatinX in AI Research, 2019. http://dx.doi.org/10.52591/lxai201912087.
Der volle Inhalt der Quelle