Gotowa bibliografia na temat „Maximum Mean Discrepancy (MMD)”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Maximum Mean Discrepancy (MMD)”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "Maximum Mean Discrepancy (MMD)"
Huang, Qihang, Yulin He i Zhexue Huang. "A Novel Maximum Mean Discrepancy-Based Semi-Supervised Learning Algorithm". Mathematics 10, nr 1 (23.12.2021): 39. http://dx.doi.org/10.3390/math10010039.
Pełny tekst źródłaZhou, Zhaokun, Yuanhong Zhong, Xiaoming Liu, Qiang Li i Shu Han. "DC-MMD-GAN: A New Maximum Mean Discrepancy Generative Adversarial Network Using Divide and Conquer". Applied Sciences 10, nr 18 (14.09.2020): 6405. http://dx.doi.org/10.3390/app10186405.
Pełny tekst źródłaXu, Haoji. "Generate Faces Using Ladder Variational Autoencoder with Maximum Mean Discrepancy (MMD)". Intelligent Information Management 10, nr 04 (2018): 108–13. http://dx.doi.org/10.4236/iim.2018.104009.
Pełny tekst źródłaSun, Jiancheng. "Complex Network Construction of Univariate Chaotic Time Series Based on Maximum Mean Discrepancy". Entropy 22, nr 2 (24.01.2020): 142. http://dx.doi.org/10.3390/e22020142.
Pełny tekst źródłaZhang, Xiangqing, Yan Feng, Shun Zhang, Nan Wang, Shaohui Mei i Mingyi He. "Semi-Supervised Person Detection in Aerial Images with Instance Segmentation and Maximum Mean Discrepancy Distance". Remote Sensing 15, nr 11 (4.06.2023): 2928. http://dx.doi.org/10.3390/rs15112928.
Pełny tekst źródłaZhao, Ji, i Deyu Meng. "FastMMD: Ensemble of Circular Discrepancy for Efficient Two-Sample Test". Neural Computation 27, nr 6 (czerwiec 2015): 1345–72. http://dx.doi.org/10.1162/neco_a_00732.
Pełny tekst źródłaWilliamson, Sinead A., i Jette Henderson. "Understanding Collections of Related Datasets Using Dependent MMD Coresets". Information 12, nr 10 (23.09.2021): 392. http://dx.doi.org/10.3390/info12100392.
Pełny tekst źródłaLi, Kangji, Borui Wei, Qianqian Tang i Yufei Liu. "A Data-Efficient Building Electricity Load Forecasting Method Based on Maximum Mean Discrepancy and Improved TrAdaBoost Algorithm". Energies 15, nr 23 (22.11.2022): 8780. http://dx.doi.org/10.3390/en15238780.
Pełny tekst źródłaLee, Junghyun, Gwangsu Kim, Mahbod Olfat, Mark Hasegawa-Johnson i Chang D. Yoo. "Fast and Efficient MMD-Based Fair PCA via Optimization over Stiefel Manifold". Proceedings of the AAAI Conference on Artificial Intelligence 36, nr 7 (28.06.2022): 7363–71. http://dx.doi.org/10.1609/aaai.v36i7.20699.
Pełny tekst źródłaHan, Chao, Deyun Zhou, Zhen Yang, Yu Xie i Kai Zhang. "Discriminative Sparse Filtering for Multi-Source Image Classification". Sensors 20, nr 20 (16.10.2020): 5868. http://dx.doi.org/10.3390/s20205868.
Pełny tekst źródłaRozprawy doktorskie na temat "Maximum Mean Discrepancy (MMD)"
Cherief-Abdellatif, Badr-Eddine. "Contributions to the theoretical study of variational inference and robustness". Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAG001.
Pełny tekst źródłaThis PhD thesis deals with variational inference and robustness. More precisely, it focuses on the statistical properties of variational approximations and the design of efficient algorithms for computing them in an online fashion, and investigates Maximum Mean Discrepancy based estimators as learning rules that are robust to model misspecification.In recent years, variational inference has been extensively studied from the computational viewpoint, but only little attention has been put in the literature towards theoretical properties of variational approximations until very recently. In this thesis, we investigate the consistency of variational approximations in various statistical models and the conditions that ensure the consistency of variational approximations. In particular, we tackle the special case of mixture models and deep neural networks. We also justify in theory the use of the ELBO maximization strategy, a model selection criterion that is widely used in the Variational Bayes community and is known to work well in practice.Moreover, Bayesian inference provides an attractive online-learning framework to analyze sequential data, and offers generalization guarantees which hold even under model mismatch and with adversaries. Unfortunately, exact Bayesian inference is rarely feasible in practice and approximation methods are usually employed, but do such methods preserve the generalization properties of Bayesian inference? In this thesis, we show that this is indeed the case for some variational inference algorithms. We propose new online, tempered variational algorithms and derive their generalization bounds. Our theoretical result relies on the convexity of the variational objective, but we argue that our result should hold more generally and present empirical evidence in support of this. Our work presents theoretical justifications in favor of online algorithms that rely on approximate Bayesian methods. Another point that is addressed in this thesis is the design of a universal estimation procedure. This question is of major interest, in particular because it leads to robust estimators, a very hot topic in statistics and machine learning. We tackle the problem of universal estimation using a minimum distance estimator based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. We also highlight the connections that may exist with minimum distance estimators using L2-distance. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations. We also propose a Bayesian version of our estimator, that we study from both a theoretical and a computational points of view
Jia, Xiaodong. "Data Suitability Assessment and Enhancement for Machine Prognostics and Health Management Using Maximum Mean Discrepancy". University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1544002523636343.
Pełny tekst źródłaOskarsson, Joel. "Probabilistic Regression using Conditional Generative Adversarial Networks". Thesis, Linköpings universitet, Statistik och maskininlärning, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-166637.
Pełny tekst źródłaYang, Qibo. "A Transfer Learning Methodology of Domain Generalization for Prognostics and Health Management". University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1613749034966366.
Pełny tekst źródłaAtta-Asiamah, Ernest. "Distributed Inference for Degenerate U-Statistics with Application to One and Two Sample Test". Diss., North Dakota State University, 2020. https://hdl.handle.net/10365/31777.
Pełny tekst źródłaMayo, Thomas Richard. "Machine learning for epigenetics : algorithms for next generation sequencing data". Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/33055.
Pełny tekst źródłaEbert, Anthony C. "Dynamic queueing networks: Simulation, estimation and prediction". Thesis, Queensland University of Technology, 2020. https://eprints.qut.edu.au/180771/1/Anthony_Ebert_Thesis.pdf.
Pełny tekst źródłaRahman, Mohammad Mahfujur. "Deep domain adaptation and generalisation". Thesis, Queensland University of Technology, 2020. https://eprints.qut.edu.au/205619/1/Mohammad%20Mahfujur_Rahman_Thesis.pdf.
Pełny tekst źródłaGupta, Yash. "Model Extraction Defense using Modified Variational Autoencoder". Thesis, 2020. https://etd.iisc.ac.in/handle/2005/4430.
Pełny tekst źródłaDiu, Michael. "Image Analysis Applications of the Maximum Mean Discrepancy Distance Measure". Thesis, 2013. http://hdl.handle.net/10012/7558.
Pełny tekst źródłaCzęści książek na temat "Maximum Mean Discrepancy (MMD)"
Slimene, Alya, i Ezzeddine Zagrouba. "Kernel Maximum Mean Discrepancy for Region Merging Approach". W Computer Analysis of Images and Patterns, 475–82. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-40246-3_59.
Pełny tekst źródłaDiu, Michael, Mehrdad Gangeh i Mohamed S. Kamel. "Unsupervised Visual Changepoint Detection Using Maximum Mean Discrepancy". W Lecture Notes in Computer Science, 336–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39094-4_38.
Pełny tekst źródłaYang, Pengcheng, Fuli Luo, Shuangzhi Wu, Jingjing Xu i Dongdong Zhang. "Learning Unsupervised Word Mapping via Maximum Mean Discrepancy". W Natural Language Processing and Chinese Computing, 290–302. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-32233-5_23.
Pełny tekst źródłaLuna-Naranjo, D. F., J. V. Hurtado-Rincon, D. Cárdenas-Peña, V. H. Castro, H. F. Torres i G. Castellanos-Dominguez. "EEG Channel Relevance Analysis Using Maximum Mean Discrepancy on BCI Systems". W Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, 820–28. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13469-3_95.
Pełny tekst źródłaZhu, Xiaofeng, Kim-Han Thung, Ehsan Adeli, Yu Zhang i Dinggang Shen. "Maximum Mean Discrepancy Based Multiple Kernel Learning for Incomplete Multimodality Neuroimaging Data". W Medical Image Computing and Computer Assisted Intervention − MICCAI 2017, 72–80. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-66179-7_9.
Pełny tekst źródłaWickstrøm, Kristoffer, J. Emmanuel Johnson, Sigurd Løkse, Gustau Camps-Valls, Karl Øyvind Mikalsen, Michael Kampffmeyer i Robert Jenssen. "The Kernelized Taylor Diagram". W Communications in Computer and Information Science, 125–31. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-17030-0_10.
Pełny tekst źródłaStreszczenia konferencji na temat "Maximum Mean Discrepancy (MMD)"
Zhang, Wen, i Dongrui Wu. "Discriminative Joint Probability Maximum Mean Discrepancy (DJP-MMD) for Domain Adaptation". W 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9207365.
Pełny tekst źródłaXu, Zhiwei, Dapeng Li, Yunpeng Bai i Guoliang Fan. "MMD-MIX: Value Function Factorisation with Maximum Mean Discrepancy for Cooperative Multi-Agent Reinforcement Learning". W 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533636.
Pełny tekst źródłaLiu, Qiao, i Hui Xue. "Adversarial Spectral Kernel Matching for Unsupervised Time Series Domain Adaptation". W Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/378.
Pełny tekst źródłaLi, Yanghao, Naiyan Wang, Jiaying Liu i Xiaodi Hou. "Demystifying Neural Style Transfer". W Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/310.
Pełny tekst źródłaQian, Sheng, Guanyue Li, Wen-Ming Cao, Cheng Liu, Si Wu i Hau San Wong. "Improving representation learning in autoencoders via multidimensional interpolation and dual regularizations". W Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/453.
Pełny tekst źródłaKim, Beomjoon, i Joelle Pineau. "Maximum Mean Discrepancy Imitation Learning". W Robotics: Science and Systems 2013. Robotics: Science and Systems Foundation, 2013. http://dx.doi.org/10.15607/rss.2013.ix.038.
Pełny tekst źródłaCai, Mingzhi, Baoguo Wei, Yue Zhang, Xu Li i Lixin Li. "Maximum Mean Discrepancy Adversarial Active Learning". W 2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC). IEEE, 2022. http://dx.doi.org/10.1109/icspcc55723.2022.9984505.
Pełny tekst źródłaZhang, Wei, Brian Barr i John Paisley. "Understanding Counterfactual Generation using Maximum Mean Discrepancy". W ICAIF '22: 3rd ACM International Conference on AI in Finance. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3533271.3561759.
Pełny tekst źródłaLin, Weiwei, Man-Wai Mak, Longxin Li i Jen-Tzung Chien. "Reducing Domain Mismatch by Maximum Mean Discrepancy Based Autoencoders". W Odyssey 2018 The Speaker and Language Recognition Workshop. ISCA: ISCA, 2018. http://dx.doi.org/10.21437/odyssey.2018-23.
Pełny tekst źródłaTian, Yi, Qiuqi Ruan i Gaoyun An. "Zero-shot Action Recognition via Empirical Maximum Mean Discrepancy". W 2018 14th IEEE International Conference on Signal Processing (ICSP). IEEE, 2018. http://dx.doi.org/10.1109/icsp.2018.8652306.
Pełny tekst źródła