Literatura académica sobre el tema "Mini-Batch Optimization"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Mini-Batch Optimization".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Mini-Batch Optimization"
Gultekin, San, Avishek Saha, Adwait Ratnaparkhi y John Paisley. "MBA: Mini-Batch AUC Optimization". IEEE Transactions on Neural Networks and Learning Systems 31, n.º 12 (diciembre de 2020): 5561–74. http://dx.doi.org/10.1109/tnnls.2020.2969527.
Texto completoFeyzmahdavian, Hamid Reza, Arda Aytekin y Mikael Johansson. "An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization". IEEE Transactions on Automatic Control 61, n.º 12 (diciembre de 2016): 3740–54. http://dx.doi.org/10.1109/tac.2016.2525015.
Texto completoBanerjee, Subhankar y Shayok Chakraborty. "Deterministic Mini-batch Sequencing for Training Deep Neural Networks". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 8 (18 de mayo de 2021): 6723–31. http://dx.doi.org/10.1609/aaai.v35i8.16831.
Texto completoSimanungkalit, F. R. J., H. Hanifah, G. Ardaneswari, N. Hariadi y B. D. Handari. "Prediction of students’ academic performance using ANN with mini-batch gradient descent and Levenberg-Marquardt optimization algorithms". Journal of Physics: Conference Series 2106, n.º 1 (1 de noviembre de 2021): 012018. http://dx.doi.org/10.1088/1742-6596/2106/1/012018.
Texto completovan Herwaarden, Dirk Philip, Christian Boehm, Michael Afanasiev, Solvi Thrastarson, Lion Krischer, Jeannot Trampert y Andreas Fichtner. "Accelerated full-waveform inversion using dynamic mini-batches". Geophysical Journal International 221, n.º 2 (21 de febrero de 2020): 1427–38. http://dx.doi.org/10.1093/gji/ggaa079.
Texto completoGhadimi, Saeed, Guanghui Lan y Hongchao Zhang. "Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization". Mathematical Programming 155, n.º 1-2 (11 de diciembre de 2014): 267–305. http://dx.doi.org/10.1007/s10107-014-0846-1.
Texto completoKervazo, C., T. Liaudat y J. Bobin. "Faster and better sparse blind source separation through mini-batch optimization". Digital Signal Processing 106 (noviembre de 2020): 102827. http://dx.doi.org/10.1016/j.dsp.2020.102827.
Texto completoDimitriou, Neofytos y Ognjen Arandjelović. "Sequential Normalization: Embracing Smaller Sample Sizes for Normalization". Information 13, n.º 7 (12 de julio de 2022): 337. http://dx.doi.org/10.3390/info13070337.
Texto completoBakurov, Illya, Marco Buzzelli, Mauro Castelli, Leonardo Vanneschi y Raimondo Schettini. "General Purpose Optimization Library (GPOL): A Flexible and Efficient Multi-Purpose Optimization Library in Python". Applied Sciences 11, n.º 11 (23 de mayo de 2021): 4774. http://dx.doi.org/10.3390/app11114774.
Texto completoLi, Zhiyuan, Xun Jian, Yue Wang, Yingxia Shao y Lei Chen. "DAHA: Accelerating GNN Training with Data and Hardware Aware Execution Planning". Proceedings of the VLDB Endowment 17, n.º 6 (febrero de 2024): 1364–76. http://dx.doi.org/10.14778/3648160.3648176.
Texto completoTesis sobre el tema "Mini-Batch Optimization"
Bensaid, Bilel. "Analyse et développement de nouveaux optimiseurs en Machine Learning". Electronic Thesis or Diss., Bordeaux, 2024. http://www.theses.fr/2024BORD0218.
Texto completoOver the last few years, developping an explainable and frugal artificial intelligence (AI) became a fundamental challenge, especially when AI is used in safety-critical systems and demands ever more energy. This issue is even more serious regarding the huge number of hyperparameters to tune to make the models work. Among these parameters, the optimizer as well as its associated tunings appear as the most important leverages to improve these models [196]. This thesis focuses on the analysis of learning process/optimizer for neural networks, by identifying mathematical properties closely related to these two challenges. First, undesirable behaviors preventing the design of explainable and frugal networks are identified. Then, these behaviors are explained using two tools: Lyapunov stability and geometrical integrators. Through numerical experiments, the learning process stabilization improves the overall performances and allows the design of shallow networks. Theoretically, the suggested point of view enables to derive convergence guarantees for classical Deep Learning optimizers. The same approach is valuable for mini-batch optimization where unwelcome phenomenons proliferate: the concept of balanced splitting scheme becomes essential to enhance the learning process understanding and improve its robustness. This study paves the way to the design of new adaptive optimizers, by exploiting the deep relation between robust optimization and invariant preserving scheme for dynamical systems
Capítulos de libros sobre el tema "Mini-Batch Optimization"
Chauhan, Vinod Kumar. "Mini-batch Block-coordinate Newton Method". En Stochastic Optimization for Large-scale Machine Learning, 117–22. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/9781003240167-9.
Texto completoChauhan, Vinod Kumar. "Mini-batch and Block-coordinate Approach". En Stochastic Optimization for Large-scale Machine Learning, 51–66. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/9781003240167-5.
Texto completoFranchini, Giorgia, Valeria Ruggiero y Luca Zanni. "Steplength and Mini-batch Size Selection in Stochastic Gradient Methods". En Machine Learning, Optimization, and Data Science, 259–63. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-64580-9_22.
Texto completoPanteleev, Andrei V. y Aleksandr V. Lobanov. "Application of Mini-Batch Adaptive Optimization Method in Stochastic Control Problems". En Advances in Theory and Practice of Computational Mechanics, 345–61. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-8926-0_23.
Texto completoLiu, Jie y Martin Takáč. "Projected Semi-Stochastic Gradient Descent Method with Mini-Batch Scheme Under Weak Strong Convexity Assumption". En Modeling and Optimization: Theory and Applications, 95–117. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-66616-7_7.
Texto completoPanteleev, A. V. y A. V. Lobanov. "Application of the Zero-Order Mini-Batch Optimization Method in the Tracking Control Problem". En SMART Automatics and Energy, 573–81. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-8759-4_59.
Texto completoKim, Hee-Seung, Lingyi Zhang, Adam Bienkowski, Krishna R. Pattipati, David Sidoti, Yaakov Bar-Shalom y David L. Kleinman. "Sequential Mini-Batch Noise Covariance Estimator". En Kalman Filter - Engineering Applications [Working Title]. IntechOpen, 2022. http://dx.doi.org/10.5772/intechopen.108917.
Texto completoSurono, Sugiyarto, Aris Thobirin, Zani Anjani Rafsanjani Hsm, Asih Yuli Astuti, Berlin Ryan Kp y Milla Oktavia. "Optimization of Fuzzy System Inference Model on Mini Batch Gradient Descent". En Frontiers in Artificial Intelligence and Applications. IOS Press, 2022. http://dx.doi.org/10.3233/faia220387.
Texto completoLuo, Kangyang, Kunkun Zhang, Shengbo Zhang, Xiang Li y Ming Gao. "Decentralized Local Updates with Dual-Slow Estimation and Momentum-Based Variance-Reduction for Non-Convex Optimization". En Frontiers in Artificial Intelligence and Applications. IOS Press, 2023. http://dx.doi.org/10.3233/faia230445.
Texto completoWu, Lilei y Jie Liu. "Contrastive Learning with Diverse Samples". En Frontiers in Artificial Intelligence and Applications. IOS Press, 2023. http://dx.doi.org/10.3233/faia230575.
Texto completoActas de conferencias sobre el tema "Mini-Batch Optimization"
Li, Mu, Tong Zhang, Yuqiang Chen y Alexander J. Smola. "Efficient mini-batch training for stochastic optimization". En KDD '14: The 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2623330.2623612.
Texto completoFeyzmahdavian, Hamid Reza, Arda Aytekin y Mikael Johansson. "An asynchronous mini-batch algorithm for regularized stochastic optimization". En 2015 54th IEEE Conference on Decision and Control (CDC). IEEE, 2015. http://dx.doi.org/10.1109/cdc.2015.7402404.
Texto completoJoseph, K. J., Vamshi Teja R, Krishnakant Singh y Vineeth N. Balasubramanian. "Submodular Batch Selection for Training Deep Neural Networks". En Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/372.
Texto completoXie, Xin, Chao Chen y Zhijian Chen. "Mini-batch Quasi-Newton optimization for Large Scale Linear Support Vector Regression". En 2015 4th International Conference on Mechatronics, Materials, Chemistry and Computer Engineering. Paris, France: Atlantis Press, 2015. http://dx.doi.org/10.2991/icmmcce-15.2015.503.
Texto completoNaganuma, Hiroki y Rio Yokota. "A Performance Improvement Approach for Second-Order Optimization in Large Mini-batch Training". En 2019 19th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGRID). IEEE, 2019. http://dx.doi.org/10.1109/ccgrid.2019.00092.
Texto completoYang, Hui y Bangyu Wu. "Full waveform inversion based on mini-batch gradient descent optimization with geological constrain". En International Geophysical Conference, Qingdao, China, 17-20 April 2017. Society of Exploration Geophysicists and Chinese Petroleum Society, 2017. http://dx.doi.org/10.1190/igc2017-104.
Texto completoOktavia, Milla y Sugiyarto Surono. "Optimization Takagi Sugeno Kang fuzzy system using mini-batch gradient descent with uniform regularization". En PROCEEDINGS OF THE 3RD AHMAD DAHLAN INTERNATIONAL CONFERENCE ON MATHEMATICS AND MATHEMATICS EDUCATION 2021. AIP Publishing, 2023. http://dx.doi.org/10.1063/5.0140144.
Texto completoKhan, Muhammad Waqas, Muhammad Zeeshan y Muhammad Usman. "Traffic Scheduling Optimization in Cognitive Radio based Smart Grid Network Using Mini-Batch Gradient Descent Method". En 2019 14th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 2019. http://dx.doi.org/10.23919/cisti.2019.8760693.
Texto completoZheng, Feng, Xin Miao y Heng Huang. "Fast Vehicle Identification in Surveillance via Ranked Semantic Sampling Based Embedding". En Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/514.
Texto completoZhang, Wenyu, Li Shen, Wanyue Zhang y Chuan-Sheng Foo. "Few-Shot Adaptation of Pre-Trained Networks for Domain Shift". En Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/232.
Texto completo