Добірка наукової літератури з теми "Mini-Batch Optimization"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Mini-Batch Optimization".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Mini-Batch Optimization"
Gultekin, San, Avishek Saha, Adwait Ratnaparkhi, and John Paisley. "MBA: Mini-Batch AUC Optimization." IEEE Transactions on Neural Networks and Learning Systems 31, no. 12 (December 2020): 5561–74. http://dx.doi.org/10.1109/tnnls.2020.2969527.
Повний текст джерелаFeyzmahdavian, Hamid Reza, Arda Aytekin, and Mikael Johansson. "An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization." IEEE Transactions on Automatic Control 61, no. 12 (December 2016): 3740–54. http://dx.doi.org/10.1109/tac.2016.2525015.
Повний текст джерелаBanerjee, Subhankar, and Shayok Chakraborty. "Deterministic Mini-batch Sequencing for Training Deep Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 8 (May 18, 2021): 6723–31. http://dx.doi.org/10.1609/aaai.v35i8.16831.
Повний текст джерелаSimanungkalit, F. R. J., H. Hanifah, G. Ardaneswari, N. Hariadi, and B. D. Handari. "Prediction of students’ academic performance using ANN with mini-batch gradient descent and Levenberg-Marquardt optimization algorithms." Journal of Physics: Conference Series 2106, no. 1 (November 1, 2021): 012018. http://dx.doi.org/10.1088/1742-6596/2106/1/012018.
Повний текст джерелаvan Herwaarden, Dirk Philip, Christian Boehm, Michael Afanasiev, Solvi Thrastarson, Lion Krischer, Jeannot Trampert, and Andreas Fichtner. "Accelerated full-waveform inversion using dynamic mini-batches." Geophysical Journal International 221, no. 2 (February 21, 2020): 1427–38. http://dx.doi.org/10.1093/gji/ggaa079.
Повний текст джерелаGhadimi, Saeed, Guanghui Lan, and Hongchao Zhang. "Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization." Mathematical Programming 155, no. 1-2 (December 11, 2014): 267–305. http://dx.doi.org/10.1007/s10107-014-0846-1.
Повний текст джерелаKervazo, C., T. Liaudat, and J. Bobin. "Faster and better sparse blind source separation through mini-batch optimization." Digital Signal Processing 106 (November 2020): 102827. http://dx.doi.org/10.1016/j.dsp.2020.102827.
Повний текст джерелаDimitriou, Neofytos, and Ognjen Arandjelović. "Sequential Normalization: Embracing Smaller Sample Sizes for Normalization." Information 13, no. 7 (July 12, 2022): 337. http://dx.doi.org/10.3390/info13070337.
Повний текст джерелаBakurov, Illya, Marco Buzzelli, Mauro Castelli, Leonardo Vanneschi, and Raimondo Schettini. "General Purpose Optimization Library (GPOL): A Flexible and Efficient Multi-Purpose Optimization Library in Python." Applied Sciences 11, no. 11 (May 23, 2021): 4774. http://dx.doi.org/10.3390/app11114774.
Повний текст джерелаLi, Zhiyuan, Xun Jian, Yue Wang, Yingxia Shao, and Lei Chen. "DAHA: Accelerating GNN Training with Data and Hardware Aware Execution Planning." Proceedings of the VLDB Endowment 17, no. 6 (February 2024): 1364–76. http://dx.doi.org/10.14778/3648160.3648176.
Повний текст джерелаДисертації з теми "Mini-Batch Optimization"
Bensaid, Bilel. "Analyse et développement de nouveaux optimiseurs en Machine Learning." Electronic Thesis or Diss., Bordeaux, 2024. http://www.theses.fr/2024BORD0218.
Повний текст джерелаOver the last few years, developping an explainable and frugal artificial intelligence (AI) became a fundamental challenge, especially when AI is used in safety-critical systems and demands ever more energy. This issue is even more serious regarding the huge number of hyperparameters to tune to make the models work. Among these parameters, the optimizer as well as its associated tunings appear as the most important leverages to improve these models [196]. This thesis focuses on the analysis of learning process/optimizer for neural networks, by identifying mathematical properties closely related to these two challenges. First, undesirable behaviors preventing the design of explainable and frugal networks are identified. Then, these behaviors are explained using two tools: Lyapunov stability and geometrical integrators. Through numerical experiments, the learning process stabilization improves the overall performances and allows the design of shallow networks. Theoretically, the suggested point of view enables to derive convergence guarantees for classical Deep Learning optimizers. The same approach is valuable for mini-batch optimization where unwelcome phenomenons proliferate: the concept of balanced splitting scheme becomes essential to enhance the learning process understanding and improve its robustness. This study paves the way to the design of new adaptive optimizers, by exploiting the deep relation between robust optimization and invariant preserving scheme for dynamical systems
Частини книг з теми "Mini-Batch Optimization"
Chauhan, Vinod Kumar. "Mini-batch Block-coordinate Newton Method." In Stochastic Optimization for Large-scale Machine Learning, 117–22. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/9781003240167-9.
Повний текст джерелаChauhan, Vinod Kumar. "Mini-batch and Block-coordinate Approach." In Stochastic Optimization for Large-scale Machine Learning, 51–66. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/9781003240167-5.
Повний текст джерелаFranchini, Giorgia, Valeria Ruggiero, and Luca Zanni. "Steplength and Mini-batch Size Selection in Stochastic Gradient Methods." In Machine Learning, Optimization, and Data Science, 259–63. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-64580-9_22.
Повний текст джерелаPanteleev, Andrei V., and Aleksandr V. Lobanov. "Application of Mini-Batch Adaptive Optimization Method in Stochastic Control Problems." In Advances in Theory and Practice of Computational Mechanics, 345–61. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-8926-0_23.
Повний текст джерелаLiu, Jie, and Martin Takáč. "Projected Semi-Stochastic Gradient Descent Method with Mini-Batch Scheme Under Weak Strong Convexity Assumption." In Modeling and Optimization: Theory and Applications, 95–117. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-66616-7_7.
Повний текст джерелаPanteleev, A. V., and A. V. Lobanov. "Application of the Zero-Order Mini-Batch Optimization Method in the Tracking Control Problem." In SMART Automatics and Energy, 573–81. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-8759-4_59.
Повний текст джерелаKim, Hee-Seung, Lingyi Zhang, Adam Bienkowski, Krishna R. Pattipati, David Sidoti, Yaakov Bar-Shalom, and David L. Kleinman. "Sequential Mini-Batch Noise Covariance Estimator." In Kalman Filter - Engineering Applications [Working Title]. IntechOpen, 2022. http://dx.doi.org/10.5772/intechopen.108917.
Повний текст джерелаSurono, Sugiyarto, Aris Thobirin, Zani Anjani Rafsanjani Hsm, Asih Yuli Astuti, Berlin Ryan Kp, and Milla Oktavia. "Optimization of Fuzzy System Inference Model on Mini Batch Gradient Descent." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2022. http://dx.doi.org/10.3233/faia220387.
Повний текст джерелаLuo, Kangyang, Kunkun Zhang, Shengbo Zhang, Xiang Li, and Ming Gao. "Decentralized Local Updates with Dual-Slow Estimation and Momentum-Based Variance-Reduction for Non-Convex Optimization." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2023. http://dx.doi.org/10.3233/faia230445.
Повний текст джерелаWu, Lilei, and Jie Liu. "Contrastive Learning with Diverse Samples." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2023. http://dx.doi.org/10.3233/faia230575.
Повний текст джерелаТези доповідей конференцій з теми "Mini-Batch Optimization"
Li, Mu, Tong Zhang, Yuqiang Chen, and Alexander J. Smola. "Efficient mini-batch training for stochastic optimization." In KDD '14: The 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2623330.2623612.
Повний текст джерелаFeyzmahdavian, Hamid Reza, Arda Aytekin, and Mikael Johansson. "An asynchronous mini-batch algorithm for regularized stochastic optimization." In 2015 54th IEEE Conference on Decision and Control (CDC). IEEE, 2015. http://dx.doi.org/10.1109/cdc.2015.7402404.
Повний текст джерелаJoseph, K. J., Vamshi Teja R, Krishnakant Singh, and Vineeth N. Balasubramanian. "Submodular Batch Selection for Training Deep Neural Networks." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/372.
Повний текст джерелаXie, Xin, Chao Chen, and Zhijian Chen. "Mini-batch Quasi-Newton optimization for Large Scale Linear Support Vector Regression." In 2015 4th International Conference on Mechatronics, Materials, Chemistry and Computer Engineering. Paris, France: Atlantis Press, 2015. http://dx.doi.org/10.2991/icmmcce-15.2015.503.
Повний текст джерелаNaganuma, Hiroki, and Rio Yokota. "A Performance Improvement Approach for Second-Order Optimization in Large Mini-batch Training." In 2019 19th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGRID). IEEE, 2019. http://dx.doi.org/10.1109/ccgrid.2019.00092.
Повний текст джерелаYang, Hui, and Bangyu Wu. "Full waveform inversion based on mini-batch gradient descent optimization with geological constrain." In International Geophysical Conference, Qingdao, China, 17-20 April 2017. Society of Exploration Geophysicists and Chinese Petroleum Society, 2017. http://dx.doi.org/10.1190/igc2017-104.
Повний текст джерелаOktavia, Milla, and Sugiyarto Surono. "Optimization Takagi Sugeno Kang fuzzy system using mini-batch gradient descent with uniform regularization." In PROCEEDINGS OF THE 3RD AHMAD DAHLAN INTERNATIONAL CONFERENCE ON MATHEMATICS AND MATHEMATICS EDUCATION 2021. AIP Publishing, 2023. http://dx.doi.org/10.1063/5.0140144.
Повний текст джерелаKhan, Muhammad Waqas, Muhammad Zeeshan, and Muhammad Usman. "Traffic Scheduling Optimization in Cognitive Radio based Smart Grid Network Using Mini-Batch Gradient Descent Method." In 2019 14th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 2019. http://dx.doi.org/10.23919/cisti.2019.8760693.
Повний текст джерелаZheng, Feng, Xin Miao, and Heng Huang. "Fast Vehicle Identification in Surveillance via Ranked Semantic Sampling Based Embedding." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/514.
Повний текст джерелаZhang, Wenyu, Li Shen, Wanyue Zhang, and Chuan-Sheng Foo. "Few-Shot Adaptation of Pre-Trained Networks for Domain Shift." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/232.
Повний текст джерела