Статті в журналах з теми "ReLU neural networks"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "ReLU neural networks".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Liang, XingLong, and Jun Xu. "Biased ReLU neural networks." Neurocomputing 423 (January 2021): 71–79. http://dx.doi.org/10.1016/j.neucom.2020.09.050.
Повний текст джерелаHuang, Changcun. "ReLU Networks Are Universal Approximators via Piecewise Linear or Constant Functions." Neural Computation 32, no. 11 (November 2020): 2249–78. http://dx.doi.org/10.1162/neco_a_01316.
Повний текст джерелаKulathunga, Nalinda, Nishath Rajiv Ranasinghe, Daniel Vrinceanu, Zackary Kinsman, Lei Huang, and Yunjiao Wang. "Effects of Nonlinearity and Network Architecture on the Performance of Supervised Neural Networks." Algorithms 14, no. 2 (February 5, 2021): 51. http://dx.doi.org/10.3390/a14020051.
Повний текст джерелаDung, D., V. K. Nguyen, and M. X. Thao. "ON COMPUTATION COMPLEXITY OF HIGH-DIMENSIONAL APPROXIMATION BY DEEP ReLU NEURAL NETWORKS." BULLETIN of L.N. Gumilyov Eurasian National University. MATHEMATICS. COMPUTER SCIENCE. MECHANICS Series 133, no. 4 (2020): 8–18. http://dx.doi.org/10.32523/2616-7182/2020-133-4-8-18.
Повний текст джерелаGühring, Ingo, Gitta Kutyniok, and Philipp Petersen. "Error bounds for approximations with deep ReLU neural networks in Ws,p norms." Analysis and Applications 18, no. 05 (September 19, 2019): 803–59. http://dx.doi.org/10.1142/s0219530519410021.
Повний текст джерелаDũng, Dinh, Van Kien Nguyen, and Mai Xuan Thao. "COMPUTATION COMPLEXITY OF DEEP RELU NEURAL NETWORKS IN HIGH-DIMENSIONAL APPROXIMATION." Journal of Computer Science and Cybernetics 37, no. 3 (September 28, 2021): 291–320. http://dx.doi.org/10.15625/1813-9663/37/3/15902.
Повний текст джерелаПолковникова, Н. А., Е. В. Тузинкевич, and А. Н. Попов. "Application of convolutional neural networks for monitoring of marine objects." MORSKIE INTELLEKTUAL`NYE TEHNOLOGII), no. 4(50) (December 17, 2020): 53–61. http://dx.doi.org/10.37220/mit.2020.50.4.097.
Повний текст джерелаGao, Hongyang, Lei Cai, and Shuiwang Ji. "Adaptive Convolutional ReLUs." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3914–21. http://dx.doi.org/10.1609/aaai.v34i04.5805.
Повний текст джерелаPetzka, Henning, Martin Trimmel, and Cristian Sminchisescu. "Notes on the Symmetries of 2-Layer ReLU-Networks." Proceedings of the Northern Lights Deep Learning Workshop 1 (February 6, 2020): 6. http://dx.doi.org/10.7557/18.5150.
Повний текст джерелаZheng, Shuxin, Qi Meng, Huishuai Zhang, Wei Chen, Nenghai Yu, and Tie-Yan Liu. "Capacity Control of ReLU Neural Networks by Basis-Path Norm." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5925–32. http://dx.doi.org/10.1609/aaai.v33i01.33015925.
Повний текст джерелаLiu, Bo, and Yi Liang. "Optimal function approximation with ReLU neural networks." Neurocomputing 435 (May 2021): 216–27. http://dx.doi.org/10.1016/j.neucom.2021.01.007.
Повний текст джерелаBodyanskiy, Yevgeniy, and Serhii Kostiuk. "Adaptive hybrid activation function for deep neural networks." System research and information technologies, no. 1 (April 25, 2022): 87–96. http://dx.doi.org/10.20535/srit.2308-8893.2022.1.07.
Повний текст джерелаMoon, Sunghwan. "ReLU Network with Bounded Width Is a Universal Approximator in View of an Approximate Identity." Applied Sciences 11, no. 1 (January 4, 2021): 427. http://dx.doi.org/10.3390/app11010427.
Повний текст джерелаBai, Yuhan. "RELU-Function and Derived Function Review." SHS Web of Conferences 144 (2022): 02006. http://dx.doi.org/10.1051/shsconf/202214402006.
Повний текст джерелаAbdeljawad, Ahmed, and Philipp Grohs. "Approximations with deep neural networks in Sobolev time-space." Analysis and Applications 20, no. 03 (May 2022): 499–541. http://dx.doi.org/10.1142/s0219530522500014.
Повний текст джерелаsci, Juncai He. "Relu Deep Neural Networks and Linear Finite Elements." Journal of Computational Mathematics 38, no. 3 (June 2020): 502–27. http://dx.doi.org/10.4208/jcm.1901-m2018-0160.
Повний текст джерелаDũng, Dinh, and Van Kien Nguyen. "Deep ReLU neural networks in high-dimensional approximation." Neural Networks 142 (October 2021): 619–35. http://dx.doi.org/10.1016/j.neunet.2021.07.027.
Повний текст джерелаDureja, Aman, and Payal Pahwa. "Analysis of Non-Linear Activation Functions for Classification Tasks Using Convolutional Neural Networks." Recent Patents on Computer Science 12, no. 3 (May 8, 2019): 156–61. http://dx.doi.org/10.2174/2213275911666181025143029.
Повний текст джерелаLi, Qunwei, Shaofeng Zou, and Wenliang Zhong. "Learning Graph Neural Networks with Approximate Gradient Descent." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 10 (May 18, 2021): 8438–46. http://dx.doi.org/10.1609/aaai.v35i10.17025.
Повний текст джерелаSun, Yichen, Mingli Dong, Mingxin Yu, Jiabin Xia, Xu Zhang, Yuchen Bai, Lidan Lu та Lianqing Zhu. "Nonlinear All-Optical Diffractive Deep Neural Network with 10.6 μm Wavelength for Image Classification". International Journal of Optics 2021 (27 лютого 2021): 1–16. http://dx.doi.org/10.1155/2021/6667495.
Повний текст джерелаThakur, Amey. "Fundamentals of Neural Networks." International Journal for Research in Applied Science and Engineering Technology 9, no. VIII (August 15, 2021): 407–26. http://dx.doi.org/10.22214/ijraset.2021.37362.
Повний текст джерелаBotoeva, Elena, Panagiotis Kouvaros, Jan Kronqvist, Alessio Lomuscio, and Ruth Misener. "Efficient Verification of ReLU-Based Neural Networks via Dependency Analysis." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3291–99. http://dx.doi.org/10.1609/aaai.v34i04.5729.
Повний текст джерелаPłaczek, Stanisław, and Aleksander Płaczek. "Learning algorithm analysis for deep neural network with ReLu activation functions." ITM Web of Conferences 19 (2018): 01009. http://dx.doi.org/10.1051/itmconf/20181901009.
Повний текст джерелаHe, Juncai, Lin Li, and Jinchao Xu. "ReLU deep neural networks from the hierarchical basis perspective." Computers & Mathematics with Applications 120 (August 2022): 105–14. http://dx.doi.org/10.1016/j.camwa.2022.06.006.
Повний текст джерелаDey, Santanu S., Guanyi Wang, and Yao Xie. "Approximation Algorithms for Training One-Node ReLU Neural Networks." IEEE Transactions on Signal Processing 68 (2020): 6696–706. http://dx.doi.org/10.1109/tsp.2020.3039360.
Повний текст джерелаLiu, Wan-Wei, Fu Song, Tang-Hao-Ran Zhang, and Ji Wang. "Verifying ReLU Neural Networks from a Model Checking Perspective." Journal of Computer Science and Technology 35, no. 6 (November 2020): 1365–81. http://dx.doi.org/10.1007/s11390-020-0546-7.
Повний текст джерелаChieng, Hock Hung, Noorhaniza Wahid, Ong Pauline, and Sai Raj Kishore Perla. "Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning." International Journal of Advances in Intelligent Informatics 4, no. 2 (July 31, 2018): 76. http://dx.doi.org/10.26555/ijain.v4i2.249.
Повний текст джерелаSalam, Abdulwahed, Abdelaaziz El Hibaoui, and Abdulgabbar Saif. "A comparison of activation functions in multilayer neural network for predicting the production and consumption of electricity power." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 1 (February 1, 2021): 163. http://dx.doi.org/10.11591/ijece.v11i1.pp163-170.
Повний текст джерелаYuan, Xiaoyong, Zheng Feng, Matthew Norton, and Xiaolin Li. "Generalized Batch Normalization: Towards Accelerating Deep Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 1682–89. http://dx.doi.org/10.1609/aaai.v33i01.33011682.
Повний текст джерелаButt, F. M., L. Hussain, S. H. M. Jafri, K. J. Lone, M. Alajmi, I. Abunadi, F. N. Al-Wesabi, and M. A. Hamza. "Optimizing Parameters of Artificial Intelligence Deep Convolutional Neural Networks (CNN) to improve Prediction Performance of Load Forecasting System." IOP Conference Series: Earth and Environmental Science 1026, no. 1 (May 1, 2022): 012028. http://dx.doi.org/10.1088/1755-1315/1026/1/012028.
Повний текст джерелаLee, Seunghye, Qui X. Lieu, Thuc P. Vo, and Jaehong Lee. "Deep Neural Networks for Form-Finding of Tensegrity Structures." Mathematics 10, no. 11 (May 25, 2022): 1822. http://dx.doi.org/10.3390/math10111822.
Повний текст джерелаAkintunde, Michael E., Andreea Kevorchian, Alessio Lomuscio, and Edoardo Pirovano. "Verification of RNN-Based Neural Agent-Environment Systems." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 6006–13. http://dx.doi.org/10.1609/aaai.v33i01.33016006.
Повний текст джерелаDaróczy, Bálint. "Gaussian Perturbations in ReLU Networks and the Arrangement of Activation Regions." Mathematics 10, no. 7 (March 31, 2022): 1123. http://dx.doi.org/10.3390/math10071123.
Повний текст джерелаZheng, Jing, Shuaishuai Shen, Tianqi Jiang, and Weiqiang Zhu. "Deep neural networks design and analysis for automatic phase pickers from three-component microseismic recordings." Geophysical Journal International 220, no. 1 (November 4, 2019): 323–34. http://dx.doi.org/10.1093/gji/ggz441.
Повний текст джерелаKatz, Justin, Iosif Pappas, Styliani Avraamidou, and Efstratios N. Pistikopoulos. "The Integration of Explicit MPC and ReLU based Neural Networks." IFAC-PapersOnLine 53, no. 2 (2020): 11350–55. http://dx.doi.org/10.1016/j.ifacol.2020.12.544.
Повний текст джерелаSchmidt-Hieber, Johannes. "Nonparametric regression using deep neural networks with ReLU activation function." Annals of Statistics 48, no. 4 (August 2020): 1875–97. http://dx.doi.org/10.1214/19-aos1875.
Повний текст джерелаOhn, Ilsang, and Yongdai Kim. "Smooth Function Approximation by Deep Neural Networks with General Activation Functions." Entropy 21, no. 7 (June 26, 2019): 627. http://dx.doi.org/10.3390/e21070627.
Повний текст джерелаOpschoor, Joost A. A., Philipp C. Petersen, and Christoph Schwab. "Deep ReLU networks and high-order finite element methods." Analysis and Applications 18, no. 05 (February 21, 2020): 715–70. http://dx.doi.org/10.1142/s0219530519410136.
Повний текст джерелаRyffel, Théo, Pierre Tholoniat, David Pointcheval, and Francis Bach. "AriaNN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing." Proceedings on Privacy Enhancing Technologies 2022, no. 1 (November 20, 2021): 291–316. http://dx.doi.org/10.2478/popets-2022-0015.
Повний текст джерелаBoopathy, Akhilan, Tsui-Wei Weng, Pin-Yu Chen, Sijia Liu, and Luca Daniel. "CNN-Cert: An Efficient Framework for Certifying Robustness of Convolutional Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 3240–47. http://dx.doi.org/10.1609/aaai.v33i01.33013240.
Повний текст джерелаHatano, Naoya, Masahiro Ikeda, Isao Ishikawa, and Yoshihiro Sawano. "A Global Universality of Two-Layer Neural Networks with ReLU Activations." Journal of Function Spaces 2021 (September 17, 2021): 1–3. http://dx.doi.org/10.1155/2021/6637220.
Повний текст джерелаHerrera, Oscar, and Belém Priego. "Wavelets as activation functions in Neural Networks." Journal of Intelligent & Fuzzy Systems 42, no. 5 (March 31, 2022): 4345–55. http://dx.doi.org/10.3233/jifs-219225.
Повний текст джерелаAli, Mahmoud Emad Aldin, and Dinesh Kumar. "The Impact of Optimization Algorithms on The Performance of Face Recognition Neural Networks." Journal of Advanced Engineering and Computation 6, no. 4 (December 31, 2022): 248. http://dx.doi.org/10.55579/jaec.202264.370.
Повний текст джерелаYan, Zhiqi, Shisheng Zhong, Lin Lin, and Zhiquan Cui. "Adaptive Levenberg–Marquardt Algorithm: A New Optimization Strategy for Levenberg–Marquardt Neural Networks." Mathematics 9, no. 17 (September 6, 2021): 2176. http://dx.doi.org/10.3390/math9172176.
Повний текст джерелаVeness, Joel, Tor Lattimore, David Budden, Avishkar Bhoopchand, Christopher Mattern, Agnieszka Grabska-Barwinska, Eren Sezener, et al. "Gated Linear Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 11 (May 18, 2021): 10015–23. http://dx.doi.org/10.1609/aaai.v35i11.17202.
Повний текст джерелаInoue, Kenta. "Expressive Numbers of Two or More Hidden Layer ReLU Neural Networks." International Journal of Networking and Computing 10, no. 2 (2020): 293–307. http://dx.doi.org/10.15803/ijnc.10.2_293.
Повний текст джерелаPetersen, Philipp, and Felix Voigtlaender. "Optimal approximation of piecewise smooth functions using deep ReLU neural networks." Neural Networks 108 (December 2018): 296–330. http://dx.doi.org/10.1016/j.neunet.2018.08.019.
Повний текст джерелаOostwal, Elisa, Michiel Straat, and Michael Biehl. "Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation." Physica A: Statistical Mechanics and its Applications 564 (February 2021): 125517. http://dx.doi.org/10.1016/j.physa.2020.125517.
Повний текст джерелаSchmidt-Hieber, Johannes. "Rejoinder: “Nonparametric regression using deep neural networks with ReLU activation function”." Annals of Statistics 48, no. 4 (August 2020): 1916–21. http://dx.doi.org/10.1214/19-aos1931.
Повний текст джерелаVelasco, Lemuel Clark, John Frail Bongat, Ched Castillon, Jezreil Laurente, and Emily Tabanao. "Days-ahead water level forecasting using artificial neural networks for watersheds." Mathematical Biosciences and Engineering 20, no. 1 (2022): 758–74. http://dx.doi.org/10.3934/mbe.2023035.
Повний текст джерела