Gotowa bibliografia na temat „Binary neural networks (BNN)”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Binary neural networks (BNN)”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "Binary neural networks (BNN)"
Rozen, Tal, Moshe Kimhi, Brian Chmiel, Avi Mendelson i Chaim Baskin. "Bimodal-Distributed Binarized Neural Networks". Mathematics 10, nr 21 (3.11.2022): 4107. http://dx.doi.org/10.3390/math10214107.
Pełny tekst źródłaCho, Jaechan, Yongchul Jung, Seongjoo Lee i Yunho Jung. "Reconfigurable Binary Neural Network Accelerator with Adaptive Parallelism Scheme". Electronics 10, nr 3 (20.01.2021): 230. http://dx.doi.org/10.3390/electronics10030230.
Pełny tekst źródłaSunny, Febin P., Asif Mirza, Mahdi Nikdast i Sudeep Pasricha. "ROBIN: A Robust Optical Binary Neural Network Accelerator". ACM Transactions on Embedded Computing Systems 20, nr 5s (31.10.2021): 1–24. http://dx.doi.org/10.1145/3476988.
Pełny tekst źródłaSimons, Taylor, i Dah-Jye Lee. "A Review of Binarized Neural Networks". Electronics 8, nr 6 (12.06.2019): 661. http://dx.doi.org/10.3390/electronics8060661.
Pełny tekst źródłaWang, Peisong, Xiangyu He, Gang Li, Tianli Zhao i Jian Cheng. "Sparsity-Inducing Binarized Neural Networks". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 07 (3.04.2020): 12192–99. http://dx.doi.org/10.1609/aaai.v34i07.6900.
Pełny tekst źródłaLiu, Chunlei, Peng Chen, Bohan Zhuang, Chunhua Shen, Baochang Zhang i Wenrui Ding. "SA-BNN: State-Aware Binary Neural Network". Proceedings of the AAAI Conference on Artificial Intelligence 35, nr 3 (18.05.2021): 2091–99. http://dx.doi.org/10.1609/aaai.v35i3.16306.
Pełny tekst źródłaZhao, Yiyang, Yongjia Wang, Ruibo Wang, Yuan Rong i Xianyang Jiang. "A Highly Robust Binary Neural Network Inference Accelerator Based on Binary Memristors". Electronics 10, nr 21 (25.10.2021): 2600. http://dx.doi.org/10.3390/electronics10212600.
Pełny tekst źródłaXiang, Maoyang, i Tee Hui Teo. "Implementation of Binarized Neural Networks in All-Programmable System-on-Chip Platforms". Electronics 11, nr 4 (21.02.2022): 663. http://dx.doi.org/10.3390/electronics11040663.
Pełny tekst źródłaZhang, Longlong, Xuebin Tang, Xiang Hu, Tong Zhou i Yuanxi Peng. "FPGA-Based BNN Architecture in Time Domain with Low Storage and Power Consumption". Electronics 11, nr 9 (28.04.2022): 1421. http://dx.doi.org/10.3390/electronics11091421.
Pełny tekst źródłaKim, HyunJin, Mohammed Alnemari i Nader Bagherzadeh. "A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks". PeerJ Computer Science 8 (29.03.2022): e924. http://dx.doi.org/10.7717/peerj-cs.924.
Pełny tekst źródłaRozprawy doktorskie na temat "Binary neural networks (BNN)"
Simons, Taylor Scott. "High-Speed Image Classification for Resource-Limited Systems Using Binary Values". BYU ScholarsArchive, 2021. https://scholarsarchive.byu.edu/etd/9097.
Pełny tekst źródłaBraga, AntoÌ‚nio de PaÌdua. "Design models for recursive binary neural networks". Thesis, Imperial College London, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.336442.
Pełny tekst źródłaRedkar, Shrutika. "Deep Learning Binary Neural Network on an FPGA". Digital WPI, 2017. https://digitalcommons.wpi.edu/etd-theses/407.
Pełny tekst źródłaEzzadeen, Mona. "Conception d'un circuit dédié au calcul dans la mémoire à base de technologie 3D innovante". Electronic Thesis or Diss., Aix-Marseille, 2022. http://theses.univ-amu.fr.lama.univ-amu.fr/221212_EZZADEEN_955e754k888gvxorp699jljcho_TH.pdf.
Pełny tekst źródłaWith the advent of edge devices and artificial intelligence, the data deluge is a reality, making energy-efficient computing systems a must-have. Unfortunately, classical von Neumann architectures suffer from the high cost of data transfers between memories and processing units. At the same time, CMOS scaling seems more and more challenging and costly to afford, limiting the chips' performance due to power consumption issues.In this context, bringing the computation directly inside or near memories (I/NMC) seems an appealing solution. However, data-centric applications require an important amount of non-volatile storage, and modern Flash memories suffer from scaling issues and are not very suited for I/NMC. On the other hand, emerging memory technologies such as ReRAM present very appealing memory performances, good scalability, and interesting I/NMC features. However, they suffer from variability issues and from a degraded density integration if an access transistor per bitcell (1T1R) is used to limit the sneak-path currents. This thesis work aims to overcome these two challenges. First, the variability impact on read and I/NMC operations is assessed and new robust and low-overhead ReRAM-based boolean operations are proposed. In the context of neural networks, new ReRAM-based neuromorphic accelerators are developed and characterized, with an emphasis on good robustness against variability, good parallelism, and high energy efficiency. Second, to resolve the density integration issues, an ultra-dense 3D 1T1R ReRAM-based Cube and its architecture are proposed, which can be used as a 3D NOR memory as well as a low overhead and energy-efficient I/NMC accelerator
Kennedy, John V. "The design of a scalable and application independent platform for binary neural networks". Thesis, University of York, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.323503.
Pełny tekst źródłaLi, Guo. "Neural network for optimization of binary computer-generated hologram with printing model /". Online version of thesis, 1995. http://hdl.handle.net/1850/12234.
Pełny tekst źródłaMedvedieva, S. O., I. V. Bogach, V. A. Kovenko, С. О. Медведєва, І. В. Богач i В. А. Ковенко. "Neural networks in Machine learning". Thesis, ВНТУ, 2019. http://ir.lib.vntu.edu.ua//handle/123456789/24788.
Pełny tekst źródłaThe paper covers the basic principles of Neural Networks’ work. Special attention is paid to Frank Rosenblatt’s model of the network called “perceptron”. In addition, the article touches upon the main programming languages used to write software for Neural Networks.
Wilson, Brittany Michelle. "Evaluating and Improving the SEU Reliability of Artificial Neural Networks Implemented in SRAM-Based FPGAs with TMR". BYU ScholarsArchive, 2020. https://scholarsarchive.byu.edu/etd/8619.
Pełny tekst źródłaMealey, Thomas C. "Binary Recurrent Unit: Using FPGA Hardware to Accelerate Inference in Long Short-Term Memory Neural Networks". University of Dayton / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1524402925375566.
Pełny tekst źródłaStrandberg, Rickard, i Johan Låås. "A comparison between Neural networks, Lasso regularized Logistic regression, and Gradient boosted trees in modeling binary sales". Thesis, KTH, Optimeringslära och systemteori, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-252556.
Pełny tekst źródłaDet primära syftet med denna uppsats är att förutsäga huruvida en kund kommer köpa en specifik produkt eller ej. Den historiska datan tillhandahålls av den Nordiska internet-baserade IT-försäljaren Dustin. Det sekundära syftet med uppsatsen är att evaluera hur väl ett djupt neuralt nätverk presterar jämfört med Lasso regulariserad logistisk regression och gradient boostade träd (GXBoost). Denna uppsats fann att XGBoost presterade bättre än de två andra metoderna i såväl träffsäkerhet, som i hastighet.
Książki na temat "Binary neural networks (BNN)"
Baram, Yoram. Orthogonal patterns in binary neural networks. Moffett Field, Calif: National Aeronautics and Space Administration, Ames Research Center, 1988.
Znajdź pełny tekst źródłaAizenberg, Igor N. Multi-Valued and Universal Binary Neurons: Theory, Learning and Applications. Boston, MA: Springer US, 2000.
Znajdź pełny tekst źródłaAizenberg, Igor, Naum N. Aizenberg i Joos P. L. Vandewalle. Multi-Valued and Universal Binary Neurons: Theory, Learning and Applications. Springer, 2000.
Znajdź pełny tekst źródłaLi, Jian, Antonio De Maio, Guolong Cui i Alfonso Farina. Radar Waveform Design Based on Optimization Theory. Institution of Engineering & Technology, 2020.
Znajdź pełny tekst źródłaRadar Waveform Design Based on Optimization Theory. Institution of Engineering & Technology, 2020.
Znajdź pełny tekst źródłaCzęści książek na temat "Binary neural networks (BNN)"
Zhang, Yedi, Zhe Zhao, Guangke Chen, Fu Song i Taolue Chen. "BDD4BNN: A BDD-Based Quantitative Analysis Framework for Binarized Neural Networks". W Computer Aided Verification, 175–200. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81685-8_8.
Pełny tekst źródłaMyojin, Tomoyuki, Shintaro Hashimoto i Naoki Ishihama. "Detecting Uncertain BNN Outputs on FPGA Using Monte Carlo Dropout Sampling". W Artificial Neural Networks and Machine Learning – ICANN 2020, 27–38. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61616-8_3.
Pełny tekst źródłaHodge, Victoria J., i Jim Austin. "A Novel Binary Spell Checker". W Artificial Neural Networks — ICANN 2001, 1199–204. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_167.
Pełny tekst źródłaKarandashev, Yakov, i Boris Kryzhanovsky. "Binary Minimization: Increasing the Attraction Area of the Global Minimum in the Binary Optimization Problem". W Artificial Neural Networks – ICANN 2010, 525–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15822-3_64.
Pełny tekst źródłaSkubiszewski, Marcin. "A Hardware Emulator for Binary Neural Networks". W International Neural Network Conference, 555–58. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0643-3_2.
Pełny tekst źródłaMakita, Kazuma, Takahiro Ozawa i Toshimichi Saito. "Basic Analysis of Cellular Dynamic Binary Neural Networks". W Neural Information Processing, 779–86. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70090-8_79.
Pełny tekst źródłaAnzai, Shota, Seitaro Koyama, Shunsuke Aoki i Toshimichi Saito. "Sparse Dynamic Binary Neural Networks for Storage and Switching of Binary Periodic Orbits". W Neural Information Processing, 536–42. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36711-4_45.
Pełny tekst źródłaCotofana, Sorin, i Stamatis Vassiliadis. "Serial binary addition with polynormally bounded weights". W Artificial Neural Networks — ICANN 96, 741–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/3-540-61510-5_125.
Pełny tekst źródłaKanerva, Pentti. "Binary spatter-coding of ordered K-tuples". W Artificial Neural Networks — ICANN 96, 869–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/3-540-61510-5_146.
Pełny tekst źródłaTu, Zhijun, Xinghao Chen, Pengju Ren i Yunhe Wang. "AdaBin: Improving Binary Neural Networks with Adaptive Binary Sets". W Lecture Notes in Computer Science, 379–95. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-20083-0_23.
Pełny tekst źródłaStreszczenia konferencji na temat "Binary neural networks (BNN)"
Zhao, Junhe, Linlin Yang, Baochang Zhang, Guodong Guo i David Doermann. "Uncertainty-aware Binary Neural Networks". W Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/474.
Pełny tekst źródłaChen, Tianlong, Zhenyu Zhang, Xu Ouyang, Zechun Liu, Zhiqiang Shen i Zhangyang Wang. "“BNN - BN = ?”: Training Binary Neural Networks without Batch Normalization". W 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 2021. http://dx.doi.org/10.1109/cvprw53098.2021.00520.
Pełny tekst źródłaLi, Yixing, i Fengbo Ren. "BNN Pruning: Pruning Binary Neural Network Guided by Weight Flipping Frequency". W 2020 21st International Symposium on Quality Electronic Design (ISQED). IEEE, 2020. http://dx.doi.org/10.1109/isqed48828.2020.9136977.
Pełny tekst źródłaSuarez-Ramirez, Cuauhtemoc, Miguel Gonzalez-Mendoza, Leonardo Chang, Gilberto Ochoa-Ruiz i Mario Duran-Vega. "A Bop and Beyond: A Second Order Optimizer for Binarized Neural Networks". W LatinX in AI at Computer Vision and Pattern Recognition Conference 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/lxai202106255.
Pełny tekst źródłaSay, Buser, i Scott Sanner. "Planning in Factored State and Action Spaces with Learned Binarized Neural Network Transition Models". W Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/669.
Pełny tekst źródłaCardelli, Luca, Marta Kwiatkowska, Luca Laurenti, Nicola Paoletti, Andrea Patane i Matthew Wicker. "Statistical Guarantees for the Robustness of Bayesian Neural Networks". W Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/789.
Pełny tekst źródłaLiu, Haihui, Yuang Zhang, Xiangwei Zheng, Mengxin Yu, Baoxu An i Xiaojie Liu. "PC-BNA: Parallel Convolution Binary Neural Network Accelerator". W 2022 IEEE 24th Int Conf on High Performance Computing & Communications; 8th Int Conf on Data Science & Systems; 20th Int Conf on Smart City; 8th Int Conf on Dependability in Sensor, Cloud & Big Data Systems & Application (HPCC/DSS/SmartCity/DependSys). IEEE, 2022. http://dx.doi.org/10.1109/hpcc-dss-smartcity-dependsys57074.2022.00169.
Pełny tekst źródłaMagnitskii, N. A., i A. A. Mikhailov. "Binary neural networks". W Optical Information Science and Technology, redaktor Andrei L. Mikaelian. SPIE, 1998. http://dx.doi.org/10.1117/12.304957.
Pełny tekst źródłaHeilala, Janne, Paavo Nevalainen i Kristiina Toivonen. "AI-based sentiment analysis approaches for large-scale data domains of public and security interests". W 14th International Conference on Applied Human Factors and Ergonomics (AHFE 2023). AHFE International, 2023. http://dx.doi.org/10.54941/ahfe1003738.
Pełny tekst źródłaHuang, Kuan-Yu, Jettae Schroff, Cheng-Di Tsai i Tsung-Chu Huang. "2DAN-BNN: Two-Dimensional AN-Code Decoders for Binarized Neural Networks". W 2022 IEEE International Conference on Consumer Electronics - Taiwan. IEEE, 2022. http://dx.doi.org/10.1109/icce-taiwan55306.2022.9868989.
Pełny tekst źródłaRaporty organizacyjne na temat "Binary neural networks (BNN)"
Farhi, Edward, i Hartmut Neven. Classification with Quantum Neural Networks on Near Term Processors. Web of Open Science, grudzień 2020. http://dx.doi.org/10.37686/qrl.v1i2.80.
Pełny tekst źródłaDenysenko, Oleksii. Solving binary classification problem by means of convolutional neural networks with the use of TensorFlow framework. Intellectual Archive, marzec 2019. http://dx.doi.org/10.32370/online/2019_03_25_1.
Pełny tekst źródła