Добірка наукової літератури з теми "Supervised neural network"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Supervised neural network".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Supervised neural network"
Tian, Jidong, Yitian Li, Wenqing Chen, Liqiang Xiao, Hao He, and Yaohui Jin. "Weakly Supervised Neural Symbolic Learning for Cognitive Tasks." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 5 (June 28, 2022): 5888–96. http://dx.doi.org/10.1609/aaai.v36i5.20533.
Повний текст джерелаIto, Toshio. "Supervised Learning Methods of Bilinear Neural Network Systems Using Discrete Data." International Journal of Machine Learning and Computing 6, no. 5 (October 2016): 235–40. http://dx.doi.org/10.18178/ijmlc.2016.6.5.604.
Повний текст джерелаVerma, Vikas, Meng Qu, Kenji Kawaguchi, Alex Lamb, Yoshua Bengio, Juho Kannala, and Jian Tang. "GraphMix: Improved Training of GNNs for Semi-Supervised Learning." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 11 (May 18, 2021): 10024–32. http://dx.doi.org/10.1609/aaai.v35i11.17203.
Повний текст джерелаHu, Jinghan. "Semi-supervised Blindness Detection with Neural Network Ensemble." Highlights in Science, Engineering and Technology 12 (August 26, 2022): 171–76. http://dx.doi.org/10.54097/hset.v12i.1448.
Повний текст джерелаHindarto, Djarot, and Handri Santoso. "Performance Comparison of Supervised Learning Using Non-Neural Network and Neural Network." Jurnal Nasional Pendidikan Teknik Informatika (JANAPATI) 11, no. 1 (April 6, 2022): 49. http://dx.doi.org/10.23887/janapati.v11i1.40768.
Повний текст джерелаLiu, Chenghua, Zhuolin Liao, Yixuan Ma, and Kun Zhan. "Stationary Diffusion State Neural Estimation for Multiview Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 7 (June 28, 2022): 7542–49. http://dx.doi.org/10.1609/aaai.v36i7.20719.
Повний текст джерелаCho, Myoung Won. "Supervised learning in a spiking neural network." Journal of the Korean Physical Society 79, no. 3 (July 20, 2021): 328–35. http://dx.doi.org/10.1007/s40042-021-00254-4.
Повний текст джерелаWang, Juexin, Anjun Ma, Qin Ma, Dong Xu, and Trupti Joshi. "Inductive inference of gene regulatory network using supervised and semi-supervised graph neural networks." Computational and Structural Biotechnology Journal 18 (2020): 3335–43. http://dx.doi.org/10.1016/j.csbj.2020.10.022.
Повний текст джерелаNobukawa, Sou, Haruhiko Nishimura, and Teruya Yamanishi. "Pattern Classification by Spiking Neural Networks Combining Self-Organized and Reward-Related Spike-Timing-Dependent Plasticity." Journal of Artificial Intelligence and Soft Computing Research 9, no. 4 (October 1, 2019): 283–91. http://dx.doi.org/10.2478/jaiscr-2019-0009.
Повний текст джерелаZhao, Shijie, Yan Cui, Linwei Huang, Li Xie, Yaowu Chen, Junwei Han, Lei Guo, Shu Zhang, Tianming Liu, and Jinglei Lv. "Supervised Brain Network Learning Based on Deep Recurrent Neural Networks." IEEE Access 8 (2020): 69967–78. http://dx.doi.org/10.1109/access.2020.2984948.
Повний текст джерелаДисертації з теми "Supervised neural network"
Tran, Khanh-Hung. "Semi-supervised dictionary learning and Semi-supervised deep neural network." Thesis, université Paris-Saclay, 2021. http://www.theses.fr/2021UPASP014.
Повний текст джерелаSince the 2010's, machine learning (ML) has been one of the topics that attract a lot of attention from scientific researchers. Many ML models have been demonstrated their ability to produce excellent results in various fields such as Computer Vision, Natural Language Processing, Robotics... However, most of these models use supervised learning, which requires a massive annotation. Therefore, the objective of this thesis is to study and to propose semi-supervised learning approaches that have many advantages over supervised learning. Instead of directly applying a semi-supervised classifier on the original representation of data, we rather use models that integrate a representation learning stage before the classification stage, to better adapt to the non-linearity of the data. In the first step, we revisit tools that allow us to build our semi-supervised models. First, we present two types of model that possess representation learning in their architecture: dictionary learning and neural network, as well as the optimization methods for each type of model. Moreover, in the case of neural network, we specify the problem with adversarial examples. Then, we present the techniques that often accompany with semi-supervised learning such as variety learning and pseudo-labeling. In the second part, we work on dictionary learning. We synthesize generally three steps to build a semi-supervised model from a supervised model. Then, we propose our semi-supervised model to deal with the classification problem typically in the case of a low number of training samples (including both labelled and non-labelled samples). On the one hand, we apply the preservation of the data structure from the original space to the sparse code space (manifold learning), which is considered as regularization for sparse codes. On the other hand, we integrate a semi-supervised classifier in the sparse code space. In addition, we perform sparse coding for test samples by taking into account also the preservation of the data structure. This method provides an improvement on the accuracy rate compared to other existing methods. In the third step, we work on neural network models. We propose an approach called "manifold attack" which allows reinforcing manifold learning. This approach is inspired from adversarial learning : finding virtual points that disrupt the cost function on manifold learning (by maximizing it) while fixing the model parameters; then the model parameters are updated by minimizing this cost function while fixing these virtual points. We also provide criteria for limiting the space to which the virtual points belong and the method for initializing them. This approach provides not only an improvement on the accuracy rate but also a significant robustness to adversarial examples. Finally, we analyze the similarities and differences, as well as the advantages and disadvantages between dictionary learning and neural network models. We propose some perspectives on both two types of models. In the case of semi-supervised dictionary learning, we propose some techniques inspired by the neural network models. As for the neural network, we propose to integrate manifold attack on generative models
Morns, Ian Philip. "The novel dynamic supervised forward propagation neural network for handwritten character recognition." Thesis, University of Newcastle Upon Tyne, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285741.
Повний текст джерелаSyrén, Grönfelt Natalie. "Pretraining a Neural Network for Hyperspectral Images Using Self-Supervised Contrastive Learning." Thesis, Linköpings universitet, Datorseende, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-179122.
Повний текст джерелаBylund, Andreas, Anton Erikssen, and Drazen Mazalica. "Hyperparameters impact in a convolutional neural network." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-18670.
Повний текст джерелаSchembri, Massimo. "Anomaly Prediction in Production Supercomputer with Convolution and Semi-supervised autoencoder." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/22379/.
Повний текст джерелаGuo, Lilin. "A Biologically Plausible Supervised Learning Method for Spiking Neurons with Real-world Applications." FIU Digital Commons, 2016. http://digitalcommons.fiu.edu/etd/2982.
Повний текст джерелаHansen, Vedal Amund. "Comparing performance of convolutional neural network models on a novel car classification task." Thesis, KTH, Medieteknik och interaktionsdesign, MID, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-213468.
Повний текст джерелаNya neurala nätverksframsteg har lett till modeller som kan användas för en mängd olika bildklasseringsuppgifter, och är därför användbara många av dagens medietekniska applikationer. I detta projektet tränar jag moderna neurala nätverksarkitekturer på en nyuppsamlad bilbild-datasats för att göra både grov- och finkornad klassificering av fordonstyp. Resultaten visar att neurala nätverk kan lära sig att skilja mellan många mycket olika bilklasser, och även mellan några mycket liknande klasser. Mina bästa modeller nådde 50,8% träffsäkerhet vid 28 klasser och 61,5% på de mest utmanande 5, trots brusiga bilder och manuell klassificering av datasetet.
Karlsson, Erik, and Gilbert Nordhammar. "Naive semi-supervised deep learning med sammansättning av pseudo-klassificerare." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-17177.
Повний текст джерелаFlores, Quiroz Martín. "Descriptive analysis of the acquisition of the base form, third person singular, present participle regular past, irregular past, and past participle in a supervised artificial neural network and an unsupervised artificial neural network." Tesis, Universidad de Chile, 2013. http://www.repositorio.uchile.cl/handle/2250/115653.
Повний текст джерелаStudying children’s language acquisition in natural settings is not cost and time effective. Therefore, language acquisition may be studied in an artificial setting reducing the costs related to this type of research. By artificial, I do not mean that children will be placed in an artificial setting, first because this would not be ethical and second because the problem of the time needed for this research would still be present. Thus, by artificial I mean that the tools of simulation found in artificial intelligence can be used. Simulators as artificial neural networks (ANNs) possess the capacity to simulate different human cognitive skills, as pattern or speech recognition, and can also be implemented in personal computers with software such as MATLAB, a numerical computing software. ANNs are computer simulation models that try to resemble the neural processes behind several human cognitive skills. There are two main types of ANNs: supervised and unsupervised. The learning processes in the first are guided by the computer programmer, while the learning processes of the latter are random.
Dabiri, Sina. "Semi-Supervised Deep Learning Approach for Transportation Mode Identification Using GPS Trajectory Data." Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/86845.
Повний текст джерелаMaster of Science
Identifying users' transportation modes (e.g., bike, bus, train, and car) is a key step towards many transportation related problems including (but not limited to) transport planning, transit demand analysis, auto ownership, and transportation emissions analysis. Traditionally, the information for analyzing travelers' behavior for choosing transport mode(s) was obtained through travel surveys. High cost, low-response rate, time-consuming manual data collection, and misreporting are the main demerits of the survey-based approaches. With the rapid growth of ubiquitous GPS-enabled devices (e.g., smartphones), a constant stream of users' trajectory data can be recorded. A user's GPS trajectory is a sequence of GPS points, recorded by means of a GPS-enabled device, in which a GPS point contains the information of the device geographic location at a particular moment. In this research, users' GPS trajectories, rather than traditional resources, are harnessed to predict their transportation mode by means of statistical models. With respect to the statistical models, a wide range of studies have developed travel mode detection models using on hand-designed attributes and classical learning techniques. Nonetheless, hand-crafted features cause some main shortcomings including vulnerability to traffic uncertainties and biased engineering justification in generating effective features. A potential solution to address these issues is by leveraging deep learning frameworks that are capable of capturing abstract features from the raw input in an automated fashion. Thus, in this thesis, deep learning architectures are exploited in order to identify transport modes based on only raw GPS tracks. It is worth noting that a significant portion of trajectories in GPS data might not be annotated by a transport mode and the acquisition of labeled data is a more expensive and labor-intensive task in comparison with collecting unlabeled data. Thus, utilizing the unlabeled GPS trajectory (i.e., the GPS trajectories that have not been annotated by a transport mode) is a cost-effective approach for improving the prediction quality of the travel mode detection model. Therefore, the unlabeled GPS data are also leveraged by developing a novel deep-learning architecture that is capable of extracting information from both labeled and unlabeled data. The experimental results demonstrate the superiority of the proposed models over the state-of-the-art methods in literature with respect to several performance metrics.
Книги з теми "Supervised neural network"
J, Marks Robert, ed. Neural smithing: Supervised learning in feedforward artificial neural networks. Cambridge, Mass: The MIT Press, 1999.
Знайти повний текст джерелаSuresh, Sundaram, Narasimhan Sundararajan, and Ramasamy Savitha. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-29491-4.
Повний текст джерелаGraves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24797-2.
Повний текст джерелаGraves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012.
Знайти повний текст джерелаSuresh, Sundaram. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.
Знайти повний текст джерелаSurinder, Singh. Exploratory spatial data analysis using supervised neural networks. London: University of East London, 1994.
Знайти повний текст джерелаSFI/CNLS Workshop on Formal Approaches to Supervised Learning (1992 Santa Fe, N.M.). The mathematics of generalization: The proceedings of the SFI/CNLS Workshop on Formal Approaches to Supervised Learning. Edited by Wolpert David H. Reading, Mass: Addison-Wesley Pub. Co., 1995.
Знайти повний текст джерелаSupervised and unsupervised pattern recognition: Feature extraction and computational intelligence. Boca Raton, Fla: CRC Press, 2000.
Знайти повний текст джерелаLeung, Wing Kai. The specification, analysis and metrics of supervised feedforward artificial neural networks for applied science and engineering applications. Birmingham: University of Central England in Birmingham, 2002.
Знайти повний текст джерелаSupervised Learning With Complexvalued Neural Networks. Springer, 2012.
Знайти повний текст джерелаЧастини книг з теми "Supervised neural network"
Magrans de Abril, Ildefons, and Ann Nowé. "Supervised Neural Network Structure Recovery." In Neural Connectomics Challenge, 37–45. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-53070-3_3.
Повний текст джерелаMuselli, Marco, and Sandro Ridella. "Supervised Learning Using a Genetic Algorithm." In International Neural Network Conference, 790. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0643-3_86.
Повний текст джерелаBaldi, Pierre, Yves Chauvin, and Kurt Hornik. "Supervised and Unsupervised Learning in Linear Networks." In International Neural Network Conference, 825–28. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0643-3_99.
Повний текст джерелаBattiti, Roberto, and Francesco Masulli. "BFGS Optimization for Faster and Automated Supervised Learning." In International Neural Network Conference, 757–60. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0643-3_68.
Повний текст джерелаPandya, Abhijit S., and Raisa Szabo. "ALOPEX Algorithm for Supervised Learning in Layer Networks." In International Neural Network Conference, 791. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0643-3_88.
Повний текст джерелаMidenet, S., and A. Grumbach. "Supervised Learning Based on Kohonen’s Self-Organising Feature Maps." In International Neural Network Conference, 773–76. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0643-3_72.
Повний текст джерелаYusoff, Nooraini, and André Grüning. "Supervised Associative Learning in Spiking Neural Network." In Artificial Neural Networks – ICANN 2010, 224–29. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15819-3_30.
Повний текст джерелаBianchini, Monica, and Marco Maggini. "Supervised Neural Network Models for Processing Graphs." In Intelligent Systems Reference Library, 67–96. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-36657-4_3.
Повний текст джерелаSuresh, Sundaram, Narasimhan Sundararajan, and Ramasamy Savitha. "Complex-valued Self-regulatory Resource Allocation Network (CSRAN)." In Supervised Learning with Complex-valued Neural Networks, 135–68. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-29491-4_8.
Повний текст джерелаMagoulas, G. D., M. N. Vrahatis, T. N. Grapsa, and G. S. Androulakis. "Neural Network Supervised Training Based on a Dimension Reducing Method." In Mathematics of Neural Networks, 245–49. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4615-6099-9_41.
Повний текст джерелаТези доповідей конференцій з теми "Supervised neural network"
FILO, G. "Analysis of Neural Network Structure for Implementation of the Prescriptive Maintenance Strategy." In Terotechnology XII. Materials Research Forum LLC, 2022. http://dx.doi.org/10.21741/9781644902059-40.
Повний текст джерелаHuynh, Alex V., John F. Walkup, and Thomas F. Krile. "Optical perceptron-based quadratic neural network." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1991. http://dx.doi.org/10.1364/oam.1991.mii8.
Повний текст джерелаZhao, Chengshuai, Shuai Liu, Feng Huang, Shichao Liu, and Wen Zhang. "CSGNN: Contrastive Self-Supervised Graph Neural Network for Molecular Interaction Prediction." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/517.
Повний текст джерелаLEMPA, P. "Analysis of Neural Network Training Algorithms for Implementation of the Prescriptive Maintenance Strategy." In Terotechnology XII. Materials Research Forum LLC, 2022. http://dx.doi.org/10.21741/9781644902059-41.
Повний текст джерелаAhmed, Sultan Uddin, Md Shahjahan, and Kazuyuki Murase. "Chaotic dynamics of supervised neural network." In 2010 13th International Conference on Computer and Information Technology (ICCIT). IEEE, 2010. http://dx.doi.org/10.1109/iccitechn.2010.5723893.
Повний текст джерелаChunwei, Zhang, and Liu Haijiang. "A New Supervised Spiking Neural Network." In 2009 Second International Conference on Intelligent Computation Technology and Automation. IEEE, 2009. http://dx.doi.org/10.1109/icicta.2009.13.
Повний текст джерелаAli, Rashid, and Iram Naim. "Neural network based supervised rank aggregation." In 2011 International Conference on Multimedia, Signal Processing and Communication Technologies (IMPACT). IEEE, 2011. http://dx.doi.org/10.1109/mspct.2011.6150439.
Повний текст джерелаYu, Francis T. S., Taiwei Lu, and Don A. Gregory. "Self-Learning Optical Neural Network." In Spatial Light Modulators and Applications. Washington, D.C.: Optica Publishing Group, 1990. http://dx.doi.org/10.1364/slma.1990.mb4.
Повний текст джерелаPerumalla, Aniruddha, Ahmet Koru, and Eric Johnson. "Network Topology Identification using Supervised Pattern Recognition Neural Networks." In 13th International Conference on Agents and Artificial Intelligence. SCITEPRESS - Science and Technology Publications, 2021. http://dx.doi.org/10.5220/0010231902580264.
Повний текст джерелаSalam, F. M. A., and S. Bai. "A feedback neural network with supervised learning." In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137855.
Повний текст джерелаЗвіти організацій з теми "Supervised neural network"
Farhi, Edward, and Hartmut Neven. Classification with Quantum Neural Networks on Near Term Processors. Web of Open Science, December 2020. http://dx.doi.org/10.37686/qrl.v1i2.80.
Повний текст джерелаEngel, Bernard, Yael Edan, James Simon, Hanoch Pasternak, and Shimon Edelman. Neural Networks for Quality Sorting of Agricultural Produce. United States Department of Agriculture, July 1996. http://dx.doi.org/10.32747/1996.7613033.bard.
Повний текст джерелаZhang, Yunchong. Blind Denoising by Self-Supervised Neural Networks in Astronomical Datasets (Noise2Self4Astro). Office of Scientific and Technical Information (OSTI), August 2019. http://dx.doi.org/10.2172/1614728.
Повний текст джерела