Статті в журналах з теми "Neural Sequence Models"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Neural Sequence Models".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Shi, Tian, Yaser Keneshloo, Naren Ramakrishnan, and Chandan K. Reddy. "Neural Abstractive Text Summarization with Sequence-to-Sequence Models." ACM/IMS Transactions on Data Science 2, no. 1 (January 3, 2021): 1–37. http://dx.doi.org/10.1145/3419106.
Повний текст джерелаLiu, Bowen, Bharath Ramsundar, Prasad Kawthekar, Jade Shi, Joseph Gomes, Quang Luu Nguyen, Stephen Ho, Jack Sloane, Paul Wender, and Vijay Pande. "Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models." ACS Central Science 3, no. 10 (September 5, 2017): 1103–13. http://dx.doi.org/10.1021/acscentsci.7b00303.
Повний текст джерелаPhua, Yeong Tsann, Sujata Navaratnam, Chon-Moy Kang, and Wai-Seong Che. "Sequence-to-sequence neural machine translation for English-Malay." IAES International Journal of Artificial Intelligence (IJ-AI) 11, no. 2 (June 1, 2022): 658. http://dx.doi.org/10.11591/ijai.v11.i2.pp658-665.
Повний текст джерелаDemeester, Thomas. "System Identification with Time-Aware Neural Sequence Models." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3757–64. http://dx.doi.org/10.1609/aaai.v34i04.5786.
Повний текст джерелаHalim, Calvin Janitra, and Kazuhiko Kawamoto. "2D Convolutional Neural Markov Models for Spatiotemporal Sequence Forecasting." Sensors 20, no. 15 (July 28, 2020): 4195. http://dx.doi.org/10.3390/s20154195.
Повний текст джерелаKalm, Kristjan, and Dennis Norris. "Sequence learning recodes cortical representations instead of strengthening initial ones." PLOS Computational Biology 17, no. 5 (May 24, 2021): e1008969. http://dx.doi.org/10.1371/journal.pcbi.1008969.
Повний текст джерелаTan, Zhixing, Jinsong Su, Boli Wang, Yidong Chen, and Xiaodong Shi. "Lattice-to-sequence attentional Neural Machine Translation models." Neurocomputing 284 (April 2018): 138–47. http://dx.doi.org/10.1016/j.neucom.2018.01.010.
Повний текст джерелаNam, Hyoungwook, Segwang Kim, and Kyomin Jung. "Number Sequence Prediction Problems for Evaluating Computational Powers of Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 4626–33. http://dx.doi.org/10.1609/aaai.v33i01.33014626.
Повний текст джерелаYousuf, Hana, Michael Lahzi, Said A. Salloum, and Khaled Shaalan. "A systematic review on sequence-to-sequence learning with neural network and its models." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 3 (June 1, 2021): 2315. http://dx.doi.org/10.11591/ijece.v11i3.pp2315-2326.
Повний текст джерелаBuckman, Jacob, and Graham Neubig. "Neural Lattice Language Models." Transactions of the Association for Computational Linguistics 6 (December 2018): 529–41. http://dx.doi.org/10.1162/tacl_a_00036.
Повний текст джерелаEriguchi, Akiko, Kazuma Hashimoto, and Yoshimasa Tsuruoka. "Incorporating Source-Side Phrase Structures into Neural Machine Translation." Computational Linguistics 45, no. 2 (June 2019): 267–92. http://dx.doi.org/10.1162/coli_a_00348.
Повний текст джерелаHahn, Michael. "Theoretical Limitations of Self-Attention in Neural Sequence Models." Transactions of the Association for Computational Linguistics 8 (July 2020): 156–71. http://dx.doi.org/10.1162/tacl_a_00306.
Повний текст джерелаDuarte, Manuel, and Armando Pinho. "Bacterial DNA Sequence Compression Models Using Artificial Neural Networks." Entropy 15, no. 12 (August 30, 2013): 3435–48. http://dx.doi.org/10.3390/e15093435.
Повний текст джерелаJehl, Laura, Carolin Lawrence, and Stefan Riezler. "Learning Neural Sequence-to-Sequence Models from Weak Feedback with Bipolar Ramp Loss." Transactions of the Association for Computational Linguistics 7 (November 2019): 233–48. http://dx.doi.org/10.1162/tacl_a_00265.
Повний текст джерелаHan, Xu-Wang, Hai-Tao Zheng, Jin-Yuan Chen, and Cong-Zhi Zhao. "Diverse Decoding for Abstractive Document Summarization." Applied Sciences 9, no. 3 (January 23, 2019): 386. http://dx.doi.org/10.3390/app9030386.
Повний текст джерелаLim, Dongjoon, and Mathieu Blanchette. "EvoLSTM: context-dependent models of sequence evolution using a sequence-to-sequence LSTM." Bioinformatics 36, Supplement_1 (July 1, 2020): i353—i361. http://dx.doi.org/10.1093/bioinformatics/btaa447.
Повний текст джерелаLiu, Ming, and Jinxu Zhang. "Chinese Neural Question Generation: Augmenting Knowledge into Multiple Neural Encoders." Applied Sciences 12, no. 3 (January 19, 2022): 1032. http://dx.doi.org/10.3390/app12031032.
Повний текст джерелаWelleck, Sean, and Kyunghyun Cho. "MLE-Guided Parameter Search for Task Loss Minimization in Neural Sequence Modeling." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 16 (May 18, 2021): 14032–40. http://dx.doi.org/10.1609/aaai.v35i16.17652.
Повний текст джерелаAverbeck, Bruno B., James Kilner, and Christopher D. Frith. "Neural Correlates of Sequence Learning with Stochastic Feedback." Journal of Cognitive Neuroscience 23, no. 6 (June 2011): 1346–57. http://dx.doi.org/10.1162/jocn.2010.21436.
Повний текст джерелаDabre, Raj, and Atsushi Fujita. "Recurrent Stacking of Layers for Compact Neural Machine Translation Models." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 6292–99. http://dx.doi.org/10.1609/aaai.v33i01.33016292.
Повний текст джерелаSilva, Milton, Diogo Pratas, and Armando J. Pinho. "AC2: An Efficient Protein Sequence Compression Tool Using Artificial Neural Networks and Cache-Hash Models." Entropy 23, no. 5 (April 26, 2021): 530. http://dx.doi.org/10.3390/e23050530.
Повний текст джерелаRahul, Kodithala. "Neural Machine Translation." International Journal for Research in Applied Science and Engineering Technology 10, no. 7 (July 31, 2022): 2027–30. http://dx.doi.org/10.22214/ijraset.2022.45669.
Повний текст джерелаSchwaller, Philippe, Théophile Gaudin, Dávid Lányi, Costas Bekas, and Teodoro Laino. "“Found in Translation”: predicting outcomes of complex organic chemistry reactions using neural sequence-to-sequence models." Chemical Science 9, no. 28 (2018): 6091–98. http://dx.doi.org/10.1039/c8sc02339e.
Повний текст джерелаZhang, Hao, Richard Sproat, Axel H. Ng, Felix Stahlberg, Xiaochang Peng, Kyle Gorman, and Brian Roark. "Neural Models of Text Normalization for Speech Applications." Computational Linguistics 45, no. 2 (June 2019): 293–337. http://dx.doi.org/10.1162/coli_a_00349.
Повний текст джерелаLi, Yurui, Mingjing Du, and Sheng He. "Attention-Based Sequence-to-Sequence Model for Time Series Imputation." Entropy 24, no. 12 (December 9, 2022): 1798. http://dx.doi.org/10.3390/e24121798.
Повний текст джерелаZarrieß, Sina, Henrik Voigt, and Simeon Schüz. "Decoding Methods in Neural Language Generation: A Survey." Information 12, no. 9 (August 30, 2021): 355. http://dx.doi.org/10.3390/info12090355.
Повний текст джерелаColombo, Pierre, Emile Chapuis, Matteo Manica, Emmanuel Vignon, Giovanna Varni, and Chloe Clavel. "Guiding Attention in Sequence-to-Sequence Models for Dialogue Act Prediction." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 7594–601. http://dx.doi.org/10.1609/aaai.v34i05.6259.
Повний текст джерелаLi, Yangming, Kaisheng Yao, Libo Qin, Shuang Peng, Yijia Liu, and Xiaolong Li. "Span-Based Neural Buffer: Towards Efficient and Effective Utilization of Long-Distance Context for Neural Sequence Models." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 8277–84. http://dx.doi.org/10.1609/aaai.v34i05.6343.
Повний текст джерелаHelcl, Jindřich, and Jindřich Libovický. "Neural Monkey: An Open-source Tool for Sequence Learning." Prague Bulletin of Mathematical Linguistics 107, no. 1 (April 1, 2017): 5–17. http://dx.doi.org/10.1515/pralin-2017-0001.
Повний текст джерелаJahier Pagliari, Daniele, Francesco Daghero, and Massimo Poncino. "Sequence-To-Sequence Neural Networks Inference on Embedded Processors Using Dynamic Beam Search." Electronics 9, no. 2 (February 15, 2020): 337. http://dx.doi.org/10.3390/electronics9020337.
Повний текст джерелаZech, John, Jessica Forde, Joseph J. Titano, Deepak Kaji, Anthony Costa, and Eric Karl Oermann. "Detecting insertion, substitution, and deletion errors in radiology reports using neural sequence-to-sequence models." Annals of Translational Medicine 7, no. 11 (June 2019): 233. http://dx.doi.org/10.21037/atm.2018.08.11.
Повний текст джерелаXia, Min, W. K. Wong, and Zhijie Wang. "Sequence Memory Based on Coherent Spin-Interaction Neural Networks." Neural Computation 26, no. 12 (December 2014): 2944–61. http://dx.doi.org/10.1162/neco_a_00663.
Повний текст джерелаAzha Javed and Muhammad Javed Iqbal. "Classification of Biological Data using Deep Learning Technique." NUML International Journal of Engineering and Computing 1, no. 1 (April 27, 2022): 13–26. http://dx.doi.org/10.52015/nijec.v1i1.10.
Повний текст джерелаLiu, Zuozhu, Thiparat Chotibut, Christopher Hillar, and Shaowei Lin. "Biologically Plausible Sequence Learning with Spiking Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 02 (April 3, 2020): 1316–23. http://dx.doi.org/10.1609/aaai.v34i02.5487.
Повний текст джерелаBouktif, Salah, Ali Fiaz, Ali Ouni, and Mohamed Adel Serhani. "Single and Multi-Sequence Deep Learning Models for Short and Medium Term Electric Load Forecasting." Energies 12, no. 1 (January 2, 2019): 149. http://dx.doi.org/10.3390/en12010149.
Повний текст джерелаKaselimi, M., N. Doulamis, A. Doulamis, and D. Delikaraoglou. "A SEQUENCE-TO-SEQUENCE TEMPORAL CONVOLUTIONAL NEURAL NETWORK FOR IONOSPHERE PREDICTION USING GNSS OBSERVATIONS." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B3-2020 (August 21, 2020): 813–20. http://dx.doi.org/10.5194/isprs-archives-xliii-b3-2020-813-2020.
Повний текст джерелаMcCoy, R. Thomas, Robert Frank, and Tal Linzen. "Does Syntax Need to Grow on Trees? Sources of Hierarchical Inductive Bias in Sequence-to-Sequence Networks." Transactions of the Association for Computational Linguistics 8 (July 2020): 125–40. http://dx.doi.org/10.1162/tacl_a_00304.
Повний текст джерелаLourentzou, Ismini, Kabir Manghnani, and ChengXiang Zhai. "Adapting Sequence to Sequence Models for Text Normalization in Social Media." Proceedings of the International AAAI Conference on Web and Social Media 13 (July 6, 2019): 335–45. http://dx.doi.org/10.1609/icwsm.v13i01.3234.
Повний текст джерелаMadi, Nora, and Hend Al-Khalifa. "Error Detection for Arabic Text Using Neural Sequence Labeling." Applied Sciences 10, no. 15 (July 30, 2020): 5279. http://dx.doi.org/10.3390/app10155279.
Повний текст джерелаYonglan, Li, and He Wenjia. "English-Chinese Machine Translation Model Based on Bidirectional Neural Network with Attention Mechanism." Journal of Sensors 2022 (March 17, 2022): 1–11. http://dx.doi.org/10.1155/2022/5199248.
Повний текст джерелаYaish, Ofir, and Yaron Orenstein. "Computational modeling of mRNA degradation dynamics using deep neural networks." Bioinformatics 38, no. 4 (November 26, 2021): 1087–101. http://dx.doi.org/10.1093/bioinformatics/btab800.
Повний текст джерелаSuszyński, Marcin, and Katarzyna Peta. "Assembly Sequence Planning Using Artificial Neural Networks for Mechanical Parts Based on Selected Criteria." Applied Sciences 11, no. 21 (November 5, 2021): 10414. http://dx.doi.org/10.3390/app112110414.
Повний текст джерелаLiu, Pengfei, Shuaichen Chang, Xuanjing Huang, Jian Tang, and Jackie Chi Kit Cheung. "Contextualized Non-Local Neural Networks for Sequence Learning." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 6762–69. http://dx.doi.org/10.1609/aaai.v33i01.33016762.
Повний текст джерелаCheng, Minhao, Jinfeng Yi, Pin-Yu Chen, Huan Zhang, and Cho-Jui Hsieh. "Seq2Sick: Evaluating the Robustness of Sequence-to-Sequence Models with Adversarial Examples." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3601–8. http://dx.doi.org/10.1609/aaai.v34i04.5767.
Повний текст джерелаFeinauer, Christoph, Barthelemy Meynard-Piganeau, and Carlo Lucibello. "Interpretable pairwise distillations for generative protein sequence models." PLOS Computational Biology 18, no. 6 (June 23, 2022): e1010219. http://dx.doi.org/10.1371/journal.pcbi.1010219.
Повний текст джерелаNair, Viswajit Vinod, Sonaal Pathlai Pradeep, Vaishnavi Sudheer Nair, P. N. Pournami, G. Gopakumar, and P. B. Jayaraj. "Deep Sequence Models for Ligand-Based Virtual Screening." Journal of Computational Biophysics and Chemistry 21, no. 02 (February 4, 2022): 207–17. http://dx.doi.org/10.1142/s2737416522500107.
Повний текст джерелаLa Quatra, Moreno, and Luca Cagliero. "BART-IT: An Efficient Sequence-to-Sequence Model for Italian Text Summarization." Future Internet 15, no. 1 (December 27, 2022): 15. http://dx.doi.org/10.3390/fi15010015.
Повний текст джерелаNataraj, Sathees Kumar, M. P. Paulraj, Ahmad Nazri Bin Abdullah, and Sazali Bin Yaacob. "A systematic approach for segmenting voiced/unvoiced signals using fuzzy-logic system and general fusion of neural network models for phonemes-based speech recognition." Journal of Intelligent & Fuzzy Systems 39, no. 5 (November 19, 2020): 7411–29. http://dx.doi.org/10.3233/jifs-200780.
Повний текст джерелаGupta, Divam, Tanmoy Chakraborty, and Soumen Chakrabarti. "GIRNet: Interleaved Multi-Task Recurrent State Sequence Models." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 6497–504. http://dx.doi.org/10.1609/aaai.v33i01.33016497.
Повний текст джерелаZhao, Zhengqiao, Stephen Woloszynek, Felix Agbavor, Joshua Chang Mell, Bahrad A. Sokhansanj, and Gail L. Rosen. "Learning, visualizing and exploring 16S rRNA structure using an attention-based deep neural network." PLOS Computational Biology 17, no. 9 (September 22, 2021): e1009345. http://dx.doi.org/10.1371/journal.pcbi.1009345.
Повний текст джерела