Academic literature on the topic 'Neural Sequence Models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Neural Sequence Models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Neural Sequence Models"

1

Shi, Tian, Yaser Keneshloo, Naren Ramakrishnan, and Chandan K. Reddy. "Neural Abstractive Text Summarization with Sequence-to-Sequence Models." ACM/IMS Transactions on Data Science 2, no. 1 (January 3, 2021): 1–37. http://dx.doi.org/10.1145/3419106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Bowen, Bharath Ramsundar, Prasad Kawthekar, Jade Shi, Joseph Gomes, Quang Luu Nguyen, Stephen Ho, Jack Sloane, Paul Wender, and Vijay Pande. "Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models." ACS Central Science 3, no. 10 (September 5, 2017): 1103–13. http://dx.doi.org/10.1021/acscentsci.7b00303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Phua, Yeong Tsann, Sujata Navaratnam, Chon-Moy Kang, and Wai-Seong Che. "Sequence-to-sequence neural machine translation for English-Malay." IAES International Journal of Artificial Intelligence (IJ-AI) 11, no. 2 (June 1, 2022): 658. http://dx.doi.org/10.11591/ijai.v11.i2.pp658-665.

Full text
Abstract:
Machine translation aims to translate text from a specific language into another language using computer software. In this work, we performed neural machine translation with attention implementation on English-Malay parallel corpus. We attempt to improve the model performance by rectified linear unit (ReLU) attention alignment. Different sequence-to-sequence models were trained. These models include long-short term memory (LSTM), gated recurrent unit (GRU), bidirectional LSTM (Bi-LSTM) and bidirectional GRU (Bi-GRU). In the experiment, both bidirectional models, Bi-LSTM and Bi-GRU yield a converge of below 30 epochs. Our study shows that the ReLU attention alignment improves the bilingual evaluation understudy (BLEU) translation score between score 0.26 and 1.12 across all the models as compare to the original Tanh models.
APA, Harvard, Vancouver, ISO, and other styles
4

Demeester, Thomas. "System Identification with Time-Aware Neural Sequence Models." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3757–64. http://dx.doi.org/10.1609/aaai.v34i04.5786.

Full text
Abstract:
Established recurrent neural networks are well-suited to solve a wide variety of prediction tasks involving discrete sequences. However, they do not perform as well in the task of dynamical system identification, when dealing with observations from continuous variables that are unevenly sampled in time, for example due to missing observations. We show how such neural sequence models can be adapted to deal with variable step sizes in a natural way. In particular, we introduce a ‘time-aware’ and stationary extension of existing models (including the Gated Recurrent Unit) that allows them to deal with unevenly sampled system observations by adapting to the observation times, while facilitating higher-order temporal behavior. We discuss the properties and demonstrate the validity of the proposed approach, based on samples from two industrial input/output processes.
APA, Harvard, Vancouver, ISO, and other styles
5

Halim, Calvin Janitra, and Kazuhiko Kawamoto. "2D Convolutional Neural Markov Models for Spatiotemporal Sequence Forecasting." Sensors 20, no. 15 (July 28, 2020): 4195. http://dx.doi.org/10.3390/s20154195.

Full text
Abstract:
Recent approaches to time series forecasting, especially forecasting spatiotemporal sequences, have leveraged the approximation power of deep neural networks to model the complexity of such sequences, specifically approaches that are based on recurrent neural networks. Still, as spatiotemporal sequences that arise in the real world are noisy and chaotic, modeling approaches that utilize probabilistic temporal models, such as deep Markov models (DMMs), are favorable because of their ability to model uncertainty, increasing their robustness to noise. However, approaches based on DMMs do not maintain the spatial characteristics of spatiotemporal sequences, with most of the approaches converting the observed input into 1D data halfway through the model. To solve this, we propose a model that retains the spatial aspect of the target sequence with a DMM that consists of 2D convolutional neural networks. We then show the robustness of our method to data with large variance compared with naive forecast, vanilla DMM, and convolutional long short-term memory (LSTM) using synthetic data, even outperforming the DNN models over a longer forecast period. We also point out the limitations of our model when forecasting real-world precipitation data and the possible future work that can be done to address these limitations, along with additional future research potential.
APA, Harvard, Vancouver, ISO, and other styles
6

Kalm, Kristjan, and Dennis Norris. "Sequence learning recodes cortical representations instead of strengthening initial ones." PLOS Computational Biology 17, no. 5 (May 24, 2021): e1008969. http://dx.doi.org/10.1371/journal.pcbi.1008969.

Full text
Abstract:
We contrast two computational models of sequence learning. The associative learner posits that learning proceeds by strengthening existing association weights. Alternatively, recoding posits that learning creates new and more efficient representations of the learned sequences. Importantly, both models propose that humans act as optimal learners but capture different statistics of the stimuli in their internal model. Furthermore, these models make dissociable predictions as to how learning changes the neural representation of sequences. We tested these predictions by using fMRI to extract neural activity patterns from the dorsal visual processing stream during a sequence recall task. We observed that only the recoding account can explain the similarity of neural activity patterns, suggesting that participants recode the learned sequences using chunks. We show that associative learning can theoretically store only very limited number of overlapping sequences, such as common in ecological working memory tasks, and hence an efficient learner should recode initial sequence representations.
APA, Harvard, Vancouver, ISO, and other styles
7

Tan, Zhixing, Jinsong Su, Boli Wang, Yidong Chen, and Xiaodong Shi. "Lattice-to-sequence attentional Neural Machine Translation models." Neurocomputing 284 (April 2018): 138–47. http://dx.doi.org/10.1016/j.neucom.2018.01.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nam, Hyoungwook, Segwang Kim, and Kyomin Jung. "Number Sequence Prediction Problems for Evaluating Computational Powers of Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 4626–33. http://dx.doi.org/10.1609/aaai.v33i01.33014626.

Full text
Abstract:
Inspired by number series tests to measure human intelligence, we suggest number sequence prediction tasks to assess neural network models’ computational powers for solving algorithmic problems. We define the complexity and difficulty of a number sequence prediction task with the structure of the smallest automaton that can generate the sequence. We suggest two types of number sequence prediction problems: the number-level and the digit-level problems. The number-level problems format sequences as 2-dimensional grids of digits and the digit-level problems provide a single digit input per a time step. The complexity of a number-level sequence prediction can be defined with the depth of an equivalent combinatorial logic, and the complexity of a digit-level sequence prediction can be defined with an equivalent state automaton for the generation rule. Experiments with number-level sequences suggest that CNN models are capable of learning the compound operations of sequence generation rules, but the depths of the compound operations are limited. For the digitlevel problems, simple GRU and LSTM models can solve some problems with the complexity of finite state automata. Memory augmented models such as Stack-RNN, Attention, and Neural Turing Machines can solve the reverse-order task which has the complexity of simple pushdown automaton. However, all of above cannot solve general Fibonacci, Arithmetic or Geometric sequence generation problems that represent the complexity of queue automata or Turing machines. The results show that our number sequence prediction problems effectively evaluate machine learning models’ computational capabilities.
APA, Harvard, Vancouver, ISO, and other styles
9

Yousuf, Hana, Michael Lahzi, Said A. Salloum, and Khaled Shaalan. "A systematic review on sequence-to-sequence learning with neural network and its models." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 3 (June 1, 2021): 2315. http://dx.doi.org/10.11591/ijece.v11i3.pp2315-2326.

Full text
Abstract:
We develop a precise writing survey on sequence-to-sequence learning with neural network and its models. The primary aim of this report is to enhance the knowledge of the sequence-to-sequence neural network and to locate the best way to deal with executing it. Three models are mostly used in sequence-to-sequence neural network applications, namely: recurrent neural networks (RNN), connectionist temporal classification (CTC), and attention model. The evidence we adopted in conducting this survey included utilizing the examination inquiries or research questions to determine keywords, which were used to search for bits of peer-reviewed papers, articles, or books at scholastic directories. Through introductory hunts, 790 papers, and scholarly works were found, and with the assistance of choice criteria and PRISMA methodology, the number of papers reviewed decreased to 16. Every one of the 16 articles was categorized by their contribution to each examination question, and they were broken down. At last, the examination papers experienced a quality appraisal where the subsequent range was from 83.3% to 100%. The proposed systematic review enabled us to collect, evaluate, analyze, and explore different approaches of implementing sequence-to-sequence neural network models and pointed out the most common use in machine learning. We followed a methodology that shows the potential of applying these models to real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
10

Buckman, Jacob, and Graham Neubig. "Neural Lattice Language Models." Transactions of the Association for Computational Linguistics 6 (December 2018): 529–41. http://dx.doi.org/10.1162/tacl_a_00036.

Full text
Abstract:
In this work, we propose a new language modeling paradigm that has the ability to perform both prediction and moderation of information flow at multiple granularities: neural lattice language models. These models construct a lattice of possible paths through a sentence and marginalize across this lattice to calculate sequence probabilities or optimize parameters. This approach allows us to seamlessly incorporate linguistic intuitions — including polysemy and the existence of multiword lexical items — into our language model. Experiments on multiple language modeling tasks show that English neural lattice language models that utilize polysemous embeddings are able to improve perplexity by 9.95% relative to a word-level baseline, and that a Chinese model that handles multi-character tokens is able to improve perplexity by 20.94% relative to a character-level baseline.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Neural Sequence Models"

1

Kann, Katharina [Verfasser], and Hinrich [Akademischer Betreuer] Schütze. "Neural sequence-to-sequence models for low-resource morphology / Katharina Kann ; Betreuer: Hinrich Schütze." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2019. http://d-nb.info/1192663276/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Khouzam, Bassem. "Neural networks as cellular computing models for temporal sequence processing." Thesis, Supélec, 2014. http://www.theses.fr/2014SUPL0007/document.

Full text
Abstract:
La thèse propose une approche de l'apprentissage temporel par des mécanismes d'auto-organisation à grain fin. Le manuscrit situe dans un premier temps le travail dans la perspective de contribuer à promouvoir une informatique cellulaire. Il s'agit d'une informatique où les calculs se répartissent en un grand nombre de calculs élémentaires, exécutés en parallèle, échangeant de l'information entre eux. Le caractère cellulaire tient à ce qu'en plus d’être à grain fin, une telle architecture assure que les connexions entre calculateurs respectent une topologie spatiale, en accord avec les contraintes des évolutions technologiques futures des matériels. Dans le manuscrit, la plupart des architectures informatiques distribuées sont examinées suivant cette perspective, pour conclure que peu d'entre elles relèvent strictement du paradigme cellulaire.Nous nous sommes intéressé à la capacité d'apprentissage de ces architectures, du fait de l'importance de cette notion dans le domaine connexe des réseaux de neurones par exemple, sans oublier toutefois que les systèmes cellulaires sont par construction des systèmes complexes dynamiques. Cette composante dynamique incontournable a motivé notre focalisation sur l'apprentissage temporel, dont nous avons passé en revue les déclinaisons dans les domaines des réseaux de neurones supervisés et des cartes auto-organisatrices.Nous avons finalement proposé une architecture qui contribue à la promotion du calcul cellulaire en ce sens qu'elle exhibe des propriétés d'auto-organisation pour l'extraction de la représentation des états du système dynamique qui lui fournit ses entrées, même si ces dernières sont ambiguës et ne reflètent que partiellement cet état. Du fait de la présence d'un cluster pour nos simulations, nous avons pu mettre en œuvre une architecture complexe, et voir émerger des phénomènes nouveaux. Sur la base de ces résultats, nous développons une critique qui ouvre des perspectives sur la suite à donner à nos travaux
The thesis proposes a sequence learning approach that uses the mechanism of fine grain self-organization. The manuscript initially starts by situating this effort in the perspective of contributing to the promotion of cellular computing paradigm in computer science. Computation within this paradigm is divided into a large number of elementary calculations carried out in parallel by computing cells, with information exchange between them.In addition to their fine grain nature, the cellular nature of such architectures lies in the spatial topology of the connections between cells that complies with to the constraints of the technological evolution of hardware in the future. In the manuscript, most of the distributed architecture known in computer science are examined following this perspective, to find that very few of them fall within the cellular paradigm.We are interested in the learning capacity of these architectures, because of the importance of this notion in the related domain of neural networks for example, without forgetting, however, that cellular systems are complex dynamical systems by construction.This inevitable dynamical component has motivated our focus on the learning of temporal sequences, for which we reviewed the different models in the domains of neural networks and self-organization maps.At the end, we proposed an architecture that contributes to the promotion of cellular computing in the sense that it exhibits self-organization properties employed in the extraction of a representation of a dynamical system states that provides the architecture with its entries, even if the latter are ambiguous such that they partially reflect the system state. We profited from an existing supercomputer to simulate complex architecture, that indeed exhibited a new emergent behavior. Based on these results we pursued a critical study that sets the perspective for future work
APA, Harvard, Vancouver, ISO, and other styles
3

Cherla, S. "Neural probabilistic models for melody prediction, sequence labelling and classification." Thesis, City, University of London, 2016. http://openaccess.city.ac.uk/17444/.

Full text
Abstract:
Data-driven sequence models have long played a role in the analysis and generation of musical information. Such models are of interest in computational musicology, computer-aided music composition, and tools for music education among other applications. This dissertation beginswith an experiment tomodel sequences of musical pitch in melodies with a class of purely data-driven predictive models collectively known as Connectionist models. It was demonstrated that a set of six such models could performon par with, or better than state-of-the-art n-gram models previously evaluated in an identical setting. A new model known as the Recurrent Temporal Discriminative Restricted Boltzmann Machine (RTDRBM), was introduced in the process and found to outperform the rest of the models. A generalisation of this modelling task was also explored, and involved extending the set of musical features used as input by the models while still predicting pitch as before. The improvement in predictive performance which resulted from adding these new input features is encouraging for future work in this direction. Based on the above success of the RTDRBM, its application was extended to a non-musical sequence labelling task, namely Optical Character Recognition. This extension involved a modification to the model’s original prediction algorithm as a result of relaxing an assumption specific to the melody modelling task. The generalised model was evaluated on a benchmark dataset and compared against a set of 8 baseline models where it faired better than all of them. Furthermore, a theoretical extension to an existingmodel which was also employed in the above pitch prediction task - the Discriminative Restricted Boltzmann Machine (DRBM) - was proposed. This led to three new variants of the DRBM (which originally contained Logistic Sigmoid hidden layer activations), withHyperbolic Tangent, Binomial and Rectified Linear hidden layer activations respectively. The first two of these have been evaluated here on the benchmark MNIST dataset and shown to perform on par with the original DRBM.
APA, Harvard, Vancouver, ISO, and other styles
4

Sarabi, Zahra. "Revealing the Positive Meaning of a Negation." Thesis, University of North Texas, 2019. https://digital.library.unt.edu/ark:/67531/metadc1505158/.

Full text
Abstract:
Negation is a complex phenomenon present in all human languages, allowing for the uniquely human capacities of denial, contradiction, misrepresentation, lying, and irony. It is in the first place a phenomenon of semantical opposition. Sentences containing negation are generally (a) less informative than affirmative ones, (b) morphosyntactically more marked—all languages have negative markers while only a few have affirmative markers, and (c) psychologically more complex and harder to process. Negation often conveys positive meaning. This meaning ranges from implicatures to entailments. In this dissertation, I develop a system to reveal the underlying positive interpretation of negation. I first identify which words are intended to be negated (i.e, the focus of negation) and second, I rewrite those tokens to generate an actual positive interpretation. I identify the focus of negation by scoring probable foci along a continuous scale. One of the obstacles to exploring foci scoring is that no public datasets exist for this task. Thus, to study this problem I create new corpora. The corpora contain verbal, nominal and adjectival negations and their potential positive interpretations along with their scores ranging from 1 to 5. Then, I use supervised learning models for scoring the focus of negation. In order to rewrite the focus of negation with its positive interpretation, I work with negations from Simple Wikipedia, automatically generate potential positive interpretations, and then collect manual annotations that effectively rewrite the negation in positive terms. This procedure yields positive interpretations for approximately 77% of negations, and the final corpus includes over 5,700 negations and over 5,900 positive interpretations. I then use sequence-to-sequence neural models and provide baseline results.
APA, Harvard, Vancouver, ISO, and other styles
5

Rehn, Martin. "Aspects of memory and representation in cortical computation." Doctoral thesis, KTH, Numerisk Analys och Datalogi, NADA, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4161.

Full text
Abstract:
Denna avhandling i datalogi föreslår modeller för hur vissa beräkningsmässiga uppgifter kan utföras av hjärnbarken. Utgångspunkten är dels kända fakta om hur en area i hjärnbarken är uppbyggd och fungerar, dels etablerade modellklasser inom beräkningsneurobiologi, såsom attraktorminnen och system för gles kodning. Ett neuralt nätverk som producerar en effektiv gles kod i binär mening för sensoriska, särskilt visuella, intryck presenteras. Jag visar att detta nätverk, när det har tränats med naturliga bilder, reproducerar vissa egenskaper (receptiva fält) hos nervceller i lager IV i den primära synbarken och att de koder som det producerar är lämpliga för lagring i associativa minnesmodeller. Vidare visar jag hur ett enkelt autoassociativt minne kan modifieras till att fungera som ett generellt sekvenslärande system genom att utrustas med synapsdynamik. Jag undersöker hur ett abstrakt attraktorminnessystem kan implementeras i en detaljerad modell baserad på data om hjärnbarken. Denna modell kan sedan analyseras med verktyg som simulerar experiment som kan utföras på en riktig hjärnbark. Hypotesen att hjärnbarken till avsevärd del fungerar som ett attraktorminne undersöks och visar sig leda till prediktioner för dess kopplingsstruktur. Jag diskuterar också metodologiska aspekter på beräkningsneurobiologin idag.
In this thesis I take a modular approach to cortical function. I investigate how the cerebral cortex may realise a number of basic computational tasks, within the framework of its generic architecture. I present novel mechanisms for certain assumed computational capabilities of the cerebral cortex, building on the established notions of attractor memory and sparse coding. A sparse binary coding network for generating efficient representations of sensory input is presented. It is demonstrated that this network model well reproduces the simple cell receptive field shapes seen in the primary visual cortex and that its representations are efficient with respect to storage in associative memory. I show how an autoassociative memory, augmented with dynamical synapses, can function as a general sequence learning network. I demonstrate how an abstract attractor memory system may be realised on the microcircuit level -- and how it may be analysed using tools similar to those used experimentally. I outline some predictions from the hypothesis that the macroscopic connectivity of the cortex is optimised for attractor memory function. I also discuss methodological aspects of modelling in computational neuroscience.
QC 20100916
APA, Harvard, Vancouver, ISO, and other styles
6

Svensk, Gustav. "TDNet : A Generative Model for Taxi Demand Prediction." Thesis, Linköpings universitet, Programvara och system, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-158514.

Full text
Abstract:
Supplying the right amount of taxis in the right place at the right time is very important for taxi companies. In this paper, the machine learning model Taxi Demand Net (TDNet) is presented which predicts short-term taxi demand in different zones of a city. It is based on WaveNet which is a causal dilated convolutional neural net for time-series generation. TDNet uses historical demand from the last years and transforms features such as time of day, day of week and day of month into 26-hour taxi demand forecasts for all zones in a city. It has been applied to one city in northern Europe and one in South America. In northern europe, an error of one taxi or less per hour per zone was achieved in 64% of the cases, in South America the number was 40%. In both cities, it beat the SARIMA and stacked ensemble benchmarks. This performance has been achieved by tuning the hyperparameters with a Bayesian optimization algorithm. Additionally, weather and holiday features were added as input features in the northern European city and they did not improve the accuracy of TDNet.
APA, Harvard, Vancouver, ISO, and other styles
7

Taylor, Neill Richard. "Neural models of temporal sequences." Thesis, King's College London (University of London), 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.300844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Calvert, David. "A distance-based neural network model for sequence processing." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape17/PQDD_0010/NQ30591.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Schmidle, Wolfgang. "A model of neural sequence detectors for sentence processing." Thesis, University of Sunderland, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.439973.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Huang, Yiming. "Phoneme Recognition Using Neural Network and Sequence Learning Model." Ohio University / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1236027180.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Neural Sequence Models"

1

Mechanisms of implicit learning: Connectionist models of sequence processing. Cambridge, Mass: MIT Press, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Keeler, James David. Capacity for patterns and sequences in Kanerva's SDM as compared to other associative memeory models. [Moffett Field, Calif.]: Research Institute for Advanced Computer Science, NASA Ames Research Center, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Skelton, Kimberley, ed. Early Modern Spaces in Motion. NL Amsterdam: Amsterdam University Press, 2020. http://dx.doi.org/10.5117/9789463725811.

Full text
Abstract:
Stretching back to antiquity, motion had been a key means of designing and describing the physical environment. But during the sixteenth through eighteenth centuries, individuals across Europe increasingly designed, experienced, and described a new world of motion: one characterized by continuous, rather than segmented, movement. New spaces that included vistas along house interiors and uninterrupted library reading rooms offered open expanses for shaping sequences of social behaviour, scientists observed how the Earth rotated around the sun, and philosophers attributed emotions to neural vibrations in the human brain. Early Modern Spaces in Motion examines this increased emphasis on motion with eight essays encompassing a geographical span of Portugal to German-speaking lands and a disciplinary range from architectural history to English. It consequently merges longstanding strands of analysis considering people in motion and buildings in motion to explore the cultural historical attitudes underpinning the varied impacts of motion in early modern Europe.
APA, Harvard, Vancouver, ISO, and other styles
4

Cleeremans, Axel. Mechanisms of Implicit Learning: Connectionist Models of Sequence Processing. MIT Press, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Nieder, Andreas. Neuronal Correlates of Non-verbal Numerical Competence in Primates. Edited by Roi Cohen Kadosh and Ann Dowker. Oxford University Press, 2014. http://dx.doi.org/10.1093/oxfordhb/9780199642342.013.027.

Full text
Abstract:
Non-verbal numerical competence, such as the estimation of set size, is rooted in biological primitives that can also be explored in animals. Over the past years, the anatomical substrates and neuronal mechanisms of numerical cognition in primates have been unravelled down to the level of single neurons. Studies with behaviourally-trained monkeys have identified a parietofrontal network of individual neurons selectively tuned to the number of items (cardinal aspect) or the rank of items in a sequence (ordinal aspect). The properties of these neurons’ numerosity tuning curves can explain fundamental psychophysical phenomena, such as the numerical distance and size effect. Functionally overlapping groups of parietal neurons represent not only numerable-discrete quantity (numerosity), but also innumerable-continuous quantity (extent) and relations between quantities (proportions), supporting the idea of a generalized magnitude system in the brain. Moreover, many neurons in the prefrontal cortex establish semantic associations between signs and abstract numerical categories, a neuronal precursor mechanisms that may ultimately give rise to symbolic number processing in humans. These studies establish putative homologies between the monkey and human brain, and demonstrate the suitability of non-human primates as model system to explore the neurobiological roots of the brain’s non-verbal quantification system, which may constitute the phylogenetic and ontogenetic foundation of all further, more elaborate numerical skills in humans.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Neural Sequence Models"

1

Ghatak, Abhijit. "Recurrent Neural Networks (RNN) or Sequence Models." In Deep Learning with R, 207–37. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-5850-0_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

da Costa, Pablo, and Gustavo H. Paetzold. "Effective Sequence Labeling with Hybrid Neural-CRF Models." In Lecture Notes in Computer Science, 490–98. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99722-3_49.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ratajczak, Martin, Sebastian Tschiatschek, and Franz Pernkopf. "Structured Regularizer for Neural Higher-Order Sequence Models." In Machine Learning and Knowledge Discovery in Databases, 168–83. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23528-8_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Helali, Mossad, Thomas Kleinbauer, and Dietrich Klakow. "Assessing Unintended Memorization in Neural Discriminative Sequence Models." In Text, Speech, and Dialogue, 265–72. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58323-1_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Onoda, Takashi. "Probabilistic Models Based Intrusion Detection Using Sequence Characteristics in Control System Communication." In Engineering Applications of Neural Networks, 155–64. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11071-4_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Majumdar, Srijoni, Nachiketa Chatterjee, Partha Pratim Das, and Amlan Chakrabarti. "$$Dcube_{NN}$$: Tool for Dynamic Design Discovery from Multi-threaded Applications Using Neural Sequence Models." In Advanced Computing and Systems for Security: Volume 14, 75–92. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-4294-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Xiong, Zhaohan, Aaqel Nalar, Kevin Jamart, Martin K. Stiles, Vadim V. Fedorov, and Jichao Zhao. "Fully Automatic 3D Bi-Atria Segmentation from Late Gadolinium-Enhanced MRIs Using Double Convolutional Neural Networks." In Statistical Atlases and Computational Models of the Heart. Multi-Sequence CMR Segmentation, CRT-EPiggy and LV Full Quantification Challenges, 63–71. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-39074-7_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Grossberg, Stephen, and Rainer W. Paine. "Attentive Learning of Sequential Handwriting Movements: A Neural Network Model." In Sequence Learning, 349–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/3-540-44565-x_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Samura, Toshikazu, Motonobu Hattori, and Shun Ishizaki. "Sequence Disambiguation by Functionally Divided Hippocampal CA3 Model." In Neural Information Processing, 117–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11893028_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bastolla, Ugo, Markus Porto, H. Eduardo Roman, and Michele Vendruscolo. "The Structurally Constrained Neutral Model of Protein Evolution." In Structural Approaches to Sequence Evolution, 75–112. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-35306-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Neural Sequence Models"

1

Strobelt, Hendrik, Sebastian Gehrmann, Michael Behrisch, Adam Perer, Hanspeter Pfister, and Alexander Rush. "Debugging Sequence-to-Sequence Models with Seq2Seq-Vis." In Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/w18-5451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Konstas, Ioannis, Srinivasan Iyer, Mark Yatskar, Yejin Choi, and Luke Zettlemoyer. "Neural AMR: Sequence-to-Sequence Models for Parsing and Generation." In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/p17-1014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yao, Kaisheng, and Geoffrey Zweig. "Sequence-to-sequence neural net models for grapheme-to-phoneme conversion." In Interspeech 2015. ISCA: ISCA, 2015. http://dx.doi.org/10.21437/interspeech.2015-134.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cintas, Celia, William Ogallo, Aisha Walcott, Sekou L. Remy, Victor Akinwande, and Samuel Osebe. "Towards neural abstractive clinical trial text summarization with sequence to sequence models." In 2019 IEEE International Conference on Healthcare Informatics (ICHI). IEEE, 2019. http://dx.doi.org/10.1109/ichi.2019.8904526.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shen, Liang-Hsin, Pei-Lun Tai, Chao-Chung Wu, and Shou-De Lin. "Controlling Sequence-to-Sequence Models - A Demonstration on Neural-based Acrostic Generator." In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): System Demonstrations. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/d19-3008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Miller, Jason Rafe, and Donald A. Adjeroh. "Exploring Neural Network Models for LncRNA Sequence Identification." In 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2020. http://dx.doi.org/10.1109/bibm49941.2020.9313445.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Raganato, Alessandro, Claudio Delli Bovi, and Roberto Navigli. "Neural Sequence Learning Models for Word Sense Disambiguation." In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/d17-1120.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sperber, Matthias, Graham Neubig, Jan Niehues, and Alex Waibel. "Neural Lattice-to-Sequence Models for Uncertain Inputs." In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/d17-1145.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yannakoudakis, Helen, Marek Rei, Øistein E. Andersen, and Zheng Yuan. "Neural Sequence-Labelling Models for Grammatical Error Correction." In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/d17-1297.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mohan, Devang S. Ram, Raphael Lenain, Lorenzo Foglianti, Tian Huey Teh, Marlene Staib, Alexandra Torresquintero, and Jiameng Gao. "Incremental Text to Speech for Neural Sequence-to-Sequence Models Using Reinforcement Learning." In Interspeech 2020. ISCA: ISCA, 2020. http://dx.doi.org/10.21437/interspeech.2020-1822.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Neural Sequence Models"

1

Farhi, Edward, and Hartmut Neven. Classification with Quantum Neural Networks on Near Term Processors. Web of Open Science, December 2020. http://dx.doi.org/10.37686/qrl.v1i2.80.

Full text
Abstract:
We introduce a quantum neural network, QNN, that can represent labeled data, classical or quantum, and be trained by supervised learning. The quantum circuit consists of a sequence of parameter dependent unitary transformations which acts on an input quantum state. For binary classification a single Pauli operator is measured on a designated readout qubit. The measured output is the quantum neural network’s predictor of the binary label of the input state. We show through classical simulation that parameters can be found that allow the QNN to learn to correctly distinguish the two data sets. We then discuss presenting the data as quantum superpositions of computational basis states corresponding to different label values. Here we show through simulation that learning is possible. We consider using our QNN to learn the label of a general quantum state. By example we show that this can be done. Our work is exploratory and relies on the classical simulation of small quantum systems. The QNN proposed here was designed with near-term quantum processors in mind. Therefore it will be possible to run this QNN on a near term gate model quantum computer where its power can be explored beyond what can be explored with simulation.
APA, Harvard, Vancouver, ISO, and other styles
2

Irudayaraj, Joseph, Ze'ev Schmilovitch, Amos Mizrach, Giora Kritzman, and Chitrita DebRoy. Rapid detection of food borne pathogens and non-pathogens in fresh produce using FT-IRS and raman spectroscopy. United States Department of Agriculture, October 2004. http://dx.doi.org/10.32747/2004.7587221.bard.

Full text
Abstract:
Rapid detection of pathogens and hazardous elements in fresh fruits and vegetables after harvest requires the use of advanced sensor technology at each step in the farm-to-consumer or farm-to-processing sequence. Fourier-transform infrared (FTIR) spectroscopy and the complementary Raman spectroscopy, an advanced optical technique based on light scattering will be investigated for rapid and on-site assessment of produce safety. Paving the way toward the development of this innovative methodology, specific original objectives were to (1) identify and distinguish different serotypes of Escherichia coli, Listeria monocytogenes, Salmonella typhimurium, and Bacillus cereus by FTIR and Raman spectroscopy, (2) develop spectroscopic fingerprint patterns and detection methodology for fungi such as Aspergillus, Rhizopus, Fusarium, and Penicillium (3) to validate a universal spectroscopic procedure to detect foodborne pathogens and non-pathogens in food systems. The original objectives proposed were very ambitious hence modifications were necessary to fit with the funding. Elaborate experiments were conducted for sensitivity, additionally, testing a wide range of pathogens (more than selected list proposed) was also necessary to demonstrate the robustness of the instruments, most crucially, algorithms for differentiating a specific organism of interest in mixed cultures was conceptualized and validated, and finally neural network and chemometric models were tested on a variety of applications. Food systems tested were apple juice and buffer systems. Pathogens tested include Enterococcus faecium, Salmonella enteritidis, Salmonella typhimurium, Bacillus cereus, Yersinia enterocolitis, Shigella boydii, Staphylococus aureus, Serratiamarcescens, Pseudomonas vulgaris, Vibrio cholerae, Hafniaalvei, Enterobacter cloacae, Enterobacter aerogenes, E. coli (O103, O55, O121, O30 and O26), Aspergillus niger (NRRL 326) and Fusarium verticilliodes (NRRL 13586), Saccharomyces cerevisiae (ATCC 24859), Lactobacillus casei (ATCC 11443), Erwinia carotovora pv. carotovora and Clavibacter michiganense. Sensitivity of the FTIR detection was 103CFU/ml and a clear differentiation was obtained between the different organisms both at the species as well as at the strain level for the tested pathogens. A very crucial step in the direction of analyzing mixed cultures was taken. The vector based algorithm was able to identify a target pathogen of interest in a mixture of up to three organisms. Efforts will be made to extend this to 10-12 key pathogens. The experience gained was very helpful in laying the foundations for extracting the true fingerprint of a specific pathogen irrespective of the background substrate. This is very crucial especially when experimenting with solid samples as well as complex food matrices. Spectroscopic techniques, especially FTIR and Raman methods are being pursued by agencies such as DARPA and Department of Defense to combat homeland security. Through the BARD US-3296-02 feasibility grant, the foundations for detection, sample handling, and the needed algorithms and models were developed. Successive efforts will be made in transferring the methodology to fruit surfaces and to other complex food matrices which can be accomplished with creative sampling methods and experimentation. Even a marginal success in this direction will result in a very significant breakthrough because FTIR and Raman methods, in spite of their limitations are still one of most rapid and nondestructive methods available. Continued interest and efforts in improving the components as well as the refinement of the procedures is bound to result in a significant breakthrough in sensor technology for food safety and biosecurity.
APA, Harvard, Vancouver, ISO, and other styles
3

Rafaeli, Ada, Russell Jurenka, and Chris Sander. Molecular characterisation of PBAN-receptors: a basis for the development and screening of antagonists against Pheromone biosynthesis in moth pest species. United States Department of Agriculture, January 2008. http://dx.doi.org/10.32747/2008.7695862.bard.

Full text
Abstract:
The original objectives of the approved proposal included: (a) The determination of species- and tissue-specificity of the PBAN-R; (b) the elucidation of the role of juvenile hormone in gene regulation of the PBAN-R; (c) the identificationof the ligand binding domains in the PBAN-R and (d) the development of efficient screening assays in order to screen potential antagonists that will block the PBAN-R. Background to the topic: Moths constitute one of the major groups of pest insects in agriculture and their reproductive behavior is dependent on chemical communication. Sex-pheromone blends are utilised by a variety of moth species to attract conspecific mates. In most of the moth species sex-pheromone biosynthesis is under circadian control by the neurohormone, PBAN (pheromone-biosynthesis-activating neuropeptide). In order to devise ideal strategies for mating disruption/prevention, we proposed to study the interactions between PBAN and its membrane-bound receptor in order to devise potential antagonists. Major conclusions: Within the framework of the planned objectives we have confirmed the similarities between the two Helicoverpa species: armigera and zea. Receptor sequences of the two Helicoverpa spp. are 98% identical with most changes taking place in the C-terminal. Our findings indicate that PBAN or PBAN-like receptors are also present in the neural tissues and may represent a neurotransmitter-like function for PBAN-like peptides. Surprisingly the gene encoding the PBAN-receptor was also present in the male homologous tissue, but it is absent at the protein level. The presence of the receptor (at the gene- and protein-levels), and the subsequent pheromonotropic activity are age-dependent and up-regulated by Juvenile Hormone in pharate females but down-regulated by Juvenile Hormone in adult females. Lower levels of pheromonotropic activity were observed when challenged with pyrokinin-like peptides than with HezPBAN as ligand. A model of the 3D structure of the receptor was created using the X-ray structure of rhodopsin as a template after sequence alignment of the HezPBAN-R with several other GPCRs and computer simulated docking with the model predicted putative binding sites. Using in silico mutagenesis the predicted docking model was validated with experimental data obtained from expressed chimera receptors in Sf9 cells created by exchanging between the three extracellular loops of the HezPBAN-R and the Drosophila Pyrokinin-R (CG9918). The chimera receptors also indicated that the 3ʳᵈ extracellular loop is important for recognition of PBAN or Diapause hormone ligands. Implications: The project has successfully completed all the objectives and we are now in a position to be able to design and screen potential antagonists for pheromone production. The successful docking simulation-experiments encourage the use of in silico experiments for initial (high-throughput) screening of potential antagonists. However, the differential responses between the expressed receptor (Sf9 cells) and the endogenous receptor (pheromone glands) emphasize the importance of assaying lead compounds using several alternative bioassays (at the cellular, tissue and organism levels). The surprising discovery of the presence of the gene encoding the PBAN-R in the male homologous tissue, but its absence at the protein level, launches opportunities for studying molecular regulation pathways and the evolution of these GPCRs. Overall this research will advance research towards the goal of finding antagonists for this important class of receptors that might encompass a variety of essential insect functions.
APA, Harvard, Vancouver, ISO, and other styles
4

Yaron, Zvi, Abigail Elizur, Martin Schreibman, and Yonathan Zohar. Advancing Puberty in the Black Carp (Mylopharyngodon piceus) and the Striped Bass (Morone saxatilis). United States Department of Agriculture, January 2000. http://dx.doi.org/10.32747/2000.7695841.bard.

Full text
Abstract:
Both the genes and cDNA sequences encoding the b-subunits of black carp LH and FSH were isolated, cloned and sequenced. Sequence analysis of the bcFSHb and LHb5'flanking regions revealed that the promoter region of both genes contains canonical TATA sequences, 30 bp and 17 bp upstream of the transcription start site of FSHb and LHb genes, respectively. In addition, they include several sequences of cis-acting motifs, required for inducible and tissue-specific transcriptional regulation: the gonadotropin-specific element (GSE), GnRH responsive element (GRE), half sites of estrogen and androgen response elements, cAMP response element, and AP1. Several methods have been employed by the Israeli team to purify the recombinant b subunits (EtOH precipitation, gel filtration and lentil lectin). While the final objective to produce pure recombinantGtH subunits has not yet been achieved, we have covered much ground towards this goal. The black carp ovary showed a gradual increase in both mass and oocyte diameter. First postvitellogenic oocytes were found in 5 yr old fish. At this age, the testes already contained spermatozoa. The circulating LH levels increased from 0.5 ng/ml in 4 yr old fish to >5ng/ml in 5 yr old fish. In vivo challenge experiments in black carp showed the initial LH response of the pituitary to GnRH in 4 yr old fish. The response was further augmented in 5 yr old fish. The increase in estradiol level in response to gonadotropic stimulation was first noted in 4 yr old fish but this response was much stronger in the following year. In vivo experiments on the FSHb and LHb mRNA levels in response to GnRH were carried out on common carp as a model for synchronom spawning cyprinids. These experiments showed the prevalence of FSHP in maturing fish while LHP mRNA was prevalent in mature fish, especially in females. The gonadal fat-pad was found to originate from the retroperitoneal mesoderm and not from the genital ridge, thus differing from that reported in certain amphibians This tissue possibly serves as the major source of sex steroids in the immature black carp. However, such a function is taken over by the developing gonads in 4 yr old fish. In the striped bass, we described the ontogeny of the neuro-endocrine parameters along the brain-pituitary-gonadal axis during the first four years of life, throughout gonadal development and the onset of puberty. We also described the responsiveness of the reproductive axis to long-term hormonal manipulations at various stages of gonadal development. Most males reached complete sexual maturity during the first year of life. Puberty was initiated during the third year of life in most females, but this first reproductive cycle did not lead to the acquisition of full sexual maturity. This finding indicates that more than one reproductive cycle may be required before adulthood is reached. Out of the three native GnRHs present in striped bass, only sbGnRH and cGnRH II increased concomitantly with the progress of gonadal development and the onset of puberty. This finding, together with data on GtH synthesis and release, suggests that while sbGnRH and cGnRH II may be involved in the regulation of puberty in striped bass, these neuropeptides are not limiting factors to the onset of puberty. Plasma LH levels remained low in all fish, suggesting that LH plays only a minor role in early gonadal development. This hypothesis was further supported by the finding that experimentally elevated plasma LH levels did not result in the induction of complete ovarian and testicular development. The acquisition of complete puberty in 4 yr old females was associated with a rise in the mRNA levels of all GtH subunit genes, including a 218-fold increase in the mRNA levels of bFSH. mRNA levels of the a and PLH subunits increased only 11- and 8-fold, respectively. Although data on plasma FSH levels are unavailable, the dramatic increase in bFSH mRNA suggests a pivotal role for this hormone in regulating the onset and completion of puberty in striped bass. The hormonal regulation of the onset of puberty and of GtH synthesis and release was studied by chronic administration of testosterone (T) and/or an analog of gonadotropin-releasing hormone (G). Sustained administration of T+G increased the mRNA levels of the PLH subunit to the values characteristic of sexually mature fish, and also increased the plasma levels of LH. However, these changes did not result in the acceleration of sexual maturation. The mRNA levels of the bFSH subunit were slightly stimulated, but remained about 1/10 of the values characteristic of sexually mature fish. It is concluded that the stimulation of FSH gene expression and release does not lead to the acceleration of sexual maturity, and that the failure to sufficiently stimulate the bFSH subunit gene expression may underlie the inability of the treatments to advance sexual maturity. Consequently, FSH is suggested to be the key hormone to the initiation and completion of puberty in striped bass. Future efforts to induce precocious puberty in striped bass should focus on understanding the regulation of FSH synthesis and release and on developing technologies to induce these processes. Definite formulation of hormonal manipulation to advance puberty in the striped bass and the black carp seems to be premature at this stage. However, the project has already yielded a great number of experimental tools of DNA technology, slow-release systems and endocrine information on the process of puberty. These systems and certain protocols have been already utilized successfully to advance maturation in other fish (e.g. grey mullet) and will form a base for further study on fish puberty.
APA, Harvard, Vancouver, ISO, and other styles
5

Altstein, Miriam, and Ronald Nachman. Rationally designed insect neuropeptide agonists and antagonists: application for the characterization of the pyrokinin/Pban mechanisms of action in insects. United States Department of Agriculture, October 2006. http://dx.doi.org/10.32747/2006.7587235.bard.

Full text
Abstract:
The general objective of this BARD project focused on rationally designed insect neuropeptide (NP) agonists and antagonists, their application for the characterization of the mechanisms of action of the pyrokinin/PBAN (PK-PBAN) family and the development of biostable, bioavailable versions that can provide the basis for development of novel, environmentally-friendly pest insect control agents. The specific objectives of the study, as originally proposed, were to: (i) Test stimulatory potencies of rationally designed backbone cyclic (BBC) peptides on pheromonotropic, melanotropic, myotropic and pupariation activities; (ii) Test the inhibitory potencies of the BBC compounds on the above activities evoked either by synthetic peptides (PBAN, LPK, myotropin and pheromonotropin) or by the natural endogenous mechanism; (iii) Determine the bioavailability of the most potent BBC compounds that will be found in (ii); (iv) Design, synthesize and examine novel PK/PBAN analogs with enhanced bioavailability and receptor binding; (v) Design and synthesize ‘magic bullet’ analogs and examine their ability to selectively kill cells expressing the PK/PBAN receptor. To achieve these goals the agonistic and antagonistic activities/properties of rationally designed linear and BBC neuropeptide (NP) were thoroughly studied and the information obtained was further used for the design and synthesis of improved compounds toward the design of an insecticide prototype. The study revealed important information on the structure activity relationship (SAR) of agonistic/antagonistic peptides, including definitive identification of the orientation of the Pro residue as trans for agonist activity in 4 PK/PBANbioassays (pheromonotropic, pupariation, melanotropic, & hindgut contractile) and a PK-related CAP₂b bioassay (diuretic); indications that led to the identification of a novel scaffold to develop biostbiostable, bioavailable peptidomimetic PK/PBANagonists/antagonists. The work led to the development of an arsenal of PK/PBAN antagonists with a variety of selectivity profiles; whether between different PKbioassays, or within the same bioassay between different natural elicitors. Examples include selective and non-selective BBC and novel amphiphilic PK pheromonotropic and melanotropic antagonists some of which are capable of penetrating the moth cuticle in efficacious quantities. One of the latter analog group demonstrated unprecedented versatility in its ability to antagonize a broad spectrum of pheromonotropic elicitors. A novel, transPro mimetic motif was proposed & used to develop a strong, selective PK agonist of the melanotropic bioassay in moths. The first antagonist (pure) of PK-related CAP₂b diuresis in flies was developed using a cisPro mimetic motif; an indication that while a transPro orientation is associated with receptor agonism, a cisPro orientation is linked with an antagonist interaction. A novel, biostablePK analog, incorporating β-amino acids at key peptidase-susceptible sites, exhibited in vivo pheromonotropic activity that by far exceeded that of PBAN when applied topically. Direct analysis of neural tissue by state-of-the-art MALDI-TOF/TOF mass spectrometry was used to identify specific PK/PK-related peptides native to eight arthropod pest species [house (M. domestica), stable (S. calcitrans), horn (H. irritans) & flesh (N. bullata) flies; Southern cattle fever tick (B. microplus), European tick (I. ricinus), yellow fever mosquito (A. aegypti), & Southern Green Stink Bug (N. viridula)]; including the unprecedented identification of mass-identical Leu/Ile residues and the first identification of NPs from a tick or the CNS of Hemiptera. Evidence was obtained for the selection of Neb-PK-2 as the primary pupariation factor of the flesh fly (N. bullata) among native PK/PK-related candidates. The peptidomic techniques were also used to map the location of PK/PK-related NP in the nervous system of the model fly D. melanogaster. Knowledge of specific PK sequences can aid in the future design of species specific (or non-specific) NP agonists/antagonists. In addition, the study led to the first cloning of a PK/PBAN receptor from insect larvae (S. littoralis), providing the basis for SAR analysis for the future design of 2ⁿᵈgeneration selective and/or nonselective agonists/antagonists. Development of a microplate ligand binding assay using the PK/PBAN pheromone gland receptor was also carried out. The assay will enable screening, including high throughput, of various libraries (chemical, molecular & natural product) for the discovery of receptor specific agonists/antagonists. In summary, the body of work achieves several key milestones and brings us significantly closer to the development of novel, environmentally friendly pest insect management agents based on insect PK/PBANNPs capable of disrupting critical NP-regulated functions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography