Letteratura scientifica selezionata sul tema "Ontologie génique"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Ontologie génique".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Ontologie génique":
Raimbault, Benjamin. "Dans l’ombre du génie génétique : le génie métabolique". Natures Sciences Sociétés 29, n. 3 (luglio 2021): 262–73. http://dx.doi.org/10.1051/nss/2021063.
PROLE, DRAGAN. "LAJBNICOVA ONTOLOGIJA POMIRENJA". ARHE 12, n. 23 (16 giugno 2016): 9. http://dx.doi.org/10.19090/arhe.2015.23.9-34.
Tesi sul tema "Ontologie génique":
Bedhiafi, Walid. "Sciences de l'information pour l'étude des systèmes biologiques (exemple du vieillissement du système immunitaire)". Electronic Thesis or Diss., Paris 6, 2017. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2017PA066139.pdf.
High-throughput experimental approaches for gene expression study involve several processing steps for the quantification, the annotation and interpretation of the results. The i3 lab and the LGIPH, applies these approaches in various experimental setups. However, limitations have been observed when using conventional approaches for annotating gene expression signatures. The main objective of this thesis was to develop an alternative annotation approach to overcome this problem. The approach we have developed is based on the contextualization of genes and their products, and then biological pathways modeling to produce a knowledge base for the study of gene expression. We define a gene expression context as follows: cell population+ anatomical compartment+ pathological condition. For the production of gene contexts, we have opted for the massive screening of literature. We have developed a Python package, which allows annotating the texts according to three ontologies chosen according to our definition of the context. We show here that it ensures better performance for text annotation the reference tool. We used our package to screen an aging immune system text corpus. The results are presented here. To model the biological pathways we have developed, in collaboration with the LIPAH lab a modeling method based on a genetic algorithm that allows combining the results semantics proximity using the Biological Process ontology and the interactions data from db-string. We were able to find networks with an error rate of 0.47
Bedhiafi, Walid. "Sciences de l'information pour l'étude des systèmes biologiques (exemple du vieillissement du système immunitaire)". Thesis, Paris 6, 2017. http://www.theses.fr/2017PA066139/document.
High-throughput experimental approaches for gene expression study involve several processing steps for the quantification, the annotation and interpretation of the results. The i3 lab and the LGIPH, applies these approaches in various experimental setups. However, limitations have been observed when using conventional approaches for annotating gene expression signatures. The main objective of this thesis was to develop an alternative annotation approach to overcome this problem. The approach we have developed is based on the contextualization of genes and their products, and then biological pathways modeling to produce a knowledge base for the study of gene expression. We define a gene expression context as follows: cell population+ anatomical compartment+ pathological condition. For the production of gene contexts, we have opted for the massive screening of literature. We have developed a Python package, which allows annotating the texts according to three ontologies chosen according to our definition of the context. We show here that it ensures better performance for text annotation the reference tool. We used our package to screen an aging immune system text corpus. The results are presented here. To model the biological pathways we have developed, in collaboration with the LIPAH lab a modeling method based on a genetic algorithm that allows combining the results semantics proximity using the Biological Process ontology and the interactions data from db-string. We were able to find networks with an error rate of 0.47
Garcia, del Rio Diego Fernando. "Studying protein complexes for assessing the function of ghost proteins (Ghost in the Cell)". Electronic Thesis or Diss., Université de Lille (2022-....), 2023. https://pepite-depot.univ-lille.fr/ToutIDP/EDBSL/2023/2023ULILS115.pdf.
Ovarian cancer (OvCa) has the highest mortality rate among female reproductive cancers worldwide. OvCa is often referred to as a stealth killer because it is commonly diagnosed late or misdiagnosed. Once diagnosed, OvCa treatment options include surgery or chemotherapy. However, chemotherapy resistance is a significant obstacle. Therefore, there is an urgent need to identify new targets and develop novel therapeutic strategies to overcome therapy resistance.In this context the ghost proteome is a potentially rich source of biomarkers. The ghost proteome, also known as the alternative proteome, consists of proteins translated from alternative open reading frames (AltORFs). These AltORFs originate from different start codons within mRNA molecules, such as the coding DNA sequence (CDS) in frameshifts (+1, +2), the 5'-UTR, 3'-UTR, and possible translation products from non-coding RNAs (ncRNA).Studies on alternative proteins (AltProts) are often limited due to their case-by-case occurrence and complexity. Obtaining functional protein information for AltProts requires complex and costly biomolecular studies. However, their functions can be inferred by profiling their interaction partners, known as "guilty by association" approaches. Indeed, assessing AltProts' protein-protein interactions (PPIs) with reference proteins (RefProts) can help identify their function and set them as research targets. Since there is a lack of antibodies against AltProts, crosslinking mass spectrometry (XL-MS) is an appropriate tool for this task. Additionally, bioinformatic tools that link protein functional information through networks and gene ontology (GO) analysis are also powerful. These tools enable the visualization of signaling pathways and the grouping of RefProts based on their biological process, molecular function, or cellular localization, thus enhancing our understanding of cellular mechanisms.In this work, we developed a methodology that combines XL-MS and subcellular fractionation. The key step of subcellular fractionation allowed us to reduce the complexity of the samples analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS). To assess the validity of crosslinked interactions, we performed molecular modeling of the 3D structures of the AltProts, followed by docking studies and measurement of the corresponding crosslink distances. Network analysis indicated potential roles for AltProts in biological functions and processes. The advantages of this workflow include non-targeted AltProt identification and subcellular identification.Additionally, a proteogenomic analysis was performed to investigate the proteomes of two ovarian cancer cell lines (PEO-4 and SKOV-3 cells) in comparison to a normal ovarian epithelial cell line (T1074 cell). Using RNA-seq data, customized protein databases for each cell line were generated. Differential expression of several proteins, including AltProts, was identified between the cancer and normal cell lines. The expression of some RefProts and their transcripts were associated with cancer-related pathways. Moreover, the XL-MS methodology described above was used to identify PPIs in the cancerous cell lines.This work highlights the significant potential of proteogenomics in uncovering new aspects of ovarian cancer biology. It enables us to identify previously unknown proteins and variants that may have functional significance. The use of customized protein databases and the crosslinking approach have shed light on the "ghost proteome," an area that has remained unexplored until now
Lalloz, Jean-Pierre. "Genie et verite. Ontologie de la verite et metaphysique des createurs". Lille 3, 1991. http://www.theses.fr/1991LIL30004.
Every practice and every theory suppose an implicite conception of trhuth which must be questionned not simply about his existence because of the juridical nature of this notion, but about the invention of its essence and the legityimacy of this invention. That gives a definition of the genius therefore the questions of truth and geniusity meet to gether, not as a positive reality but as a juridical necessity after the introduction which states the necessity for philosophy to create a definition of truth of which it will be the explication and therefore to be itself of genius, as tis history attests it, will be study the question of the nature of what is true by the creativity of genius, the works. These ones will be questionned about the worldly vanity they assume and about the legitimacy of its existence, the institution of which it is the next problem of the genius is to reconcile juridical relaity of its notion with the ontological nature of truth which is by definition substancial. The second part will be denoted to the development of knowledge which is necessasity involved geniusity the creator of genius must own the answer to the most fundamental questions of ontology and metaphysics we will clarify each question and we will bring out the answer involved by the notion of an absolutly legitimale speaker. The conclusion is about the notion of the person, subject of right and not of fact, as completely made of the "subconsqcious" supposition of what is truen and leads to a "psychianalysis of right"
Benabderrahmane, Sidahmed. "Prise en compte des connaissances du domaine dans l'analyse transcriptomique : Similarité sémantique, classification fonctionnelle et profils flous : application au cancer colorectal". Phd thesis, Université Henri Poincaré - Nancy I, 2011. http://tel.archives-ouvertes.fr/tel-00653169.
Laadhar, Amir. "Local matching learning of large scale biomedical ontologies". Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30126.
Although a considerable body of research work has addressed the problem of ontology matching, few studies have tackled the large ontologies used in the biomedical domain. We introduce a fully automated local matching learning approach that breaks down a large ontology matching task into a set of independent local sub-matching tasks. This approach integrates a novel partitioning algorithm as well as a set of matching learning techniques. The partitioning method is based on hierarchical clustering and does not generate isolated partitions. The matching learning approach employs different techniques: (i) local matching tasks are independently and automatically aligned using their local classifiers, which are based on local training sets built from element level and structure level features, (ii) resampling techniques are used to balance each local training set, and (iii) feature selection techniques are used to automatically select the appropriate tuning parameters for each local matching context. Our local matching learning approach generates a set of combined alignments from each local matching task, and experiments show that a multiple local classifier approach outperforms conventional, state-of-the-art approaches: these use a single classifier for the whole ontology matching task. In addition, focusing on context-aware local training sets based on local feature selection and resampling techniques significantly enhances the obtained results
Fortineau, Virginie. "Contribution à une modélisation ontologique des informations tout au long du cycle de vie du produit". Phd thesis, Ecole nationale supérieure d'arts et métiers - ENSAM, 2013. http://pastel.archives-ouvertes.fr/pastel-01064598.
Pham, Tuan Anh. "OntoApp : une approche déclarative pour la simulation du fonctionnement d’un logiciel dès une étape précoce du cycle de vie de développement". Thesis, Université Côte d'Azur (ComUE), 2017. http://www.theses.fr/2017AZUR4075/document.
In this thesis, we study several models of collaboration between Software Engineering and Semantic Web. From the state of the art, we propose an approach to the use of ontology in the business application layer. The main objective of our work is to provide the developer with the tools to design, in the declarative manner, a business "executable" layer of an application in order to simulate its operation and thus show the compliance of the application with the customer requirements defined at the beginning of the software life cycle. On the other hand, another advantage of this approach is to allow the developer to share and reuse the business layer description of a typical application in a domain using ontology. This typical application description is called "Application Template". The reuse of the business layer description of an application is an interesting aspect of software engineering. That is the key point we want to consider in this thesis. In the first part of this thesis, we deal with the modeling of the business layer. We first present an ontology-based approach to represent business process and the business rules and show how to verify the consistency of business process and the set of business rules. Then, we present an automatic check mechanism of compliance of business process with a set of business rules. The second part of this thesis is devoted to define a methodology, called personalization, of creating of an application from an "Application Template". This methodology will allow the user to use an Application Template to create his own application by avoiding deadlock and semantic errors. We introduce at the end of this part the description of an experimental platform to illustrate the feasibility of the mechanisms proposed in the thesis. This platform s carried out on a relational DBMS.Finally, we present, in a final chapter, the conclusion, the perspective and other annexed works developed during this thesis
Benabderrahmane, Sidahmed. "Prise en compte des connaissances du domaine dans l'analyse transcriptomique : Similarité sémantique, classification fonctionnelle et profils flous : application au cancer colorectal". Electronic Thesis or Diss., Nancy 1, 2011. http://www.theses.fr/2011NAN10097.
Bioinformatic analyses of transcriptomic data aims to identify genes with variations in their expression level in different tissue samples, for example tissues from healthy versus seek patients, and to characterize these genes on the basis of their functional annotation. In this thesis, I present four contributions for taking into account domain knowledge in these methods. Firstly, I define a new semantic and functional similarity measure which optimally exploits functional annotations from Gene Ontology (GO). Then, I show, thanks to a rigorous evaluation method, that this measure is efficient for the functional classification of genes. In the third contribution, I propose a differential approach with fuzzy assignment for building differential expression profiles (DEPs). I define an algorithm for analyzing overlaps between functional clusters and reference sets such as DEPs here, in order to point out genes that have both similar functional annotation and similar variations in expression. This method is applied to experimental data produced from samples of healthy tissue, colorectal tumor and cancerous cultured cell line. Finally the similarity measure IntelliGO is generalized to another structured vocabulary organized as GO as a rooted directed acyclic graph, with an application concerning the semantic reduction of attributes before mining
Perry, Nicolas. "INDUSTRIALISATION DES CONNAISSANCES : APPROCHES D'INTEGRATION POUR UNE UTILISATION OPTIMALE EN INGENIERIE (CAS DE L'EVALUATION ECONOMIQUE)". Habilitation à diriger des recherches, Université de Nantes, 2007. http://tel.archives-ouvertes.fr/tel-00204563.
Nous développons ainsi des environnements que nous voulons interactifs, pour l'aide à la prise de décision (SIAD).
Dès lors, et indépendamment du domaine abordé, la connaissance des experts d'un métier devient une ressource que l'on voudrait partager au maximum, voir faire fonctionner de manières semi-automatiques. Or ces connaissances sont immatérielles, souvent non formalisées et échangées au travers de flux d'informations dont la quantité, la qualité et la véracité ne sont pas toujours assurées ou certifiées.
Il importe donc de s'intéresser aux approches de gestion des connaissances ou Knowledge Management, pour spécifier des environnements virtuels les plus riches possibles (sémantiquement parlant) dans lesquels on peut espérer intégrer une partie des connaissances d'experts. Certaines actions d'ingénierie, dont les processus et les règles ont été formalisés puis modélisés, sont alors partiellement automatisables, laissant la place à l'utilisateur pour décider dans des situations non gérées par l'expertise automatique, de critiquer, révoquer ou faire évoluer les choix et alternatives proposées.
Un pan important des études et recherches faites dans les domaines des sciences de l'information et de la communication font la part belle à l'intelligence artificielle. Mais les domaines d'application restent souvent très loin des cas d'utilisation ou d'application d'ingénierie (ingénierie mécanique en particulier).
Il nous importe donc de nous imprégner des approches, méthodes et outils existants pour nous les approprier tout en les faisant évoluer pour les adapter à nos besoins. Il faut donc pouvoir maîtriser ces méthodes et leurs solutions techniques pour pouvoir être porteur d'évolutions et requêtes et attirer ces thématiques de recherches vers nos problématiques.
En conséquences, mes travaux se sont inscrits dans cette démarche d'appropriation des approches de gestion et d'automatisation des connaissances qui relève d'une thématique d'ingénierie des connaissances. Nous avons donc cherché à développer des environnements à base de connaissances supportant les démarches d'ingénierie.
Cette position (ingénieur des connaissances) se place en interface entre l'expert (réceptacle et source des connaissances à partager), l'informaticien (qui va développer l'environnement virtuel à base de connaissances) et l'utilisateur (qui va utiliser et faire vivre l'outil).
Dès lors différents points de vue se confrontent, et il devient tout aussi important de maîtriser les aspects techniques (méthodes et solutions existantes pour formaliser et modéliser des connaissances, développer les applications, valider l'intégrité et la cohérence des éléments intégrés et à intégrer, faire vivre les bases de connaissances, etc...) que les aspects de gestion de projet, puisqu'il reste très difficile d'évaluer le retour sur investissement direct de tels projets. Un échec conduisant généralement à l'abandon de ce type d'approches, longues, lourdes et souvent opaques pour l'expert ou l'utilisateur.
Cette première partie de mes activités de recherches a donc participé à l'enrichissement d'aspects de modélisation (d'objets d'entreprise : information, connaissances ...), de structuration de ces modèles (méthode de modélisation) et d'intégration dans le cycle de développement d'applications informatiques.
En analysant les démarches d'ingénierie en mécanique (au sens large), il est ressorti que nous continuons à approfondir et affiner des modèles et calculs sur les solutions techniques qui répondent à des fonctions et des besoins clients. Cependant, le critère de coût ou de valeur reste très peu évalué ou abordé. Or, dans le contexte mondial et concurrentiel actuel il importe d'assurer l'équilibre entre les fonctions techniques et économiques des produits qui sont devenus de plus en plus complexes.
Ainsi nous essayons d'aborder la performance au travers des aspects économiques le long du cycle de vie du produit. Ceci nous donne un domaine d'application de nos travaux sur l'intégration et la mise à disposition d'expertise(s) issue(s) des sciences économiques et de gestion dans les démarches d'ingénierie.
Ces travaux ont initialement abordé l'adéquation entre les outils et méthodes d'évaluation économique au regard du cycle de vie produit. Les problèmes d'information partielle ou de passage de modèles se posent alors à chaque étape charnière.
De plus, nos réflexions et échanges avec les chercheurs en sciences de gestion nous ont porté vers la prise en compte du concept de valeur englobant le coût. Il est ainsi possible d'évaluer les alternatives de chaînes de valeurs et d'envisager d'aborder la conception conjointe de produits et de processus à valeur ajoutée maîtrisée. Ainsi l'environnement d'aide à la décision complète les contextes de décisions en intégrant les différents points de vus d'acteurs et leurs visions sur la valeur à prendre en compte.
L'ensemble de ces travaux et réflexions m'ont conduit au néologisme d'Industrialisation des Connaissances, au sens de la mise à disposition fiable, évolutive et objectivée, des connaissances d'experts d'entreprise. Ces travaux ont été alimentés par 4 thèses soutenues (une en cours), 7 DEA et Masters, des implications dans un Réseau Thématique du FP5 et deux Réseaux d'Excellence du FP6, ainsi qu'une collaboration soutenue depuis 2003 avec une équipe de recherche Sud Africaine.
Libri sul tema "Ontologie génique":
(Editor), Coral Calero, Francisco Ruiz (Editor) e Mario Piattini (Editor), a cura di. Ontologies for Software Engineering and Software Technology. Springer, 2006.