Thèses sur le sujet « Fuzzy formal concept analysis »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les 50 meilleures thèses pour votre recherche sur le sujet « Fuzzy formal concept analysis ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.
De, Maio Carmen. « Fuzzy concept analysis for semantic knowledge extraction ». Doctoral thesis, Universita degli studi di Salerno, 2012. http://hdl.handle.net/10556/1307.
Texte intégralAvailability of controlled vocabularies, ontologies, and so on is enabling feature to provide some added values in terms of knowledge management. Nevertheless, the design, maintenance and construction of domain ontologies are a human intensive and time consuming task. The Knowledge Extraction consists of automatic techniques aimed to identify and to define relevant concepts and relations of the domain of interest by analyzing structured (relational databases, XML) and unstructured (text, documents, images) sources. Specifically, methodology for knowledge extraction defined in this research work is aimed at enabling automatic ontology/taxonomy construction from existing resources in order to obtain useful information. For instance, the experimental results take into account data produced with Web 2.0 tools (e.g., RSS-Feed, Enterprise Wiki, Corporate Blog, etc.), text documents, and so on. Final results of Knowledge Extraction methodology are taxonomies or ontologies represented in a machine oriented manner by means of semantic web technologies, such as: RDFS, OWL and SKOS. The resulting knowledge models have been applied to different goals. On the one hand, the methodology has been applied in order to extract ontologies and taxonomies and to semantically annotate text. On the other hand, the resulting ontologies and taxonomies are exploited in order to enhance information retrieval performance and to categorize incoming data and to provide an easy way to find interesting resources (such as faceted browsing). Specifically, following objectives have been addressed in this research work: Ontology/Taxonomy Extraction: that concerns to automatic extraction of hierarchical conceptualizations (i.e., taxonomies) and relations expressed by means typical description logic constructs (i.e., ontologies). Information Retrieval: definition of a technique to perform concept-based the retrieval of information according to the user queries. Faceted Browsing: in order to automatically provide faceted browsing capabilities according to the categorization of the extracted contents. Semantic Annotation: definition of a text analysis process, aimed to automatically annotate subjects and predicates identified. The experimental results have been obtained in some application domains: e-learning, enterprise human resource management, clinical decision support system. Future challenges go in the following directions: investigate approaches to support ontology alignment and merging applied to knowledge management.
X n.s.
Konecny, Jan. « Isotone fuzzy Galois connections and their applications in formal concept analysis ». Diss., Online access via UMI:, 2009.
Trouver le texte intégralIncludes bibliographical references.
Glodeanu, Cynthia Vera. « Conceptual Factors and Fuzzy Data ». Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-103775.
Texte intégralKomplexitätsreduktion ist eines der wichtigsten Verfahren in der Datenanalyse. Mit ständig wachsenden Datensätzen gilt dies heute mehr denn je. In vielen Gebieten stößt man zudem auf vage und ungewisse Daten. Wann immer man ein Instrument zur Datenanalyse hat, stellen sich daher die folgenden zwei Fragen auf eine natürliche Weise: Wie kann man im Rahmen der Analyse die Variablenanzahl verkleinern, und wie kann man Fuzzy-Daten bearbeiten? In dieser Arbeit versuchen wir die eben genannten Fragen für die Formale Begriffsanalyse zu beantworten. Genauer gesagt, erarbeiten wir verschiedene Methoden zur Komplexitätsreduktion qualitativer Daten und entwickeln diverse Verfahren für die Bearbeitung von Fuzzy-Datensätzen. Basierend auf diesen beiden Themen gliedert sich die Arbeit in zwei Teile. Im ersten Teil liegt der Schwerpunkt auf der Komplexitätsreduktion, während sich der zweite Teil der Verarbeitung von Fuzzy-Daten widmet. Die verschiedenen Kapitel sind dabei durch die beiden Themen verbunden. So werden insbesondere auch Methoden für die Komplexitätsreduktion von Fuzzy-Datensätzen entwickelt
Ayouni, Sarra. « Etude et Extraction de règles graduelles floues : définition d'algorithmes efficaces ». Thesis, Montpellier 2, 2012. http://www.theses.fr/2012MON20015/document.
Texte intégralKnowledge discovery in databases is a process aiming at extracting a reduced set of valuable knowledge from a huge amount of data. Data mining, one step of this process, includes a number of tasks, such as clustering, classification, of association rules mining, etc.The problem of mining association rules requires the step of frequent patterns extraction. We distinguish several categories of frequent patterns: classical patterns, fuzzy patterns, gradual patterns, sequential patterns, etc. All these patterns differ on the type of the data from which the extraction is done and the type of the relationship that represent.In this thesis, we particularly contribute with the proposal of fuzzy and gradual patterns extraction method.Indeed, we define new systems of closure of the Galois connection for, respectively, fuzzy and gradual patterns. Thus, we propose algorithms for extracting a reduced set of fuzzy and gradual patterns.We also propose two approaches for automatically defining fuzzy modalities that allow obtaining relevant fuzzy gradual patterns.Based on fuzzy closed and gradual closed patterns, we define generic bases of fuzzy and gradual association rules. We thus propose a complet and valid inference system to derive all redundant fuzzy and gradual association rules
Novi, Daniele. « Knowledge management and Discovery for advanced Enterprise Knowledge Engineering ». Doctoral thesis, Universita degli studi di Salerno, 2014. http://hdl.handle.net/10556/1466.
Texte intégralThe research work addresses mainly issues related to the adoption of models, methodologies and knowledge management tools that implement a pervasive use of the latest technologies in the area of Semantic Web for the improvement of business processes and Enterprise 2.0 applications. The first phase of the research has focused on the study and analysis of the state of the art and the problems of Knowledge Discovery database, paying more attention to the data mining systems. The most innovative approaches which were investigated for the "Enterprise Knowledge Engineering" are listed below. In detail, the problems analyzed are those relating to architectural aspects and the integration of Legacy Systems (or not). The contribution of research that is intended to give, consists in the identification and definition of a uniform and general model, a "Knowledge Enterprise Model", the original model with respect to the canonical approaches of enterprise architecture (for example with respect to the Object Management - OMG - standard). The introduction of the tools and principles of Enterprise 2.0 in the company have been investigated and, simultaneously, Semantic Enterprise based appropriate solutions have been defined to the problem of fragmentation of information and improvement of the process of knowledge discovery and functional knowledge sharing. All studies and analysis are finalized and validated by defining a methodology and related software tools to support, for the improvement of processes related to the life cycles of best practices across the enterprise. Collaborative tools, knowledge modeling, algorithms, knowledge discovery and extraction are applied synergistically to support these processes. [edited by author]
XII n.s.
Dao, Ngoc Bich. « Réduction de dimension de sac de mots visuels grâce à l’analyse formelle de concepts ». Thesis, La Rochelle, 2017. http://www.theses.fr/2017LAROS010/document.
Texte intégralIn several scientific fields such as statistics, computer vision and machine learning, redundant and/or irrelevant information reduction in the data description (dimension reduction) is an important step. This process contains two different categories : feature extraction and feature selection, of which feature selection in unsupervised learning is hitherto an open question. In this manuscript, we discussed about feature selection on image datasets using the Formal Concept Analysis (FCA), with focus on lattice structure and lattice theory. The images in a dataset were described as a set of visual words by the bag of visual words model. Two algorithms were proposed in this thesis to select relevant features and they can be used in both unsupervised learning and supervised learning. The first algorithm was the RedAttSansPerte, which based on lattice structure and lattice theory, to ensure its ability to remove redundant features using the precedence graph. The formal definition of precedence graph was given in this thesis. We also demonstrated their properties and the relationship between this graph and the AC-poset. Results from experiments indicated that the RedAttsSansPerte algorithm reduced the size of feature set while maintaining their performance against the evaluation by classification. Secondly, the RedAttsFloue algorithm, an extension of the RedAttsSansPerte algorithm, was also proposed. This extension used the fuzzy precedence graph. The formal definition and the properties of this graph were demonstrated in this manuscript. The RedAttsFloue algorithm removed redundant and irrelevant features while retaining relevant information according to the flexibility threshold of the fuzzy precedence graph. The quality of relevant information was evaluated by the classification. The RedAttsFloue algorithm is suggested to be more robust than the RedAttsSansPerte algorithm in terms of reduction
Diner, Casri. « Visualizing Data With Formal Concept Analysis ». Master's thesis, METU, 2003. http://etd.lib.metu.edu.tr/upload/1046325/index.pdf.
Texte intégrals hard-disk capacities which is used for storing datas and the amount of data you can reach through internet is increasing day by day, there should be a need to turn this information into knowledge. This is one of the reasons for studying formal concept analysis. We wanted to point out how this application is related with algebra and logic. The beginning of the first chapter emphasis the relation between closure systems, galois connections, lattice theory as a mathematical structure and concept analysis. Then it describes the basic step in the formalization: An elementary form of the representation of data is defined mathematically. Second chapter explains the logic of formal concept analysis. It also shows how implications, which can be regard as special formulas on a set,between attributes can be shown by fewer implications, so called generating set for implications. These mathematical tools are then used in the last chapter, in order to describe complex '
concept'
lattices by means of decomposition methods in examples.
Krajča, Petr. « Advanced algorithms for formal concept analysis ». Diss., Online access via UMI:, 2009.
Trouver le texte intégralIncludes bibliographical references.
Petersen, Wiebke, et Petja Heinrich. « Qualitative Citation Analysis Based on Formal Concept Analysis ». Universitätsbibliothek Chemnitz, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200801464.
Texte intégralSertkaya, Baris. « Formal Concept Analysis Methods for Description Logics ». Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1215598189927-85390.
Texte intégralLungley, Deirdre. « Adaptive information retrieval employing formal concept analysis ». Thesis, University of Essex, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.573750.
Texte intégralSertkaya, Baris. « Formal Concept Analysis Methods for Description Logics ». Doctoral thesis, Technische Universität Dresden, 2007. https://tud.qucosa.de/id/qucosa%3A23613.
Texte intégralSertkaya, Barış. « Formal concept analysis methods for description logics ». [S.l. : s.n.], 2008. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1215598189927-85390.
Texte intégralKim, Bong-Seop. « Advanced web search based on formal concept analysis ». Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ62230.pdf.
Texte intégralTilley, Thomas. « Formal concept analysis applications to requirements engineering and design / ». [St. Lucia, Qld.], 2003. http://adt.library.uq.edu.au/public/adt-QU20050223.204947/index.html.
Texte intégralGonzález, González Larry Javier. « Modelling dynamics of RDF graphs with formal concept analysis ». Tesis, Universidad de Chile, 2018. http://repositorio.uchile.cl/handle/2250/168144.
Texte intégralLa Web Semántica es una red de datos organizados de tal manera que permite su manipulación directa tanto por humanos como por computadoras. RDF es el framework recomendado por W3C para representar información en la Web Semántica. RDF usa un modelo de datos basado en grafos que no requiere ningún esquema fijo, provocando que los grafos RDF sean fáciles de extender e integrar, pero también difíciles de consultar, entender, explorar, resumir, etc. En esta tesis, inspirados en formal concept analysis (un subcampo de las matemáticas aplicadas, basados en la formalización de conceptos y jerarquı́as conceptuales, llamadas lattices) proponemos un data-driven schema para grandes y heterogéneos grafos RDF La idea principal es que si podemos definir un formal context a partir de un grafo RDF, entonces podemos extraer sus formal concepts y computar una lattice con ellos, lo que resulta en nuestra propuesta de esquema jerárquico para grafos RDF. Luego, proponemos un álgebra sobre tales lattices, que permite (1) calcular deltas entre dos lattices (por ejemplo, para resumir los cambios de una versión de un grafo a otro), y (2) sumar un delta a un lattice (por ejemplo, para proyectar cambios futuros). Mientras esta estructura (y su álgebra asociada) puede tener varias aplicaciones, nos centramos en el caso de uso de modelar y predecir el comportamiento dinámico de los grafos RDF. Evaluamos nuestros métodos al analizar cómo Wikidata ha cambiado durante 11 semanas. Primero extraemos los conjuntos de propiedades asociadas a entidades individuales de una manera escalable usando el framework MapReduce. Estos conjuntos de propiedades (también conocidos como characteristic sets) son anotados con sus entidades asociadas, y posteriormente, con su cardinalidad. En segundo lugar, proponemos un algoritmo para construir la lattice sobre los characteristic sets basados en la relación de subconjunto. Evaluamos la eficiencia y la escalabilidad de ambos procedimientos. Finalmente, usamos los métodos algebraicos para predecir cómo el esquema jerárquico de Wikidata evolucionaría. Contrastamos nuestros resultados con un modelo de regresión lineal como referencia. Nuestra propuesta supera al modelo lineal por un gran margen, llegando a obtener un root mean square error 12 veces más pequeño que el modelo de referencia. Concluimos que, basados en formal concept analysis, podemos definir y generar un esquema jerárquico a partir de un grafo RDF y que podemos usar esos esquemas para predecir cómo evolucionarán, en un alto nivel, estos grafos RDF en el tiempo.
Almuhisen, Feda. « Leveraging formal concept analysis and pattern mining for moving object trajectory analysis ». Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0738/document.
Texte intégralThis dissertation presents a trajectory analysis framework, which includes both a preprocessing phase and trajectory mining process. Furthermore, the framework offers visual functions that reflect trajectory patterns evolution behavior. The originality of the mining process is to leverage frequent emergent pattern mining and formal concept analysis for moving objects trajectories. These methods detect and characterize pattern evolution behaviors bound to time in trajectory data. Three contributions are proposed: (1) a method for analyzing trajectories based on frequent formal concepts is used to detect different trajectory patterns evolution over time. These behaviors are "latent", "emerging", "decreasing", "lost" and "jumping". They characterize the dynamics of mobility related to urban spaces and time. The detected behaviors are automatically visualized on generated maps with different spatio-temporal levels to refine the analysis of mobility in a given area of the city, (2) a second trajectory analysis framework that is based on sequential concept lattice extraction is also proposed to exploit the movement direction in the evolution detection process, and (3) prediction method based on Markov chain is presented to predict the evolution behavior in the future period for a region. These three methods are evaluated on two real-world datasets. The obtained experimental results from these data show the relevance of the proposal and the utility of the generated maps
Kriegel, Francesco. « Visualization of Conceptual Data with Methods of Formal Concept Analysis ». Master's thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-125309.
Texte intégralEntwurf und Beweis eines Algorithmus zur Berechnung inkrementeller Änderungen in einem beschrifteten dargestellten Begriffsverband beim Einfügen oder Entfernen einer Merkmalsspalte im zugrundeliegenden formalen Kontext. Weiterhin sind einige Details zur Implementation sowie zum mathematischen Hintergrundwissen dargestellt
Kiraly, Bret D. « An Experimental Application of Formal Concept Analysis to Research Communities ». Case Western Reserve University School of Graduate Studies / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=case1228497076.
Texte intégralKanade, Parag M. « Fuzzy ants as a clustering concept ». [Tampa, Fla.] : University of South Florida, 2004. http://purl.fcla.edu/fcla/etd/SFE0000397.
Texte intégralHanika, Tom [Verfasser]. « Discovering Knowledge in Bipartite Graphs with Formal Concept Analysis / Tom Hanika ». Kassel : Universitätsbibliothek Kassel, 2019. http://d-nb.info/1180660811/34.
Texte intégralWatmough, Martin John. « Discovering the hidden knowledge in transaction data through formal concept analysis ». Thesis, Sheffield Hallam University, 2013. http://shura.shu.ac.uk/7706/.
Texte intégralArévalo, Gabriela Beatriz. « High-level views in object-oriented systems using formal concept analysis / ». [S.l.] : [s.n.], 2004. http://www.zb.unibe.ch/download/eldiss/04arevalo_g.pdf.
Texte intégralSmith, David T. « A Formal Concept Analysis Approach to Association Rule Mining : The QuICL Algorithms ». NSUWorks, 2009. http://nsuworks.nova.edu/gscis_etd/309.
Texte intégralRudolph, Sebastian. « Relational Exploration : Combining Description Logics and Formal Concept Analysis for Knowledge Specification ». Doctoral thesis, Technische Universität Dresden, 2006. https://tud.qucosa.de/id/qucosa%3A25002.
Texte intégralDucrou, Jon. « Design for conceptual knowledge processing case studies in applied formal concept analysis / ». Access electronically, 2007. http://www.library.uow.edu.au/adt-NWU/public/adt-NWU20080919.093612/index.html.
Texte intégralHorner, Vincent Zion. « Developing a consumer health informatics decision support system using formal concept analysis ». Diss., Pretoria : [s.n.], 2007. http://upetd.up.ac.za/thesis/available/etd-05052008-112403/.
Texte intégralEverts, TJ. « Using Formal Concept Analysis with a Push-based Web Document Management System ». Thesis, Honours thesis, University of Tasmania, 2004. https://eprints.utas.edu.au/116/1/EvertsT_Hons_Thesis2004.pdf.
Texte intégralAbid, Ahmed. « Improvement of web service composition using semantic similarities and formal concept analysis ». Thesis, Tours, 2017. http://www.theses.fr/2017TOUR4007.
Texte intégralService Oriented Architectures (SOA) have been progressively confirmed as an essential tool in inter-companies exchanges thanks to their strategic and technological potential. Their implementation is realised through Web services. One of the main assets of services is their compostability. With the emergence of the semantic Web, the discovery and composition of semantic Web services become a real challenge. The discovery process is generally based on traditional registries with syntactic descriptions where services are statically grouped. This poses a problem related to the heterogeneity of syntactic descriptions and the rigidity of the classification. The composition process depends on the Web service matching quality processed in the discovery phase. We propose in this dissertation an architecture of a framework that covers all the phases of the composition process. Then, we propose a semantic similarity measure Web services. The Web services discovery process relies on the proposed similarity measure, the formal concept analysis (FCA) formalism, and the organisation of lattice services. The composition is then based on the establishment of coherent and relevant composite services for the expected functionality. The main strengths of this architecture are the adaptation and integration of semantic technologies, the calculation of semantic similarity and the use of this semantic similarity and the FCA formalism in order to optimise the composition process
Berthold, Stefan. « Linkability of communication contents : Keeping track of disclosed data using Formal Concept Analysis ». Thesis, Karlstad University, Faculty of Economic Sciences, Communication and IT, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-369.
Texte intégralA person who is communication about (the data subject) has to keep track of all of his revealed data in order to protect his right of informational self-determination. This is important when data is going to be processed in an automatic manner and, in particular, in case of automatic inquiries. A data subject should, therefore, be enabled to recognize useful decisions with respect to data disclosure, only by using data which is available to him.
For the scope of this thesis, we assume that a data subject is able to protect his communication contents and the corresponding communication context against a third party by using end-to-end encryption and Mix cascades. The objective is to develop a model for analyzing the linkability of communication contents by using Formal Concept Analysis. In contrast to previous work, only the knowledge of a data subject is used for this analysis instead of a global view on the entire communication contents and context.
As a first step, the relation between disclosed data is explored. It is shown how data can be grouped by types and data implications can be represented. As a second step, behavior, i. e. actions and reactions, of the data subject and his communication partners is included in this analysis in order to find critical data sets which can be used to identify the data subject.
Typical examples are used to verify this analysis, followed by a conclusion about pros and cons of this method for anonymity and linkability measurement. Results can be used, later on, in order to develop a similarity measure for human-computer interfaces.
Distel, Felix. « Learning Description Logic Knowledge Bases from Data Using Methods from Formal Concept Analysis ». Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-70199.
Texte intégralMa, Junheng. « Contributions to Numerical Formal Concept Analysis, Bayesian Predictive Inference and Sample Size Determination ». Case Western Reserve University School of Graduate Studies / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=case1285341426.
Texte intégralSinha, Aditya. « Formal Concept Analysis for Search and Traversal in Multiple Databases with Effective Revision ». University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1245088303.
Texte intégralShen, Gongqin. « Formal Concepts and Applications ». Case Western Reserve University School of Graduate Studies / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=case1121454398.
Texte intégralMeschke, Christian. « Concept Approximations ». Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2012. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-86642.
Texte intégralIn der vorliegenden Arbeit beschreiben wir einen verbandstheoretischen Zugang zum Thema Approximieren. Ausgehend von einem Kern- und einem Hüllensystem auf einem vollständigen Verband erhält man einen Approximationsverband. Wir beschreiben die Theorie dieser Approximationsverbände. Des Weiteren liegt dabei ein Hauptaugenmerk auf dem Fall zugrundeliegender Begriffsverbände. Wie sich nämlich herausstellt, lassen sich Approximationen formaler Begriffe als Spuren auffassen, welche diese in einem vorgegebenen Teilkontext hinterlassen
Ozdemir, Ali Yucel. « An Inquiry Into The Concept Of ». Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12615034/index.pdf.
Texte intégralsurface&rdquo
in the works of Peter Eisenman. In doing so, the concept of &ldquo
surface&rdquo
is discussed under three titles: &ldquo
Surface&rdquo
as an element of architectural vocabulary (as a formal element), as an analytical tool (as a grammar), and as a diagrammatic tool. Correspondingly, the thesis is intended to examine how &ldquo
surface&rdquo
is conceptualized and handled through the critical readings of Eisenman&rsquo
s writings, and projects are referred in order to support and visualize the discussions. In this context, Eisenman&rsquo
s dissertation, The Formal Basis of Modern Architecture (1963), reveals the definition of architectural surface in relation to the architectural language that is proposed by him. Through the formal analysis of Giuseppe Terragni&rsquo
s building, Casa Guiliani Frigerio, he utilizes surface as an analytical tool. Considering design processes of his projects, as discussed in the book Diagram Diaries (1999), surface becomes a dominant tool for generating architectural form. As a result, in this thesis, surface is evaluated in various aspects (as a formal, analytical and diagrammatic tool) that are essential for understanding of architectural form. In the case of Eisenman, its significance dominates the way of developing his architecture.
Rudolph, Sebastian [Verfasser]. « Relational exploration : combining description logics and formal concept analysis for knowledge specification / von Sebastian Rudolph ». Karlsruhe : Univ.-Verl. Karlsruhe, 2007. http://d-nb.info/983756430/34.
Texte intégralJoseph, Daniel. « Linking information resources with automatic semantic extraction ». Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/linking-information-resources-with-automatic-semantic-extraction(ada2db36-4366-441a-a0a9-d76324a77e2c).html.
Texte intégralDe, Alburquerque Melo Cassio. « Real-time Distributed Computation of Formal Concepts and Analytics ». Phd thesis, Ecole Centrale Paris, 2013. http://tel.archives-ouvertes.fr/tel-00966184.
Texte intégralCellier, Peggy, Felix Distel et Bernhard Ganter. « Contributions to the 11th International Conference on Formal Concept Analysis : Dresden, Germany, May 21–24, 2013 ». Technische Universität Dresden, 2013. https://tud.qucosa.de/id/qucosa%3A26885.
Texte intégralKandasamy, Meenakshi. « Approaches to Creating Fuzzy Concept Lattices and an Application to Bioinformatics Annotations ». Miami University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=miami1293821656.
Texte intégralDistel, Felix, et Daniel Borchmann. « Expected Numbers of Proper Premises and Concept Intents ». Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-71153.
Texte intégralPotter, Dustin Paul. « A combinatorial approach to scientific exploration of gene expression data : An integrative method using Formal Concept Analysis for the comparative analysis of microarray data ». Diss., Virginia Tech, 2005. http://hdl.handle.net/10919/28792.
Texte intégralPh. D.
Peng, Jin-De, et 彭晉德. « Apply Document Processing Techniques to Improve Fuzzy Formal Concept Analysis Concept Quality ». Thesis, 2013. http://ndltd.ncl.edu.tw/handle/13267447827823741877.
Texte intégral國立雲林科技大學
資訊管理系碩士班
101
Traditional Formal Concept Analysis (FCA) has been blamed for its drawback that fails to deal with uncertain information; furthermore, its performance to administer and searching deteriorated while coping with a large amount of documents and particularly in wider domains. To deal with this drawback, this study employed Fuzzy Theory into FCA and used Event Detection Clustering Technique based on the experimental dataset from Yahoo! News. We extracted news features through syntax rules and assigned normalized TF-IDF as membership grade. Then, event detection clustering was carried out to decrease the complexity of document set, enhance quality of searching and shorten the duration of processing. In comparison with traditional FCA, the results showed that our proposed method, assessing the quality of concept lattice through fuzzy rate, had higher fuzzy rate and the quality also increased as α-cut was higher. Furthermore, we are able to find an appropriate α-cut to build more precise concept lattice through users’ satisfaction. Experimental results showed that users indicate that the concept expressed the news contents best as the α-cut was 0.06 in this study.
Lin, Shin-Yang, et 林欣洋. « Knowledge Exploration in Drug Interaction using Fuzzy Formal Concept Analysis ». Thesis, 2007. http://ndltd.ncl.edu.tw/handle/43035149416098183090.
Texte intégral國立成功大學
資訊管理研究所
95
The improvements in pharmaceutics, accompanied by medicinal and technological advances, have expanded the diversity of pharmaceuticals to a great extent, turning the world of pharmacology into a complex web of drugs, their interactions, and most important of all, their effects on patients. The increasing number of existent pharmaceuticals inevitably complicates the interactions between them, revealing the importance of their thorough understanding in order to prevent possible pathogenic symptoms. This is emphasized by the seriousness of drug misuse and the related consequences. This paper utilizes Fuzzy Formal Concept Analysis in the process of medicinal data analysis to uncover the ongoing connections between the formal concepts and the strengths of these relationships. The tacit knowledge extracted helps experts have a better insight of the pharmaceuticals and the possible drug interactions, consequently improving their usage in terms of effectiveness and reducing mistreatment.
Cheng, HsuFeng, et 鄭旭峰. « The Study on “Information Security” News Retrieval by using Fuzzy Formal Concept Analysis ». Thesis, 2013. http://ndltd.ncl.edu.tw/handle/73187591349099605567.
Texte intégral國防大學管理學院
資訊管理學系
101
In today's ever-expanding universe of knowledge, digital documentation to be a trend, online information is the main part of knowledge resources. Therefore, to cite the past valuable references and try to find out how to make the massive volume of data as a file, to categorize and search relevant information in an efficient way is the critical issue of current studies. Nowadays we can use index application in plenty websites, and the way of navigating based on “Keywords”, however in this way, it cannot be more efficient to search one certain topic during navigation. Besides, if the search engine establishes on “Subject-Specific”, the main category is made by manually- it might be time-consuming and strenuous to handle with such a plenty data information. Therefore, the main purpose of this study is to figure out what’s the automatic classify system in search application, to aim for improving effectiveness in knowledge representation and discovery. There are three main purposes of this study- first, to acquire specific terms as our foundation, combine Fuzzy Formal Concept Analysis (FFCA) method as analysis process to set up the ontology and the concept relationship automatically for the information security domain. Second, the news reporting states on the website of “Information Security” which is the main resource of training materials. Query Expansion is the main navigation method. Third, to construct the systemized query model on information security news, the system then can get some recommended contents and then expand the search results and increase the retrieval efficiency . The conclusion of this study is to validate the establish timing would be shorten by Ontology set on automatic searching system- it is superior to manual one. To construct domain ontology based on FFCA is beneficial than Formal Concept Analysis. Experimental results on “Information Security News Retrieval Systems” illustrate the most efficient way to expand all relevant contents for every user.
Glodeanu, Cynthia Vera. « Conceptual Factors and Fuzzy Data ». Doctoral thesis, 2012. https://tud.qucosa.de/id/qucosa%3A26470.
Texte intégralKomplexitätsreduktion ist eines der wichtigsten Verfahren in der Datenanalyse. Mit ständig wachsenden Datensätzen gilt dies heute mehr denn je. In vielen Gebieten stößt man zudem auf vage und ungewisse Daten. Wann immer man ein Instrument zur Datenanalyse hat, stellen sich daher die folgenden zwei Fragen auf eine natürliche Weise: Wie kann man im Rahmen der Analyse die Variablenanzahl verkleinern, und wie kann man Fuzzy-Daten bearbeiten? In dieser Arbeit versuchen wir die eben genannten Fragen für die Formale Begriffsanalyse zu beantworten. Genauer gesagt, erarbeiten wir verschiedene Methoden zur Komplexitätsreduktion qualitativer Daten und entwickeln diverse Verfahren für die Bearbeitung von Fuzzy-Datensätzen. Basierend auf diesen beiden Themen gliedert sich die Arbeit in zwei Teile. Im ersten Teil liegt der Schwerpunkt auf der Komplexitätsreduktion, während sich der zweite Teil der Verarbeitung von Fuzzy-Daten widmet. Die verschiedenen Kapitel sind dabei durch die beiden Themen verbunden. So werden insbesondere auch Methoden für die Komplexitätsreduktion von Fuzzy-Datensätzen entwickelt.
Lin, Hun-Ching, et 林紘靖. « Automatic Document Classification UsingFuzzy Formal Concept Analysis ». Thesis, 2009. http://ndltd.ncl.edu.tw/handle/12015487957874690408.
Texte intégral國立成功大學
資訊管理研究所
97
As computer becomes popular, the internet developes and the coming of the age of knowledge, the numerious of digital documents increases faster. There are always a huge deal of search resoult when we use search engine on the internet, and it becomes more and more difficult to find specified document from databases. Hense, people starts to find the way to find required documents from a huge database. Thus, automatical categorization of documennts becomes an important issue in managing document datas. In recent years, more and more research uses formal concept analysis(FCA) on information retrieval. However, classical formal concept analysis present the fuzzy information of document categorization (Tho et al., 2006), some research thus combines fuzzy theory with FCA to fuzzy FCA (Burusco and Fuentes-Gonzales, 1994). The researches of FCA then become more and more. This proposed research is trying to analysis documents with information retrieval technology to find the most important keywords of the specified dataset, then give fuzzy membership degree and then categorize the documents with fuzzy FCA. In this research, the categorization is computed with the concept lattice produced from the FCA process to find an application of the concept lattice besides presenting the domain knowledge. We hope this to be helpful to the researches of document categorization using FCA. The result shows that the categorization using concept lattice combining with fuzzy logic is precise. And the result is steady for all categories.
Lin, Yu-Ting, et 林于婷. « Document Overlapping Clustering Using Formal Concept Analysis ». Thesis, 2015. http://ndltd.ncl.edu.tw/handle/15119561186477320841.
Texte intégral國立中興大學
資訊管理學系所
103
In recent year, information and data growing and spreading fast. A lot of studies trying to find the useful pattern or knowledge among the growing data. Text document clustering is also a technique in Data Mining field which could solve this problem. Text document clustering is a technique which group documents into several clusters based on the similarities among documents. Most of traditional clustering algorithms build disjoint clusters, but clusters should be overlapped because document may often belong to two or more categories in real world. For example, an article discussing the Apple Watch may be categorized into either 3C, Fashion, or even Clothing and Shoes. Then this article could be seen by more internet users. In this paper, we propose an overlapping clustering algorithm by using the Formal Concept Analysis, which could make an article belongs to two or more cluster. Due to the hierarchical structure of Formal Concept Lattice, an article could belong to more than one Formal Concept. Extracting the suitable Formal Concepts and transformed into conceptual vectors, the overlapping clustering result could be obtained. More over, our algorithm reduced the dimension of the vector space, it performs more efficiently than traditional clustering approaches which are based on Vector Space Model.
黃大偉. « Event Tree Analysis Using Fuzzy Concept ». Thesis, 1997. http://ndltd.ncl.edu.tw/handle/11517306333510215915.
Texte intégral國立清華大學
工業工程研究所
85
Event tree analysis (ETA) method is a straightforward and simple approach for risk assessment. It can be used to identify various sequences and their causes, and also to give the analyst the clear picture about which top event dominates the safety of the system. The traditional ETA uses a single probability to represent each top event. However, it is unreasonable to evaluate the occurrence of an event by using a crisp value without considering the inherent uncertainty and imprecision a state has. Since fuzzy set theory provides a framework for dealing with this kind of phenomena, this tool is used in this study. The main purpose of this study is to make an effort in constructing an easy methodology to evaluate the human error and integrates it into ETA by using fuzzy concept. In addition, a systematic FETA algorithm is developed to evaluate the risk of a large scale system. A practical example of an ATWS event in a nuclear power plant is used to demonstrate the procedure. The fuzzy outcomes will be defuzzified by using the total integral value in terms of the degree of optimism the decision maker has. At last, more information about the importance and uncertainty of top events will be provided by using the two indices.