Littérature scientifique sur le sujet « Fuzzy formal concept analysis »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Fuzzy formal concept analysis ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Fuzzy formal concept analysis"

1

Supriyati, Endang. « FUZZY FORMAL CONCEPT ANALYSIS UNTUK KEMIRIPAN DOKUMEN ». Simetris : Jurnal Teknik Mesin, Elektro dan Ilmu Komputer 1, no 1 (29 juin 2013) : 21. http://dx.doi.org/10.24176/simet.v1i1.111.

Texte intégral
Résumé :
ABSTRAK Fuzzy logic dapat dimasukkan ke dalam ontologi untuk representasi ketidakpastian informasi yang ditemukan di banyak aplikasi domain karena kurangnya jelas batas-batas antara konsep domain. Fuzzy ontologi dihasilkan dari konsep hirarki yang telah ditetapkan. Namun, untuk membangun sebuah konsep hirarki untuk domain tertentu dapat menjadi tugas yang sulit dan membosankan. Untuk mengatasi masalah ini, diusulkan Fuzzy Formal Concept Analysis(FFCA). Titik awal dari metode diusulkan dalam paper ini adalah definisi dari konteks , relasi kemiripan pada domain ontologi kemudian memetakan ke dalam concept lattice. Dengan penggunaan tool lattice navigator,metode yang diusulkan mampu mengelompokkan domain ontology secara efektif. Kata Kunci: Ontology, Formal Concept Analysis, Fuzzy Formal Concept Analysis,konsep Lattice
Styles APA, Harvard, Vancouver, ISO, etc.
2

FORMICA, ANNA. « CONCEPT SIMILARITY IN FUZZY FORMAL CONCEPT ANALYSIS FOR SEMANTIC WEB ». International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 18, no 02 (avril 2010) : 153–67. http://dx.doi.org/10.1142/s0218488510006465.

Texte intégral
Résumé :
This paper presents a method for evaluating concept similarity within Fuzzy Formal Concept Analysis. In the perspective of developing the Semantic Web, such a method can be helpful when the digital resources found on the Internet cannot be treated equally and the integration of fuzzy data becomes fundamental for the search and discovery of information in the Web.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Haddache, Mohamed, Allel Hadjali et Hamid Azzoune. « Skyline refinement exploiting fuzzy formal concept analysis ». International Journal of Intelligent Computing and Cybernetics 14, no 3 (29 avril 2021) : 333–62. http://dx.doi.org/10.1108/ijicc-11-2020-0181.

Texte intégral
Résumé :
PurposeThe study of the skyline queries has received considerable attention from several database researchers since the end of 2000's. Skyline queries are an appropriate tool that can help users to make intelligent decisions in the presence of multidimensional data when different, and often contradictory criteria are to be taken into account. Based on the concept of Pareto dominance, the skyline process extracts the most interesting (not dominated in the sense of Pareto) objects from a set of data. Skyline computation methods often lead to a set with a large size which is less informative for the end users and not easy to be exploited. The purpose of this paper is to tackle this problem, known as the large size skyline problem, and propose a solution to deal with it by applying an appropriate refining process.Design/methodology/approachThe problem of the skyline refinement is formalized in the fuzzy formal concept analysis setting. Then, an ideal fuzzy formal concept is computed in the sense of some particular defined criteria. By leveraging the elements of this ideal concept, one can reduce the size of the computed Skyline.FindingsAn appropriate and rational solution is discussed for the problem of interest. Then, a tool, named SkyRef, is developed. Rich experiments are done using this tool on both synthetic and real datasets.Research limitations/implicationsThe authors have conducted experiments on synthetic and some real datasets to show the effectiveness of the proposed approaches. However, thorough experiments on large-scale real datasets are highly desirable to show the behavior of the tool with respect to the performance and time execution criteria.Practical implicationsThe tool developed SkyRef can have many domains applications that require decision-making, personalized recommendation and where the size of skyline has to be reduced. In particular, SkyRef can be used in several real-world applications such as economic, security, medicine and services.Social implicationsThis work can be expected in all domains that require decision-making like hotel finder, restaurant recommender, recruitment of candidates, etc.Originality/valueThis study mixes two research fields artificial intelligence (i.e. formal concept analysis) and databases (i.e. skyline queries). The key elements of the solution proposed for the skyline refinement problem are borrowed from the fuzzy formal concept analysis which makes it clearer and rational, semantically speaking. On the other hand, this study opens the door for using the formal concept analysis and its extensions in solving other issues related to skyline queries, such as relaxation.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Konecny, Jan, et Ondrej Krídlo. « On biconcepts in formal fuzzy concept analysis ». Information Sciences 375 (janvier 2017) : 16–29. http://dx.doi.org/10.1016/j.ins.2016.09.042.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Konecny, Jan, et Michal Krupka. « Block relations in formal fuzzy concept analysis ». International Journal of Approximate Reasoning 73 (juin 2016) : 27–55. http://dx.doi.org/10.1016/j.ijar.2016.02.004.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Shao, Ming-Wen, Min Liu et Wen-Xiu Zhang. « Set approximations in fuzzy formal concept analysis ». Fuzzy Sets and Systems 158, no 23 (décembre 2007) : 2627–40. http://dx.doi.org/10.1016/j.fss.2007.05.002.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Alwersh, Mohammed, et Kovács László. « Fuzzy formal concept analysis : approaches, applications and issues ». Computer Science and Information Technologies 3, no 2 (1 juillet 2022) : 126–36. http://dx.doi.org/10.11591/csit.v3i2.p126-136.

Texte intégral
Résumé :
Formal concept analysis (FCA) is today regarded as a significant technique for knowledge extraction, representation, and analysis for applications in a variety of fields. Significant progress has been made in recent years to extend FCA theory to deal with uncertain and imperfect data. The computational complexity associated with the enormous number of formal concepts generated has been identified as an issue in various applications. In general, the generation of a concept lattice of sufficient complexity and size is one of the most fundamental challenges in FCA. The goal of this work is to provide an overview of research articles that assess and compare numerous fuzzy formal concept analysis techniques which have been suggested, as well as to explore the key techniques for reducing concept lattice size. as well as we'll present a review of research articles on using fuzzy formal concept analysis in ontology engineering, knowledge discovery in databases and data mining, and information retrieval.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Poelmans, Jonas, Dmitry I. Ignatov, Sergei O. Kuznetsov et Guido Dedene. « Fuzzy and rough formal concept analysis : a survey ». International Journal of General Systems 43, no 2 (6 janvier 2014) : 105–34. http://dx.doi.org/10.1080/03081079.2013.862377.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Liu, Yan, Sheng Quan Liu et Peng Li. « Tourism Domain Ontology Construction Method Based on Fuzzy Formal Concept Analysis ». Applied Mechanics and Materials 347-350 (août 2013) : 2809–13. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.2809.

Texte intégral
Résumé :
In this paper, fuzzy formal concept analysis is introduced to the tourism domain ontology construction process, first fuzzy formal concept analysis of uncertain information in the domain of tourism,then through the conceptual clustering generated fuzzy concept hierarchy, lastly mapping to get fuzzy ontology prototype. A example shows that the method is feasible and effective.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Wang, Ting Zhong, et Hong Sheng Xu. « Constructing Domain Ontology Based on Fuzzy Set and Concept Lattice ». Applied Mechanics and Materials 63-64 (juin 2011) : 715–18. http://dx.doi.org/10.4028/www.scientific.net/amm.63-64.715.

Texte intégral
Résumé :
The major content in FCA is to extract formal concepts and connections between them from data in form of formal context so as to form a lattice structure of formal concepts. Fuzzy set theory and fuzzy logic are acknowledged as an appropriate formalism for capturing imprecise and vague knowledge. The paper offers a methodology for building ontology for knowledge sharing and reusing based on fuzzy concept lattices union. This paper makes up these defects by applying formal concept analysis theory and fuzzy sets to construct concept hierarchies of ontology, and the experiments shows the CPU Time in the attribute numbers, indicating that FFCA is superior to FCA in building the ontology of semantic web.
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Fuzzy formal concept analysis"

1

De, Maio Carmen. « Fuzzy concept analysis for semantic knowledge extraction ». Doctoral thesis, Universita degli studi di Salerno, 2012. http://hdl.handle.net/10556/1307.

Texte intégral
Résumé :
2010 - 2011
Availability of controlled vocabularies, ontologies, and so on is enabling feature to provide some added values in terms of knowledge management. Nevertheless, the design, maintenance and construction of domain ontologies are a human intensive and time consuming task. The Knowledge Extraction consists of automatic techniques aimed to identify and to define relevant concepts and relations of the domain of interest by analyzing structured (relational databases, XML) and unstructured (text, documents, images) sources. Specifically, methodology for knowledge extraction defined in this research work is aimed at enabling automatic ontology/taxonomy construction from existing resources in order to obtain useful information. For instance, the experimental results take into account data produced with Web 2.0 tools (e.g., RSS-Feed, Enterprise Wiki, Corporate Blog, etc.), text documents, and so on. Final results of Knowledge Extraction methodology are taxonomies or ontologies represented in a machine oriented manner by means of semantic web technologies, such as: RDFS, OWL and SKOS. The resulting knowledge models have been applied to different goals. On the one hand, the methodology has been applied in order to extract ontologies and taxonomies and to semantically annotate text. On the other hand, the resulting ontologies and taxonomies are exploited in order to enhance information retrieval performance and to categorize incoming data and to provide an easy way to find interesting resources (such as faceted browsing). Specifically, following objectives have been addressed in this research work:  Ontology/Taxonomy Extraction: that concerns to automatic extraction of hierarchical conceptualizations (i.e., taxonomies) and relations expressed by means typical description logic constructs (i.e., ontologies).  Information Retrieval: definition of a technique to perform concept-based the retrieval of information according to the user queries.  Faceted Browsing: in order to automatically provide faceted browsing capabilities according to the categorization of the extracted contents.  Semantic Annotation: definition of a text analysis process, aimed to automatically annotate subjects and predicates identified. The experimental results have been obtained in some application domains: e-learning, enterprise human resource management, clinical decision support system. Future challenges go in the following directions: investigate approaches to support ontology alignment and merging applied to knowledge management.
X n.s.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Konecny, Jan. « Isotone fuzzy Galois connections and their applications in formal concept analysis ». Diss., Online access via UMI:, 2009.

Trouver le texte intégral
Résumé :
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2009.
Includes bibliographical references.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Glodeanu, Cynthia Vera. « Conceptual Factors and Fuzzy Data ». Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-103775.

Texte intégral
Résumé :
With the growing number of large data sets, the necessity of complexity reduction applies today more than ever before. Moreover, some data may also be vague or uncertain. Thus, whenever we have an instrument for data analysis, the questions of how to apply complexity reduction methods and how to treat fuzzy data arise rather naturally. In this thesis, we discuss these issues for the very successful data analysis tool Formal Concept Analysis. In fact, we propose different methods for complexity reduction based on qualitative analyses, and we elaborate on various methods for handling fuzzy data. These two topics split the thesis into two parts. Data reduction is mainly dealt with in the first part of the thesis, whereas we focus on fuzzy data in the second part. Although each chapter may be read almost on its own, each one builds on and uses results from its predecessors. The main crosslink between the chapters is given by the reduction methods and fuzzy data. In particular, we will also discuss complexity reduction methods for fuzzy data, combining the two issues that motivate this thesis
Komplexitätsreduktion ist eines der wichtigsten Verfahren in der Datenanalyse. Mit ständig wachsenden Datensätzen gilt dies heute mehr denn je. In vielen Gebieten stößt man zudem auf vage und ungewisse Daten. Wann immer man ein Instrument zur Datenanalyse hat, stellen sich daher die folgenden zwei Fragen auf eine natürliche Weise: Wie kann man im Rahmen der Analyse die Variablenanzahl verkleinern, und wie kann man Fuzzy-Daten bearbeiten? In dieser Arbeit versuchen wir die eben genannten Fragen für die Formale Begriffsanalyse zu beantworten. Genauer gesagt, erarbeiten wir verschiedene Methoden zur Komplexitätsreduktion qualitativer Daten und entwickeln diverse Verfahren für die Bearbeitung von Fuzzy-Datensätzen. Basierend auf diesen beiden Themen gliedert sich die Arbeit in zwei Teile. Im ersten Teil liegt der Schwerpunkt auf der Komplexitätsreduktion, während sich der zweite Teil der Verarbeitung von Fuzzy-Daten widmet. Die verschiedenen Kapitel sind dabei durch die beiden Themen verbunden. So werden insbesondere auch Methoden für die Komplexitätsreduktion von Fuzzy-Datensätzen entwickelt
Styles APA, Harvard, Vancouver, ISO, etc.
4

Ayouni, Sarra. « Etude et Extraction de règles graduelles floues : définition d'algorithmes efficaces ». Thesis, Montpellier 2, 2012. http://www.theses.fr/2012MON20015/document.

Texte intégral
Résumé :
L'Extraction de connaissances dans les bases de données est un processus qui vise à extraire un ensemble réduit de connaissances à fortes valeurs ajoutées à partir d'un grand volume de données. La fouille de données, l'une des étapes de ce processus, regroupe un certain nombre de taches, telles que : le clustering, la classification, l'extraction de règles d'associations, etc.La problématique d'extraction de règles d'association nécessite l'étape d'extraction de motifs fréquents. Nous distinguons plusieurs catégories de motifs : les motifs classiques, les motifs flous, les motifs graduels, les motifs séquentiels. Ces motifs diffèrent selon le type de données à partir desquelles l'extraction est faite et selon le type de corrélation qu'ils présentent.Les travaux de cette thèse s'inscrivent dans le contexte d'extraction de motifs graduels, flous et clos. En effet, nous définissons de nouveaux systèmes de clôture de la connexion de Galois relatifs, respectivement, aux motifs flous et graduels. Ainsi, nous proposons des algorithmes d'extraction d'un ensemble réduit pour les motifs graduels et les motifs flous.Nous proposons également deux approches d'extraction de motifs graduels flous, ceci en passant par la génération automatique des fonctions d'appartenance des attributs.En se basant sur les motifs flous clos et graduels clos, nous définissons des bases génériques de toutes les règles d'association graduelles et floues. Nous proposons également un système d'inférence complet et valide de toutes les règles à partir de ces bases
Knowledge discovery in databases is a process aiming at extracting a reduced set of valuable knowledge from a huge amount of data. Data mining, one step of this process, includes a number of tasks, such as clustering, classification, of association rules mining, etc.The problem of mining association rules requires the step of frequent patterns extraction. We distinguish several categories of frequent patterns: classical patterns, fuzzy patterns, gradual patterns, sequential patterns, etc. All these patterns differ on the type of the data from which the extraction is done and the type of the relationship that represent.In this thesis, we particularly contribute with the proposal of fuzzy and gradual patterns extraction method.Indeed, we define new systems of closure of the Galois connection for, respectively, fuzzy and gradual patterns. Thus, we propose algorithms for extracting a reduced set of fuzzy and gradual patterns.We also propose two approaches for automatically defining fuzzy modalities that allow obtaining relevant fuzzy gradual patterns.Based on fuzzy closed and gradual closed patterns, we define generic bases of fuzzy and gradual association rules. We thus propose a complet and valid inference system to derive all redundant fuzzy and gradual association rules
Styles APA, Harvard, Vancouver, ISO, etc.
5

Novi, Daniele. « Knowledge management and Discovery for advanced Enterprise Knowledge Engineering ». Doctoral thesis, Universita degli studi di Salerno, 2014. http://hdl.handle.net/10556/1466.

Texte intégral
Résumé :
2012 - 2013
The research work addresses mainly issues related to the adoption of models, methodologies and knowledge management tools that implement a pervasive use of the latest technologies in the area of Semantic Web for the improvement of business processes and Enterprise 2.0 applications. The first phase of the research has focused on the study and analysis of the state of the art and the problems of Knowledge Discovery database, paying more attention to the data mining systems. The most innovative approaches which were investigated for the "Enterprise Knowledge Engineering" are listed below. In detail, the problems analyzed are those relating to architectural aspects and the integration of Legacy Systems (or not). The contribution of research that is intended to give, consists in the identification and definition of a uniform and general model, a "Knowledge Enterprise Model", the original model with respect to the canonical approaches of enterprise architecture (for example with respect to the Object Management - OMG - standard). The introduction of the tools and principles of Enterprise 2.0 in the company have been investigated and, simultaneously, Semantic Enterprise based appropriate solutions have been defined to the problem of fragmentation of information and improvement of the process of knowledge discovery and functional knowledge sharing. All studies and analysis are finalized and validated by defining a methodology and related software tools to support, for the improvement of processes related to the life cycles of best practices across the enterprise. Collaborative tools, knowledge modeling, algorithms, knowledge discovery and extraction are applied synergistically to support these processes. [edited by author]
XII n.s.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Dao, Ngoc Bich. « Réduction de dimension de sac de mots visuels grâce à l’analyse formelle de concepts ». Thesis, La Rochelle, 2017. http://www.theses.fr/2017LAROS010/document.

Texte intégral
Résumé :
La réduction des informations redondantes et/ou non-pertinentes dans la description de données est une étape importante dans plusieurs domaines scientifiques comme les statistiques, la vision par ordinateur, la fouille de données ou l’apprentissage automatique. Dans ce manuscrit, nous abordons la réduction de la taille des signatures des images par une méthode issue de l’Analyse Formelle de Concepts (AFC), qui repose sur la structure du treillis des concepts et la théorie des treillis. Les modèles de sac de mots visuels consistent à décrire une image sous forme d’un ensemble de mots visuels obtenus par clustering. La réduction de la taille des signatures des images consiste donc à sélectionner certains de ces mots visuels. Dans cette thèse, nous proposons deux algorithmes de sélection d’attributs (mots visuels) qui sont utilisables pour l’apprentissage supervisé ou non. Le premier algorithme, RedAttSansPerte, ne retient que les attributs qui correspondent aux irréductibles du treillis. En effet, le théorème fondamental de la théorie des treillis garantit que la structure du treillis des concepts est maintenue en ne conservant que les irréductibles. Notre algorithme utilise un graphe d’attributs, le graphe de précédence, où deux attributs sont en relation lorsque les ensembles d’objets à qui ils appartiennent sont inclus l’un dans l’autre. Nous montrons par des expérimentations que la réduction par l’algorithme RedAttsSansPerte permet de diminuer le nombre d’attributs tout en conservant de bonnes performances de classification. Le deuxième algorithme, RedAttsFloue, est une extension de l’algorithme RedAttsSansPerte. Il repose sur une version approximative du graphe de précédence. Il s’agit de supprimer les attributs selon le même principe que l’algorithme précédent, mais en utilisant ce graphe flou. Un seuil de flexibilité élevé du graphe flou entraîne mécaniquement une perte d’information et de ce fait une baisse de performance de la classification. Nous montrons par des expérimentations que la réduction par l’algorithme RedAttsFloue permet de diminuer davantage l’ensemble des attributs sans diminuer de manière significative les performances de classification
In several scientific fields such as statistics, computer vision and machine learning, redundant and/or irrelevant information reduction in the data description (dimension reduction) is an important step. This process contains two different categories : feature extraction and feature selection, of which feature selection in unsupervised learning is hitherto an open question. In this manuscript, we discussed about feature selection on image datasets using the Formal Concept Analysis (FCA), with focus on lattice structure and lattice theory. The images in a dataset were described as a set of visual words by the bag of visual words model. Two algorithms were proposed in this thesis to select relevant features and they can be used in both unsupervised learning and supervised learning. The first algorithm was the RedAttSansPerte, which based on lattice structure and lattice theory, to ensure its ability to remove redundant features using the precedence graph. The formal definition of precedence graph was given in this thesis. We also demonstrated their properties and the relationship between this graph and the AC-poset. Results from experiments indicated that the RedAttsSansPerte algorithm reduced the size of feature set while maintaining their performance against the evaluation by classification. Secondly, the RedAttsFloue algorithm, an extension of the RedAttsSansPerte algorithm, was also proposed. This extension used the fuzzy precedence graph. The formal definition and the properties of this graph were demonstrated in this manuscript. The RedAttsFloue algorithm removed redundant and irrelevant features while retaining relevant information according to the flexibility threshold of the fuzzy precedence graph. The quality of relevant information was evaluated by the classification. The RedAttsFloue algorithm is suggested to be more robust than the RedAttsSansPerte algorithm in terms of reduction
Styles APA, Harvard, Vancouver, ISO, etc.
7

Diner, Casri. « Visualizing Data With Formal Concept Analysis ». Master's thesis, METU, 2003. http://etd.lib.metu.edu.tr/upload/1046325/index.pdf.

Texte intégral
Résumé :
In this thesis, we wanted to stress the tendency to the geometry of data. This should be applicable in almost every branch of science, where data are of great importance, and also in every kind of industry, economy, medicine etc. Since machine'
s hard-disk capacities which is used for storing datas and the amount of data you can reach through internet is increasing day by day, there should be a need to turn this information into knowledge. This is one of the reasons for studying formal concept analysis. We wanted to point out how this application is related with algebra and logic. The beginning of the first chapter emphasis the relation between closure systems, galois connections, lattice theory as a mathematical structure and concept analysis. Then it describes the basic step in the formalization: An elementary form of the representation of data is defined mathematically. Second chapter explains the logic of formal concept analysis. It also shows how implications, which can be regard as special formulas on a set,between attributes can be shown by fewer implications, so called generating set for implications. These mathematical tools are then used in the last chapter, in order to describe complex '
concept'
lattices by means of decomposition methods in examples.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Krajča, Petr. « Advanced algorithms for formal concept analysis ». Diss., Online access via UMI:, 2009.

Trouver le texte intégral
Résumé :
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2009.
Includes bibliographical references.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Petersen, Wiebke, et Petja Heinrich. « Qualitative Citation Analysis Based on Formal Concept Analysis ». Universitätsbibliothek Chemnitz, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200801464.

Texte intégral
Résumé :
Zu den Aufgaben der Bibliometrie gehört die Zitationsanalyse (Kessler 1963), das heißt die Analyse von Kozitationen (zwei Texte werden kozipiert, wenn es einen Text gibt, in dem beide zitiert werden) und die bibliographische Kopplung (zwei Texte sind bibliographisch gekoppelt, wenn beide eine gemeinsame Zitation aufweisen). In dem Vortrag wird aufgezeigt werden, daß die Formale Begriffsanalyse (FBA) für eine qualitative Zitationsanalyse geeignete Mittel bereithält. Eine besondere Eigenschaft der FBA ist, daß sie die Kombination verschiedenartiger (qualitativer und skalarer) Merkmale ermöglicht. Durch den Einsatz geeigneter Skalen kann auch dem Problem begegnet werden, daß die große Zahl von zu analysierenden Texten bei qualitativen Analyseansätzen in der Regel zu unübersichtlichen Zitationsgraphen führt, deren Inhalt nicht erfaßt werden kann. Die Relation der bibliographischen Kopplung ist eng verwandt mit den von Priss entwickelten Nachbarschaftskontexten, die zur Analyse von Lexika eingesetzt werden. Anhand einiger Beispielanalysen werden die wichtigsten Begriffe der Zitationsanalyse in formalen Kontexten und Begriffsverbänden modelliert. Es stellt sich heraus, daß die hierarchischen Begriffsverbände der FBA den gewöhnlichen Zitationsgraphen in vielerlei Hinsicht überlegen sind, da sie durch ihre hierarchische Verbandstruktur bestimmte Regularitäten explizit erfassen. Außerdem wird gezeigt, wie durch die Kombination geeigneter Merkmale (Doktorvater, Institut, Fachbereich, Zitationshäufigkeit, Keywords) und Skalen häufigen Fehlerquellen wie Gefälligkeitszitationen, Gewohnheitszitationen u.s.w. begegnet werden kann.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Sertkaya, Baris. « Formal Concept Analysis Methods for Description Logics ». Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1215598189927-85390.

Texte intégral
Résumé :
This work presents mainly two contributions to Description Logics (DLs) research by means of Formal Concept Analysis (FCA) methods: supporting bottom-up construction of DL knowledge bases, and completing DL knowledge bases. Its contribution to FCA research is on the computational complexity of computing generators of closed sets.
Styles APA, Harvard, Vancouver, ISO, etc.

Livres sur le sujet "Fuzzy formal concept analysis"

1

Braud, Agnès, Aleksey Buzmakov, Tom Hanika et Florence Le Ber, dir. Formal Concept Analysis. Cham : Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77867-5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Cristea, Diana, Florence Le Ber et Baris Sertkaya, dir. Formal Concept Analysis. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-21462-3.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Kwuida, Léonard, et Barış Sertkaya, dir. Formal Concept Analysis. Berlin, Heidelberg : Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-11928-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Ferré, Sébastien, et Sebastian Rudolph, dir. Formal Concept Analysis. Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01815-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Cellier, Peggy, Felix Distel et Bernhard Ganter, dir. Formal Concept Analysis. Berlin, Heidelberg : Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38317-5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Domenach, Florent, Dmitry I. Ignatov et Jonas Poelmans, dir. Formal Concept Analysis. Berlin, Heidelberg : Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29892-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Valtchev, Petko, et Robert Jäschke, dir. Formal Concept Analysis. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20514-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Glodeanu, Cynthia Vera, Mehdi Kaytoue et Christian Sacarea, dir. Formal Concept Analysis. Cham : Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07248-7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Ganter, Bernhard, et Rudolf Wille. Formal Concept Analysis. Berlin, Heidelberg : Springer Berlin Heidelberg, 1999. http://dx.doi.org/10.1007/978-3-642-59830-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Medina, Raoul, et Sergei Obiedkov, dir. Formal Concept Analysis. Berlin, Heidelberg : Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-78137-0.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Fuzzy formal concept analysis"

1

Macko, Juraj. « User-Friendly Fuzzy FCA ». Dans Formal Concept Analysis, 156–71. Berlin, Heidelberg : Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38317-5_10.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Bělohlávek, Radim, Vladimír Sklenář et Jiří Zacpal. « Crisply Generated Fuzzy Concepts ». Dans Formal Concept Analysis, 269–84. Berlin, Heidelberg : Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/978-3-540-32262-7_19.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Brito, Abner, Laécio Barros, Estevão Laureano, Fábio Bertato et Marcelo Coniglio. « Fuzzy Formal Concept Analysis ». Dans Communications in Computer and Information Science, 192–205. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-95312-0_17.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Bělohlávek, Radim, et Vilém Vychodil. « Attribute Implications in a Fuzzy Setting ». Dans Formal Concept Analysis, 45–60. Berlin, Heidelberg : Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11671404_3.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

García-Pardo, F., I. P. Cabrera, P. Cordero et M. Ojeda-Aciego. « On Closure Systems and Adjunctions Between Fuzzy Preordered Sets ». Dans Formal Concept Analysis, 114–27. Cham : Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19545-2_7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Antoni, Lubomir, Stanislav Krajči et Ondrej Krídlo. « Randomized Fuzzy Formal Contexts and Relevance of One-Sided Concepts ». Dans Formal Concept Analysis, 183–99. Cham : Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19545-2_12.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Konecny, Jan. « Bonds Between $$L$$ -Fuzzy Contexts Over Different Structures of Truth-Degrees ». Dans Formal Concept Analysis, 81–96. Cham : Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19545-2_5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Liu, Xiaodong, et Witold Pedrycz. « AFS Formal Concept and AFS Fuzzy Formal Concept Analysis ». Dans Axiomatic Fuzzy Set Theory and Its Applications, 303–49. Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-00402-5_8.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Cordero, Pablo, Manuel Enciso et Angel Mora. « Directness in Fuzzy Formal Concept Analysis ». Dans Communications in Computer and Information Science, 585–95. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91473-2_50.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Djouadi, Yassine, et Henri Prade. « Interval-Valued Fuzzy Formal Concept Analysis ». Dans Lecture Notes in Computer Science, 592–601. Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04125-9_62.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Actes de conférences sur le sujet "Fuzzy formal concept analysis"

1

Golinska-Pilarek, Joanna, et Ewa Orlowska. « Relational Reasoning in Formal Concept Analysis ». Dans 2007 IEEE International Fuzzy Systems Conference. IEEE, 2007. http://dx.doi.org/10.1109/fuzzy.2007.4295512.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Zhou, Lei. « Formal concept analysis in intuitionistic fuzzy formal context ». Dans 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD). IEEE, 2010. http://dx.doi.org/10.1109/fskd.2010.5569658.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Shen, Lili, et Dexue Zhang. « Formal concept analysis on fuzzy sets ». Dans 2013 Joint IFSA World Congress and NAFIPS Annual Meeting (IFSA/NAFIPS). IEEE, 2013. http://dx.doi.org/10.1109/ifsa-nafips.2013.6608402.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Kridlo, Ondrej, et Manuel Ojeda-Aciego. « Extending formal concept analysis using intuitionistic l-fuzzy sets ». Dans 2017 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE). IEEE, 2017. http://dx.doi.org/10.1109/fuzz-ieee.2017.8015570.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Cross, Valerie V., et Wenting Yi. « Formal concept analysis for ontologies and their annotation files ». Dans 2008 IEEE 16th International Conference on Fuzzy Systems (FUZZ-IEEE). IEEE, 2008. http://dx.doi.org/10.1109/fuzzy.2008.4630646.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Hashemi, R. R., S. De Agostino, B. Westgeest et J. R. Talburt. « Data granulation and formal concept analysis ». Dans IEEE Annual Meeting of the Fuzzy Information, 2004. Processing NAFIPS '04. IEEE, 2004. http://dx.doi.org/10.1109/nafips.2004.1336253.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Zhou, Wen, Zongtian Liu et Yan Zhao. « Concept Hierarchies Generation for Classification using Fuzzy Formal Concept Analysis ». Dans Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007). IEEE, 2007. http://dx.doi.org/10.1109/snpd.2007.229.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Belohlavek, Radim, et Jan Konecny. « Scaling, Granulation, and Fuzzy Attributes in Formal Concept Analysis ». Dans 2007 IEEE International Fuzzy Systems Conference. IEEE, 2007. http://dx.doi.org/10.1109/fuzzy.2007.4295488.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Yao, Y. Y., et Yaohua Chen. « Rough set approximations in formal concept analysis ». Dans IEEE Annual Meeting of the Fuzzy Information, 2004. Processing NAFIPS '04. IEEE, 2004. http://dx.doi.org/10.1109/nafips.2004.1336252.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Liu, Jun, et Xiaoqiu Yao. « Formal concept analysis of incomplete information system ». Dans 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD). IEEE, 2010. http://dx.doi.org/10.1109/fskd.2010.5569661.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Rapports d'organisations sur le sujet "Fuzzy formal concept analysis"

1

Baader, Franz, Bernhard Ganter, Ulrike Sattler et Barış Sertkaya. Completing Description Logic Knowledge Bases using Formal Concept Analysis. Aachen University of Technology, 2006. http://dx.doi.org/10.25368/2022.155.

Texte intégral
Résumé :
We propose an approach for extending both the terminological and the assertional part of a Description Logic knowledge base by using information provided by the assertional part and by a domain expert. The use of techniques from Formal Concept Analysis ensures that, on the one hand, the interaction with the expert is kept to a minimum, and, on the other hand, we can show that the extended knowledge base is complete in a certain sense.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Baader, Franz, et Felix Distel. A finite basis for the set of EL-implications holding in a finite model. Technische Universität Dresden, 2007. http://dx.doi.org/10.25368/2022.160.

Texte intégral
Résumé :
Formal Concept Analysis (FCA) can be used to analyze data given in the form of a formal context. In particular, FCA provides efficient algorithms for computing a minimal basis of the implications holding in the context. In this paper, we extend classical FCA by considering data that are represented by relational structures rather than formal contexts, and by replacing atomic attributes by complex formulae defined in some logic. After generalizing some of the FCA theory to this more general form of contexts, we instantiate the general framework with attributes defined in the Description Logic (DL) EL, and with relational structures over a signature of unary and binary predicates, i.e., models for EL. In this setting, an implication corresponds to a so-called general concept inclusion axiom (GCI) in EL. The main technical result of this report is that, in EL, for any finite model there is a finite set of implications (GCIs) holding in this model from which all implications (GCIs) holding in the model follow.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Baader, Franz, et Felix Distel. Exploring finite models in the Description Logic ELgfp. Technische Universität Dresden, 2008. http://dx.doi.org/10.25368/2022.168.

Texte intégral
Résumé :
In a previous ICFCA paper we have shown that, in the Description Logics EL and ELgfp, the set of general concept inclusions holding in a finite model always has a finite basis. In this paper, we address the problem of how to compute this basis efficiently, by adapting methods from formal concept analysis.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Borchmann, Daniel. A General Form of Attribute Exploration. Technische Universität Dresden, 2013. http://dx.doi.org/10.25368/2022.192.

Texte intégral
Résumé :
We present a general form of attribute exploration, a knowledge completion algorithm from formal concept analysis. The aim of this generalization is to extend the applicability of attribute exploration by a general description. Additionally, this may also allow for viewing different existing variants of attribute exploration as instances of a general form, as for example exploration on partial contexts.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Borchmann, Daniel. Exploration by Confidence. Technische Universität Dresden, 2013. http://dx.doi.org/10.25368/2022.194.

Texte intégral
Résumé :
Within formal concept analysis, attribute exploration is a powerful tool to semiautomatically check data for completeness with respect to a given domain. However, the classical formulation of attribute exploration does not take into account possible errors which are present in the initial data. We present in this work a generalization of attribute exploration based on the notion of confidence, which will allow for the exploration of implications which are not necessarily valid in the initial data, but instead enjoy a minimal confidence therein.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Соловйов, Володимир Миколайович, et V. Saptsin. Heisenberg uncertainty principle and economic analogues of basic physical quantities. Transport and Telecommunication Institute, 2011. http://dx.doi.org/10.31812/0564/1188.

Texte intégral
Résumé :
From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measuring is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Soloviev, V., V. Solovieva et V. Saptsin. Heisenberg uncertainity principle and economic analogues of basic physical quantities. Брама-Україна, 2014. http://dx.doi.org/10.31812/0564/1306.

Texte intégral
Résumé :
From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Soloviev, V., et V. Solovieva. Quantum econophysics of cryptocurrencies crises. [б. в.], 2018. http://dx.doi.org/10.31812/0564/2464.

Texte intégral
Résumé :
From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measuring is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, related to the cryptocurrencies market, the achieved results are open for discussion. Then, combined the empirical cross-correlation matrix with the random matrix theory, we mainly examine the statistical properties of cross-correlation coefficient, the evolution of average correlation coefficient, the distribution of eigenvalues and corresponding eigenvectors of the global cryptocurrency market using the daily returns of 15 cryptocurrencies price time series across the world from 2016 to 2018. The result indicated that the largest eigenvalue reflects a collective effect of the whole market, practically coincides with the dynamics of the mean value of the correlation coefficient and very sensitive to the crisis phenomena. It is shown that both the introduced economic mass and the largest eigenvalue of the matrix of correlations can serve as quantum indicator-predictors of crises in the market of cryptocurrencies.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Kelly, Luke. Definitions, Characteristics and Monitoring of Conflict Economies. Institute of Development Studies (IDS), février 2022. http://dx.doi.org/10.19088/k4d.2022.024.

Texte intégral
Résumé :
The idea of conflict economies is a broad concept encompassing several research angles. Definitions differ according to these focuses. Some of the main uses of the concept are to understand: • economic analysis of the motives for and likelihood of war • financing of state and non-state belligerents • how the continuation of conflicts can be explained by rational motives including economic ones • how conflict affects economic activity, and how conflict parties and citizens adapt Some distinctive characteristics of war economies are (Ballentine & Nitzschke, 2005, p. 12): • They involve the destruction or circumvention of the formal economy and the growth of informal and black markets, • Pillage, predation, extortion, and deliberate violence against civilians is used by combatants to acquire control over lucrative assets, capture trade networks and diaspora remittances, and exploit labour; • War economies are highly decentralised and privatised, both in the means of coercion and in the means of production and exchange; • Combatants increasingly rely on the licit or illicit exploitation of / trade in lucrative natural resources • They thrive on cross-border trading networks, regional kin and ethnic groups, arms traffickers, and mercenaries, as well as legally operating commercial entities, each of which may have a vested interest in the continuation of conflict and instability. The first section of this rapid review outlines the evolution of the term and key definitions. Most of this discussion occurs in the academic literature around the early 2000s. The second looks at key characteristics of conflict economies identified in the literature, with examples where possible from both academic and grey literature. The third section briefly identifies methodologies used to measure and monitor conflict economies, as well as some current research and programmes on conflict economies, from academic literature as well as NGOs and other sources. The findings have been derived via a literature search and advice from experts in the field. Given time constraints, the report is not comprehensive. The review is gender- and disability blind.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova et Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], février 2020. http://dx.doi.org/10.31812/123456789/3677.

Texte intégral
Résumé :
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie