Academic literature on the topic 'Fuzzy formal concept analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Fuzzy formal concept analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Fuzzy formal concept analysis"

1

Supriyati, Endang. "FUZZY FORMAL CONCEPT ANALYSIS UNTUK KEMIRIPAN DOKUMEN." Simetris : Jurnal Teknik Mesin, Elektro dan Ilmu Komputer 1, no. 1 (June 29, 2013): 21. http://dx.doi.org/10.24176/simet.v1i1.111.

Full text
Abstract:
ABSTRAK Fuzzy logic dapat dimasukkan ke dalam ontologi untuk representasi ketidakpastian informasi yang ditemukan di banyak aplikasi domain karena kurangnya jelas batas-batas antara konsep domain. Fuzzy ontologi dihasilkan dari konsep hirarki yang telah ditetapkan. Namun, untuk membangun sebuah konsep hirarki untuk domain tertentu dapat menjadi tugas yang sulit dan membosankan. Untuk mengatasi masalah ini, diusulkan Fuzzy Formal Concept Analysis(FFCA). Titik awal dari metode diusulkan dalam paper ini adalah definisi dari konteks , relasi kemiripan pada domain ontologi kemudian memetakan ke dalam concept lattice. Dengan penggunaan tool lattice navigator,metode yang diusulkan mampu mengelompokkan domain ontology secara efektif. Kata Kunci: Ontology, Formal Concept Analysis, Fuzzy Formal Concept Analysis,konsep Lattice
APA, Harvard, Vancouver, ISO, and other styles
2

FORMICA, ANNA. "CONCEPT SIMILARITY IN FUZZY FORMAL CONCEPT ANALYSIS FOR SEMANTIC WEB." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 18, no. 02 (April 2010): 153–67. http://dx.doi.org/10.1142/s0218488510006465.

Full text
Abstract:
This paper presents a method for evaluating concept similarity within Fuzzy Formal Concept Analysis. In the perspective of developing the Semantic Web, such a method can be helpful when the digital resources found on the Internet cannot be treated equally and the integration of fuzzy data becomes fundamental for the search and discovery of information in the Web.
APA, Harvard, Vancouver, ISO, and other styles
3

Haddache, Mohamed, Allel Hadjali, and Hamid Azzoune. "Skyline refinement exploiting fuzzy formal concept analysis." International Journal of Intelligent Computing and Cybernetics 14, no. 3 (April 29, 2021): 333–62. http://dx.doi.org/10.1108/ijicc-11-2020-0181.

Full text
Abstract:
PurposeThe study of the skyline queries has received considerable attention from several database researchers since the end of 2000's. Skyline queries are an appropriate tool that can help users to make intelligent decisions in the presence of multidimensional data when different, and often contradictory criteria are to be taken into account. Based on the concept of Pareto dominance, the skyline process extracts the most interesting (not dominated in the sense of Pareto) objects from a set of data. Skyline computation methods often lead to a set with a large size which is less informative for the end users and not easy to be exploited. The purpose of this paper is to tackle this problem, known as the large size skyline problem, and propose a solution to deal with it by applying an appropriate refining process.Design/methodology/approachThe problem of the skyline refinement is formalized in the fuzzy formal concept analysis setting. Then, an ideal fuzzy formal concept is computed in the sense of some particular defined criteria. By leveraging the elements of this ideal concept, one can reduce the size of the computed Skyline.FindingsAn appropriate and rational solution is discussed for the problem of interest. Then, a tool, named SkyRef, is developed. Rich experiments are done using this tool on both synthetic and real datasets.Research limitations/implicationsThe authors have conducted experiments on synthetic and some real datasets to show the effectiveness of the proposed approaches. However, thorough experiments on large-scale real datasets are highly desirable to show the behavior of the tool with respect to the performance and time execution criteria.Practical implicationsThe tool developed SkyRef can have many domains applications that require decision-making, personalized recommendation and where the size of skyline has to be reduced. In particular, SkyRef can be used in several real-world applications such as economic, security, medicine and services.Social implicationsThis work can be expected in all domains that require decision-making like hotel finder, restaurant recommender, recruitment of candidates, etc.Originality/valueThis study mixes two research fields artificial intelligence (i.e. formal concept analysis) and databases (i.e. skyline queries). The key elements of the solution proposed for the skyline refinement problem are borrowed from the fuzzy formal concept analysis which makes it clearer and rational, semantically speaking. On the other hand, this study opens the door for using the formal concept analysis and its extensions in solving other issues related to skyline queries, such as relaxation.
APA, Harvard, Vancouver, ISO, and other styles
4

Konecny, Jan, and Ondrej Krídlo. "On biconcepts in formal fuzzy concept analysis." Information Sciences 375 (January 2017): 16–29. http://dx.doi.org/10.1016/j.ins.2016.09.042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Konecny, Jan, and Michal Krupka. "Block relations in formal fuzzy concept analysis." International Journal of Approximate Reasoning 73 (June 2016): 27–55. http://dx.doi.org/10.1016/j.ijar.2016.02.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Shao, Ming-Wen, Min Liu, and Wen-Xiu Zhang. "Set approximations in fuzzy formal concept analysis." Fuzzy Sets and Systems 158, no. 23 (December 2007): 2627–40. http://dx.doi.org/10.1016/j.fss.2007.05.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Alwersh, Mohammed, and Kovács László. "Fuzzy formal concept analysis: approaches, applications and issues." Computer Science and Information Technologies 3, no. 2 (July 1, 2022): 126–36. http://dx.doi.org/10.11591/csit.v3i2.p126-136.

Full text
Abstract:
Formal concept analysis (FCA) is today regarded as a significant technique for knowledge extraction, representation, and analysis for applications in a variety of fields. Significant progress has been made in recent years to extend FCA theory to deal with uncertain and imperfect data. The computational complexity associated with the enormous number of formal concepts generated has been identified as an issue in various applications. In general, the generation of a concept lattice of sufficient complexity and size is one of the most fundamental challenges in FCA. The goal of this work is to provide an overview of research articles that assess and compare numerous fuzzy formal concept analysis techniques which have been suggested, as well as to explore the key techniques for reducing concept lattice size. as well as we'll present a review of research articles on using fuzzy formal concept analysis in ontology engineering, knowledge discovery in databases and data mining, and information retrieval.
APA, Harvard, Vancouver, ISO, and other styles
8

Poelmans, Jonas, Dmitry I. Ignatov, Sergei O. Kuznetsov, and Guido Dedene. "Fuzzy and rough formal concept analysis: a survey." International Journal of General Systems 43, no. 2 (January 6, 2014): 105–34. http://dx.doi.org/10.1080/03081079.2013.862377.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Yan, Sheng Quan Liu, and Peng Li. "Tourism Domain Ontology Construction Method Based on Fuzzy Formal Concept Analysis." Applied Mechanics and Materials 347-350 (August 2013): 2809–13. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.2809.

Full text
Abstract:
In this paper, fuzzy formal concept analysis is introduced to the tourism domain ontology construction process, first fuzzy formal concept analysis of uncertain information in the domain of tourism,then through the conceptual clustering generated fuzzy concept hierarchy, lastly mapping to get fuzzy ontology prototype. A example shows that the method is feasible and effective.
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Ting Zhong, and Hong Sheng Xu. "Constructing Domain Ontology Based on Fuzzy Set and Concept Lattice." Applied Mechanics and Materials 63-64 (June 2011): 715–18. http://dx.doi.org/10.4028/www.scientific.net/amm.63-64.715.

Full text
Abstract:
The major content in FCA is to extract formal concepts and connections between them from data in form of formal context so as to form a lattice structure of formal concepts. Fuzzy set theory and fuzzy logic are acknowledged as an appropriate formalism for capturing imprecise and vague knowledge. The paper offers a methodology for building ontology for knowledge sharing and reusing based on fuzzy concept lattices union. This paper makes up these defects by applying formal concept analysis theory and fuzzy sets to construct concept hierarchies of ontology, and the experiments shows the CPU Time in the attribute numbers, indicating that FFCA is superior to FCA in building the ontology of semantic web.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Fuzzy formal concept analysis"

1

De, Maio Carmen. "Fuzzy concept analysis for semantic knowledge extraction." Doctoral thesis, Universita degli studi di Salerno, 2012. http://hdl.handle.net/10556/1307.

Full text
Abstract:
2010 - 2011
Availability of controlled vocabularies, ontologies, and so on is enabling feature to provide some added values in terms of knowledge management. Nevertheless, the design, maintenance and construction of domain ontologies are a human intensive and time consuming task. The Knowledge Extraction consists of automatic techniques aimed to identify and to define relevant concepts and relations of the domain of interest by analyzing structured (relational databases, XML) and unstructured (text, documents, images) sources. Specifically, methodology for knowledge extraction defined in this research work is aimed at enabling automatic ontology/taxonomy construction from existing resources in order to obtain useful information. For instance, the experimental results take into account data produced with Web 2.0 tools (e.g., RSS-Feed, Enterprise Wiki, Corporate Blog, etc.), text documents, and so on. Final results of Knowledge Extraction methodology are taxonomies or ontologies represented in a machine oriented manner by means of semantic web technologies, such as: RDFS, OWL and SKOS. The resulting knowledge models have been applied to different goals. On the one hand, the methodology has been applied in order to extract ontologies and taxonomies and to semantically annotate text. On the other hand, the resulting ontologies and taxonomies are exploited in order to enhance information retrieval performance and to categorize incoming data and to provide an easy way to find interesting resources (such as faceted browsing). Specifically, following objectives have been addressed in this research work:  Ontology/Taxonomy Extraction: that concerns to automatic extraction of hierarchical conceptualizations (i.e., taxonomies) and relations expressed by means typical description logic constructs (i.e., ontologies).  Information Retrieval: definition of a technique to perform concept-based the retrieval of information according to the user queries.  Faceted Browsing: in order to automatically provide faceted browsing capabilities according to the categorization of the extracted contents.  Semantic Annotation: definition of a text analysis process, aimed to automatically annotate subjects and predicates identified. The experimental results have been obtained in some application domains: e-learning, enterprise human resource management, clinical decision support system. Future challenges go in the following directions: investigate approaches to support ontology alignment and merging applied to knowledge management.
X n.s.
APA, Harvard, Vancouver, ISO, and other styles
2

Konecny, Jan. "Isotone fuzzy Galois connections and their applications in formal concept analysis." Diss., Online access via UMI:, 2009.

Find full text
Abstract:
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2009.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
3

Glodeanu, Cynthia Vera. "Conceptual Factors and Fuzzy Data." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-103775.

Full text
Abstract:
With the growing number of large data sets, the necessity of complexity reduction applies today more than ever before. Moreover, some data may also be vague or uncertain. Thus, whenever we have an instrument for data analysis, the questions of how to apply complexity reduction methods and how to treat fuzzy data arise rather naturally. In this thesis, we discuss these issues for the very successful data analysis tool Formal Concept Analysis. In fact, we propose different methods for complexity reduction based on qualitative analyses, and we elaborate on various methods for handling fuzzy data. These two topics split the thesis into two parts. Data reduction is mainly dealt with in the first part of the thesis, whereas we focus on fuzzy data in the second part. Although each chapter may be read almost on its own, each one builds on and uses results from its predecessors. The main crosslink between the chapters is given by the reduction methods and fuzzy data. In particular, we will also discuss complexity reduction methods for fuzzy data, combining the two issues that motivate this thesis
Komplexitätsreduktion ist eines der wichtigsten Verfahren in der Datenanalyse. Mit ständig wachsenden Datensätzen gilt dies heute mehr denn je. In vielen Gebieten stößt man zudem auf vage und ungewisse Daten. Wann immer man ein Instrument zur Datenanalyse hat, stellen sich daher die folgenden zwei Fragen auf eine natürliche Weise: Wie kann man im Rahmen der Analyse die Variablenanzahl verkleinern, und wie kann man Fuzzy-Daten bearbeiten? In dieser Arbeit versuchen wir die eben genannten Fragen für die Formale Begriffsanalyse zu beantworten. Genauer gesagt, erarbeiten wir verschiedene Methoden zur Komplexitätsreduktion qualitativer Daten und entwickeln diverse Verfahren für die Bearbeitung von Fuzzy-Datensätzen. Basierend auf diesen beiden Themen gliedert sich die Arbeit in zwei Teile. Im ersten Teil liegt der Schwerpunkt auf der Komplexitätsreduktion, während sich der zweite Teil der Verarbeitung von Fuzzy-Daten widmet. Die verschiedenen Kapitel sind dabei durch die beiden Themen verbunden. So werden insbesondere auch Methoden für die Komplexitätsreduktion von Fuzzy-Datensätzen entwickelt
APA, Harvard, Vancouver, ISO, and other styles
4

Ayouni, Sarra. "Etude et Extraction de règles graduelles floues : définition d'algorithmes efficaces." Thesis, Montpellier 2, 2012. http://www.theses.fr/2012MON20015/document.

Full text
Abstract:
L'Extraction de connaissances dans les bases de données est un processus qui vise à extraire un ensemble réduit de connaissances à fortes valeurs ajoutées à partir d'un grand volume de données. La fouille de données, l'une des étapes de ce processus, regroupe un certain nombre de taches, telles que : le clustering, la classification, l'extraction de règles d'associations, etc.La problématique d'extraction de règles d'association nécessite l'étape d'extraction de motifs fréquents. Nous distinguons plusieurs catégories de motifs : les motifs classiques, les motifs flous, les motifs graduels, les motifs séquentiels. Ces motifs diffèrent selon le type de données à partir desquelles l'extraction est faite et selon le type de corrélation qu'ils présentent.Les travaux de cette thèse s'inscrivent dans le contexte d'extraction de motifs graduels, flous et clos. En effet, nous définissons de nouveaux systèmes de clôture de la connexion de Galois relatifs, respectivement, aux motifs flous et graduels. Ainsi, nous proposons des algorithmes d'extraction d'un ensemble réduit pour les motifs graduels et les motifs flous.Nous proposons également deux approches d'extraction de motifs graduels flous, ceci en passant par la génération automatique des fonctions d'appartenance des attributs.En se basant sur les motifs flous clos et graduels clos, nous définissons des bases génériques de toutes les règles d'association graduelles et floues. Nous proposons également un système d'inférence complet et valide de toutes les règles à partir de ces bases
Knowledge discovery in databases is a process aiming at extracting a reduced set of valuable knowledge from a huge amount of data. Data mining, one step of this process, includes a number of tasks, such as clustering, classification, of association rules mining, etc.The problem of mining association rules requires the step of frequent patterns extraction. We distinguish several categories of frequent patterns: classical patterns, fuzzy patterns, gradual patterns, sequential patterns, etc. All these patterns differ on the type of the data from which the extraction is done and the type of the relationship that represent.In this thesis, we particularly contribute with the proposal of fuzzy and gradual patterns extraction method.Indeed, we define new systems of closure of the Galois connection for, respectively, fuzzy and gradual patterns. Thus, we propose algorithms for extracting a reduced set of fuzzy and gradual patterns.We also propose two approaches for automatically defining fuzzy modalities that allow obtaining relevant fuzzy gradual patterns.Based on fuzzy closed and gradual closed patterns, we define generic bases of fuzzy and gradual association rules. We thus propose a complet and valid inference system to derive all redundant fuzzy and gradual association rules
APA, Harvard, Vancouver, ISO, and other styles
5

Novi, Daniele. "Knowledge management and Discovery for advanced Enterprise Knowledge Engineering." Doctoral thesis, Universita degli studi di Salerno, 2014. http://hdl.handle.net/10556/1466.

Full text
Abstract:
2012 - 2013
The research work addresses mainly issues related to the adoption of models, methodologies and knowledge management tools that implement a pervasive use of the latest technologies in the area of Semantic Web for the improvement of business processes and Enterprise 2.0 applications. The first phase of the research has focused on the study and analysis of the state of the art and the problems of Knowledge Discovery database, paying more attention to the data mining systems. The most innovative approaches which were investigated for the "Enterprise Knowledge Engineering" are listed below. In detail, the problems analyzed are those relating to architectural aspects and the integration of Legacy Systems (or not). The contribution of research that is intended to give, consists in the identification and definition of a uniform and general model, a "Knowledge Enterprise Model", the original model with respect to the canonical approaches of enterprise architecture (for example with respect to the Object Management - OMG - standard). The introduction of the tools and principles of Enterprise 2.0 in the company have been investigated and, simultaneously, Semantic Enterprise based appropriate solutions have been defined to the problem of fragmentation of information and improvement of the process of knowledge discovery and functional knowledge sharing. All studies and analysis are finalized and validated by defining a methodology and related software tools to support, for the improvement of processes related to the life cycles of best practices across the enterprise. Collaborative tools, knowledge modeling, algorithms, knowledge discovery and extraction are applied synergistically to support these processes. [edited by author]
XII n.s.
APA, Harvard, Vancouver, ISO, and other styles
6

Dao, Ngoc Bich. "Réduction de dimension de sac de mots visuels grâce à l’analyse formelle de concepts." Thesis, La Rochelle, 2017. http://www.theses.fr/2017LAROS010/document.

Full text
Abstract:
La réduction des informations redondantes et/ou non-pertinentes dans la description de données est une étape importante dans plusieurs domaines scientifiques comme les statistiques, la vision par ordinateur, la fouille de données ou l’apprentissage automatique. Dans ce manuscrit, nous abordons la réduction de la taille des signatures des images par une méthode issue de l’Analyse Formelle de Concepts (AFC), qui repose sur la structure du treillis des concepts et la théorie des treillis. Les modèles de sac de mots visuels consistent à décrire une image sous forme d’un ensemble de mots visuels obtenus par clustering. La réduction de la taille des signatures des images consiste donc à sélectionner certains de ces mots visuels. Dans cette thèse, nous proposons deux algorithmes de sélection d’attributs (mots visuels) qui sont utilisables pour l’apprentissage supervisé ou non. Le premier algorithme, RedAttSansPerte, ne retient que les attributs qui correspondent aux irréductibles du treillis. En effet, le théorème fondamental de la théorie des treillis garantit que la structure du treillis des concepts est maintenue en ne conservant que les irréductibles. Notre algorithme utilise un graphe d’attributs, le graphe de précédence, où deux attributs sont en relation lorsque les ensembles d’objets à qui ils appartiennent sont inclus l’un dans l’autre. Nous montrons par des expérimentations que la réduction par l’algorithme RedAttsSansPerte permet de diminuer le nombre d’attributs tout en conservant de bonnes performances de classification. Le deuxième algorithme, RedAttsFloue, est une extension de l’algorithme RedAttsSansPerte. Il repose sur une version approximative du graphe de précédence. Il s’agit de supprimer les attributs selon le même principe que l’algorithme précédent, mais en utilisant ce graphe flou. Un seuil de flexibilité élevé du graphe flou entraîne mécaniquement une perte d’information et de ce fait une baisse de performance de la classification. Nous montrons par des expérimentations que la réduction par l’algorithme RedAttsFloue permet de diminuer davantage l’ensemble des attributs sans diminuer de manière significative les performances de classification
In several scientific fields such as statistics, computer vision and machine learning, redundant and/or irrelevant information reduction in the data description (dimension reduction) is an important step. This process contains two different categories : feature extraction and feature selection, of which feature selection in unsupervised learning is hitherto an open question. In this manuscript, we discussed about feature selection on image datasets using the Formal Concept Analysis (FCA), with focus on lattice structure and lattice theory. The images in a dataset were described as a set of visual words by the bag of visual words model. Two algorithms were proposed in this thesis to select relevant features and they can be used in both unsupervised learning and supervised learning. The first algorithm was the RedAttSansPerte, which based on lattice structure and lattice theory, to ensure its ability to remove redundant features using the precedence graph. The formal definition of precedence graph was given in this thesis. We also demonstrated their properties and the relationship between this graph and the AC-poset. Results from experiments indicated that the RedAttsSansPerte algorithm reduced the size of feature set while maintaining their performance against the evaluation by classification. Secondly, the RedAttsFloue algorithm, an extension of the RedAttsSansPerte algorithm, was also proposed. This extension used the fuzzy precedence graph. The formal definition and the properties of this graph were demonstrated in this manuscript. The RedAttsFloue algorithm removed redundant and irrelevant features while retaining relevant information according to the flexibility threshold of the fuzzy precedence graph. The quality of relevant information was evaluated by the classification. The RedAttsFloue algorithm is suggested to be more robust than the RedAttsSansPerte algorithm in terms of reduction
APA, Harvard, Vancouver, ISO, and other styles
7

Diner, Casri. "Visualizing Data With Formal Concept Analysis." Master's thesis, METU, 2003. http://etd.lib.metu.edu.tr/upload/1046325/index.pdf.

Full text
Abstract:
In this thesis, we wanted to stress the tendency to the geometry of data. This should be applicable in almost every branch of science, where data are of great importance, and also in every kind of industry, economy, medicine etc. Since machine'
s hard-disk capacities which is used for storing datas and the amount of data you can reach through internet is increasing day by day, there should be a need to turn this information into knowledge. This is one of the reasons for studying formal concept analysis. We wanted to point out how this application is related with algebra and logic. The beginning of the first chapter emphasis the relation between closure systems, galois connections, lattice theory as a mathematical structure and concept analysis. Then it describes the basic step in the formalization: An elementary form of the representation of data is defined mathematically. Second chapter explains the logic of formal concept analysis. It also shows how implications, which can be regard as special formulas on a set,between attributes can be shown by fewer implications, so called generating set for implications. These mathematical tools are then used in the last chapter, in order to describe complex '
concept'
lattices by means of decomposition methods in examples.
APA, Harvard, Vancouver, ISO, and other styles
8

Krajča, Petr. "Advanced algorithms for formal concept analysis." Diss., Online access via UMI:, 2009.

Find full text
Abstract:
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2009.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
9

Petersen, Wiebke, and Petja Heinrich. "Qualitative Citation Analysis Based on Formal Concept Analysis." Universitätsbibliothek Chemnitz, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200801464.

Full text
Abstract:
Zu den Aufgaben der Bibliometrie gehört die Zitationsanalyse (Kessler 1963), das heißt die Analyse von Kozitationen (zwei Texte werden kozipiert, wenn es einen Text gibt, in dem beide zitiert werden) und die bibliographische Kopplung (zwei Texte sind bibliographisch gekoppelt, wenn beide eine gemeinsame Zitation aufweisen). In dem Vortrag wird aufgezeigt werden, daß die Formale Begriffsanalyse (FBA) für eine qualitative Zitationsanalyse geeignete Mittel bereithält. Eine besondere Eigenschaft der FBA ist, daß sie die Kombination verschiedenartiger (qualitativer und skalarer) Merkmale ermöglicht. Durch den Einsatz geeigneter Skalen kann auch dem Problem begegnet werden, daß die große Zahl von zu analysierenden Texten bei qualitativen Analyseansätzen in der Regel zu unübersichtlichen Zitationsgraphen führt, deren Inhalt nicht erfaßt werden kann. Die Relation der bibliographischen Kopplung ist eng verwandt mit den von Priss entwickelten Nachbarschaftskontexten, die zur Analyse von Lexika eingesetzt werden. Anhand einiger Beispielanalysen werden die wichtigsten Begriffe der Zitationsanalyse in formalen Kontexten und Begriffsverbänden modelliert. Es stellt sich heraus, daß die hierarchischen Begriffsverbände der FBA den gewöhnlichen Zitationsgraphen in vielerlei Hinsicht überlegen sind, da sie durch ihre hierarchische Verbandstruktur bestimmte Regularitäten explizit erfassen. Außerdem wird gezeigt, wie durch die Kombination geeigneter Merkmale (Doktorvater, Institut, Fachbereich, Zitationshäufigkeit, Keywords) und Skalen häufigen Fehlerquellen wie Gefälligkeitszitationen, Gewohnheitszitationen u.s.w. begegnet werden kann.
APA, Harvard, Vancouver, ISO, and other styles
10

Sertkaya, Baris. "Formal Concept Analysis Methods for Description Logics." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1215598189927-85390.

Full text
Abstract:
This work presents mainly two contributions to Description Logics (DLs) research by means of Formal Concept Analysis (FCA) methods: supporting bottom-up construction of DL knowledge bases, and completing DL knowledge bases. Its contribution to FCA research is on the computational complexity of computing generators of closed sets.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Fuzzy formal concept analysis"

1

Braud, Agnès, Aleksey Buzmakov, Tom Hanika, and Florence Le Ber, eds. Formal Concept Analysis. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77867-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cristea, Diana, Florence Le Ber, and Baris Sertkaya, eds. Formal Concept Analysis. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-21462-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kwuida, Léonard, and Barış Sertkaya, eds. Formal Concept Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-11928-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ferré, Sébastien, and Sebastian Rudolph, eds. Formal Concept Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01815-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cellier, Peggy, Felix Distel, and Bernhard Ganter, eds. Formal Concept Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38317-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Domenach, Florent, Dmitry I. Ignatov, and Jonas Poelmans, eds. Formal Concept Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29892-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Valtchev, Petko, and Robert Jäschke, eds. Formal Concept Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20514-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Glodeanu, Cynthia Vera, Mehdi Kaytoue, and Christian Sacarea, eds. Formal Concept Analysis. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07248-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ganter, Bernhard, and Rudolf Wille. Formal Concept Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 1999. http://dx.doi.org/10.1007/978-3-642-59830-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Medina, Raoul, and Sergei Obiedkov, eds. Formal Concept Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-78137-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Fuzzy formal concept analysis"

1

Macko, Juraj. "User-Friendly Fuzzy FCA." In Formal Concept Analysis, 156–71. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38317-5_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bělohlávek, Radim, Vladimír Sklenář, and Jiří Zacpal. "Crisply Generated Fuzzy Concepts." In Formal Concept Analysis, 269–84. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/978-3-540-32262-7_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Brito, Abner, Laécio Barros, Estevão Laureano, Fábio Bertato, and Marcelo Coniglio. "Fuzzy Formal Concept Analysis." In Communications in Computer and Information Science, 192–205. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-95312-0_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bělohlávek, Radim, and Vilém Vychodil. "Attribute Implications in a Fuzzy Setting." In Formal Concept Analysis, 45–60. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11671404_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

García-Pardo, F., I. P. Cabrera, P. Cordero, and M. Ojeda-Aciego. "On Closure Systems and Adjunctions Between Fuzzy Preordered Sets." In Formal Concept Analysis, 114–27. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19545-2_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Antoni, Lubomir, Stanislav Krajči, and Ondrej Krídlo. "Randomized Fuzzy Formal Contexts and Relevance of One-Sided Concepts." In Formal Concept Analysis, 183–99. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19545-2_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Konecny, Jan. "Bonds Between $$L$$ -Fuzzy Contexts Over Different Structures of Truth-Degrees." In Formal Concept Analysis, 81–96. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19545-2_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Xiaodong, and Witold Pedrycz. "AFS Formal Concept and AFS Fuzzy Formal Concept Analysis." In Axiomatic Fuzzy Set Theory and Its Applications, 303–49. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-00402-5_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cordero, Pablo, Manuel Enciso, and Angel Mora. "Directness in Fuzzy Formal Concept Analysis." In Communications in Computer and Information Science, 585–95. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91473-2_50.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Djouadi, Yassine, and Henri Prade. "Interval-Valued Fuzzy Formal Concept Analysis." In Lecture Notes in Computer Science, 592–601. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04125-9_62.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Fuzzy formal concept analysis"

1

Golinska-Pilarek, Joanna, and Ewa Orlowska. "Relational Reasoning in Formal Concept Analysis." In 2007 IEEE International Fuzzy Systems Conference. IEEE, 2007. http://dx.doi.org/10.1109/fuzzy.2007.4295512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhou, Lei. "Formal concept analysis in intuitionistic fuzzy formal context." In 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD). IEEE, 2010. http://dx.doi.org/10.1109/fskd.2010.5569658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shen, Lili, and Dexue Zhang. "Formal concept analysis on fuzzy sets." In 2013 Joint IFSA World Congress and NAFIPS Annual Meeting (IFSA/NAFIPS). IEEE, 2013. http://dx.doi.org/10.1109/ifsa-nafips.2013.6608402.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kridlo, Ondrej, and Manuel Ojeda-Aciego. "Extending formal concept analysis using intuitionistic l-fuzzy sets." In 2017 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE). IEEE, 2017. http://dx.doi.org/10.1109/fuzz-ieee.2017.8015570.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cross, Valerie V., and Wenting Yi. "Formal concept analysis for ontologies and their annotation files." In 2008 IEEE 16th International Conference on Fuzzy Systems (FUZZ-IEEE). IEEE, 2008. http://dx.doi.org/10.1109/fuzzy.2008.4630646.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hashemi, R. R., S. De Agostino, B. Westgeest, and J. R. Talburt. "Data granulation and formal concept analysis." In IEEE Annual Meeting of the Fuzzy Information, 2004. Processing NAFIPS '04. IEEE, 2004. http://dx.doi.org/10.1109/nafips.2004.1336253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhou, Wen, Zongtian Liu, and Yan Zhao. "Concept Hierarchies Generation for Classification using Fuzzy Formal Concept Analysis." In Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007). IEEE, 2007. http://dx.doi.org/10.1109/snpd.2007.229.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Belohlavek, Radim, and Jan Konecny. "Scaling, Granulation, and Fuzzy Attributes in Formal Concept Analysis." In 2007 IEEE International Fuzzy Systems Conference. IEEE, 2007. http://dx.doi.org/10.1109/fuzzy.2007.4295488.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yao, Y. Y., and Yaohua Chen. "Rough set approximations in formal concept analysis." In IEEE Annual Meeting of the Fuzzy Information, 2004. Processing NAFIPS '04. IEEE, 2004. http://dx.doi.org/10.1109/nafips.2004.1336252.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Jun, and Xiaoqiu Yao. "Formal concept analysis of incomplete information system." In 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD). IEEE, 2010. http://dx.doi.org/10.1109/fskd.2010.5569661.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Fuzzy formal concept analysis"

1

Baader, Franz, Bernhard Ganter, Ulrike Sattler, and Barış Sertkaya. Completing Description Logic Knowledge Bases using Formal Concept Analysis. Aachen University of Technology, 2006. http://dx.doi.org/10.25368/2022.155.

Full text
Abstract:
We propose an approach for extending both the terminological and the assertional part of a Description Logic knowledge base by using information provided by the assertional part and by a domain expert. The use of techniques from Formal Concept Analysis ensures that, on the one hand, the interaction with the expert is kept to a minimum, and, on the other hand, we can show that the extended knowledge base is complete in a certain sense.
APA, Harvard, Vancouver, ISO, and other styles
2

Baader, Franz, and Felix Distel. A finite basis for the set of EL-implications holding in a finite model. Technische Universität Dresden, 2007. http://dx.doi.org/10.25368/2022.160.

Full text
Abstract:
Formal Concept Analysis (FCA) can be used to analyze data given in the form of a formal context. In particular, FCA provides efficient algorithms for computing a minimal basis of the implications holding in the context. In this paper, we extend classical FCA by considering data that are represented by relational structures rather than formal contexts, and by replacing atomic attributes by complex formulae defined in some logic. After generalizing some of the FCA theory to this more general form of contexts, we instantiate the general framework with attributes defined in the Description Logic (DL) EL, and with relational structures over a signature of unary and binary predicates, i.e., models for EL. In this setting, an implication corresponds to a so-called general concept inclusion axiom (GCI) in EL. The main technical result of this report is that, in EL, for any finite model there is a finite set of implications (GCIs) holding in this model from which all implications (GCIs) holding in the model follow.
APA, Harvard, Vancouver, ISO, and other styles
3

Baader, Franz, and Felix Distel. Exploring finite models in the Description Logic ELgfp. Technische Universität Dresden, 2008. http://dx.doi.org/10.25368/2022.168.

Full text
Abstract:
In a previous ICFCA paper we have shown that, in the Description Logics EL and ELgfp, the set of general concept inclusions holding in a finite model always has a finite basis. In this paper, we address the problem of how to compute this basis efficiently, by adapting methods from formal concept analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

Borchmann, Daniel. A General Form of Attribute Exploration. Technische Universität Dresden, 2013. http://dx.doi.org/10.25368/2022.192.

Full text
Abstract:
We present a general form of attribute exploration, a knowledge completion algorithm from formal concept analysis. The aim of this generalization is to extend the applicability of attribute exploration by a general description. Additionally, this may also allow for viewing different existing variants of attribute exploration as instances of a general form, as for example exploration on partial contexts.
APA, Harvard, Vancouver, ISO, and other styles
5

Borchmann, Daniel. Exploration by Confidence. Technische Universität Dresden, 2013. http://dx.doi.org/10.25368/2022.194.

Full text
Abstract:
Within formal concept analysis, attribute exploration is a powerful tool to semiautomatically check data for completeness with respect to a given domain. However, the classical formulation of attribute exploration does not take into account possible errors which are present in the initial data. We present in this work a generalization of attribute exploration based on the notion of confidence, which will allow for the exploration of implications which are not necessarily valid in the initial data, but instead enjoy a minimal confidence therein.
APA, Harvard, Vancouver, ISO, and other styles
6

Соловйов, Володимир Миколайович, and V. Saptsin. Heisenberg uncertainty principle and economic analogues of basic physical quantities. Transport and Telecommunication Institute, 2011. http://dx.doi.org/10.31812/0564/1188.

Full text
Abstract:
From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measuring is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.
APA, Harvard, Vancouver, ISO, and other styles
7

Soloviev, V., V. Solovieva, and V. Saptsin. Heisenberg uncertainity principle and economic analogues of basic physical quantities. Брама-Україна, 2014. http://dx.doi.org/10.31812/0564/1306.

Full text
Abstract:
From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.
APA, Harvard, Vancouver, ISO, and other styles
8

Soloviev, V., and V. Solovieva. Quantum econophysics of cryptocurrencies crises. [б. в.], 2018. http://dx.doi.org/10.31812/0564/2464.

Full text
Abstract:
From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measuring is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, related to the cryptocurrencies market, the achieved results are open for discussion. Then, combined the empirical cross-correlation matrix with the random matrix theory, we mainly examine the statistical properties of cross-correlation coefficient, the evolution of average correlation coefficient, the distribution of eigenvalues and corresponding eigenvectors of the global cryptocurrency market using the daily returns of 15 cryptocurrencies price time series across the world from 2016 to 2018. The result indicated that the largest eigenvalue reflects a collective effect of the whole market, practically coincides with the dynamics of the mean value of the correlation coefficient and very sensitive to the crisis phenomena. It is shown that both the introduced economic mass and the largest eigenvalue of the matrix of correlations can serve as quantum indicator-predictors of crises in the market of cryptocurrencies.
APA, Harvard, Vancouver, ISO, and other styles
9

Kelly, Luke. Definitions, Characteristics and Monitoring of Conflict Economies. Institute of Development Studies (IDS), February 2022. http://dx.doi.org/10.19088/k4d.2022.024.

Full text
Abstract:
The idea of conflict economies is a broad concept encompassing several research angles. Definitions differ according to these focuses. Some of the main uses of the concept are to understand: • economic analysis of the motives for and likelihood of war • financing of state and non-state belligerents • how the continuation of conflicts can be explained by rational motives including economic ones • how conflict affects economic activity, and how conflict parties and citizens adapt Some distinctive characteristics of war economies are (Ballentine & Nitzschke, 2005, p. 12): • They involve the destruction or circumvention of the formal economy and the growth of informal and black markets, • Pillage, predation, extortion, and deliberate violence against civilians is used by combatants to acquire control over lucrative assets, capture trade networks and diaspora remittances, and exploit labour; • War economies are highly decentralised and privatised, both in the means of coercion and in the means of production and exchange; • Combatants increasingly rely on the licit or illicit exploitation of / trade in lucrative natural resources • They thrive on cross-border trading networks, regional kin and ethnic groups, arms traffickers, and mercenaries, as well as legally operating commercial entities, each of which may have a vested interest in the continuation of conflict and instability. The first section of this rapid review outlines the evolution of the term and key definitions. Most of this discussion occurs in the academic literature around the early 2000s. The second looks at key characteristics of conflict economies identified in the literature, with examples where possible from both academic and grey literature. The third section briefly identifies methodologies used to measure and monitor conflict economies, as well as some current research and programmes on conflict economies, from academic literature as well as NGOs and other sources. The findings have been derived via a literature search and advice from experts in the field. Given time constraints, the report is not comprehensive. The review is gender- and disability blind.
APA, Harvard, Vancouver, ISO, and other styles
10

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova, and Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3677.

Full text
Abstract:
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography