Journal articles on the topic 'Graph-logical models'

To see the other types of publications on this topic, follow the link: Graph-logical models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 47 journal articles for your research on the topic 'Graph-logical models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Liu, Qi, Dani Yogatama, and Phil Blunsom. "Relational Memory-Augmented Language Models." Transactions of the Association for Computational Linguistics 10 (2022): 555–72. http://dx.doi.org/10.1162/tacl_a_00476.

Full text
Abstract:
Abstract We present a memory-augmented approach to condition an autoregressive language model on a knowledge graph. We represent the graph as a collection of relation triples and retrieve relevant relations for a given context to improve text generation. Experiments on WikiText-103, WMT19, and enwik8 English datasets demonstrate that our approach produces a better language model in terms of perplexity and bits per character. We also show that relational memory improves coherence, is complementary to token-based memory, and enables causal interventions. Our model provides a simple yet effective way to combine an autoregressive language model and a knowledge graph for more coherent and logical generation.
APA, Harvard, Vancouver, ISO, and other styles
2

Kovács, Tibor, Gábor Simon, and Gergely Mezei. "Benchmarking Graph Database Backends—What Works Well with Wikidata?" Acta Cybernetica 24, no. 1 (May 21, 2019): 43–60. http://dx.doi.org/10.14232/actacyb.24.1.2019.5.

Full text
Abstract:
Knowledge bases often utilize graphs as logical model. RDF-based knowledge bases (KB) are prime examples, as RDF (Resource Description Framework) does use graph as logical model. Graph databases are an emerging breed of NoSQL-type databases, offering graph as the logical model. Although there are specialized databases, the so-called triple stores, for storing RDF data, graph databases can also be promising candidates for storing knowledge. In this paper, we benchmark different graph database implementations loaded with Wikidata, a real-life, large-scale knowledge base. Graph databases come in all shapes and sizes, offer different APIs and graph models. Hence we used a measurement system, that can abstract away the API differences. For the modeling aspect, we made measurements with different graph encodings previously suggested in the literature, in order to observe the impact of the encoding aspect on the overall performance.
APA, Harvard, Vancouver, ISO, and other styles
3

Shao, Bo, Yeyun Gong, Weizhen Qi, Guihong Cao, Jianshu Ji, and Xiaola Lin. "Graph-Based Transformer with Cross-Candidate Verification for Semantic Parsing." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 8807–14. http://dx.doi.org/10.1609/aaai.v34i05.6408.

Full text
Abstract:
In this paper, we present a graph-based Transformer for semantic parsing. We separate the semantic parsing task into two steps: 1) Use a sequence-to-sequence model to generate the logical form candidates. 2) Design a graph-based Transformer to rerank the candidates. To handle the structure of logical forms, we incorporate graph information to Transformer, and design a cross-candidate verification mechanism to consider all the candidates in the ranking process. Furthermore, we integrate BERT into our model and jointly train the graph-based Transformer and BERT. We conduct experiments on 3 semantic parsing benchmarks, ATIS, JOBS and Task Oriented semantic Parsing dataset (TOP). Experiments show that our graph-based reranking model achieves results comparable to state-of-the-art models on the ATIS and JOBS datasets. And on the TOP dataset, our model achieves a new state-of-the-art result.
APA, Harvard, Vancouver, ISO, and other styles
4

Ivannikov, A. D., and A. L. Stempkovskiy. "Iterative Methods for Solving Systems of Multi-Valued Logical Equations in the Simulation of Object Control Digital Systems." Mekhatronika, Avtomatizatsiya, Upravlenie 21, no. 9 (September 7, 2020): 511–20. http://dx.doi.org/10.17587/mau.21.511-520.

Full text
Abstract:
The article is devoted to the analysis of methods for solving systems of multivalued logical equations by iteration methods. Iterative methods for solving such systems of equations are a mathematical description of the main process of functional-logical simulation, which is used at the stage of designing digital systems for objects control to verify the correctness of the design. Consideration of multi-valued values of logical signals at the outputs of blocks and elements of digital systems is explained by the fact that in some cases, to analyze the correctness of time relationships when simulating the hardware of digital systems, a several valued representation of logical signals is used, as well as that recently, logical elements are being developed that implement four or more valued logic. Based on the analysis of the structure of the system of logical equations used in digital hardware simulation, using graph and logical models, an analysis is made of the existence of solutions and their number. Iterative methods of a simple and generalized iteration are analyzed, a relationship is shown between the number of solutions of the system of equations and its graph representation, which reflects a given circuit of connecting elements of the hardware of a digital control system. For the generalized iteration method, options with a different structure of the iteration trace are considered, in particular, it is shown that, with a certain structure of the iteration trace, the generalized iteration turns into a simple iteration or Seidel iteration. It is shown that the generalized iteration most adequately describes the process of simulating the switching of logical signals in a simulated circuit of digital control systems hardware. The correspondence between various options of functional-logical simulation of digital systems and the used methods of iterative solution of systems of logical equations is shown.
APA, Harvard, Vancouver, ISO, and other styles
5

Gates, Alexander J., Rion Brattig Correia, Xuan Wang, and Luis M. Rocha. "The effective graph reveals redundancy, canalization, and control pathways in biochemical regulation and signaling." Proceedings of the National Academy of Sciences 118, no. 12 (March 18, 2021): e2022598118. http://dx.doi.org/10.1073/pnas.2022598118.

Full text
Abstract:
The ability to map causal interactions underlying genetic control and cellular signaling has led to increasingly accurate models of the complex biochemical networks that regulate cellular function. These network models provide deep insights into the organization, dynamics, and function of biochemical systems: for example, by revealing genetic control pathways involved in disease. However, the traditional representation of biochemical networks as binary interaction graphs fails to accurately represent an important dynamical feature of these multivariate systems: some pathways propagate control signals much more effectively than do others. Such heterogeneity of interactions reflects canalization—the system is robust to dynamical interventions in redundant pathways but responsive to interventions in effective pathways. Here, we introduce the effective graph, a weighted graph that captures the nonlinear logical redundancy present in biochemical network regulation, signaling, and control. Using 78 experimentally validated models derived from systems biology, we demonstrate that 1) redundant pathways are prevalent in biological models of biochemical regulation, 2) the effective graph provides a probabilistic but precise characterization of multivariate dynamics in a causal graph form, and 3) the effective graph provides an accurate explanation of how dynamical perturbation and control signals, such as those induced by cancer drug therapies, propagate in biochemical pathways. Overall, our results indicate that the effective graph provides an enriched description of the structure and dynamics of networked multivariate causal interactions. We demonstrate that it improves explainability, prediction, and control of complex dynamical systems in general and biochemical regulation in particular.
APA, Harvard, Vancouver, ISO, and other styles
6

Singh, Pritpal, Gaurav Dhiman, and Amandeep Kaur. "A quantum approach for time series data based on graph and Schrödinger equations methods." Modern Physics Letters A 33, no. 35 (November 19, 2018): 1850208. http://dx.doi.org/10.1142/s0217732318502085.

Full text
Abstract:
The supremacy of quantum approach is able to solve the problems which are not practically feasible on classical machines. It suggests a significant speed up of the simulations and decreases the chance of error rates. This paper introduces a new quantum model for time series data which depends on the appropriate length of intervals. To provide effective solution of this problem, this study suggests a new graph-based quantum approach. This technique is useful in discretization and representation of logical relationships. Then, we divide these logical relations into various groups to obtain efficient results. The proposed model is verified and validated with various approaches. Experimental results signify that the proposed model is more precise than existing competing models.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Jindou, and Jing Li. "Enhanced Knowledge Graph Embedding by Jointly Learning Soft Rules and Facts." Algorithms 12, no. 12 (December 10, 2019): 265. http://dx.doi.org/10.3390/a12120265.

Full text
Abstract:
Combining first order logic rules with a Knowledge Graph (KG) embedding model has recently gained increasing attention, as rules introduce rich background information. Among such studies, models equipped with soft rules, which are extracted with certain confidences, achieve state-of-the-art performance. However, the existing methods either cannot support the transitivity and composition rules or take soft rules as regularization terms to constrain derived facts, which is incapable of encoding the logical background knowledge about facts contained in soft rules. In addition, previous works performed one time logical inference over rules to generate valid groundings for modeling rules, ignoring forward chaining inference, which can further generate more valid groundings to better model rules. To these ends, this paper proposes Soft Logical rules enhanced Embedding (SoLE), a novel KG embedding model equipped with a joint training algorithm over soft rules and KG facts to inject the logical background knowledge of rules into embeddings, as well as forward chaining inference over rules. Evaluations on Freebase and DBpedia show that SoLE not only achieves improvements of 11.6%/5.9% in Mean Reciprocal Rank (MRR) and 18.4%/15.9% in HITS@1 compared to the model on which SoLE is based, but also significantly and consistently outperforms the state-of-the-art baselines in the link prediction task.
APA, Harvard, Vancouver, ISO, and other styles
8

Loganathan, MK, and OP Gandhi. "Reliability enhancement of manufacturing systems through functions." Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 231, no. 10 (November 14, 2015): 1850–68. http://dx.doi.org/10.1177/0954405415612324.

Full text
Abstract:
A reliable system means being able to perform its intended functions. Therefore, ensuring performing of its required functions will help to enhance its reliability. For a manufacturing system (e.g. computer numerical control machines), there are a large number of functions, which complicate and make analysis difficult. In this article, a logical and systems approach of graph theory, which is effective to eliminate such difficulties, is employed. The graph theoretic models do consider the system structure explicitly and are applied to model functions at various hierarchical levels of a manufacturing system. These function digraph models are analysed using matrix approach to examine the cause and effect, which helps to evaluate importance of the function and hence provide direction for system reliability enhancement. A step-by-step methodology is presented, which is illustrated by an example of manufacturing system: computer numerical control drilling machine.
APA, Harvard, Vancouver, ISO, and other styles
9

Lv, Qingna, Yanyun Zhang, Yanyan Li, and Yang Yu. "Research on a Health Care Personnel Training Model Based on Multilayered Knowledge Mapping for the Integration of Nursing Courses and Examinations." Journal of Healthcare Engineering 2022 (February 9, 2022): 1–10. http://dx.doi.org/10.1155/2022/3826413.

Full text
Abstract:
While nursing courses provide a convenient and quick way to learn, they can also be overloaded with resources that can cause learners to become cognitively disoriented or have difficulty choosing nursing course. This paper proposes to fully explore learners’ interests in the case of sparse data by fusing knowledge graph technology and deep recommendation models and adopt knowledge graph to model nursing courses at the semantic level so as to correspond the set of nursing courses to the knowledge graph and solve the problem of lack of logical knowledge relationships. Due to the specificity of its positions, the nursing profession must accurately position the nursing professional curriculum standards in the process of determining the talent cultivation model based on the nursing professional positions and the admission requirements for nursing practice qualification. Through linear feature mining based on the knowledge graph, entities and relationships are used to intuitively display the interest paths of nursing professional learners and enhance the interpretability of recommendations.
APA, Harvard, Vancouver, ISO, and other styles
10

Kadyrov, Amanulla, and Amir Kadyrov. "Foundations of General Theory of Discrete Dynamic, Relay and Logical-Dynamic Systems Based on Physical Decomposition and Graph Models." Vestnik Volgogradskogo gosudarstvennogo universiteta. Serija 10. Innovatcionnaia deiatel’nost’, no. 2 (May 2015): 80–89. http://dx.doi.org/10.15688/jvolsu10.2015.2.8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Molnár, Bálint, and András Benczúr. "The Application of Directed Hyper-Graphs for Analysis of Models of Information Systems." Mathematics 10, no. 5 (February 27, 2022): 759. http://dx.doi.org/10.3390/math10050759.

Full text
Abstract:
Hyper-graphs offer the opportunity to formulate logical statements about their components, for example, using Horn clauses. Several models of Information Systems can be represented using hyper-graphs as the workflows, i.e., the business processes. During the modeling of Information Systems, many constraints should be maintained during the development process. The models of Information Systems are complex objects, for this reason, the analysis of algorithms and graph structures that can support the consistency and integrity of models is an essential issue. A set of interdependencies between models and components of architecture can be formulated by functional dependencies and can be investigated via algorithmic methods. Information Systems can be perceived as overarching documents that includes data collections; documents to be processed; and representations of business processes, activities, and services. Whe selecting and working out an appropriate method encoding of artifacts in Information Systems, the complex structure can be represented using hyper-graphs. This representation enables the application of various model-checking, verification, and validation tools that are based on formal approaches. This paper describes the proposed representations in different situations using hyper-graphs, moreover, the formal, algorithmic-based model-checking methods that are coupled with the representations. The model-checking methods are realized by algorithms that are grounded in graph-theoretical approaches and tailored to the specificity of hyper-graphs. Finally, the possible applications in a real-life enterprise environment are outlined.
APA, Harvard, Vancouver, ISO, and other styles
12

Zaourar, Lilia, Yann Kieffer, and Chouki Aktouf. "A Graph-Based Approach to Optimal Scan Chain Stitching Using RTL Design Descriptions." VLSI Design 2012 (December 20, 2012): 1–11. http://dx.doi.org/10.1155/2012/312808.

Full text
Abstract:
The scan chain insertion problem is one of the mandatory logic insertion design tasks. The scanning of designs is a very efficient way of improving their testability. But it does impact size and performance, depending on the stitching ordering of the scan chain. In this paper, we propose a graph-based approach to a stitching algorithm for automatic and optimal scan chain insertion at the RTL. Our method is divided into two main steps. The first one builds graph models for inferring logical proximity information from the design, and then the second one uses classic approximation algorithms for the traveling salesman problem to determine the best scan-stitching ordering. We show how this algorithm allows the decrease of the cost of both scan analysis and implementation, by measuring total wirelength on placed and routed benchmark designs, both academic and industrial.
APA, Harvard, Vancouver, ISO, and other styles
13

Cravo, Glória. "Workflow Modelling and Analysis Based on the Construction of Task Models." Scientific World Journal 2015 (2015): 1–7. http://dx.doi.org/10.1155/2015/481767.

Full text
Abstract:
We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.
APA, Harvard, Vancouver, ISO, and other styles
14

Xiao, Juan, Song Wang, Sheng Duan, and Shanglin Li. "Research on Real-Time System and Related Graph Task Model." Journal of Physics: Conference Series 2066, no. 1 (November 1, 2021): 012054. http://dx.doi.org/10.1088/1742-6596/2066/1/012054.

Full text
Abstract:
Abstract Generally speaking, real-time system is considered to be able to influence the environment by receiving and processing data, and returning calculation results rapid enough, so as to control the environment. In computer science, real-time system describes the software and hardware system affected by time constraints, and its correctness relies on the logical correctness of the function and the time when the result is generated. According to the main characteristics of real-time operating system, such as time constraint, predictability and reliability, it puts forward higher requirements for the time accuracy and reliability of real-time operating system. This paper first introduces the real-time system from its main characteristics, related concepts and scheduling algorithm. Then five classical graph based task models of real-time system are introduced. Finally, this paper introduces the directed graph realtime task model from two aspects of definition and semantics. As an extension of realtime system task model, directed graph real-time task model is considered to be able to provide real-time systems with stronger expressive power and support the formal study of time constraint problems.
APA, Harvard, Vancouver, ISO, and other styles
15

Aleksander Stasiuk, Valeriy Kuznetsov, Lidia Goncharova, and Petro Hubskyi. "Models of the Computer Intellectualization Optimal Strategy of the Power Supply Fast-Flowing Technological Processes of the Railways Traction Substations." Communications - Scientific letters of the University of Zilina 23, no. 2 (April 1, 2021): C30—C36. http://dx.doi.org/10.26552/com.c.2021.2.c30-c36.

Full text
Abstract:
Based on analysis of the problem of the power supply networks of railways innovative transformation, the direction of research is substantiated related to organization of the optimal strategy of computerized intellectualization of the power supply processes to the railways traction substations. The logical structure of a distributed computer environment developed in the form of graph, which adequately reflects the topology of the organization of the power supply system. A differential mathematical model of the computer architecture of the power supply control is proposed. An intelligent method for finding the optimal strategy for the intellectualization of the power supply processes was proposed to guarantee the specified indicators of the optimal functioning of individual nodes and segments of the power supply management computer network.
APA, Harvard, Vancouver, ISO, and other styles
16

Sithole, G. "INDOOR SPACE ROUTING GRAPHS: VISIBILITY, ENCODING, ENCRYPTION AND ATTENUATION." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4 (September 19, 2018): 579–85. http://dx.doi.org/10.5194/isprs-archives-xlii-4-579-2018.

Full text
Abstract:
<p><strong>Abstract.</strong> The conventional approach to path planning for indoor navigation is to infer routes from a subdivided floor map of the indoor space. The floor map describes the spatial geometry of the space. Contained in this floor map are logical units called subspaces. For the purpose of path planning the possible routes between the subspaces have to be modelled. Typical these models employing a graph structures, or skeletons, in which the interconnected subspaces (e.g., rooms, corridors, etc.) are represented as linked nodes, i.e. a graph.</p><p>This paper presents a novel method for creating generalised graphs of indoor spaces that doesn’t require the subdivision of indoor space. The method creates the generalised graph by gradually simplifying/in-setting the floor map until a graph is obtained, a process described here as chained deflation. The resulting generalised graph allows for more flexible and natural paths to be determined within the indoor environment. Importantly the method allows the indoor space to be encoded and encrypted and supplied to users in a way that emulates the use of physical keys in the real world. Another important novelty of the method is that the space described by the graph is adaptable. The space described by the graph can be deflated or inflated according to the needs of the path planning. Finally, the proposed method can be readily generalised to the third dimension.</p><p>The concept and logic of the method are explained. A full implementation of the method will be discussed in a future paper.</p>
APA, Harvard, Vancouver, ISO, and other styles
17

Rozin, Mikhail, Valeriy Svechkarev, and Zhanna Tumakova. "Descriptor of self-determination based on cognitive models." SHS Web of Conferences 72 (2019): 04005. http://dx.doi.org/10.1051/shsconf/20197204005.

Full text
Abstract:
There is interest of scientists of various applied fields of science in the possibilities of research within the framework of the concept of self-determination. It is suggested to concentrate on the general semantic content of self-determination, which is inherent in it regardless of the applied focus of research. It is shown that self-determination is manifested as a system property of the sociocultural system activity as a whole. Furthermore, the level of self-determination of the sociocultural system and the potential for development in the target direction are directly proportional to the degree of semantic and causal integration of its elements. It has been identified that a set of factors united by cause-and-effect relationships in an oriented named signed graph reflects both the integration causation and the logical-semantic target organization of the sociocultural system. Thus, based on the figurative approximation of the essence of the studied system property of self-determination and its visual metaphorical representation in the form of a cognitive model, we get the opportunity to study the self-determination descriptor. Analysis of the causal integration of the elements of the sociocultural system or process using cognitive models makes it quite simple to determine the level of self-determination of a system not only at a qualitative but also at a quantitative level.
APA, Harvard, Vancouver, ISO, and other styles
18

Komkov, N. I., A. A. Lazarev, and V. S. Romantsov. "INFORMATION MODELING OF DEVELOPMENT PROCESSES BASED ON THE SYSTEM ANALYSIS OF "BOTTLENECKS"." MIR (Modernization. Innovation. Research) 9, no. 2 (June 30, 2018): 222–31. http://dx.doi.org/10.18184/2079-4665.2018.9.2.222-231.

Full text
Abstract:
Purpose: the purpose of the presented research is to expand the possibilities of using information-logical models for the analysis of forecasting "bottlenecks" and quantitative assessment of ways to eliminate them in the development of socio-economic systems.Methods: the implementation of this research is based on the use of basic principles, properties and rules of construction of phased informationlogical models for solving complex problems. Their development involves the introduction of quantitative assessments of ways to reduce the negative potential of "bottlenecks" and comparing the expected results of their elimination with the initial state.Results: the authors presented a method of analytical directed search and elimination of "bottlenecks" in the development of complex systems. This method is based on the use of rules for building information-logical models. A quantitative analysis of the potential for reducing the potential of identified "bottlenecks" assumes the construction of a linear graph based on the information-logical model, as well as the calculation of integral estimates of the expected compensation of the initial potential of "bottlenecks".conclusions and Relevance: the developed method of analytical representation of the possibilities of eliminating "bottlenecks" in the development of socio-economic systems is applicable for the analysis of the prospects of eliminating "bottlenecks" on the basis of building a scheme of a full cycle of decision-making. Testing of the presented approach on the example of analysis of known problem situations showed that the proposed tools allow a priori to evaluate the effectiveness of the proposed mechanisms, simulate their expansion and effectiveness in terms of impact on the final result. This increases the possibilities for finding effective solutions to complex scientific, technological and socioeconomic problems. In addition, it will be very useful in the examination of various projects and programs.
APA, Harvard, Vancouver, ISO, and other styles
19

Telenyk, Sergii, Sergiy Pogorilyy, and Artem Kramov. "Evaluation of the Coherence of Polish Texts Using Neural Network Models." Applied Sciences 11, no. 7 (April 2, 2021): 3210. http://dx.doi.org/10.3390/app11073210.

Full text
Abstract:
Coherence evaluation of texts falls into a category of natural language processing tasks. The evaluation of texts’ coherence implies the estimation of their semantic and logical integrity; such a feature of a text can be utilized during the solving of multidisciplinary tasks (SEO analysis, medicine area, detection of fake texts, etc.). In this paper, different state-of-the-art coherence evaluation methods based on machine learning models have been analyzed. The investigation of the effectiveness of different methods for the coherence estimation of Polish texts has been performed. The impact of text’s features on the output coherence value has been analyzed using different approaches of a semantic similarity graph. Two neural networks based on LSTM layers and a pre-trained BERT model correspondingly have been designed and trained for the coherence estimation of input texts. The results obtained may indicate that both lexical and semantic components should be taken into account during the coherence evaluation of Polish documents; moreover, it is advisable to analyze corresponding documents in a sentence-by-sentence manner taking into account word order. According to the retrieved accuracy of the proposed neural networks, it can be concluded that suggested models may be used in order to solve typical coherence estimation tasks for a Polish corpus.
APA, Harvard, Vancouver, ISO, and other styles
20

Chen, Xuhua. "Semantic Matching Efficiency of Supply and Demand Text on Cross-Border E-Commerce Online Technology Trading Platforms." Wireless Communications and Mobile Computing 2021 (May 15, 2021): 1–12. http://dx.doi.org/10.1155/2021/9976774.

Full text
Abstract:
With the innovation of global trade business models, more and more foreign trade companies are transforming and developing in the direction of cross-border e-commerce. However, due to the limitation of platform language processing and analysis technology, foreign trade companies encounter many bottlenecks in the process of transformation and upgrading. From the perspective of the semantic matching efficiency of e-commerce platforms, this paper improves the logical and technical problems of cross-border e-commerce in the operation process and uses semantic matching efficiency as the research object to conduct experiments on the QQP dataset. We propose a graph network text semantic analysis model TextSGN based on semantic dependency analysis for the problem that the existing text semantic matching method does not consider the semantic dependency information between words in the text and requires a large amount of training data. The model first analyzes the semantic dependence of the text and performs word embedding and one-hot encoding on the nodes (single words) and edges (dependencies) in the semantic dependence graph. On this basis, in order to quickly mine semantic dependencies, an SGN network block is proposed. The network block defines the way of information transmission from the structural level to update the nodes and edges in the graph, thereby quickly mining semantics dependent information allows the network to converge faster, train classification models on multiple public datasets, and perform classification tests. The experimental results show that the accuracy rate of TextSGN model in short text classification reaches 95.2%, which is 3.6% higher than the suboptimal classification method; the accuracy rate is 86.16%, the F 1 value is 88.77%, and the result is better than other methods.
APA, Harvard, Vancouver, ISO, and other styles
21

Fehlmann, Thomas, and Eberhard Kranich. "The Fixpoint Combinator in Combinatory Logic – A Step towards Autonomous Real-time Testing of Software?" ATHENS JOURNAL OF SCIENCES 9, no. 1 (February 9, 2022): 47–64. http://dx.doi.org/10.30958/ajs.9-1-3.

Full text
Abstract:
Combinatory Logic is an elegant and powerful logical theory that is used in computer science as a theoretical model for computation. Its algebraic structure supports self-application and is Turing-complete. However, contrary to Lambda Calculus, it untangles the problem of substitution, because bound variables are eliminated by inserting specific terms called Combinators. It was introduced by Schönfinkel (1924) and Curry (1930). Combinatory Logic uses just one algebraic operation, namely combining two terms, yielding another valid term of Combinatory Logic. Terms in models of Combinatory Logic look like some sort of assembly language for mathematical logic. A Neural Algebra, modeling the way we think, constitutes an interesting model of Combinatory Logic. There are other models, also based on the Graph Model (Engeler 1981), such as software testing. This paper investigates what Combinatory Logic contributes to modern software testing. Keywords: combinatory logic, combinatory algebra, autonomous real-time testing, recursion, software testing, artificial intelligence
APA, Harvard, Vancouver, ISO, and other styles
22

Kuzmina, E. A., and G. F. Nizamova. "Curriculum development based on the graph model." Informatics and education, no. 5 (July 4, 2020): 33–43. http://dx.doi.org/10.32517/0234-0453-2020-35-5-33-43.

Full text
Abstract:
The article discusses the approach to planning interrelated work within a given number of time periods using the example of the formation of a curriculum of a higher educational institution based on the competence model of students. It is proposed to consider the task of forming a curriculum as an optimization task of cutting-packing, solved with the help of an ordered graph. A logical model of the curriculum is developed in the form of an N-layer ordered graph in which the vertices correspond to the disciplines of the curriculum, and the arcs specify the relations of the preceding of disciplines in semesters, the temporal preceding (following) of disciplines is set using the discipline following matrix. As a criterion for the optimality of the curriculum, the criterion of the uniformity of the study load and the criterion of minimum fines, which reflects the degree of compliance with the given causal relationships between the disciplines of the plan, are considered. The UML diagram of the components of the software package for the formation of curricula is developed, the functions of the developed software package are described. The software implementation of the developed models and algorithms is completed. An experiment was conducted, various curriculum options were obtained and analyzed. Their optimality was estimated based on the proposed criteria, the characteristics of the constructed curriculum options are given, recommendations are given for choosing the final curriculum version, taking into account the established priorities
APA, Harvard, Vancouver, ISO, and other styles
23

Ageev, Yu D., S. V. Fedoseev, Yu A. Kavin, S. G. Vorona, and I. S. Pavlovskiy. "Inconsistency evaluation of the curriculum logical structure." Statistics and Economics 15, no. 5 (November 13, 2018): 73–80. http://dx.doi.org/10.21686/2500-3925-2018-5-73-80.

Full text
Abstract:
Purpose of the study. The main purpose of creating a curriculum is to regulate academic disciplines in accordance with the logic of the learning process, defined by the relationship between the basic concepts of the disciplines. Violation of this logic becomes apparent only directly during the training sessions.A large variety of quantitative methods uses indicators that do not reveal structural deficiencies in the curriculum. This makes it difficult to improve the curriculum.The purpose of this work is to demonstrate the application of a general approach to the assessment of the structural inconsistency of systems in relation to the evaluation of the logical structure of the curriculum.Materials and methods. The paper applies a general approach to the assessment of structural integrity, developed on the basis of the provisions of the general theory of systems and graph theory. The approach involves the construction of three interrelated structural models of the system and using them to determine the initial data for calculating the index of inconsistency of the system structure.Results. The overall approach to the assessment of structural integrity is adapted to assess the logical structure of the curriculum. Three models of curriculum structure are developed:Elementary model of interdisciplinary communication;Curriculum network model;Hierarchical curriculum model.Based on the parameters of the hierarchical curriculum model, using three adapted algorithms, the value of the inconsistency index of the curriculum structure in the direction of preparation “Applied Informatics” is calculated. Recommendations on changing the structure of the studied curriculum to reduce the degree of its structural inconsistency are proposed. Conclusion. As a result of the research, the methods were proposed that allow identifying possible contradictions in the structure of the curriculum and evaluating its inconsistency. As the experiments have shown, it is extremely difficult to study the curricula in a manual manner, the number of disciplines in which exceeds 50. In this regard, the development of a complex of computer programs that will automate the assessment of the inconsistency of large curricula is being completed.
APA, Harvard, Vancouver, ISO, and other styles
24

Kleshhenkov, A. V., A. L. Chikin, A. Ju Moskovec, and L. G. Chikina. "ON TWO MODELS RELATED TO THE FLOODING AND DEPOSITION OF THE DON DELTA." Ecology. Economy. Informatics.System analysis and mathematical modeling of ecological and economic systems 1, no. 6 (2021): 29–33. http://dx.doi.org/10.23885/2500-395x-2021-1-6-29-33.

Full text
Abstract:
The results of modeling changes in the water surface level in the eastern part of the Taganrog Bay and the main Don branches in its delta area are presented. A numerical study of the process of saltwater inflow from the Taganrog Bay to the Don delta has been carried out. The hydrodynamics in the Taganrog Bay, as well as the saltwater transport process, are specified using the corresponding two-layer models. A system of Don arms is presented in the form of a graph, the edges of which correspond to open channels, and the vertices correspond to branching points and end nodes. The flow in the main Don branches is described by the Saint-Venant equation. It is assumed that there is no distributed lateral inflow, and the channel cross-section has a parabolic profile. Saltwater inflow into the arms is described by a one-dimensional transport equation. Boundary conditions are specified for each sleeve. At the branching nodes, conditions are set for the equality of the water levels, as well as the equality of the inflowing and outflowing discharges. The description of the algorithm of the process of flooding/drainage of the Don delta area is given. Considering the values of the depths at the nodes of the flat grid, the cells located in water or on land are determined. A logical array characterizing the type of cells (“water”, “land”) sets the configuration of the entire computational domain. Comparison of the calculation results with the observed values of salinity and water level is carried out.
APA, Harvard, Vancouver, ISO, and other styles
25

Klimina, N. V., and I. А. Morozov. "The program of the advanced training course for teachers of mathematics and informatics "Graphs and graph models: methods of visual processing "." Informatics and education, no. 3 (June 4, 2021): 31–41. http://dx.doi.org/10.32517/0234-0453-2021-36-3-31-41.

Full text
Abstract:
The method of visual presentation of educational information for solving problems of mathematics and informatics is effective for the development of algorithmic, logical and computational thinking of schoolchildren. Technical progress, informatization of education, the emergence of modern software for visualization of information change the activities of teachers who need to master new technologies of information visualization for use in the classroom and in work with gifted children. Visual models for presenting educational information and methods of their processing with the use of computer programs are also relevant in extracurricular activities, allowing to develop the intellectual abilities of schoolchildren. Teachers are required to teach children to create projects in which visibility is a necessary component and must be represented by an electronic product created using modern information visualization tools. The article proposes a variant of the advanced training course for teachers of mathematics and informatics on teaching methods for visualization of solving problems using graphs and the free software “Graphoanalyzator”. The relevance of the course is due to the need to form the competency to carry out targeted work with gifted children in the use of software for creating and processing graphs based on the graph visualization program “Graphoanalyzator”. The authors believe that the training of teachers on this course will contribute to the formation of their skills to solve problems of mathematical modeling in informatics and mathematics, to apply information technologies to solve pedagogical problems in the context of informatization of education.
APA, Harvard, Vancouver, ISO, and other styles
26

Long, Jun, Lei Liu, Hongxiao Fei, Yiping Xiang, Haoran Li, Wenti Huang, and Liu Yang. "Contextual Semantic-Guided Entity-Centric GCN for Relation Extraction." Mathematics 10, no. 8 (April 18, 2022): 1344. http://dx.doi.org/10.3390/math10081344.

Full text
Abstract:
Relation extraction tasks aim to predict potential relations between entities in a target sentence. As entity mentions have ambiguity in sentences, some important contextual information can guide the semantic representation of entity mentions to improve the accuracy of relation extraction. However, most existing relation extraction models ignore the semantic guidance of contextual information to entity mentions and treat entity mentions in and the textual context of a sentence equally. This results in low-accuracy relation extractions. To address this problem, we propose a contextual semantic-guided entity-centric graph convolutional network (CEGCN) model that enables entity mentions to obtain semantic-guided contextual information for more accurate relational representations. This model develops a self-attention enhanced neural network to concentrate on the importance and relevance of different words to obtain semantic-guided contextual information. Then, we employ a dependency tree with entities as global nodes and add virtual edges to construct an entity-centric logical adjacency matrix (ELAM). This matrix can enable entities to aggregate the semantic-guided contextual information with a one-layer GCN calculation. The experimental results on the TACRED and SemEval-2010 Task 8 datasets show that our model can efficiently use semantic-guided contextual information to enrich semantic entity representations and outperform previous models.
APA, Harvard, Vancouver, ISO, and other styles
27

Laniau, Julie, Clémence Frioux, Jacques Nicolas, Caroline Baroukh, Maria-Paz Cortes, Jeanne Got, Camille Trottier, Damien Eveillard, and Anne Siegel. "Combining graph and flux-based structures to decipher phenotypic essential metabolites within metabolic networks." PeerJ 5 (October 12, 2017): e3860. http://dx.doi.org/10.7717/peerj.3860.

Full text
Abstract:
BackgroundThe emergence of functions in biological systems is a long-standing issue that can now be addressed at the cell level with the emergence of high throughput technologies for genome sequencing and phenotyping. The reconstruction of complete metabolic networks for various organisms is a key outcome of the analysis of these data, giving access to a global view of cell functioning. The analysis of metabolic networks may be carried out by simply considering the architecture of the reaction network or by taking into account the stoichiometry of reactions. In both approaches, this analysis is generally centered on the outcome of the network and considers all metabolic compounds to be equivalent in this respect. As in the case of genes and reactions, about which the concept of essentiality has been developed, it seems, however, that some metabolites play crucial roles in system responses, due to the cell structure or the internal wiring of the metabolic network.ResultsWe propose a classification of metabolic compounds according to their capacity to influence the activation of targeted functions (generally the growth phenotype) in a cell. We generalize the concept of essentiality to metabolites and introduce the concept of thephenotypic essential metabolite(PEM) which influences the growth phenotype according to sustainability, producibility or optimal-efficiency criteria. We have developed and made available a tool,Conquests, which implements a method combining graph-based and flux-based analysis, two approaches that are usually considered separately. The identification of PEMs is made effective by using a logical programming approach.ConclusionThe exhaustive study of phenotypic essential metabolites in six genome-scale metabolic models suggests that the combination and the comparison of graph, stoichiometry and optimal flux-based criteria allows some features of the metabolic network functionality to be deciphered by focusing on a small number of compounds. By considering the best combination of both graph-based and flux-based techniques, theConquestspython package advocates for a broader use of these compounds both to facilitate network curation and to promote a precise understanding of metabolic phenotype.
APA, Harvard, Vancouver, ISO, and other styles
28

Kavinilavu, A., and S. Neelavathy Pari. "COMPRESSION OF HIGH-RESOLUTION VOXEL PHANTOMS BY MEANS OF B+ TREE." International Journal of Research -GRANTHAALAYAH 7, no. 12 (June 9, 2020): 199–208. http://dx.doi.org/10.29121/granthaalayah.v7.i12.2019.312.

Full text
Abstract:
Data structures are chosen to save space and to grant fast access to data by it’s key for a particular structural representation. The data structures surveyed are linear lists, hierarchical structures, graph structures. B+ tree is an expansion of a B tree data structure which allows efficient insertions, deletions and search operations. It is used to store a large amount of data that cannot be stored in the main memory. B+ tree leaf nodes are connected together in the form of a singly linked list to make search queries more efficient and effective. The drawback of binary tree geometry is that the decrease in memory use comes at the expense of more frequent memory access, might slow down simulation in which frequent memory access constitutes a significant part of the execution time. Processing and compression of voxel phantoms without loss of quality. Voxels are often utilized in the visualization and analysis of medical and scientific (logical) information. Voxel phantoms which comprise a set of small volume components appeared towards the end of the 1980s and improved on the first scientific models. These are the models of the human body. These phantoms are an extremely exact representation. Fetching of records in the equal number of disk accesses and to reduce the access time by reducing the height of the tree and increasing the number of branches in the node.
APA, Harvard, Vancouver, ISO, and other styles
29

Hahanova, A., V. Hahanov, S. Chumachenko, E. Litvinova, and D. Rakhlis. "VECTOR-DRIVEN LOGIC AND STRUCTURE FOR TESTING AND DEDUCTIVE FAULT SIMULATION." Radio Electronics, Computer Science, Control, no. 3 (October 6, 2021): 69–85. http://dx.doi.org/10.15588/1607-3274-2021-3-7.

Full text
Abstract:
Context. It is known that data structures are decisive for the creation of efficient parallel algorithms and high-performance computing devices. Therefore, the development of mathematically perfect and technologically simple data structures takes about 80 percent of the design time, when about 20 percent of time and material resources are spent on algorithms and their hardware-software coding. This lead to search for such primitives of data structures that will significantly simplify the parallel high-performance algorithms which are working on them. Models and methods for testing and simulation of digital systems are proposed, which containing certain advantages of quantum computing in terms of implementation of vector qubit data structures in technology of classical computational processes. Objective. The goal of the work is development of an innovative technology for qubit-vector synthesis and deductive analysis of tests for their verification based on vector data structures that greatly simplify algorithms that can be embedded as BIST components in digital systems on chips. Method. The deductive faults simulation is used to obtain analytical expressions focused on transporting fault lists through a functional or logical element based on the xor-operation, which serves as a measure of similarity-difference between a test, a function and faults which is specified in the same way in one of the formats − a table, graph, equation. A binary vector is proposed as the most technologically advanced primitive of data structures for setting logical functionality for the purpose of parallel synthesis and analysis of digital systems. The parallelism of solving combinatorial problems is a physical property of quantum computing, which in classical computing, for parallel simulation and faults diagnostics, is provided by unitary-coded data structures due to excess memory. Results. 1) A method of analytical synthesis of deductive logic for functional elements on the gate level and register transfer level has been developed. 2) A deductive processor for faults simulation based on transporting input lists or faults vectors to external outputs of digital circuits was proposed. 3) The qubit-vector form of logic setting and methods of qubit synthesis of deductive equations for faults simulation were described. 4) A qubit-vector method for the tests’ synthesis which is using derivatives calculated by vector coverage of logic has been developed. 5) Models and methods verification is performed on test examples in the software implementation of structures and algorithms. Conclusions. The scientific novelty lies in the new paradigm of the technology for the synthesis of deductive RTL logic based on metric test equation, which forms the. A vector form for structures description is introduced, which makes it possible to apply wellknown technologies for the synthesis and analysis of logical circuits tests to effectively solve the problems of graph structures testing and state machine models of digital devices. The practical significance is reflected in the examples of analytical synthesis of deductive logic for functional elements on gate level and register transfer level. A deductive processor for faults simulation which is focused on implementation as a BIST tool, which is used in online testing, simulation and fault diagnosis for digital systems on chips is proposed. A qubit-vector form of the digital systems description is proposed, which surpasses the existing methods of computing devices development in terms of the metric: manufacturability, compactness, speed and quality. A software application has been developed that implements the main testing, simulation and diagnostics services which are used in the educational process to study the advantages of qubit-vector data structures and algorithms. The computational complexity of synthesis processes and deductive formulas for logic and their usage in fault simulation are given.
APA, Harvard, Vancouver, ISO, and other styles
30

Fan, Youping, Jingjiao Li, Dai Zhang, Jie Pi, Jiahan Song, and Guo Zhao. "Supporting Sustainable Maintenance of Substations under Cyber-Threats: An Evaluation Method of Cybersecurity Risk for Power CPS." Sustainability 11, no. 4 (February 14, 2019): 982. http://dx.doi.org/10.3390/su11040982.

Full text
Abstract:
In the increasingly complex cyber-environment, appropriate sustainable maintenance of substation auto systems (SASs) can lead to many positive effects on power cyber-physical systems (CPSs). Evaluating the cybersecurity risk of power CPSs is the first step in creating sustainable maintenance plans for SASs. In this paper, a mathematical framework for evaluating the cybersecurity risk of a power CPS is proposed considering both the probability of successful cyberattacks on SASs and their consequences for the power system. First, the cyberattacks and their countermeasures are introduced, and the probability of successful cyber-intruding on SASs is modeled from the defender’s perspective. Then, a modified hypergraph model of the SAS’s logical structure is established to quantitatively analyze the impacts of cyberattacks on an SAS. The impacts will ultimately act on the physical systems of the power CPS. The modified hypergraph model can describe more information than a graph or hypergraph model and potentially can analyze complex networks like CPSs. Finally, the feasibility and effectiveness of the proposed evaluation method is verified by the IEEE 14-bus system, and the test results demonstrate that this proposed method is more reasonable to assess the cybersecurity risk of power CPS compared with some other models.
APA, Harvard, Vancouver, ISO, and other styles
31

Vavreniuk, Tetiana. "Expressiveness of Ivan Ohiienko's language (based on the work "Ukrainian Church under Bohdan Khmelnytsky. 1647-1657")." IVAN OHIIENKO AND CONTEMPORARY SCIENCE AND EDUCATION SCHOLARLY PAPERS PHILOLOGY, no. 17 (December 1, 2020): 21–26. http://dx.doi.org/10.32626/2309-7086.2020-17-2.21-26.

Full text
Abstract:
The article raises the problem of expressiveness of the language of Ivan Ohiіenko’s work «Ukrainian Church under Bohdan Khmelnytsky. 1647-1657». Attention is focused on the fact that the author’s use of expressive means gives the text signs of popular sci-ence substyle. The role of evaluative vocabulary with positive and negative semantics as an expressive means, which indirectly but clearly represents the author’s attitude to the depicted, is determined. The expressiveness of the language of the studied text is enhanced by detailed metaphors, which are not very frequent, but quite expressive.The means of expressive syntax are analyzed: rhetorical questions, interrogative-corresponding complexes, exclamatory sentences – which promote semantic con-densation and logical expression of thought. Amplifi cation of interrogative sentences enhances the expressiveness of popular science text, thereby emotionally aff ecting the reader. Interrogative-answering complexes perform three functions in the text: the function of imitating dialogue, the function of problem statement and the function of defi nition. It is noted that exclamatory sentences not only convey the author’s emo-tions, but are also a means of expressiveness (enhanced expressiveness).The expressive load of constructions with direct speech and quotations is deter-mined: the use of these structures models an imaginary dialogue-discussion or is a means of argumentation. Inclusions, as a kind of foreign language, expand the subjec-tive plan of the text and can convey diff erent authorial assessment.It is noted that the expressiveness of the text is enhanced by such a graphic means as words, phrases highlighted by font. The change of font and line density of graph-emes refl ects the change of intonation and logical emphasis or contains the author’s point of view, draws attention to the words highlighted by line density.The opinion is substantiated that expressive means are not only marked by en-hanced expressiveness, but also have infl uential qualities. The special purpose of ex-pressive constructions is not only to convey to the addressee this or that information, but also to draw his attention to it.
APA, Harvard, Vancouver, ISO, and other styles
32

Гончар, Андрій Володимирович, Станіслав Олексійович Довгий, and Марина Андріївна Попова. "Онтологічний підхід до консолідації 3D-моделей об’єктів історико-культурної спадщини та ГІС." RADIOELECTRONIC AND COMPUTER SYSTEMS, no. 1 (February 27, 2021): 81–91. http://dx.doi.org/10.32620/reks.2021.1.07.

Full text
Abstract:
The development of the information society, the rise of the knowledge economy, the implementation of the latest technologies in the IT industry have provided the opportunity for free mass access to the digitized heritage of world civilization using augmented and virtual reality in three-dimensional space. The leaders in modeling the three-dimensional geometry of real heritage conservation objects are 3D panorama and 3D GIS technologies, the combination of whose capabilities has not yet become widespread. The paper subject is the processes of consolidation of ontological 3D models of heritage conservation objects and geoinformation systems for the formation of virtual excursion routes, which will provide new opportunities in the study of the world cultural, historical, scientific digital documentary heritage by integrating transdisciplinary distributed information resources describing the selected object into a piece of unified information and research space. The paper purpose is to increase the efficiency of user interaction with distributed information resources and museum knowledge systems by creating models and developing a method and on their based on – information technology for the formation of transdisciplinary virtual museum spaces, the implementation of ontological interaction in their environments with the integrated use of spatially distributed information based on integration with geoinformation systems through an ontological interface. Objectives: to analyze modern approaches and the most common software tools for 3D modeling of heritage conservation objects; to develop an infological model of an ontological excursion route; formalize the process of dynamic redistribution of heritage conservation objects during the excursion; to develop an algorithm for consolidating 3D models of heritage conservation objects and GIS for the formation of an excursion route. To achieve this goal, a set of methods was used: to develop operational ontological models of heritage conservation objects – system analysis, set theory, graph theory, to formalize knowledge representation – algebraic-logical and obvious methods; for software implementation of information technology – design patterns and object-oriented analysis. The following results were obtained: analysis of software solutions for three-dimensional representation and analysis of data in GIS made it possible to determine that most of them are intended only for geometry visualization and do not have information management tools; it is shown that the most suitable for practical implementation in terms of including time, labor and financial resources is GIS with integrated 3D panoramas of heritage conservation objects; it was determined that the ontological approach to the consolidation of multi-format data, created according to various standards and technologies, provides a solution to the problems of heterogeneity and interoperability of transdisciplinary distributed information resources that describe heritage conservation objects, is the most appropriate and effective; an algorithm for the consolidation of 3D models of heritage conservation objects and GIS for the formation of an excursion route is presented; an ontological model of an excursion route is presented, the taxonomy of which is implemented in the form of a graph. Conclusions. The conceptual and system-technical foundations of the transdisciplinary representation of models of heritage conservation objects with the integrated use of distributed information, in particular geographic information, including the format of ontological interaction in the form of an excursion in the GIS environment, have been developed.
APA, Harvard, Vancouver, ISO, and other styles
33

Belov, P. G. "Corruption as a threat to the russia national security. Evaluation and reduction of risk on the modeling’s basis." Issues of Risk Analysis 16, no. 1 (February 28, 2019): 10–23. http://dx.doi.org/10.32686/1812-5220-2019-16-10-23.

Full text
Abstract:
The article presents the study of the corruption call (CC) results which implies the development of the corresponding threat, which requires a response to save the nation and the country created by it. The main emphasis is on an a priori assessment of the measure of the possibility of CC and the expected social and economic damage from it. The main tools used are graph-analytical and logical-probabilistic modeling, as well as the system analysis of the circumstances of the spread of corruption based on it and the system synthesis of proposals for its limitation. The assessing criteria of the CC severity is the risk, interpreted as an integral measure of this danger and measured in person-years of lost social time. The models developed by the author are causal diagrams and their mathematical equivalents, taking into account several dozen prerequisites for the appearance of CC and the outcomes of its destructive manifestation. Among the factors of the first group are prevail everything that results from the vicious privatization of the former USSR property and the overconsumption of the Russia citizens, while their negative consequences are various forms of economic, political and moral damage. Considering the specificity of social processes and the limited humanitarian methods of their research, an important place in the article is devoted to evaluating the parameters used in the model using the apparatus of the fuzzy sets theory and automating its system (qualitative and quantitative) analysis.
APA, Harvard, Vancouver, ISO, and other styles
34

Bochkov, A. V. "Method of using the transitive graph of a Markovian process as part of ranking of heterogeneous items." Dependability 21, no. 1 (March 24, 2021): 11–16. http://dx.doi.org/10.21683/1729-2646-2021-21-1-11-16.

Full text
Abstract:
Hierarchy analysis developed by Thomas Saaty is a closed logical structure that uses simple and well-substantiated rules that allow solving multicriterial problems that include both quantitative, and qualitative factors, whereby the quantitative factors can differ in terms of their dimensionality. The method is based on problem decomposition and its representation as a hierarchical arrangement, which allows including into such hierarchy all the knowledge the decision-maker has regarding the problem at hand and subsequent processing of decision-makers’ judgements. As the result, the relative degree of the interaction between the elements of such hierarchy can be identified and later quantified. Hierarchy analysis includes the procedure of multiple judgement synthesis, criteria priority definition and rating of the compared alternatives. The method’s significant limitation consists in the requirement of coherence of pairwise comparison matrices for correct definition of the weights of compared alternatives. The Aim of the paper is to examine a non-conventional method of solving the problem of alternative ratings estimation based on their pairwise comparisons that arises in the process of expert preference analysis in various fields of research. Approaches are discussed to the generation of pairwise comparison matrices taking into consideration the problem of coherence of such matrices and expert competence estimation. Method. The methods of hierarchy analysis, models and methods of the Markovian process theory were used. Result. The paper suggested a method of using the transitive graph of a Markovian process as part of expert ranking of items of a certain parent entity subject to the competence and qualification of the experts involved in the pairwise comparison. It is proposed to use steady-state probabilities of a Markovian process as the correlation of priorities (weights) of the compared items. The paper sets forth an algorithm for constructing the final scale of comparison taking into consideration the experts’ level of competence. Conclusion. The decision procedures, in which the experts are expected to choose the best alternatives out of the allowable set, are quite frequently used in a variety of fields for the purpose of estimation and objective priority definition, etc. The described method can be applied not only for comparing items, but also for solving more complicated problems of expert group estimation, i.e., planning and management, prediction, etc. The use of the method contributes to the objectivity of analysis, when comparing alternatives, taking into consideration various aspects of their consequences, as well as the decision-maker’s attitude to such consequences. The suggested model-based approach allows the decision-maker identifying and adjusting his/her preferences and, consequently, choosing the decisions according to such preferences, avoiding logical errors in long and complex reasoning chains. This approach can be used in group decision-making, description of the procedures that compensate a specific expert’s insufficient knowledge by using information provided by the other experts.
APA, Harvard, Vancouver, ISO, and other styles
35

Jolly, Amy E., Gregory T. Scott, David J. Sharp, and Adam H. Hampshire. "Distinct patterns of structural damage underlie working memory and reasoning deficits after traumatic brain injury." Brain 143, no. 4 (April 1, 2020): 1158–76. http://dx.doi.org/10.1093/brain/awaa067.

Full text
Abstract:
Abstract It is well established that chronic cognitive problems after traumatic brain injury relate to diffuse axonal injury and the consequent widespread disruption of brain connectivity. However, the pattern of diffuse axonal injury varies between patients and they have a correspondingly heterogeneous profile of cognitive deficits. This heterogeneity is poorly understood, presenting a non-trivial challenge for prognostication and treatment. Prominent amongst cognitive problems are deficits in working memory and reasoning. Previous functional MRI in controls has associated these aspects of cognition with distinct, but partially overlapping, networks of brain regions. Based on this, a logical prediction is that differences in the integrity of the white matter tracts that connect these networks should predict variability in the type and severity of cognitive deficits after traumatic brain injury. We use diffusion-weighted imaging, cognitive testing and network analyses to test this prediction. We define functionally distinct subnetworks of the structural connectome by intersecting previously published functional MRI maps of the brain regions that are activated during our working memory and reasoning tasks, with a library of the white matter tracts that connect them. We examine how graph theoretic measures within these subnetworks relate to the performance of the same tasks in a cohort of 92 moderate-severe traumatic brain injury patients. Finally, we use machine learning to determine whether cognitive performance in patients can be predicted using graph theoretic measures from each subnetwork. Principal component analysis of behavioural scores confirm that reasoning and working memory form distinct components of cognitive ability, both of which are vulnerable to traumatic brain injury. Critically, impairments in these abilities after traumatic brain injury correlate in a dissociable manner with the information-processing architecture of the subnetworks that they are associated with. This dissociation is confirmed when examining degree centrality measures of the subnetworks using a canonical correlation analysis. Notably, the dissociation is prevalent across a number of node-centric measures and is asymmetrical: disruption to the working memory subnetwork relates to both working memory and reasoning performance whereas disruption to the reasoning subnetwork relates to reasoning performance selectively. Machine learning analysis further supports this finding by demonstrating that network measures predict cognitive performance in patients in the same asymmetrical manner. These results accord with hierarchical models of working memory, where reasoning is dependent on the ability to first hold task-relevant information in working memory. We propose that this finer grained information may be useful for future applications that attempt to predict long-term outcomes or develop tailored therapies.
APA, Harvard, Vancouver, ISO, and other styles
36

Rozin, Mikhail, Valery Svechkarev, Evgeny Nesmeyanov, Irina Ryabtseva, and Sergey Yusov. "Model of integration of competencies in a professionally-oriented education." E3S Web of Conferences 210 (2020): 22038. http://dx.doi.org/10.1051/e3sconf/202021022038.

Full text
Abstract:
It is noted that in higher professional education there is a global trend of the growing number of students combining the process of studying at the University with employment in the workplace in order to form a basis for their future professional and specialized career. Modern researches of the existing educational standards reveal the fact that there are practically no models meeting the needs of a new perspective on the organization of professionally-oriented training. It is suggested that the topological description on the basis of the graph theory, namely the cognitive model which has already been tested in the study of such problems in education should be used as a theoretical and methodological basis of the study. The cognitive model allows to involve the available model descriptions of educational processes, namely, structural-dynamic descriptions interpreted in this case as logical-semantic and causal. A multi-circuit cognitive model of professionally-oriented education based on the integration of competencies is proposed. The authors propose a multi-circuit cognitive model of professionally-oriented education based on the integration of competencies. The Central place in the model is given to the target chain of factors: Integration of competences → Professional competences → Educational and production process. This model helps transmit and synchronize signals across all model contours, continuously initialize the target model functioning, and analyze the achieved level of professional competencies in the integration process. It is on this axis that all four contours of the model are strung, which eliminates the loss of the target determination of the structure. The model implements the principle of professional orientation of education. Thus it is possible to balance the advantages of the educational competences received as a result of educational process, as well as the peculiarities of the production competences generated in manufacturing by means of their integration. Professional competencies in the model do not dominate, but rather are the result of continuous integration of both educational and industrial competencies.
APA, Harvard, Vancouver, ISO, and other styles
37

Chemodanov, S. I., and Yu V. Burlakov. "Update of the technical equipment of the grain-harvesting complex." Siberian Herald of Agricultural Science 51, no. 6 (January 5, 2022): 95–101. http://dx.doi.org/10.26898/0370-8799-2021-6-11.

Full text
Abstract:
Many options have been developed for the implementation of the algorithm for updating the fleet of grain harvesters to date. In accordance with the yield and other indicators, recommendations for the formation and renewal of the harvester fleet are proposed discretely in the form of tables or charts. This form of information does not always meet the requirements of operational correction and does not allow assessing the technological capabilities of the harvesting units, depending on the harvesting conditions. The method to improve the formation of the initial information for operational decision-making on the effective upgrading of technical means of grain harvesting complex taking into account the zonal features of a particular agricultural enterprise is proposed. A graph-analytical method for determining the main parameters of the basic harvesting tools depending on the predicted yield level is developed and the influence of the factors determining the composition of the grain harvesting fleet is assessed. This method makes it possible to identify the most rational basic parameters of alternative basic harvesting tools for a specific agricultural enterprise. The first step is to determine the basic parameters of the basic equipment, then select the appropriate size series of self-propelled threshers for combine harvesters and reapers. Further, alternative versions of various models of grain harvesting units and complexes are formed. For the subsequent selection of rational types of cleaning agents and their criterion assessment, technical and technological, environmental and other indicators are used. The expert-logical analysis of information resources makes it possible to identify and assess the factors that determine the quantitative composition of the technical means of the grain harvesting complex. The final stage in the formation of the initial information for making a decision on updating the technical means of the grain harvesting complex should be their economic assessment, which makes it possible to predict the competitiveness of the threshed grain.
APA, Harvard, Vancouver, ISO, and other styles
38

Brigl, B., T. Wendt, and A. Winter. "Modeling Hospital Information Systems (Part 1): The Revised Three-layer Graph-based Meta Model 3LGM2." Methods of Information in Medicine 42, no. 05 (2003): 544–51. http://dx.doi.org/10.1055/s-0038-1634381.

Full text
Abstract:
Summary Objectives: Not only architects but also information managers need models and modeling tools for their subject of work. Especially for supporting strategic information management in hospitals, the meta model 3LGM2 is presented as an ontological basis for modeling the comprehensive information system of a hospital (HIS). Methods: In a case study, requirements for modeling HIS have been deduced. Accordingly 3LGM2 has been designed to describe HIS by concepts on three layers. The domain layer consists of enterprise functions and entity types, the logical tool layer focuses on application components and the physical tool layer describes physical data processing components. In contrast to other approaches a lot of inter-layer-relationships exist. 3LGM2 is defined using the Unified Modeling Language (UML). Results: Models of HIS can be created which comprise not only technical and semantic aspects but also computer-based and paper-based information processing. A software tool supporting the creation of 3LGM2 compliant models in a graphical way has been developed. The tool supports in detecting those shortcomings at the logical or the physical tool layers which make it impossible to satisfy the information needs at the domain layer. 3LGM2 can also be used as an ontology for describing HIS in natural language. Conclusions: Strategic information management even in large hospitals should be and can be supported by dedicated methods and tools. Although there have been good experiences with 3LGM2 concerning digital document archiving at the Leipzig University Hospital, which are presented in part 2, the benefit of the proposed method and tool has to be further evaluated.
APA, Harvard, Vancouver, ISO, and other styles
39

Gepreel, K. A., M. Higazy, and A. M. S. Mahdy. "Optimal control, signal flow graph, and system electronic circuit realization for nonlinear Anopheles mosquito model." International Journal of Modern Physics C 31, no. 09 (July 27, 2020): 2050130. http://dx.doi.org/10.1142/s0129183120501302.

Full text
Abstract:
We study the estimated investigative answers for one of the popular models in biomathematics, in particular, the nonlinear Anopheles mosquito model numerically. The optimal control (OC) for nonlinear Anopheles mosquito model is examined. Important and adequate conditions to ensure the presence and singularity of the arrangements of the control issue are assumed. Two control factors are suggested to limit the normal measure of eggs laid per treated female every day. The signal stream chart and Simulink[Formula: see text]Matlab of this model are constructed. The framework is designed utilizing the MULTISIM simulation program. We utilize the homotopy disruption strategy (HPM) to examine the logical surmised answer for the nonlinear control issue. We utilize the mathematical programming bundles, for example, Maple, to emphasize while ascertaining the rough arrangement. Results are displayed graphically and introduced to delineate the conduct of obtained inexact arrangements.
APA, Harvard, Vancouver, ISO, and other styles
40

Mišovič, Milan, and Oldřich Faldík. "Applying of component system development in object methodology." Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis 61, no. 7 (2013): 2515–22. http://dx.doi.org/10.11118/actaun201361072515.

Full text
Abstract:
In the last three decades, the concept and implementation of component-based architectures have been promoted in software systems creation. Increasingly complex demands are placed on the software component systems, in particular relating to the dynamic properties. The emergence of such requirements has been gradually enforced by the practice of development and implementation of these systems, especially for information systems software.Just the information systems (robust IS) of different types require that target software meets their requirements. Among other things, we mean primarily the adaptive processes of different domains, high distributives due to the possibilities of the Internet 2.0, acceptance of high integrity of life domains (process, data and communications integrity), scalability, and flexible adaptation to process changes, a good context for external devices and transparent structure of the sub-process modules and architectural units.Of course, the target software of required qualities and the type robust cannot be a monolith. As commonly known, development of design toward information systems software has clearly come to the need for the software composition of completely autonomous, but cooperating architectural units that communicate with each other using messages of prescribed formats.Although for such units there were often used the so called subsystems and modules, see (Jac, Boo, Rumbo, 1998) and (Arlo, Neus, 2007), their abstraction being gradually enacted as the term component. In other words, the subsystems and modules are specific types of components.In (Král, Žeml, 2000) and (Král, Žeml, 2003) there are considered two types of target software of information systems. The first type – there are SWC (Software Components), composed of permanently available components, which are thought as services – Confederate software. The second type – SWA (Software Alliance), called semi Confederate, formed during the run-time of the software system and referred to as software alliance.In both of these mentioned publications there is delivered ​​deep philosophy of relevant issues relating to SWC / SWA as creating copies of components (cloning), the establishment and destruction of components at software run-time (dynamic reconfiguration), cooperation of autonomous components, programmable management of components interface in depending on internal components functionality and customer requirements (functionality, security, versioning).Nevertheless, even today we can meet numerous cases of SWC / SWA existence, with a highly developed architecture that is accepting vast majority of these requests. On the other hand, in the development practice of component-based systems with a dynamic architecture (i.e. architecture with dynamic reconfiguration), and finally with a mobile architecture (i.e. architecture with dynamic component mobility) confirms the inadequacy of the design methods contained in UML 2.0. It proves especially the dissertation thesis (Rych, Weis, 2008). Software Engineering currently has two different approaches to systems SWC / SWA. The first approach is known as component-oriented software development CBD (Component based Development). According to (Szyper, 2002) that is a collection of CBD methodologies that are heavily focused on the setting up and software components re-usability within the architecture. Although CBD does not show high theoretical approach, nevertheless, it is classified under the general evolution of SDP (Software Development Process), see (Sommer, 2010) as one of its two dominant directions.From a structural point of view, a software system consists of self-contained, interoperable architectural units – components based on well-defined interfaces. Classical procedural object-oriented methodologies significantly do not use the component meta-models, based on which the target component systems are formed, then. Component meta-models describe the syntax, semantics of components. They are a system of rules for components, connectors and configuration. Component meta-models for dynamic and mobile architectures also describe the concept of rules for configuration changes (rules for reconfiguration). As well-known meta-models are now considered: Wright for static architecture, SOFA and Darvin for dynamic architecture and SOFA 2.0 for mobile architecture, see (Rych, Weis, 2008).The CBD approach verbally defines the basic terms as component (primitive / composite), interface, component system, configuration, reconfiguration, logical (structural) view, process view (behavioral), static component architecture, dynamic architecture, mobile architecture (fully dynamic architecture), see (IEEE Report, 2000) and (Crnk, Chaud, 2006).The CBD approach also presents several ​​ADL languages (Architecture Description Languages) which are able to describe software architecture. The known languages ​​are integration ACME and UML (Unified Modeling Language), see (Garl, Mon, Wil, 2000) and (UNIFEM, 2005).The second approach to SWC / SWA systems is formed on SOA, but this article does not deal with it consistently.SOA is a philosophy of architecture. SOA is not a methodology for the comprehensive development of the target software. Nevertheless, SOA successfully filled the role of software design philosophy and on the other hand, also gave an important concept linking software components and their architectural units – business services. SOA understands any software as a Component System of a business service and solved life components in it. The physical implementation of components is given by a Web services platform. A certain lack of SOA is its weak link to the business processes that are a universally recognized platform for business activities and the source for the creation of enterprise services.This paper deals with a specific activity in the CBD, i.e. the integration of the concept of component-based system into an advanced procedural, object-oriented methodology (Arlo, Neust, 2007), (Kan, Müller, 2005), (​​Krutch, 2003) for problem domains with double-layer process logic. There is indicated an integration method, based on a certain meta-model (Applying of the Component system Development in object Methodology) and leading to the component system formation. The mentioned meta-model is divided into partial workflows that are located in different stages of a classic object process-based methodology. Into account there are taken the consistency of the input and output artifacts in working practices of the meta-model and mentioned object methodology. This paper focuses on static component systems that are starting to explore dynamic and mobile component systems.In addition, in the contribution the component system is understood as a specific system, for its system properties and basic terms notation being used a set and graph and system algebra.
APA, Harvard, Vancouver, ISO, and other styles
41

Burchardt, Aljoscha, Sebastian Padó, Dennis Spohr, Anette Frank, and Ulrich Heid. "Constructing Integrated Corpus and Lexicon Models for Multi-Layer Annotation in OWL DL." Linguistic Issues in Language Technology 1 (June 1, 2008). http://dx.doi.org/10.33011/lilt.v1i.1191.

Full text
Abstract:
We present a general approach to formally modelling corpora with multi-layered annotation in a typed logical representation language, OWL DL. By defining abstractions over the corpus data, we can generalise from a large set of individual corpus annotations, thereby inducing a lexicon model. The resulting combined corpus and lexicon model can be interpreted as a graph structure that offers flexible querying functionality beyond current XML-based query languages. Its powerful methods for characterising and checking consistency can be used for incremental model refinement. In addition, the formalisation in a graph-based structure offers the means of defining flexible lexicon views over the corpus data. These views can be tailored for linguistic inspection or to define clean interfaces with other linguistic resources. We illustrate our approach by applying it to the syntactically and semantically annotated SALSA/TIGER corpus, a collection of German newspaper text.
APA, Harvard, Vancouver, ISO, and other styles
42

Wójtowicz, Anna, and Krzysztof Wójtowicz. "A graph model for probabilities of nested conditionals." Linguistics and Philosophy, July 30, 2021. http://dx.doi.org/10.1007/s10988-021-09324-z.

Full text
Abstract:
AbstractWe define a model for computing probabilities of right-nested conditionals in terms of graphs representing Markov chains. This is an extension of the model for simple conditionals from Wójtowicz and Wójtowicz (Erkenntnis, 1–35. 10.1007/s10670-019-00144-z, 2019). The model makes it possible to give a formal yet simple description of different interpretations of right-nested conditionals and to compute their probabilities in a mathematically rigorous way. In this study we focus on the problem of the probabilities of conditionals; we do not discuss questions concerning logical and metalogical issues such as setting up an axiomatic framework, inference rules, defining semantics, proving completeness, soundness etc. Our theory is motivated by the possible-worlds approach (the direct formal inspiration is the Stalnaker Bernoulli models); however, our model is generally more flexible. In the paper we focus on right-nested conditionals, discussing them in detail. The graph model makes it possible to account in a unified way for both shallow and deep interpretations of right-nested conditionals (the former being typical of Stalnaker Bernoulli spaces, the latter of McGee’s and Kaufmann’s causal Stalnaker Bernoulli models). In particular, we discuss the status of the Import-Export Principle and PCCP. We briefly discuss some methodological constraints on admissible models and analyze our model with respect to them. The study also illustrates the general problem of finding formal explications of philosophically important notions and applying mathematical methods in analyzing philosophical issues.
APA, Harvard, Vancouver, ISO, and other styles
43

"Drawworks Diagnosis by a Temporal Probabilistic Method Using a Microprocessor." Bulletin of the South Ural State University series "Power Engineering" 21, no. 1 (2021): 109–21. http://dx.doi.org/10.14529/power210112.

Full text
Abstract:
The paper dwells upon diagnosing drilling rig electrics with material, time, and labor costs reduction in mind. SciVal analytics and overview of literature on equipment diagnosis reinforce the relevance of this research. To create an automatic fault detection system, it is proposed to combine mathematical models of Boolean objects of diagnosis with the microprocessor capabilities. The research team used an Uralmash 6500/450 BMCh drilling rig to develop electrical equipment diagnosis flowcharts and a drawworks logical model; then the researchers estimated the costs of checking the model elements and compiled a table of a fault functions. The proposal was to program the fault location algorithm in the controller programming language following the author-developed troubleshooting graph which uses a temporal probabilistic method. To visualize the solution, the paper presents an original method that diagnoses faults of individual drilling rig components; case study herein analyzes an inductive sensor as such a component. The method consists in using additional feedback and implementing an algorithm for automatic fault detection in a high-level programming language.
APA, Harvard, Vancouver, ISO, and other styles
44

Stankevich, Tatiana Sergeevna. "SIMULATION OF SPREADING FOREST FIRE UNDER NONSTATIONARITY AND UNCERTAINTY BY MEANS OF ARTIFICIAL INTELLIGENCE AND DEEP MACHINE LEARNING." Vestnik of Astrakhan State Technical University. Series: Management, computer science and informatics, July 25, 2019, 97–107. http://dx.doi.org/10.24143/2072-9502-2019-3-97-107.

Full text
Abstract:
The article describes the results of increasing the efficiency of operational forecast of the forest fire dynamics under nonstationarity and uncertainty through the fire dynamics modeling based on artificial intelligence and deep machine learning. To achieve the goal there were used following methods: system analysis method, theory of neural networks, deep machine learning method, method of operational forecasting of the forest fire dynamics, method of filtering images (modified median filter), MoSCoW method, and ER-method. In the course of study there have been developed forest fire forecasting models (models of treetop and ground fires) using artificial neural networks. The developed models solve the recognition and forecasting problems in order to determine the dynamics of forest fires in successive images and generating images with a forecast of fire spread. There has been given the general logical scheme of the proposed forest fire forecasting models involving five stages: stage 1 - data input; stage 2 - preprocessing of input data (format check; size check; noise removal); stage 3 - object recognition using Convolutional Neural Networks (recognition of fire data; recognition of data on environmental factors; recognition of data on the nature of forest plantations); stage 4 - development of forest fire forecasting; stage 5 - output of the generated image with the operational forecast. To build and train artificial neural networks, a visual forest fire dynamics database was proposed to use. The developed forest fire forecasting models are based on a tree of artificial neural networks in the form of an acyclic graph and identify dependencies between the dynamics of a forest fire and the characteristics of the external and internal environment.
APA, Harvard, Vancouver, ISO, and other styles
45

Fominykh, Dmitry Sergeevich, Aleksey Sergeevich Bogomolov, Vladimir Andreevich Ivashchenko, Vadim Alekseevich Kushnikov, Alexander Fedorovich Rezchikov, and Leonid Yurevich Filimonyuk. "THE PROBLEM OF MINIMIZING THE PROBABILITY OF ACCIDENTS AT WELDING IN ROBOTIZED TECHNOLOGICAL COMPLEXES." Vestnik of Astrakhan State Technical University. Series: Management, computer science and informatics, October 25, 2017, 21–30. http://dx.doi.org/10.24143/2072-9502-2017-4-21-30.

Full text
Abstract:
The article studies the problems, mathematical models and algorithms that allow reducing the probability of emergency situations during welding in robotic complexes. Since the application of methods of the calculus of variations is difficult, the task is reduced to the development and verification of the implementation of a detailed comprehensive plan of measures to remove the emergency situation, leading to a shutdown of the technological process. A comprehensive plan of measures based on the causal relationship between the parameters of the process and the study of the experience of the dispatching personnel has been developed. The plan is presented in the form of an oriented graph in which the vertices are the activities of the plan, and the arcs determine their relationship and sequence of implementation. The conditions affecting the technological process and the implementation of the plan are presented in the form of a production model. To verify the implementation of the plan in accordance with the principles and conditions for their implementation, a logical function was developed and a circuit of a discrete device constructed according to this function was drawn up. By specifying the values of the function arguments, the possibility of implementing the plan can be checked at any time. In the article the scheme of introducing educational mathematics in the structure of the existing complex of technical controls of the robotized welding complex is presented. The algorithms of the solution at various time intervals are analyzed with the help of the information-logic scheme. The introduction of the developed models and algorithms in industrial enterprises using robotic welding systems allows to reduce the damage from emergency situations and shutdown of the technological process.
APA, Harvard, Vancouver, ISO, and other styles
46

Weidner, Felix M., Julian D. Schwab, Silke D. Werle, Nensi Ikonomi, Ludwig Lausser, and Hans A. Kestler. "Capturing dynamic relevance in Boolean networks using graph theoretical measures." Bioinformatics, May 13, 2021. http://dx.doi.org/10.1093/bioinformatics/btab277.

Full text
Abstract:
Abstract Motivation Interaction graphs are able to describe regulatory dependencies between compounds without capturing dynamics. In contrast, mathematical models that are based on interaction graphs allow to investigate the dynamics of biological systems. However, since dynamic complexity of these models grows exponentially with their size, exhaustive analyses of the dynamics and consequently screening all possible interventions eventually becomes infeasible. Thus, we designed an approach to identify dynamically relevant compounds based on the static network topology. Results Here, we present a method only based on static properties to identify dynamically influencing nodes. Coupling vertex betweenness and determinative power, we could capture relevant nodes for changing dynamics with an accuracy of 75% in a set of 35 published logical models. Further analyses of the selected compounds’ connectivity unravelled a new class of not highly connected nodes with high impact on the networks’ dynamics, which we call gatekeepers. We validated our method’s working concept on logical models, which can be readily scaled up to complex interaction networks, where dynamic analyses are not even feasible. Availability and implementation Code is freely available at https://github.com/sysbio-bioinf/BNStatic. Supplementary information Supplementary data are available at Bioinformatics online.
APA, Harvard, Vancouver, ISO, and other styles
47

Cui, Hong, Bruce Ford, Julian Starr, James Macklin, Anton Reznicek, Noah Giebink, Dylan Longert, Étienne Léveillé-Bourret, and Limin Zhang. "Author-Driven Computable Data and Ontology Production for Taxonomists." Biodiversity Information Science and Standards 5 (September 27, 2021). http://dx.doi.org/10.3897/biss.5.75741.

Full text
Abstract:
It takes great effort to manually or semi-automatically convert free-text phenotype narratives (e.g., morphological descriptions in taxonomic works) to a computable format before they can be used in large-scale analyses. We argue that neither a manual curation approach nor an information extraction approach based on machine learning is a sustainable solution to produce computable phenotypic data that are FAIR (Findable, Accessible, Interoperable, Reusable) (Wilkinson et al. 2016). This is because these approaches do not scale to all biodiversity, and they do not stop the publication of free-text phenotypes that would need post-publication curation. In addition, both manual and machine learning approaches face great challenges: the problem of inter-curator variation (curators interpret/convert a phenotype differently from each other) in manual curation, and keywords to ontology concept translation in automated information extraction, make it difficult for either approach to produce data that are truly FAIR. Our empirical studies show that inter-curator variation in translating phenotype characters to Entity-Quality statements (Mabee et al. 2007) is as high as 40% even within a single project. With this level of variation, curated data integrated from multiple curation projects may still not be FAIR. The key causes of this variation have been identified as semantic vagueness in original phenotype descriptions and difficulties in using standardized vocabularies (ontologies). We argue that the authors describing characters are the key to the solution. Given the right tools and appropriate attribution, the authors should be in charge of developing a project's semantics and ontology. This will speed up ontology development and improve the semantic clarity of the descriptions from the moment of publication. In this presentation, we will introduce the Platform for Author-Driven Computable Data and Ontology Production for Taxonomists, which consists of three components: a web-based, ontology-aware software application called 'Character Recorder,' which features a spreadsheet as the data entry platform and provides authors with the flexibility of using their preferred terminology in recording characters for a set of specimens (this application also facilitates semantic clarity and consistency across species descriptions); a set of services that produce RDF graph data, collects terms added by authors, detects potential conflicts between terms, dispatches conflicts to the third component and updates the ontology with resolutions; and an Android mobile application, 'Conflict Resolver,' which displays ontological conflicts and accepts solutions proposed by multiple experts. a web-based, ontology-aware software application called 'Character Recorder,' which features a spreadsheet as the data entry platform and provides authors with the flexibility of using their preferred terminology in recording characters for a set of specimens (this application also facilitates semantic clarity and consistency across species descriptions); a set of services that produce RDF graph data, collects terms added by authors, detects potential conflicts between terms, dispatches conflicts to the third component and updates the ontology with resolutions; and an Android mobile application, 'Conflict Resolver,' which displays ontological conflicts and accepts solutions proposed by multiple experts. Fig. 1 shows the system diagram of the platform. The presentation will consist of: a report on the findings from a recent survey of 90+ participants on the need for a tool like Character Recorder; a methods section that describes how we provide semantics to an existing vocabulary of quantitative characters through a set of properties that explain where and how a measurement (e.g., length of perigynium beak) is taken. We also report on how a custom color palette of RGB values obtained from real specimens or high-quality specimen images, can be used to help authors choose standardized color descriptions for plant specimens; and a software demonstration, where we show how Character Recorder and Conflict Resolver can work together to construct both human-readable descriptions and RDF graphs using morphological data derived from species in the plant genus Carex (sedges). The key difference of this system from other ontology-aware systems is that authors can directly add needed terms to the ontology as they wish and can update their data according to ontology updates. a report on the findings from a recent survey of 90+ participants on the need for a tool like Character Recorder; a methods section that describes how we provide semantics to an existing vocabulary of quantitative characters through a set of properties that explain where and how a measurement (e.g., length of perigynium beak) is taken. We also report on how a custom color palette of RGB values obtained from real specimens or high-quality specimen images, can be used to help authors choose standardized color descriptions for plant specimens; and a software demonstration, where we show how Character Recorder and Conflict Resolver can work together to construct both human-readable descriptions and RDF graphs using morphological data derived from species in the plant genus Carex (sedges). The key difference of this system from other ontology-aware systems is that authors can directly add needed terms to the ontology as they wish and can update their data according to ontology updates. The software modules currently incorporated in Character Recorder and Conflict Resolver have undergone formal usability studies. We are actively recruiting Carex experts to participate in a 3-day usability study of the entire system of the Platform for Author-Driven Computable Data and Ontology Production for Taxonomists. Participants will use the platform to record 100 characters about one Carex species. In addition to usability data, we will collect the terms that participants submit to the underlying ontology and the data related to conflict resolution. Such data allow us to examine the types and the quantities of logical conflicts that may result from the terms added by the users and to use Discrete Event Simulation models to understand if and how term additions and conflict resolutions converge. We look forward to a discussion on how the tools (Character Recorder is online at http://shark.sbs.arizona.edu/chrecorder/public) described in our presentation can contribute to producing and publishing FAIR data in taxonomic studies.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography