Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Why3 tool for deductive verification.

Zeitschriftenartikel zum Thema „Why3 tool for deductive verification“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-27 Zeitschriftenartikel für die Forschung zum Thema "Why3 tool for deductive verification" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Shelekhov, Vladimir Ivanovich. „TRANSFORMATION AND VERIFICATION OF THE OS PROGRAM SORTING DEVICES IN A COMPUTER BUS“. System Informatics, Nr. 18 (2021): 1–34. http://dx.doi.org/10.31144/si.2307-6410.2021.n18.p1-34.

Der volle Inhalt der Quelle
Annotation:
The transformation and verification of the bus_sort_breadthfirst program, which belongs to the Linux OS kernel and implements sorting of devices are described. The C program is transformed into the cP language performing macros unfolding, structure changes, and elimination of pointers. Transformed program is translated into the WhyML functional language. For the received program, a specification is constructed. Deductive verification is carried out in the tool Why3.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Fortin, Jean, und Frédéric Gava. „BSP-Why: A Tool for Deductive Verification of BSP Algorithms with Subgroup Synchronisation“. International Journal of Parallel Programming 44, Nr. 3 (31.03.2015): 574–97. http://dx.doi.org/10.1007/s10766-015-0360-y.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Santos, César, Francisco Martins und Vasco Thudichum Vasconcelos. „Deductive Verification of Parallel Programs Using Why3“. Electronic Proceedings in Theoretical Computer Science 189 (19.08.2015): 128–42. http://dx.doi.org/10.4204/eptcs.189.11.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Shelekhov, V. I. „Applying Program Transformations for Deductive Verification of the List Reverse Program“. Programmnaya Ingeneria 12, Nr. 3 (19.05.2021): 127–39. http://dx.doi.org/10.17587/prin.12.127-139.

Der volle Inhalt der Quelle
Annotation:
The program transformation methods to simplify the deductive verification of programs with recursive data types are investigated. The list reversion program is considered as an example. A source program in the C language is translated to the cP functional language which includes no pointers. The resulting program is translated further to the WhyML language to perform deductive verification of the program. The cP language includes the same constructs of the C language except pointers. In the C program, all actions that include pointers are replaced by the equivalent fragments without pointers. These replacement are performed by the special transformations using the results of the program dataflow analysis. Three variants of deductive verification of the transformed list reverse program in the Why3 verification platform with SMT solvers (Z3 4.8.6, CVC3 2.4.1, CVC4 1.7) are performed. First, the recursive WhyML program supplied with specifications was automatically verified successfully using only SMT solvers. Second, the recursive program was translated to the P predicate language. Correctness formulae were constructed for the P program and translated further to the why3 specification language. The formulae proving correctness were easy like the first variant. But correctness formulae for the first and second variants were different. Third, the "imperative" WhyML program that included while loop with additional invariant specifications was verified. The proving was easy but not automatic. So, for deductive verification, recursive program variant appears to be more preferable against imperative program variant.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Lanzinger, Florian, Alexander Weigl, Mattias Ulbrich und Werner Dietl. „Scalability and precision by combining expressive type systems and deductive verification“. Proceedings of the ACM on Programming Languages 5, OOPSLA (20.10.2021): 1–29. http://dx.doi.org/10.1145/3485520.

Der volle Inhalt der Quelle
Annotation:
Type systems and modern type checkers can be used very successfully to obtain formal correctness guarantees with little specification overhead. However, type systems in practical scenarios have to trade precision for decidability and scalability. Tools for deductive verification, on the other hand, can prove general properties in more cases than a typical type checker can, but they do not scale well. We present a method to complement the scalability of expressive type systems with the precision of deductive program verification approaches. This is achieved by translating the type uses whose correctness the type checker cannot prove into assertions in a specification language, which can be dealt with by a deductive verification tool. Type uses whose correctness the type checker can prove are instead turned into assumptions to aid the verification tool in finding a proof.Our novel approach is introduced both conceptually for a simple imperative language, and practically by a concrete implementation for the Java programming language. The usefulness and power of our approach has been evaluated by discharging known false positives from a real-world program and by a small case study.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Watanabe, Yasunari, Kiran Gopinathan, George Pîrlea, Nadia Polikarpova und Ilya Sergey. „Certifying the synthesis of heap-manipulating programs“. Proceedings of the ACM on Programming Languages 5, ICFP (22.08.2021): 1–29. http://dx.doi.org/10.1145/3473589.

Der volle Inhalt der Quelle
Annotation:
Automated deductive program synthesis promises to generate executable programs from concise specifications, along with proofs of correctness that can be independently verified using third-party tools. However, an attempt to exercise this promise using existing proof-certification frameworks reveals significant discrepancies in how proof derivations are structured for two different purposes: program synthesis and program verification. These discrepancies make it difficult to use certified verifiers to validate synthesis results, forcing one to write an ad-hoc translation procedure from synthesis proofs to correctness proofs for each verification backend. In this work, we address this challenge in the context of the synthesis and verification of heap-manipulating programs. We present a technique for principled translation of deductive synthesis derivations (a.k.a. source proofs) into deductive target proofs about the synthesised programs in the logics of interactive program verifiers. We showcase our technique by implementing three different certifiers for programs generated via SuSLik, a Separation Logic-based tool for automated synthesis of programs with pointers, in foundational verification frameworks embedded in Coq: Hoare Type Theory (HTT), Iris, and Verified Software Toolchain (VST), producing concise and efficient machine-checkable proofs for characteristic synthesis benchmarks.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Cohen, Joshua M., und Philip Johnson-Freyd. „A Formalization of Core Why3 in Coq“. Proceedings of the ACM on Programming Languages 8, POPL (05.01.2024): 1789–818. http://dx.doi.org/10.1145/3632902.

Der volle Inhalt der Quelle
Annotation:
Intermediate verification languages like Why3 and Boogie have made it much easier to build program verifiers, transforming the process into a logic compilation problem rather than a proof automation one. Why3 in particular implements a rich logic for program specification with polymorphism, algebraic data types, recursive functions and predicates, and inductive predicates; it translates this logic to over a dozen solvers and proof assistants. Accordingly, it serves as a backend for many tools, including Frama-C, EasyCrypt, and GNATProve for Ada SPARK. But how can we be sure that these tools are correct? The alternate foundational approach, taken by tools like VST and CakeML, provides strong guarantees by implementing the entire toolchain in a proof assistant, but these tools are harder to build and cannot directly take advantage of SMT solver automation. As a first step toward enabling automated tools with similar foundational guarantees, we give a formal semantics in Coq for the logic fragment of Why3. We show that our semantics are useful by giving a correct-by-construction natural deduction proof system for this logic, using this proof system to verify parts of Why3's standard library, and proving sound two of Why3's transformations used to convert terms and formulas into the simpler logics supported by the backend solvers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Devyanin, P. N., und M. A. Leonova. „The techniques of formalization of OS Astra Linux Special Edition access control model using Event-B formal method for verification using Rodin and ProB“. Prikladnaya Diskretnaya Matematika, Nr. 52 (2021): 83–96. http://dx.doi.org/10.17223/20710410/52/5.

Der volle Inhalt der Quelle
Annotation:
The paper presents techniques to specification access control model of OS Astra Linux Special Edition (the MROSL DP-model) in the formalized notation (formalized using the Event-B formal method), that are based on the use of several global types, separation of general total functions into specific total functions, reduction in the number of invariants and guard of events, which iterate over subsets of a certain set. The result of using these techniques was the simplification of automated deductive verification of formalized notation using the Rodin tool and adaptation of the model to verification by model checking formalized notation using the ProB tool. These techniques can be useful in development of the MROSL DP-model, and also in development of other access control models and verification using appropriate tools.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Elad, Neta, Oded Padon und Sharon Shoham. „An Infinite Needle in a Finite Haystack: Finding Infinite Counter-Models in Deductive Verification“. Proceedings of the ACM on Programming Languages 8, POPL (05.01.2024): 970–1000. http://dx.doi.org/10.1145/3632875.

Der volle Inhalt der Quelle
Annotation:
First-order logic, and quantifiers in particular, are widely used in deductive verification of programs and systems. Quantifiers are essential for describing systems with unbounded domains, but prove difficult for automated solvers. Significant effort has been dedicated to finding quantifier instantiations that establish unsatisfiability of quantified formulas, thus ensuring validity of a system’s verification conditions. However, in many cases the formulas are satisfiable—this is often the case in intermediate steps of the verification process, e.g., when an invariant is not yet inductive. For such cases, existing tools are limited to finding finite models as counterexamples. Yet, some quantified formulas are satisfiable but only have infinite models, which current solvers are unable to find. Such infinite counter-models are especially typical when first-order logic is used to approximate the natural numbers, the integers, or other inductive definitions such as linked lists, which is common in deductive verification. The inability of solvers to find infinite models makes them diverge in these cases, providing little feedback to the user as they try to make progress in their verification attempts. In this paper, we tackle the problem of finding such infinite models, specifically, finite representations thereof that can be presented to the user of a deductive verification tool. These models give insight into the verification failure, and allow the user to identify and fix bugs in the modeling of the system and its properties. Our approach consists of three parts. First, we introduce symbolic structures as a way to represent certain infinite models, and show they admit an efficient model checking procedure. Second, we describe an effective model finding procedure that symbolically explores a given (possibly infinite) family of symbolic structures in search of an infinite model for a given formula. Finally, we identify a new decidable fragment of first-order logic that extends and subsumes the many-sorted variant of EPR, where satisfiable formulas always have a model representable by a symbolic structure within a known family, making our model finding procedure a decision procedure for that fragment. We evaluate our approach on examples from the domains of distributed consensus protocols and of heap-manipulating programs (specifically, linked lists). Our implementation quickly finds infinite counter-models that demonstrate the source of verification failures in a simple way, while state-of-the-art SMT solvers and theorem provers such as Z3, cvc5, and Vampire diverge or return “unknown”.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Shelekhov, Vladimir Ivanovich. „COMPARISON OF AUTOMATA-BASED ENGINEERING METHOD AND EVENT-B MODELING METHOD“. System informatics, Nr. 18 (2021): 53–84. http://dx.doi.org/10.31144/si.2307-6410.2021.n18.p53-84.

Der volle Inhalt der Quelle
Annotation:
It is shown that a specification in the Event-B language can be represented by an automata-based program as a non-deterministic composition of simple conditional statements, which corresponds to a narrow subclass of automata-based programs. A specification in Event-B is monolithic. To build a specification, there are no other means of composition, except for a refinement that implements an extension of a previously built specification. Comparison of automata-based engineering method and Event-B modeling method is carried out on two example tasks. Previous solutions to the bridge traffic control problem in the Event-B system are complicated. A simpler solution with deductive verification in the Rodin tool is proposed. The effectiveness of the Event-B verification methods is confirmed by finding three non-trivial errors in our solution.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

HARPER, ROBERT, und DANIEL R. LICATA. „Mechanizing metatheory in a logical framework“. Journal of Functional Programming 17, Nr. 4-5 (Juli 2007): 613–73. http://dx.doi.org/10.1017/s0956796807006430.

Der volle Inhalt der Quelle
Annotation:
AbstractThe LF logical framework codifies a methodology for representing deductive systems, such as programming languages and logics, within a dependently typed λ-calculus. In this methodology, the syntactic and deductive apparatus of a system is encoded as the canonical forms of associated LF types; an encoding is correct (adequate) if and only if it defines acompositional bijectionbetween the apparatus of the deductive system and the associated canonical forms. Given an adequate encoding, one may establish metatheoretic properties of a deductive system by reasoning about the associated LF representation. The Twelf implementation of the LF logical framework is a convenient and powerful tool for putting this methodology into practice. Twelf supports both the representation of a deductive system and the mechanical verification of proofs of metatheorems about it. The purpose of this article is to provide an up-to-date overview of the LF λ-calculus, the LF methodology for adequate representation, and the Twelf methodology for mechanizing metatheory. We begin by defining a variant of the original LF language, calledCanonical LF, in which only canonical forms (long βη-normal forms) are permitted. This variant is parameterized by asubordination relation, which enables modular reasoning about LF representations. We then give an adequate representation of a simply typed λ-calculus in Canonical LF, both to illustrate adequacy and to serve as an object of analysis. Using this representation, we formalize and verify the proofs of some metatheoretic results, including preservation, determinacy, and strengthening. Each example illustrates a significant aspect of using LF and Twelf for formalized metatheory.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Alpuente, María, Daniel Pardo und Alicia Villanueva. „Abstract Contract Synthesis and Verification in the Symbolic 𝕂 Framework“. Fundamenta Informaticae 177, Nr. 3-4 (10.12.2020): 235–73. http://dx.doi.org/10.3233/fi-2020-1989.

Der volle Inhalt der Quelle
Annotation:
In this article, we propose a symbolic technique that can be used for automatically inferring software contracts from programs that are written in a non-trivial fragment of C, called KERNELC, that supports pointer-based structures and heap manipulation. Starting from the semantic definition of KERNELC in the 𝕂 semantic framework, we enrich the symbolic execution facilities recently provided by 𝕂 with novel capabilities for contract synthesis that are based on abstract subsumption. Roughly speaking, we define an abstract symbolic technique that axiomatically explains the execution of any (modifier) C function by using other (observer) routines in the same program. We implemented our technique in the automated tool KINDSPEC 2.1, which generates logical axioms that express pre- and post-condition assertions which define the precise input/output behavior of the C routines. Thanks to the integrated support for symbolic execution and deductive verification provided by 𝕂, some synthesized axioms that cannot be guaranteed to be correct by construction due to abstraction can finally be verified in our setting with little effort.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Abbasi, Rosa, Jonas Schiffl, Eva Darulova, Mattias Ulbrich und Wolfgang Ahrendt. „Combining rule- and SMT-based reasoning for verifying floating-point Java programs in KeY“. International Journal on Software Tools for Technology Transfer 25, Nr. 2 (08.03.2023): 185–204. http://dx.doi.org/10.1007/s10009-022-00691-x.

Der volle Inhalt der Quelle
Annotation:
AbstractDeductive verification has been successful in verifying interesting properties of real-world programs. One notable gap is the limited support for floating-point reasoning. This is unfortunate, as floating-point arithmetic is particularly unintuitive to reason about due to rounding as well as the presence of the special values infinity and ‘Not a Number’ (NaN). In this article, we present the first floating-point support in a deductive verification tool for the Java programming language. Our support in the KeY verifier handles floating-point arithmetics, transcendental functions, and potentially rounding-type casts. We achieve this with a combination of delegation to external SMT solvers on the one hand, and KeY-internal, rule-based reasoning on the other hand, exploiting the complementary strengths of both worlds. We evaluate this integration on new benchmarks and show that this approach is powerful enough to prove the absence of floating-point special values—often a prerequisite for correct programs—as well as functional properties, for realistic benchmarks.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Hahanova, A., V. Hahanov, S. Chumachenko, E. Litvinova und D. Rakhlis. „VECTOR-DRIVEN LOGIC AND STRUCTURE FOR TESTING AND DEDUCTIVE FAULT SIMULATION“. Radio Electronics, Computer Science, Control, Nr. 3 (06.10.2021): 69–85. http://dx.doi.org/10.15588/1607-3274-2021-3-7.

Der volle Inhalt der Quelle
Annotation:
Context. It is known that data structures are decisive for the creation of efficient parallel algorithms and high-performance computing devices. Therefore, the development of mathematically perfect and technologically simple data structures takes about 80 percent of the design time, when about 20 percent of time and material resources are spent on algorithms and their hardware-software coding. This lead to search for such primitives of data structures that will significantly simplify the parallel high-performance algorithms which are working on them. Models and methods for testing and simulation of digital systems are proposed, which containing certain advantages of quantum computing in terms of implementation of vector qubit data structures in technology of classical computational processes. Objective. The goal of the work is development of an innovative technology for qubit-vector synthesis and deductive analysis of tests for their verification based on vector data structures that greatly simplify algorithms that can be embedded as BIST components in digital systems on chips. Method. The deductive faults simulation is used to obtain analytical expressions focused on transporting fault lists through a functional or logical element based on the xor-operation, which serves as a measure of similarity-difference between a test, a function and faults which is specified in the same way in one of the formats − a table, graph, equation. A binary vector is proposed as the most technologically advanced primitive of data structures for setting logical functionality for the purpose of parallel synthesis and analysis of digital systems. The parallelism of solving combinatorial problems is a physical property of quantum computing, which in classical computing, for parallel simulation and faults diagnostics, is provided by unitary-coded data structures due to excess memory. Results. 1) A method of analytical synthesis of deductive logic for functional elements on the gate level and register transfer level has been developed. 2) A deductive processor for faults simulation based on transporting input lists or faults vectors to external outputs of digital circuits was proposed. 3) The qubit-vector form of logic setting and methods of qubit synthesis of deductive equations for faults simulation were described. 4) A qubit-vector method for the tests’ synthesis which is using derivatives calculated by vector coverage of logic has been developed. 5) Models and methods verification is performed on test examples in the software implementation of structures and algorithms. Conclusions. The scientific novelty lies in the new paradigm of the technology for the synthesis of deductive RTL logic based on metric test equation, which forms the. A vector form for structures description is introduced, which makes it possible to apply wellknown technologies for the synthesis and analysis of logical circuits tests to effectively solve the problems of graph structures testing and state machine models of digital devices. The practical significance is reflected in the examples of analytical synthesis of deductive logic for functional elements on gate level and register transfer level. A deductive processor for faults simulation which is focused on implementation as a BIST tool, which is used in online testing, simulation and fault diagnosis for digital systems on chips is proposed. A qubit-vector form of the digital systems description is proposed, which surpasses the existing methods of computing devices development in terms of the metric: manufacturability, compactness, speed and quality. A software application has been developed that implements the main testing, simulation and diagnostics services which are used in the educational process to study the advantages of qubit-vector data structures and algorithms. The computational complexity of synthesis processes and deductive formulas for logic and their usage in fault simulation are given.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Shelekhov, V. I. „Automata-based Software Engineering with Event-B“. Programmnaya Ingeneria 13, Nr. 4 (20.04.2022): 155–67. http://dx.doi.org/10.17587/prin.13.155-167.

Der volle Inhalt der Quelle
Annotation:
A new automata-based programming language has been built by extending the Event-B specification language. When developing models in Event-B, it becomes possible to use automata-based methods in addition to the popular refinement method. Automata-based software engineering, supported by deductive verification in Event-B, can be successfully used for the development of control systems in critical infrastructure with a high cost of error. A model of the Event-B specification in the automata-based programming language is constructed. The Event-B specification is a chain of machine refinements. The machine is defined by a non-deterministic composition of events. An event is the equivalent of a simple conditional statement without else branch. In automata-based software engineering, in addition to non-deterministic composition, a number of other compositions are allowed. The main one is a hypergraphic composition with respect to control states. Parallel process composition, object-oriented and aspect-oriented compositions are also possible. A process can be called from another process. It is possible to send and receive messages. There are different time actions. It is not difficult to rewrite an automata-based program into the Event-B specification. The automata-based software engineering with Event-B is demonstrated by the example of the problem of traffic control on the bridge from the Event-B system manual. A simpler solution with verification in the Rodin tool is proposed. The effectiveness of Event-B verification methods is confirmed by finding three non-trivial errors in our solution.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Shelekhov, V. I. „Automata-based Software Engineering with Event-B“. Programmnaya Ingeneria 13, Nr. 4 (20.04.2022): 155–67. http://dx.doi.org/10.17587/prin.13.155-167.

Der volle Inhalt der Quelle
Annotation:
A new automata-based programming language has been built by extending the Event-B specification language. When developing models in Event-B, it becomes possible to use automata-based methods in addition to the popular refinement method. Automata-based software engineering, supported by deductive verification in Event-B, can be successfully used for the development of control systems in critical infrastructure with a high cost of error. A model of the Event-B specification in the automata-based programming language is constructed. The Event-B specification is a chain of machine refinements. The machine is defined by a non-deterministic composition of events. An event is the equivalent of a simple conditional statement without else branch. In automata-based software engineering, in addition to non-deterministic composition, a number of other compositions are allowed. The main one is a hypergraphic composition with respect to control states. Parallel process composition, object-oriented and aspect-oriented compositions are also possible. A process can be called from another process. It is possible to send and receive messages. There are different time actions. It is not difficult to rewrite an automata-based program into the Event-B specification. The automata-based software engineering with Event-B is demonstrated by the example of the problem of traffic control on the bridge from the Event-B system manual. A simpler solution with verification in the Rodin tool is proposed. The effectiveness of Event-B verification methods is confirmed by finding three non-trivial errors in our solution.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Gromek, Paweł. „Societal dimension of disaster risk reduction. Conceptual framework“. Zeszyty Naukowe SGSP 77 (31.03.2021): 35–54. http://dx.doi.org/10.5604/01.3001.0014.8412.

Der volle Inhalt der Quelle
Annotation:
Current disaster risk reduction approach does not reflect the societal dimension of factors that shape risk and safety. The research objective is to elaborate a model of DRR in its societal dimension, respecting not only an engineering component of disaster risk, but also how people perceive it. The methodology bases on literature review and a deductive investigation for ideas and assumptions verification. As a first result, safety structure was presented. At the highest level of generality, it consists in real safety and safety sense. The second one is a derivative of four components: sense of being informed, sense of perpetration, sense of confidence and sense of anchoring. In analogy to safety, risk could be characterized by an engineering component and risk perception. Perception is structured with direct connection to safety sense. Morphological connection of risk structure, disaster risk reduction structure and two signs of risk (positives and negatives) allows to elaborate the model, which could prove to be a valuable tool in theory and practice of the reduction.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Adamczyk, Mikołaj, Piotr Parasiewicz, Paolo Vezza, Paweł Prus und Giovanni De Cesare. „Empirical Validation of MesoHABSIM Models Developed with Different Habitat Suitability Criteria for Bullhead Cottus Gobio L. as an Indicator Species“. Water 11, Nr. 4 (08.04.2019): 726. http://dx.doi.org/10.3390/w11040726.

Der volle Inhalt der Quelle
Annotation:
Application of instream habitat models such as the Mesohabitat Simulation Model (MesoHABSIM) is becoming increasingly popular. Such models can predict alteration to a river physical habitat caused by hydropower operation or river training. They are a tool for water management planning, especially in terms of requirements of the Water Framework Directive. Therefore, model verification studies, which investigate the accuracy and reliability of the results generated, are essential. An electrofishing survey was conducted in September 2014 on the Stura di Demonte River located in north-western Italy. One hundred and sixteen bullhead—Cottus gobio L.—were captured in 80 pre-exposed area electrofishing (PAE) grids. Observations of bullhead distribution in various habitats were used to validate MesoHABSIM model predictions created with inductive and deductive habitat suitability indices. The inductive statistical models used electrofishing data obtained from multiple mountainous streams, analyzed with logistic regression. The deductive approach was based on conditional habitat suitability criteria (CHSC) derived from expert knowledge and information gathered from the literature about species behaviour and habitat use. The results of model comparison and validation show that although the inductive models are more precise and reflect site- and species-specific characteristics, the CHSC model provides quite similar results. We propose to use inductive models for detailed planning of measures that could potentially impair riverine ecosystems at a local scale, since the CHSC model provides general information about habitat suitability and use of such models is advised in pre-development or generic scale studies. However, the CHSC model can be further calibrated with localized electrofishing data at a lower cost than development of an inductive model.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Chisty, Nur Mohammad Ali, und Harshini Priya Adusumalli. „Applications of Artificial Intelligence in Quality Assurance and Assurance of Productivity“. ABC Journal of Advanced Research 11, Nr. 1 (28.01.2022): 23–32. http://dx.doi.org/10.18034/abcjar.v11i1.625.

Der volle Inhalt der Quelle
Annotation:
Probabilistic intelligence is vital in current management and technology. It is simpler to persuade readers when a management or engineer reports connected difficulties with objective statistical data. Statistical data support the evaluation of the true status, and cause and effect can be induced. The rationale is proven using deductive logic and statistical data verification and induction. Quality practitioners should develop statistical thinking skills and fully grasp the three quality principles: “essence of substance,” “process of business,” and “psychology.” Traditional quality data include variables, attributes, faults, internal and external failure costs, etc., obtained by data collection, data processing, statistical analysis, root cause analysis, etc. Quality practitioners used to rely on these so-called professional qualities to get a job. If quality practitioners do not keep up with the steps of times, quality data collection, organization, analysis, and monitoring will be confusing or challenging. Increasingly, precision tool machines are embedded in various IoTs, gathering machine operation data, component diagnostic and life estimation, consumables monitoring and utilization monitoring, and various data analyses. Data mining and forecasting have steadily been combined into Data Science, which is the future of quality field worth worrying about.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

M, Dmitriev, Gameliak I, Ostroverkhyi O und Dmytrychenko A. „USE OF PRODUCTION WASTE IN ROAD CONSTRUCTION OF UKRAINE“. National Transport University Bulletin 1, Nr. 48 (2021): 143–56. http://dx.doi.org/10.33744/2308-6645-2021-1-48-143-156.

Der volle Inhalt der Quelle
Annotation:
The analytical review provides an analysis of domestic quality standards for road construction materials and structures used with industrial waste (metallurgical slag, thermal plant fly-ashes, mixes of fly­ash, etc.). Note that this issue is not new and some aspects of it have already been covered in special publications and in the media. However, the presented generalized material will be useful not only for road construction specialists, but also for people who would like to know how and why industrial waste is used in road construction. The review gives general concepts and describes the principles and features of building norms and regulations, standards of supervision and control that have operated and operate in Ukraine since the mass construction of roads in the 70-80s of last century and now. Based on this analysis, conclusions are formulated that should be taken into account when initiating the use of waste from the metallurgical industry and energy in the field of road construction of our country. The materials presented in this review are intended for government officials and industry and road professionals, including developers of building norms, researchers, scientists, designers, builders, suppliers, teachers and students of educational institutions who train personnel for construction, in particular roads and airfields. Qualitative method tools were used to collect, verify and analyze information. Information was collected and verified according to the needs of the analysis. The research was performed by the method of iterations. To collect the information needed for the analysis, two categories of information resources were used: human and documentary. Verification of the adequacy, integrity, reliability, validity and applicability of information was achieved by comparing the information obtained by alternative means, as well as using deductive and consistent methods of analysis. The base of normative and technical documents and human data carriers were used as the main source of information. Human resources included secondary sources of information in the person of consultants' specialists and primary sources in the person of respondents - specialists of non-governmental organizations and enterprises of the road industry of Ukraine. Obtaining information from respondents was carried out through oral interviews and interviews. In order to ensure the objectivity and completeness of the information obtained, the technique of informal, unstructured survey was used, followed by verification of the information received from the respondent by comparing it with documentary information obtained from the main sources. The authors of the review express their deep gratitude to the persons who kindly provided the necessary information. KEYWORDS: SLAG, PRODUCTION WASTE, MIXES OF FLY-ASH, MATERIALS, STANDARDS, NORMATIVE DOCUMENTS, ROAD CONSTRUCTION.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

M, Dmitriev, Gameliak I, Ostroverkhyi O und Dmytrychenko A. „USE OF PRODUCTION WASTE IN ROAD CONSTRUCTION OF UKRAINE“. National Transport University Bulletin 1, Nr. 48 (2021): 143–56. http://dx.doi.org/10.33744/2308-6645-2021-1-48-143-156.

Der volle Inhalt der Quelle
Annotation:
The analytical review provides an analysis of domestic quality standards for road construction materials and structures used with industrial waste (metallurgical slag, thermal plant fly-ashes, mixes of fly­ash, etc.). Note that this issue is not new and some aspects of it have already been covered in special publications and in the media. However, the presented generalized material will be useful not only for road construction specialists, but also for people who would like to know how and why industrial waste is used in road construction. The review gives general concepts and describes the principles and features of building norms and regulations, standards of supervision and control that have operated and operate in Ukraine since the mass construction of roads in the 70-80s of last century and now. Based on this analysis, conclusions are formulated that should be taken into account when initiating the use of waste from the metallurgical industry and energy in the field of road construction of our country. The materials presented in this review are intended for government officials and industry and road professionals, including developers of building norms, researchers, scientists, designers, builders, suppliers, teachers and students of educational institutions who train personnel for construction, in particular roads and airfields. Qualitative method tools were used to collect, verify and analyze information. Information was collected and verified according to the needs of the analysis. The research was performed by the method of iterations. To collect the information needed for the analysis, two categories of information resources were used: human and documentary. Verification of the adequacy, integrity, reliability, validity and applicability of information was achieved by comparing the information obtained by alternative means, as well as using deductive and consistent methods of analysis. The base of normative and technical documents and human data carriers were used as the main source of information. Human resources included secondary sources of information in the person of consultants' specialists and primary sources in the person of respondents - specialists of non-governmental organizations and enterprises of the road industry of Ukraine. Obtaining information from respondents was carried out through oral interviews and interviews. In order to ensure the objectivity and completeness of the information obtained, the technique of informal, unstructured survey was used, followed by verification of the information received from the respondent by comparing it with documentary information obtained from the main sources. The authors of the review express their deep gratitude to the persons who kindly provided the necessary information. KEYWORDS: SLAG, PRODUCTION WASTE, MIXES OF FLY-ASH, MATERIALS, STANDARDS, NORMATIVE DOCUMENTS, ROAD CONSTRUCTION.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Nikiforov, Alexander L. „Is “Analytic Philosophy” a Philosophy?“ Russian Journal of Philosophical Sciences 63, Nr. 8 (01.12.2020): 7–21. http://dx.doi.org/10.30727/0235-1188-2020-63-8-7-21.

Der volle Inhalt der Quelle
Annotation:
The article discusses the issue of the nature of analytic philosophy. It is shown that in the 1920s–1940s it was a certain philosophical school, whose representatives were united by some initial principles. Analytic philosophers saw the main task of philosophy in the analysis of the language of natural sciences, in establishing logical connections between scientific propositions, in the empirical substantiation of scientific theories and in the elimination of speculative concepts and proposals from the language of science. The tool for such analysis was the mathematical logic created at the beginning of the 20th century by G. Frege, A.N. Whitehead, B. Russell. Another characteristic feature of the analytic tradition was a negative attitude toward philosophical speculation. Adherents of this tradition believed that philosophy does not provide knowledge about the world, therefore, it is not a science. Analytic philosophers have made a significant contribution to the methodology of scientific knowledge, offering an accurate description of the hypothetical-deductive structure of scientific theory, methods of scientific explanation and prediction, verification, confirmation and refutation of scientific statements. In the late 1930s, most of the analytic philosophers emigrated to England and the United States. The analytic movement is gradually losing its integrity and loses the features of a philosophical school. There is a rejection of mathematical logic as the main means of analysis, the connection with the natural sciences has been lost. In the second half of the 20th century, analytic philosophy from a specific philosophical school turns into a certain style of thinking of the philosophers of various philosophical research areas and orientations.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

De Souza, Carina Lopes, und Tássia A. Gervasoni. „Os impactos da desigualdade à cidadania a partir da inefetividade do direito à moradia: um estudo de caso nas ocupações Beira Trilho no município de Passo Fundo/RS / The impacts of inequality on citizenship from the ineffectiveness of the right to housing: a case study in Beira Trilho occupations in the municipality of Passo Fundo/ RS“. Revista de Direito da Cidade 14, Nr. 4 (31.12.2022): 2324–65. http://dx.doi.org/10.12957/rdc.2022.57481.

Der volle Inhalt der Quelle
Annotation:
ResumoA pesquisa pretende investigar quais os possíveis impactos à cidadania decorrentes da inefetividade do direito à moradia. Para tanto, utiliza-se o método de abordagem dedutivo, os métodos de procedimento histórico e monográfico e como técnica de pesquisa a documentação direta e indireta. Contando com cem participantes, a pesquisa de campo emprega como instrumento um questionário composto por 20 perguntas acerca da temática explorada, a partir do qual se realiza uma análise qualitativa dos dados coletados. O trabalho divide-se em três capítulos. O primeiro deles realiza um apanhado histórico do processo de reconhecimento e consolidação da moradia como direito humano e fundamental. O segundo capítulo busca apresentar um contexto de inefetividade do direito à moradia no Brasil e sua relação para com o exercício da cidadania. Por fim, no terceiro capítulo examinam-se as ocupações Beira Trilho no município de Passo Fundo/RS, com destaque para a verificação dos impactos que a inefetividade do direito à moradia pode gerar à cidadania. A partir dessa análise pode-se verificar que a ausência de uma moradia adequada obsta a fruição de uma série de direitos fundamentais do indivíduo, impactando diretamente no exercício da cidadania.Palavras-chave: Direito à Moradia. Espaço Urbano. Cidadania. Segregação Socioespacial. Desigualdade. AbstractThe research intends to investigate the impacts on citizenship due to the ineffectiveness of the right to housing. For this, the deductive approach method, the historical and monographic procedure methods are used, and direct and indirect documentation are used as a research technique. With 100 participants, the research uses as a tool a questionnaire composed of 20 questions about the theme explored, from which a qualitative analysis of the collected data is carried out. The work is divided into three chapters. The first one provides a historical overview of the process of recognizing and consolidating housing as a human and fundamental right. The second chapter seeks to present a context of ineffectiveness of the right to housing in Brazil and its relationship to the exercise of citizenship. Finally, in the third chapter, the area nearby rails occupations in the municipality of Passo Fundo / RS are examined, highlighting the verification of the impacts that the ineffectiveness of the right to housing can generate to citizenship. From this analysis it can be seen that the absence of adequate housing prevents the enjoyment of a series of fundamental rights of the individual, directly impacting the exercise of citizenship.Keywords: Right to housing. Urban Area. Citizenship. Socio-spatial segregation. Inequality.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Tumurov, Erdem Garmayevich, und Vladimir Ivanovich Shelekhov. „TRANSFORMATION, SPECIFICATION, AND VERIFICATION OF THE PROGRAM CALCULATING THE ELEMENTS NUMBER OF A SET PRESENTED BY A BIT VECTOR“. System Informatics, 2020. http://dx.doi.org/10.31144/si.2307-6410.2020.n16.p103-136.

Der volle Inhalt der Quelle
Annotation:
Transformations eliminating pointers in the memweight function in OS Linux kernel library is described. Next, the function is translated to the predicate programming language P. For the obtained predicate program, deductive verification in the Why3 tool was performed. In order to simplify verification, the program model of calculating program inner state was constructed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Shelekhov, Vladimir Ivanovich. „VERIFICATION OF A PREDICATE HEAPSORT PROGRAM USING INVERSE TRANSFORMATIONS“. System Informatics, 2020. http://dx.doi.org/10.31144/si.2307-6410.2020.n16.p75-102.

Der volle Inhalt der Quelle
Annotation:
Deductive verification of the classical J.Williams heapsort algorithm for objects of an arbitrary type was conducted. In order to simplify verification, non-trivial transformations, replacing pointer arithmetic operators by an array element constructs, were applied. The program was translated to the predicate programming language. Deductive verification of the program in the tools Why3 and Coq appears to be complicated and time consuming.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Shelekhov, Vladimir Ivanovich. „VERIFICATION OF A STRING TO INTEGER CONVERSION PROGRAM“. System Informatics, Nr. 17 (2020). http://dx.doi.org/10.31144/si.2307-6410.2020.n17.p43-90.

Der volle Inhalt der Quelle
Annotation:
Deductive verification of a string to integer conversion program kstrtoul in Linux OS kernel library is described. The kstrtoul program calculates the integer value presented as a char sequence of digits. To simplify program verification the transformations of replacing pointer operators to equivalent actions without pointers are conducted. Model of inner program state are constructed to enhance program specification. Deductive verification was conducted in the tools Why3 and Coq.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

„Cameleer A deductive verification tool for OCaml“. Research Outreach, Nr. 130 (08.06.2022). http://dx.doi.org/10.32907/ro-130-2767435612.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie