Academic literature on the topic 'Temporal verification effectiveness'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Temporal verification effectiveness.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Temporal verification effectiveness"

1

FIORAVANTI, FABIO, ALBERTO PETTOROSSI, MAURIZIO PROIETTI, and VALERIO SENNI. "Generalization strategies for the verification of infinite state systems." Theory and Practice of Logic Programming 13, no. 2 (January 25, 2012): 175–99. http://dx.doi.org/10.1017/s1471068411000627.

Full text
Abstract:
AbstractWe present a method for the automated verification of temporal properties of infinite state systems. Our verification method is based on the specialization of constraint logic programs (CLP) and works in two phases: (1) in the first phase, a CLP specification of an infinite state system is specialized with respect to the initial state of the system and the temporal property to be verified, and (2) in the second phase, the specialized program is evaluated by using a bottom-up strategy. The effectiveness of the method strongly depends on the generalization strategy which is applied during the program specialization phase. We consider several generalization strategies obtained by combining techniques already known in the field of program analysis and program transformation, and we also introduce some new strategies. Then, through many verification experiments, we evaluate the effectiveness of the generalization strategies we have considered. Finally, we compare the implementation of our specialization-based verification method to other constraint-based model checking tools. The experimental results show that our method is competitive with the methods used by those other tools.
APA, Harvard, Vancouver, ISO, and other styles
2

Kongburan, Wutthipong, and Denduang Pradubsuwun. "Formal Verification of WS-BPEL Using Timed Trace Theory." Advanced Materials Research 931-932 (May 2014): 1452–56. http://dx.doi.org/10.4028/www.scientific.net/amr.931-932.1452.

Full text
Abstract:
A web service composition is able to create a new service by incorporating some existing web services. Currently, Web Service Business Process Execution Language or WS-BPEL is a promising language used to describe the web service composition. Since in the real world most of business processes have been involved temporal context and they are quite complex interaction, it is impossible to completely eliminate all failures in them. Therefore, a formal verification is required to assure the correctness and reliability of the web service composition. In this paper, timed trace theory has been applied to verify the web service composition with temporal constraints. Both safety and timing failures can be examined. Experimenting with a ticket reservation system, the proposed approach shows its effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Lanfang, Zhiyong Zhang, and Ting Zhao. "A Novel Spatio-Temporal Access Control Model for Online Social Networks and Visual Verification." International Journal of Cloud Applications and Computing 11, no. 2 (April 2021): 17–31. http://dx.doi.org/10.4018/ijcac.2021040102.

Full text
Abstract:
With the rapid development of mobile internet, a large number of online social networking platforms and tools have been widely applied. As a classic method for protecting the privacy and information security of social users, access control technology is evolving with the spatio-temporal change of social application requirements and scenarios. However, nowadays there is a lack of effective theoretical model of social spatio-temporal access control as a guide. This paper proposed a novel spatio-temporal access control model for online social network (STAC) and its visual verification, combined with the advantages of discretionary access control, using formal language to describe the access control rules based on spatio-temporal, and real-life scenarios for access control policy description, realizes a more fine-grained access control mechanism for social network. By using the access control verification tool ACPT developed by NIST to visually verify the proposed model, the security and effectiveness of the STAC model are proved.
APA, Harvard, Vancouver, ISO, and other styles
4

Shriyam, Shaurya, and Satyandra K. Gupta. "Modeling and verification of contingency resolution strategies for multi-robot missions using temporal logic." International Journal of Advanced Robotic Systems 16, no. 6 (November 1, 2019): 172988141988569. http://dx.doi.org/10.1177/1729881419885697.

Full text
Abstract:
This article presents an approach for assessing contingency resolution strategies using temporal logic. We present a framework for nominal mission modeling, then specifying contingency resolution strategies and evaluating their effectiveness for the mission. Our approach focuses on leveraging the use of model checkers to the domain of multi-robot missions to assess the adequacy of contingency resolution strategies that minimize the adverse effects of contingencies on the mission execution. We consider missions with deterministic as well as probabilistic transitions. We demonstrate our approach using two case studies. We consider the escorting of a ship in a port where multiple contingencies may occur concurrently and assess the adequacy of the proposed contingency resolution strategies. We also consider a manufacturing scenario where multiple assembly stations collaborate to create a product. In this case, assembly operations may fail, and human intervention is needed to complete the assembly process. We investigate several different strategies and assess their effectiveness based on mission characteristics.
APA, Harvard, Vancouver, ISO, and other styles
5

Kubota, Yuki, Tomohiko Hayakawa, and Masatoshi Ishikawa. "Dynamic perceptive compensation for the rotating snakes illusion with eye tracking." PLOS ONE 16, no. 3 (March 4, 2021): e0247937. http://dx.doi.org/10.1371/journal.pone.0247937.

Full text
Abstract:
This study developed a dynamic perceptive compensation system for the rotating snakes illusion (RSI) with eye tracking. Large eye movements, such as saccades and blinks, were detected with an eye tracker, and perceptive compensation was dynamically performed based on the characteristics of RSI perception. The proposed compensation system considered three properties: spatial dependence, temporal dependence, and individual dependence. Several psychophysical experiments were performed to confirm the effectiveness of the proposed system. After the preliminary verification and determination of the temporal-dependent function for RSI perception, the effects of gaze information on RSI control were investigated. Five algorithms were compared using paired comparison. This confirmed that the compensation system that took gaze information into account reduced the RSI effect better than compensation without gaze information at a significance threshold of p < 0.01, calculated with Bonferroni correction. Some algorithms that are dependent on gaze information reduced the RSI effects more stably than still RSI images, whereas spatially and temporally dependent compensation had a lower score than other compensation algorithms based on gaze information. The developed system and algorithm successfully controlled RSI perception in relation to gaze information. This study systematically handled gaze measurement, image manipulation, and compensation of illusory image, and can be utilized as a standard framework for the study of optical illusions in engineering fields.
APA, Harvard, Vancouver, ISO, and other styles
6

Vega Vice, Jorge, and Valery Mikhailov. "On Methods in the Verification and Elaboration of Development Programs for Agricultural Territories." Modeling and Analysis of Information Systems 25, no. 5 (October 28, 2018): 481–90. http://dx.doi.org/10.18255/1818-1015-2018-5-481-490.

Full text
Abstract:
Nowadays, the methods of program-targeted management for the development of various socio-economic systems of complex structure, such as agricultural areas, have become universal. Therefore, the current tasks at hand are the verification of already created development programs and the development of "proper" programs for the development of such systems, by analogy with the verification and development of proper computer programs through developed disciplines in theoretical programming. In this paper, in order to solve the problem of verification of development programs for agricultural territories, a structural scheme of the program is first constructed, through which the axiomatic theory is created, using Hoare’s algorithmic logic system. The main problem in the construction of the axiomatic theory is the development of the axioms of the theory reflecting the preconditions and effects of the implementation of meaningful actions indicated in the text of the development program. The verification of the development program corresponds to the probability of some Hoare triplet, according to the initial and target conditions of the program. For the task of elaboration of the right development programs, the mechanism for constructing a domain model using the PDDL family description languages is described. The description of a specific model is purely declarative in nature and consists of descriptions of predicates and actions of the chosen subject area. In this paper, it is shown how on the described model with the help of intelligent planners, including temporal planners such as OPTIC, solutions to the targets of development programs can be automatically built. Based on expert knowledge and activity standards, a model of an agricultural territory is constructed, a brief description of which is given in the work. The conducted experiments showed the effectiveness of the proposed approach for the development of proper development programs.
APA, Harvard, Vancouver, ISO, and other styles
7

Yan, Zheng, Xiao Hui Peng, Yu Qiang Cheng, and Jian Jun Wu. "Fault Diagnosis Method Based on Integration Signed Directed Graph with Quantitative Knowledge." Applied Mechanics and Materials 232 (November 2012): 359–63. http://dx.doi.org/10.4028/www.scientific.net/amm.232.359.

Full text
Abstract:
Signed Directed Graph (SDG) has been widely applied to model the cause and effect behavior of process systems in recent years. However, SDG-based diagnosis has poorly discriminatory ability, because of the information loss while going from quantitative to qualitative domain. In this paper, a new method combining SDG with quantitative knowledge is presented to improve the discriminatory ability. In the method, a hybrid reasoning (forward and backward) strategy based on assumption and verification was applied to find all the potential fault sources and corresponding consistent paths in SDG model. Then the SDG-based method was modified by integrating governing equation and temporal information of the system, in order to improve the discriminatory ability. The method has been validated by the artificial telemetry data, and the effectiveness of the method has been confirmed. The method proposed can provide important practical value for the development of on-board fault diagnosis system of spacecraft propulsion systems.
APA, Harvard, Vancouver, ISO, and other styles
8

Panokin, Alexander M. "THE IDEA OF REVIEWING CRIMINAL COURT RULINGS AND THE FIRST EXPERIENCES OF ITS IMPLEMENTATION EMERGED." Vestnik Tomskogo gosudarstvennogo universiteta. Pravo, no. 37 (2020): 93–107. http://dx.doi.org/10.17223/22253513/37/8.

Full text
Abstract:
From the very early stages of development, the state resolves social conflicts related to the crime committed. However, decisions made as a result of such conflicts begin to be challenged by their participants, which shows that there is an objective need for the right to appeal and review court decisions. This suggests that the right to appeal and review court decisions is objectively required. It is important to bear in mind that at the initial stages of statehood development, there is no procedural separation of appeals from other forms of verification of court decisions. The search for the effectiveness of such an audit is directed towards identifying the officials vested with the authority to carry out it, the subject of the appeal, the means of verification and its results. Further development of the idea under consideration has led to the emergence of various forms of verification of court decisions. Verification of court decisions in criminal cases is also objectively conditioned by the nature of criminal proceedings due to the impossibility of excluding court errors, including the limited cognitive abilities of a person. The close relationship between verification and the nature of social relations is indicated by the fact that, having emerged a few thousand years ago, such verification still exists today and will not be necessary in the future no matter how civilisation develops. Public relations give rise to certain types of miscarriages of justice (in establishing factual circumstances, in applying legal rules or in factual and legal matters at the same time), depending on which there is a differentiation in the way court decisions are checked, which is a response to the nature of the miscarriage of justice. Of course, the ways of verifying judicial acts are different because judicial errors are different. A generalisation and analysis of the history of verification of court judgments make it possible to identify their typological, repetitive and characteristic features which serve as grounds for classification. Such grounds are the entities authorised to carry out the audit, the means of verification, justification and argumentation at a higher authority and the types of decisions it makes. The first experiences with the idea of verifying criminal court decisions are extremely diverse. While many of the trends in the development of such verification are repeated, its morphological forms are different. At the stage of historical development under consideration, there is no close relationship between the individual states, so that the verification of court decisions in each state is formed with a small number of borrowings. This experience is interesting because it is unique for each state in a particular historical period. The formation and institutionalisation of court review is determined by the spatial, temporal, cultural, legal and other characteristics of individual states. On the one hand, various countries are aware of the need to create national judicial systems, and on the other hand, each state does this inde-pendently, using its own approaches.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Qiang, Qiangqiang Yuan, Jie Li, Yuan Wang, Fujun Sun, and Liangpei Zhang. "Generating seamless global daily AMSR2 soil moisture (SGD-SM) long-term products for the years 2013–2019." Earth System Science Data 13, no. 3 (March 31, 2021): 1385–401. http://dx.doi.org/10.5194/essd-13-1385-2021.

Full text
Abstract:
Abstract. High-quality and long-term soil moisture products are significant for hydrologic monitoring and agricultural management. However, the acquired daily Advanced Microwave Scanning Radiometer 2 (AMSR2) soil moisture products are incomplete in global land (just about 30 %–80 % coverage ratio), due to the satellite orbit coverage and the limitations of soil moisture retrieval algorithms. To solve this inevitable problem, we develop a novel spatio-temporal partial convolutional neural network (CNN) for AMSR2 soil moisture product gap-filling. Through the proposed framework, we generate the seamless daily global (SGD) AMSR2 long-term soil moisture products from 2013 to 2019. To further validate the effectiveness of these products, three verification methods are used as follows: (1) in situ validation, (2) time-series validation, and (3) simulated missing-region validation. Results show that the seamless global daily soil moisture products have reliable cooperativity with the selected in situ values. The evaluation indexes of the reconstructed (original) dataset are a correlation coefficient (R) of 0.685 (0.689), root-mean-squared error (RMSE) of 0.097 (0.093), and mean absolute error (MAE) of 0.079 (0.077). The temporal consistency of the reconstructed daily soil moisture products is ensured with the original time-series distribution of valid values. The spatial continuity of the reconstructed regions is in accordance with the spatial information (R: 0.963–0.974, RMSE: 0.065–0.073, and MAE: 0.044–0.052). This dataset can be downloaded at https://doi.org/10.5281/zenodo.4417458 (Zhang et al., 2021).
APA, Harvard, Vancouver, ISO, and other styles
10

Yu, Wei, and Hong Li. "Development of 3D Finite Element Method for Non-Aqueous Phase Liquid Transport in Groundwater as Well as Verification." Processes 7, no. 2 (February 25, 2019): 116. http://dx.doi.org/10.3390/pr7020116.

Full text
Abstract:
Groundwater contamination previously occurred at a broad range of locations in present-day China. There are thousands of kinds of contaminants which can be divided into soluble and insoluble categories in groundwater. In recent years, the non-aqueous phase liquid (NAPL) pollution that belongs to the multi-phase seepage flow phenomenon has become an increasingly prominent topic due to the challenge brought by groundwater purification and its treatment. Migrating with seepage flow and moving into the potable water sources, these contaminants directly endanger people’s health. Therefore, it is necessary to research how these contaminants not only migrate, but also are then accordingly remedied. First, as an analysis means, an effective numerical method is necessary to be built. A three-dimensional finite element method program for analyzing two-phase flow in porous media, which can be applied to the immiscible contaminant transport problem in subsurface flow has been developed in this paper. The fundamental theory and numerical discretization formulations are elaborated. The numerical difficulty brought about by the distinct non-linearity of the temporal evolution of saturation-dependent variables is overcome by the mixed-form formulation. The effectiveness of simultaneous solution (SS) method and its improvement in efficiency are explained. Finally, two computational examples are given for verifying the correctness and demonstrating the preliminary applicability. In addition, the function of two-phase immiscible flow, especially in Fast Lagrangian Analysis of Continua (FLAC) is used to simulate the same examples and the results are compared to further verify the correctness of the numerical development.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Temporal verification effectiveness"

1

Chen, Jinjun, and n/a. "Towards effective and efficient temporal verification in grid workflow systems." Swinburne University of Technology, 2007. http://adt.lib.swin.edu.au./public/adt-VSWT20070424.112326.

Full text
Abstract:
In grid architecture, a grid workflow system is a type of high-level grid middleware which aims to support large-scale sophisticated scientific or business processes in a variety of complex e-science or e-business applications such as climate modelling, disaster recovery, medical surgery, high energy physics, international stock market modelling and so on. Such sophisticated processes often contain hundreds of thousands of computation or data intensive activities and take a long time to complete. In reality, they are normally time constrained. Correspondingly, temporal constraints are enforced when they are modelled or redesigned as grid workflow specifications at build-time. The main types of temporal constraints include upper bound, lower bound and fixed-time. Then, temporal verification would be conducted so that we can identify any temporal violations and handle them in time. Conventional temporal verification research and practice have presented some basic concepts and approaches. However, they have not paid sufficient attention to overall temporal verification effectiveness and efficiency. In the context of grid economy, any resources for executing grid workflows must be paid. Therefore, more resources should be mainly used for execution of grid workflow itself rather than for temporal verification. Poor temporal verification effectiveness or efficiency would cause more resources diverted to temporal verification. Hence, temporal verification effectiveness and efficiency become a prominent issue and deserve an in-depth investigation. This thesis systematically investigates the limitations of conventional temporal verification in terms of temporal verification effectiveness and efficiency. The detailed analysis of temporal verification effectiveness and efficiency is conducted for each step of a temporal verification cycle. There are four steps in total: Step 1 - defining temporal consistency; Step 2 - assigning temporal constraints; Step 3 - selecting appropriate checkpoints; and Step 4 - verifying temporal constraints. Based on the investigation and analysis, we propose some new concepts and develop a set of innovative methods and algorithms towards more effective and efficient temporal verification. Comparisons, quantitative evaluations and/or mathematical proofs are also presented at each step of the temporal verification cycle. These demonstrate that our new concepts, innovative methods and algorithms can significantly improve overall temporal verification effectiveness and efficiency. Specifically, in Step 1, we analyse the limitations of two temporal consistency states which are defined by conventional verification work. After, we propose four new states towards better temporal verification effectiveness. In Step 2, we analyse the necessity of a number of temporal constraints in terms of temporal verification effectiveness. Then we design a novel algorithm for assigning a series of finegrained temporal constraints within a few user-set coarse-grained ones. In Step 3, we discuss the problem of existing representative checkpoint selection strategies in terms of temporal verification effectiveness and efficiency. The problem is that they often ignore some necessary checkpoints and/or select some unnecessary ones. To solve this problem, we develop an innovative strategy and corresponding algorithms which only select sufficient and necessary checkpoints. In Step 4, we investigate a phenomenon which is ignored by existing temporal verification work, i.e. temporal dependency. Temporal dependency means temporal constraints are often dependent on each other in terms of their verification. We analyse its impact on overall temporal verification effectiveness and efficiency. Based on this, we develop some novel temporal verification algorithms which can significantly improve overall temporal verification effectiveness and efficiency. Finally, we present an extension to our research about handling temporal verification results since these verification results are based on our four new temporal consistency states. The major contributions of this research are that we have provided a set of new concepts, innovative methods and algorithms for temporal verification in grid workflow systems. With these, we can significantly improve overall temporal verification effectiveness and efficiency. This would eventually improve the overall performance and usability of grid workflow systems because temporal verification can be viewed as a service or function of grid workflow systems. Consequently, by deploying the new concepts, innovative methods and algorithms, grid workflow systems would be able to better support large-scale sophisticated scientific and business processes in complex e-science and e-business applications in the context of grid economy.
APA, Harvard, Vancouver, ISO, and other styles
2

Lin, Tzu-Ting, and 林姿葶. "Temporal Perspective Leadership in Organizations: Content Analysis, Entrainment Mechanism, and Effectiveness Verification." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/654288.

Full text
Abstract:
博士
國立臺灣大學
心理學研究所
102
Time should be an important aspect of organizational theory, but it has been neglected for decades, especially in leadership research. However, despite many recent studies had devoted to the field of time issues in management and organizational behaviors, they were mainly focused on the effect of subjective time on individual’s attitude and behavior. Few studies, if any, investigated research on time at group-level, or linked it with leadership area. One of important function of a group leader is to harmonize a group of members with specialized skills to accomplish group target; therefore, the temporal dimensions are especially significant to a group leader. Unfortunately, this issue has not yet been explored. In view of this, I reviewed literature on temporal perspective and leadership, in order to address distinctive concepts of temporal perspective leadership and its possible effects on group-level and individual-level effectiveness. In addition, I take induction method to develop a suitable measurement questionnaire of temporal perspective leadership, and then establishes the reliability and validity of scale and its nomological network. Finally, based on social entrainment theory, I developed a model in which group entrainment mediated between temporal perspective leadership and outcomes at different levels. Applying three systematic studies and samples comprised seven sources, results showed that group entrainment mediated the effect of temporal perspective leadership at both levels. Implications for the theory and practice of leadership are discussed, and future research directions offered. By doing so, I hope to encourage future researchers to get involved in and invest in leadership research on time issues.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Temporal verification effectiveness"

1

Kobayashi, Naoki, Grigory Fedyukovich, and Aarti Gupta. "Fold/Unfold Transformations for Fixpoint Logic." In Tools and Algorithms for the Construction and Analysis of Systems, 195–214. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45237-7_12.

Full text
Abstract:
Abstract Fixpoint logics have recently been drawing attention as common foundations for automated program verification. We formalize fold/unfold transformations for fixpoint logic formulas and show how they can be used to enhance a recent fixpoint-logic approach to automated program verification, including automated verification of relational and temporal properties. We have implemented the transformations in a tool and confirmed its effectiveness through experiments.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Temporal verification effectiveness"

1

Mahani, Maziar Fooladi, and Yue Wang. "Runtime Verification of Trust-Based Symbolic Robot Motion Planning With Human-in-the-Loop." In ASME 2016 Dynamic Systems and Control Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/dscc2016-9816.

Full text
Abstract:
In this paper, we address the runtime verification problem of robot motion planning with human-in-the-loop. By bringing together approaches from runtime verification, trust model, and symbolic motion planning, we developed a framework which guarantees that a robot is able to safely satisfy task specifications while improving task efficiency by switches between human supervision and autonomous motion planning. A simple robot model in a domain path planning scenario is considered and the robot is assumed to have perfect localization capabilities. The task domain is partitioned into a finite number of identical cells. A trust model based on the robot and human performance is used to provide a switching logic between different modes. Model checking techniques are utilized to generate plans in autonomous motion planning and for this purpose, Linear Temporal Logic (LTL) as a task specification language is employed to formally express specifications in model checking. The whole system is implemented in a runtime verification framework to monitor and verifies the system execution at runtime using ROSRV. Finally, we illustrated the effectiveness of this framework as well as its feasibility through a simulated case study.
APA, Harvard, Vancouver, ISO, and other styles
2

Mahani, Maziar Fooladi, and Yue Wang. "Trust-Based Runtime Verification for Multi-Quad-Rotor Motion Planning With a Human-in-the-Loop." In ASME 2018 Dynamic Systems and Control Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/dscc2018-9174.

Full text
Abstract:
In this paper, we propose a trust-based runtime verification (RV) framework for deploying multiple quad-rotors with a human-in-the-loop (HIL). By bringing together approaches from runtime verification, trust-based decision-making, human-robot interaction (HRI), and hybrid systems, we develop a unified framework that is capable of integrating human cognitive skills with autonomous capabilities of multi-robot systems to improve system performance and maximize the intuitiveness of the human-robot-interaction. On top of the RV framework, we utilize a probabilistic trust inference model as the key component in forming the HRI, designed to maintain the system performance. A violation avoidance controller is designed to account for the unexpected/unmodeled environment behaviors e.g. collision with static/moving obstacles. We also use the automata theoretic approaches to generate motion plans for the quad-rotors working in a partially-known environment by automatic synthesis of controllers enforcing specifications given in temporal logic languages. Finally, we illustrated the effectiveness of this framework as well as its feasibility through a simulated case study.
APA, Harvard, Vancouver, ISO, and other styles
3

Shirakawa, Noriyuki, Yasushi Uehara, Masanori Naitoh, Hidetoshi Okada, Yuichi Yamamoto, and Seiichi Koshizuka. "Next Generation Safety Analysis Methods for SFRs—(5) Structural Mechanics Models of COMPASS Code and Verification Analyses." In 17th International Conference on Nuclear Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/icone17-75532.

Full text
Abstract:
A five-year research project started in FY2005 (Japanese Fiscal Year, hereafter) to develop a code based on the Moving Particle Semi-implicit (MPS) method for detailed analysis of core disruptive accidents (CDAs) in sodium-cooled fast reactors (SFRs). The code is named COMPASS (Computer Code with Moving Particle Semi-implicit for Reactor Safety Analysis). CDAs have been almost exclusively analyzed with SIMMER-III [2], which is a two-dimensional multi-component multi-phase Eulerian fluid-dynamics code, coupled with fuel pin model and neutronics model. The COMPASS has been developed to play a role complementary to SIMMER-III in temporal and spatial scale viewpoint; COMPASS for mesoscopic using a small window cut off from SIMMER-III for macroscopic. We presented the project’s outline and the verification analyses of elastic structural mechanics module of the COMPASS in ICONE16 [1]. The COMPASS solves physical phenomena in CDAs coupling fluid dynamics and structural dynamics with phase changes, that is vaporization/condensation and melting/ freezing. The phase changes are based on nonequilibrium heat transfer-limited model and all “phase change paths” considered in SIMMER-III are implemented [20]. In FY2007, the elastoplastic model including thermal expansion and fracture are formulated in terms of MPS method and implemented in the COMPASS, where the model adopts the von Mises type yield condition and the maximum principal stress as fracture condition. To cope with large computing time, “stiffness reduction approximation” was developed and successfully implemented in the COMPASS besides parallelization effort. Verification problems are set to be suitable for analyses of SCARABEE tests, EAGLE tests and hypothetical CDAs in real plants so that they are suggesting issues to be solved by improving the models and calculation algorithms. The main objective of SCARABEE-N in-pile tests was to study the consequences of a hypothetical total instantaneous blockage (TIB) at the entrance of a liquid-metal reactor subassembly at full power [21]. The main objectives of the EAGLE program consisting of in-pile tests using IGR (Impulse Graphite Reactor) and out-of-pile tests at NNC/RK are; 1) to demonstrate effectiveness of special design concepts to eliminate the re-criticality issue, and 2) to acquire basic information on early-phase relocation of molten-core materials toward cold regions surrounding the core, which would be applicable to various core design concepts [22, 23]. In this paper, the formulations and the results of functional verification of elastoplastic models in CDA conditions will be presented.
APA, Harvard, Vancouver, ISO, and other styles
4

Romera, David, and Roque Corral. "Efficient Passage-Spectral Method for Unsteady Flows Under Stall Conditions." In ASME Turbo Expo 2019: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/gt2019-91661.

Full text
Abstract:
Abstract This paper presents an efficient method of approximating unsteady flows using a blockwise discrete spatial Fourier series for the modeling of three-dimensional non-axisymmetric flows without making any hypothesis about its temporal periodicity. The method aims at capturing the long wavelength flow patterns which are present in many unsteady problems of industrial interest, such as compressor stability, with a drastic reduction in computational resources. The method is intended to be used to compute flows exhibiting large-scale instabilities and where the fundamental frequency of the problem is not known beforehand. The approach discretizes the domain using a finite number of blocks or passages, where the flow variables at the supposedly periodic boundaries are continuously updated using the spatial Fourier coefficients of a uniformly spaced set of reduced-passage domains. The NASA rotor 67 under stall conditions has been used as verification validation case to demonstrate the effectiveness and viability of the proposed modeling strategy. The comparison between the solutions obtained with the discrete Fourier series and the full-annulus solution shows that accurate solutions can be obtained with a low number of harmonics. The new method has been applied to investigate the rotating stall inception of the NASA rotor 67 for clean and distorted inlet flow near stall operating conditions. The method is shown to accurately reproduce the full-annulus solution with a few spatial harmonics, capturing the characteristic features of the complex flow induced by the tip leakage vortex breakdown. The computational cost in this application has been reduced by a factor of between three and seven, although this number heavily depends on the ratio between the number of retained harmonics and the number of blades.
APA, Harvard, Vancouver, ISO, and other styles
5

Keve, Gábor. "DETERMINING ACCURATE ICE COVERAGE ON DANUBE BY WEBCAMERAS." In XXVII Conference of the Danubian Countries on Hydrological Forecasting and Hydrological Bases of Water Management. Nika-Tsentr, 2020. http://dx.doi.org/10.15407/uhmi.conference.01.03.

Full text
Abstract:
For most Hungarian rivers, especially the Danube, floods and other damages caused by ice have produced and are producing serious problems. Meanwhile, the number of national researches on ice that improve the effectiveness of ice protection is low, and technical development is not significant at this point. The main focus of the research presented in this article emphasizes the advancement of this research and to the further develop of the river ice monitoring methodology. The key objectives are listed in the following points: Develop a fast, automated, cost effective, and continuous ice-collection method based on web camera images with a precision far beyond their manual or estimation procedures. Verification of the developed solution through error analysis. Solutions that do not require specialized software were preferential. Analyze the time pulsation and daily travel curve of the ice jam coverage ratio of the Danube with the developed high frequency measurement process. The aim of this paper is to promote modernization of the Hungarian ice-observations and to provide a numerical basis for scientific research related to this topic. I have demonstrated that the web-based, automated river ice-monitoring system can be used as a detailed hydrographic tool and can provide more accurate results than the currently used estimation or manual image processing methods. I have proved that from the images of webcams to determine the rate of ice coverage, it is enough to imagine the views of the cameras in advance, with a single spatial perspective transformation, it is not necessary to use georeferencing, orthorectification, or complicated form recognition procedures for each frame. From the perspective mapping, the aspect ratio of the pixels (pixels) to the water surface in the image being examined can be calculated, and it is sufficient for the computation of ice coverage in all images with the same viewpoint. By doing this, I've narrowed the task to the grading of the water-ice pixels. A simple numerical method was developed and verified to determine the area ratio of pixels to the surface of the water. I have developed an automatic, adaptable threshold value, which distinguishes between ice and water with appropriate precision as picture points (pixels). With my method of ice coverage determination, I observed significant temporal pulsation and daily periodicity in the ice movement of the observed Danube reach. I have found that the small number of daily estimates are not representative to determine daily average ice coverage. I recommend continuous webcams monitoring. The new findings contribute to a more accurate understanding of the spatial and temporal structures of ice floes in rivers, as well as the methodological development of their measurability and reproducibility. My work creates the basis for the modernization of the Hungarian ice-monitoring network. The operation of such a network provides the condition that in the future on the larger rivers ice floe forecasting and alarm systems may be established. The time series collected over the past decades provide data for national research on river ice phenomenon’s too.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography