Articles de revues sur le sujet « Timed Failure Logic »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Timed Failure Logic.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Timed Failure Logic ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Cao, Yuyan, Yongxi Lyu et Xinmin Wang. « Fault Diagnosis Reasoning Algorithm for Electromechanical Actuator Based on an Improved Hybrid TFPG Model ». Electronics 9, no 12 (16 décembre 2020) : 2153. http://dx.doi.org/10.3390/electronics9122153.

Texte intégral
Résumé :
As a new generation of power-by-wire actuators, electromechanical actuators are finding increasingly more applications in the aviation field. Aiming at the application problem of the fault diagnosis of the electromechanical actuator, an improved diagnosis reasoning algorithm based on a hybrid timed failure propagation graph (TFPG) model is proposed. On the basis of this hybrid TFPG model, the activation conditions of OR and causality among nodes are given. The relationship discrepancy node is transformed into a relationship node and discrepancy node, which unifies the model storage process. The backward and forward extension operations of hypothesis generation and updating are improved. In the backward expansion operation, the specific process of backward update from non-alarm nodes is given, and the judging logic of the branch of relationship nodes is added, which guarantees the unity of the algorithm framework and the accuracy of the time update. In the forward expansion operation, the update order is adjusted to ensure the accuracy of the node update for the case of multiple parents. A hybrid TFPG model of the electromechanical actuator is established in the general modeling environment (GME), and a systematic verification scheme with two simulation types is tested with the application of the P2020 reference design board (RDB) and VxWorks 653 system. The results show that the proposed algorithm can realize the fault diagnosis of the electromechanical actuator as well as fault propagation prediction.
Styles APA, Harvard, Vancouver, ISO, etc.
2

TELES, Athus Costa, Ana Clara de Pádua FREITAS et Antonio Cruz RODRIGUES. « FAILURES DETECTION METHODS IN CHEMICAL PROCESS USING ARTIFICIAL INTELLIGENCE ». Periódico Tchê Química 16, no 32 (20 août 2019) : 61–68. http://dx.doi.org/10.52571/ptq.v16.n32.2019.79_periodico32_pgs_61_68.pdf.

Texte intégral
Résumé :
Any atypical change in a procedure can be characterized as a “failure”. Consequently, it may result in economic losses and/or a rise of the operational cost, because most of the times the process will need to be interrupted. Therefore, the concern with the quality and security of the processes has stimulating studies of diagnosis and monitoring failures in industrial equipments. In light of this, the present article has as purpose to apply three different methods (Artificial Neural Networks - ANN, Fuzzy Logic – FL and Support Vector Machine – SVM). All of those were applied as detection and classification systems of failure in the processes of a case study in order to diagnose these artificial intelligence techniques so that the efficiency of each method can be compared. All investigation is done by modeling a reactor of Van der Vusse’s kinetic causing four types of failures, in the concentration of a reagent (failure 1), in the sensor which measures the concentration of the interested product and temperature (failure 2 and 3), and in the valve locking (failure 4). The data used in this methodology is based in quantitative and qualitative historical information. All methods are able to detect failures, but in different times. ANN is the one which detects faster all the failures. SVM detects some minutes later, however with good precision, even though this method uses less computational effort compared to ANN. Fuzzy, in the most of the cases studied, takes hours to detect any change in the system, which makes this one the less effective among the ones studied.
Styles APA, Harvard, Vancouver, ISO, etc.
3

DWYER, VINCENT M., ROGER M. GOODALL et ROGER DIXON. « RELIABILITY OF 2-OUT-OF-N:G SYSTEMS WITH NHPP FAILURE FLOWS AND FIXED REPAIR TIMES ». International Journal of Reliability, Quality and Safety Engineering 19, no 01 (février 2012) : 1250003. http://dx.doi.org/10.1142/s0218539312500039.

Texte intégral
Résumé :
It is commonplace to replicate critical components in order to increase system lifetimes and reduce failure rates. The case of a general N-plexed system, whose failures are modeled as N identical, independent nonhomogeneous Poisson process (NHPP) flows, each with rocof (rate of occurrence of failure) equal to λ(t), is considered here. Such situations may arise if either there is a time-dependent factor accelerating failures or if minimal repair maintenance is appropriate. We further assume that system logic for the redundant block is 2-out-of-N:G. Reliability measures are obtained as functions of τ which represents a fixed time after which Maintenance Teams must have replaced any failed component. Such measures are determined for small λ(t)τ, which is the parameter range of most interest. The triplex version, which often occurs in practice, is treated in some detail where the system reliability is determined from the solution of a first order differential-delay equation (DDE). This is solved exactly in the case of constant λ(t), but must be solved numerically in general. A general means of numerical solution for the triplex system is given, and an example case is solved for a rocof resembling a bathtub curve.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Fan, Chongqing, Zhenzhou Lu et Yan Shi. « Time-dependent failure possibility analysis under consideration of fuzzy uncertainty ». Fuzzy Sets and Systems 367 (juillet 2019) : 19–35. http://dx.doi.org/10.1016/j.fss.2018.06.016.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Jiang, S., et R. Kumar. « Failure Diagnosis of Discrete-Event Systems With Linear-Time Temporal Logic Specifications ». IEEE Transactions on Automatic Control 49, no 6 (juin 2004) : 934–45. http://dx.doi.org/10.1109/tac.2004.829616.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Guo, Lu, et Xiaodong Liu. « Mission-Oriented Missile Equipment Support System Modeling : Considering the Failure and Health State ». Mathematical Problems in Engineering 2022 (31 janvier 2022) : 1–18. http://dx.doi.org/10.1155/2022/5026555.

Texte intégral
Résumé :
Missile equipment support system is a complex system involving mission profile, operating environment, support personnel, resource scheduling, and other factors, and most of the equipment failures are a process of continuous evolution over time. Based on the characteristics of equipment functions or performance parameters and current detection methods, only part of equipment health degradation trend can be monitored. Therefore, it is necessary to carry out the support system simulation considering the equipment failure and health state comprehensively. In this paper, a mission-oriented modeling framework of missile equipment support system based on agent is proposed. Agents of equipment, support, management, and environment are abstracted. The behavior logic, information transmission link, and interaction mechanism among agents are explained. The failure occurrence handling mechanism and health state degradation handling mechanism are determined. The equipment recovery behavior model, including equipment health state control model, corrective maintenance behavior model, and system reconstruction behavior model, are constructed, and the simulation process and implementation method are explained. Finally, a certain type of missile equipment is taken as an example for simulation. The results show that the mean time between failures can be extended and the times of corrective maintenance can be reduced by maintenance support with integrated consideration of equipment failure and health deterioration. In addition, the simulation results show the variation trend of equipment availability, which can lay a solid foundation for the optimization design of equipment support resources in the future, so as to improve the availability of equipment in the whole life cycle.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Wang, Fang, Zhi Liu, Yun Zhang et C. L. Philip Chen. « Adaptive finite-time control of stochastic nonlinear systems with actuator failures ». Fuzzy Sets and Systems 374 (novembre 2019) : 170–83. http://dx.doi.org/10.1016/j.fss.2018.12.005.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

POP, LILIANA. « Time and crisis : framing success and failure in Romania’s post-communist transformations ». Review of International Studies 33, no 3 (juillet 2007) : 395–413. http://dx.doi.org/10.1017/s0260210507007577.

Texte intégral
Résumé :
ABSTRACTThis article analyses the complex interplay between domestic systemic transformations in post-communist Europe and the reintegration of these countries in the global political economy, through a study of the relationship between successive Romanian governments and the international financial institutions in the 1990s. Conceptually, it seeks to overcome the dichotomies of realist and rationalist approaches to international relations by deploying fields, habitus and practices conceptual framework inspired by the work of Pierre Bourdieu. The article captures both the symbolic and material-structural dimensions of the interaction between domestic field-creation and the reproduction of global economic and political fields. It suggests that practices aimed at the reproduction of power hierarchies are also modulated by symbolic requirements, to save face and to avoid whenever possible open conflict, related to the logic of honour.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Cropper, Andrew, et Rolf Morel. « Learning programs by learning from failures ». Machine Learning 110, no 4 (19 février 2021) : 801–56. http://dx.doi.org/10.1007/s10994-020-05934-z.

Texte intégral
Résumé :
AbstractWe describe an inductive logic programming (ILP) approach called learning from failures. In this approach, an ILP system (the learner) decomposes the learning problem into three separate stages: generate, test, and constrain. In the generate stage, the learner generates a hypothesis (a logic program) that satisfies a set of hypothesis constraints (constraints on the syntactic form of hypotheses). In the test stage, the learner tests the hypothesis against training examples. A hypothesis fails when it does not entail all the positive examples or entails a negative example. If a hypothesis fails, then, in the constrain stage, the learner learns constraints from the failed hypothesis to prune the hypothesis space, i.e. to constrain subsequent hypothesis generation. For instance, if a hypothesis is too general (entails a negative example), the constraints prune generalisations of the hypothesis. If a hypothesis is too specific (does not entail all the positive examples), the constraints prune specialisations of the hypothesis. This loop repeats until either (i) the learner finds a hypothesis that entails all the positive and none of the negative examples, or (ii) there are no more hypotheses to test. We introduce Popper, an ILP system that implements this approach by combining answer set programming and Prolog. Popper supports infinite problem domains, reasoning about lists and numbers, learning textually minimal programs, and learning recursive programs. Our experimental results on three domains (toy game problems, robot strategies, and list transformations) show that (i) constraints drastically improve learning performance, and (ii) Popper can outperform existing ILP systems, both in terms of predictive accuracies and learning times.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Muyizere, Darius, Lawrence K. Letting et Bernard B. Munyazikwiye. « Decreasing the Negative Impact of Time Delays on Electricity Due to Performance Improvement in the Rwanda National Grid ». Electronics 11, no 19 (29 septembre 2022) : 3114. http://dx.doi.org/10.3390/electronics11193114.

Texte intégral
Résumé :
One of the most common power problems today is communication and control delays. This can adversely affect decision interaction in grid security management. This paper focuses on communication signal delays and how to identify and address communication system failure issues in the context of grid monitoring and control, with emphasis on communication signal delay. An application to solve this problem uses a thyristor switch capacitor (TSC) and a thyristor-controlled reactor (TCR) to improve the power quality of the Rwandan National Grid (RNG) with synchronous and PV generators. It is to counteract the negative effects of time delays. To this end, the TSC and TCR architectures use two methods: the fuzzy logic controller (FLC) method and the modified predictor method (MPM). The experiment was performed using the Simulink MATLAB tool. The power quality of the system was assessed using two indicators: the voltage index and total harmonic distortion. The FLC-based performance was shown to outperform the MPM for temporary or permanent failures if the correct outcome was found. As a result, we are still unsure if TSC and TCR can continue to provide favorable results in the event of a network cyber-attack.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Rahimi, Tohid, Hossein Jahan, Frede Blaabjerg, Amir Bahman et Seyed Hosseini. « Fuzzy-Logic-Based Mean Time to Failure (MTTF) Analysis of Interleaved Dc-Dc Converters Equipped with Redundant-Switch Configuration ». Applied Sciences 9, no 1 (27 décembre 2018) : 88. http://dx.doi.org/10.3390/app9010088.

Texte intégral
Résumé :
Interleaved dc-dc converters in sensitive applications necessitate an enhanced reliability. An interleaved converter equipped with redundant components can fulfill the reliability requirements. Mean Time to Failure (MTTF), as a reliability index, can be used to evaluate the expected life span of the mentioned converters. The Markov model is a helpful tool to calculate the MTTF in such systems. Different scientific reports denote different failure rates with different weight for power elements. Also, in reliability reports, failure rates of active and passive components are uncertain values. In order to approximate the failure rates fuzzy-logic-based Markov models are proposed in this paper. Then it is used to evaluate the MTTF of an interleaved multi-phase dc-dc converter, which is equipped with parallel and standby switch configurations. For the first time, fuzzy curves for MTTFs of the converters and 3D reliability function are derived in this paper. The reliability analyses give an insight to find the appropriate redundant-switch configurations for interleaved dc-dc converters under different conditions. Simulation and experimental results are provided to lend credence to the viability of the studied redundant-switch configurations in interleaved dc-dc boost converter.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Bodoh, Dan, et Kent Erington. « Finding the “When” and Improving the “Where” With Picosecond Time-Resolved LADA ». EDFA Technical Articles 17, no 2 (1 mai 2015) : 10–17. http://dx.doi.org/10.31399/asm.edfa.2015-2.p010.

Texte intégral
Résumé :
Abstract Laser-assisted device alteration (LADA) is an effective tool for identifying speed-limiting paths in ICs. When implemented with a continuous wave laser, it can reveal where the speed-limiting path resides but not when the slow (or fast) logic transition is occurring. To overcome this limitation, an enhanced version of the technique has been developed. This article discusses the capabilities of the new method, called picosecond time-resolved LADA, and explains how it complements the existing failure analysis toolset, facilitating faster resolution of issues and root-cause identification.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Shengbing Jiang et R. Kumar. « Diagnosis of repeated failures for discrete event systems with linear-time temporal-logic specifications ». IEEE Transactions on Automation Science and Engineering 3, no 1 (janvier 2006) : 47–59. http://dx.doi.org/10.1109/tase.2005.860613.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Kaur, Ravinder. « “I Am India Shining” : The Investor-Citizen and the Indelible Icon of Good Times ». Journal of Asian Studies 75, no 3 (août 2016) : 621–48. http://dx.doi.org/10.1017/s0021911816000619.

Texte intégral
Résumé :
This article is an against-the-grain reading of the highly publicized failure of the 2004 India Shining campaign. Aimed at the Indian publics, this mega-publicity spectacle sought to communicate the success of neoliberal reforms in transforming India from a developing nation to a lucrative emerging market in the global economy. Rather than uplift the mood of the nation, the campaign brought to the surface the underlying acrimony and exclusion experienced by a vast majority of the population. I argue that the discourse of India Shining's failure misreads the electoral defeat of the Bharatiya Janata Party as the failure of the neoliberal project of economic reforms. In fact, the India Shining images helped popularize the reforms at an unprecedented mass scale that until then had largely been limited to elite policy debates and reform packages. Instead, the discontent accrued by those excluded from the good times seemingly ushered in by the reforms, and not the reforms per se. The India Shining controversy also allows us to witness a new form of investor-citizenship shaped around the language and logic of loss and profits. The very edifice of failure makes apparent the shift to a capitalist dream world and the withering away of the old order in post-reform India.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Sang, D.-K., et M.-J. Tahk. « Guidance law switching logic considering the seeker's field-of-view limits ». Proceedings of the Institution of Mechanical Engineers, Part G : Journal of Aerospace Engineering 223, no 8 (1 août 2009) : 1049–58. http://dx.doi.org/10.1243/09544100jaero614.

Texte intégral
Résumé :
The impact time control guidance (ITCG) method, which has been proposed recently, can be applied successfully to a salvo attack of multiple missiles. Compared to the proportional navigation guidance law, this guidance method makes additional manoeuvres to synchronize the impact times. However, such manoeuvres do not consider the manoeuvrability and the seeker's field-of-view (FOV) of a missile and may cause the target to move out of the missile seeker's FOV; maintaining the seeker lock-on condition during the engagement is critical for missile guidance. To solve this problem, two methods are presented in this article: one is based on the calculation of minimum and maximum flight times considering the missile's manoeuvring limit and the seeker's FOV limit to check the available impact time. The other is based on guidance law switching logic that keeps the target look angle of the seeker constant. These methods can provide the boundary limit of the impact time of the salvo attack and prevent the lock-on failure because of the seeker's FOV limit of the missile during the homing phase when the ITCG is used. This method was applied to the case of a time critical salvo-attack of multiple missiles, which have manoeuvring limit and the seeker's FOV limit, and desired results were obtained.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Liu, Linna. « Intelligent Detection and Diagnosis of Power Failure Relying on BP Neural Network Algorithm ». Computational Intelligence and Neuroscience 2022 (21 septembre 2022) : 1–10. http://dx.doi.org/10.1155/2022/3758660.

Texte intégral
Résumé :
The development of economy and the needs of urban planning have led to the rapid growth of power applications and the corresponding frequent occurrence of power failures, which many times lead to a series of economic losses due to failure to repair in time. To address these needs and shortcomings, this paper introduces a BP neural network algorithm to determine the neural network structure and parameters for fault diagnosis of power electronic inverter circuits with improved hazard. By optimizing the weights and thresholds of neural networks, the learning and generalization ability of neural network fault diagnosis systems can be improved. It can effectively extract fault features for training, sort out the business logic of power supply intelligent detection, analyze the potential hazards of power supply, and effectively perform circuit intelligent control to achieve effective fault detection of power supply circuits. It can provide timely feedback and hints to improve the fault identification ability and the corresponding diagnosis accuracy. Simulation results show that the method can eventually determine the threshold value for intelligent power fault detection and diagnosis by analyzing the convergence of long-term relevant indicators, avoiding the blindness of subjective experience and providing a theoretical basis for intelligent detection and diagnosis.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Hantz, Didier, Jordi Corominas, Giovanni B. Crosta et Michel Jaboyedoff. « Definitions and Concepts for Quantitative Rockfall Hazard and Risk Analysis ». Geosciences 11, no 4 (1 avril 2021) : 158. http://dx.doi.org/10.3390/geosciences11040158.

Texte intégral
Résumé :
There is an increasing need for quantitative rockfall hazard and risk assessment that requires a precise definition of the terms and concepts used for this particular type of landslide. This paper suggests using terms that appear to be the most logic and explicit as possible and describes methods to derive some of the main hazards and risk descriptors. The terms and concepts presented concern the rockfall process (failure, propagation, fragmentation, modelling) and the hazard and risk descriptors, distinguishing the cases of localized and diffuse hazards. For a localized hazard, the failure probability of the considered rock compartment in a given period of time has to be assessed, and the probability for a given element at risk to be impacted with a given energy must be derived combining the failure probability, the reach probability, and the exposure of the element. For a diffuse hazard that is characterized by a failure frequency, the number of rockfalls reaching the element at risk per unit of time and with a given energy (passage frequency) can be derived. This frequency is relevant for risk assessment when the element at risk can be damaged several times. If it is not replaced, the probability that it is impacted by at least one rockfall is more relevant.
Styles APA, Harvard, Vancouver, ISO, etc.
18

Myers, Sharon A., Troy D. Cognata et Hugh Gotts. « FTIR analysis of printed-circuit board residue ». Proceedings, annual meeting, Electron Microscopy Society of America 54 (11 août 1996) : 264–65. http://dx.doi.org/10.1017/s0424820100163782.

Texte intégral
Résumé :
Logic boards were failing at Enhanced Mac Minus One (EMMO) test or Integrated Circuit Test (ICT) after printed circuit board (PCB) rework. The failure to boot was originally traced to a suspected bad microcontroller chip. Replacing this chip, or an oscillator tied to the microcontroller circuit, did not consistently solve the boot problem. With further testing, it was found the microcontroller circuit was very sensitive to resistance and was essentially shorted.A resistor in the microcontroller circuit was identified on the flip side of the PCB. Several areas on the board, including the resistor R161, were seen to have a slight white haze/ low gloss appearance on the surface of the PCB. To test if the residue was electrically conductive, five boards were selected whose sole failure was R161. The resistance of the individual resistors was measured with a digital multimeter (DMM). The resistor was then cleaned with isopropyl alcohol and a cotton swab. Each board was retested at ICT and the individual resistors measured again with a DMM. Cleaning the area surrounding the resistor with isopropyl alcohol, corrected the failure four of the times.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Lim, Hyunyul, Minho Cheong et Sungho Kang. « Scan-Chain-Fault Diagnosis Using Regressions in Cryptographic Chips for Wireless Sensor Networks ». Sensors 20, no 17 (24 août 2020) : 4771. http://dx.doi.org/10.3390/s20174771.

Texte intégral
Résumé :
Scan structures, which are widely used in cryptographic circuits for wireless sensor networks applications, are essential for testing very-large-scale integration (VLSI) circuits. Faults in cryptographic circuits can be effectively screened out by improving testability and test coverage using a scan structure. Additionally, scan testing contributes to yield improvement by identifying fault locations. However, faults in circuits cannot be tested when a fault occurs in the scan structure. Moreover, various defects occurring early in the manufacturing process are expressed as faults of scan chains. Therefore, scan-chain diagnosis is crucial. However, it is difficult to obtain a sufficiently high diagnosis resolution and accuracy through the conventional scan-chain diagnosis. Therefore, this article proposes a novel scan-chain diagnosis method using regression and fan-in and fan-out filters that require shorter training and diagnosis times than existing scan-chain diagnoses do. The fan-in and fan-out filters, generated using a circuit logic structure, can highlight important features and remove unnecessary features from raw failure vectors, thereby converting the raw failure vectors to fan-in and fan-out vectors without compromising the diagnosis accuracy. Experimental results confirm that the proposed scan-chain-diagnosis method can efficiently provide higher resolutions and accuracies with shorter training and diagnosis times.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Aghajanian, Soheil, Guruprasad Rao, Vesa Ruuskanen, Radosław Wajman, Lidia Jackowska-Strumillo et Tuomas Koiranen. « Real-Time Fault Detection and Diagnosis of CaCO3 Reactive Crystallization Process by Electrical Resistance Tomography Measurements ». Sensors 21, no 21 (20 octobre 2021) : 6958. http://dx.doi.org/10.3390/s21216958.

Texte intégral
Résumé :
In the present research work, an electrical resistance tomography (ERT) system is utilized as a means for real-time fault detection and diagnosis (FDD) during a reactive crystallization process. The calcium carbonate crystallization is part of the carbon capture and utilization scheme where process monitoring and malfunction diagnostics strategies are presented. The graphical logic representation of the fault tree analysis methodology is used to develop the system failure states. The measurement consistency due to the use of a single electrode from a set of ERT electrodes for malfunction identification is experimentally and quantitatively investigated based on the sensor sensitivity and standard deviation criteria. Electrical current measurements are employed to develop a LabVIEW-based process automation program by using the process-specific knowledge and historical process data. Averaged electrical current is correlated to the mechanical failure of the stirrer through standard deviation evaluation, and slopes of the measured data are used to monitor the pump and concentrations status. The performance of the implemented methodology for detecting the induced faults and abnormalities is tested at different operating conditions, and a basic signal-based alarming technique is developed.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Tahiri, Imane, Alexandre Philippot, Véronique Carré-Ménétrier et Bernard Riera. « A Fault-Tolerant and Reconfigurable Control Framework : Modeling, Design, and Synthesis ». Processes 11, no 3 (26 février 2023) : 701. http://dx.doi.org/10.3390/pr11030701.

Texte intégral
Résumé :
Manufacturing systems (MS) have become increasingly complex due to constraints induced by a changing environment, such as flexibility, availability, competition, and key performance indicators. This change has led to a need for flexible systems capable of adapting to production changes while meeting productivity and quality criteria and reducing the risk of failures. This paper provides a methodology for designing reconfigurable and fault-tolerant control for implementation in a Programmable Logic Controller (PLC). The main contribution of this methodology is based on a safe control synthesis founded on timed properties. If a sensor fault is detected, the controller switches from normal behavior to a degraded one, where timed information replaces the information lost from the faulty sensor. Switching between normal and degraded behaviors is ensured through reconfiguration rules. The primary objective of this method is to implement the obtained control into a PLC. In order to achieve this goal, a method is proposed to translate the controllers of the two behaving modes and the reconfiguration rules into different Grafcets. This approach relies on the modular architecture of manufacturing systems to avoid the combinatorial explosion that occurs in several approaches.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Yan, Shurong, Ayman A. Aly, Bassem F. Felemban, Meysam Gheisarnejad, Manwen Tian, Mohammad Hassan Khooban, Ardashir Mohammadzadeh et Saleh Mobayen. « A New Event-Triggered Type-3 Fuzzy Control System for Multi-Agent Systems : Optimal Economic Efficient Approach for Actuator Activating ». Electronics 10, no 24 (15 décembre 2021) : 3122. http://dx.doi.org/10.3390/electronics10243122.

Texte intégral
Résumé :
This study presents a new approach for multi-agent systems (MASs). The agent dynamics are approximated by the suggested type-3 (T3) fuzzy logic system (FLS). Some sufficient conditions based on the event-triggered scheme are presented to ensure the stability under less activation of the actuators. New tuning rules are obtained for T3-FLSs form the stability and robustness examination. The effect of perturbations, actuator failures and approximation errors are compensated by the designed adaptive compensators. Simulation results show that the output of all agents well converged to the leader agent under disturbances and faulty conditions. Additionally, it is shown that the suggested event-triggered scheme is effective and the actuators are updated about 20–40% of total sample times.
Styles APA, Harvard, Vancouver, ISO, etc.
23

Vinogradova, N. G., D. S. Polyakov et I. V. Fomin. « The risks of re-hospitalization of patients with heart failure with prolonged follow-up in a specialized center for the treatment of heart failure and in real clinical practice. » Kardiologiia 60, no 3 (3 mai 2020) : 59–69. http://dx.doi.org/10.18087/cardio.2020.3.n1002.

Texte intégral
Résumé :
Relevance The number of patients with functional class III-IV chronic heart failure (CHF) characterized by frequent rehospitalization for acute decompensated HF (ADHF) has increased. Rehospitalizations significantly increase the cost of patient management and the burden on health care system.Objective To determine the effect of long-term follow-up at a specialized center for treatment of HF (Center for Treatment of Chronic Heart Failure, CTCHF) on the risk of rehospitalization for patients after ADHF.Materials and Methods The study successively included 942 patients with CHF after ADHF. Group 1 consisted of 510 patients who continued the outpatient follows-up at the CTCHF, and group 2 included 432 patients who refused of the follow-up at the CTCHF and were managed at outpatient clinics at their place of residence. CHF patient compliance with recommendations and frequency of rehospitalization for ADHF were determined by outpatient medical records and structured telephone calls. A rehospitalization for ADHF was recorded if the patient stayed for more than one day in the hospital and required intravenous loop diuretics. The follow-up period was two years. Statistical analyses were performed using a Statistica 7.0 software for Windows, SPSS, and a R statistical package.Results Patients of group 2 were significantly older, more frequently had FC III CHF and less frequently had FC I CHF than patients of group 1. Both groups contained more women and HF patients with preserved ejection fraction. Using the method of binary multifactorial logit-regression a mathematical model was created, which showed that risk of rehospitalization during the entire follow-up period did not depend on age and sex but was significantly increased 2.4 times for patients with FC III-IV CHF and 3.4 times for patients of group 2. Multinomial multifactorial logit-regression showed that the risk of one, two, three or more rehospitalizations within two years was significantly higher in group 2 than in group 1 (2.9–4.5 times depending on the number of rehospitalizations) and for patients with FC III-IV CHF compared to patients with FC I-II CHF (2–3.2 times depending on the number of rehospitalizations). Proportion of readmitted patients during the first year of follow-up was significantly greater in group 2 than in group 1 (55.3 % vs. 39.8 % of patients [odd ratio (OR) =1.9; 95% confidence interval (CI), 1.4–2.4; р<0.001]; during the second year, the proportion was 67.4 % vs. 28.2 % (OR=5.3; 95 % CI, 3.9–7.1; р<0.001). Patients of group 1 were readmitted more frequently during the first year than during the second year (р<0,001) whereas patients of group 2 were readmitted more frequently during the second than the first year of follow-up (р<0.001). Total proportion of readmitted patients for two years of follow-up was significantly greater in group 2 (78.0 % vs. 50.6 %) (OR=3.5; 95 % CI, 2.6–4.6; р<0.001). Reasons for rehospitalizations were identified in 88.7 % and 45.9 % of the total number of readmitted patients in groups 1 and 2, respectively. The main cause for ADHF was non-compliance with recommendations in 47.4 % and 66.7 % of patients of groups 1 and 2, respectively (р<0.001).Conclusion Follow-up in the system of specialized health care significantly decreases the risk of rehospitalization during the first and second years of follow-up and during two years in total for both patients with FC I-II CHF and FC III-IV CHF. Despite education of patients, personal contacts with medical personnel, and telephone support, main reasons for rehospitalization were avoidable.
Styles APA, Harvard, Vancouver, ISO, etc.
24

R, Thenmozhi, Sharmeela C, Natarajan P et Velraj R. « Fuzzy Logic Controller Based Bridgeless Isolated Interleaved Zeta Converter for LED Lamp Driver Application ». International Journal of Power Electronics and Drive Systems (IJPEDS) 7, no 2 (13 février 2016) : 509. http://dx.doi.org/10.11591/ijpeds.v7.i2.pp509-520.

Texte intégral
Résumé :
In recent times, High-Brightness Light Emitting Diodes (HB-LEDs) are developing rapidly and it is confirmed to be the future development in lighting not only because of their high efficiency and high reliability, however also because of their other exceptional features: chromatic variety, shock and vibration resistance, etc. In this paper, a Bridgeless (BL) Isolated Interleaved Zeta Converter is proposed for the purpose of reducing the diode failures or losses, the value of output ripples also gets decreased. The proposed BL isolated interleaved zeta converter operating in Discontinuous Conduction Mode (DCM) is used for controlling the brightness of LED Driver with inherent PFC at ac mains using single voltage sensor. The fuzzy logic controller (FLC) is used to adjust the Modulation Index of the voltage controller in order to improve the dynamic response of LED Lamp driver. Based on the error of converter output voltage, FLC is designed to select the optimum Modulation Index of the voltage controller. The proposed LED driver is simulated to achieve a unity power factor at ac mains for a wide range of voltage control and supply voltage fluctuations.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Hassan, Ahmad, Zahira Mokhtar et Mazleha Maskin. « Time dependent reliability analysis for a critical reactor safety system based on fault tree approach ». IOP Conference Series : Materials Science and Engineering 1231, no 1 (1 février 2022) : 012015. http://dx.doi.org/10.1088/1757-899x/1231/1/012015.

Texte intégral
Résumé :
Abstract Loss of coolant in the operation of any nuclear power plant will eventually become the primary source of hazard in the sequence of events leading to reactor core uncovery. Subsequent failure in removing the nuclear decay heat and preventing the core uncovery will further lead to loss of coolant accident (LOCA). In conjunction to this safety concern, it is crucial that the reliability of the plant emergency core cooling system is systematically and critically analysed. This article presents a case study on the time dependent reliability analysis for a safety injection system (SIS) of an advanced pressurized water reactor, based on the failure mode and effect analysis (FMEA) and fault tree (FT) analysis approach. The identified generic data for component reliability are carefully reviewed and used in this study. Based on the base case model, sensitivity and importance measure analysis for basic events are performed and the outcomes gained are presented and discussed. From the analysis, it is shown that the safety injection pumps of the SIS contribute significantly to the reliability of the system. In the short (at 0.5 hour) and long (7.0 hours and 72.0 hours) run, safety injection pumps are critical and influence the reliability of the SIS the most. The SIS’s FT logic model that has been developed and calculated shows the usability of the FMEA and FT approach that are implemented in analysing SIS time dependent reliability.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Reed, Lynn. « A 250°C ASIC Technology ». Additional Conferences (Device Packaging, HiTEC, HiTEN, and CICMT) 2013, HITEN (1 janvier 2013) : 000134–38. http://dx.doi.org/10.4071/hiten-ta16.

Texte intégral
Résumé :
Tekmos has developed a 250°C ASIC technology that uses the X-Fab XI10 SOI process. A gate array architecture was chosen to allow reduced mask costs and quicker manufacturing cycle times. The design of the technology includes first determining the optimum routing grid and then designing of the basic gate array transistors. The “A” style transistor was chosen over the “H” style to create stronger transistors. The choice of the transistor in turn sets the characteristics of the basic “Block” that is used in the gate array architecture. Another factor in the block design is the requirement for a pre-determined source with “A” transistors. This prevents the use of shared diffusions that are used in most gate array architectures and resulted in a different block layout. The pre-determined sources also required a change to the logic cell library. Since the basic transmission gate found in most flop designs cannot be used, alternative logic architectures were developed. By implementing the SOI specific library into the Tekmos standard logic library, the SOI peculiarities were masked from the end designer. The 250°C ASIC technology was demonstrated in a FPGA conversion, in which a design in an Actel MX series FPGA was reimplemented in the 250°C ASIC technology. A standard FPGA design conversion flow was used, and the only issues were related to the speed and voltage differences between the FPGA and the 1.0μ ASIC. These were addressed through critical path analysis and some slight circuit modifications. The temperature derating for 250°C was significant, but enough margin was retained to allow the circuit to work. Parts were made and worked as expected at 250°C. The life testing results at 280°C have been satisfactory. On an experimental basis, parts were evaluated at temperatures of up to 305°C without failure.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Kim, Jonghyuk, et Hyunwoo Hwangbo. « Real-Time Early Warning System for Sustainable and Intelligent Plastic Film Manufacturing ». Sustainability 11, no 5 (12 mars 2019) : 1490. http://dx.doi.org/10.3390/su11051490.

Texte intégral
Résumé :
In this study, real-time preventive measures were formulated for a crusher process that is impossible to automate, due to the impossibility of installing sensors during the production of plastic films, and a real-time early warning system for semi-automated processes subsequently developed. First, the flow of a typical film process was ascertained. Second, a sustainable plan for real-time forecasting in a process that cannot be automated was developed using the semi-automation method flexible structure production control (FSPC). Third, statistical early selection of the process variables that are most probably responsible for failure was performed during data preprocessing. Then, a new, unified dataset was created using the link reordering method to transform the time sequence of the continuous process into one time zone. Fourth, a sustainable prediction algorithm was developed using the association rule method along with traditional statistical techniques, and verified using actual data. Finally, the overall developed logic was applied to new production process data to verify its prediction accuracy. The developed real-time early warning system for semi-automated processes contributes significantly to the smart manufacturing process both theoretically and practically.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Savitri, Wiwiet Eva, et Suvi Akhiriyah. « Errors Analysis of The Sentences Made by Freshmen of English Department ». IJET (Indonesian Journal of English Teaching) 5, no 2 (28 décembre 2016) : 282–93. http://dx.doi.org/10.15642/ijet2.2016.5.2.282-293.

Texte intégral
Résumé :
Writing sentence is not a simple task. Yet, it is found many times that even students of latest semesters are unable to compose good writing text due to their inability in producing good sentences. It is an unfortunate because ability in writing good sentences is the fundamental for the next level of writing. The failure of senior students to make good sentences triggers questions on what materials to focus on and how to conduct teaching and learning in the writing and grammar classes. In line with those, this study tries to reveal the errors freshmen students usually make in writing sentences. It is expected that the result of this study helps lecturers to choose appropriate materials and teaching techniques. The results of this study show that most errors are found in terms of grammar, mechanic, logic, and spelling. Hence, it is suggested that freshmen practice sentence writing more to practice the grammar rules, spelling, and mechanic as well as to sharpen their ability in building logical sentences.
Styles APA, Harvard, Vancouver, ISO, etc.
29

Sitompul, Erwin, et Agus Rohmat. « IoT-based Running Time Monitoring System for Machine Preventive Maintenance Scheduling ». ELKHA 13, no 1 (20 avril 2021) : 33. http://dx.doi.org/10.26418/elkha.v13i1.44202.

Texte intégral
Résumé :
Machines are valuable assets that need to be protected from damage and failure through proper maintenance measures. This paper proposes a system that automatically monitors the running time of machines and sends notifications regarding their preventive maintenance (PM) schedules. The system core consists of a programmable logic controller (PLC) and a human machine interface (HMI). The HMI is connected to an online platform via internet connection provided by a router, so that the monitoring result can be accessed via Android smartphone or laptop/PC. This IoT-based running time monitoring system (IRTMS) will be particularly helpful in implementation at an production site that consists of multiple various machines. The PM items of a machine may vary from cleaning, changing single component, to an overhaul, each with different time interval. By using the IRTMS, the user will have an overview of the PM schedules anytime and anywhere. The preparation of material, components, or tools can be known ahead of time. For simulation purpose, a prototype is constructed by using components as used in industrial real-life condition. Four output connections are provided to simulate the simultaneous monitoring of four machines. The IRTMS prototype is tested and completely successful on doing the running time monitoring, the running time reset, the PM notifications, and the remote access for monitoring and control.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Kitching, John, et Julia Rouse. « Contesting effectuation theory : Why it does not explain new venture creation ». International Small Business Journal : Researching Entrepreneurship 38, no 6 (17 février 2020) : 515–35. http://dx.doi.org/10.1177/0266242620904638.

Texte intégral
Résumé :
We evaluate whether the theory of effectuation provides – or could provide – a powerful causal explanation of the process of new venture creation. We do this by conducting an analysis of the principal concepts introduced by effectuation theory. Effectuation theory has become a highly influential cognitive science-based approach to understanding how nascent entrepreneurs start businesses under conditions of uncertainty. But by reducing the process of venture creation to a decision-making logic, effectuation theory pays insufficient regard to the substantial, pervasive and enduring influence of social-structural and cultural contexts on venture creation. Powerful explanations should conceive of venture creation as a sociohistorical process emergent from the interaction of structural, cultural and agential causal powers and must be able to theorise, fallibly, how nascent entrepreneurs form particular firms in particular times and places. We conclude that effectuation’s contribution to entrepreneurship scholarship is more limited than its advocates claim because it can offer only an under-socialised, ahistorical account of venture creation. Failure to theorise adequately the influence of structural and cultural contexts on venture creation implicitly grants nascent entrepreneurs excessive powers of agency.
Styles APA, Harvard, Vancouver, ISO, etc.
31

Aramon Bajestani, M., et J. C. Beck. « Scheduling a Dynamic Aircraft Repair Shop with Limited Repair Resources ». Journal of Artificial Intelligence Research 47 (21 mai 2013) : 35–70. http://dx.doi.org/10.1613/jair.3902.

Texte intégral
Résumé :
We address a dynamic repair shop scheduling problem in the context of military aircraft fleet management where the goal is to maintain a full complement of aircraft over the long-term. A number of flights, each with a requirement for a specific number and type of aircraft, are already scheduled over a long horizon. We need to assign aircraft to flights and schedule repair activities while considering the flights requirements, repair capacity, and aircraft failures. The number of aircraft awaiting repair dynamically changes over time due to failures and it is therefore necessary to rebuild the repair schedule online. To solve the problem, we view the dynamic repair shop as successive static repair scheduling sub-problems over shorter time periods. We propose a complete approach based on the logic-based Benders decomposition to solve the static sub-problems, and design different rescheduling policies to schedule the dynamic repair shop. Computational experiments demonstrate that the Benders model is able to find and prove optimal solutions on average four times faster than a mixed integer programming model. The rescheduling approach having both aspects of scheduling over a longer horizon and quickly adjusting the schedule increases aircraft available in the long term by 10% compared to the approaches having either one of the aspects alone.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Stangeland, Marcus, Trond Engjom, Martin Mezl, Radovan Jirik, Odd Gilja, Georg Dimcevski et Kim Nylund. « Interobserver Variation of the Bolus-and-Burst Method for Pancreatic Perfusion with Dynamic – Contrast-Enhanced Ultrasound ». Ultrasound International Open 03, no 03 (juin 2017) : E99—E106. http://dx.doi.org/10.1055/s-0043-110475.

Texte intégral
Résumé :
Abstract Purpose Dynamic contrast-enhanced ultrasound (DCE-US) can be used for calculating organ perfusion. By combining bolus injection with burst replenishment, the actual mean transit time (MTT) can be estimated. Blood volume (BV) can be obtained by scaling the data to a vessel on the imaging plane. The study aim was to test interobserver agreement for repeated recordings using the same ultrasound scanner and agreement between results on two different scanner systems. Materials and Methods Ten patients under evaluation for exocrine pancreatic failure were included. Each patient was scanned two times on a GE Logiq E9 scanner, by two different observers, and once on a Philips IU22 scanner, after a bolus of 1.5 ml Sonovue. A 60-second recording of contrast enhancement was performed before the burst and the scan continued for another 30 s for reperfusion. We performed data analysis using MATLAB-based DCE-US software. An artery in the same depth as the region of interest (ROI) was used for scaling. The measurements were compared using the intraclass correlation coefficient (ICC) and Bland Altman plots. Results The interobserver agreement on the Logiq E9 for MTT (ICC=0.83, confidence interval (CI) 0.46–0.96) was excellent. There was poor agreement for MTT between the Logiq E9 and the IU22 (ICC=−0.084, CI −0.68–0.58). The interobserver agreement for blood volume measurements was excellent on the Logiq E9 (ICC=0.9286, CI 0.7250–0.98) and between scanners (ICC=0.86, CI=0.50–0.97). Conclusion Interobserver agreement was excellent using the same scanner for both parameters and between scanners for BV, but the comparison between two scanners did not yield acceptable agreement for MTT. This was probably due to incomplete bursting of bubbles in some of the recordings on the IU22.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Chisty, Nur Mohammad Ali, et Harshini Priya Adusumalli. « Applications of Artificial Intelligence in Quality Assurance and Assurance of Productivity ». ABC Journal of Advanced Research 11, no 1 (28 janvier 2022) : 23–32. http://dx.doi.org/10.18034/abcjar.v11i1.625.

Texte intégral
Résumé :
Probabilistic intelligence is vital in current management and technology. It is simpler to persuade readers when a management or engineer reports connected difficulties with objective statistical data. Statistical data support the evaluation of the true status, and cause and effect can be induced. The rationale is proven using deductive logic and statistical data verification and induction. Quality practitioners should develop statistical thinking skills and fully grasp the three quality principles: “essence of substance,” “process of business,” and “psychology.” Traditional quality data include variables, attributes, faults, internal and external failure costs, etc., obtained by data collection, data processing, statistical analysis, root cause analysis, etc. Quality practitioners used to rely on these so-called professional qualities to get a job. If quality practitioners do not keep up with the steps of times, quality data collection, organization, analysis, and monitoring will be confusing or challenging. Increasingly, precision tool machines are embedded in various IoTs, gathering machine operation data, component diagnostic and life estimation, consumables monitoring and utilization monitoring, and various data analyses. Data mining and forecasting have steadily been combined into Data Science, which is the future of quality field worth worrying about.
Styles APA, Harvard, Vancouver, ISO, etc.
34

Shin, Younggy, Seok-Weon Choi, Guee-Won Moon, Hee-Jun Seo, Sang-Hoon Lee et Hyokjin Cho. « Application of a Real-time Process Simulator to PLC Programming for a Satellite Thermal Vacuum Chamber ». Journal of the IEST 48, no 1 (1 septembre 2005) : 127–37. http://dx.doi.org/10.17764/jiet.48.1.762302m1628x2572.

Texte intégral
Résumé :
A thermal vacuum chamber is used to simulate thermal environments of a test satellite in orbits where daily temperature variations range from 80 K to above 400 K. The test facility is complex and consists of expensive parts. Modification of control software is discouraged as the modification may cause unexpected system failure. This paper describes a study that develops a real-time dynamics model of the thermal vacuum chamber that can be used to create control algorithms and simulate electrical inputs and outputs for interface with a programmable logic controller (PLC). The dynamics model is represented by simulation software and exported to a target PC in the Microsoft® Disk Operating System (DOS) mode to exploit the real-time kernel of the DOS software. The model is executed in real-time and communicates with a microprocessor-based input/output (I/O) board via a serial port to emulate electrical inputs and outputs. The target process to model is the gaseous nitrogen (GN2) mode in which GN2 circulates in a closed loop through thermal shrouds encompassing a test object. A blower boosts the GN2. Injected liquid nitrogen (LN2) and an electric heater control the set temperature of the GN2. The realized simulator dynamics are quite similar to those of the thermal vacuum chamber and serve as an appropriate system to verify the control performance of a programmed PLC.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Fomin, I. V., et N. G. Vinogradova. « Rationale of specialized medical care for patients with chronic heart failure in the Russian Federation ». South Russian Journal of Therapeutic Practice 1, no 3 (20 décembre 2020) : 44–53. http://dx.doi.org/10.21886/2712-8156-2020-1-3-44-53.

Texte intégral
Résumé :
Objectives: to determine the causes of ineffective observation and poor prognosis in patients undergoing ADHF, in real clinical practice and to consider the basics of the formation of specialized medical care for patients with heart failure (HF).Materials and methods: the study was conducted based on the City Center for the treatment of heart failure (center HF), N. Novgorod. The study consistently included 942 patients with heart failure (HF) at the age of 18 years and older who underwent ADHF and received inpatient treatment in center HF between March 4, 2016 and March 3, 2017. Based on the decisions of patients to continue outpatient monitoring in center HF, two groups of patients were distinguished: patients who continued to be monitored in center HF (group I, n = 510) and patients who continued to be monitored in outpatient clinics at the place of residence (group II, n = 432). The assessment of adherence to treatment, overall mortality, survival and re-admission to a depth of two years of observation was carried out. Statistical data processing was performed using Statistica 7.0 for Windows and the software package R.Results: all patients in the study groups had high comorbidity. Group 2 patients turned out to be statistically significantly older, more often had III functional class (FC) HF, lower the baseline test score of 6-minute walk, and higher the baseline clinical assessment scale. After 2 years of follow-up in group II, there was a significant deterioration in adherence to basic therapy of HF compared with group I. According to the results of multifactorial proportional risk Cox models, it was shown that observation of patients in the group 1 is an independent factor increasing the risk of overall mortality by 2.8 times by the end of the second year of observation. Survival after two years of follow-up was: in group I — 89.8 %, and in group II — 70.1 % of patients (OR = 0.3, 95 % CI 0.2 – 0.4; p1/2 < 0.001). After two years of follow-up, the proportion of re-hospitalized patients in group II was greater (78.0 % of patients) versus group 1 (50.6 % of patients, OR = 3.5, 95 % CI 2.6 – 4.6; p1/2 <0.001). The independent risk of re-hospitalization according to multinominal logit regression was 3.4 times higher in group II and 2.4 times for III – IV FC HF. Conclusions: the inclusion of patients with HF in the system of specialized medical care improves adherence to treatment, prognosis of life and reduces the risk of repeated hospitalizations. Patients of an older age and with an initially greater clinical severity refused specialized supervision in center HF.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Harding, Carol. « COGNITION AND COMMUNICATION : JUDGMENTAL BIASES, RESEARCH METHODS, AND THE LOGIC OF CONVERSATION.Norbert Schwarz. Mahwah, NJ : Erlbaum, 1996. Pp. vii + 112. $22.50 paper. » Studies in Second Language Acquisition 20, no 3 (septembre 1998) : 452–53. http://dx.doi.org/10.1017/s0272263198353073.

Texte intégral
Résumé :
Professor Schwarz is the most recent contributor to the John M. MacEachran Memorial Lecture Series. In this timely essay, Schwarz takes a position critical of traditional psychological research asserting that: “Our [psychologists'] focus on individual thought processes has fostered a neglect of the social context in which individuals do their thinking and this neglect has contributed to the less than flattering portrait that psychology has painted of human judgment” (p. 1). He posits that “fallacies of human judgment” reported in studies of cognition and communication are actually fallacies of the research—specifically, the researchers' failure to take into account the human mind's capacity to make sense of things, particularly through communication embedded in social context. His point is an important one. When involved in conversation (even in the research laboratory), humans may suspend their abstract knowledge of the logic of language and attend to irrelevant and misleading information—especially if they assume that the speaker's intentions are to convey information and to make sense. Schwarz reports that “ordinary kinds of talk” build on Gricean conversational implicatures, inferences that “go beyond the semantic meaning of what is being said by determining the pragmatic meaning of the utterance” (p. 11). Researchers underestimate the power of these inferences and, by presenting decontextualized, at times absurd, information, they fail to accurately measure their subjects' “human judgment,” but instead observe their subjects' diligent, and often expert, attempts to make sense of the message.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Wen, Yanqing, Baoliang Liu, Haiyan Shi, Shugui Kang et Yuejiao Feng. « Reliability Evaluation and Optimization of a System with Mixed Run Shock ». Axioms 11, no 8 (27 juillet 2022) : 366. http://dx.doi.org/10.3390/axioms11080366.

Texte intégral
Résumé :
In this paper, we investigate a wear and mixed shock model in which the system can fail due to internal aging or external shocks. The lifetime of the system, due to internal wear, follows continuous phase-type (PH) distributions. The external random shocks arrive at the system according to a PH renewal process. The system will fail when the internal failure occurs or k1 consecutive external shocks, the size of at least d1 or k2 consecutive external shocks the size of at least d2 occur, where d1<d2, k1>k2. The failed system can be repaired immediately, and the repair times of the system are governed by continuous PH distributions. The system can be replaced by a new and identical one based on a bivariate replacement policy (L,N). The long-run average profit rate for the system is obtained by employing the closure property of the PH distribution. Finally, a numerical example is also given to determine the optimal replacement policy.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Xue, Xiaozhen, Sima Siami-Namini et Akbar Siami Namin. « Testing Multi-Threaded Applications Using Answer Set Programming ». International Journal of Software Engineering and Knowledge Engineering 28, no 08 (août 2018) : 1151–75. http://dx.doi.org/10.1142/s021819401850033x.

Texte intégral
Résumé :
We introduce a technique to formally represent and specify race conditions in multithreaded applications. Answer set programming (ASP) is a logic-based knowledge representation paradigm to formally express belief acquired through reasoning in an application domain. The transparent and expressiveness representation of problems along with powerful non-monotonic reasoning power enable ASP to abstractly represent and solve some certain classes of NP hard problems in polynomial times. We use ASP to formally express race conditions and thus represent potential data races often occurred in multithreaded applications with shared memory models. We then use ASP to generate all possible test inputs and thread interleaving, i.e. scheduling, whose executions would result in deterministically exposing thread interleaving failures. We evaluated the proposed technique with some moderate sized Java programs, and our experimental results confirm that the proposed technique can practically expose common data races in multithreaded programs with low false positive rates. We conjecture that, in addition to generating threads scheduling whose execution order leads to the exposition of data races, ASP has several other applications in constraint-based software testing research and can be utilized to express and solve similar test case generation problems where constraints play a key role in determining the complexity of searches.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Gopalakrishnan, Dharmesh, Heesun J. Rogers, Paul Elson et Keith R. McCrae. « Diastolic but Not Systolic Heart Failure Is Associated with Multiple Abnormalities on Platelet Aggregation Testing ». Blood 126, no 23 (3 décembre 2015) : 1079. http://dx.doi.org/10.1182/blood.v126.23.1079.1079.

Texte intégral
Résumé :
Abstract Introduction: The effect(s) of co-morbid medical conditions on platelet function is poorly understood. In this retrospective EMR-based study, we analyzed the influence of various diseases on in vitro measures of platelet function - platelet function analyzer-100 (PFA-100) closure times, platelet aggregation (using light transmission aggregometry (LTA)), platelet dense granule release (using lumi-aggregometry), and platelet flow-cytometry for surface glycoproteins. We also examined their influence on VWF testing. Methods: Four hundred ninety seven patients who had platelet aggregation testing performed using LTA between August 2008 and August 2013 were included in our study. Co-morbidities at the time of testing were recorded. Propensity score matching for each individual disease was used to adjust for relevant covariates. We used a 1:1 nearest neighbor match without replacement, with caliper width set to 0.2 times the standard deviation of the logit of the propensity score. Following matching, Fisher's exact test or Chi square test was used as appropriate to assess the association between categorical variables, while the Mann-Whitney test was used to test the association between categorical and continuous measures. Pearson co-efficient was used to assess the correlation between continuous variables. P < 0.05 was considered significant. Results: 1) Congestive heart failure (n = 44) was associated with impaired platelet aggregation in the presence of arachidonic acid (p = 0.001) and collagen (p = 0.009), as well as impaired dense granule release in the presence of collagen (p = 0.002) and epinephrine (0.012). It was also associated with abnormal aggregation (p = 0.024) and release (p = 0.028) in the presence of ≥ 2 agonists in the respective panels. Diastolic heart failure (n = 25) was found to be associated with impaired aggregation in the presence of ADP (p = 0.007), collagen (p = 0.001), or arachidonic acid (p = 0.007), and to ≥ 2 agonists in the aggregation panel (p = 0.008). Systolic heart failure (n = 26) was not associated with abnormalities in aggregation or release. 2) Severe aortic stenosis (n = 17) was associated with prolonged collagen/ADP (p = 0.003) and collagen/epinephrine (p <0.001) closure times with PFA-100, but not with any abnormalities in the platelet aggregation/release panels. Severe aortic stenosis was associated with a decreased ristocetin cofactor/VWF antigen ratio (0.66±0.17 vs. 0.90±0.37; p = 0.030), but not with any other abnormalities in VWF testing. 3) Diabetes mellitus (n = 65) was associated with impaired platelet aggregation in the presence of collagen (p = 0.034) and impaired platelet release in the presence of epinephrine (p = 0.027). However, glycated hemoglobin level (HbA1C) was not found to correlate with impairments in either aggregation or release in the presence of any agonist. Hypothyroidism (n = 71) or vitamin D deficiency (n = 39) were not found to be associated with abnormalities in any of the platelet function assays. Finally, biochemical parameters reflecting hepatic or renal function did not correlate with any abnormalities in platelet function assays. However, the total number of co-morbidities in any patient correlated with the number of abnormalities in the platelet aggregation as well as release panels. Conclusion: Diastolic heart failure was associated with impaired platelet aggregation in the presence of multiple agonists. Though the mechanism remains unclear, we postulate that this could be related to shear stress to which the platelets are subjected in the non-compliant ventricles. Severe aortic stenosis was associated with prolonged collagen/ADP as well as collagen/epinephrine PFA-100 closure times and with lower ristocetin co-factor/VW antigen ratio suggesting functional impairment of VWF. Though diabetes mellitus was associated with impaired platelet aggregation in the presence of collagen and impaired dense granule release in the presence of epinephrine, no correlation was found between these abnormalities and HbA1C levels, making the significance of the association unclear. Disclosures McCrae: Syntimmune: Consultancy; Momenta: Consultancy; Janssen: Membership on an entity's Board of Directors or advisory committees; Halozyme: Membership on an entity's Board of Directors or advisory committees.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Muller, G. H., B. W. Steyn-Bruwer et W. D. Hamman. « Predicting financial distress of companies listed on the JSE : A comparison of techniques ». South African Journal of Business Management 40, no 1 (31 mars 2009) : 21–32. http://dx.doi.org/10.4102/sajbm.v40i1.532.

Texte intégral
Résumé :
In 2006, Steyn-Bruwer and Hamman highlighted several deficiencies in previous research which investigated the prediction of corporate failure (or financial distress) of companies. In their research, Steyn-Bruwer and Hamman made use of the population of companies for the period under review and not only a sample of bankrupt versus successful companies. Here the sample of bankrupt versus successful companies is considered as two extremes on the continuum of financial condition, while the population is considered as the entire continuum of financial condition.The main objective of this research, which was based on the above-mentioned authors’ work, was to test whether some modelling techniques would in fact provide better prediction accuracies than other modelling techniques. The different modelling techniques considered were: Multiple discriminant analysis (MDA), Recursive partitioning (RP), Logit analysis (LA) and Neural networks (NN).From the literature survey it was evident that existing literature did not readily consider the number of Type I and Type II errors made. As such, this study introduces a novel concept (not seen in other research) called the “Normalised Cost of Failure” (NCF) which takes cognisance of the fact that a Type I error typically costs 20 to 38 times that of a Type II error.In terms of the main research objective, the results show that different analysis techniques definitely produce different predictive accuracies. Here, the MDA and RP techniques correctly predict the most “failed” companies, and consequently have the lowest NCF; while the LA and NN techniques provide the best overall predictive accuracy.
Styles APA, Harvard, Vancouver, ISO, etc.
41

DOGARU, RADU, et LEON O. CHUA. « MUTATIONS OF THE "GAME OF LIFE" : A GENERALIZED CELLULAR AUTOMATA PERSPECTIVE OF COMPLEX ADAPTIVE SYSTEMS ». International Journal of Bifurcation and Chaos 10, no 08 (août 2000) : 1821–66. http://dx.doi.org/10.1142/s0218127400001201.

Texte intégral
Résumé :
This paper presents a novel approach for studying the relationship between the properties of isolated cells and the emergent behavior that occurs in cellular systems formed by coupling such cells. The novelty of our approach consists of a method for precisely partitioning the cell parameter space into subdomains via the failure boundaries of the piecewise-linear CNN (cellular neural network) cells [Dogaru & Chua, 1999a] of a generalized cellular automata [Chua, 1998]. Instead of exploring the rule space via statistically defined parameters (such as λ in [Langton, 1990]), or by conducting an exhaustive search over the entire set of all possible local Boolean functions, our approach consists of exploring a deterministically structured parameter space built around parameter points corresponding to "interesting" local Boolean logic functions. The well-known "Game of Life" [Berlekamp et al., 1982] cellular automata is reconsidered here to exemplify our approach and its advantages. Starting from a piecewise-linear representation of the classic Conway logic function called the "Game of Life", and by introducing two new cell parameters that are allowed to vary continuously over a specified domain, we are able to draw a "map-like" picture consisting of planar regions which cover the cell parameter space. A total of 148 subdomains and their failure boundaries are precisely identified and represented by colored paving stones in this mosaic picture (see Fig. 1), where each stone corresponds to a specific local Boolean function in cellular automata parlance. Except for the central "paving stone" representing the "Game of Life" Boolean function, all others are mutations uncovered by exploring the entire set of 148 subdomains and determining their dynamic behaviors. Some of these mutations lead to interesting, "artificial life"-like behavior where colonies of identical miniaturized patterns emerge and evolve from random initial conditions. To classify these emergent behaviors, we have introduced a nonhomogeneity measure, called cellular disorder measure, which was inspired by the local activity theory from [Chua, 1998]. Based on its temporal evolution, we are able to partition the cell parameter space into a class U "unstable-like" region, a class E "edge of chaos"-like region, and a class P "passive-like" region. The similarity with the "unstable", "edge of chaos" and "passive" domains defined precisely and applied to various reaction–diffusion CNN systems [Dogaru & Chua, 1998b, 1998c] opens interesting perspectives for extending the theory of local activity [Chua, 1998] to discrete-time cellular systems with nonlinear couplings. To demonstrate the potential of emergent computation in generalized cellular automata with cells designed from mutations of the "Game of Life", we present a nontrivial application of pattern detection and reconstruction from very noisy environments. In particular, our example demonstrates that patterns can be identified and reconstructed with very good accuracy even from images where the noise level is ten times stronger than the uncorrupted image.
Styles APA, Harvard, Vancouver, ISO, etc.
42

Khanh, Tran Trong, VanDung Nguyen et Eui-Nam Huh. « Fuzzy-Based Mobile Edge Orchestrators in Heterogeneous IoT Environments : An Online Workload Balancing Approach ». Wireless Communications and Mobile Computing 2021 (10 août 2021) : 1–19. http://dx.doi.org/10.1155/2021/5539186.

Texte intégral
Résumé :
Online workload balancing guarantees that the incoming workloads are processed to the appropriate servers in real time without any knowledge of future resource requests. Currently, by matching the characteristics of incoming Internet of Things (IoT) applications to the current state of computing and networking resources, a mobile edge orchestrator (MEO) provides high-quality service while temporally and spatially changing the incoming workload. Moreover, a fuzzy-based MEO is used to handle the multicriteria decision-making process by considering multiple parameters within the same framework in order to make an offloading decision for an incoming task of an IoT application. In a fuzzy-based MEO, the fuzzy-based offloading strategy leads to unbalanced loads among edge servers. Therefore, the fuzzy-based MEO needs to scale its capacity when it comes to a large number of devices in order to avoid task failures and to reduce service times. In this paper, we investigate and propose an online workload balancing algorithm, which we call the fuzzy-based (FuB) algorithm, for a fuzzy-based MEO. By considering user configuration requirements, server geographic locations, and available resource capacities for achieving an online algorithm, our proposal allocates the proximate server for each incoming task in real time at the MEO. A simulation was conducted in augmented reality, healthcare, compute-intensive, and infotainment applications. Compared to two benchmark schemes that use the fuzzy logic approach for an MEO in IoT environments, the simulation results (using EdgeCloudSim) show that our proposal outperforms the existing algorithms in terms of service time, the number of failed tasks, and processing times when the system is overloaded.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Ma, Hoi-Lam, et Wai-Hung Collin Wong. « A fuzzy-based House of Risk assessment method for manufacturers in global supply chains ». Industrial Management & ; Data Systems 118, no 7 (13 août 2018) : 1463–76. http://dx.doi.org/10.1108/imds-10-2017-0467.

Texte intégral
Résumé :
Purpose Risk management is crucial for all organizations, especially those in the global supply chain network. Failure may result in huge economic loses and damage to company reputation. Risk assessment usually involves quantitative and qualitative decisions. The purpose of this paper is to apply fuzzy logic to capture and inference qualitative decisions made in the House of Risk (HOR) assessment method. Design/methodology/approach In the existing HOR model, aggregate risk potential (ARP) is calculated by the risk event times the risk agent value and its occurrence. However, these values are usually obtained from interviews, which may involve subjective decisions. To overcome this shortcoming, a fuzzy-based approach is proposed to calculate ARP instead of the current deterministic approach. Findings Risk analyses are conducted in five major categories of risk sources: internal, global environment, supplier, customer and third-party logistics provider. Moreover, each category is further divided into different sub-categories. The results indicate that the fuzzy-based HOR successfully inferences the inputs of the risk event, risk agents and its occurrence, and can prioritize the risk agents in order to take proactive decisions. Practical implications The proposed fuzzy-based HOR model can be used practically by manufacturers in the global supply chain. It provides a framework for decision makers to systematically analyze the potential risks in different categories. Originality/value The proposed fuzzy-based HOR approach improves the traditional approach by more precise modeling of the qualitative decision-making process. It contributes to a more accurate reflection of the real situation that manufacturers are facing.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Pabst, Adrian. « Postliberal Politics ». Almanac “Essays on Conservatism” 35.5 (16 octobre 2021) : 200–222. http://dx.doi.org/10.24030/24092517-2021-0-3-200-222.

Texte intégral
Résumé :
The present article consists of key extracts from the recently published Adrian Pabst’s book “Postliberal Politics. The Coming Era of Renewal” (2021). According to the author, stability in the West faces the challenges of left and right populism. And if left populism hasn’t survived the trial by real elections, the right populism is quite successful in removing liberal elites from power. At the same time the strong point of the right populism is the provision of a political program, but its weakness is in the absence of any concepts or political instruments for transitions implementation. But forces, - the ultraliberal left and anti-liberal right, - develop various types of identity politics thus undermining the cultural and civilizational fundamental aspects of the West and the feelings of common goal and common destiny. The author opposes those extremes with postliberalism – non-uniform ideological movement directed at overcoming the contradictions of the deadlocked liberal ideology that is characterized by the rise of both left and right populism. According to Adrian Pabst, postliberalism acknowledges the failure of liberal projects and at the same time the necessity to preserve the most valuable liberal aspects in new form. Liberalism with its multiple trends is not beyond hope and some institutions it created are worth preserving. Still liberal ideology lead to the situation when freedom once alienated from self-restraint and mutual obligations turned into unfreedom. Self-destruction of liberal values such as freedom, equality, tolerance and pluralism demonstrates abnormalities that at once distort liberal principles and show liberal ideology logic. Postliberalism is intended to cut short those defects. In particular, postliberal ideology proceeds from acknowledging that the society is based not on some non-personal social contract between individuals as claimed by the liberals from the times of Hobbes and Locke, but appeared as the result of mutual arrangement between generations. Civil liberty does not man freedom from obligations or freedom for the sake of egoistical interests, but liberty to take care of oneself and others. Personality development based on personal independence should be balanced by common well-being. Equality does not mean uniformity but respect for integral virtue. Individual rights should not be downgraded but should be specific and relative due to their connection with obligations towards other people. Postliberalism in this interpretation endeavors to preserve the best gains of liberal ideology while eliminating the threat of blunt authoritarianism that is always concealed in liberal logic.
Styles APA, Harvard, Vancouver, ISO, etc.
45

Alavipour, S. M. Reza, et David Arditi. « Maximizing expected contractor profit using an integrated model ». Engineering, Construction and Architectural Management 26, no 1 (18 février 2019) : 118–38. http://dx.doi.org/10.1108/ecam-04-2018-0149.

Texte intégral
Résumé :
Purpose Planning for increased contractor profits should start at the time the contract is signed because low profits and lack of profitability are the primary causes of contractor failure. The purpose of this paper is to propose an integrated profit maximization model (IPMM) that aims for maximum expected profit by using time-cost tradeoff analysis, adjusted start times of activities, minimized financing cost and minimized extension of work schedule beyond the contract duration. This kind of integrated approach was never researched in the past. Design/methodology/approach IPMM is programmed into an automated system using MATLAB 2016a. It generates an optimal work schedule that leads to maximum profit by means of time-cost tradeoff analysis considering different activity acceleration/deceleration methods and adjusting the start/finish times of activities. While doing so, IPMM minimizes the contractor’s financing cost by considering combinations of different financing alternatives such as short-term loans, long-term loans and lines of credit. IPMM also considers the impact of extending the project duration on project profit. Findings IPMM is tested for different project durations, for the optimality of the solutions, differing activity start/finish times and project financing alternatives. In all cases, contractors can achieve maximum profit by using IPMM. Research limitations/implications IPMM considers a deterministic project schedule, whereas stochastic time-cost tradeoff analysis can improve its performance. Resource allocation and resource leveling are not considered in IPMM, but can be incorporated into the model in future research. Finally, the long computational time is a challenge that needs to be overcome in future research. Practical implications IPMM is likely to increase profits and improve the chances of contractors to survive and grow compared to their competitors. The practical value of IPMM is that any contractor can and should use IPMM since all the data required to run IPMM is available to the contractor at the time the contract is signed. The contractor who provides information about network logic, schedule data, cost data, contractual terms, and available financing alternatives and their APRs can use an automated IPMM that adjusts activity start times and durations, minimizes financing cost, eliminates or minimizes time extensions, minimizes total cost and maximizes expected profit. Originality/value Unlike any prior study that looks into contractors’ profits by considering the impact of only one or two factors at a time, this study presents an IPMM that considers all major factors that affect profits, namely, time-cost tradeoff analysis, adjusted start times of activities, minimized financing cost and minimized extension of work schedule beyond the contract duration.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Contini, S., et V. Matuzas. « Coupling decomposition and truncation for the analysis of complex fault trees ». Proceedings of the Institution of Mechanical Engineers, Part O : Journal of Risk and Reliability 226, no 3 (15 juin 2011) : 249–61. http://dx.doi.org/10.1177/1748006x11401495.

Texte intégral
Résumé :
The analysis of large and complex fault trees is a very difficult task. The main limiting factor is an insufficient working memory. Several methods are available in literature to reduce the working memory requirement including modularization, the so-called ‘re-writing rules’, and truncation, i.e. the use of logic and/or probabilistic cut-offs to determine only the most important system failure modes. The truncation method is very effective, as it allows significant reductions in the computational effort; however, it implies the estimation of the truncation error, a problem not yet solved satisfactorily. Recently, a new method based on the decomposition of a complex fault tree into a set of mutually exclusive simpler fault trees was proposed. The decomposition is repeatedly applied until the generated trees are sufficiently simple to be exactly analysed with the available working memory. Theoretically, this approach would allow the exact analysis of fault trees of any complexity, but the related computation times are generally too high. The scope of this paper is to show how the combined application of decomposition and truncation constitutes a valuable method to analyse complex fault trees. The upper and lower bounds of the top-event probability, obtained by applying this method, are very close to the exact value and their difference depends on the dimension of the available working memory. Furthermore, the probabilistic quantification, including the importance measures of basic events, can easily be performed by properly combining the results from the independent analysis of all simpler fault trees. The developed methodology has been implemented in a software tool and successfully applied to the analysis of several complex fault trees, some of which are considered in this paper.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Balatsky, E. V., et N. A. Ekimova. « Identifcation оf world Class universities : Destructive Pluralism ». World of new economy 16, no 3 (12 octobre 2022) : 6–19. http://dx.doi.org/10.26794/2220-6469-2022-16-3-6-19.

Texte intégral
Résumé :
The article deals with the problem of identifying world-class universities (WCU) on the basis of information provided by various ranking systems. The relevance of the problem is due to the fact that in 2022 Russia was “cut off” from the world community, including the interruption of cooperation with leading international ranking universities, so the country risks losing the opportunity to self-check its successes and failures by generally recognized criteria. In this regard, the purpose of this article is hypothesis verifcation that the “friendly” ranking of ARWU base can serve as an effective substitute for the “unfriendly” QS ranking base. To test the formulated hypothesis, we used the previously developed algorithm for identifying WCU using statistical data from the fve Global University Rankings — Quacquarelli Symonds (QS), Times Higher Education (THE), Academic Ranking of World Universities (ARWU), Center for World University Rankings (CWUR) and National Taiwan University Ranking (NTU) — and two University Rankings by subject — QS and ARWU. Conducted calculations disproved the general hypothesis and revealed a fundamental inconsistency of results obtained on the basis of different rankings. In addition, by the example of the ARWU, a profound contradiction in the logic of compiling the GUR and the SRU was uncovered. That raises a broader question about adequacy of the concept of the WCU itself. To answer this question, we conducted a “humanitarian test” for the validity of modern WCU, which showed the presence of elementary illiteracy and lack of culture among graduates of advanced universities. Collected stylized examples allowed to establish that modern world market leaders’ universities do not pass the “humanitarian test”, and therefore the entire rating system cannot be considered a reliable basis for conclusions about the activities of universities. The question of replacing the term WCU with a less pretentious “product” category — practice-oriented universities — is being discussed.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Kovalev, D. I., T. P. Mansurova et Ya A. Tynchenko. « On the issue of choosing a real-time operating system for hardware and software support of industrial and environmental monitoring systems ». Modern Innovations, Systems and Technologies 1, no 2 (23 juillet 2021) : 46–63. http://dx.doi.org/10.47813/2782-2818-2021-1-2-46-63.

Texte intégral
Résumé :
The article discusses the features of the operation of environmental monitoring systems. A number of factors are determined that affect the organization of hardware and software support for such systems. The main factors are: the operation of the equipment in real time, various environmental influences, the specifics of the composition of sensors and connections, as well as a complex technology for servicing the equipment in an aggressive technological environment. The listed factors affect the structure, software, hardware components and design of environmental monitoring systems. When creating any complex systems, the distribution of work in time leads to the allocation of various design stages, and the ideas about the system being designed, reflecting its essential properties with varying degrees of detail, determine the constituent parts of the design process. The specificity of the stages for monitoring systems of thermal power plants is shown, determined both by the general features of these systems and by the peculiarities of their application in the technological processes of thermal power plants, among which the main ones are: the heterogeneity of the input units and devices, as well as technological objects of power plants; work in real time; programmable logic of smart sensors and multifunctional purpose of system components; the possibility of failures leading to a change in the functioning algorithm using the multiversion methodology; the presence of interrelated requirements for the accuracy and speed of information transfer. The work shows the content of work on the stages of designing monitoring systems, taking into account the specifics of their development. This specificity is most significantly manifested in the development of a technical proposal and a draft design. In a step-by-step design, after the end of each stage, an assessment of the main results obtained is carried out by comparing them with those required according to the terms of reference. The article presents an expert analysis that allows you to determine the acceptable implementation of a real-time operating system for hardware and software support for environmental monitoring technologies. Phar Lap ETS, VxWorks and NI Linux Real-Time operating systems are considered.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Kovalev, D. I., T. P. Mansurova et Ya A. Tynchenko. « On the issue of choosing a real-time operating system for hardware and software support of industrial and environmental monitoring systems ». Modern Innovations, Systems and Technologies 1, no 2 (23 juillet 2021) : 64–81. http://dx.doi.org/10.47813/2782-2818-2021-1-2-64-81.

Texte intégral
Résumé :
The article discusses the features of the operation of environmental monitoring systems. A number of factors are determined that affect the organization of hardware and software support for such systems. The main factors are: the operation of the equipment in real time, various environmental influences, the specifics of the composition of sensors and connections, as well as a complex technology for servicing the equipment in an aggressive technological environment. The listed factors affect the structure, software, hardware components and design of environmental monitoring systems. When creating any complex systems, the distribution of work in time leads to the allocation of various design stages, and the ideas about the system being designed, reflecting its essential properties with varying degrees of detail, determine the constituent parts of the design process. The specificity of the stages for monitoring systems of thermal power plants is shown, determined both by the general features of these systems and by the peculiarities of their application in the technological processes of thermal power plants, among which the main ones are: the heterogeneity of the input units and devices, as well as technological objects of power plants; work in real time; programmable logic of smart sensors and multifunctional purpose of system components; the possibility of failures leading to a change in the functioning algorithm using the multiversion methodology; the presence of interrelated requirements for the accuracy and speed of information transfer. The work shows the content of work on the stages of designing monitoring systems, taking into account the specifics of their development. This specificity is most significantly manifested in the development of a technical proposal and a draft design. In a step-by-step design, after the end of each stage, an assessment of the main results obtained is carried out by comparing them with those required according to the terms of reference. The article presents an expert analysis that allows you to determine the acceptable implementation of a real-time operating system for hardware and software support for environmental monitoring technologies. Phar Lap ETS, VxWorks and NI Linux Real-Time operating systems are considered.
Styles APA, Harvard, Vancouver, ISO, etc.
50

Weston, Lauren, Sarah Rybczynska-Bunt, Cath Quinn, Charlotte Lennox, Mike Maguire, Mark Pearson, Alex Stirzaker et al. « Interrogating intervention delivery and participants’ emotional states to improve engagement and implementation : A realist informed multiple case study evaluation of Engager ». PLOS ONE 17, no 7 (14 juillet 2022) : e0270691. http://dx.doi.org/10.1371/journal.pone.0270691.

Texte intégral
Résumé :
Background ‘Engager’ is an innovative ‘through-the-gate’ complex care intervention for male prison-leavers with common mental health problems. In parallel to the randomised-controlled trial of Engager (Trial registration number: ISRCTN11707331), a set of process evaluation analyses were undertaken. This paper reports on the depth multiple case study analysis part of the process evaluation, exploring how a sub-sample of prison-leavers engaged and responded to the intervention offer of one-to-one support during their re-integration into the community. Methods To understand intervention delivery and what response it elicited in individuals, we used a realist-informed qualitative multiple ‘case’ studies approach. We scrutinised how intervention component delivery lead to outcomes by examining underlying causal pathways or ‘mechanisms’ that promoted or hindered progress towards personal outcomes. ‘Cases’ (n = 24) were prison-leavers from the intervention arm of the trial. We collected practitioner activity logs and conducted semi-structured interviews with prison-leavers and Engager/other service practitioners. We mapped data for each case against the intervention logic model and then used Bhaskar’s (2016) ‘DREIC’ analytic process to categorise cases according to extent of intervention delivery, outcomes evidenced, and contributing factors behind engagement or disengagement and progress achieved. Results There were variations in the dose and session focus of the intervention delivery, and how different participants responded. Participants sustaining long-term engagement and sustained change reached a state of ‘crises but coping’. We found evidence that several components of the intervention were key to achieving this: trusting relationships, therapeutic work delivered well and over time; and an in-depth shared understanding of needs, concerns, and goals between the practitioner and participants. Those who disengaged were in one of the following states: ‘Crises and chaos’, ‘Resigned acceptance’, ‘Honeymoon’ or ‘Wilful withdrawal’. Conclusions We demonstrate that the ‘implementability’ of an intervention can be explained by examining the delivery of core intervention components in relation to the responses elicited in the participants. Core delivery mechanisms often had to be ‘triggered’ numerous times to produce sustained change. The improvements achieved, sustained, and valued by participants were not always reflected in the quantitative measures recorded in the RCT. The compatibility between the practitioner, participant and setting were continually at risk of being undermined by implementation failure as well as changing external circumstances and participants’ own weaknesses. Trial registration number ISRCTN11707331, Wales Research Ethics Committee, Registered 02-04-2016—Retrospectively registered https://doi.org/10.1186/ISRCTN11707331.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie