Journal articles on the topic 'Verification-based software inspection'

To see the other types of publications on this topic, follow the link: Verification-based software inspection.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 31 journal articles for your research on the topic 'Verification-based software inspection.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Simon, S., and J. P. Rogala. "Model-based prediction-verification scheme for real-time inspection." Pattern Recognition Letters 7, no. 5 (June 1988): 305–11. http://dx.doi.org/10.1016/0167-8655(88)90071-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

BESSON, FRÉDÉRIC, THOMAS DE GRENIER DE LATOUR, and THOMAS JENSEN. "Interfaces for stack inspection." Journal of Functional Programming 15, no. 2 (March 2005): 179–217. http://dx.doi.org/10.1017/s0956796804005465.

Full text
Abstract:
Stack inspection is a mechanism for programming secure applications in the presence of code from various protection domains. Run-time checks of the call stack allow a method to obtain information about the code that (directly or indirectly) invoked it in order to make access control decisions. This mechanism is part of the security architecture of Java and the .NET Common Language Runtime. A central problem with stack inspection is to determine to what extent the local checks inserted into the code are sufficient to guarantee that a global security property is enforced. A further problem is how such verification can be carried out in an incremental fashion. Incremental analysis is important for avoiding re-analysis of library code every time it is used, and permits the library developer to reason about the code without knowing its context of deployment. We propose a technique for inferring interfaces for stack-inspecting libraries in the form of secure calling context for methods. By a secure calling context we mean a pre-condition on the call stack sufficient for guaranteeing that execution of the method will not violate a given global property. The technique is a constraint-based static program analysis implemented via fixed point iteration over an abstract domain of linear temporal logic properties.
APA, Harvard, Vancouver, ISO, and other styles
3

Shi, Mengdi, Jun Ji, and Wei Wang. "Development of Railway Bridge Deflection Detection System Based on NI-cDAQ Inclinometer." Journal of Physics: Conference Series 2160, no. 1 (January 1, 2022): 012068. http://dx.doi.org/10.1088/1742-6596/2160/1/012068.

Full text
Abstract:
Abstract Aiming at the limitation of the traditional deflection test method by the field application of railway bridge inspection, the paper develops a data acquisition and analysis system for bridge deflection inspection. The system is based on NI-cDAQ series acquisition equipment and the LabVIEW programming software development environment. Through the industrial Modbus bus protocol, it is possible to integrate the inclinometer, data acquisition equipment and other hardware acquisition equipment with the application software through the computer, and establish a complete set of practical instrument equipment and dynamic deflection detection process to achieve high speed, high precision and multi-channel acquisition Bridge deflection information. Combined with field tests for verification, the results show that the real-time collection and transmission of various parameters of the system can run for a long time under complex field conditions, with stable work and reliable data.
APA, Harvard, Vancouver, ISO, and other styles
4

Suharev, N. V. "DEVELOPMENT OF CHARGING AND DISCHARGE SOFTWARE AND HARDWARE COMPLEX APPLICABLE FOR CHECKING BATTERIES OF SPACE VEHICLES." EurasianUnionScientists 5, no. 5(74) (June 14, 2020): 67–71. http://dx.doi.org/10.31618/esu.2413-9335.2020.5.74.758.

Full text
Abstract:
Problem statement: Currently, there is a need in the space industry to actively improve the characteristics of battery batteries, the use of new types of batteries for power supply systems of spacecraft leads to a constant demand to improve the control and verification equipment (CPA). Depending on the improvement of storage batteries (AB) for spacecraft, the requirements for electrical inspections and control and verificationequipment were gradually changed. With the advent of lithium-ion batteries for spacecraft, there was a need to develop and manufacture a charge-discharge hardware and software complex (ZRPAK). The charge-discharge hardware-software complex designed to work as a charger-bit complex to work with AB spacecraft for all ground operation phases, to verify compliance of the electrical characteristics of the AB to the specified requirements, conduct incoming inspection and Autonomous tests of AB on the manufacturer of the spacecraft. The advantages and disadvantages of the previously developed and currently used control and verification equipment are analyzed. The electrical characteristics of the KPA of all generations of development are summarized in the table. Based on the analysis of the development of batteries, trends in the development of control and verification equipment and the fact that all spacecraft of new developments will use only lithium-ion batteries, the requirements for a promising fifth-generation ZRPAK are formulated. The following requirements are applied to the fifth-generation charge-discharge software and hardware complex: increase the charge-discharge voltage to 150 V; increase the charge -discharge current to 150 A; introduce devices for pre-charge-pre-discharge of the battery into the KPA; increase the accuracy of measuring the voltage of each battery; provide remote operation from the control PC; writing cyclograms; logging and subsequent viewing of all test data
APA, Harvard, Vancouver, ISO, and other styles
5

HÄRKÖNEN, ARI, JARI MIETTINEN, and ILKKA MORING. "DESIGN OF IMAGE ACQUISITION FOR SURFACE INSPECTION." International Journal of Pattern Recognition and Artificial Intelligence 10, no. 01 (February 1996): 15–32. http://dx.doi.org/10.1142/s0218001496000037.

Full text
Abstract:
Since the design of an inspection system typically requires a lot of application-dependent work, the provision of systematic methods and tools to assist in the design process could significantly reduce the system development and installation time. With this in view, a step-by-step design procedure for image acquisition systems is suggested, consisting of measurements of certain important optical parameters for the surfaces to be inspected, modelling of the measurements and arrangement of the imaging in a form that a computer can understand, simulation of the imaging process in a computer using optical analysis tools, and verification of the results through a pilot system. The procedure is exemplified by describing its application to the design of a steel sheet inspection system and its capacity for optimising the detection of various defects is demonstrated. For comparison, measurements made on some other materials are shown and the implications discussed. The results of the simulation and the pilot system for steel are compared and the usefulness of the computer-based design method is evaluated.
APA, Harvard, Vancouver, ISO, and other styles
6

Rimshin, Vladimir, and Pavel Truntov. "An integrated approach to the use of composite materials for the restoration of reinforced concrete structures." E3S Web of Conferences 135 (2019): 03068. http://dx.doi.org/10.1051/e3sconf/201913503068.

Full text
Abstract:
The article presents the results of a technical inspection of the state of the structures of the object. To conduct the study, horizontal structures of the sludge pool that were exposed to the carbonization reaction were taken for the objects under investigation. Defects and damages of the considered structures revealed during visual inspection are described. The degree of carbonization of reinforced concrete structures was determined by the phenolphthalein sample method. According to the results of the technical inspection, a verification calculation of the beam was carried out in order to determine its bearing capacity for assessing the suitability for further operation after restoration and strengthening. The calculation was performed using software. Based on the calculation results, data on the bearing capacity of the beam reinforced with composite materials were determined. The option of restoring and strengthening the beam using external reinforcement based on carbon fibers FibArm 230/150 is presented. The restoration was carried out taking into account the carbonized concrete layer. Based on the results of the study, an assessment is given of the application of an integrated approach to the restoration and strengthening of structures with composite materials, taking into account the carbonized concrete layer.
APA, Harvard, Vancouver, ISO, and other styles
7

Ivan, Virgala, and Filakovský Filip. "CONCERTINA LOCOMOTION OF A SNAKE ROBOT IN THE PIPE." TECHNICAL SCIENCES AND TECHNOLOG IES, no. 4 (14) (2018): 109–17. http://dx.doi.org/10.25140/2411-5363-2018-4(14)-109-117.

Full text
Abstract:
Urgency of the research. Nowadays robotics and mechatronics come to be mainstream. With development in these areas also grow computing fastidiousness. Since there is significant focus on numerical modeling and algorithmization in kinematic and dynamic modeling. Inspection of the pipes is well-known engineering application. For this application are usually used wheel-based robots. Another approaches are based on biologically inspired mechanisms like inchworm robot. Our study deals with another kind of pipe inspection robot, namely snake robot. Target setting. Modeling and testing of snake robot moving in the pipe for the inspection purposes. Actual scientific researches and issues analysis. Pipe inspection is usually done by wheel-based robots. However, snake robots have great potential to do these applications. Uninvestigated parts of general matters defining. Inspection in section of curved pipes is still the actual point of research. The research objective. In the paper the locomotion pattern of namely snake robot is designed and experimentally verified. The statement of basic materials. This paper investigates the area of numerical modeling in software MATLAB. The paper presents locomotion pattern of snake robot moving in the narrow pipe. Next, kinematic model for robot is derived and motion of robot simulated in the software MATLAB. Subsequently the experiments are done with experimental snake robot LocoSnake. In the conclusion the simulation and experiment results are compared and discussed. Conclusions. The paper introduces concertina locomotion pattern of namely snake robot with numerical modeling as well as experimental verification. The results of experiment are different from simulation mainly because of differences of kinematic configuration between simulation and real model. The experiment also shows uniqueness of kinematic configuration using revolute as well as prismatic joints, what is for concertina locomotion significant.
APA, Harvard, Vancouver, ISO, and other styles
8

Sun, Tao, Liangpeng Ye, Jia Xie, Jiaqing Zhang, Minghao Fan, and Hao Fan. "Research and application of substation cable trench inspection robot communication system." Journal of Physics: Conference Series 2031, no. 1 (September 1, 2021): 012025. http://dx.doi.org/10.1088/1742-6596/2031/1/012025.

Full text
Abstract:
Abstract The substation cable trench inspection robot is mainly used to inspect the high-voltage cables in the underground cable trench of the substation, so as to find various hidden dangers in time. However, due to serious electromagnetic interference in the cable trench and no communication network underground, the environment is more complicated. Aiming at the complex environment in the cable trench, this paper designs a WLAN-based cable trench inspection robot communication program, combining WLAN with directional antennas, and relaying through relay routing, so that the communication between the robot and the PC is divided into pipeline communication and ground communication Two parts. At the same time, the communication hardware, software, and protocol are designed based on the B/S architecture, which effectively improves the communication distance. Through field verification, the packet loss rate of the communication system proposed in the thesis is less than 0.5‰, the communication is stable, and the communication task of the robot in the cable trench of the substation can be completed well.
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Liqiong, Shilong Song, and Can Wang. "A Novel Effort Measure Method for Effort-Aware Just-in-Time Software Defect Prediction." International Journal of Software Engineering and Knowledge Engineering 31, no. 08 (August 2021): 1145–69. http://dx.doi.org/10.1142/s0218194021500364.

Full text
Abstract:
Just-in-time software defect prediction (JIT-SDP) is a fine-grained software defect prediction technology, which aims to identify the defective code changes in software systems. Effort-aware software defect prediction is a software defect prediction technology that takes into consideration the cost of code inspection, which can find more defective code changes in limited test resources. The traditional effort-aware defect prediction model mainly measures the effort based on the number of lines of code (LOC) and rarely considers additional factors. This paper proposes a novel effort measure method called Multi-Metric Joint Calculation (MMJC). When measuring the effort, MMJC takes into account not only LOC, but also the distribution of modified code across different files (Entropy), the number of developers that changed the files (NDEV) and the developer experience (EXP). In the simulation experiment, MMJC is combined with Linear Regression, Decision Tree, Random Forest, LightGBM, Support Vector Machine and Neural Network, respectively, to build the software defect prediction model. Several comparative experiments are conducted between the models based on MMJC and baseline models. The results show that indicators ACC and [Formula: see text] of the models based on MMJC are improved by 35.3% and 15.9% on average in the three verification scenarios, respectively, compared with the baseline models.
APA, Harvard, Vancouver, ISO, and other styles
10

Stojadinovic, Slavenko M., Vidosav D. Majstorovic, Adam Gąska, Jerzy Sładek, and Numan M. Durakbasa. "Development of a Coordinate Measuring Machine—Based Inspection Planning System for Industry 4.0." Applied Sciences 11, no. 18 (September 10, 2021): 8411. http://dx.doi.org/10.3390/app11188411.

Full text
Abstract:
Industry 4.0 represents a new paradigm which creates new requirements in the area of manufacturing and manufacturing metrology such as to reduce the cost of product, flexibility, mass customization, quality of product, high level of digitalization, optimization, etc., all of which contribute to smart manufacturing and smart metrology systems. This paper presents a developed inspection planning system based on CMM as support of the smart metrology within Industry 4.0 or manufacturing metrology 4.0 (MM4.0). The system is based on the application of three AI techniques such as engineering ontology (EO), GA and ants colony optimization (ACO). The developed system consists of: the ontological knowledge base; the mathematical model for generating strategy of initial MP; the model of analysis and optimization of workpiece setups and probe configuration; the path simulation model in MatLab, PTC Creo and STEP-NC Machine software, and the model of optimization MP by applying ACO. The advantage of the model is its suitability for monitoring of the measurement process and digitalization of the measurement process planning, simulation carried out and measurement verification based on CMM, reduction of the preparatory measurement time as early as in the inspection planning phase and minimizing human involvement or human errors through intelligent planning, which directly influences increased production efficiency, competitiveness, and productivity of enterprises. The measuring experiment was performed using a machined prismatic workpiece (PW).
APA, Harvard, Vancouver, ISO, and other styles
11

Broman, David. "Interactive Programmatic Modeling." ACM Transactions on Embedded Computing Systems 20, no. 4 (June 2021): 1–26. http://dx.doi.org/10.1145/3431387.

Full text
Abstract:
Modeling and computational analyses are fundamental activities within science and engineering. Analysis activities can take various forms, such as simulation of executable models, formal verification of model properties, or inference of hidden model variables. Traditionally, tools for modeling and analysis have similar workflows: (i) a user designs a textual or graphical model or the model is inferred from data, (ii) a tool performs computational analyses on the model, and (iii) a visualization tool displays the resulting data. This article identifies three inherent problems with the traditional approach: the recomputation problem, the variable inspection problem, and the model expressiveness problem. As a solution, we propose a conceptual framework called Interactive Programmatic Modeling. We formalize the interface of the framework and illustrate how it can be used in two different domains: equation-based modeling and probabilistic programming.
APA, Harvard, Vancouver, ISO, and other styles
12

Botacin, Marcus, Francis B. Moreira, Philippe O. A. Navaux, André Grégio, and Marco A. Z. Alves. "Terminator : A Secure Coprocessor to Accelerate Real-Time AntiViruses Using Inspection Breakpoints." ACM Transactions on Privacy and Security 25, no. 2 (May 31, 2022): 1–34. http://dx.doi.org/10.1145/3494535.

Full text
Abstract:
AntiViruses (AVs) are essential to face the myriad of malware threatening Internet users. AVs operate in two modes: on-demand checks and real-time verification. Software-based real-time AVs intercept system and function calls to execute AV’s inspection routines, resulting in significant performance penalties as the monitoring code runs among the suspicious code. Simultaneously, dark silicon problems push the industry to add more specialized accelerators inside the processor to mitigate these integration problems. In this article, we propose Terminator , an AV-specific coprocessor to assist software AVs by outsourcing their matching procedures to the hardware, thus saving CPU cycles and mitigating performance degradation. We designed Terminator to be flexible and compatible with existing AVs by using YARA and ClamAV rules. Our experiments show that our approach can save up to 70 million CPU cycles per rule when outsourcing on-demand checks for matching typical, unmodified YARA rules against a dataset of 30 thousand in-the-wild malware samples. Our proposal eliminates the AV’s need for blocking the CPU to perform full system checks, which can now occur in parallel. We also designed a new inspection breakpoint mechanism that signals to the coprocessor the beginning of a monitored region, allowing it to scan the regions in parallel with their execution. Overall, our mechanism mitigated up to 44% of the overhead imposed to execute and monitor the SPEC benchmark applications in the most challenging scenario.
APA, Harvard, Vancouver, ISO, and other styles
13

Ibrahim, A., N. A. Z. M. Zukri, B. N. Ismail, M. K. Osman, N. A. M. Yusof, M. Idris, A. H. Rabian, and I. Bahri. "Flexible Pavement Crack’s Severity Identification and Classification using Deep Convolution Neural Network." Journal of Mechanical Engineering 18, no. 2 (April 15, 2021): 193–201. http://dx.doi.org/10.24191/jmeche.v18i2.15154.

Full text
Abstract:
Effective road maintenance program is vital to ensure traffic safety, serviceability, and prolong the life span of the road. Maintenance will be carried out on pavements when signs of degradation begin to appear and delays may also lead to increased maintenance costs in the future, when more severe changes may be required. In Malaysia, manual visual observation is practiced in the inspection of distressed pavements. Nonetheless, this method of inspection is ineffective as it is more laborious, time consuming and poses safety hazard. This study focuses in utilizing an Artificial Intelligence (AI) method to automatically classify pavement crack severity. Field data collection was conducted to allow meaningful verification of accuracy and reliability of the crack’s severity prediction based on AI. Several important phases are required in research methodology processes including data collection, image labelling, image resizing, image enhancement, deep convolution neural network (DCNN) training and performance evaluation. Throughout the analysis of image processing results, the image output was successfully classified using MATLAB software. The good agreement between field measurement data and DCNN prediction of crack’s severity proved the reliability of the system. In conclusion, the established method can classify the crack’s severity based on the JKR guideline of visual assessment.
APA, Harvard, Vancouver, ISO, and other styles
14

Khatri, Ajay, Shweta Agrawal, and Jyotir M. Chatterjee. "Wheat Seed Classification: Utilizing Ensemble Machine Learning Approach." Scientific Programming 2022 (February 2, 2022): 1–9. http://dx.doi.org/10.1155/2022/2626868.

Full text
Abstract:
Recognizing and authenticating wheat varieties is critical for quality evaluation in the grain supply chain, particularly for methods for seed inspection. Recognition and verification of grains are carried out manually through direct visual examination. Automatic categorization techniques based on machine learning and computer vision offered fast and high-throughput solutions. Even yet, categorization remains a complicated process at the varietal level. The paper utilized machine learning approaches for classifying wheat seeds. The seed classification is performed based on 7 physical features: area of wheat, perimeter of wheat, compactness, length of the kernel, width of the kernel, asymmetry coefficient, and kernel groove length. The dataset is collected from the UCI library and has 210 occurrences of wheat kernels. The dataset contains kernels from three wheat varieties Kama, Rosa, and Canadian, with 70 components chosen at random for the experiment. In the first phase, K-nearest neighbor, classification and regression tree, and Gaussian Naïve Bayes algorithms are implemented for classification. The results of these algorithms are compared with the ensemble approach of machine learning. The results reveal that accuracies calculated for KNN, decision, and Naïve Bayes classifiers are 92%, 94%, and 92%, respectively. The highest accuracy of 95% is achieved through the ensemble classifier in which decision is made based on hard voting.
APA, Harvard, Vancouver, ISO, and other styles
15

Seryh, I., E. Chernysheva, and A. Gol'cov. "INSPECTION OF LOAD-BEARING STRUCTURES OF THE MAIN BUILDING OF THE CANNING FACTORY." Bulletin of Belgorod State Technological University named after. V. G. Shukhov 7, no. 2 (February 14, 2022): 30–38. http://dx.doi.org/10.34031/2071-7318-2021-7-2-30-38.

Full text
Abstract:
Within technical re-equipment of the main building of canning plant there was a need of replacement of the capacitive equipment therefore load of overlapping of the first floor increased by 40.5 tons. In this regard, a comprehensive survey of structural structures was carried out. Defects and damages detected during the examination can in the future during operation lead to a decrease in the bearing capacity of the frame elements, and therefore to physical wear of the building. Therefore, taking into account the results obtained, it was recommended to implement a set of measures to restore struc-tures that were in a limited working condition. The check calculation of the structural structures of the first floor in the restoration room was carried out by the LIRA-CAD software complex, which is based on the method of co-engineering el-ements. When determining the forces for a monolithic beam-free plate, the most suitable loading combinations were taken into account. The most dangerous load in this case is considered to be the band load distributed over the entire span, and continuous when the pressure is distributed over the entire area of the structure. Considering the possibility of simultaneous collapse of the above column and span part of the floor, the calculation of the strength of the latter was carried out for strip de-struction. Based on the verification calculation taking into account the current state of the mill's construc-tion structures, as well as taking into account the expected increase in the load on them associated with the installation of new technological equipment, it can be concluded that the load-bearing ca-pacity of the structures is not provided.
APA, Harvard, Vancouver, ISO, and other styles
16

He, Ping, Chao Liu, Meng Wang, and Sheng Mei Cao. "Design of On-Line Detection System for Paper Defects Based on Line-Scan Camera." Applied Mechanics and Materials 536-537 (April 2014): 268–71. http://dx.doi.org/10.4028/www.scientific.net/amm.536-537.268.

Full text
Abstract:
Paper defects mean that there are some defects in the paper such as hole, emboss, and fold during the paper production, which mainly results from the limitation of technological level. In the past time, artificial visual inspection and off-line checking were often used to detect the paper defects. However, its shortcoming was highlighted along with the improvement of industrial technology level and increasing demand for paper. In order to realize the online detection and markers for paper defects, the project designs the on-line detection system based on line-scan digital camera. Firstly, the principle and detection scheme of the system for the paper defects was presented. Then the overall structure of the system was designed. After that, the hardware circuit of the system was designed using TMS320F2812 as main control chip. It mainly consists of the function of each module and the working process of the system. Finally, the software of the image acquisition system was presented. With the experimental verification, the system has advantages of low cost, high efficiency and strong resistance to interference. The functions and indexes achieved the design requirements.
APA, Harvard, Vancouver, ISO, and other styles
17

Oh, Jai-woo, Jin-Kyu Kang, and Og-Woo Yoo. "A Study on the Automatic Control System for Quality Assurance of Clinical Pathology Examination Applying Machine Learning Technology." Webology 19, no. 1 (January 20, 2022): 4459–81. http://dx.doi.org/10.14704/web/v19i1/web19294.

Full text
Abstract:
The object of research is a study on a machine learning-based integrated quality control system to provide accurate test results quickly by improving the accuracy of test results and improving the utilization rate of test equipment by analyzing clinical pathology Examination data. As for the development method for research, IEC 62304, the international standard for medical S/W life cycle used in medical software development, was applied, and the S/W life cycle rule was applied to all processors by integrating agile methodology. Random Forest, XGboost (extreme Gradient Boosting), LightGBM (Light Gradient Boosting Machine), and DNN (Deep Neural Networks) to improve Rule Check accuracy to predict abnormality of inspection results, and to monitor abnormal results in real time by rule check calculation method technique was applied. In addition, to improve user convenience, automatic control and user interface technology using a dashboard function applied with machine learning technology, and automatic interlocking interface technology for various heterogeneous inspection devices were applied. Validation and verification were conducted through a qualified testing body (TTA) to ensure reliability. The following results were found through this study. As a result of implementing a module that applied big data-based machine learning technology to the algorithm used for quality control judgment of the first knowledge-based expert system, it was possible to implement a module with more than 95% accuracy. there was. Second, in order to determine whether a real-time alarm function was provided, the development module was linked to the clinical pathology information system and as a result of the experiment, it was found that it was operating normally. In addition, reliability was secured through certification by an accredited certification body. Third, as a communication support method for the interface of the inspection equipment, stability and various technologies were secured through a number of communication tests and certification tests such as RS232C, TCP/IP, and Serial HL7. Fourth, through multiple database tests (Oracle, MSSQL, MySQL, MS Access, etc.), cost savings were secured by resolving duplicate investment by providing database neutrality and interface with other systems. Fifth, utility and user satisfaction were enhanced by providing program functions for outputting the result report in various formats and configuring the UI settings, and the UI settings were modularized to reduce the program development costs and allow the modules to be reused. Through the results of research, small and medium hospitals can improve the reliability of inspection results through the machine learning-based quality control module, and through real-time monitoring of the inspection equipment, it is possible to quickly determine whether there is a failure and improve the operation rate of the inspection equipment. In addition, by providing a module that can be linked with the existing information system, it was made easy to link, and the convenience of the user was improved by providing various UI environments. As a result, it can be expected that the hospital's competitiveness and medical service will be improved by resolving the difficulties of quality control that small and medium-sized hospitals had and providing prompt and accurate test results.
APA, Harvard, Vancouver, ISO, and other styles
18

Guo, Canzhi, Chunguang Xu, Dingguo Xiao, Juan Hao, and Hanming Zhang. "Trajectory planning method for improving alignment accuracy of probes for dual-robot air-coupled ultrasonic testing system." International Journal of Advanced Robotic Systems 16, no. 2 (March 1, 2019): 172988141984271. http://dx.doi.org/10.1177/1729881419842713.

Full text
Abstract:
Composite workpieces, especially the complex-curved surfaces composite workpieces, have been increasingly used in different industries. Non-destructive testing of these parts has become an urgent problem to be addressed. To solve the problem, this article presents a dual-robot air-coupled ultrasonic non-destructive testing scheme and introduces the structure of the system and a general calibration method for the workpiece frame of a dual-robot system in detail. Importantly, this article proposes a tangential constraint method, which makes the probes completely aligned during the inspection process. Verification experiments and ultrasonic testing experiments for a glued multilayered composite workpiece were performed using the dual-robot air-coupled ultrasonic non-destructive testing system. A comparative experiment was also performed using a dual-robot water jet-coupling ultrasonic testing system. Experimental results show that the dual-robot non-destructive testing scheme and the tangential constraint method function well, and all the artificial defects on the sample can be detected by both kinds of testing methods. Vivid 3-D C-scan image based on the test result is provided for convenience of observation. In other words, a kind of flexible versatile testing platform with multiple degrees of freedom is established.
APA, Harvard, Vancouver, ISO, and other styles
19

Cernescu, Anghel, Liviu Marsavina, and Ion Dumitru. "Structural integrity assessment for a component of the bucket-wheel excavator." International Journal of Structural Integrity 5, no. 2 (May 13, 2014): 129–40. http://dx.doi.org/10.1108/ijsi-08-2012-0019.

Full text
Abstract:
Purpose – The purpose of this paper is to present a methodology for assessing the structural integrity of a tie member from a bucket-wheel excavator, ESRC 470 model, which was in operation for about 20 years. The tie member is made of S355J2N structural steel. Following the period of operation, the occurrence of microcracks which can propagate by fatigue is almost inevitable. It is therefore necessary to analyze the structural integrity and the remaining life of the component analyzed. Design/methodology/approach – In principle, the assessment methodology is based on three steps: first, the evaluation of mechanical properties of the material component; second, a BEM analysis using FRANC 3D software package to estimate the evolution of the stress intensity factor based on crack length and applied stress; third, risk factor estimation and remaining fatigue life predictions based on failure assessment diagram and fatigue damage tolerance concept. Findings – Following the evaluation procedure were made predictions of failure risk factor and remaining fatigue life function of crack length and variable stress range, for a high level of confidence. Originality/value – As results of this analysis was implemented a program for verification and inspection of the tie member for the loading state and development of small cracks during operation.
APA, Harvard, Vancouver, ISO, and other styles
20

Yussouf, Nusrat, John S. Kain, and Adam J. Clark. "Short-Term Probabilistic Forecasts of the 31 May 2013 Oklahoma Tornado and Flash Flood Event Using a Continuous-Update-Cycle Storm-Scale Ensemble System." Weather and Forecasting 31, no. 3 (June 1, 2016): 957–83. http://dx.doi.org/10.1175/waf-d-15-0160.1.

Full text
Abstract:
Abstract A continuous-update-cycle storm-scale ensemble data assimilation (DA) and prediction system using the ARW model and DART software is used to generate retrospective 0–6-h ensemble forecasts of the 31 May 2013 tornado and flash flood event over central Oklahoma, with a focus on the prediction of heavy rainfall. Results indicate that the model-predicted probabilities of strong low-level mesocyclones correspond well with the locations of observed mesocyclones and with the observed damage track. The ensemble-mean quantitative precipitation forecast (QPF) from the radar DA experiments match NCEP’s stage IV analyses reasonably well in terms of location and amount of rainfall, particularly during the 0–3-h forecast period. In contrast, significant displacement errors and lower rainfall totals are evident in a control experiment that withholds radar data during the DA. The ensemble-derived probabilistic QPF (PQPF) from the radar DA experiment is more skillful than the PQPF from the no_radar experiment, based on visual inspection and probabilistic verification metrics. A novel object-based storm-tracking algorithm provides additional insight, suggesting that explicit assimilation and 1–2-h prediction of the dominant supercell is remarkably skillful in the radar experiment. The skill in both experiments is substantially higher during the 0–3-h forecast period than in the 3–6-h period. Furthermore, the difference in skill between the two forecasts decreases sharply during the latter period, indicating that the impact of radar DA is greatest during early forecast hours. Overall, the results demonstrate the potential for a frequently updated, high-resolution ensemble system to extend probabilistic low-level mesocyclone and flash flood forecast lead times and improve accuracy of convective precipitation nowcasting.
APA, Harvard, Vancouver, ISO, and other styles
21

Xu, Zhi, Deming Zhong, Weigang Li, Hao Huang, and And Yigang Sun. "Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach." ITM Web of Conferences 17 (2018): 03026. http://dx.doi.org/10.1051/itmconf/20181703026.

Full text
Abstract:
Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.
APA, Harvard, Vancouver, ISO, and other styles
22

Lee, E. S., C. H. Lee, and Sung Chung Kim. "Machining Accuracy Improvement by Automatic Tool Setting and On Machine Verification." Key Engineering Materials 381-382 (June 2008): 199–202. http://dx.doi.org/10.4028/www.scientific.net/kem.381-382.199.

Full text
Abstract:
This paper proposes a methodology for improving the machining accuracy based on auto tool setting & work-piece measuring on the machine using laser tool setting system & inspection probe. In this study, laser tool setting systems were analyzed as considering principles, convenience and efficiency and auto tool setting method and operating macro software were developed. As compared with conventional manual tool setting and touch-type auto tool setting, the importance of automatic non-contact tool setting using laser tool setter has been discussed. Also correct tool-setting methods in accordance with different tool shapes and sizes were defined and pocket features were machined by each of setting tools and measured by the inspection probe system to verify machining accuracy. Lastly we tried to demonstrate the superiority of laser tool setting systems by analyzing the cutting results when the CNC machining center was fitted with laser tool setting system.
APA, Harvard, Vancouver, ISO, and other styles
23

Wang, Xiaoyuan, Renche Wang, and Zi’ang Li. "Quality Improvement of Aluminum Alloy Thin-walled Laminations Based on Process Capability Analysis." Journal of Physics: Conference Series 2262, no. 1 (April 1, 2022): 012003. http://dx.doi.org/10.1088/1742-6596/2262/1/012003.

Full text
Abstract:
Abstract The flatness of a certain batch of aluminum alloy thin-walled laminations is the key quality control object. The calculation result of the process performance index Pp U in the initial manufacturing stage showed that the process capability was insufficient. In order to solve the above problem, through analysing the structural characteristics of the lamination and combing the production process, an end factors’ analysis model of the out-of-tolerance flatness was established; by means of investigation and experimental verification, the main causes of out-of-tolerance flatness were confirmed. Targeted improvement measures such as optimization of mould inspection strategies and replacement of qualified moulds had been taken. After the improvement, using the MINITAB software, the Xbar-R control chart and process capability chart were drawn, and the calculation and analysis of process performance index Pp U were completed. The results showed that the improved production process was in a state of statistical control and the process capability was sufficient. The effectiveness of the improved measures was verified.
APA, Harvard, Vancouver, ISO, and other styles
24

Smith, Robert L., and Wen-Jing Huang. "Results from Minnesota/Wisconsin Automatic Out-of-Service Verification Operational Test." Transportation Research Record: Journal of the Transportation Research Board 1640, no. 1 (January 1998): 92–99. http://dx.doi.org/10.3141/1640-12.

Full text
Abstract:
A project was designed to enhance the ability of inspectors at fixed safety and weight stations (scales) to identify “out-of-service” (OOS) commercial vehicles and drivers by using advanced video-based license plate scanners linked to database software on a personal computer, the MOOSE system. In Wisconsin the results of safety inspections at the scales are stored in real time on a central computer database. This includes commercial vehicles and drivers placed OOS because of major safety violations. The primary goals were to increase the effectiveness of OOS enforcement efforts, establish a bistate enforcement program, and identify future applications. The technology was tested on a corridor involving three scales in Wisconsin and one in Minnesota on westbound I-90/I-94. The MOOSE system did identify a large number of OOS vehicles and drivers, but upon reinspection, almost no current OOS violations were found. The MOOSE system was successfully implemented at the Minnesota scale, but, as in Wisconsin, very few current OOS violations were identified. Because the Minnesota scale operates 24 hr/day, drivers coming from Wisconsin who are still OOS will probably use bypass routes. The greatest potential benefit from the MOOSE system is likely to be from linking the license plates to a new system that provides safety rating scores. Inspectors could then select vehicles for inspection that have a higher probability of being OOS or having other safety violations.
APA, Harvard, Vancouver, ISO, and other styles
25

Ren, Ziqiong, Haoze Yu, Shuaifeng Wang, Jinwei Li, ShuangXi Huang, and Yeyong Guo. "Research on multi-functional intelligent ventilator based on UC/OS-III operating system for gas concentration detection." Journal of Physics: Conference Series 2246, no. 1 (April 1, 2022): 012028. http://dx.doi.org/10.1088/1742-6596/2246/1/012028.

Full text
Abstract:
Abstract In order to change the traditional gas concentration inspection and exhaust device, the system through the gas concentration to control the rotation of the fan, to achieve real-time gas concentration monitoring, alarm control and GPS global positioning. In this study, STM32F4ZGT6 with ARM Coretex-M4 as the core controller is combined with embedded system design, and MDK is used to develop software programming. Through the ADC MQ-9 carbon monoxide sensor detection module and infrared NDIR carbon dioxide sensor detection module of the output of the analog conversion to digital quantity acquisition and processing, when the gas concentration exceeds the pre-set threshold, triggering a buzzer alarm, timer and the energy conservation and exhaust fan work, implementation is beyond the rapid circulation of the gas and gas concentration inside and outside the display of the time.Through bluetooth and mobile phone connection, real-time check of gas concentration, using the upper computer to control the operation mode of exhaust fan and change the gas concentration threshold. Modular programming is adopted in this study to improve the maintainability and facilitate debugging and modification in the later stage of the study. Through practical verification, the system can realize the detection of carbon monoxide and carbon dioxide concentration, real-time alarm monitoring, GPS positioning and change exhaust fan working mode and other functions, with real time, practicality and reliability.
APA, Harvard, Vancouver, ISO, and other styles
26

Carley, Stephen F., Alan L. Porter, and Jan L. Youtie. "A Multi-match Approach to the Author Uncertainty Problem." Journal of Data and Information Science 4, no. 2 (June 7, 2019): 1–18. http://dx.doi.org/10.2478/jdis-2019-0006.

Full text
Abstract:
AbstractPurposeThe ability to identify the scholarship of individual authors is essential for performance evaluation. A number of factors hinder this endeavor. Common and similarly spelled surnames make it difficult to isolate the scholarship of individual authors indexed on large databases. Variations in name spelling of individual scholars further complicates matters. Common family names in scientific powerhouses like China make it problematic to distinguish between authors possessing ubiquitous and/or anglicized surnames (as well as the same or similar first names). The assignment of unique author identifiers provides a major step toward resolving these difficulties. We maintain, however, that in and of themselves, author identifiers are not sufficient to fully address the author uncertainty problem. In this study we build on the author identifier approach by considering commonalities in fielded data between authors containing the same surname and first initial of their first name. We illustrate our approach using three case studies.Design/methodology/approachThe approach we advance in this study is based on commonalities among fielded data in search results. We cast a broad initial net—i.e., a Web of Science (WOS) search for a given author’s last name, followed by a comma, followed by the first initial of his or her first name (e.g., a search for ‘John Doe’ would assume the form: ‘Doe, J’). Results for this search typically contain all of the scholarship legitimately belonging to this author in the given database (i.e., all of his or her true positives), along with a large amount of noise, or scholarship not belonging to this author (i.e., a large number of false positives). From this corpus we proceed to iteratively weed out false positives and retain true positives. Author identifiers provide a good starting point—e.g., if ‘Doe, J’ and ‘Doe, John’ share the same author identifier, this would be sufficient for us to conclude these are one and the same individual. We find email addresses similarly adequate—e.g., if two author names which share the same surname and same first initial have an email address in common, we conclude these authors are the same person. Author identifier and email address data is not always available, however. When this occurs, other fields are used to address the author uncertainty problem.Commonalities among author data other than unique identifiers and email addresses is less conclusive for name consolidation purposes. For example, if ‘Doe, John’ and ‘Doe, J’ have an affiliation in common, do we conclude that these names belong the same person? They may or may not; affiliations have employed two or more faculty members sharing the same last and first initial. Similarly, it’s conceivable that two individuals with the same last name and first initial publish in the same journal, publish with the same co-authors, and/or cite the same references. Should we then ignore commonalities among these fields and conclude they’re too imprecise for name consolidation purposes? It is our position that such commonalities are indeed valuable for addressing the author uncertainty problem, but more so when used in combination.Our approach makes use of automation as well as manual inspection, relying initially on author identifiers, then commonalities among fielded data other than author identifiers, and finally manual verification. To achieve name consolidation independent of author identifier matches, we have developed a procedure that is used with bibliometric software called VantagePoint (seewww.thevantagepoint.com)While the application of our technique does not exclusively depend on VantagePoint, it is the software we find most efficient in this study. The script we developed to implement this procedure is designed to implement our name disambiguation procedure in a way that significantly reduces manual effort on the user’s part. Those who seek to replicate our procedure independent of VantagePoint can do so by manually following the method we outline, but we note that the manual application of our procedure takes a significant amount of time and effort, especially when working with larger datasets.Our script begins by prompting the user for a surname and a first initial (for any author of interest). It then prompts the user to select a WOS field on which to consolidate author names. After this the user is prompted to point to the name of the authors field, and finally asked to identify a specific author name (referred to by the script as the primary author) within this field whom the user knows to be a true positive (a suggested approach is to point to an author name associated with one of the records that has the author’s ORCID iD or email address attached to it).The script proceeds to identify and combine all author names sharing the primary author’s surname and first initial of his or her first name who share commonalities in the WOS field on which the user was prompted to consolidate author names. This typically results in significant reduction in the initial dataset size. After the procedure completes the user is usually left with a much smaller (and more manageable) dataset to manually inspect (and/or apply additional name disambiguation techniques to).Research limitationsMatch field coverage can be an issue. When field coverage is paltry dataset reduction is not as significant, which results in more manual inspection on the user’s part. Our procedure doesn’t lend itself to scholars who have had a legal family name change (after marriage, for example). Moreover, the technique we advance is (sometimes, but not always) likely to have a difficult time dealing with scholars who have changed careers or fields dramatically, as well as scholars whose work is highly interdisciplinary.Practical implicationsThe procedure we advance has the ability to save a significant amount of time and effort for individuals engaged in name disambiguation research, especially when the name under consideration is a more common family name. It is more effective when match field coverage is high and a number of match fields exist.Originality/valueOnce again, the procedure we advance has the ability to save a significant amount of time and effort for individuals engaged in name disambiguation research. It combines preexisting with more recent approaches, harnessing the benefits of both.FindingsOur study applies the name disambiguation procedure we advance to three case studies. Ideal match fields are not the same for each of our case studies. We find that match field effectiveness is in large part a function of field coverage. Comparing original dataset size, the timeframe analyzed for each case study is not the same, nor are the subject areas in which they publish. Our procedure is more effective when applied to our third case study, both in terms of list reduction and 100% retention of true positives. We attribute this to excellent match field coverage, and especially in more specific match fields, as well as having a more modest/manageable number of publications.While machine learning is considered authoritative by many, we do not see it as practical or replicable. The procedure advanced herein is both practical, replicable and relatively user friendly. It might be categorized into a space between ORCID and machine learning. Machine learning approaches typically look for commonalities among citation data, which is not always available, structured or easy to work with. The procedure we advance is intended to be applied across numerous fields in a dataset of interest (e.g. emails, coauthors, affiliations, etc.), resulting in multiple rounds of reduction. Results indicate that effective match fields include author identifiers, emails, source titles, co-authors and ISSNs. While the script we present is not likely to result in a dataset consisting solely of true positives (at least for more common surnames), it does significantly reduce manual effort on the user’s part. Dataset reduction (after our procedure is applied) is in large part a function of (a) field availability and (b) field coverage.
APA, Harvard, Vancouver, ISO, and other styles
27

Varsan, Evhen. "Features of organization of medico-legal expert researches in the cases of the mass injuring of victims in the salon of bus." Forensic-medical examination, no. 1 (May 29, 2017): 31–37. http://dx.doi.org/10.24061/2707-8728.1.2017.7.

Full text
Abstract:
The article deals with certain medico-legal aspects of trauma in the salon of bus as one of the types of road traffic accidents with a large number of dead and injured. Are shown the typical causes of such incidents and the nature of the victims injury. Was developed and proposed a modern approach to optimization of expert research in case of appearance a large number of victims in the bus. Circumstances of injury in case of personal injury people in the bus are very diverse:− rollover of the bus when transporting a large number of people while driving;− bus falling from height;− a massive collision with a fixed roadside objects; − collision with other vehicles; among the latter is the most fatal bus collision with a moving train.Naturally, in these cases, the massive injuries have affected depends on the intensity of injury to passengers in the bus, and the mechanism of damage is determined by the specific form of an accident involving a bus. In such cases, the experts faced, usually with mechanical trauma inside the cabin, and mixed types of injuries passengers (e.g. in case of fire). For in-car trauma characterized by formation damage from the following mechanisms:− shock bodies on the inner part of the interior (interior);− injuries from the shards of broken glass.Basically, the nature of injury is determined by the structural features of the bus, the presence of foreign objects, the location of the victims. If the vehicle rolls over, the occupants people are numerous additional impact. Formed characteristic for the driver damage to the hands, fractures of the sternum fractures of the hips, legs and feet. For passengers is characteristic fractures of the lower limbs, bruised head wounds, fractures and dislocations of the cervical spine when using the seat belts − stripe-like bruises and abrasions ofthe chest and abdomen, broken ribs, collarbone, sternum, in the projection of the belts. Shards of broken glass caused by the multiple linear abrasions and (or) surface or deep cut wounds mainly in the face and upper extremities. In the case of deformation of the bus body can be compression of the bodies are formed by damage to several areas, primarily the chest, abdomen, extremities, accompanied by multiple bilateral rib fractures, ruptures of internal organs. If in the future there is a fire or explosion of the vehicle, the nature of thedamage detected on the bodies will correspond to the combined injury.In cases of injuries in the bus to work with the bodies of the victims begins at the scene. Thus, the Protocol of inspection of the scene and of the corpse in the first place should reflect the data about the mutual position of bodies and (or) their fragments relative to the vehicle and other parts, the distance between them; the condition of clothing, odors from it, the presence of different overlays, damage; contamination of the skin; localization and nature of the injuries on the bodies, the presence of deformations of its individualparts; the presence of traces of biological origin on the vehicle in comparison with the nature of the deformation (damage) of the body.Be sure to note the results of the inspection of the road where there was a traffic accident, a bus traces of blood, and fragments of various things, etc. Despite the small percentage of bus injured in world General statistics of fatal injuries, it presents certain difficulties in planning, organization, execution and coordination of forensic work on multi-step liquidation of medical consequences of the accident, usually associated with a large number of victims, gross impact of factors affecting on the bodies of the victims, the need to quickly address some specific issues: establishing at autopsy pathological symptoms that indicate the status of the health of drivers in the period priorto the tragic event; the existence of facts pointing to the use of intoxicating and medicinal substances that depress the nervous system and many others), early identification of all victims. According to the results of the analysis made it impossible to offer modern, optimal, evidence-informed, and until only itremains to be reliable in practice the system approach to the organizational model of forensic activities, while ensuring the interests of the investigation of an accident involving a bus and a large number of victims:1. The preliminary stage of organization expert services. It can conditionally enough be divided into 2 phases:− advance (pre-) phase;− the immediate phase.To the basic questions of the early phases include: early development, coordination and approval of the optimal legislative and other regulatory framework; preliminary methodological, administrative and organizational, theoretical-practical, logistical, software and applied training; exercise reasonable estimates of projected short and long term needs and costs with regard to the peculiarities specified by the tragic events; creation, storage, use and replenishment of the trust reserves, logistical and financial resources areinviolable, is intended solely for use in such emergencies. It also includes the creation, maintenance and continuous improvement of a Single centralized situation center on a temporary or permanent basis, with a good system of departmental and interdepartmental cooperation, primarily containing a - operational information-Supervisory and analytical center for the collection, processing, storage, information exchange and joint action with the threat, occurrence and prevention of emergencies with a large number of victims.Immediately with the receipt of the news of the accident involving a bus and a large number of victims for forensic services begin immediate phase, the main elements of which include:− prompt notification and collection of employees and expert institutions;− an emergency conference call to discuss the organizational, theoretical and practical questions and short specialized trainingon occupational safety, including use of personal protective equipment depending on the nature of the accident and actions are potentially dangerous to health and life of employees and expert institutions factors.All plans of measures are necessarily coordinated and agreed with appropriate representatives of structures of fast reaction, especially when conducting urgent investigative actions in the emergency areas, primarily the inspection of the scene. 2. The inspection of the crime scene it is advisable to start with a preliminary review («intelligence»), which finally determined the necessity of application of those or other technical means, and the number of specialists who will participate in the inspection.The static phase of scene examination with the participation of forensic doctors is accompanied by clear mapping; mapping, photo - and video fixing of vehicle, various objects; it is noted the exact relative positions of the bus (its parts) and discovered the corpses, fragments of human remains and other biological material. During dynamic examination of the scene produce a detailed external examination of the human remains, their fragments, biological material, perform primary medical sorting, their careful packaging,clear detailed marking. Then performed the proper loading, transportation and unloading. In case of need in a temporary Deposit of biological material, can be used in railway wagons refrigerators, refrigerated trailers, mobile camera with a refrigeration unit, and in the absence or lack of volume for the total number of remains and the biomaterial deploys heat-resistant boxes, fit the space with the use of outdoor mobile air conditioning systems, large amounts of ice obtained from specialized industrial ice makers, etc., which is especially important for braking processes of rotting corpses, their fragments and biomaterial in the warm season.3. After the initial registration and a secondary sort examine corpses, their fragments and biological material collection for postmortem identification of significant information, determine the cause of death, nature, mechanism and prescription of formation damage and address other special issues. At this stage also produce the identification of fragmented body parts and (or) tissues that or another body. In expert identification work on the fragments of human remains or biological material, preference is given to genetic research providing highly accurate results. Depending on the extent of influence of damaging factors on the bodies of the victims and their degree of preservation, only after the completion of the necessary is judicial-medical research with a full range of fence material for additional research, producing restoration of the exterior, embalming, sanitary and cosmetic processing of human remains and give them to relatives (relatives, authorized representatives, etc.) for burial. 4. Issued the final results of examinations; establishes data that may be useful for later investigative and judicial actions aimed at gathering and verification of evidence in a criminal case.5. The penultimate stage consists of conducting sanitary-and-hygienic, treatment-and-prophylactic, rehabilitation (including a full psychological) of interventions for physical and mental health of employees and expert institutions involved in this work.6. After the conclusion of the criminal proceedings in general, with the official opening of data access, it is advisable to analyze the material, and publish the relevant data in the scientific literature, with the goal of widespread study and use of gained experience.CONCLUSIONS.1. Research platform forensic activities in cases of accidents involving buses and a large number of victims to date have not been developed.2. The effectiveness of forensic medical groups in this situation is in direct proportion to the degree of readiness for quick response and timely quality completion of tasks.3. Based on this, very urgent is the development of modern optimal evidence-based systemic approach to the organizational model of forensic activities in the presence of a large number of injured persons in the bus; the solution to this problem and sent the above recommendations.4. The recommendations, in principle, can be applied not only in cases of injuries in the bus, but also to similar situations in which there is a massive injury and loss of life.5. It is necessary to continue scientific and practical research aimed at improving this algorithm works experts.
APA, Harvard, Vancouver, ISO, and other styles
28

Duan, Xinjian, Min Wang, and Michael J. Kozluk. "Benchmarking PRAISE-CANDU 1.0 With Nuclear Risk Based Inspection Methodology Project Fatigue Cases." Journal of Pressure Vessel Technology 137, no. 2 (October 15, 2014). http://dx.doi.org/10.1115/1.4028202.

Full text
Abstract:
A probabilistic fracture mechanics (PFM) code, PRAISE-CANDU 1.0, has been developed under a software quality assurance (QA) program in full compliance with Canadian Standards Association (CSA) N286.7-99, and was initially released in June 2012 and officially approved for use in August 2013. Extensive verification and validation has been performed on PRAISE-CANDU 1.0 for the purpose of software QA. This paper presents the fatigue benchmarking against NURBIM (nuclear risk based inspection methodology for passive components) fatigue cases between PRAISE-CANDU 1.0 and six other PFM codes. This benchmarking is considered to be an important element of the validation of PRAISE-CANDU. Excellent agreement is observed in spite of the differences between the codes. The comparison of the predicted leak probability at the 40th year shows that PRAISE-CANDU not only captures the same trend but also bounds (higher predicted failure probability) the majority of the NURBIM results. In addition to the leak probability, the rupture probability, and uncertainty analysis, which were not reported in the NURBIM Project, are also calculated with PRAISE-CANDU and presented.
APA, Harvard, Vancouver, ISO, and other styles
29

Kuzmenko, Olha, Hanna Yarovenko, and Vitaliia Koibichuk. "DEVELOPMENT OF THE USER INTERFACE FOR THE AUTOMATED INTERNAL FINANCIAL MONITORING MODULE BASED ON UML-METHODOLOGY." Scientific opinion: Economics and Management, no. 6(76) (2021). http://dx.doi.org/10.32836/2521-666x/2021-76-16.

Full text
Abstract:
The rapid development of object-oriented programming languages in the late twentieth century contributed to the development of unified modeling languages for graphical description of software development, system design, mapping organizational structures, and modeling business processes of the financial and economic environment. In today's conditions, such broad-profile languages are notations IDEF0, IDEF3 (Integrated Definition for Function Modeling), DFD (Data Flow Diagram), UML (Unified Modeling Language), which allow the effective definition, visualization, design, documentation of automated software modules and systems. The article develops communication models between the main users of the system and the module of automated internal financial monitoring to identify the quality of financial transactions and prevent the legalization of criminal proceeds based on UML charts. The proposed diagrams visualize all stages of verification of the financial transaction, determine the direction of the exchange of messages between the involved modules (services) of internal financial monitoring and the services of the State Financial Monitoring (National Bank of Ukraine, Security Service of Ukraine). Their logic is based on the following actions: the implementation of a financial transaction by the user through a mobile or web application or the Client-Bank system; request of the automated banking system to the financial monitoring module regarding the need for financial monitoring; start of verification by a responsible employee of the bank who communicates with the monitoring module, receives information based on the relevant verification criteria for approving or rejecting a financial transaction, or using a module containing built-in artificial intelligence, business logic and audits financial transactions; receiving inspection results, sending them to the user, responsible employees of the bank; sending letters to the authorized bodies of the state financial monitoring services for those transactions that have not been verified or have risky signs of legalization of criminal proceeds.
APA, Harvard, Vancouver, ISO, and other styles
30

Lin, Muyangzi, Miyuan Shan, Jie Zhou, and Yunjie Pan. "A Data-Driven Fault Diagnosis Method using Modified Health Index and Deep Neural Networks of a Rolling Bearing." Journal of Computing and Information Science in Engineering, August 9, 2021, 1–20. http://dx.doi.org/10.1115/1.4052082.

Full text
Abstract:
Abstract To improve fault diagnosis accuracy, a data-driven fault diagnosis model based on the adjustment Mahalanobis-Taguchi system (AMTS) was proposed. This model can analyze and identify the characteristics of vibration signals by using degradation monitoring as the classifier to capture and recognize the faults of product more accurately. To achieve this goal, we firstly used the modified ensemble empirical mode decomposition (MEEMD) scalar index to capture the bearing condition; then, by using the key intrinsic mode function (IMF) extracted by AMTS as the input of classifier, the optimized properties of bearing is decomposed and extracted effectively. Next, in order to improve the accuracy of the fault diagnosis we tested different modes; employing the modified health index (MHI), which is designed to overcome the shortcomings of the proposed health index as a classifier in single fault mode, and the deep neural networks (DNN) as a classifier in multi-fault mode. To evaluate the effectiveness of our model, the Case Western Reserve University (CWRU) bearing data were used for verification. Results indicated a strong robustness with 99.16% and 1.09s, 99.86% and 6.61s fault diagnosis accuracy in different data modes respectively. Furthermore, we argue that this data-driven fault diagnosis obviously lowers the maintenance cost of complex systems by significantly reducing the inspection frequency and improves future safety and reliability.
APA, Harvard, Vancouver, ISO, and other styles
31

Quinan, C. L., and Hannah Pezzack. "A Biometric Logic of Revelation: Zach Blas’s SANCTUM (2018)." M/C Journal 23, no. 4 (August 12, 2020). http://dx.doi.org/10.5204/mcj.1664.

Full text
Abstract:
Ubiquitous in airports, border checkpoints, and other securitised spaces throughout the world, full-body imaging scanners claim to read bodies in order to identify if they pose security threats. Millimetre-wave body imaging machines—the most common type of body scanner—display to the operating security agent a screen with a generic body outline. If an anomaly is found or if an individual does not align with the machine’s understanding of an “average” body, a small box is highlighted and placed around the “problem” area, prompting further inspection in the form of pat-downs or questioning. In this complex security regime governed by such biometric, body-based technologies, it could be argued that nonalignment with bodily normativity as well as an attendant failure to reveal oneself—to become “transparent” (Hall 295)—marks a body as dangerous. As these algorithmic technologies become more pervasive, so too does the imperative to critically examine their purported neutrality and operative logic of revelation and readability.Biometric technologies are marketed as excavators of truth, with their optic potency claiming to demask masquerading bodies. Failure and bias are, however, an inescapable aspect of such technologies that work with narrow parameters of human morphology. Indeed, surveillance technologies have been taken to task for their inherent racial and gender biases (Browne; Pugliese). Facial recognition has, for example, been critiqued for its inability to read darker skin tones (Buolamwini and Gebru), while body scanners have been shown to target transgender bodies (Keyes; Magnet and Rodgers; Quinan). Critical security studies scholar Shoshana Magnet argues that error is endemic to the technological functioning of biometrics, particularly since they operate according to the faulty notion that bodies are “stable” and unchanging repositories of information that can be reified into code (Magnet 2).Although body scanners are presented as being able to reliably expose concealed weapons, they are riddled with incompetencies that misidentify and over-select certain demographics as suspect. Full-body scanners have, for example, caused considerable difficulties for transgender travellers, breast cancer patients, and people who use prosthetics, such as artificial limbs, colonoscopy bags, binders, or prosthetic genitalia (Clarkson; Quinan; Spalding). While it is not in the scope of this article to detail the workings of body imaging technologies and their inconsistencies, a growing body of scholarship has substantiated the claim that these machines unfairly impact those identifying as transgender and non-binary (see, e.g., Beauchamp; Currah and Mulqueen; Magnet and Rogers; Sjoberg). Moreover, they are constructed according to a logic of binary gender: before each person enters the scanner, transportation security officers must make a quick assessment of their gender/sex by pressing either a blue (corresponding to “male”) or pink (corresponding to “female”) button. In this sense, biometric, computerised security systems control and monitor the boundaries between male and female.The ability to “reveal” oneself is henceforth predicated on having a body free of “abnormalities” and fitting neatly into one of the two sex categorisations that the machine demands. Transgender and gender-nonconforming individuals, particularly those who do not have a binary gender presentation or whose presentation does not correspond to the sex marker in their documentation, also face difficulties if the machine flags anomalies (Quinan and Bresser). Drawing on a Foucauldian analysis of power as productive, Toby Beauchamp similarly illustrates how surveillance technologies not only identify but also create and reshape the figure of the dangerous subject in relation to normative configurations of gender, race, and able-bodiedness. By mobilizing narratives of concealment and disguise, heightened security measures frame gender nonconformity as dangerous (Beauchamp, Going Stealth). Although national and supranational authorities market biometric scanning technologies as scientifically neutral and exact methods of identification and verification and as an infallible solution to security risks, such tools of surveillance are clearly shaped by preconceptions and prejudgements about race, gender, and bodily normativity. Not only are they encoded with “prototypical whiteness” (Browne) but they are also built on “grossly stereotypical” configurations of gender (Clarkson).Amongst this increasingly securitised landscape, creative forms of artistic resistance can offer up a means of subverting discriminatory policing and surveillance practices by posing alternate visualisations that reveal and challenge their supposed objectivity. In his 2018 audio-video artwork installation entitled SANCTUM, UK-based American artist Zach Blas delves into how biometric technologies, like those described above, both reveal and (re)shape ontology by utilising the affectual resonance of sexual submission. Evoking the contradictory notions of oppression and pleasure, Blas describes SANCTUM as “a mystical environment that perverts sex dungeons with the apparatuses and procedures of airport body scans, biometric analysis, and predictive policing” (see full description at https://zachblas.info/works/sanctum/).Depicting generic mannequins that stand in for the digitalised rendering of the human forms that pass through body scanners, the installation transports the scanners out of the airport and into a queer environment that collapses sex, security, and weaponry; an environment that is “at once a prison-house of algorithmic capture, a sex dungeon with no genitals, a weapons factory, and a temple to security.” This artistic reframing gestures towards full-body scanning technology’s germination in the military, prisons, and other disciplinary systems, highlighting how its development and use has originated from punitive—rather than protective—contexts.In what follows, we adopt a methodological approach that applies visual analysis and close reading to scrutinise a selection of scenes from SANCTUM that underscore the sadomasochistic power inherent in surveillance technologies. Analysing visual and aural elements of the artistic intervention allows us to complicate the relationship between transparency and recognition and to problematise the dynamic of mandatory complicity and revelation that body scanners warrant. In contrast to a discourse of visibility that characterises algorithmically driven surveillance technology, Blas suggests opacity as a resistance strategy to biometrics' standardisation of identity. Taking an approach informed by critical security studies and queer theory, we also argue that SANCTUM highlights the violence inherent to the practice of reducing the body to a flat, inert surface that purports to align with some sort of “core” identity, a notion that contradicts feminist and queer approaches to identity and corporeality as fluid and changing. In close reading this artistic installation alongside emerging scholarship on the discriminatory effects of biometric technology, this article aims to highlight the potential of art to queer the supposed objectivity and neutrality of biometric surveillance and to critically challenge normative logics of revelation and readability.Corporeal Fetishism and Body HorrorThroughout both his artistic practice and scholarly work, Blas has been critical of the above narrative of biometrics as objective extractors of information. Rather than looking to dominant forms of representation as a means for recognition and social change, Blas’s work asks that we strive for creative techniques that precisely queer biometric and legal systems in order to make oneself unaccounted for. For him, “transparency, visibility, and representation to the state should be used tactically, they are never the end goal for a transformative politics but are, ultimately, a trap” (Blas and Gaboury 158). While we would simultaneously argue that invisibility is itself a privilege that is unevenly distributed, his creative work attempts to refuse a politics of visibility and to embrace an “informatic opacity” that is attuned to differences in bodies and identities (Blas).In particular, Blas’s artistic interventions titled Facial Weaponization Suite (2011-14) and Face Cages (2013-16) protest against biometric recognition and the inequalities that these technologies propagate by making masks and wearable metal objects that cannot be detected as human faces. This artistic-activist project contests biometric facial recognition and their attendant inequalities by, as detailed on the artist’s website,making ‘collective masks’ in workshops that are modelled from the aggregated facial data of participants, resulting in amorphous masks that cannot be detected as human faces by biometric facial recognition technologies. The masks are used for public interventions and performances.One mask explores blackness and the racist implications that undergird biometric technologies’ inability to detect dark skin. Meanwhile another mask, which he calls the “Fag Face Mask”, points to the heteronormative underpinnings of facial recognition. Created from the aggregated facial data of queer men, this amorphous pink mask implicitly references—and contests—scientific studies that have attempted to link the identification of sexual orientation through rapid facial recognition techniques.Building on this body of creative work that has advocated for opacity as a tool of social and political transformation, SANCTUM resists the revelatory impulses of biometric technology by turning to the use and abuse of full-body imaging. The installation opens with a shot of a large, dark industrial space. At the far end of a red, spotlighted corridor, a black mask flickers on a screen. A shimmering, oscillating sound reverberates—the opening bars of a techno track—that breaks down in rhythm while the mask evaporates into a cloud of smoke. The camera swivels, and a white figure—the generic mannequin of the body scanner screen—is pummelled by invisible forces as if in a wind tunnel. These ghostly silhouettes appear and reappear in different positions, with some being whipped and others stretched and penetrated by a steel anal hook. Rather than conjuring a traditional horror trope of the body’s terrifying, bloody interior, SANCTUM evokes a new kind of feared and fetishized trope that is endemic to the current era of surveillance capitalism: the abstracted body, standardised and datafied, created through the supposedly objective and efficient gaze of AI-driven machinery.Resting on the floor in front of the ominous animated mask are neon fragments arranged in an occultist formation—hands or half a face. By breaking the body down into component parts— “from retina to fingerprints”—biometric technologies “purport to make individual bodies endlessly replicable, segmentable and transmissible in the transnational spaces of global capital” (Magnet 8). The notion that bodies can be seamlessly turned into blueprints extracted from biological and cultural contexts has been described by Donna Haraway as “corporeal fetishism” (Haraway, Modest). In the context of SANCTUM, Blas illustrates the dangers of mistaking a model for a “concrete entity” (Haraway, “Situated” 147). Indeed, the digital cartography of the generic mannequin becomes no longer a mode of representation but instead a technoscientific truth.Several scenes in SANCTUM also illustrate a process whereby substances are extracted from the mannequins and used as tools to enact violence. In one such instance, a silver webbing is generated over a kneeling figure. Upon closer inspection, this geometric structure, which is reminiscent of Blas’s earlier Face Cages project, is a replication of the triangulated patterns produced by facial recognition software in its mapping of distance between eyes, nose, and mouth. In the next scene, this “map” breaks apart into singular shapes that float and transform into a metallic whip, before eventually reconstituting themselves as a penetrative douche hose that causes the mannequin to spasm and vomit a pixelated liquid. Its secretions levitate and become the webbing, and then the sequence begins anew.In another scene, a mannequin is held upside-down and force-fed a bubbling liquid that is being pumped through tubes from its arms, legs, and stomach. These depictions visualise Magnet’s argument that biometric renderings of bodies are understood not to be “tropic” or “historically specific” but are instead presented as “plumbing individual depths in order to extract core identity” (5). In this sense, this visual representation calls to mind biometrics’ reification of body and identity, obfuscating what Haraway would describe as the “situatedness of knowledge”. Blas’s work, however, forces a critique of these very systems, as the materials extracted from the bodies of the mannequins in SANCTUM allude to how biometric cartographies drawn from travellers are utilised to justify detainment. These security technologies employ what Magnet has referred to as “surveillant scopophilia,” that is, new ways and forms of looking at the human body “disassembled into component parts while simultaneously working to assuage individual anxieties about safety and security through the promise of surveillance” (17). The transparent body—the body that can submit and reveal itself—is ironically represented by the distinctly genderless translucent mannequins. Although the generic mannequins are seemingly blank slates, the installation simultaneously forces a conversation about the ways in which biometrics draw upon and perpetuate assumptions about gender, race, and sexuality.Biometric SubjugationOn her 2016 critically acclaimed album HOPELESSNESS, openly transgender singer, composer, and visual artist Anohni performs a deviant subjectivity that highlights the above dynamics that mark the contemporary surveillance discourse. To an imagined “daddy” technocrat, she sings:Watch me… I know you love me'Cause you're always watching me'Case I'm involved in evil'Case I'm involved in terrorism'Case I'm involved in child molestersEvoking a queer sexual frisson, Anohni describes how, as a trans woman, she is hyper-visible to state institutions. She narrates a voyeuristic relation where trans bodies are policed as threats to public safety rather than protected from systemic discrimination. Through the seemingly benevolent “daddy” character and the play on ‘cause (i.e., because) and ‘case (i.e., in case), she highlights how gender-nonconforming individuals are predictively surveilled and assumed to already be guilty. Reflecting on daddy-boy sexual paradigms, Jack Halberstam reads the “sideways” relations of queer practices as an enactment of “rupture as substitution” to create a new project that “holds on to vestiges of the old but distorts” (226). Upending power and control, queer art has the capacity to both reveal and undermine hegemonic structures while simultaneously allowing for the distortion of the old to create something new.Employing the sublimatory relations of bondage, discipline, sadism, and masochism (BDSM), Blas’s queer installation similarly creates a sideways representation that re-orientates the logic of the biometric scanners, thereby unveiling the always already sexualised relations of scrutiny and interrogation as well as the submissive complicity they demand. Replacing the airport environment with a dark and foreboding mise-en-scène allows Blas to focus on capture rather than mobility, highlighting the ways in which border checkpoints (including those instantiated by the airport) encourage free travel for some while foreclosing movement for others. Building on Sara Ahmed’s “phenomenology of being stopped”, Magnet considers what happens when we turn our gaze to those “who fail to pass the checkpoint” (107). In SANCTUM, the same actions are played out again and again on spectral beings who are trapped in various states: they shudder in cages, are chained to the floor, or are projected against the parameters of mounted screens. One ghostly figure, for instance, lies pinned down by metallic grappling hooks, arms raised above the head in a recognisable stance of surrender, conjuring up the now-familiar image of a traveller standing in the cylindrical scanner machine, waiting to be screened. In portraying this extended moment of immobility, Blas lays bare the deep contradictions in the rhetoric of “freedom of movement” that underlies such spaces.On a global level, media reporting, scientific studies, and policy documents proclaim that biometrics are essential to ensuring personal safety and national security. Within the public imagination, these technologies become seductive because of their marked ability to identify terrorist attackers—to reveal threatening bodies—thereby appealing to the anxious citizen’s fear of the disguised suicide bomber. Yet for marginalised identities prefigured as criminal or deceptive—including transgender and black and brown bodies—the inability to perform such acts of revelation via submission to screening can result in humiliation and further discrimination, public shaming, and even tortuous inquiry – acts that are played out in SANCTUM.Masked GenitalsFeminist surveillance studies scholar Rachel Hall has referred to the impetus for revelation in the post-9/11 era as a desire for a universal “aesthetics of transparency” in which the world and the body is turned inside-out so that there are no longer “secrets or interiors … in which terrorists or terrorist threats might find refuge” (127). Hall takes up the case study of Umar Farouk Abdulmutallab (infamously known as “the Underwear Bomber”) who attempted to detonate plastic explosives hidden in his underwear while onboard a flight from Amsterdam to Detroit on 25 December 2009. Hall argues that this event signified a coalescence of fears surrounding bodies of colour, genitalia, and terrorism. News reports following the incident stated that Abdulmutallab tucked his penis to make room for the explosive, thereby “queer[ing] the aspiring terrorist by indirectly referencing his willingness … to make room for a substitute phallus” (Hall 289). Overtly manifested in the Underwear Bomber incident is also a desire to voyeuristically expose a hidden, threatening interiority, which is inherently implicated with anxieties surrounding gender deviance. Beauchamp elaborates on how gender deviance and transgression have coalesced with terrorism, which was exemplified in the wake of the 9/11 attacks when the United States Department of Homeland Security issued a memo that male terrorists “may dress as females in order to discourage scrutiny” (“Artful” 359). Although this advisory did not explicitly reference transgender populations, it linked “deviant” gender presentation—to which we could also add Abdulmutallab’s tucking of his penis—with threats to national security (Beauchamp, Going Stealth). This also calls to mind a broader discussion of the ways in which genitalia feature in the screening process. Prior to the introduction of millimetre-wave body scanning technology, the most common form of scanner used was the backscatter imaging machine, which displayed “naked” body images of each passenger to the security agent. Due to privacy concerns, these machines were replaced by the scanners currently in place which use a generic outline of a passenger (exemplified in SANCTUM) to detect possible threats.It is here worth returning to Blas’s installation, as it also implicitly critiques the security protocols that attempt to reveal genitalia as both threatening and as evidence of an inner truth about a body. At one moment in the installation a bayonet-like object pierces the blank crotch of the mannequin, shattering it into holographic fragments. The apparent genderlessness of the mannequins is contrasted with these graphic sexual acts. The penetrating metallic instrument that breaks into the loin of the mannequin, combined with the camera shot that slowly zooms in on this action, draws attention to a surveillant fascination with genitalia and revelation. As Nicholas L. Clarkson documents in his analysis of airport security protocols governing prostheses, including limbs and packies (silicone penis prostheses), genitals are a central component of the screening process. While it is stipulated that physical searches should not require travellers to remove items of clothing, such as underwear, or to expose their genitals to staff for inspection, prosthetics are routinely screened and examined. This practice can create tensions for trans or disabled passengers with prosthetics in so-called “sensitive” areas, particularly as guidelines for security measures are often implemented by airport staff who are not properly trained in transgender-sensitive protocols.ConclusionAccording to media technologies scholar Jeremy Packer, “rather than being treated as one to be protected from an exterior force and one’s self, the citizen is now treated as an always potential threat, a becoming bomb” (382). Although this technological policing impacts all who are subjected to security regimes (which is to say, everyone), this amalgamation of body and bomb has exacerbated the ways in which bodies socially coded as threatening or deceptive are targeted by security and surveillance regimes. Nonetheless, others have argued that the use of invasive forms of surveillance can be justified by the state as an exchange: that citizens should willingly give up their right to privacy in exchange for safety (Monahan 1). Rather than subscribing to this paradigm, Blas’ SANCTUM critiques the violence of mandatory complicity in this “trade-off” narrative. Because their operationalisation rests on normative notions of embodiment that are governed by preconceptions around gender, race, sexuality and ability, surveillance systems demand that bodies become transparent. This disproportionally affects those whose bodies do not match norms, with trans and queer bodies often becoming unreadable (Kafer and Grinberg). The shadowy realm of SANCTUM illustrates this tension between biometric revelation and resistance, but also suggests that opacity may be a tool of transformation in the face of such discriminatory violations that are built into surveillance.ReferencesAhmed, Sara. “A Phenomenology of Whiteness.” Feminist Theory 8.2 (2007): 149–68.Beauchamp, Toby. “Artful Concealment and Strategic Visibility: Transgender Bodies and U.S. State Surveillance after 9/11.” Surveillance & Society 6.4 (2009): 356–66.———. Going Stealth: Transgender Politics and U.S. Surveillance Practices. Durham, NC: Duke UP, 2019.Blas, Zach. “Informatic Opacity.” The Journal of Aesthetics and Protest 9 (2014). <http://www.joaap.org/issue9/zachblas.htm>.Blas, Zach, and Jacob Gaboury. 2016. “Biometrics and Opacity: A Conversation.” Camera Obscura: Feminism, Culture, and Media Studies 31.2 (2016): 154-65.Buolamwini, Joy, and Timnit Gebru. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Proceedings of Machine Learning Research 81 (2018): 1-15.Browne, Simone. Dark Matters: On the Surveillance of Blackness. Durham, NC: Duke UP, 2015.Clarkson, Nicholas L. “Incoherent Assemblages: Transgender Conflicts in US Security.” Surveillance & Society 17.5 (2019): 618-30.Currah, Paisley, and Tara Mulqueen. “Securitizing Gender: Identity, Biometrics, and Transgender Bodies at the Airport.” Social Research 78.2 (2011): 556-82.Halberstam, Jack. The Queer Art of Failure. Durham: Duke UP, 2011.Hall, Rachel. “Terror and the Female Grotesque: Introducing Full-Body Scanners to U.S. Airports.” Feminist Surveillance Studies. Eds. Rachel E. Dubrofsky and Shoshana Amielle Magnet. Durham, NC: Duke UP, 2015. 127-49.Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14.3 (1988): 575-99.———. Modest_Witness@Second_Millennium. FemaleMan_Meets_OncoMouse: Feminism and Technoscience. New York: Routledge, 1997.Kafer, Gary, and Daniel Grinberg. “Queer Surveillance.” Surveillance & Society 17.5 (2019): 592-601.Keyes, O.S. “The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition.” Proceedings of the ACM on Human-Computer Interaction 2. CSCW, Article 88 (2018): 1-22.Magnet, Shoshana Amielle. When Biometrics Fail: Gender, Race, and the Technology of Identity. Durham: Duke UP, 2011.Magnet, Shoshana, and Tara Rodgers. “Stripping for the State: Whole Body Imaging Technologies and the Surveillance of Othered Bodies.” Feminist Media Studies 12.1 (2012): 101–18.Monahan, Torin. Surveillance and Security: Technological Politics and Power in Everyday Life. New York: Routledge, 2006.Packer, Jeremy. “Becoming Bombs: Mobilizing Mobility in the War of Terror.” Cultural Studies 10.5 (2006): 378-99.Pugliese, Joseph. “In Silico Race and the Heteronomy of Biometric Proxies: Biometrics in the Context of Civilian Life, Border Security and Counter-Terrorism Laws.” Australian Feminist Law Journal 23 (2005): 1-32.Pugliese, Joseph. Biometrics: Bodies, Technologies, Biopolitics New York: Routledge, 2010.Quinan, C.L. “Gender (In)securities: Surveillance and Transgender Bodies in a Post-9/11 Era of Neoliberalism.” Eds. Stef Wittendorp and Matthias Leese. Security/Mobility: Politics of Movement. Manchester: Manchester UP, 2017. 153-69.Quinan, C.L., and Nina Bresser. “Gender at the Border: Global Responses to Gender Diverse Subjectivities and Non-Binary Registration Practices.” Global Perspectives 1.1 (2020). <https://doi.org/10.1525/gp.2020.12553>.Sjoberg, Laura. “(S)he Shall Not Be Moved: Gender, Bodies and Travel Rights in the Post-9/11 Era.” Security Journal 28.2 (2015): 198-215.Spalding, Sally J. “Airport Outings: The Coalitional Possibilities of Affective Rupture.” Women’s Studies in Communication 39.4 (2016): 460-80.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography