Journal articles on the topic 'Non-functional characteristics-components of software quality'

To see the other types of publications on this topic, follow the link: Non-functional characteristics-components of software quality.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Non-functional characteristics-components of software quality.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Samra, Hardeep Singh. "Study on Non Functional Software Testing." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 4, no. 1 (February 1, 2013): 151–55. http://dx.doi.org/10.24297/ijct.v4i1c.3115.

Full text
Abstract:
Improving software quality involves reducing the quantity of defects within the final product and identifying the remaining defects as early as possible. It involves both its functionality and its non-functional characteristics, such as usability, flexibility, performance, interoperability and security. In fact, defects found earlier in the development lifecycle cost dramatically less to repair than those found later. However, engineers cannot address non-functional quality requirements such as reliability, security, performance and usability early in the lifecycle using the same tools and processes that they use after coding and at later phases. Approaches such as stress testing for reliability, measuring performance and gauging user response to determine usability are inherently post-integration techniques. Accordingly, defects found with these tools are more disruptive and costly to fix. Nonetheless, there has been a lop-sided emphasis in the functionality of the software, even though the functionality is not useful or usable without the necessary non-functional characteristics. This research highlights the sporadic industry acceptance of some popular methods for designing for non-functional requirements and suggests some practical approaches that are applicable for companies that also must consider the demands of schedule and cost.
APA, Harvard, Vancouver, ISO, and other styles
2

REYNOLDS, ROBERT G., and ELENA ZANNONI. "EXTRACTING PROCEDURAL KNOWLEDGE FROM SOFTWARE SYSTEMS USING INDUCTIVE LEARNING IN THE PM SYSTEM." International Journal on Artificial Intelligence Tools 01, no. 03 (September 1992): 351–67. http://dx.doi.org/10.1142/s0218213092000247.

Full text
Abstract:
Biggerstaff and Richter suggest that there are four fundamental subtasks associated with operationalizing the reuse process [1]. They are finding reusable components, understanding these components, modifying these components, and composing components. Each of these sub-problems can be re-expressed as a knowledge acquisition sub-problem relative to producing a new representation for the components that make them more suitable for future reuse. In this paper, we express the first two subtasks for the software reuse activity, as described by Biggerstaff and Richter, as a problem in Machine Learning. From this perspective, the goal of software reuse is to learn to recognize reusable software in terms of code structure, run-time behavior, and functional specification. The Partial Metrics (PM) System supports the acquisition of reusable software at three different levels of granularity: the system level, the procedural level, and the code segment level. Here, we describe how the system extracts procedural knowledge from an example Pascal software system that satisfies a set of structural, behavioral, and functional constraints. These constraints are extracted from a set of positive and negative examples using inductive learning techniques. The constraints are expressed quantitatively in terms of various quality models and metrics. The general characteristics of learned constraints that were extracted from a variety of applications libraries are discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

Павленко, М. А., С. В. Осієвський, and Ю. В. Данюк. "Methodological foundation for improving the quality of intelligent decision-making system software." Системи обробки інформації, no. 1(164) (March 17, 2021): 55–64. http://dx.doi.org/10.30748/soi.2021.164.06.

Full text
Abstract:
On the basis of a detailed analysis, existing terminological interpretations of the concept of "software quality" have been generalized, conclusions are drawn about the correspondence of the terms used to assess the quality of general software in the process of assessing the quality of software of intelligent decision-making systems (IDMS). It has been proved that the quality of the IDMS software is a complex multi-criteria indicator that takes into account not only the performance of the individual software module as a subsystem, but also the causal relationships of the elements of the software system itself. The main differences in software quality assessment between the functional and formal approaches are shown. The structure of the criterion of guarantor capacity of decision-making systems software has been investigated and conclusions have been drawn on the influence of its main components on the evaluation of IDMS software and on the provision of reliable computing process. On the basis of the analysis of the list of attributes and the quality metric of the IDMS software, it is established that the guarantee is determined by the reliability of the software structure itself and is characterised by the restoration of the functional state after failures or failures. The interrelationship and influence of IDMS software design quality indicators on the characteristics and sub-characteristics of the IDMS software is established, an example of the interrelationship between characteristics (factors) and quality indicators, the method of measuring quality indicators and design processes is given. On the basis of the conducted research, IDMS software denial regimes have been defined and their impact on the decision-making process has been shown. Detailed classes of failures and their influence on compliance of IDMS software with the task of development are shown. It has been shown that the reliability of IDMS is a dynamic concept, manifested in time, and is strongly dependent on the presence / absence of defects in the interaction. A detailed analysis of methods of software quality assurance and control has been carried out, and conclusions have been drawn on the possibility of their application IDMS software. The maturity model of the IDMS software has been improved and validated, and the maturity structure of the software as an indicator of the quality of the IDMS has been introduced.
APA, Harvard, Vancouver, ISO, and other styles
4

Abu Ezza, Hasan, Anna V. Shokhnekh, Victoria S. Telyatnikova, and Natalia S. Mushketova. "QUALITY PARAMETERS OF INFORMATION SYSTEMS FOR BUSINESS IN THE CONTEXT OF DIGITAL TRANSFORMATIONS." E3S Web of Conferences 208 (2020): 03059. http://dx.doi.org/10.1051/e3sconf/202020803059.

Full text
Abstract:
The article provides a refined definition of “information system for business” as a coordinated set of material, non-material and human re-sourses components that is used to implement a set of procedures of form¬ing an information resource as a quality product that meets the needs of ex¬ternal and internal users. According to the study, information technologies play a key role in the modern business environment. Many economic enti¬ties carry out their economical activities exclusively bye using communica¬tion-digital networks and functional software which provide the infor-mation necessary for effective business management. Resently, it has been proven that the communication and information technologies have become an integral part of any modern information systems, since they increase the level of business competitiveness during use the operational management information on time.These information technologies ensure the work’s continuity of information systems for a long period of time with the possi¬bility of quick adaptation of the business to the digital transformation’s conditions, which reduce the formation’s cost of information resources.The role of such components of information technology as software, databases and computer networks, which, in turn, are key components of high-quality information systems, is disclosed.
APA, Harvard, Vancouver, ISO, and other styles
5

Gnetko, L. V., M. M. Udychak, B. B. Siyukhova, and S. A. Gisheva. "Computer model and complete set of a functional non-alcoholic beverage production line." New Technologies 16, no. 6 (February 20, 2021): 20–27. http://dx.doi.org/10.47370/2072-0920-2020-16-6-20-27.

Full text
Abstract:
Production of functional products using local raw materials with a high content of biologically active substances is the main trend in the production of non-alcoholic beverages. In this regard, creation of a technological line that ensures not only the quality of the finished product, but also the safety of the functional properties of the feedstock is of great interest. A computer model of a functional soft drink production line, developed using the MasterSCADA 4D software created by a leading domestic company has been. The model emulates the entire production process including the stage of filling the finished product. It is possible to control a real production line provided that communication with an industrial controller, sensors and actuators is installed and configured. A set of technological equipment has been selected for each production stage, from processing raw materials and on down to functional soft drink packaging. Recommendations have been given for water treatment, including a number of technological processes, such as rough water purification and disinfection. There are instructions for preparing sugar syrup using hot method. The characteristics of the recommended plant for hydrodynamic extraction from plant raw materials has been presented. A factor intensifying the extraction process, in the pulsating action of the extractant between the solid and liquid phases. Production of a functional drink implies using of water as an extractant, so additional methods of intensifying mass transfer processes have been considered in order to increase the percentage of extraction of target components. The use of the effect of ultrasonic, electric, pulse and discrete-pulse fields on raw materials is considered to be the most promising method for the extraction process intensification.
APA, Harvard, Vancouver, ISO, and other styles
6

KRISHNA MOHAN, K., A. K. VERMA, A. SRIVIDYA, and LJUBISA PAPIC. "INTEGRATION OF BLACK-BOX AND WHITE-BOX MODELING APPROACHES FOR SOFTWARE RELIABILITY ESTIMATION." International Journal of Reliability, Quality and Safety Engineering 17, no. 03 (June 2010): 261–73. http://dx.doi.org/10.1142/s0218539310003792.

Full text
Abstract:
Quality of a software product should be tracked during the software lifecycle right from the architectural phase to its operational phase. Heterogeneous systems consist of several globally distributed components, thus rendering their reliability evaluation more complex with respect to the conventional methods. The objective of our work is to expand the evaluation process for effective reliability analysis by using both white box and black box approaches at prototype and at module/component level before the actual development. In this paper the, Black box testing is based on non-functional requirements for early quantitative analysis for the reliability estimation of the application development based on the output results of the prototype development. White box testing is based on inter-component interactions which deal with probabilistic software behavior. It uses an internal perspective of the system to design test cases based on internal structure at requirements and design phases. This paper has been applied for evolution of effective reliability quantification analysis at prototype level of a financial application case study with both non functional test data of software Development Life cycle (SDLC) phases captured from defect consolidation table in the form orthogonal defect classification as well functional requirements at requirement and design phases captured through software architectural modeling paradigms.
APA, Harvard, Vancouver, ISO, and other styles
7

Гордєєв, Олександр Олександрович. "МОДЕЛІ ТА ОЦІНЮВАННЯ ЯКОСТІ ЗРУЧНОСТІ ВИКОРИСТАННЯ ІНТЕРФЕЙСУ ПРОГРАМНОГО ЗАБЕЗПЕЧЕННЯ ДЛЯ ЛЮДИНО-КОМП'ЮТЕРНОЇ ВЗАЄМОДІЇ." RADIOELECTRONIC AND COMPUTER SYSTEMS, no. 3 (September 28, 2020): 84–96. http://dx.doi.org/10.32620/reks.2020.3.09.

Full text
Abstract:
The software quality model describes software quality in terms of non-functional requirements. The most well-known and authoritative quality model ISO/IEC 25010, includes 8 related characteristics: functionality, performance, compatibility, usability, reliability, security, maintainability, and portability. The article materials are limited only by the quality of the software in terms of usability characteristics. The characteristic of the usability of the software should include subcharacteristics inherent in the quality of the user interface, on the one hand, as a static object, and on the other hand, subcharacteristics of the process of interaction with the user – human-computer interaction. Existing quality models and usability assessments do not combine the quality elements of the user interface itself and the user experience. The article proposes models of quality and quality assessment of the usability of the software interface of human-computer interaction, which combine the characteristics inherent directly to the user interface and the characteristics of human-computer interaction. Such models are interconnected due to a single nomenclature of subcharacteristics. The model for assessing the quality of software usability consists of two parts and includes many metrics and indicators that correspond to the indicated sub-characteristics. The purpose of the article is to develop a quality model of the usability of the software interface of human-computer interaction and a corresponding model for assessing its quality, which would combine the subcharacteristics of the quality of the user interface and the subcharacteristics of the quality of its interaction with the user. The object of the research is the subcharacteristics of the usability of the software interface of human-computer interaction. The idea of developing the model is based on the results of the analysis of the following standards: ISO / IEC 25010, ISO / IEC 25022, and ISO / IEC 25022. The provisions (subcharacteristics and metrics) of these standards were taken into account when forming the main material of this article. The taxonomy of metrics and indicators was formed by combining metrics from ISO / IEC 25022, ISO / IEC 25023, and proprietary metrics. As a result, this paper proposes a model for the quality of the usability of the software interface of human-computer interaction and a model for assessing the quality of the usability of the software interface of human-computer interaction.
APA, Harvard, Vancouver, ISO, and other styles
8

Skalík, Lukáš, and Otília Lulkovičová. "A Software Optimization of the Solar Energy System Performance." Advanced Materials Research 899 (February 2014): 199–204. http://dx.doi.org/10.4028/www.scientific.net/amr.899.199.

Full text
Abstract:
The energy demand of buildings represents in the balance of heat use and heat consumption of energy complex in the Slovak national economy second largest savings potential. Their complex energy demands is the sum of total investment input to ensure thermal protection and annual operational demands of particular energy systems during their lifetime in building. The application of energy systems based on thermal solar systems reduces energy consumption and operating costs of building for support heating and domestic hot water as well as savings of non-renewable fossil fuels. Correctly designed solar energy system depends on many characteristics, i. e. appropriate solar collector area and tank volume, collector tilt and orientation as well as quality of used components. The evaluation of thermal solar system components by calculation software shows how can be the original thermal solar system improved by means of performance. The system performance can be improved of more than 31 % than in given system by changing four thermal solar system parameters such as heat loss coefficient and aperture area of used solar collector, storage tank volume and its height and diameter ratio.
APA, Harvard, Vancouver, ISO, and other styles
9

Aouzal, Khadija, Hatim Hafiddi, and Mohamed Dahchour. "Policy-Driven Middleware for Multi-Tenant SaaS Services Configuration." International Journal of Cloud Applications and Computing 9, no. 4 (October 2019): 86–106. http://dx.doi.org/10.4018/ijcac.2019100105.

Full text
Abstract:
The multi-tenancy architecture allows software-as-a-service applications to serve multiple tenants with a single instance. This is beneficial as it leverages economies of scale. However, it does not cope with the specificities of each tenant and their variability; notably, the variability induced in the required quality levels that differ from a tenant to another. Hence, sharing one single instance hampers the fulfillment of these quality levels for all the tenants and leads to service level agreement violations. In this context, this article proposes a policy-driven middleware that configures the service according to the non-functional requirements of the tenants. The adopted approach combines software product lines engineering and model driven engineering principles. It spans the quality attributes lifecycle, from documenting them to annotating the service components with them as policies, and it enables dynamic configuration according to service level agreements terms of the tenants.
APA, Harvard, Vancouver, ISO, and other styles
10

Shaw, Mary. "Myths and mythconceptions: what does it mean to be a programming language, anyhow?" Proceedings of the ACM on Programming Languages 4, HOPL (June 14, 2020): 1–44. http://dx.doi.org/10.1145/3480947.

Full text
Abstract:
Modern software does not stand alone; it is embedded in complex physical and sociotechnical systems. It relies on computational support from interdependent subsystems as well as non-code resources such as data, communications, sensors, and interactions with humans. Both general-purpose programming languages and mainstream programming language research focus on symbolic notations with well-defined abstractions that are intended for use by professionals to write programs that solve precisely specified problems. There is a strong emphasis on correctness of the resulting programs, preferably by formal reasoning. However, these languages, despite their careful design and formal foundations, address only a modest portion of modern software and only a minority of software developers. Several persistent myths reinforce this focus. These myths express an idealized model of software and software development. They provide a lens for examining modern software and software development practice: highly trained professionals are outnumbered by vernacular developers. Writing new code is dominated by composition of ill-specified software and non-software components. General-purpose languages may be less appropriate for a task than domain-specific languages, and functional correctness is often a less appropriate goal than overall fitness for task. Support for programming to meet a specification is of little help to people who are programming in order to understand their problems. Reasoning about software is challenged by uncertainty and nondeterminism in the execution environment and by the increasingly dominant role of data, especially with the advent of systems that rely on machine learning. The lens of our persistent myths illuminates the dissonance between our idealized view of software development and common practice, which enables us to identify emerging opportunities and challenges for programming language research.
APA, Harvard, Vancouver, ISO, and other styles
11

Chaliasos, Stefanos, Thodoris Sotiropoulos, Georgios-Petros Drosos, Charalambos Mitropoulos, Dimitris Mitropoulos, and Diomidis Spinellis. "Well-typed programs can go wrong: a study of typing-related bugs in JVM compilers." Proceedings of the ACM on Programming Languages 5, OOPSLA (October 20, 2021): 1–30. http://dx.doi.org/10.1145/3485500.

Full text
Abstract:
Despite the substantial progress in compiler testing, research endeavors have mainly focused on detecting compiler crashes and subtle miscompilations caused by bugs in the implementation of compiler optimizations. Surprisingly, this growing body of work neglects other compiler components, most notably the front-end. In statically-typed programming languages with rich and expressive type systems and modern features, such as type inference or a mix of object-oriented with functional programming features, the process of static typing in compiler front-ends is complicated by a high-density of bugs. Such bugs can lead to the acceptance of incorrect programs (breaking code portability or the type system's soundness), the rejection of correct (e.g. well-typed) programs, and the reporting of misleading errors and warnings. We conduct, what is to the best of our knowledge, the first empirical study for understanding and characterizing typing-related compiler bugs. To do so, we manually study 320 typing-related bugs (along with their fixes and test cases) that are randomly sampled from four mainstream JVM languages, namely Java, Scala, Kotlin, and Groovy. We evaluate each bug in terms of several aspects, including their symptom, root cause, bug fix's size, and the characteristics of the bug-revealing test cases. Some representative observations indicate that: (1) more than half of the typing-related bugs manifest as unexpected compile-time errors: the buggy compiler wrongly rejects semantically correct programs, (2) the majority of typing-related bugs lie in the implementations of the underlying type systems and in other core components related to operations on types, (3) parametric polymorphism is the most pervasive feature in the corresponding test cases, (4) one third of typing-related bugs are triggered by non-compilable programs. We believe that our study opens up a new research direction by driving future researchers to build appropriate methods and techniques for a more holistic testing of compilers.
APA, Harvard, Vancouver, ISO, and other styles
12

CANGUSSU, JOÃO W., KENDRA COOPER, and W. ERIC WONG. "A SEGMENT BASED APPROACH FOR THE REDUCTION OF THE NUMBER OF TEST CASES FOR PERFORMANCE EVALUATION OF COMPONENTS." International Journal of Software Engineering and Knowledge Engineering 19, no. 04 (June 2009): 481–505. http://dx.doi.org/10.1142/s0218194009004283.

Full text
Abstract:
Component-based software development techniques are being adopted to rapidly deploy complex, high quality systems. One of its aspects is the selection of components that realize the specified requirements. In addition to the functional requirements, the selection must be done taking into account some non-functional requirements such as performance, reliability, and usability. Hence, data that characterize the non-functional behavior of the components is needed; a test set is needed to collect this data for each component under consideration. This set may be large, which results in a considerable increase in the cost of the development process. Here, a process is proposed to considerably reduce the number of test cases used in the performance evaluation of components. The process is based on sequential curve fittings from an incremental number of test cases until a minimal pre-specified residual error is achieved. The incremental selection of test cases is done in two different ways: randomly and adaptively. The accuracy and performance of the proposed approach are dependent on the values of the desired residual error. The smaller the residual error, the higher the accuracy. However, performance has an opposite behavior. The smaller the error, the larger the number of test cases needed. The results from experiments with image compression components are a clear indication that a reduction in the number of test cases can be achieved while maintaining reasonable accuracy when using the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
13

Golovin, S. A., S. V. Zykov, Yu P. Korablin, and D. A. Kryukov. "Application of high-level methods of compromise optimization for control of autonomous robotized open pit mining." Russian Technological Journal 8, no. 5 (October 20, 2020): 7–18. http://dx.doi.org/10.32362/2500-316x-2020-8-5-7-18.

Full text
Abstract:
In most software engineering approaches, software design begins with defining functional requirements, which is well suited to web-based software development projects. When designing high-critical large-scale software intended for industrial use, accounting for non-functional software requirements is also required. The main idea of the proposed document-oriented approach is to design a stable architectural solution as early as possible, taking into account the nonfunctional characteristics of the software: reliability, security, maintainability and performance (quality attributes). At the same time, the key issue is the coordination of functional requirements, taking into account technical limitations and business requirements achieved during the steady interaction of customer and developer teams. To increase the flexibility of the designed solutions and prevent crisis situations when developing highly critical large-scale software, it is proposed to use the approach integrating the architecture-centric design method (ACDM), the architecture-tradeoff analysis method (ATAM) with a matrix enterprise architecture matrix (EAM). This allows getting a result that is adequate to the required level of responsibility and reliability. Consideration of quality attributes within the framework of the method of compromise analysis makes it possible to select and make certain decisions in software design taking into account the scale of the software and its scope. The main attributes of product quality are highlighted (ISO 25010 standard), critical scenarios are defined for each of them (templates and use cases). The use of these templates for detailed software design with the necessary parameters of functional requirements, business conditions and technological limitations reduces the risk of developing unpredictable and uncertain system behavior. Based on the proposed approach, an architectural solution is presented for highly critical, responsible, large-scale software for managing autonomous robotic open-pit mining of minerals. Critical attributes for creating the specified software were identified and ranked, and the architecture of the solution according to the SWEBOK software development standard was described. Further, taking into account the nature, scale and scope of the software solution, recommendations are given on high-level architectural templates for the system design, including layers, pipelines and microservices. The proposed architecture-oriented development method is suitable for industrial-level software in various subject areas.
APA, Harvard, Vancouver, ISO, and other styles
14

Santoso, Agus Dwi, Ferry Budi Cahyono, Widyo Tri Laksana, and Yosi Nurfalah. "Analysed power quality of electrical system ahts vessel." Research, Society and Development 9, no. 8 (July 31, 2020): e836986151. http://dx.doi.org/10.33448/rsd-v9i8.6151.

Full text
Abstract:
A number of power quality issues including electrical harmonics, poor power factor, and voltage instability and imbalance impact on the efficiency of electrical equipment. To support that equipment operates optimally, electrical power is generated must be have good quality. This study has a purpose to examine the power quality of electrical system AHTS (Anchor Handling Tug Supply) vessel. This research was done by quantitative method. Computational techniques used by the software of Total Harmonic Distortion (THD), which shares characteristics and Electrochemical Impedance Spectroscopy (EIS). The result of this analysis indicated that Electrical system in VM INPOSH REGENT have quality power very good of utility systems during full way condition, however during maneuver, there was a particularly high harmonic from non-linier devices used as motor control, which functional a role propulsion. Based on the result above, it can be concluded that after filtering so that harmonic can correct until perimeter secure ABS standard, system is very good during maneuver long site and maneuver on DP system.
APA, Harvard, Vancouver, ISO, and other styles
15

Kostinskiy, Sergey, Daniel Shaikhutdinov, and Nuri Narakidze. "Loss Counter in Power Double Winding Transformers Implementing the Method of Conditionally Constant Coefficients in Online Mode Using the Information Platform." Известия высших учебных заведений. Электромеханика 63, no. 5 (2020): 79–85. http://dx.doi.org/10.17213/0136-3360-2020-5-79-85.

Full text
Abstract:
The article describes the developed loss counter in power double-winding transformers of distribution net-works. The principle of operation of the proposed measuring unit of the loss counter is considered on the exam-ple of one channel for measuring current and voltage in accordance with the above functional scheme. The ap-pearance and main characteristics of the loss counter are given. The main characteristics of the developed loss counter are confirmed by the adjustment Protocol based on the results of measurements in the FBU «Rostov CSM» and are at the level of existing characteristics of electric energy quality analyzers. The data measured and calculated by the microcontroller of the loss counter, necessary for the implementation of the method of conditionally constant coefficients, are transmitted over the USB bus to a single-Board computer. Using the de-veloped software written in the high-level General-purpose programming language Python, the formation and sending of data to the information platform Yandex.Cloud. The approbation of the operation of the loss coun-ter on the developed physical model «power transformer-asymmetric and non-sinusoidal load» and operating substations is shown. Thus, the developed loss counter can be used to calculate losses from higher harmonic components in power transformers by the method of conditionally constant coefficients in the online mode us-ing the resources of the information platform Yandex.Cloud.
APA, Harvard, Vancouver, ISO, and other styles
16

Gvozdev, V. E., O. Ya Bezhaeva, D. R. Akhmetova, and G. R. Safina. "Formation of project management model parameters based on linearization of functional dependencies." Ontology of designing 10, no. 4 (December 30, 2020): 527–39. http://dx.doi.org/10.18287/2223-9537-2020-10-4-527-539.

Full text
Abstract:
At present, the quality of information support for management is becoming a critical factor in the implementation of the provisions of the Industry 4.0 doctrine, due to which the need to improve the theoretical provisions for managing organizational defects in the implementation of projects for creating hardware and software components of the digital ecoenvironment becomes especially significant. The paper considers a formal model that creates the basis for the formation of a balanced system of the main characteristics of the project, for the case when satisfaction with the properties of the product on the part of the customer and satisfaction with the progress of the project on the part of the contractor are equally important. The basis for the formation of a balanced system of project characteristics is its consideration as a static multi-connected control object. Empirical functional dependencies correspond to direct and cross connections between the input and output parameters of the object. A feature of constructing empirical models is the use of both actual data on budgets and the duration of previously implemented projects, and subjective expert assessments of project participants. The procedure for forming a balanced system of project characteristics is formalized, which makes it possible to automate it. The proposed approach makes it possible to increase the validity of decisions on the feasibility of implementing the project by the forces of the proposed contractor, taking into account the priority of the budget and the duration of the project for the customer.
APA, Harvard, Vancouver, ISO, and other styles
17

Slyusar, Ekaterina Nikolaeva, and Alyona Anatolyevna Govorukhina. "CORRELATIONS BETWEEN PHYSICAL AND MENTAL LIFE QUALITY COMPONENTS WITH PSYCHOSOMATIC PARAMETERS IN WOMEN WORKING IN THE OIL INDUSTRY." Психология. Психофизиология 12, no. 3 (December 4, 2019): 93–100. http://dx.doi.org/10.14529/jpps190309.

Full text
Abstract:
Aim. The article deals with assessing professional risk factors and establishing the correlation between mental life quality components and psychosomatic parameters, which is a relevant issue of the psychophysiology of professional activity. Materials and methods. 102 female engineers participated in the study. All women belonged to two various age groups and possessed a different experience of living in the North. Obtaining the informed consent of all female participants was a mandatory requirement for inclusion in the study. MOS-SF-36 questionnaire was used for analyzing subjective assessment of physical and mental life quality components. Heart rate variability was studied with the VNS-spectrum equipment; body composition was analyzed with the VS-601 analyzer; the vascular wall was quantified with the Angioscan-01 device. Correlation analysis was conducted with the non-parametric Spearman coefficient. The level of significance was established at P < 0.05. The data obtained were processed with the Statistica v. 10.0 software. Results. It was revealed that adaptation to the climatic and ecological conditions of the North changed the character and number of intersystem correlations between mental life quality components and psychosomatic parameters. In young female engineers living in the North for less than 10 years, functional correlations between the physical function, pulse wave contour analysis data, and psychosomatic indicators were revealed. With the increase in the duration of living in northern conditions, the number of correlations with physical and mental life components decreased. Conclusion. The change in physical and mental life quality components during adaptation to northern conditions is determined by the physiological reactions of the body.
APA, Harvard, Vancouver, ISO, and other styles
18

PODVINTSEV, A. V., and A. S. DUDAREV. "APPLICATION OF THE DEFORM SOFTWARE FOR MODELING THE DRILLING PROCESS OF COMPOSITE MATERIALS BASED ON THE FINITE ELEMENT METHOD." Fundamental and Applied Problems of Engineering and Technology, no. 4 (2021): 175–82. http://dx.doi.org/10.33979/2073-7408-2021-348-4-175-182.

Full text
Abstract:
This article shows the solution of a complex related problem of numerical simulation of the drilling process of polymer composite materials based on the finite element method (FEM), using the Deform engineering package. The subject of the study in setting the tasks was the functional characteristics of the drilling process: the components of the cutting force, the temperature in the cutting zone, depending on the different combination of controlled parameters of the processing mode and the geometric parameters of the drills, taking into account the specific properties of polymer composite materials. The value of the axial cutting force during drilling is important, since this value is used to calculate the drive power of technological equipment, check the strength of the cutting tool, and the cutting temperature can predict the quality of the shaped holes. The values of the power and thermal characteristics can be obtained by different methods, theoretical and practical, for example, in the course of a full-scale experiment. But for a full-scale experiment, serious preparation is necessary, the presence of a laboratory base. For a theoretical solution, it is necessary to know and apply the mathematical apparatus or use modern engineering packages. The latter is done in this work.
APA, Harvard, Vancouver, ISO, and other styles
19

Li, Fan, Chun-Hsien Chen, Ching-Hung Lee, and Li-Pheng Khoo. "A User Requirement-driven Approach Incorporating TRIZ and QFD for Designing a Smart Vessel Alarm System to Reduce Alarm Fatigue." Journal of Navigation 73, no. 1 (July 2, 2019): 212–32. http://dx.doi.org/10.1017/s0373463319000547.

Full text
Abstract:
Alarm fatigue is a critical safety issue, as it can increase workload and impair operators' situational awareness. This paper proposes a design methodology to enhance the interaction between alarm systems and operators. Through input from VTS personnel as the fundamental design requirements, a user requirement-driven design framework is proposed. It integrates quality function deployment, the theory of inventive problem solving, and software quality characteristics into three design phases. In Phase I, user requirements are obtained from the analysis of current working processes. Phase II investigates the specific non-functional design requirements of vessel alarm systems and the contradictions. In Phase III, the innovative principles generated with the contradiction matrix were analysed. A case study was conducted to verify and illustrate this framework, resulting in a conceptualisation design of a smart vessel alarm system.
APA, Harvard, Vancouver, ISO, and other styles
20

Dovgun, Valery, Sergei Temerbaev, Maxim Chernyshov, Viktor Novikov, Natalia Boyarskaya, and Elena Gracheva. "Distributed Power Quality Conditioning System for Three-Phase Four-Wire Low Voltage Networks." Energies 13, no. 18 (September 19, 2020): 4915. http://dx.doi.org/10.3390/en13184915.

Full text
Abstract:
This paper presents distributed power quality conditioning system to compensate current and voltage distortions in three-phase four-wire networks caused by unbalanced non-linear single-phase loads. The proposed conditioning system consists of several hybrid filter units installed in various nodes for compensation of excessive neutral currents and voltage distortions in a selected area of the distribution network. The system is open and it can be easily modified by installation of new filter units. A novel hybrid filter design procedure based on filter frequency characteristics optimization in the parameters of passive and active parts is presented. A digital system of control signal computation for active filters based on the use of modern methods of spectral analysis is considered. The proposed digital control system enables selective compensation of fundamental and harmonic components. The mathematical model of the proposed power quality conditioning system is developed in the MatLab software. The simulated results show that the presented conditioning system reduces the level of neutral conductor currents and the voltage unbalance as well ensures harmonic compensation in three-phase four-wire networks.
APA, Harvard, Vancouver, ISO, and other styles
21

Ivanov, V. M. "Simulation Model of the Cutting Speed Stabilization System for CNC Metal-Cutting Machine Tools." Mekhatronika, Avtomatizatsiya, Upravlenie 21, no. 2 (February 10, 2020): 110–16. http://dx.doi.org/10.17587/mau.21.110-116.

Full text
Abstract:
In this paper a simulation model of cutting speed stabilization for numerical control machines was developed. The mode of cutting speed stabilization allows solving a number of technological problems, including the increase in machining productivity and the quality of the part surface, the increase in durability of the cutting tool.Access to the functions of the basic software of CNC systems is limited. Taking this into account, this paper considers the functional and algorithmic features of the power parameter stabilization systems, as the basis for the further development of intelligent algorithmic support and software. The existing guidance on cutting conditions is based on empirical dependencies, the use of which for direct application in algorithms for the cutting process automatic control is difficult, since these dependencies determine the predicted, not the current, parameters. The non-stationary model was adopted as the basic structure of the process of shaping parts, the main non-stationary model parameters are determined by the three-dimensional kinematics of the universal machine. The generalized approach to the power parameter systems and the synthesis of the regulators of their main circuits made it possible to identify the simplest version of the structures based on the use of the regulators with parametric feedback. The functional model contains the main components of the CNC system: an interpolator, servo drives of feeds and main motion, as well as additional cycle and analysis modules. The high-speed processing of end surfaces and the conditions for the implementation of cyclic control tasks are considered. The simulation results confirming the performance of functional algorithms and the possibility of their use in intelligent CNC systems are presented.
APA, Harvard, Vancouver, ISO, and other styles
22

Smirnova, Ludmila, Gennadiy Ponomarenko, and Vadim Suslyaev. "Methodology and information-measuring system for personalized synthesis of lower limb prostheses." Information and Control Systems, no. 6 (December 16, 2021): 64–74. http://dx.doi.org/10.31799/1684-8853-2021-6-64-74.

Full text
Abstract:
Introduction: One of the methods for managing the quality of prosthetics is optimizing the composition of a modular prosthesis components. Mistakes in choosing models for functional modules of a prosthesis lead to a limited realization of the patient's potential capabilities, or to the choice of expensive highly functional models whose potential cannot be fully realized with the given body system disabilities. One of the most effective ways to solve this problem is to use the computer technology capabilities. Purpose: Substantiation of the methodology for the development of an innovative computer technology for personalized synthesis of a lower-limb prosthesis, including the development of the structure of an information-measuring system for its implementation. Methods: Analysis, synthesis, analogy; expert survey; analytic hierarchy process (Saaty method). The conceptual language of the International Classification of Functioning, Disability and Health was used to describe the factors influencing the requirements for the characteristics of prosthetic modules. Results: In order to choose models for prosthetic modules, we should use an extended system of factors, including both the basic factors associated with the purpose of the products and indicated in the catalogs, and additional factors: impairment indicators of the body functions and structures, the capacity and performance of the patient's activity and participation, the presence of barriers and facilitators environmental factors in which the prosthesis is planned to be used. Taking this system of factors into account, a structural diagram of an information-measuring system for examining a prosthetic patient has been developed. To select the components for the prosthesis, we have substantiated the necessity of creating a global electronic catalog, combining structured information on the models of prosthetic modules supplied by various manufacturers. A matrix representation form is proposed for the knowledge base, reflecting the rules for choosing models according to the correspondence of their characteristics to the estimates of the factors. The methodology of computerized selection of models from the electronic catalog has been substantiated. Practical relevance: The results of the work are a step towards the creation of a technology for a computerized multicriteria choice of components for a modular prosthesis, taking into account the personal needs and functional capabilities of the patient. The use of this technology will improve the patient's rehabilitation level and the quality of his or her life.
APA, Harvard, Vancouver, ISO, and other styles
23

Ornatsky, D. P., O. O. Krivokulska, O. O. Burbela, and O. D. Bliznyuk. "Measuring System for Non-Destructive Testing of Metal Rods." Metrology and instruments, no. 2 (May 21, 2020): 22–24. http://dx.doi.org/10.33955/2307-2180(2)2020.22-24.

Full text
Abstract:
The control parameters of metal products using the eddy current method of nondestructive testing based on electromagnetic induction law is now widespread. Due to the high sensitivity over a wide frequency range of the ability to control the mechanical properties , uniformity of material, both magnetic and non-magnetic materials, beskonechnosti, high reliability, automation, process control, etc. The object of study is the process of interaction of external electromagnetic fields with defects in heterogeneity of structure in metal rod, causing a deformation of microtubuli currents and, accordingly, their influence on the inductance coil of the sensor. So, according to the law of electromagnetic induction eddy currents induced by an external electromagnetic field will be asking a private field that will oppose the external field that will lead to a change in inductance of the sensor coil. Therefore, the most informative parameter in this case is the relative change in inductance of the sensor. In the known designs use differential transformer sensors, transmission type, which differ in complexity of implementation, but have high sensitivity. In existing works not enough attention on improving of the metrological characteristics . Modern means of microstraava flaw detectors in the overwhelming number are for scientific research, but little attention is paid to tools that can be used in industrial processes, through a complex measurement process in the existing funds and the large volume of the software during automatic processing of information. In the presented work there is a system for nondestructive testing of metal bars with deprivation of the above-mentioned disadvantages, which would provide high metrological characteristics in a wide frequency range, separate measurement of impedance components of the sensor, which allows the reduction of methodological errors of determination of the main characteristics of the output signal of microstraava sensors. The scientific result is created sambalanco pavement system based on electronic dharamtala model of a vortex sensor with high metrological characteristics, which allows you to create real-time signal proportional to the amount of damage that will give you the opportunity to increase productivity in the quality control bar of metal products in a production environment.
APA, Harvard, Vancouver, ISO, and other styles
24

Mohamed, B. Asan, and P. Janaki. "Determination of active ingredients in commercial insecticides using spectral characteristics of Fourier transform infrared spectroscopy (FTIR)." Journal of Applied and Natural Science 13, SI (July 19, 2021): 110–23. http://dx.doi.org/10.31018/jans.v13isi.2809.

Full text
Abstract:
Pesticides have become a basic necessity for yield development. This might be credited to the quickly expanding population, which has presented weight on the food creation industry.Fourier Transform Infra-red Spectroscopy utilizes sample with less course of action, less time consuming, simple, fast, non-destructive and environmental friendly infrared-based method. It makes use of Smart iTR window and pellets use on omnic transmission window. In FTIR the peaks formed for the representative sample are from 800 cm-1 to 4000 cm-1 of wavenumbers against the % transmittance. The FTIR spectra obtained for pesticide formulations were on par with the NIST (National Institute of Standards and Technology) spectra library. Comparing the commercial-grade spectra with the Spectrabase, NIST library and Bio-rad software showed the peak ranges for different functional groups of the compound and can be examined with KnowItAll software’s ProcessItIR and AnalyseItIR. We can obtain the active principle of the peak, peak intensities. This method can be viewed as genuine choices to long and tedious chromatographic strategies as a rule suggested for quality control of commercially accessible pesticide formulations and check for adultered formulations that harm agricultural produce.
APA, Harvard, Vancouver, ISO, and other styles
25

Selisteanu, Dan, Monica Roman, Lucian Mandache, Razvan Prejbeanu, Sergiu Ivanov, and Alexandru Radu. "Three-Level Inverter Control Techniques: Design, Analysis, and Comparisons." Elektronika ir Elektrotechnika 27, no. 3 (June 28, 2021): 26–37. http://dx.doi.org/10.5755/j02.eie.29015.

Full text
Abstract:
This work addresses the analysis and design of various Proportional-Integral-Derivative (PID) control techniques for a three-level inverter. Multilevel power converters are modern and basic elements of high-voltage electric drive and power supply systems. By using simulations and specific computer-aided design tools, the overall functional characteristics of multilevel converters, as well as the electrical demands of the components, can be accurately assessed to obtain an appropriate control solution. An innovative and detailed software model of a three-level inverter is developed and then used for the implementation of control techniques. Several tuning methods are used to tune PID controllers for two specific cases: the multilevel inverter with a linear load and with an asynchronous motor load, respectively. A detailed analysis and comparisons of the quality criteria and control performance are achieved. This analysis shows that the choice of controller type depends on the inverter load. For the linear load, proper results are obtained with a PI Nichols-tuned controller, and for the asynchronous load, with a PI controller tuned via a modified Hokushin method. The computer-aided design tools can be further used for the simulation of the equipment in various operating conditions, normal and fault, following all functional parameters.
APA, Harvard, Vancouver, ISO, and other styles
26

Gvozdev, V. E., O. Ya Bezhaeva, D. R. Akhmetova, and G. R. Safina. "Assessment of the project budget according to the criterion of the actors' satisfaction." Ontology of Designing 11, no. 3 (September 30, 2021): 382–92. http://dx.doi.org/10.18287/2223-9537-2021-11-3-382-392.

Full text
Abstract:
In the project management manuals for the creation of software components of computing and communication sys-tems, it is noted that the quality of the results is determined by the validity of organizational decisions related to project management. It is emphasized that organizational mistakes made at the initial stages of the project, including insuffi-cient justification of the project budget, have serious consequences. The novelty of this work lies in the assertion that the quality of the project results is equally determined not only by the satisfaction of consumers / customers, but also by the satisfaction of the developers with the progress of the project. A scheme for assessing the possible boundaries of the project budget, in which it is possible to obtain a solution acceptable both for consumers / customers and for the executors of the project, is proposed. The basis of the scheme is the use of the experience of consumers and developers in the implementation of projects with similar content. The model basis of the scheme is formed by non-linear empiri-cal functional dependencies between the indicators of actor satisfaction, the project budget and the duration of its im-plementation. Improving the validity of the budget assessment, taking into account the experience of the actors, is a prerequisite for reducing the organizational defects of the project, which makes it possible to achieve the required quali-ty of the infrastructure components of computing and communication systems.
APA, Harvard, Vancouver, ISO, and other styles
27

Syryamkin, V. I., E. N. Bogomolov, and M. S. Kutsov. "Computer-Aided Design of X-Ray Microtomographic Scanners." Advanced Materials Research 1033-1034 (October 2014): 1327–30. http://dx.doi.org/10.4028/www.scientific.net/amr.1033-1034.1327.

Full text
Abstract:
The article is to study the development of computer-aided design of X-ray microtomography - the device for investigating the structure and construction three-dimensional images of organic and inorganic objects on the basis of shadow projections. This article provides basic information regarding CAD of X-ray microtomography and a scheme consisting of three levels. The article also shows basic relations of X-ray computed tomography, the generalized scheme of a X-ray microtomographic scanner. The methods of X-ray imaging of the spatial microstructure and morphometry of materials are described. The main characteristics of an X-ray microtomographic scanner, the X-ray source, X-ray optical elements and mechanical components of the positioning system are shown. The block scheme and software functional scheme for intelligent neural network system of analysis of the internal microstructure of objects are presented. The method of choice of design parameters of CADof X-raymicrotomography is aimed at improving the quality of design and reduce of costs of it. It is supposed to reduce the design time and eliminate the growing number of engineers involved in development and construction of X-ray microtomographic scanners.
APA, Harvard, Vancouver, ISO, and other styles
28

Wolde, Behailu Getachew, and Abiot Sinamo Boltana. "REST API Composition for Effective Testing the Cloud." Journal of Applied Research and Technology 19, no. 6 (December 31, 2021): 676–93. http://dx.doi.org/10.22201/icat.24486736e.2021.19.6.924.

Full text
Abstract:
Cloud offers many ready-made REST services for the end users. This offer realizes the service composition through implementation somewhere on internet based on Service Level Agreement (SLA). For ensuring this SLA, a software testing is a useful means for attesting a non-functional requirement that guarantees quality assurance from end user's perspective. However, test engineer experiences only what goes in and out through an interface that contains a high level behaviors separated from its underlying details. Testing with these behaviors become an issue for classical testing procedures. So, REST API through composition is an alternative new promising approach for modeling behaviors with parameters against the cloud. This new approach helps to devise test effectiveness in terms of REST based behavior-driven implementation. It aims to understand functional behaviors through API methods based on input domain modeling (IDM) on the standard keyboard pattern. By making an effective REST design the test engineer sends complete test inputs to its API directly on application, and gets test responses from the infrastructure. We consider NEMo mobility API specification to design an IDM, which represents pattern match of mobility search URL API path scope. With this scope, sample mobility REST API service compositions are used. Then, the test assertions are implemented to validate each path resource to test the components and the end-to-end integration on the specified service.
APA, Harvard, Vancouver, ISO, and other styles
29

Katamanova, E. V., P. V. Kazakova, I. V. Kudaeva, A. N. Kuks, and O. V. Ushakova. "Efficiency of “Neo inulin” in the complex treatment of patients with type 2 diabetes." Acta Biomedica Scientifica 6, no. 6-2 (December 28, 2021): 11–17. http://dx.doi.org/10.29413/abs.2021-6.6-2.2.

Full text
Abstract:
Diabetes mellitus is one of the most serious medical, social, and economic health problems in all countries of the world. The incidence of diabetes mellitus in the world doubles every 10–15 years, acquiring the character of a non-infectious epidemic. Therefore, it is extremely important to search for new drugs that help normalize glycemia, prevent complications of diabetes mellitus, and improve the quality of life of patients. These drugs include “Neo inulin”, which has a hypoglycemic, antioxidant, hepatoprotective and angioprotective effect.The aim: to evaluate the effectiveness of “Neo inulin” in the complex treatment of patients with type 2 diabetes mellitus.Materials and methods. The study involved 18 women (average age – 64.5 ± 8.7 years, average weight – 77.8 ± 11.4 kg) and 3 men (average age – 54.6 ± 12.4 years, average weight – 114 ± 40.2 kg). The average duration of type 2 diabetes was 11.0 (7.0–12.0) years. “Neo inulin” was prescribed as 2 capsules per day for 12 weeks in the complex of basic diabetes therapy To assess the effectiveness of therapy, a study of the quality of life related to health, a biochemical blood test (glycated hemoglobin), a clinical minimum (CBC, OAM, ECG, blood glucose) was carried out, the functional characteristics of tissue blood flow were investigated, and the ankle-brachial index (ABI) was determined. Statistical processing of the results was carried out using the Statistica 6.0 software package (StatSoft Inc., USA). Differences were considered statistically significant at p < 0.05.Results. Statistically significant differences were revealed in the values of all scales of health-related quality of life in patients in the groups before and after treatment with “Neo Inulin”, including the total physical and mental components. An improvement in the functioning of microcirculatory regulation mechanisms was noted, which is confirmed by a statistically significant increase in ABI (1.0 and 0.8 on the right; 0.9 and 0.8 on the left, respectively; p < 0.05) and the coefficient of microcirculation variation (9.2 and 8.3, respectively; p < 0.05). In 57,1 % of cases (12 people), the level of glycated hemoglobin was normalized.Conclusion. The use of a treatment regimen that includes “Neo Inulin” improves the effectiveness of treatment and improves the quality of life of patients with type 2 diabetes mellitus.
APA, Harvard, Vancouver, ISO, and other styles
30

Kashevarova, Galina, Anastasia Semina, and Svetlana Maksimova. "INTELLIGENT AND DIGITAL TECHNOLOGIES IN THE CONSTRUCTION OBJECTS TECHNICAL DIAGNOSTICS." International Journal for Computational Civil and Structural Engineering 17, no. 2 (July 6, 2021): 22–33. http://dx.doi.org/10.22337/2587-9618-2021-17-2-22-33.

Full text
Abstract:
The intelligent and digital technologies implementation into the civil engineer expert activities is able to provide alternative solutions to different qualifications specialists. This will increase the speed of data processing and the reliability of the expert opinion on the technical condition of the operated construction objects as well as will allow assessing their real residual resource for making a decision on the possibility of further exploitation. The ontology and the original technology for the main system components determination are necessary to achieve the goal (the confinement model) used for the structural and functional analysis of the knowledge system. The entire technologies set for the architectural and construction objects (digital documentation, scan results, thermal imager data, non-destructive survey methods data), data about structure’s defects and damage, appropriate software as well as intelligent technologies (fuzzy logic, neural networks) are used for more thoroughly diagnose certain construction parts and transmit digital information to determine the technical condition category. Also, this data set can be used for following situations: a control of the dynamics of changes in the technical state of a construction object, an improvement the accuracy of determining the scope of repair work, an enhancement the quality of project documentation, methods for assessing the quality of restoration work and measures for the conservation of architectural monuments, etc.
APA, Harvard, Vancouver, ISO, and other styles
31

Appel, Lora, Suad Ali, Tanya Narag, Krystyna Mozeson, Zain Pasat, Ani Orchanian-Cheff, and Jennifer L. Campos. "Virtual reality to promote wellbeing in persons with dementia: A scoping review." Journal of Rehabilitation and Assistive Technologies Engineering 8 (January 2021): 205566832110539. http://dx.doi.org/10.1177/20556683211053952.

Full text
Abstract:
Virtual Reality (VR) technologies have increasingly been considered potentially valuable tools in dementia-related research and could serve as non-pharmacological therapy to improve quality of life (QoL) and wellbeing for persons with dementia (PwD). In this scoping review, we summarize peer-reviewed articles published up to Jan-21, 2021, on the use of VR to promote wellbeing in PwD. Eighteen manuscripts (reporting on 19 studies) met the inclusion criteria, with a majority published in the past 2 years. Two reviewers independently coded the articles regarding A) intended clinical outcomes and effectiveness of the interventions, B) study sample (characteristics of the participants), C) intervention administration (by whom, what setting), D) experimental methods (design/instruments), and E) technical properties of the VR-systems (hardware/devices and software/content). Emotional outcomes were by far the most common objectives of the interventions, reported in seventeen (89.5%) of the included articles. Outcomes addressing social engagement and personhood in PwD have not been thoroughly explored using VR. Based on the positive impact of VR, future opportunities lie in identifying special features and customization of the hardware/software to afford the most benefit to different sub-groups of the target population. Overall, this review found that VR represents a promising tool for promoting wellbeing in PwD, with positive or neutral impact reported on emotional, social, and functional aspects of wellbeing.
APA, Harvard, Vancouver, ISO, and other styles
32

Arredondo-Valdez, Juan, Alejandro Isabel Luna-Maldonado, Ricardo David Valdez-Cepeda, Humberto Rodríguez-Fuentes, Juan Antonio Vidales-Contreras, Uziel Francisco Grajeda-González, and Héctor Flores-Breceda. "Characterization of Mature Paddles of Opuntia ficus-indica L. Using Morphological and Colorimetric Descriptors." Journal of Experimental Biology and Agricultural Sciences 10, no. 2 (April 30, 2022): 335–43. http://dx.doi.org/10.18006/2022.10(2).335.343.

Full text
Abstract:
Mexico is the world's leading producer of Opuntia ficus-indica. This kind of prickly pear is the most widespread and most commercially important cactus in Mexico. Morphological and colorimetric descriptors are among the most important agronomic traits because these parameters affect the yield, in such a way, the objective of the present research was to present a fast and reliable methodology to obtain the functional relationship in shape and color parameters of O. ficus indica cladodes, using a smartphone, a color meter, and open-access software. The acquisition and processing of images discovered interesting relationships between the Opuntia cladode's morphological characteristics, as well as colorimetric parameters of the cladodes. The non-linear data behaviors were fitted using deterministic models and CurveExpert software. Results of the study revealed that the best morphological descriptors were Circularity vs. Perimeter (r= 0.9815) and Aspect ratio vs. Roundness (r= 0.9999). In addition, mean values of the L*, C, and H color parameters were displayed in a window of a computer program online. It was found that the a-C relationship of the color parameters had the highest correlation coefficient (0.999). Therefore, it can be concluded that the morphological descriptors Circularity vs. Perimeter, Aspect Rate vs. Roundness, and a*-C color parameter can predict quickly and precisely the quality of O. ficus-indica.
APA, Harvard, Vancouver, ISO, and other styles
33

Khan, Gitosree, Sabnam Sengupta, and Anirban Sarkar. "Dynamic service composition in enterprise cloud bus architecture." International Journal of Web Information Systems 15, no. 5 (December 2, 2019): 550–76. http://dx.doi.org/10.1108/ijwis-01-2019-0005.

Full text
Abstract:
Purpose Service composition phenomenon based on non-scenario aspects are become the latest issues in enterprise software applications of the multi-cloud environment due to the phenomenal increase in a number of Web services. The traditional service composition patterns are hard to support the dynamic, flexible and autonomous service composition in the inter-cloud platform. To address this problem, this paper aims to describe a dynamic service composition framework (SCF) that is enriched with various structural and functional aspects of composition patterns in a cloud computing environment. The proposed methodology helps to integrate various heterogeneous cloud services dynamically to acquire an optimal and novel enterprise solution for delivering the service to the end-users automatically. Design/methodology/approach SCF and different composition patterns have been used to compose the services present in the inter-cloud architecture of the multi-agent-based system. Further, the proposed dynamic service composition algorithm is illustrated using a hybrid approach, where service are chosen according to various needs of quality of service parameters. Besides, a priority-based service scheduling algorithm is proposed that facilitates the automation of delivering cloud service optimally. Findings The proposed framework is capable of composing the heterogeneous service and facilitate the structural and functional aspects of service composition process in enterprise cloud-based applications in terms of flexibility, scalability, integrity and dynamicity of the cloud bus. The advantage of the proposed algorithm is that it helps to minimize the execution cost, processing time and get better success rate in delivering the service as per customer’s need. Originality/value The novelty of the proposed architecture coordinates cloud participants, automate service discovery pattern, reconfigure scheduled services and focus on aggregating a composite services in inter-cloud environments. Besides, the proposed framework supported several non-functional characteristics such as robustness, flexibility, dynamicity, scalability and reliability of the system.
APA, Harvard, Vancouver, ISO, and other styles
34

Kizbayev, Assylzhan, Dauren Zhakebayev, Ualikhan Abdibekov, and Askar Khikmetov. "Mathematical modeling of electron irradiation of oil." Engineering Computations 35, no. 5 (July 2, 2018): 1998–2009. http://dx.doi.org/10.1108/ec-12-2016-0419.

Full text
Abstract:
Purpose This paper aims to propose a mathematical model and numerical modeling to study the behavior of low conductive incompressible multicomponent hydrocarbon mixture in a channel under the influence of electron irradiation. In addition, it also aims to present additional mechanisms to study the radiation transfer and the separation of the mixture’s components. Design/methodology/approach The three-dimensional non-stationary Navier–Stokes equation is the basis for this model. The Adams–Bashforth scheme is used to solve the convective terms of the equation of motion using a fourth-order accuracy five-point elimination method, and the viscous terms are computed with the Crank–Nicolson method. The Poisson equation is solved with the matrix sweep method and the concentration and electron irradiation equations are solved with the Crank–Nicolson method too. Findings It shows high computational efficiency and good estimation quality. On the basis of numerical results of mathematical model, the effect of the separation of mixture to fractions with various physical characteristics was obtained. The obtained results contribute to the improvement of technologies for obtaining high-quality oil products through oil separation into light and heavy fractions. Mathematical model is approbated based on test problem, and has good agreement with the experimental data. Originality/value The constructed mathematical model makes developing a methodology for conducting experimental studies of this phenomenon possible.
APA, Harvard, Vancouver, ISO, and other styles
35

Воеводин, В. В. "A comprehensive analysis of performance quality of large supercomputer complexes." Numerical Methods and Programming (Vychislitel'nye Metody i Programmirovanie), no. 3 (June 14, 2019): 182–91. http://dx.doi.org/10.26089/nummet.v20r317.

Full text
Abstract:
В настоящее время проблема недостаточной эффективности работы суперкомпьютерных комплексов во многом связана с тем, что администраторы таких систем не всегда могут своевременно обнаруживать и устранять причины снижения эффективности. Это в большей степени касается не выхода из строя оборудования (такие случаи можно отслеживать с помощью систем мониторинга), а неявного снижения эффективности работы определенных компонентов суперкомпьютера при условии, что внешне они продолжают работать корректно. Возникновение подобной ситуации связано с тем, что на данный момент нет достаточно гибких и удобных средств для оперативного и комплексного анализа всех характеристик качества работы вычислительных систем. Существующие решения либо позволяют анализировать только небольшую часть таких характеристик, либо представляют собой не универсальные решения, удовлетворяющие только небольшой набор конкретных нужд администраторов определенной системы. В настоящей статье описывается системный подход к решению этого вопроса, который позволит проводить комплексный анализ различных аспектов работы суперкомпьютеров, связанных в первую очередь с выполнением суперкомпьютерных приложений. Разрабатываемый на основе этого подхода программный инструмент предназначен для сбора в рамках единой модели всех наиболее важных данных о свойствах и качестве выполняющихся на суперкомпьютере задач - данные об эффективности их выполнения, размере и длительности, наличии характерных или аномальных сценариев поведения, использовании прикладных пакетов и библиотек и др. С помощью гибких возможностей по агрегации будет задаваться нужная степень детализации, с какой необходимо предоставлять полученную информацию - по отдельным пользователям, проектам, прикладным пакетам, предметным областям, разделам суперкомпьютера, временным диапазонам и др. Это позволит создавать сотни и тысячи различных представлений для анализа состояния суперкомпьютера, что поможет администраторам выбирать наиболее подходящий для них вариант. Currently, the problem of low performance of supercomputer complexes is largely due to the fact that administrators of such complexes cannot always timely detect and eliminate the root causes of reduced efficiency. This largely concerns not the equipment failure (such cases can usually be detected using monitoring systems), but an implicit performance decrease of certain supercomputer components, provided that they seems to continue working correctly. Such a situation arises because there are no sufficiently flexible and convenient software tools for prompt and comprehensive analysis of all the performance quality characteristics of computer systems at the moment. The existing solutions either allow analyzing only a small part of such characteristics or are made as non-universal solutions that satisfy only a small set of specific needs provided by administrators of a particular system. This paper describes a systematic approach to solving this issue, which will allow one to perform a comprehensive analysis of various aspects of supercomputer functioning, primarily related to the execution of supercomputer applications. A software tool developed on the basis of this approach will collect, within a single model, all the most important data on the properties and quality of jobs running on the supercomputer - data on their execution performance, size and duration, presence of specific or abnormal behavior scenarios, the usage of application packages and libraries, etc. Using flexible aggregation capabilities, the required level of detail will be specified - individual users, projects, application packages, subject areas, supercomputer partitions, time ranges, etc. This will allow one to create hundreds and thousands of different views for analyzing the state of the supercomputer, which will help administrators to choose the most suitable option for them.
APA, Harvard, Vancouver, ISO, and other styles
36

Hossain, Molla Muntasir, Md Abdul Wahab, Md Abdus Samad Al Azad, and Rubaiya Reza Tumpa. "Quality of Life among Patients with Coronary Heart Disease Admitted in a Selected Tertiary Level Hospital." Journal of Armed Forces Medical College, Bangladesh 13, no. 1 (April 23, 2017): 90–94. http://dx.doi.org/10.3329/jafmc.v13i1.41065.

Full text
Abstract:
Introduction: Coronary heart disease and cerebrovascular disease are the two main contributors of global morbidly and mortality. Coronary Heart Disease deaths in Bangladesh reached 1,63,769 or 17.11% of total deaths and reaches 25th in world. Importantly quality of life among them can modify the coronary heart disease. The opportunity for improved quality of life should be a factor in the health care provider's decision to recommend the treatment procedure. Objective: To ascertain the physical and mental health component of Quality of Life with sociodemographic characteristics and health-related morbidity status among admitted coronary heart disease patients. Materials and Methods: This cross-sectional study was conducted from January 2013 to December 2013 among coronary heart disease patients admitted in Cardiology department of Combined Military Hospital, Dhaka. The data were collected purposively by using Medical Outcomes short form SF-36 invented by RAND corporation, UK for measuring health-related quality of life among Bangladeshi patients where data were expressed as a score on a 0-100 scale. Data analysis was done by using software SPSS version 19. Results: A total of 105 cases were selected purposively amongst which majority were in the age group of 50-60 years with mean age of 55.27 years. Among the respondents 97.1% were males and 98.1% were Muslims. Majority (41%) of them were retired personnel. The mean monthly income was Tk. 16,393.56. Regarding education level 73% of the study population were SSC pass and below. Among the study group, 27(25.7%) patient had undergone coronary artery bypass graft operation. The study group possessed a total quality of life obtaining 63.4% score in their interviews as per SF-36. Among the whole study group, mental components score (63.61%) was found slightly higher than physical components score (63.2%). CABG operated patients mental component score (69.43%) was found relatively higher than Non CABG patients mental component score (60.01%). Patients having better monthly income as well as better educational level possess better mental component and total quality of life than others. Conclusion: It is of paramount importance to maintain the quality of life among coronary heart disease patients. Mental assurance and surgical intervention can improve quality of life among coronary heart disease patients. Journal of Armed Forces Medical College Bangladesh Vol.13(1) 2017: 90-94
APA, Harvard, Vancouver, ISO, and other styles
37

Tsybulnyk, Serhii, and Danylo Bidnyk. "DESIGN OF THE ARCHITECTURE OF AN AUTOMATED BIBLIOGRAPHIC SYSTEM." Bulletin of the National Technical University «KhPI» Series: New solutions in modern technologies, no. 2(8) (June 15, 2021): 83–89. http://dx.doi.org/10.20998/2413-4295.2021.02.12.

Full text
Abstract:
The development of information and computer technologies has led to the need to evolve the concept of universal bibliographic control. The creation of the Internet and web technologies has allowed this concept to reach a new level by creating a number of common international standards. In addition, to ensure control and exchange of bibliographic information public bibliographic and scientometric databases were created. Today, software for managing bibliographic records is in demand in various countries in Europe and America. The most popular software in these countries is EndNote, RefWorks, BibTeX and Zotero. The development of such automated bibliographic system and the adaptation of its functionality to standards and requirements within Ukraine is relevant for a number of reasons. The main reasons are the need for every scientist and lecturer of higher education institutions to confirm their scientific achievements when hiring, submitting scientific work to various competitions, to obtain a scientific degree and so on. Today the rapid development of information and computer technology allows us to abandon the list of scientific papers in manual mode and move to the use of specialized software on smartphones. That is why the architecture of an automated bibliographic system, which is developed as a mobile application based on the Android operating system, was designed. Java is chosen as the programming language in which the software will be written, as the vast majority of the Android operating system is written in this language. A number of technologies were chosen for the selected operating system. They will simplify the process of developing a mobile application. The three-layer architecture of the automated bibliographic system is designed on the basis of the multilevel model of architecture and the MVVM template. This architecture allows to provide the main non-functional characteristics of the quality of the developed software, as well as to effectively implement the rules of business logic within the object-oriented programming paradigm.
APA, Harvard, Vancouver, ISO, and other styles
38

Alwakeel, Lyan, and Kevin Lano. "Functional and Technical Aspects of Self-management mHealth Apps: Systematic App Search and Literature Review." JMIR Human Factors 9, no. 2 (May 25, 2022): e29767. http://dx.doi.org/10.2196/29767.

Full text
Abstract:
Background Although the past decade has witnessed the development of many self-management mobile health (mHealth) apps that enable users to monitor their health and activities independently, there is a general lack of empirical evidence on the functional and technical aspects of self-management mHealth apps from a software engineering perspective. Objective This study aims to systematically identify the characteristics and challenges of self-management mHealth apps, focusing on functionalities, design, development, and evaluation methods, as well as to specify the differences and similarities between published research papers and commercial and open-source apps. Methods This research was divided into 3 main phases to achieve the expected goal. The first phase involved reviewing peer-reviewed academic research papers from 7 digital libraries, and the second phase involved reviewing and evaluating apps available on Android and iOS app stores using the Mobile Application Rating Scale. Finally, the third phase involved analyzing and evaluating open-source apps from GitHub. Results In total, 52 research papers, 42 app store apps, and 24 open-source apps were analyzed, synthesized, and reported. We found that the development of self-management mHealth apps requires significant time, effort, and cost because of their complexity and specific requirements, such as the use of machine learning algorithms, external services, and built-in technologies. In general, self-management mHealth apps are similar in their focus, user interface components, navigation and structure, services and technologies, authentication features, and architecture and patterns. However, they differ in terms of the use of machine learning, processing techniques, key functionalities, inference of machine learning knowledge, logging mechanisms, evaluation techniques, and challenges. Conclusions Self-management mHealth apps may offer an essential means of managing users’ health, expecting to assist users in continuously monitoring their health and encourage them to adopt healthy habits. However, developing an efficient and intelligent self-management mHealth app with the ability to reduce resource consumption and processing time, as well as increase performance, is still under research and development. In addition, there is a need to find an automated process for evaluating and selecting suitable machine learning algorithms for the self-management of mHealth apps. We believe that these issues can be avoided or significantly reduced by using a model-driven engineering approach with a decision support system to accelerate and ameliorate the development process and quality of self-management mHealth apps.
APA, Harvard, Vancouver, ISO, and other styles
39

Al-Baldawi, Buraq Adnan. "Evaluation of Petrophysical Properties Using Well Logs of Yamama Formation in Abu Amood Oil Field, Southern Iraq." Iraqi Geological Journal 54, no. 1E (May 31, 2021): 67–77. http://dx.doi.org/10.46717/igj.54.1e.6ms-2021-05-27.

Full text
Abstract:
The petrophysical analysis is very important to understand the factors controlling the reservoir quality and production wells. In the current study, the petrophysical evaluation was accomplished to hydrocarbon assessment based on well log data of four wells of Early Cretaceous carbonate reservoir Yamama Formation in Abu-Amood oil field in the southern part of Iraq. The available well logs such as sonic, density, neutron, gamma ray, SP, and resistivity logs for wells AAm-1, AAm-2, AAm-3, and AAm-5 were used to delineate the reservoir characteristics of the Yamama Formation. Lithologic and mineralogic studies were performed using porosity logs combination cross plots such as density vs. neutron cross plot and M-N mineralogy plot. These cross plots show that the Yamama Formation consists mainly of limestone and the essential mineral components are dominantly calcite with small amounts of dolomite. The petrophysical characteristics such as porosity, water and hydrocarbon saturation and bulk water volume were determined and interpreted using Techlog software to carried out and building the full computer processed interpretation for reservoir properties. Based on the petrophysical properties of studied wells, the Yamama Formation is divided into six units; (YB-1, YB-2, YB-3, YC-1, YC-2 and YC-3) separated by dense non porous units (Barrier beds). The units (YB-1, YB-2, YC-2 and YC-3) represent the most important reservoir units and oil-bearing zones because these reservoir units are characterized by good petrophysical properties due to high porosity and low to moderate water saturation. The other units are not reservoirs and not oil-bearing units due to low porosity and high-water saturation.
APA, Harvard, Vancouver, ISO, and other styles
40

Borchmann, Peter, Horst Müller, Corinne Brillant, Karolin Behringer, Teresa Halbsguth, Volker Diehl, Hans Henning Flechtner, and Andreas Engert. "Longitudinal Evaluation of Quality of Life and Fatigue In Hodgkin Lymphoma Patients." Blood 116, no. 21 (November 19, 2010): 935. http://dx.doi.org/10.1182/blood.v116.21.935.935.

Full text
Abstract:
Abstract Abstract 935 Background: Long term impairment of quality of life (QoL) and elevated fatigue levels in Hodgkin Lymphoma (HL) survivors have been reported. However, few longitudinal data and no conclusive knowledge on components and determinants of QoL exist so far. Therefore, the German Hodgkin Study Group (GHSG) assessed the patients` QoL within the prospectively randomized studies HD10-12 for a detailed longitudinal evaluation of QoL and fatigue. Methods: QoL was assessed with a psychometrically proven questionnaire (QLQ-S) which contains the EORTC QLQ-C30 among other scales and items. Patients answered the questionnaires before, during, and at the end of therapy and at regular follow-up visits. For all QLQ-C30 functional scales and fatigue, longitudinal courses up to 27 months from diagnosis are given with means and 95%-confidence intervals. Reference values from a German control population were used for interpretation of the results. Components and determinants of QoL were analyzed with special modeling software (MPlus) which allows for full information maximum likelihood estimation of multivariate longitudinal models in the presence of missing data. The predictive value of fatigue at baseline for progression free survival and overall survival was tested in Cox proportional hazards analyses together with other known risk factors. Results: 4,160 patients were included in HD10-12, and 3,208 are evaluable for this analysis (total of 15,722 assessments). Before therapy, HL patients had clearly poorer mean scores in each QoL scale when compared to the German reference population. All scales at baseline were negatively influenced by gender (females) and more advanced disease. Before therapy age ≥50 years was negatively related to physical functioning and cognitive functioning, but positively to social functioning. After a decrease of QoL during chemotherapy, all scales showed considerable improvement. However, usually long term QoL remained below normal reference values and this was most pronounced in patients ≥50 years of age and advanced stages. A QoL model with three factors (physical, mental, and social) showed very good fit (RMSEA<.05) and high stability of QoL already 12 months after diagnosis. No relevant effect of the type of treatment could be detected. Overall, 44.7% of patients never experienced severe fatigue ≥50 (relative scale from 0–100), and 17.4% had fatigue only temporarily during treatment, and 15.1% had severe fatigue before therapy which vanished after therapy. In addition, 6.8% of patients developed severe long term fatigue without being severely fatigued before, and 6% had permanently severe fatigue. Cox regression for overall survival revealed that severe fatigue at baseline is a significant, strong and independent risk factor for death from any cause (p<.05, HR= 1.5). Other significant risk factors for OS included age, infradiaphragmatic nodes, and large mediastinal mass. In contrast, gender, high ESR, extranodal involvement, B-symptoms, intermediate stages, advanced stages were not significant. Conclusion: This is the first detailed QoL and fatigue analysis in HL patients covering all stages of the disease before, during, and after therapy. QoL domains are clearly impaired before the onset of chemotherapy, but improve over time substantially. Baseline QoL is affected in considerable degree by tumor- and patient-specific characteristics. Importantly, type and intensity of HL treatment have no relevant negative impact on long term QoL or fatigue. The strong impact of severe fatigue at baseline on overall survival is currently analyzed in more detail and results will be presented. Disclosures: No relevant conflicts of interest to declare.
APA, Harvard, Vancouver, ISO, and other styles
41

Frolkova, A. V., V. G. Fertikova, E. V. Rytova, and A. K. Frolkova. "Evaluation of the adequacy of phase equilibria modeling based on various sets of experimental data." Fine Chemical Technologies 16, no. 6 (January 26, 2022): 457–64. http://dx.doi.org/10.32362/2410-6593-2021-16-6-457-464.

Full text
Abstract:
Objectives. The purpose of the paper is to compare the adequacy of mathematical models of vapor–liquid equilibrium (VLE) and their ability to reproduce the phase behavior of the ternary system benzene–cyclohexane–chlorobenzene using different experimental data sets to evaluate binary interaction parameters.Methods. The research methodologies were mathematical modeling of VLE in the Aspen Plus V.10.0 software package using activity coefficient models (Non-Random Two-Liquid (NRTL), Wilson) and the Universal quasichemical Functional-group Activity Coefficients (UNIFAC) group model, which allows for independent information. For the benzene–cyclohexane–chlorobenzene ternary system, the use of the NRTL equation is warranted because it provides a better description of the VLE experimental data.Results. The diagram construction of the constant volatility of cyclohexane relative to benzene lines revealed three topological structures. Only one of them can be considered reliable because it corresponds to the experimental data and coincides with the UNIFAC model diagram constructed based on independent UNIFAC model data. The results indicate that to study systems containing components with similar properties, it is necessary to improve the description quality of the available data sets (the relative error should not exceed 1.5%).Conclusions. The reproduction of the thermodynamic features of various manifolds in the composition simplex obtained by processing direct VLE data can be used to supplement the adequacy of the model. For the cyclohexane–benzene–chlorobenzene system, the best NRTL equation parameters are those regressed from the extensive experimental VLE data available in the literature for the ternary system as a whole.
APA, Harvard, Vancouver, ISO, and other styles
42

Bulashova, O. V., A. A. Nasybullina, E. V. Khazova, V. M. Gazizyanova, and V. N. Oslopov. "Heart failure patients with mid-range ejection fraction: clinical features and prognosis." Kazan medical journal 102, no. 3 (June 10, 2021): 293–301. http://dx.doi.org/10.17816/kmj2021-293.

Full text
Abstract:
Aim. To analyze clinical and echocardiographic characteristics and prognosis in patients with heart failure mid-range ejection fraction. Methods. The study included 76 patients with stable heart failure IIV functional class, with a mean age of 66.110.4 years. All patients were divided into 3 subgroups based on the left ventricular ejection fraction: the first group heart failure patients with reduced ejection fraction (below 40%), 21.1%; the second group patients with mid-range ejection fraction (from 40 to 49%), 23.7%; the third group patients with preserved ejection fraction (50%), 55.3%. The clinical characteristics of all groups were compared. The quality of life was assessed by the Minnesota Satisfaction Questionnaire (MSQ), the clinical condition was determined by using the clinical condition assessment scale (Russian Shocks). The prognosis was studied according to the onset of cardiovascular events one year after enrollment in the study. The endpoints were cardiovascular mortality, myocardial infarction (MI), stroke, hospitalization for acutely decompensated heart failure, thrombotic complications. Statistical analysis was performed by using IBM SPSS Statistics 20 software. Normal distribution of the data was determined by the ShapiroWilk test, nominal indicators were compared between groups by using chi-square tests, normally distributed quantitative indicators by ANOVA. The KruskalWallis test was performed to comparing data with non-normal distribution. Results. Analysis showed that the most of clinical characteristics (etiological structure, age, gender, quality of life, results on the clinical condition assessment scale for patients with chronic heart failure and a 6-minute walk test, distribution by functional classes of heart failure) in patients with mid-range ejection fraction (HFmrEF) were similar to those in patients with reduced ejection fraction (HFrEF). At the same time, they significantly differed from the characteristics of patients with preserved ejection fraction (HFpEF). Echocardiographic data from patients with mid-range ejection fraction ranks in the middle compared to patients with reduced and preserved ejection fraction. In heart failure patients with mid-range ejection fraction, the incidence of adverse outcomes during the 1st year also was intermediate between heart failure patients with preserved ejection fraction and patients with reduced ejection fraction: for all cardiovascular events in the absence of significant differences (17.6; 10.8 and 18.8%, respectively), myocardial infarction (5,9; 0 and 6.2%), thrombotic complications (5.9; 5.4 and 6.2%). Heart failure patients with mid-range ejection fraction in comparison to patients with preserved ejection fraction and reduced ejection fraction had significantly lower cardiovascular mortality (0; 2.7 and 12.5%, p 0.05) and the number of hospitalization for acutely decompensated heart failure (0; 2,7 and 6.2%). Conclusion. Clinical characteristics of heart failure patients with mid-range and heart failure patients with reduced ejection fraction are similar but significantly different from those in the group of patients with preserved ejection fraction; echocardiographic data in heart failure patients with mid-range ejection fraction is intermediate between those in patients with reduced ejection fraction and patients with preserved ejection fraction; the prognosis for all cardiovascular events did not differ significantly in the groups depending on the left ventricular ejection fraction.
APA, Harvard, Vancouver, ISO, and other styles
43

Egorov, I. V. "Simulation model of dependability of redundant computer systems with recurrent information recovery." Dependability 18, no. 3 (September 5, 2018): 10–17. http://dx.doi.org/10.21683/1729-2646-2018-18-3-10-17.

Full text
Abstract:
Today’s digital nanotechnology-based information management systems are especially sensitive to highly-energized particles during operation in irradiated areas. This sensitivity is most often manifested in the form of intermittent soft errors, i.e. distortion of information bits in the system’s memory elements with no hardware failure. The cause is in the afterpulses at the output of the logical elements that occur as the result of ionization of the gate area of the transistor’s semiconductor after it is exposed to a highly-energized particle. In order to counter soft errors the system is equipped with self-repair mechanisms that ensure regular replacement of distorted data with correct data. If this approach to design is employed, the significance of dependability analysis of the system under development increases significantly. Since regular occurrence of soft errors is essentially normal operating mode of a system in conditions of increased radiation, dependability analysis must be repeatedly conducted at the design stage, as that is the only way to duly evaluate the quality of the taken design decisions. The distinctive feature of fault-tolerant hardware and software systems that consists in the presence of nonprobabilistic recovery process limits the applicability of the known methods of dependability analysis. It is difficult to formalize the behaviour of such systems in the form of a dependability model in the context of the classic dependability theory that is geared towards the evaluation of hardware structure. As it has been found out, the application of conventional methods of dependability analysis (such as the Markovian model or probabilistic logic) requires making a number of assumptions that result in unacceptable errors in the evaluation results or its inapplicability.Aim. Development of the model and methods of dependability analysis that would allow evaluating the dependability of hardware and software systems with periodic recovery.Results. A simulation model was developed that is intended for dependability evaluation of complex recoverable information management systems. The model is a network of oriented state graphs that allows describing the behaviour of a recoverable system subject to the presence of computation processes and recovery processes that operate according to non-stochastic algorithms. Based on the simulation model, a software tool for dependability analysis was developed that enables probabilistic estimation of dependability characteristics of individual system units and its overall structure by means of computer simulation of failures and recoveries. This tool can be used for comprehensive dependability evaluation of hardware and software systems that involves the analysis of recoverable units with complex behaviour using the developed simulation model, and their operation along with simple hardware components, such as power supplies and fuses, using conventional analytical methods of dependability analysis. Such approach to dependability evaluation is implemented in the Digitek Reliability Analyzer dependability analysis software environment.Practical significance.The application of the developed simulation model and dependability analysis tool at the design stage enables due evaluation of the quality of the produced fault tolerant recoverable system in terms of dependability and choose the best architectural solution, which has a high practical significance.
APA, Harvard, Vancouver, ISO, and other styles
44

Obilor, Amarachi F., Manuela Pacella, Andy Wilson, and Vadim V. Silberschmidt. "Micro-texturing of polymer surfaces using lasers: a review." International Journal of Advanced Manufacturing Technology 120, no. 1-2 (February 11, 2022): 103–35. http://dx.doi.org/10.1007/s00170-022-08731-1.

Full text
Abstract:
AbstractMicro- and nanoscale structures produced on surfaces of metals, polymers, ceramics, and glasses have many important applications in different fields such as engineering, medical, biological, etc. Laser ablation using ultrashort pulses has become the prominent technique for generating different surface structures for various functional applications. Ultrashort laser ablation proved to be ideal for producing structures with dimensions down to the nanometre scale. In comparison to other texturing techniques employed to create micro/nano features such as electrochemical machining, micro-milling, ion-beam etching, hot embossing, lithography, and mechanical texturing, ultrashort laser ablation produces high-quality surfaces at low cost in a one-step non-contact process. Advantageous characteristics of polymers such as high strength-to-weight ratio, non-corrosive nature, and high electrical and thermal resistance, have made polymers the preferred choice compared to other materials (e.g., steel, aluminium, titanium) in several fields of application. As a result, laser ablation of polymers has been of great interest for many researchers. This paper reviews the current state-of-the art research and recent progress in laser ablation of polymers starting from laser-material interaction, polymer properties influenced by laser, laser texturing methods, and achievable surface functionalities such as adhesion, friction, self-cleaning, and hydrophilicity on commonly used polymeric materials. It also highlights the capabilities and drawbacks of various micro-texturing techniques while identifying texture geometries that can be generated with these techniques. In general, the objective of this work is to present a thorough review on laser ablation and laser surface modification of a variety of industrially used polymers. Since direct laser interference patterning is an emerging area, considerable attention is given to this technique with the aim of forming a basis for follow-up research that could pave the way for potential technological ideas and optimization towards obtaining complex high-resolution features for future novel applications.
APA, Harvard, Vancouver, ISO, and other styles
45

Weigand, Maximilian, Florian M. Wagner, Jonas K. Limbrock, Christin Hilbich, Christian Hauck, and Andreas Kemna. "A monitoring system for spatiotemporal electrical self-potential measurements in cryospheric environments." Geoscientific Instrumentation, Methods and Data Systems 9, no. 2 (August 5, 2020): 317–36. http://dx.doi.org/10.5194/gi-9-317-2020.

Full text
Abstract:
Abstract. Climate-induced warming increasingly leads to degradation of high-alpine permafrost. In order to develop early warning systems for imminent slope destabilization, knowledge about hydrological flow processes in the subsurface is urgently needed. Due to the fast dynamics associated with slope failures, non- or minimally invasive methods are required for inexpensive and timely characterization and monitoring of potential failure sites to allow in-time responses. These requirements can potentially be met by geophysical methods usually applied in near-surface geophysical settings, such as electrical resistivity tomography (ERT), ground-penetrating radar (GPR), various seismic methods, and self-potential (SP) measurements. While ERT and GPR have their primary uses in detecting lithological subsurface structure and liquid water/ice content variations, SP measurements are sensitive to active water flow in the subsurface. Combined, these methods provide huge potential to monitor the dynamic hydrological evolution of permafrost systems. However, while conceptually simple, the technical application of the SP method in high-alpine mountain regions is challenging, especially if spatially resolved information is required. We here report on the design, construction, and testing phase of a multi-electrode SP measurement system aimed at characterizing surface runoff and meltwater flow on the Schilthorn, Bernese Alps, Switzerland. Design requirements for a year-round measurement system are discussed; the hardware and software of the constructed system, as well as test measurements are presented, including detailed quality-assessment studies. On-site noise measurements and one laboratory experiment on freezing and thawing characteristics of the SP electrodes provide supporting information. It was found that a detailed quality assessment of the measured data is important for such challenging field site operations, requiring adapted measurement schemes to allow for the extraction of robust data in light of an environment highly contaminated by anthropogenic and natural noise components. Finally, possible short- and long-term improvements to the system are discussed and recommendations for future installations are developed.
APA, Harvard, Vancouver, ISO, and other styles
46

Ismail, Ahmed, Mohamed Ezzeldin, Wael El-Dakhakhni, and Michael Tait. "Blast load simulation using conical shock tube systems." International Journal of Protective Structures 11, no. 2 (June 28, 2019): 135–58. http://dx.doi.org/10.1177/2041419619858098.

Full text
Abstract:
With the increased frequency of accidental and deliberate explosions, evaluating the response of civil infrastructure systems to blast loading has been attracting the interests of the research and regulatory communities. However, with the high cost and complex safety and logistical issues associated with field explosives testing, North American blast-resistant construction standards (e.g. ASCE 59-11 and CSA S850-12) recommend the use of shock tubes to simulate blast loads and evaluate relevant structural response. This study first aims at developing a simplified two-dimensional axisymmetric shock tube model, implemented in ANSYS Fluent, a computational fluid dynamics software, and then validating the model using the classical Sod’s shock tube problem solution, as well as available shock tube experimental test results. Subsequently, the developed model is compared to a more complex three-dimensional model and the results show that there is negligible difference between the two models for axisymmetric shock tube performance simulation; however, the three-dimensional model is necessary to simulate non-axisymmetric shock tubes. Following the model validation, extensive analyses are performed to evaluate the influences of shock tube design parameters (e.g. the driver section pressure and length and the expansion section length) on blast wave characteristics to facilitate a shock tube design that would generate shock waves similar to those experienced by civil infrastructure components under blast loads. The results show that the peak reflected pressure increases as the driver pressure increases, while a decrease in the expansion length increases the peak reflected pressure. In addition, the positive phase duration increases as both the driver length and expansion length are increased. Finally, the developed two-dimensional axisymmetric model is used to optimize the dimensions of a physical large-scale conical shock tube system constructed for civil infrastructure component blast response evaluation applications. The capabilities of such shock tube system are further investigated by correlating its design parameters to a range of explosion threats identified by different hemispherical TNT charge weight and distance scenarios.
APA, Harvard, Vancouver, ISO, and other styles
47

Kostenko, Ye Ya, R. I. Ratushniy, I. M. Bogdan, O. Ya Bilynsky, and S. B. Kostenko. "Discrete-Event Simulation of the Triangular Relations of the Components of the Working Process of a Dentist Doctor in Endodontic Manipulations." Ukraïnsʹkij žurnal medicini, bìologìï ta sportu 6, no. 3 (June 26, 2021): 269–76. http://dx.doi.org/10.26693/jmbs06.03.269.

Full text
Abstract:
Today in modern dentistry one of the urgent tasks is to increase the level of productivity of the dentist, while maintaining his or her mental and physical health. To find a rational solution to this issue, much attention is paid to ergonomics, which is aimed at protecting the work of doctors, improving the efficiency and quality of their work, creating optimal working conditions for them, ensuring safety and comfort for patients, and developing the latest dental equipment. The purpose of the study is to describe the clinical and experimental forecasting of the influence of ergonomics derivatives of dentists on the result of endodontic manipulations. Materials and methods. The methods, which were used, are targeted research methods, in particular Rapid Entire Body Assessment and Rapid Upper Limb Assessment, software Tecnomatix Jack (Siemens), StatPlusPro for Windows. The subject of research: a sample of 32 dentists (17 male dentists) (53.13%) and 15 female dentists (46.88%), who provide dental care on the basis of the University Dental Clinic, as well as in within other clinical bases of the dental faculty of Uzhhorod National University. Results and discussion. Analyzing the final results of iatrogenic interventions, there is a direct impact of ergonomics, justified by the presence of proven relationships between the integrated quality indicator of dental rehabilitation and procedural-manual-associated components of the treatment process. Non-compliance with the basic principles of ergonomics during various dental manipulations is evidence of changes in the pathological nature of the musculoskeletal system of the dentist, but there is still lack of data on the impact of the above changes on quantitative and qualitative indicators of effectiveness and predictability of treatment. Considerable attention needs to be paid to the analysis of the influence of ergonomic features of dentists' work on the result of endodontic treatment of teeth and their post-endodontic restoration, taking into account the initial complexity of this type of manipulation. As this is significantly influenced by anatomical variations in the structure of the endodontic, limited visualization of the working field, the need to ensure mandatory isolation of the intervention and permanent control over the absence of contamination of endodontic structures during treatment, topographic features of individual teeth (molars, in particular), features and physical characteristics of mechanical (rotational) and manual endodontic instruments. Conclusion. Occurrence of complications arising from endodontic treatment directly affect the prognosis of the dentition as a complex biomechanical system of the dental apparatus in cases of further post-endodontic restoration of teeth by direct or indirect restorations, as well as when using them as supports for future crowns, and also removable and non-removable types of orthopedic structures. In cases of fixation of bridges on endodontically treated teeth, the emergence of iatrogenic-associated complications due to biomechanical and biological properties of the endodontic, is associated with a decrease in the prognosis of success and survival of the entire prosthetic structure as a whole, rather than one independent unit of the tooth. Based on this, predicting the risks associated with the development of errors and complications during endodontic treatment, as well as their minimization through the use of various types of preventive measures remains an important scientific and practical issue not only therapeutic but also orthopedic dentistry
APA, Harvard, Vancouver, ISO, and other styles
48

PAGE, REX. "Engineering Software Correctness." Journal of Functional Programming 17, no. 6 (November 2007): 675–86. http://dx.doi.org/10.1017/s095679680700634x.

Full text
Abstract:
AbstractDesign and quality are fundamental themes in engineering education. Functional programming builds software from small components, a central element of good design, and facilitates reasoning about correctness, an important aspect of quality. Software engineering courses that employ functional programming provide a platform for educating students in the design of quality software. This pearl describes experiments in the use of ACL2, a purely functional subset of Common Lisp with an embedded mechanical logic, to focus on design and correctness in software engineering courses. Students find the courses challenging and interesting. A few acquire enough skill to use an automated theorem prover on the job without additional training. Many students, but not quite a majority, find enough success to suggest that additional experience would make them effective users of mechanized logic in commercial software development. Nearly all gain a new perspective on what it means for software to be correct and acquire a good understanding of functional programming.
APA, Harvard, Vancouver, ISO, and other styles
49

Mohd asri, Nurul anissa, ABDUL MALEK ABDUL HAMID, NORHASHIMAH SHAFFIAR, NOR AIMAN SUKINDAR, SHARIFAH IMIHEZRI SYED SHAHARUDDIN, and FARID SYAZWAN HASSAN. "APPLICATION OF HOUSE OF QUALITY IN THE CONCEPTUAL DESIGN OF BATIK WAX EXTRUDER AND PRINTER." IIUM Engineering Journal 23, no. 1 (January 4, 2022): 310–28. http://dx.doi.org/10.31436/iiumej.v23i1.1842.

Full text
Abstract:
Malaysian batik production is dominated by two techniques known as hand-drawn batik, or batik tjanting, and stamp batik, or batik block. In comparison to batik block, the more popular batik tjanting takes a longer time to produce. A Standardized Nordic Questionnaire (SNQ) for musculoskeletal symptom examination involving batik artisans in Kelantan and Terengganu identified high rates of musculoskeletal disorders in respondents due to their working posture during the batik tjanting process. It was also observed that the number of workers and artisans willing to participate in the traditional batik industry is on the decline. These problems have led to a systematic Quality Functional Deployment approach to facilitate the decision-making process for the conceptual design of an automatic batik printer. In this study, house of quality (HOQ) was applied to identify the critical features for a batik printer based on the voice of the customer (VOC). A survey done to rate the importance of VOC using an 8-point Likert scale revealed that the batik practitioners topmost priority for the batik printer feature is the 'ability to adjust and maintain the temperature of wax' (17.54%) while the non-batik practitioners chose 'ability to deliver a variety of complex designs' (15.94%). The least required feature for the batik printer was related to the size of the batik printer. The mapping between customer requirements (VOC) and technical requirements identified that the extruder design (21.3%), the heating element (18%), and nozzle diameter (17.8%) were the most critical components for the batik printer. Several conceptual designs of the extrusion unit, cartesian-based batik printer, and 2D image conversion using open-sourced software were proposed at the end of this work. ABSTRAK: Pengeluaran batik Malaysia telah didominasi oleh dua teknik yang dikenali sebagai batik lukisan-tangan (batik canting) dan batik cap (batik blok). Sebagai perbandingan, batik canting yang popular mengambil masa lebih lama bagi dihasilkan. Soal Selidik Nordic Standad (SNQ) bagi meneliti gejala muskuloskeletal melibatkan tukang batik di Kelantan dan Terengganu telah menunjukkan persamaan kadar muskuloskeletal yang tinggi pada postur badan semasa bekerja canting batik. Bilangan pekerja yang terlibat dalam industri tradisional batik ini turut terjejas. Masalah-masalah ini telah mengarah kepada kaedah Pengerahan Fungsi Kualiti bagi membantu proses membuat keputusan dalam rekaan konsep pencetak batik automatik. Kajian ini telah mengadaptasi Kualiti Rumah (HOQ) bagi mengesan ciri-ciri kritikal pada pencetak batik berdasarkan suara pelanggan (VOC). Kaji selidik telah dilakukan bagi menilai kepentingan VOC menggunakan skala Likert 8-poin. Didapati keutamaan yang diperlukan oleh 17.54% ahli batik adalah; ciri pencetak batik ini perlu mempunyai ‘keupayaan dalam menyelaras dan menetapkan suhu lilin’, manakala sebanyak 15.94% bukan ahli batik memilih ‘keupayaan pencetak ini harus berjaya menghasilkan pelbagai rekaan yang kompleks’. Ciri yang kurang diberi tumpuan adalah berkaitan saiz pencetak batik. Persamaan antara kehendak pelanggan (VOC) dan kehendak teknikal dalam mengenal pasti komponen-komponen penting bagi pencetak batik adalah rekaan penyemperit (21.3%), elemen pemanas (18%), dan diameter nozel (17.8%). Pelbagai rekaan konsep bagi unit penyemperit, pencetak batik canting, dan imej konversi 2D menggunakan perisian sumber terbuka telah dicadangkan di bahagian akhir kajian ini.
APA, Harvard, Vancouver, ISO, and other styles
50

Oh, Hyun-Shik, Dohyung Kim, and Sunju Lee. "Review on the Quality Attributes of an Integrated Simulation Software for Weapon Systems." Journal of the Korea Institute of Military Science and Technology 24, no. 4 (August 5, 2021): 408–17. http://dx.doi.org/10.9766/kimst.2021.24.4.408.

Full text
Abstract:
This paper describes the quality attributes of an integrated simulation software for weapon systems named Advanced distributed simulation environment(AddSIM). AddSIM is developed as a key enabler for Defense Modeling & Simulation(M&S) systems which simulate battlefields and used for battle experiments, analyses, military exercises, training, etc. AddSIM shall provide a standard simulation framework of the next Defense M&S systems. Therefore AddSIM shall satisfy not only functional but also quality requirements such as availability, modifiability, performance, testability, usability, and others. AddSIM consists of operating softwares of hierarchical components including graphical user interface, simulation engines, and support services(natural environment model, math utility, etc.), and separated weapon system models executable on the operating softwares. The relation between software architectures and their quality attributes are summarized from previous works. And the AddSIM architecture and its achievements in the aspect of quality attributes are reviewed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography