Journal articles on the topic 'Computer software industry Quality control'

To see the other types of publications on this topic, follow the link: Computer software industry Quality control.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Computer software industry Quality control.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ericsson, Mikael, Dahniel Johansson, and David Stjern. "AI-Based Quality Control of Wood Surfaces with Autonomous Material Handling." Applied Sciences 11, no. 21 (October 25, 2021): 9965. http://dx.doi.org/10.3390/app11219965.

Full text
Abstract:
The theory and applications of Smart Factories and Industry 4.0 are increasing the entry into the industry. It is common in industry to start converting exclusive parts, of their production, into this new paradigm rather than converting whole production lines all at once. In Europe and Sweden, recent political decisions are taken to reach the target of greenhouse gas emission reduction. One possible solution is to replace concrete in buildings with Cross Laminated Timber. In the last years, equipment and software that have been custom made for a certain task, are now cheaper and can be adapted to fit more processes than earlier possible. This in combination, with lessons learned from the automotive industry, makes it possible to take the necessary steps and start redesigning and building tomorrows automated and flexible production systems in the wood industry. This paper presents a proof of concept of an automated inspection system, for wood surfaces, where concepts found in Industry 4.0, such as industrial Internet of things (IIoT), smart factory, flexible automation, artificial intelligence (AI), and cyber physical systems, are utilized. The inspection system encompasses, among other things, of the shelf software and hardware, open source software, and standardized, modular, and mobile process modules. The design of the system is conducted with future expansion in mind, where new parts and functions can be added as well as removed.
APA, Harvard, Vancouver, ISO, and other styles
2

Elgrably, Isaac Souza, and Sandro Ronaldo Bezerra Oliveira. "A Quasi-Experimental Evaluation of Teaching Software Testing in Software Quality Assurance Subject during a Post-Graduate Computer Science Course." International Journal of Emerging Technologies in Learning (iJET) 17, no. 05 (March 14, 2022): 57–86. http://dx.doi.org/10.3991/ijet.v17i05.25673.

Full text
Abstract:
Software testing is regarded as a key activity in the software development cycle, as it helps information technology professionals to design good quality software. Thus, this is an essential activity for the software industry, although with all its nuances high priority is still not being given to learning about it at an academic level. The purpose of this work is to investigate a teaching strategy for software testing which involves acquiring academic skills within a curriculum based on active teaching methodologies. A teaching model was designed for this to coordinate the different areas of a subject, and then a controlled quasi-experiment was carried out in a post-graduate course to evaluate the application of this model. The results obtained demonstrate that there was a considerable learning gain in the experimental group that adopted the teaching approach when compared with the control group that relied on a traditional approach. The student t-test was employed to determine the learning efficiency.
APA, Harvard, Vancouver, ISO, and other styles
3

Ríos-Reina, Rocío, Daniel Caballero, Silvana M. Azcarate, Diego L. García-González, Raquel M. Callejón, and José M. Amigo. "VinegarScan: A Computer Tool Based on Ultraviolet Spectroscopy for a Rapid Authentication of Wine Vinegars." Chemosensors 9, no. 11 (October 22, 2021): 296. http://dx.doi.org/10.3390/chemosensors9110296.

Full text
Abstract:
Ultraviolet-visible (UV-vis) spectroscopy has shown successful results in the last few years to characterize and classify wine vinegar according to its quality, particularly those with a protected designation of origin (PDO). Due to these promising results, together with the simplicity, price, speed, portability of this technique and its ability to create robust hierarchical classification models, the objective of this work was the development of a computer tool or software, named VinegarScan, which uses the UV-vis spectra to be able to perform quality control and authentication of wine vinegar in a quick and user-friendly way. This software was based on the open-source GUI created in C++ using several data mining algorithms (e.g., decision trees, classification algorithms) on UV-vis spectra. This software achieved satisfactory prediction results with the available analytical UV-vis data. The future idea of utility is to combine the VinegarScan tool with a portable UV-vis device that could be used by control bodies of the wine vinegar industry to achieve a clear differentiation from their competitors to avoid fraud.
APA, Harvard, Vancouver, ISO, and other styles
4

Kállai, Viktória, Gábor L. Szepesi, and Péter Mizsey. "Dynamic Simulation Control in a Cryogenic Distillation Column." Pollack Periodica 15, no. 3 (November 7, 2020): 91–100. http://dx.doi.org/10.1556/606.2020.15.3.9.

Full text
Abstract:
Chemical industry has a high demand for ethylene quantity, especially with high quality. This paper discusses dynamic simulation models of an ethaneethylene high-pressure cryogenic rectification column with Unisim Design process simulator software. Distillation is one of the most essential technologies in chemical industry, it is important that the operation of the procedure can be modeled not only in steady-state mode but also in a dynamic way. The goal during this study is to make simulations with system-controlling and to investigate the effect the disturbance on the behavior of the columns.
APA, Harvard, Vancouver, ISO, and other styles
5

Gordin, M. V., G. S. Ivanova, A. V. Proletarsky, and M. V. Fetisov. "Adaptive Modelling System as a Unifi ed Platform for Industry-Specifi c CAD Systems." Mekhatronika, Avtomatizatsiya, Upravlenie 23, no. 11 (November 3, 2022): 563–69. http://dx.doi.org/10.17587/mau.23.563-569.

Full text
Abstract:
The risks associated with the isolated design of complex software systems within individual industries are analyzed, where not only the same thing is often done, but also the quality of the design suffers due to incomplete competence of the implementers. The approach of dividing competence and responsibility in complex software development by introducing an additional domain-specific layer of interaction between the software developer and the subject area specialists is discussed. The use of an adaptive modelling system as a tool for such separation is proposed. It is shown that the use of adaptive modelling as a common development platform for industry-specific CAD will not only improve the quality of production design in different industries, but will also simplify the design of production in related fields. Finally, it is shown that the use of a common platform will avoid the costs associated with the trend towards simplification and atomization of software developed in our country in the face of sanctions and the degradation of global connections.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhu, Xiaoyang, and Yangjian Ji. "A digital twin–driven method for online quality control in process industry." International Journal of Advanced Manufacturing Technology 119, no. 5-6 (January 5, 2022): 3045–64. http://dx.doi.org/10.1007/s00170-021-08369-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nikitina, M. A., V. A. Pchelkina, and O. A. Kuznetsova. "Technological solutions for intelligent data processing in the food industry." Proceedings of the Voronezh State University of Engineering Technologies 80, no. 2 (October 2, 2018): 256–63. http://dx.doi.org/10.20914/2310-1202-2018-2-256-263.

Full text
Abstract:
The article is devoted to the possibilities of application of artificial neural networks (ANN), which are a mathematical model, as well as its software or hardware implementation, built on the principle of organization and functioning of nerve cell networks of a living organism. Convolutional neural networks are arranged like the visual cortex of the brain and have achieved great success in image recognition, they are able to concentrate on a small area and highlight important features in it. The widespread use of ANN in medicine for the evaluation of radiographs, blood pressure and body mass index of patients on the analysis of their retina is noted. The use of ANN in the food industry for input quality control of raw materials is promising. In the world practice, various methods of remote control of raw materials are used, for this purpose ultrasonic scanning devices are mainly used. Such devices and analysis systems control raw materials by the ratio of meat tissues (muscle, connective, fat) in the carcass or half-carcass, without affecting the tissue structure, do not lead the quality at the cellular (microstructural) level. It is established that the structure of muscle (diameter of muscle fibers, the safety of the cellular elements, the porosity of the tissue, integrity of muscle fibers) reflects the quality of the raw material, its thermal state. Our work has begun on the creation of an expert system for quality control of meat raw materials at the microstructural level using modern intelligent technologies as ANN and computer vision. This direction is relevant and socially significant in the development of the meat industry, as it will significantly speed up the process of analysis of the quality of raw meat in the research laboratories of meat processing enterprises and testing centers and improve the objectivity of the results.
APA, Harvard, Vancouver, ISO, and other styles
8

Levytskyi, Volodymyr. "The New Approach of Using Image and Range Based Methods for Quality Control of Dimension Stone." Reports on Geodesy and Geoinformatics 103, no. 1 (June 27, 2017): 66–77. http://dx.doi.org/10.1515/rgg-2017-0006.

Full text
Abstract:
Abstract The basis for the quality control of commodity dimension stone blocks for mining industry is the study of fracturing. The identification of fracturing in rock masses is one of the most important aspects in rock mass modelling. Traditional methods for determination properties of fracturing are difficult and hazardous. This paper describes a new approach of fracturing identification, based on image and range data, which realized by image processing and special software. In this article describes a method using new computer algorithms that allow for automated identification and calculation of fracturing parameters. Different digital filters for image processing and mathematical dependences are analyzed. The digital imaging technique has the potential for being used in real time applications. The purpose of this paper is the accurate and fast mapping of fracturing in some walls of the Bukinsky gabbro deposit.
APA, Harvard, Vancouver, ISO, and other styles
9

Mineo, C., M. Vasilev, B. Cowan, C. N. MacLeod, S. G. Pierce, C. Wong, E. Yang, R. Fuentes, and E. J. Cross. "Enabling robotic adaptive behaviour capabilities for new Industry 4.0 automated quality inspection paradigms." Insight - Non-Destructive Testing and Condition Monitoring 62, no. 6 (June 1, 2020): 338–44. http://dx.doi.org/10.1784/insi.2020.62.6.338.

Full text
Abstract:
The seamless integration of industrial robotic arms with server computers, sensors and actuators can revolutionise the way in which automated non-destructive testing (NDT) is performed and conceived. Achieving effective integration and realising the full potential of robotic systems presents significant challenges, since robots, sensors and end-effector tools are often not necessarily designed to be put together and form a holistic system. This paper presents recent breakthroughs, opening up new scenarios for the inspection of product quality in advanced manufacturing. Many years of research have brought to software platforms the ability to integrate external data acquisition instrumentation with industrial robots to improve the inspection speed, accuracy and repeatability of NDT. Robotic manipulators have typically been operated by predefined tool-paths generated through offline path-planning software applications. Recent developments pave the way to data-driven autonomous robotic inspections, enabling real-time path planning and adaptive control. This paper presents a toolbox with highly efficient algorithms and software functions, developed to be used through high-level programming language platforms (for example MATLAB, LabVIEW and Python) and/ or integrated within low-level language (for example C# and C++) applications. The use of the toolbox can speed up the development and the robust integration of new robotic NDT systems with real-time adaptive capabilities and is compatible with all KUKA robots with six degrees of freedom (DOF), which are equipped with the Robot Sensor Interface (RSI) software add-on. The paper describes the architecture of the toolbox and shows two application examples, where performance results are provided. The concepts described in the paper are aligned with the emerging Industry 4.0 paradigms and have wider applicability beyond NDT.
APA, Harvard, Vancouver, ISO, and other styles
10

Knoll, Carsten, and Robert Heedt. "Tool-based Support for the FAIR Principles for Control Theoretic Results: The "Automatic Control Knowledge Repository"." SYSTEM THEORY, CONTROL AND COMPUTING JOURNAL 1, no. 1 (June 30, 2021): 56–67. http://dx.doi.org/10.52846/stccj.2021.1.1.11.

Full text
Abstract:
In 2016 a collection of guiding principles for the management of scientific data was proposed by a consortium of scientists and organizations under the acronym FAIR (Findability, Accessibility, Interoperability, Reusability). As many other disciplines, control theory also is affected by the (mostly unintended) disregard of these principles and to some degree also suffers from a reproducibility crisis. The specific situation for that discipline, however, is more related to software, than to classical numerical data. In particular, since computational methods like simulation, numeric approximation or computer algebra play an important role, the reproducibility of results relies on implementation details, which are typically out of scope for written papers.While some publications do reference the source code of the respective software, this is by far not standard in industry and academia. Additionally, having access to the source code does not imply reproducibility due to dependency issues w. r. t. hardware and software components. This paper proposes a tool based approach consisting of four components to mitigate the problem: a) an open repository with a suitable data structure to publish formal problem specifications and problem solutions (each represented as source code) along with descriptive metadata, b) a web service that automatically checks the solution methods against the problem specifications and auxiliary software for local testing, c) a computational ontology which allows for semantic tagging and sophisticated querying the entities in the repo and d) a peer-oriented process scheme to organize both the contribution process to that repository and formal quality assurance.
APA, Harvard, Vancouver, ISO, and other styles
11

Warszawski, A. "Robots in the construction industry." Robotica 4, no. 3 (July 1986): 181–88. http://dx.doi.org/10.1017/s026357470000936x.

Full text
Abstract:
SUMMARYRobots have a considerable potential of application on the building site; they can adapt to varying tasks, move and interact with environment. The building process may be restructured in such way that a majority of tasks would be performed by 4 configurations of robots: an assembling robot for handling of large structure components, a general purpose robot for interior finishing works, an exterior wall, and a floor finishing robot for finishing of large vertical and horizontal surfaces, respectively. A preliminary feasibility study reveals that such robots may be justified economically, especially under conditions which reduce human productivity or require high quality of work.
APA, Harvard, Vancouver, ISO, and other styles
12

Nicolò, V. "FMS and the main car industry." Robotica 3, no. 3 (September 1985): 137–45. http://dx.doi.org/10.1017/s026357470000905x.

Full text
Abstract:
SUMMARYThe attainment of a breakeven for lower outputs, the diversification of models in order to meet market demand and the improvement of quality are accepted as strategic objectives for the automotive industry. The role of flexible manufacturing with reference to these objectives is illustrated. The steps, through which the new technology is introduced, are discussed and some production systems already installed are described within the given framework.
APA, Harvard, Vancouver, ISO, and other styles
13

Pajpach, Martin, Oto Haffner, Erik Kučera, and Peter Drahoš. "Low-Cost Education Kit for Teaching Basic Skills for Industry 4.0 Using Deep-Learning in Quality Control Tasks." Electronics 11, no. 2 (January 12, 2022): 230. http://dx.doi.org/10.3390/electronics11020230.

Full text
Abstract:
The main purposes of this paper are to offer a low-cost solution that can be used in engineering education and to address the challenges that Industry 4.0 brings with it. In recent years, there has been a great shortage of engineering experts, and therefore it is necessary to educate the next generation of experts, but the hardware and software tools needed for education are often expensive and access to them is sometimes difficult, but most importantly, they change and evolve rapidly. Therefore, the use of cheaper hardware and free software helps to create a reliable and suitable environment for the education of engineering experts. Based on the overview of related works dealing with low-cost teaching solutions, we present in this paper our own low-cost Education Kit, for which the price can be as low as approximately EUR 108 per kit, for teaching the basic skills of deep learning in quality-control tasks in inspection lines. The solution is based on Arduino, TensorFlow and Keras, a smartphone camera, and is assembled using LEGO kit. The results of the work serve as inspiration for educators and educational institutions.
APA, Harvard, Vancouver, ISO, and other styles
14

Muzyka, M. Y., I. G. Blagoveshchensk, V. G. Blagoveshchensk, V. V. Golovin, M. M. Blagoveshchensk, and I. A. Kachura. "Technical solutions for the implementation of a software and hardware complex for food quality management." Proceedings of the Voronezh State University of Engineering Technologies 83, no. 4 (December 22, 2021): 49–56. http://dx.doi.org/10.20914/2310-1202-2021-4-49-56.

Full text
Abstract:
The problem of technical solutions for the implementation of a software and hardware complex for food quality management is considered. The review and analysis of existing modern control systems is presented, which made it possible to conclude that today food enterprises need new effective solutions using highly efficient intelligent technologies. The analysis of the possibility of intellectualization of the food production quality management system is carried out. The main tasks of this system are presented. It is shown that a practical basis for the implementation of this problem can be the creation of a software and hardware complex for an automated food quality control system using artificial intelligence technologies, which includes neural network technologies, computer vision systems, simulation modeling and an effective combination of hybrid methods and technologies in its arsenal. Methods, algorithms and technologies for the development of the investigated software and hardware complex of an intelligent automated food quality control system are analyzed. The developed generalized functional structure of such an intelligent system and the main stages of its implementation are presented. The main types of support for this system have been developed: information, mathematical and software. The main stages of decision-making on the quality of finished food products have been developed. The necessary technical means are recommended for the implementation of the system. For the practical implementation of the developed intelligent system, the CP1EE14DRA controller from Omron was chosen - a modular programmable controller. As an operator's workstation, a choice should be made in favor of Siemens products - SIMATIC Panel PC. For the tasks of data storage and implementation of calculations, a conventional personal server equipped with a powerful processor, for example, IntelCorei7, has been proposed. It is shown that the implementation of the developed intelligent automated food quality management system makes food industry enterprises more efficient and safe.
APA, Harvard, Vancouver, ISO, and other styles
15

Tudor, Nicolae, Vasile Claudiu Kifor, and Constantin Oprean. "Using QSPS in Developing and Realization of a Production Line in Automotive Industry." International Journal of Computers Communications & Control 5, no. 5 (December 1, 2010): 953. http://dx.doi.org/10.15837/ijccc.2010.5.2259.

Full text
Abstract:
<p>Using the QSPS (Quality System for the production software) for industrial projects and not only therefore, has led to accurate running of the production line from beginning of the SOP (Start of Production). This paper presents the application way of the QSPS at one of the strongest European automotive company. By using of this system several significant costs savings and quality improvement can be observed.<br /> The content of this paper will show step by step how to use QSPS for the integration of a production line in the traceability system from a big company in automotive industry.<br /> The production line involved contains 56 production equipments, which have to be passed trough by the product before being packet and deliver to the customer.<br /> The control of the line is done by this traceability system, so the impact of this system with the quality of the product is very high. The structure of this system contains 7 steps. All of these steps are followed and executed in each System (test, pilot and production environment).</p>
APA, Harvard, Vancouver, ISO, and other styles
16

Li, Jing Min. "On Current Situation and Development of our Industrial Automation Control Technology." Advanced Materials Research 383-390 (November 2011): 5697–99. http://dx.doi.org/10.4028/www.scientific.net/amr.383-390.5697.

Full text
Abstract:
The industrial automation control technology is comprehensive technology, mainly including industrial automation hardware, software and systems which reaches inspect, control and optimization, scheduling, management and decision-making, and increases yield and improve quality, reduce cost, to ensure safety. Industrial automation control technology as the most important technology in modern manufacturing in 20th century, which mainly solve the problem of production efficiency and consistency. Although automation system itself is not directly creating benefit of enterprise production, but it has obvious effect of ascension process. Our industrial automation control development road, mostly in the introduction of complete sets of equipment, and at the same time the digestion and absorption of second development and application. Currently our industrial automation control technology, industry and applications have developed greatly in industrial computer system, has been formed. At present, industrial automation control technology is intelligent, network integration and development direction
APA, Harvard, Vancouver, ISO, and other styles
17

SOINI, ANTTI J. "MACHINE VISION 1992–1996, TECHNOLOGY PROGRAM IN FINLAND." International Journal of Pattern Recognition and Artificial Intelligence 10, no. 01 (February 1996): 3–13. http://dx.doi.org/10.1142/s0218001496000025.

Full text
Abstract:
Machine vision technology has attracted a strong interest among Finnish research organizations, which has resulted in many innovative products for industry. Despite this goal users were very skeptical towards machine vision and its robustness in harsh industrial environments. Therefore the Technology Development Centre, TEKES, which funds technology related research and development projects in universities and individual companies in Finland, decided to start a national technology program, "Machine Vision 1992–1996". Led by industry, the program boosts research in machine vision technology and seeks to put the research results to work in practical industrial applications. The emphasis is on nationally important, demanding applications. The program will create new business for machine vision producers and encourage the process and manufacturing industry to take advantage of this new technology. So far 60 companies and all major universities and research centers in Finland are working on our forty different projects. The key themes are Process Control, Robot Vision and Quality Control.
APA, Harvard, Vancouver, ISO, and other styles
18

Areias, Sérgio, Cruz da, Rangel Henriques, and Sousa Pinto. "GammaPolarSlicer." Computer Science and Information Systems 8, no. 2 (2011): 477–99. http://dx.doi.org/10.2298/csis110107006a.

Full text
Abstract:
In software development, it is often desirable to reuse existing software components. This has been recognized since 1968, when Douglas Mcllroy of Bell Laboratories proposed basing the software industry on reuse. Despite the failures in practice, many efforts have been made to make this idea successful. In this context, we address the problem of reusing annotated components as a rigorous way of assuring the quality of the application under construction. We introduce the concept of caller-based slicing as a way to certify that the integration of an annotated component with a contract into a legacy system will preserve the behavior of the former. To complement the efforts done and the benefits of the slicing techniques, there is also a need to find an efficient way to visualize the annotated components and their slices. To take full profit of visualization, it is crucial to combine the visualization of the control/data flow with the textual representation of source code. To attain this objective, we extend the notion of System Dependence Graph and slicing criterion.
APA, Harvard, Vancouver, ISO, and other styles
19

Espinha Gasiba, Tiago, Ulrike Lechner, and Maria Pinto-Albuquerque. "Cybersecurity Challenges in Industry: Measuring the Challenge Solve Time to Inform Future Challenges." Information 11, no. 11 (November 16, 2020): 533. http://dx.doi.org/10.3390/info11110533.

Full text
Abstract:
Cybersecurity vulnerabilities in industrial control systems have been steadily increasing over the last few years. One possible way to address this issue is through raising the awareness (through education) of software developers, with the intent to increase software quality and reduce the number of vulnerabilities. CyberSecurity Challenges (CSCs) are a novel serious game genre that aims to raise industrial software developers’ awareness of secure coding, secure coding guidelines, and secure coding best practices. An important industry-specific requirement to consider in designing these kinds of games is related to the whole event’s duration and how much time it takes to solve each challenge individually—the challenge solve time. In this work, we present two different methods to compute the challenge solve time: one method based on data collected from the CSC dashboard and another method based on a challenge heartbeat. The results obtained by both methods are presented; both methods are compared to each other, and the advantages and limitations of each method are discussed. Furthermore, we introduce the notion of a player profile, which is derived from dashboard data. Our results and contributions aim to establish a method to measure the challenge solve time, inform the design of future challenges, and improve coaching during CSC gameplay.
APA, Harvard, Vancouver, ISO, and other styles
20

Wen, Lin. "Determining the Degree of Characteristics for Internet of Healthcare Devices Using Fuzzy ANP." Scientific Programming 2021 (June 28, 2021): 1–11. http://dx.doi.org/10.1155/2021/9292496.

Full text
Abstract:
With the revolution in Internet and digital technology, every organization is adopting digital things to carry out their day-to-day activities. Internet of Things (IoT) is a concept that is used to connect various devices over the Internet to increase the production and quality of service, deliver a huge amount of data in seconds, and automate the processes. IoT implementation in the health sector has changed the typical setup into smart and intelligent setup. With the “smart” and “Intelligent” abilities of IoT devices such as sensors and with the collaboration of humans and computers, physical processes can be monitored and, based on the received data, optimal decisions can ultimately be taken. IoT applications in healthcare will increase flexibility, patient’s care, quality of health, and control of diseases. As the IoT combines various heterogeneous devices and the inappropriate determination of the degree of characteristics for Internet of healthcare (IoH) devices may affect the efficiency of services, this research is carried out with the use of decision support system and application of fuzzy analytic network process (fuzzy ANP) was used for optimal determination of the degree of characteristics for IoH devices based on unique properties which would greatly increase the efficiency of industry. The experimental results are efficient and show the usefulness of the approach.
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Guozhu. "Improvement and Experimental Explore on Coordinated Control of Kinematic Mechanism of FDM 3D Printer." Advances in Multimedia 2022 (September 16, 2022): 1–9. http://dx.doi.org/10.1155/2022/4422616.

Full text
Abstract:
As the main component of the 3D printing industry, the fused deposition process covers all aspects of the industry with its advantages of low R&D investment, high practicability, and open source programs. However, due to process problems, problems have arisen in terms of printing efficiency and molding quality. To this end, we designed a large-scale multinozzle FDM printing device using the high-current fused deposition (FDM) printing principle. The defects of small size, slow printing speed, and low precision are deeply studied, and the machine structure is optimized according to the structural strength analysis. In this paper, the theoretical design and static analysis of the overall mechanical part of the large-scale FDM device are carried out, and then, the selection of the movement organization structure and movement method is theoretically analyzed. A modular flow chart is designed for the control system to coordinate and control the parallel and precise operation of multiple nozzles, and the relationship function between the main controller, power driver, and heating module is designed. By modifying the firmware parameter command, we can find out the optimal method running on the platform and discuss the function usage of the slicing software in detail. According to the current problems of FDM printing equipment, various factors affecting printing speed were analyzed from the perspective of printing accuracy, and the process parameters of 3D printer were studied through orthogonal experiments. Speed, nozzle temperature, idling speed, and fill rate were studied, and the relationship between factors affecting printing speed and printing accuracy was obtained. Use a simple model print to measure the overall performance of your product. The stability of the system is verified by short-term and long-term printing tests. The analysis results show that the forming performance and stability of the large-scale FDM are improved significantly.
APA, Harvard, Vancouver, ISO, and other styles
22

Tang, Yongbin, and Xi Chen. "Software Development, Configuration, Monitoring, and Management of Artificial Neural Networks." Security and Communication Networks 2022 (April 14, 2022): 1–11. http://dx.doi.org/10.1155/2022/9122908.

Full text
Abstract:
With the increasing demand for software systems, the software development industry is also developing rapidly. With the development of information technology, the more functions of the software, the more valuable it is, so the function design of the software becomes more complicated and difficult. The design of software system functions is increasingly large and complex. Scientific and effective use of software configuration management can well deal with collaborative work problems such as version management and change control in the software development process. In the process of software development and configuration, there will always be many problems that are difficult to detect. For example, when inputting the program code, there are not always some letter or space errors, and these errors are difficult to detect in time. For this reason, we need to establish a monitoring and management system for software development. As a computing model of human brain neural network, the artificial neural network can play the role of monitoring and management when it is applied to software development and configuration, which provides support for the security and scientificity of software development and configuration systems. This study studies the role and effectiveness of an artificial neural network in the monitoring and management of software development and configuration and validates it through experiments. The experimental results show that the artificial neural network has a strong ability to identify the problems in the software development configuration, which can improve the software development efficiency by at least 20%. It can improve the quality of software development and then improve the life cycle of software.
APA, Harvard, Vancouver, ISO, and other styles
23

Sun, Bo, and Hui Fang Zhang. "Three Dimensional Design and Unfolding Method of Sheet Metal Parts." Advanced Materials Research 532-533 (June 2012): 208–12. http://dx.doi.org/10.4028/www.scientific.net/amr.532-533.208.

Full text
Abstract:
Sheet metal parts are widely used in various fields of mechanical industry. Unfolding is the first process of sheet metal processing. Due to the traditional manual way of sheet metal parts with large labor intensity and low efficiency, so the mode of production of sheet metal parts must be improved for quality and efficiency. Along with the conditions of computer aided design and production needs of the sheet metal industry, sheet metal CAD system is proposed to establish. At present, sheet metal systems are mostly based on second development of three-dimensional software to set up. So this paper is based on UG for second development to establish the sheet metal system. This system concentrates the study of method of parameter design and automatic unfolding of sheet metal parts. Parametric design and unfolding model of sheet metal parts is accomplished through the establishment of unfolding model and write control program. The principles of parametric design and second development tools of UG are specifically discussed in this paper. Unfolding model is created through the establishment of the parameters and considering plate thickness.
APA, Harvard, Vancouver, ISO, and other styles
24

Chen, Xi, Yue Chen, Arun Sangaiah, Shouxi Luo, and Hongfang Yu. "MonLink: Piggyback Status Monitoring over LLDP in Software-Defined Energy Internet." Energies 12, no. 6 (March 25, 2019): 1147. http://dx.doi.org/10.3390/en12061147.

Full text
Abstract:
While software-defined networking (SDN) has been widely applied in various networking domains including datacenters, WANs (Wide Area Networks), QoS (Quality of Service) provisioning, service function chaining, etc., it also has foreseeable applications in energy internet (EI), which envisions an intelligent energy industry on the basis of (information) internet. Global awareness provided by SDN is especially useful in system monitoring in EI to achieve optimal energy transportation, sharing, etc. Link layer discovery protocol (LLDP) plays a key role in global topology discovery in software-defined energy internet when SDN is applied. Nevertheless, EI-related status information (power loads, etc.) is not collected during the LLDP-based topology discovery process initiated by the SDN controller, which makes the optimal decision making (e.g., efficient energy transportation and sharing) difficult. This paper proposes MonLink, a piggyback status-monitoring scheme over LLDP in software-defined energy internet with SDN-equipped control plane and data plane. MonLink extends the original LLDP by introducing metric type/length/value (TLV) fields so as to collect status information and conduct status monitoring in a piggyback fashion over LLDP during topology discovery simultaneously without the introduction of any newly designed dedicated status monitoring protocol. Several operation modes are derived for MonLink, namely, periodic MonLink, which operates based on periodic timeouts, proactive MonLink, which operates based on explicit API invocations, and adaptive MonLink, which operates sensitively and self-adaptively to status changes. Various northbound APIs are also designed so that upper layer network applications can make full use of the status monitoring facility provided by MonLink. Experiment results indicate that MonLink is a lightweight protocol capable of efficient monitoring of topological and status information with very low traffic overhead, compared with other network monitoring schemes such as sFlow.
APA, Harvard, Vancouver, ISO, and other styles
25

Wang, Xiaofeng, and Zhengran Wang. "Analysis and Evaluation of Ecological Environment Monitoring Based on PIE Remote Sensing Image Processing Software." Journal of Robotics 2022 (October 4, 2022): 1–12. http://dx.doi.org/10.1155/2022/1716756.

Full text
Abstract:
With the continuous improvement of people’s demand for ecological environment quality, the research on ecological environment monitoring, analysis and evaluation had been paid more and more attention by relevant departments and personnel. Because the images collected by remote sensing technology were many and multi-source, the features extracted from remote sensing images using traditional methods had been difficult to meet the needs of related industry applications. Therefore, this paper made use of the advantages of PIE remote sensing image processing software in data analysis and processing, and put forward the research on ecological environment monitoring, analysis and evaluation methods. Firstly, on the basis of summarizing the concepts and related problems of ecological environment, this paper analyzed the processing methods of remote sensing data sources of ecological environment, and explained the evaluation standards and common methods of ecological environment. Secondly, the composition of PIE remote sensing image processing technology system and its application advantages were described, the common indicators and analysis methods of ecological environment monitoring were given, and the index system and model of ecological environment comprehensive evaluation were established. Finally, through the analysis of experimental cases, the results showed that the ecological environment monitoring analysis and evaluation method proposed in this paper was feasible. Compared with the traditional methods, the method proposed in this paper could objectively evaluate the ecological environment. This paper can not only provide support for the analysis and processing of remote sensing image data, but also provide an important reference for the application of remote sensing technology in the field of ecological environment.
APA, Harvard, Vancouver, ISO, and other styles
26

Farahmandpour, Zeinab, Mehdi Seyedmahmoudian, and Alex Stojcevski. "A Review on the Service Virtualisation and Its Structural Pillars." Applied Sciences 11, no. 5 (March 8, 2021): 2381. http://dx.doi.org/10.3390/app11052381.

Full text
Abstract:
Continuous delivery is an industry software development approach that aims to reduce the delivery time of software and increase the quality assurance within a short development cycle. The fast delivery and improved quality require continuous testing of the developed software service. Testing services are complicated and costly and postponed to the end of development due to unavailability of the requisite services. Therefore, an empirical approach that has been utilised to overcome these challenges is to automate software testing by virtualising the requisite services’ behaviour for the system being tested. Service virtualisation involves analysing the behaviour of software services to uncover their external behaviour in order to generate a light-weight executable model of the requisite services. There are different research areas which can be used to create such a virtual model of services from network interactions or service execution logs, including message format extraction, inferring control model, data model and multi-service dependencies. This paper reviews the state-of-the-art of how these areas have been used in automating the service virtualisation to make available the required environment for testing software. This paper provides a review of the relevant research within these four fields by carrying out a structured study on about 80 research works. These studies were then categorised according to their functional context as, extracting the message format, control model, data model and multi-service dependencies that can be employed to automate the service virtualisation activity. Based on our knowledge, this is the first structural review paper in service virtualisation fields.
APA, Harvard, Vancouver, ISO, and other styles
27

Ames, Arlo L., Elaine M. Hinman-Sweeney, and John M. Sizemore. "Automated Generation of Weld Path Trajectories." Journal of Ship Production 19, no. 03 (August 1, 2003): 147–50. http://dx.doi.org/10.5957/jsp.2003.19.3.147.

Full text
Abstract:
AUTOmated GENeration of Control Programs for Robotic Welding of Ship Structure (AUTOGEN) is software that automates the planning and compiling of control programs for robotic welding of ship structure. The software works by evaluating computer representations of the ship design and the manufacturing plan. Based on this evaluation, AUTOGEN internally identifies and appropriately characterizes each weld. Then it constructs the robot motions necessary to accomplish the welds and determines for each the correct assignment of process control values. AUTOGEN generates these robot control programs completely without manual intervention or edits except to correct wrong or missing input data. Most ship structure assemblies are unique or at best manufactured only a few times. Accordingly, the high cost inherent in all previous methods of preparing complex control programs has made robot welding of ship structures economically unattractive to the U.S. shipbuilding industry. AUTOGEN eliminates the cost of creating robot control programs. With programming costs eliminated, capitalization of robots to weld ship structures becomes economically viable. Robot welding of ship structures will result in reduced ship costs, uniform product quality, and enhanced worker safety. Sandia National Laboratories and Northrop Grumman Ship Systems worked with the National Shipbuilding Research Program to develop a means of automated path and process generation for robotic welding. This effort resulted in the AUTOGEN program, which has successfully demonstrated automated path generation and robot control. Although the current implementation of AUTOGEN is optimized for welding applications, the path and process planning capability has applicability to a number of industrial applications, including painting, riveting, and adhesive delivery.
APA, Harvard, Vancouver, ISO, and other styles
28

Ordabayeva, Gulzinat, Abdizhapar Saparbayev, Bibinur Kirgizbayeva, Gulzat Dzhsupbekova, and Nazira Rakhymbek. "Analysis of network security organization based on SD-WAN technology." Eastern-European Journal of Enterprise Technologies 5, no. 9 (113) (October 31, 2021): 56–69. http://dx.doi.org/10.15587/1729-4061.2021.242993.

Full text
Abstract:
A Software-Defined Network (SDN) on a Wide Area Network (WAN) is a computer network that is controlled and created by software. SD-WAN is an emerging research area that has received a lot of attention from industry and government. This technology offers tremendous opportunities to support the creation of consolidated data centers and secure networks. This is an innovation that allows the network to be monitored and programmed so that it can respond to network events caused by security breaches. This solution provides network security, offers a single network management console, and provides complete control over the network architecture. Also controls security in the cloud software-defined infrastructure (SDI), such as dynamically changing the network configuration when forwarding packets, blocking, redirecting, changing Media Access Control (MAC) or Internet Protocol (IP) addresses, limiting the packet flow rate etc. Using SD-WAN technology, it is possible to reduce the cost of dedicated bandwidth channels, achieve a high-quality Virtual Private Network (VPN), and the ability to automatically select a channel for certain channels. The main advantages of SD-WAN are the management of an unlimited number of devices from a single center, reducing the cost of deploying branch infrastructure. According to the results of the survey, 7 % of respondents use SD-WAN for security solutions, 14% at the piloting stage. As a result of the research, it was revealed that by 2024, to increase the flexibility and support of cloud applications, more than 60 % of SD-WAN customers will implement the SASE (Secure Access Service Edge) architecture, which is 30% more than in 2020 and the main concept - application security and cloud functions. Keywords: OpenFlow, Software defined wide area network (SD-WAN), architecture, DDoS attack, WAN network
APA, Harvard, Vancouver, ISO, and other styles
29

Wu, Chunlei, and Yunhua Lin. "Intelligent Building Space Layout and Optimization Design Method Based on Biological Population Simulation." Journal of Sensors 2022 (September 20, 2022): 1–9. http://dx.doi.org/10.1155/2022/6246576.

Full text
Abstract:
Smart buildings can store various types of data, analyze data at a faster speed, and quickly provide users with the most accurate and high-quality intelligent services through technical means. The main purpose is to meet the needs of users, to provide communication and a series of knowledge services for all professional participants, and to provide comfortable working resources and working environment for all professional participants to ensure the smooth implementation of the project and meet the construction engineering requirements. In this study, the spatial layout and optimization of intelligent buildings are carried out based on biological population simulation, and the following conclusions are drawn: (1) The emergence of multicore or multicore graphics converters reduces the cost and popularity of high-performance software platforms, and provides faster computing speeds for applications. A new platform and implementation method have been developed. Comparing marine biopopulation models relies on the i4Ocean platform. The i4Ocean platform is a computer platform for seascape information and virtual reality applications, including view management, production engine, and MVC core framework. (2) The swarm model simulates the movement of fish in a virtual ocean. Determine their behavioral changes based on determined behavioral conditions (e.g., in position, position, and velocity). As everyone in the group moves, the entire group is monitored for behavior. (3) Intelligent buildings involve many aspects of the social machine, such as building construction, which not only affects architecture, but also design. Smart construction helps improve quality, shorten production time, save resources, and control costs. It is an important part of the structural transformation of the supply side of the construction industry, and an important tool for the structural adjustment and modernization of the construction industry and greenhouses. (4) X2View is like the most popular configuration king, with integrated functions. Project software is placed on the X2View and sent to the door or touch screen for processing and monitoring. In the driver configuration interface, you can set the application frame time of the monitoring node and the real-time measurement time of the record value recording time frame.
APA, Harvard, Vancouver, ISO, and other styles
30

Khin, Sabai, and Daisy Mui Hung Kee. "Factors influencing Industry 4.0 adoption." Journal of Manufacturing Technology Management 33, no. 3 (January 5, 2022): 448–67. http://dx.doi.org/10.1108/jmtm-03-2021-0111.

Full text
Abstract:
PurposeThe digital transformation towards Industry 4.0 (I4.0) has become imperative for manufacturers, as it makes them more flexible, agile and responsive to customers. This study aims to identify the factors influencing the manufacturing firms’ decision to adopt I4.0 and develop a triadic conceptual model that explains this phenomenon.Design/methodology/approachThis study used a qualitative exploratory study design based on multiple case studies (n = 15) from the manufacturing industry in Malaysia by conducting face-to-face interviews. The data were analysed using NVivo. The conceptual model was developed based on grounded theory and deductive thematic analysis.FindingsResults demonstrate that driving, facilitating and impeding factors play influential roles in a firms’ decision-making to adopt I4.0. The major driving factors identified are expected benefits, market opportunities, labour problem, customer requirements, competition and quality image. Furthermore, resources, skills and support are identified as facilitating factors and getting the right people, lack of funding, lack of knowledge, technical challenges, training the operators and changing the mindset of operators to accept new digital technologies are identified as impeding factors.Research limitations/implicationsDue to its qualitative design and limited sample size, the findings of this study need to be supplemented by quantitative studies for enhanced generalizability of the proposed model.Practical implicationsKnowledge of the I4.0 decision factors identified would help manufacturers in their decision to invest in I4.0, as they can be applied to balancing advantages and disadvantages, understanding benefits, identifying required skills and support and which challenges to expect. For policymakers, our findings identify important aspects of the ecosystem in need of improvement and how manufacturers can be motivated to adopt I4.0.Originality/valueThis study lays the theoretical groundwork for an alternative approach for conceptualizing I4.0 adoption beyond UTAUT (Unified Theory of Acceptance and Use of Technology). Integrating positive and negative factors enriches the understanding of decision-making factors for I4.0 adoption.
APA, Harvard, Vancouver, ISO, and other styles
31

Heijmans, Ad. "The Right Mix." Mechanical Engineering 131, no. 03 (March 1, 2009): 46–48. http://dx.doi.org/10.1115/1.2009-mar-5.

Full text
Abstract:
This review explores the use of computation fluid dynamics (CFD) tools embedded in its computer-aided design (CAD) software to create a right mix of gas and air for a wide range of applications. The new tools provide the ability to evaluate the performance of many potential alternatives in the initial stages of the design process. Early stage analysis makes it possible to improve the performance of the product and resolve design problems quickly and before large sums have been spent on a design that must be changed. The review also discusses that several best practices can help ensure the accuracy of CFD gas and air mixing simulation. The utilization of native 3D data places a premium on the quality of the solid model. The newest generation of CFD software contains sophisticated automatic control functions that make it possible to converge to a solution in almost every application without the need for manual tuning. CFD simulation in the preliminary stages in the design of products involving gas mixing can save time and money. Best practices tuned for the requirements of a particular industry can help design engineers avoid analysis mistakes.
APA, Harvard, Vancouver, ISO, and other styles
32

Ruiz-Flores, Alberto, Araceli García, Antonio Pineda, María Brox, Andrés Gersnoviez, and Eduardo Cañete-Carmona. "Low-Cost Photoreactor to Monitor Wastewater Pollutant Decomposition." Sensors 23, no. 2 (January 10, 2023): 775. http://dx.doi.org/10.3390/s23020775.

Full text
Abstract:
Actually, the quality of water is one of the most important indicators of the human environmental impact, the control of which is crucial to avoiding irreversible damage in the future. Nowadays, in parallel to the growth of the chemical industry, new chemical compounds have been developed, such as dyes and medicines. The increasing use of these products has led to the appearance of recalcitrant pollutants in industrial wastewater, and even in the drinking water circuit of our populations. The current work presents a photoreactor prototype that allows the performance of experiments for the decomposition of coloured pollutants using photocatalysis at the laboratory scale. The design of this device included the study of the photometric technique for light emission and the development of a software that allows monitoring the dye degradation process. Open-source hardware platforms, such as Arduino, were used for the monitoring system, which have the advantages of being low-cost platforms. A software application that manages the communication of the reactor with the computer and graphically displays the data read by the sensor was also developed. The results obtained demonstrated that this device can accelerate the photodegradation reaction in addition to monitoring the changes throughout the process.
APA, Harvard, Vancouver, ISO, and other styles
33

SILVÉN, OLLI, and HANNU KAUPPINEN. "RECENT DEVELOPMENTS IN WOOD INSPECTION." International Journal of Pattern Recognition and Artificial Intelligence 10, no. 01 (February 1996): 83–95. http://dx.doi.org/10.1142/s0218001496000086.

Full text
Abstract:
Automated grading of lumber is attractive for the sawmill industry. Achieving better accuracy of quality grading and control of quality variation in volume production easily improves the profit obtained from it. We describe a color vision based framework to grading softwood lumber. The proposed inspection principle is to recognize the sound wood regions early, as this reduces the computational requirements at later defect recognition stages. The recognition stages are ordered based on their estimated costs and implementational complexity. This is important because color increases the data volumes significantly over grey scale images. Finally, the description of the board and its defects is passed to the quality grader which assigns the class giving the best price for each board. In comparative tests, the computational solutions have turned out to be simpler than with grey level images, compensating for the higher cost of the color imaging system. The effects of changes in the spectrum of illumination have also been evaluated to identify robust color features and to produce the requirements for color calibration.
APA, Harvard, Vancouver, ISO, and other styles
34

Azieva, G., A. Alimagambetova, and U. Turusbekova. "FORMATION AND APPLICATION OF AN AGENT-ORIENTED MODEL IN THE MANAGEMENT OF THE OIL INDUSTRY OF THE REPUBLIC OF KAZAKHSTAN." Scientific Journal of Astana IT University, no. 9 (March 30, 2022): 36–49. http://dx.doi.org/10.37943/aitu.2022.41.73.004.

Full text
Abstract:
Kazakhstan is one of the few countries in the world rich in oil, deservedly called“black gold” because it is the most important source of energy. The relevance of the studyof this paper is determined by the fact that the management of the oil industry affects notonly the management process itself, but also the social aspects of the implementation of thedevelopment strategy of the state as a whole. It is necessary to identify aspects of managementactivity and define criteria by which it is possible to calculate the effectiveness of managerialdecision-making in the analyzed industry. Agent models allow us to identify the main criteriafor the effectiveness of managerial decision-making and optimize social and economic costsfor their implementation within the framework of interdepartmental planning. The novelty ofthe research is determined by the fact that agent models are based not only on the associatedparameters of the management process, but also affect the possibility of planning currentactivities for a long period. The article shows that the formation of agent models should affectboth the aspect of the formation of matrices of complex managerial actions and calculationson the accounting of competencies in making managerial decisions. The practical significanceof the study is determined by the fact that the development of complex models based onagent forms allows expanding the use of forms of control over the industry by the state andother stakeholders. The implementation of a matrix form of management is proposed, takinginto account balanced industry indicators of management quality.
APA, Harvard, Vancouver, ISO, and other styles
35

Bala Subrahmanya, M. H. "External technology acquisition of SMEs in the engineering industry of Bangalore." Journal of Manufacturing Technology Management 25, no. 8 (September 30, 2014): 1174–94. http://dx.doi.org/10.1108/jmtm-07-2012-0069.

Full text
Abstract:
Purpose – The purpose of this paper is to ascertain: first, India's public policy support for small and medium enterprises (SMEs) for external technology acquisition (ETA); second, objectives of SMEs for ETA; and third, factors which induced them to obtain their first ETA. Design/methodology/approach – Public policy is examined through survey of literature, whereas objectives and factors influencing first ETAs by SMEs are analyzed based on primary data collected from 64 SMEs in Bangalore. Objectives of ETAs are analyzed descriptively whereas factors which facilitated/hindered early ETAs are examined through Cox regression analysis. Findings – Public policy for ETAs by SMEs includes technology information, assistance and fiscal incentives. The technology focus of these SMEs has been shifting from conventional lathes to computer numerical control (CNC) machines. Most of the SMEs have gone for technology up-gradation with the objective of improving product quality, scale expansion, and meeting customer demand. Majority of these SMEs have obtained their first ETA within six years of their inception. Firm level factors have significantly influenced the time taken by these SMEs for their first ETAs. Overall, when technology is well developed and easily accessible, SMEs would hardly look for external support for ETAs. Research limitations/implications – The shifting technology focus from conventional lathes to CNC machines is a welcome development, which is driven by the need for “competitiveness enhancement”. Since there is no major obstacle for ETAs, policy makers may focus more on providing SMEs with market information and market developments. Practical implications – The shifting technology focus from conventional lathes to CNC machines in Indian SMEs is a welcome development, which is largely driven by the need for better product quality, scale expansion and customer demand, and internal factors played a crucial role in the time taken by these SMEs in accomplishing their first ETA. As such there is no major obstacle for these SMEs in going for ETAs since technology suppliers are available at the door-step and finance is available from the banks. Therefore, policy makers may focus more on providing SMEs with market information and market developments in the domestic as well as international market. Originality/value – This is a first attempt to examine public policy, objectives and factors influencing SMEs for ETAs in India, after 1991.
APA, Harvard, Vancouver, ISO, and other styles
36

Cutugno, Matteo, Umberto Robustelli, and Giovanni Pugliano. "Structure-from-Motion 3D Reconstruction of the Historical Overpass Ponte della Cerra: A Comparison between MicMac® Open Source Software and Metashape®." Drones 6, no. 9 (September 6, 2022): 242. http://dx.doi.org/10.3390/drones6090242.

Full text
Abstract:
In recent years, the performance of free-and-open-source software (FOSS) for image processing has significantly increased. This trend, as well as technological advancements in the unmanned aerial vehicle (UAV) industry, have opened blue skies for both researchers and surveyors. In this study, we aimed to assess the quality of the sparse point cloud obtained with a consumer UAV and a FOSS. To achieve this goal, we also process the same image dataset with a commercial software package using its results as a term of comparison. Various analyses were conducted, such as the image residuals analysis, the statistical analysis of GCPs and CPs errors, the relative accuracy assessment, and the Cloud-to-Cloud distance comparison. A support survey was conducted to measure 16 markers identified on the object. In particular, 12 of these were used as ground control points to scale the 3D model, while the remaining 4 were used as check points to assess the quality of the scaling procedure by examining the residuals. Results indicate that the sparse clouds obtained are comparable. MicMac® has mean image residuals equal to 0.770 pixels while for Metashape® is 0.735 pixels. In addition, the 3D errors on control points are similar: the mean 3D error for MicMac® is equal to 0.037 m with a standard deviation of 0.017 m, whereas for Metashape®, it is 0.031 m with a standard deviation equal to 0.015 m. The present work represents a preliminary study: a comparison between software packages is something hard to achieve, given the secrecy of the commercial software and the theoretical differences between the approaches. This case study analyzes an object with extremely complex geometry; it is placed in an urban canyon where the GNSS support can not be exploited. In addition, the scenario changes continuously due to the vehicular traffic.
APA, Harvard, Vancouver, ISO, and other styles
37

Kowalczyk, Paweł, Jacek Izydorczyk, and Marcin Szelest. "Evaluation Methodology for Object Detection and Tracking in Bounding Box Based Perception Modules." Electronics 11, no. 8 (April 8, 2022): 1182. http://dx.doi.org/10.3390/electronics11081182.

Full text
Abstract:
The aim of this work is to formulate a new metric to be used in the automotive industry for the evaluation process of software used to detect vehicles on video data. To achieve this goal, we have formulated a new concept for measuring the degree of matching between rectangles for industrial use. We propose new measure based on three sub-measures focused on the area of the rectangle, its shape, and distance. These sub-measures are merged into a General similarity measure to avoid problems with poor adaptability of the Jaccard index to practical issues of recognition. Additionally, we create method of calculation of detection quality in the sequence of video frames that summarizes the local quality and adds information about possible late detection. Experiments with real and artificial data have confirmed that we have created flexible tools that can reduce time needed to evaluate detection software efficiently, and provide more detailed information about the quality of detection than the Jaccard index. Their use can significantly speed up data analysis and capture the weaknesses and limitations of the detection system under consideration. Our detection quality assessment method can be of interest to all engineers involved in machine recognition of video data.
APA, Harvard, Vancouver, ISO, and other styles
38

Roy, Debayan, Licong Zhang, Wanli Chang, Dip Goswami, Birgit Vogel-Heuser, and Samarjit Chakraborty. "Tool Integration for Automated Synthesis of Distributed Embedded Controllers." ACM Transactions on Cyber-Physical Systems 6, no. 1 (January 31, 2022): 1–31. http://dx.doi.org/10.1145/3477499.

Full text
Abstract:
Controller design and their software implementations are usually done in isolated design spaces using respective COTS design tools. However, this separation of concerns can lead to long debugging and integration phases. This is because assumptions made about the implementation platform during the design phase—e.g., related to timing—might not hold in practice, thereby leading to unacceptable control performance. In order to address this, several control/architecture co-design techniques have been proposed in the literature. However, their adoption in practice has been hampered by the lack of design flows using commercial tools. To the best of our knowledge, this is the first article that implements such a co-design method using commercially available design tools in an automotive setting, with the aim of minimally disrupting existing design flows practiced in the industry. The goal of such co-design is to jointly determine controller and platform parameters in order to avoid any design-implementation gap , thereby minimizing implementation time testing and debugging. Our setting involves distributed implementations of control algorithms on automotive electronic control units ( ECUs ) communicating via a FlexRay bus. The co-design and the associated toolchain Co-Flex jointly determines controller and FlexRay parameters (that impact signal delays) in order to optimize specified design metrics. Co-Flex seamlessly integrates the modeling and analysis of control systems in MATLAB/Simulink with platform modeling and configuration in SIMTOOLS/SIMTARGET that is used for configuring FlexRay bus parameters. It automates the generation of multiple Pareto-optimal design options with respect to the quality of control and the resource usage, that an engineer can choose from. In this article, we outline a step-by-step software development process based on Co-Flex tools for distributed control applications. While our exposition is automotive specific, this design flow can easily be extended to other domains.
APA, Harvard, Vancouver, ISO, and other styles
39

Lavrentiev, A. Y., V. S. Sherne, and F. A. Musaev. "INFLUENCE OF MIXED FODDERS COMBINED WITH ENZYMES ON GOOSE FEATHERS AND DOWN YIELD." THEORETICAL & APPLIED PROBLEMS OF AGRO-INDUSTRY 52, no. 2 (2022): 34–39. http://dx.doi.org/10.32935/2221-7312-2022-52-2-34-39.

Full text
Abstract:
A promising direction in development of poultry industry is expansion of the range and increasing the quality. ‘Lindovskaya’ breed is the most promising geese for breeding in the Chuvash Republic, which could meet the goal. Further development of goose breeding will largely be determined by breeding work, full-fledged feeding, keeping poultry technology, poultry assignment, production cost, quality characteristics and economic feasibility. Only highly productive breeds and lines of geese, ensuring the competitiveness of farms in national and foreign markets, can lead to effective goose production. The aim of the study was to study the effect of compound feeds with enzyme agents (amylosubtilin G3x, cellolux-F and protosubtilin G3x) on the yield of feathers and down in young geese of Lindovskaya breed. The results of the research showed, that the enzyme agents used as part of mixed feed improved both quantitative and qualitative indicators for feathers and down productivity in goslings. Feathers and down weight of ganders from the 2nd experimental group was 47.7 g higher than in the control group, and 11.1 g higher than in the 1st experimental group. Down yield was higher in the 2nd experimental group in comparison with both the control and the 1st experimental groups. Amount of down was larger in ganders and layers of the 2nd experimental group. The largest output of feathers and down was in the 2nd experimental group, it exceeded the control and the 1st experimental groups by 36.9 gram and 29,5%, respectively.
APA, Harvard, Vancouver, ISO, and other styles
40

Shtuts, Andrii. "COMPUTER SIMULATION OF THE PROCESS OF STAMPING BY ROLLING CYLINDRICAL AND PIPE PREPARATIONS USING THE DEFORM - 3D SOFTWARE COMPLEX." Vibrations in engineering and technology, no. 4(99) (December 18, 2020): 101–13. http://dx.doi.org/10.37128/2306-8744-2020-4-12.

Full text
Abstract:
The purpose of modeling is the analysis of the stress-strain (VAT) [7,14] condition and deformation of tubular, cylindrical workpieces in the deformation process, determining the energy parameters of the process, as well as the geometry of the deforming tool. transverse and longitudinal directions, as well as the cleanliness of the surface of the profiled cavity. This article analyzes the features of local deformation, which define stamping by rolling as an independent type of metal pressure treatment in the metalworking industry. This problem is most effectively solved by using cold plastic deformation processes, which allow to bring the shape of the workpiece as close as possible to the shape of the finished product, and in some cases eliminate the need for further processing. Using the method of cold deformation instead of cutting, you can increase 2… 3 times the utilization of the metal. This ensures high quality of the workpiece surface, improves the physical and mechanical characteristics of the material, increases productivity and creates conditions for full automation of production. Examples of the most complete realization of advantages of stamping by rolling that provides efficiency of industrial use are resulted. In the priority areas of science and technology, a special role is given to energy and resource conservation. In [3] the simulation results for rolling stamping without restriction with one-sided and two-sided restriction of metal flow along the length of lead blanks are presented. Analysis of the experimental study showed that the process of stamping rolling (SHO) workpieces with conical and cylindrical rolls, allows you to control the flow of workpiece material by changing the size and direction of the axes of the workpiece roll, relative to the direction of rotation of the workpiece. The conducted modeling of SHO processes showed that the stress-strain state, deformation and deformability of the workpiece material also significantly depend on these parameters. The technological capabilities of the SHO are limited mainly by the loss of stability and destruction of the workpieces, which, in turn, significantly depends on the direction of flow of metal at the contact of the roll with the workpiece. Thus, the ability to control the direction of flow of the metal largely determines the ability to manufacture workpieces of the desired shape and size without destruction and loss of stability [2].
APA, Harvard, Vancouver, ISO, and other styles
41

Ji, Miaomiao, Peng Liu, and Qiufeng Wu. "Feasibility of Hybrid PSO-ANN Model for Identifying Soybean Diseases." International Journal of Cognitive Informatics and Natural Intelligence 15, no. 4 (October 2021): 1–16. http://dx.doi.org/10.4018/ijcini.290328.

Full text
Abstract:
Soybean disease has become one of vital factors restricting the sustainable development of high-yield and high-quality soybean industry. A hybrid artificial neural network (ANN) model optimized via particle swarm optimization (PSO) algorithm, which is denoted as PSO-ANN, is proposed in this paper for soybean diseases identification based on categorical feature inputs. Augmentation dataset is created via Synthetic minority over-sampling technique (SMOTE) to deal with quantitative insufficiency and categorical unbalance of the dataset. PSO algorithm is used to optimize the parameters in ANN, including the activation function, the number of hidden layers, the number of neurons in each hidden layer and the optimizer. In the end, ANN model with 2 hidden layers, 63 and 61 neurons in hidden layers respectively, Relu activation function and Adam optimizer yields the best overall test accuracy of 92.00%, compared with traditional machine learning methods. PSO-ANN shows superiority on various evaluation metrics, which may have great potential in crop diseases control for modern agriculture.
APA, Harvard, Vancouver, ISO, and other styles
42

Sellitto, Miguel Afonso, and Elisandro João de Vargas. "A method to align functionalities of a manufacturing execution system with competitive priorities." Journal of Manufacturing Technology Management 31, no. 2 (August 1, 2019): 353–69. http://dx.doi.org/10.1108/jmtm-11-2018-0424.

Full text
Abstract:
Purpose The purpose of this paper is to align the implementation of the manufacturing execution systems (MES) functionalities with the manufacturing competitive dimensions of two companies of the metal-mechanics industry. Design/methodology/approach The research object comprises two Brazilian manufacturers that use MES. The research method is quantitative modeling. A literature review organized 24 functionalities of MES. The study prioritized the functionalities and the manufacturing competitive dimensions, evaluated the contribution and the quality of the implementation and classified the functionalities. Findings Two functionalities in the first and five in the second case have high contribution and quality. Three functionalities in the first and six in the second case have low contribution and quality. The first group should be more explored and updated. The second should be discontinued. The rest has intermediate graduations and should be studied case-by-case. The main contribution of the paper is the method, which can be replicated in other applications in the same or in other industries. Research limitations/implications The study relies on two case studies of the same industry. Further research shall encompass the entire industry and cases in industries others than the metal-mechanics. Practical implications MES users and vendors may benefit from the study as they can apply the method to align the implementation with the manufacturing strategy and therefore enhance the effectivity of the system. Originality/value The authors did not find a study that associates the performance or the contribution of the MES functionalities to the competitive priorities with the quality, integrity or consistency of the implementation.
APA, Harvard, Vancouver, ISO, and other styles
43

Sánchez Santiago, A. J., A. J. Yuste, J. E. Muñoz Expósito, Sebastian García Galán, R. P. Prado, J. M. Maqueira, and S. Bruque. "Real-time image texture analysis in quality management using grid computing: an application to the MDF manufacturing industry." International Journal of Advanced Manufacturing Technology 58, no. 9-12 (August 7, 2011): 1217–25. http://dx.doi.org/10.1007/s00170-011-3456-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Farahani, Saeed, Nathaniel Brown, Jonathan Loftis, Curtis Krick, Florian Pichl, Robert Vaculik, and Srikanth Pilla. "Evaluation of in-mold sensors and machine data towards enhancing product quality and process monitoring via Industry 4.0." International Journal of Advanced Manufacturing Technology 105, no. 1-4 (September 2, 2019): 1371–89. http://dx.doi.org/10.1007/s00170-019-04323-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Abdullah Hokoma, Rajab, Mohammed K. Khan, and Khalid Hussain. "Investigation into the implementation stages of manufacturing and quality techniques and philosophies within the Libyan cement industry." Journal of Manufacturing Technology Management 19, no. 7 (September 5, 2008): 893–907. http://dx.doi.org/10.1108/17410380810898804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Steinberg, Fabian, Peter Burggaef, Johannes Wagner, and Benjamin Heinbach. "Impact of material data in assembly delay prediction—a machine learning-based case study in machinery industry." International Journal of Advanced Manufacturing Technology 120, no. 1-2 (February 12, 2022): 1333–46. http://dx.doi.org/10.1007/s00170-022-08767-3.

Full text
Abstract:
AbstractDesigning customized products for customer needs is a key characteristic of machine and plant manufacturers. Their manufacturing process typically consists of a design phase followed by planning and executing a production process of components required in the subsequent assembly. Production delays can lead to a delayed start of the assembly. Predicting potentially delayed components—we call those components assembly start delayers—in early phases of the manufacturing process can support an on-time assembly. In recent research, prediction models typically include information about the orders, workstations, and the status of the manufacturing system, but information about the design of the component is not used. Since the components of machine and plant manufacturers are designed specifically for the customer needs, we assumed that material data influence the quality of a model predicting assembly start delayers. To analyze our hypothesis, we followed the established CRISP-DM method to set up 12 prediction models at an exemplary chosen machine and plant manufacturer utilizing a binary classification approach. These 12 models differentiated in the utilization of material data—including or excluding material data—and in the utilized machine learning algorithm—six algorithms per data case. Evaluating the different models revealed a positive impact of the material data on the model quality. With the achieved results, our study validates the benefit of using material data in models predicting assembly start delayers. Thus, we identified that considering data sources, which are commonly not used in prediction models, such as material data, increases the model quality.
APA, Harvard, Vancouver, ISO, and other styles
47

Stavropoulos, Panagiotis, Alexios Papacharalampopoulos, John Stavridis, and Kyriakos Sampatakakis. "A three-stage quality diagnosis platform for laser-based manufacturing processes." International Journal of Advanced Manufacturing Technology 110, no. 11-12 (September 18, 2020): 2991–3003. http://dx.doi.org/10.1007/s00170-020-05981-9.

Full text
Abstract:
Abstract Diagnosis systems for laser processing are being integrated into industry. However, their readiness level is still questionable under the prism of the Industry’s 4.0 design principles for interoperability and intuitive technical assistance. This paper presents a novel multifunctional, web-based, real-time quality diagnosis platform, in the context of a laser welding application, fused with decision support, data visualization, storing, and post-processing functionalities. The platform’s core considers a quality assessment module, based upon a three-stage method which utilizes feature extraction and machine learning techniques for weld defect detection and quality prediction. A multisensorial configuration streams image data from the weld pool to the module in which a statistical and geometrical method is applied for selecting the input features for the classification model. A Hidden Markov Model is then used to fuse this information with earlier results for a decision to be made on the basis of maximum likelihood. The outcome is fed through web services in a tailored User Interface. The platform’s operation has been validated with real data.
APA, Harvard, Vancouver, ISO, and other styles
48

Voicu, Adrian Catalin, Ion Gheorghe Gheorghe, Liliana Laura Badita, and Adriana Cirstoiu. "3D Measuring of Complex Automotive Parts by Multiple Laser Scanning." Applied Mechanics and Materials 371 (August 2013): 519–23. http://dx.doi.org/10.4028/www.scientific.net/amm.371.519.

Full text
Abstract:
Three-dimensional scanning is available for more than 15 years, however there are few that have heard of it and as few people know the applications of this technology. 3D scanning is also known as 3D digitizing, the name coming from the fact that this is a process that uses a contact or non-contact digitizing probe to capture the objects form and recreate them in a virtual workspace through a very dense network of points (xyz) as a 3D graph representation. Based on this information have been developed many new applications in many fields - computer games industry, prosthetics or forensic medicine, the arts and culture area - but the most common area where scanning systems are used remains the automotive industry, aircraft and consumer goods. Most automotive manufacturers currently use 3D metrology based on optical or laser systems to validate products quality. The pieces are initially measured by 3D scanning then they are compared with the designed model (CAD file) using a specialized software. By this comparison producer can interfere very quickly in the manufacturing process to remove the cause of defects, this technique being called Reverse Engineering (RE). The overall accuracy of a 3D acquisition system depends above all on the sensors precision and on the acquisition device (acquisition with contact) or acquisition structure (acquisition without contact). This accuracy may vary from micron to millimeter and the acquisitions size from a few points to several thousand points per second. In a perfect world or in an integrated production environment, 3D measuring systems should be able to measure all the necessary parameters in a single step without errors, and to render the results in the same way to the manufacturing networks equipped with computers, in formats useful for machines control and processes management.
APA, Harvard, Vancouver, ISO, and other styles
49

Shang, Meng, Chulwoo Lee, Junwei Cao, and Yu Liu. "A Construction and Empirical Study of Quality Management Evaluation Index System in the Internet of Things Industry." Systems 10, no. 6 (November 22, 2022): 231. http://dx.doi.org/10.3390/systems10060231.

Full text
Abstract:
A revised version of the ISO9001:2015 quality management system is an indispensable element for a corporation to survive against the severe competition and maintain sustainable, competitive superiority, as well as a new outlook. With this in mind, the first purpose of this research was to seek customer satisfaction through the continuous improvement of the organization in the IoT industry. Furthermore, it aimed to lay a foundation for a preemptive response and formulate a management strategy regarding the continuity of the business. Simultaneously, we explored whether the scale items of the lower-ranking level resulting from the latent variables in the main provision of the revised edition are the variables that can sufficiently explain the latent variables. Furthermore, the statistical programs of the SPSS22.0 and the AMOS18.0 were utilized for analysis. The results demonstrated that corporations can achieve customer satisfaction through continuous improvements concerning the successful introduction and conversion of the new quality management system. It was confirmed that the core variables of the revised edition sufficiently explained the latent variables. Consequently, it is imperative to understand the latent risks from the internal and external environments surrounding the organization and formulate the short-term and long-term strategies of the organization on the part of a corporation.
APA, Harvard, Vancouver, ISO, and other styles
50

Chen, Chung-Ho, and Chao-Yu Chou. "Optimum process mean, standard deviation and specification limits settings under the Burr distribution." Engineering Computations 34, no. 1 (March 6, 2017): 66–76. http://dx.doi.org/10.1108/ec-10-2015-0321.

Full text
Abstract:
Purpose The quality level setting problem determines the optimal process mean, standard deviation and specification limits of product/process characteristic to minimize the expected total cost associated with products. Traditionally, it is assumed that the product/process characteristic is normally distributed. However, this may not be true. This paper aims to explore the quality level setting problem when the probability distribution of the process characteristic deviates from normality. Design/methodology/approach Burr developed a density function that can represent a wide range of normal and non-normal distributions. This can be applied to investigate the effect of non-normality on the studies of statistical quality control, for example, designs of control charts and sampling plans. The quality level setting problem is examined by introducing Burr’s density function as the underlying probability distribution of product/process characteristic such that the effect of non-normality to the determination of optimal process mean, standard deviation and specification limits of product/process characteristic can be studied. The expected total cost associated with products includes the quality loss of conforming products, the rework cost of non-conforming products and the scrap cost of non-conforming products. Findings Numerical results show that the expected total cost associated with products is significantly influenced by the parameter of Burr’s density function, the target value of product/process characteristic, quality loss coefficient, unit rework cost and unit scrap cost. Research limitations/implications The major assumption of the proposed model is that the lower specification limit must be positive for practical applications, which definitely affects the space of feasible solution for the different combinations of process mean and standard deviation. Social implications The proposed model can provide industry/business application for promoting the product/service quality assurance for the customer. Originality/value The authors adopt the Burr distribution to determine the optimum process mean, standard deviation and specification limits under non-normality. To the best of their knowledge, this is a new method for determining the optimum process and product policy, and it can be widely applied.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography