Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Initialization file.

Zeitschriftenartikel zum Thema „Initialization file“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-39 Zeitschriftenartikel für die Forschung zum Thema "Initialization file" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Tang, Hong Ying, Zhen Jiang Qian und Hao Huang. „Modeling and Proof of a File System Based on Micro-Kernel“. Advanced Materials Research 765-767 (September 2013): 837–40. http://dx.doi.org/10.4028/www.scientific.net/amr.765-767.837.

Der volle Inhalt der Quelle
Annotation:
As the manager of the data stored in the disk, the security of the file system is an important aspect of the operating system. Given the high logicalness and unambiguousness of the formal methods, also the module independence of the micro-kernel, this paper abstracts a formal model from the file system implementation based on micro-kernel, and then gives the axiomatic semantics of the initialization; finally, gives the correctness assertions of file system initialization, then verify the correctness of the formal model of file system aided with Isabelle/HOL.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Wu, Chin-Hsien, Tei-Wei Kuo und Li-Pin Chang. „The Design of efficient initialization and crash recovery for log-based file systems over flash memory“. ACM Transactions on Storage 2, Nr. 4 (November 2006): 449–67. http://dx.doi.org/10.1145/1210596.1210600.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Yang, Yan Xin, Huan Zou und Xue Ping Zhang. „Task Download Technique Research of Ethernet Parallel System Based on Multi-DSP“. Applied Mechanics and Materials 347-350 (August 2013): 1589–93. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.1589.

Der volle Inhalt der Quelle
Annotation:
In order to accomplish the real-time simulation application of embedded Ethernet parallel system, the DSP boot mode and the loading process are researched according to the analysis of the parallel simulation system structure and the basic characteristic of BF548 loader file structure. It has proposed a multiple-load method which is a kind of download method based on initialization code. Task program can automatic download after DSP launched in this way, and the problem of complex download procedure of DSP parallel system is solved. Through the actual test, the feasibility and validity of the download method have been verified, that provide a convenient and flexible experimental method and implementation technique of parallel simulation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Sanni*, Rachana R., und H. S. Guruprasad. „Hospital Management using OAM [Operation Administration & Maintenance] Tool“. International Journal of Innovative Technology and Exploring Engineering 10, Nr. 10 (30.08.2021): 24–30. http://dx.doi.org/10.35940/ijitee.j9391.08101021.

Der volle Inhalt der Quelle
Annotation:
This paper proposes a software tool to monitor the system configuration for Hospital Management System. The OAM tool monitors the system configurations after the user installs it into the system. The aim of this software is to monitor the system about its configurations and install the required softwares to the respective system. There are four tools in this application, Fintal, Piston, Naavi and Mapel. To install these, a technique of single file installation using batch scripting is used. Batch scripting is used to execute the installation files of softwares and the softwares are installed. These installation files of softwares will be embedded as execution commands in a single file called “windows batch file”, which should be saved with the extension as “.bat”. The installation that starts will be displayed in the command prompt to know whether the softwares are getting installed correctly. After each installation, the configuration and initialization of installed softwares will be displayed. The required softwares are to be installed. Each of these performs particular tasks that are required for the management of Hospitals. Fintal is used for overall management of Hospital. This is the most user-friendly part of OAM Tool. This handles overall administrative part of Hospitals. Piston is used for storing patient’s details. This stores the patients’ reports in detailed manner of each test the patient has undergone. Naavi is used for storing Laboratory details. The tests of each patient are stored here. And the last one, i.e., Mapel which is used for storing pharmaceuticals details (Medicines). It stores the details of medicines such as from which pharmaceuticals the medicines are purchased. The license of the hospital is also stored in this part of OAM Tool. All these together form an OAM Tool. This tool also manages other required softwares like, MySQL, ODBC drivers, etc. The management of Hospitals is very important as there is a need to maintain the patient details. This technique of installation is proposed in this paper to make the installation from hardware to remote installation such as, giving the access to the system in which the tool needs to be installed and to save the time of installation process.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Baek, Sung Hoon, und Ki-Woong Park. „A Durable Hybrid RAM Disk with a Rapid Resilience for Sustainable IoT Devices“. Sensors 20, Nr. 8 (11.04.2020): 2159. http://dx.doi.org/10.3390/s20082159.

Der volle Inhalt der Quelle
Annotation:
Flash-based storage is considered to be a de facto storage module for sustainable Internet of things (IoT) platforms under a harsh environment due to its relatively fast speed and operational stability compared to disk storage. Although their performance is considerably faster than disk-based mechanical storage devices, the read and write latency still could not catch up with that of Random-access memory (RAM). Therefore, RAM could be used as storage devices or systems for time-critical IoT applications. Despite such advantages of RAM, a RAM-based storage system has limitations in its use for sustainable IoT devices due to its nature of volatile storage. As a remedy to this problem, this paper presents a durable hybrid RAM disk enhanced with a new read interface. The proposed durable hybrid RAM disk is designed for sustainable IoT devices that require not only high read/write performance but also data durability. It includes two performance improvement schemes: rapid resilience with a fast initialization and direct byte read (DBR). The rapid resilience with a fast initialization shortens the long booting time required to initialize the durable hybrid RAM disk. The new read interface, DBR, enables the durable hybrid RAM disk to bypass the disk cache, which is an overhead in RAM-based storages. DBR performs byte–range I/O, whereas direct I/O requires block-range I/O; therefore, it provides a more efficient interface than direct I/O. The presented schemes and device were implemented in the Linux kernel. Experimental evaluations were performed using various benchmarks at the block level till the file level. In workloads where reads and writes were mixed, the durable hybrid RAM disk showed 15 times better performance than that of Solid-state drive (SSD) itself.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Rosen, C., U. Jeppsson und P. A. Vanrolleghem. „Towards a common benchmark for long-term process control and monitoring performance evaluation“. Water Science and Technology 50, Nr. 11 (01.12.2004): 41–49. http://dx.doi.org/10.2166/wst.2004.0669.

Der volle Inhalt der Quelle
Annotation:
The COST/IWA benchmark simulation model has been available for seven years. Its primary purpose has been to create a platform for control strategy benchmarking of biological wastewater treatment processes. The fact that the benchmark has resulted in more than 100 publications, not only in Europe but also worldwide, demonstrates the interest for such a tool in the research community. In this paper, an extension of the benchmark simulation model no. 1 (BSM1) is proposed. It aims at facilitating evaluation of two closely related operational tasks: long-term control strategy performance and process monitoring performance. The motivation for the extension is that these two tasks typically act on longer time scales. The extension proposed here consists of 1) prolonging the evaluation period to one year (including influent files), 2) specifying time varying process parameters and 3) including sensor and actuator failures. The prolonged evaluation period is necessary to obtain a relevant and realistic assessment of the effects of such disturbances. Also, a prolonged evaluation period allows for a number of long-term control actions/handles that cannot be evaluated in a realistic fashion in the one week BSM1 evaluation period. In the paper, models for influent file design, parameter changes and sensor failures, initialization procedure and evaluation criteria are discussed. Important remaining topics, for which consensus is required, are identified. The potential of a long-term benchmark is illustrated with an example of process monitoring algorithm benchmarking.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

LIU, NING, JING FU, CHRISTOPHER D. CAROTHERS, ONKAR SAHNI, KENNETH E. JANSEN und MARK S. SHEPHARD. „MASSIVELY PARALLEL I/O FOR PARTITIONED SOLVER SYSTEMS“. Parallel Processing Letters 20, Nr. 04 (Dezember 2010): 377–95. http://dx.doi.org/10.1142/s0129626410000302.

Der volle Inhalt der Quelle
Annotation:
This paper investigates I/O approaches for massively parallel partitioned solver systems. Typically, such systems have synchronized "loops" and write data in a well defined block I/O format consisting of a header and data portion. Our target use for such a parallel I/O subsystem is checkpoint-restart where writing is by far the most common operation and reading typically only happens during either initialization or during a restart operation because of a system failure. We compare four parallel I/O strategies: POSIX File Per Processor (1PFPP), "Poor-Man's" Parallel I/O (PMPIO), a synchronized parallel I/O (syncIO), and a "reduced blocking" strategy (rbIO). Performance tests executed on the Blue Gene/P at Argonne National Laboratory using real CFD solver data from PHASTA (an unstructured grid finite element Navier-Stokes solver) show that the syncIO strategy can achieve a read bandwidth of 47.4 GB/sec and a write bandwidth of 27.5 GB/sec using 128K processors. The "reduced-blocking" rbIO strategy achieves an actual writing performance of 17.8 GB/sec and the perceived writing performance is 166 TB/sec on Blue Gene/P using 128K processors.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Cho, Jae Hyuk, Yunhee Kang und Young B. Park. „Secure Delivery Scheme of Common Data Model for Decentralized Cloud Platforms“. Applied Sciences 10, Nr. 20 (13.10.2020): 7134. http://dx.doi.org/10.3390/app10207134.

Der volle Inhalt der Quelle
Annotation:
The Common Data Model (CDM) is being used to deal with problems caused by the various electronic medical record structures in the distributed hospital information system. The concept of CDM is emerging as a collaborative method of exchanging data from each hospital in the same format and conducting various clinical studies based on shared data. The baseline of a CDM system is centralized with an infrastructure typically controlled by a single entity with full authority. The characteristics of this centralized system can pose serious security issues. Therefore, the proposed SC-CDM system is designed as a platform for distributed ledger and provides data with a high level of confidentiality, security, and scalability. This framework provides a reference model that supports multiple channels, using secure CDM as an encryption method. The data confidentiality of CDM is guaranteed by asymmetric and symmetric protocols. Delivering CDM is protected by a symmetric key signed by the CDM creator and maintains lightweight distributed ledger transactions on Inter Planetary File System (IPFS), which acts as a file share. To deliver an encrypted CDM on the SC-CDM platform, the CDM is encrypted with a block cipher by a random symmetric key and Initialization Vector (IV). The symmetric key protocol is used for the fast encryption of large-capacity data. The SC-CDM is implemented the repository with IPFS for storing the encrypted CDM, in which symmetric key, two hash values, and IV are shared through blockchain. Data confidentiality of SC-CDM is guaranteed by only registered users accessing the data. In conclusion, the SC-CDM is the first approach to demultiplexing with the data confidentiality proof based on asymmetric key cryptography. We analyze and verify the security of SC-CDM by comparing qualitative factors and performance with existing CDM. Moreover, we adopt a byte-level processing method with encryption to ensure efficiency while handling a large CDM.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Willighagen, Lars G. „Citation.js: a format-independent, modular bibliography tool for the browser and command line“. PeerJ Computer Science 5 (12.08.2019): e214. http://dx.doi.org/10.7717/peerj-cs.214.

Der volle Inhalt der Quelle
Annotation:
Background Given the vast number of standards and formats for bibliographical data, any program working with bibliographies and citations has to be able to interpret such data. This paper describes the development of Citation.js (https://citation.js.org/), a tool to parse and format according to those standards. The program follows modern guidelines for software in general and JavaScript in specific, such as version control, source code analysis, integration testing and semantic versioning. Results The result is an extensible tool that has already seen adaption in a variety of sources and use cases: as part of a server-side page generator of a publishing platform, as part of a local extensible document generator, and as part of an in-browser converter of extracted references. Use cases range from transforming a list of DOIs or Wikidata identifiers into a BibTeX file on the command line, to displaying RIS references on a webpage with added Altmetric badges to generating ”How to cite this” sections on a blog. The accuracy of conversions is currently 27% for properties and 60% for types on average and a typical initialization takes 120 ms in browsers and 1 s with Node.js on the command line. Conclusions Citation.js is a library supporting various formats of bibliographic information in a broad selection of use cases and environments. Given the support for plugins, more formats can be added with relative ease.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Ndichu, Samuel, Sylvester McOyowo, Henry Okoyo und Cyrus Wekesa. „A Remote Access Security Model based on Vulnerability Management“. International Journal of Information Technology and Computer Science 12, Nr. 5 (08.10.2020): 38–51. http://dx.doi.org/10.5815/ijitcs.2020.05.03.

Der volle Inhalt der Quelle
Annotation:
Information security threats exploit vulnerabilities in communication networks. Remote access vulnerabilities are evident from the point of communication initialization following the communication channel to data or resources being accessed. These threats differ depending on the type of device used to procure remote access. One kind of these remote access devices can be considered as safe as the organization probably issues it to provide for remote access. The other type is risky and unsafe, as they are beyond the organization’s control and monitoring. The myriad of devices is, however, a necessary evil, be it employees on public networks like cyber cafes, wireless networks, vendors support, or telecommuting. Virtual Private Network (VPN) securely connects a remote user or device to an internal or private network using the internet and other public networks. However, this conventional remote access security approach has several vulnerabilities, which can take advantage of encryption. The significant threats are malware, botnets, and Distributed Denial of Service (DDoS). Because of the nature of a VPN, encryption will prevent traditional security devices such as a firewall, Intrusion Detection System (IDS), and antivirus software from detecting compromised traffic. These vulnerabilities have been exploited over time by attackers using evasive techniques to avoid detection leading to costly security breaches and compromises. We highlight numerous shortcomings for several conventional approaches to remote access security. We then adopt network tiers to facilitate vulnerability management (VM) in remote access domains. We perform regular traffic simulation using Network Security Simulator (NeSSi2) to set bandwidth baseline and use this as a benchmark to investigate malware spreading capabilities and DDoS attacks by continuous flooding in remote access. Finally, we propose a novel approach to remote access security by passive learning of packet capture file features using machine learning and classification using a classifier model.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Lafore, J. P., J. Stein, N. Asencio, P. Bougeault, V. Ducrocq, J. Duron, C. Fischer et al. „The Meso-NH Atmospheric Simulation System. Part I: adiabatic formulation and control simulations“. Annales Geophysicae 16, Nr. 1 (31.01.1998): 90–109. http://dx.doi.org/10.1007/s00585-997-0090-6.

Der volle Inhalt der Quelle
Annotation:
Abstract. The Meso-NH Atmospheric Simulation System is a joint effort of the Centre National de Recherches Météorologiques and Laboratoire d'Aérologie. It comprises several elements; a numerical model able to simulate the atmospheric motions, ranging from the large meso-alpha scale down to the micro-scale, with a comprehensive physical package, a flexible file manager, an ensemble of facilities to prepare initial states, either idealized or interpolated from meteorological analyses or forecasts, a flexible post-processing and graphical facility to visualize the results, and an ensemble of interactive procedures to control these functions. Some of the distinctive features of this ensemble are the following: the model is currently based on the Lipps and Hemler form of the anelastic system, but may evolve towards a more accurate form of the equations system. In the future, it will allow for simultaneous simulation of several scales of motion, by the so-called "interactive grid-nesting technique". It allows for the in-line computation and accumulation of various terms of the budget of several quantities. It allows for the transport and diffusion of passive scalars, to be coupled with a chemical module. It uses the relatively new Fortran 90 compiler. It is tailored to be easily implemented on any UNIX machine. Meso-NH is designed as a research tool for small and meso-scale atmospheric processes. It is freely accessible to the research community, and we have tried to make it as "user-friendly" as possible, and as general as possible, although these two goals sometimes appear contradictory. The present paper presents a general description of the adiabatic formulation and some of the basic validation simulations. A list of the currently available physical parametrizations and initialization methods is also given. A more precise description of these aspects will be provided in a further paper.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Naqvi, Syed, Tallha Akram, Sajjad Haider, Muhammad Kamran, Aamir Shahzad, Wilayat Khan, Tassawar Iqbal und Hafiz Umer. „Precision Modeling: Application of Metaheuristics on Current–Voltage Curves of Superconducting Films“. Electronics 7, Nr. 8 (03.08.2018): 138. http://dx.doi.org/10.3390/electronics7080138.

Der volle Inhalt der Quelle
Annotation:
Contemplating the importance of studying current–voltage curves in superconductivity, it has been recently and rightly argued that their approximation, rather than incessant measurements, seems to be a more viable option. This especially becomes bona fide when the latter needs to be recorded for a wide range of critical parameters including temperature and magnetic field, thereby becoming a tedious monotonous procedure. Artificial neural networks have been recently put forth as one methodology for approximating these so-called electrical measurements for various geometries of antidots on a superconducting thin film. In this work, we demonstrate that the prediction accuracy, in terms of mean-squared error, achieved by artificial neural networks is rather constrained, and, due to their immense credence on randomly generated networks’ coefficients, they may result in vastly varying prediction accuracies for different geometries, experimental conditions, and their own tunable parameters. This inconsistency in prediction accuracies is resolved by controlling the uncertainty in networks’ initialization and coefficients’ generation by means of a novel entropy based genetic algorithm. The proposed method helps in achieving a substantial improvement and consistency in the prediction accuracy of current–voltage curves in comparison to existing works, and is amenable to various geometries of antidots, including rectangular, square, honeycomb, and kagome, on a superconducting thin film.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Mitra, Parta, und Asim Halder. „Effect of initialization time on application potentiality of a ZnO thin film based LPG sensor“. Materials Research 12, Nr. 3 (September 2009): 329–32. http://dx.doi.org/10.1590/s1516-14392009000300013.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Tudor, M. „Methods for automatized detection of rapid changes in lateral boundary condition fields for NWP limited area models“. Geoscientific Model Development 8, Nr. 8 (24.08.2015): 2627–43. http://dx.doi.org/10.5194/gmd-8-2627-2015.

Der volle Inhalt der Quelle
Annotation:
Abstract. Three-hourly temporal resolution of lateral boundary data for limited area models (LAMs) can be too infrequent to resolve rapidly moving storms. This problem is expected to be worse with increasing horizontal resolution. In order to detect intensive disturbances in surface pressure moving rapidly through the model domain, a filtered surface pressure field (MCUF) is computed operationally in the ARPEGE global model of Météo France. The field is distributed in the coupling files along with conventional meteorological fields used for lateral boundary conditions (LBCs) for the operational forecast using limited area model ALADIN (Aire Limitée Adaptation dynamique Développement InterNational) in the Meteorological and Hydrological Service of Croatia (DHMZ). Here an analysis is performed of the MCUF field for the LACE coupling domain for the period from 23 January 2006, when it became available, until 15 November 2014. The MCUF field is a good indicator of rapidly moving pressure disturbances (RMPDs). Its spatial and temporal distribution can be associated with the usual cyclone tracks and areas known to be supporting cyclogenesis. An alternative set of coupling files from the IFS operational run in the European Centre for Medium-Range Weather Forecasts (ECMWF) is also available operationally in DHMZ with 3-hourly temporal resolution, but the MCUF field is not available. Here, several methods are tested that detect RMPDs in surface pressure a posteriori from the IFS model fields provided in the coupling files. MCUF is computed by running ALADIN on the coupling files from IFS. The error function is computed using one-time-step integration of ALADIN on the coupling files without initialization, initialized with digital filter initialization (DFI) or scale-selective DFI (SSDFI). Finally, the amplitude of changes in the mean sea level pressure is computed from the fields in the coupling files. The results are compared to the MCUF field of ARPEGE and the results of same methods applied to the coupling files from ARPEGE. Most methods give a signal for the RMPDs, but DFI reduces the storms too much to be detected. The error functions without filtering and amplitude have more noise, but the signal of a RMPD is also stronger. The methods are tested for NWP LAM ALADIN, but could be applied to other LAMs and benefit the performance of climate LAMs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Simanullang, Harlen Gilbert, Arina Prima Silalahi und Darwis Robinson Manalu. „Sistem Informasi Pendaftaran Mahasiswa Baru Menggunakan Framework Codeigniter dan Application Programming Interface“. Ultima InfoSys : Jurnal Ilmu Sistem Informasi 12, Nr. 1 (29.06.2021): 67–73. http://dx.doi.org/10.31937/si.v12i1.1803.

Der volle Inhalt der Quelle
Annotation:
One of the most important things in the work process is the presentation of information that is fast and accurate. Data is processed in a short time, it will increase work efficiency, one of which is by using the system to process and display data into information. Information systems are needed considering technology is growing and now people are required to use more technology to reduce activities outside the home. Problems that often arise are the tight location of registration and the length of service for one registrant due to filling out registration forms and screening examinations for new students. In addition, the large accumulation of registration and trial files also takes a long time so that announcements of new student graduations cannot be made. The goal is to build a new student admission information system for all faculties at the Indonesian Methodist University, reduce applicants who come to the registration location, reduce the use of paper registration forms, help submit and fill in student data, conduct online screening exams, provide information on new student registration, schedule and payment of tuition fees. The result of this initialization is building a new student registration information system using the Codeigniter Framework and Application Programming Interface (API). Index Terms—Application Programming Interface (API); Codeigniter framework; information systems; online registration
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Li, Fan, und Jing Zhe Pan. „Modelling Multi-Cracking in Thin Films during Constrained Sintering Using Anisotropic Constitutive Law and Material Point Method“. Advances in Science and Technology 62 (Oktober 2010): 191–96. http://dx.doi.org/10.4028/www.scientific.net/ast.62.191.

Der volle Inhalt der Quelle
Annotation:
The sintering of thin films is widely used for surface coatings and because of its technological importance has generated extensive research interest. During the sintering process, the thin film is constrained by the substrate, which generates considerably high levels of stresses. Crackings are often observed and are considered as one of the major problems of the surface coating technique. This paper has proposed a new numerical method in order to tackle the traditional difficulties in simulating multi-crackings during constrained sintering. Main features of the present method include: (i) the material data is represented by an anisotropic constitutive law, (ii) a new numerical scheme is developed for the crack initialization and growth based on the material point method, (iii) the 3D viscous film shrinkage model is solved by using a dynamic FE scheme, and (iv) the random nature of the initial green body density is represented by statistical variabilities. It is shown that the model proposed by the present paper is capable for the nucleation and propagation of multi-cracks in a straightforward manner. Cracking patterns are shown to be consistent with experimental understandings. The focus of the paper is on the numerical issues and demonstrating the capacity of the model.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Dhaliwal, Gurpinder, und Golam Newaz. „Effect of Resin Rich Veil Cloth Layers on the Uniaxial Tensile Behavior of Carbon Fiber Reinforced Fiber Metal Laminates“. Journal of Composites Science 2, Nr. 4 (19.10.2018): 61. http://dx.doi.org/10.3390/jcs2040061.

Der volle Inhalt der Quelle
Annotation:
The influence of stacking sequence and resin rich (polyester veil cloth) layers, which were used to improve the adhesion between carbon fiber/epoxy (CFRP) and aluminum layers (AL), on the uniaxial tensile response of carbon fiber reinforced aluminum laminates (CARALL) was investigated in this research study. The metal volume fraction was varied to prepare two types of CARALL laminates having a 3/2 configuration with the help of a vacuum press without using any adhesive film. Numerical simulations were performed by utilizing commercially available finite element (FE) code, LS-Dyna to predict the tensile response of these laminates with initialization of predicted thermal residual stresses that developed during curing of laminates. Delamination failure was considered in the numerical simulation by utilizing the well-known B-K mixed-mode damage propagation model. It was found that addition of epoxy resin rich (polyester veil cloth) layers used for enhancement of interfacial bond adhesion and to ensure no separation between AL-CFRP layers increased the tensile strength of CARALL laminates.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Shi, L. P., und T. C. Chong. „Nanophase Change for Data Storage Applications“. Journal of Nanoscience and Nanotechnology 7, Nr. 1 (01.01.2007): 65–93. http://dx.doi.org/10.1166/jnn.2007.18007.

Der volle Inhalt der Quelle
Annotation:
Phase change materials are widely used for date storage. The most widespread and important applications are rewritable optical disc and Phase Change Random Access Memory (PCRAM), which utilizes the light and electric induced phase change respectively. For decades, miniaturization has been the major driving force to increase the density. Now the working unit area of the current data storage media is in the order of nano-scale. On the nano-scale, extreme dimensional and nano-structural constraints and the large proportion of interfaces will cause the deviation of the phase change behavior from that of bulk. Hence an in-depth understanding of nanophase change and the related issues has become more and more important. Nanophase change can be defined as: phase change at the scale within nano range of 100 nm, which is size-dependent, interface-dominated and surrounding materials related. Nanophase change can be classified into two groups, thin film related and structure related. Film thickness and clapping materials are key factors for thin film type, while structure shape, size and surrounding materials are critical parameters for structure type. In this paper, the recent development of nanophase change is reviewed, including crystallization of small element at nano size, thickness dependence of crystallization, effect of clapping layer on the phase change of phase change thin film and so on. The applications of nanophase change technology on data storage is introduced, including optical recording such as super lattice like optical disc, initialization free disc, near field, super-RENS, dual layer, multi level, probe storage, and PCRAM including, superlattice-like structure, side edge structure, and line type structure. Future key research issues of nanophase change are also discussed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Tudor, M. „Methods for automatized detection of rapid changes in lateral boundary condition fields for NWP limited area models“. Geoscientific Model Development Discussions 8, Nr. 3 (10.03.2015): 2691–737. http://dx.doi.org/10.5194/gmdd-8-2691-2015.

Der volle Inhalt der Quelle
Annotation:
Abstract. Three hourly temporal resolution of lateral boundary data can be too low to properly resolve rapidly moving storms. This problem is expected to be worse with increasing horizontal resolution. In order to detect intensive disturbances in surface pressure moving rapidly through the model domain, a filtered surface pressure field (MCUF) is computed operationally in the ARPEGE global model of Météo France. The field is distributed in the coupling files along with conventional meteorological fields used for lateral boundary conditions (LBCs) for the operational forecast using limited area model ALADIN in the Meteorological and hydrological service of Croatia (DHMZ). Here an analysis is performed of the MCUF field for the LACE coupling domain for the period since 23 January 2006, when it became available, until 15 November 2014. The MCUF field is a good indicator of rapidly moving pressure disturbances (RMPDs). Its spatial and temporal distribution can be associated to the usual cyclone tracks and areas known to be supporting cyclogenesis. Other global models do not compute such field. Alternative set of coupling files from IFS operational run in ECMWF is also available operationally in DHMZ with 3 hourly temporal resolution but the MCUF field is not available. Here, several methods are tested that detect RMPDs in surface pressure a posteriori from the IFS model fields provided in the coupling files. MCUF is computed by running ALADIN on the coupling files from IFS. The error function is computed using one time step integration of ALADIN on the coupling files without initialization, initialized with DFI or SSDFI. Finally, the amplitude of changes in the mean sea level pressure is computed from the fields in the coupling files. The results are compared to the MCUF field of ARPEGE and the results of same methods applied to the coupling files from ARPEGE. Most methods give a signal for the rapidly moving pressure disturbances (RMPDs), but DFI reduces the storms too much to be detected. Error function without filtering and amplitude have more noise, but the signal of a RMPD is also stronger. The methods are tested for NWP LAM, but could be applied to and benefit the performance of climate LAMs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Liu, G., J. J. Liu, D. W. Tarasick, V. E. Fioletov, J. J. Jin, O. Moeni, X. Liu und C. E. Sioris. „A global tropospheric ozone climatology from trajectory-mapped ozone soundings“. Atmospheric Chemistry and Physics Discussions 13, Nr. 5 (02.05.2013): 11473–507. http://dx.doi.org/10.5194/acpd-13-11473-2013.

Der volle Inhalt der Quelle
Annotation:
Abstract. A global three-dimensional (i.e. latitude, longitude, altitude) climatology of tropospheric ozone is derived from the ozone sounding record by trajectory mapping. Approximately 52 000 ozonesonde profiles from more than 100 stations worldwide since 1962 are used. The small number of stations causes the set of ozone soundings to be sparse in geographical spacing. Here, forward and backward trajectory calculations are performed for each sounding to map ozone measurements to a number of other locations, and so to fill in the spatial domain. This is possible because the lifetime of ozone in the troposphere is of the order of weeks. This physically-based interpolation method offers obvious advantages over typical statistical interpolation methods. The trajectory-mapped ozone values show reasonable agreement, where they overlap, to the actual soundings, and the patterns produced separately by forward and backward trajectory calculations are similar. Major regional features of the tropospheric ozone distribution are clearly evident in the global maps. An interpolation algorithm based on spherical functions is further used for smoothing and to fill in remaining data gaps. The resulting three-dimensional global tropospheric ozone climatology facilitates visualization and comparison of different years, decades, and seasons, and offers some intriguing insights into the global variation of tropospheric ozone. It will be useful for climate and air quality model initialization and validation, and as an a priori climatology for satellite data retrievals. Further division of the climatology into decadal averages provides a global view of tropospheric ozone trends, which appear to be surprisingly modest over the last four decades.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Vázquez-Romero, Adrián, und Ascensión Gallardo-Antolín. „Automatic Detection of Depression in Speech Using Ensemble Convolutional Neural Networks“. Entropy 22, Nr. 6 (20.06.2020): 688. http://dx.doi.org/10.3390/e22060688.

Der volle Inhalt der Quelle
Annotation:
This paper proposes a speech-based method for automatic depression classification. The system is based on ensemble learning for Convolutional Neural Networks (CNNs) and is evaluated using the data and the experimental protocol provided in the Depression Classification Sub-Challenge (DCC) at the 2016 Audio–Visual Emotion Challenge (AVEC-2016). In the pre-processing phase, speech files are represented as a sequence of log-spectrograms and randomly sampled to balance positive and negative samples. For the classification task itself, first, a more suitable architecture for this task, based on One-Dimensional Convolutional Neural Networks, is built. Secondly, several of these CNN-based models are trained with different initializations and then the corresponding individual predictions are fused by using an Ensemble Averaging algorithm and combined per speaker to get an appropriate final decision. The proposed ensemble system achieves satisfactory results on the DCC at the AVEC-2016 in comparison with a reference system based on Support Vector Machines and hand-crafted features, with a CNN+LSTM-based system called DepAudionet, and with the case of a single CNN-based classifier.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Starodubtseva, I. P., und A. N. Pavlenko. „Peculiarities of the initialization and dynamics of the quench front on an extremely overheated plate cooled by a falling film of a cryogenic fluid“. Journal of Physics: Conference Series 1677 (November 2020): 012097. http://dx.doi.org/10.1088/1742-6596/1677/1/012097.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Wang, Dengwei. „An Efficient Multi-Scale Local Binary Fitting-Based Level Set Method for Inhomogeneous Image Segmentation“. Journal of Sensors 2018 (05.08.2018): 1–17. http://dx.doi.org/10.1155/2018/6269214.

Der volle Inhalt der Quelle
Annotation:
An efficient level set model based on multiscale local binary fitting (MLBF) is proposed for image segmentation. By introducing multiscale idea into the LBF model, the proposed MLBF model can effectively and efficiently segment images with intensity inhomogeneity. In addition, by adding a reaction diffusion term into the level set evolution (LSE) equation, the regularization of the level set function (LSF) can be achieved, thus completely eliminating the time-consuming reinitialization process. In the implementation phase, in order to greatly improve the efficiency of the numerical solution of the level set segmentation model, we introduce three strategies: The first is the additive operator splitting (AOS) solver which is used for breaking the restrictions on time step; the second is the salient target detection mechanism which is used to achieve full automatic initialization of the LSE process; the third is the sparse filed method (SFM) which is used to restrict the groups of pixels that need to be updated in a small strip region. Under the combined effect of these three strategies, the proposed model achieves very high execution efficiency in the following aspects: contour location accuracy, speed of evolution convergence, robustness against initial contour position, and robustness against noise interference.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Liu, G., J. Liu, D. W. Tarasick, V. E. Fioletov, J. J. Jin, O. Moeini, X. Liu, C. E. Sioris und M. Osman. „A global tropospheric ozone climatology from trajectory-mapped ozone soundings“. Atmospheric Chemistry and Physics 13, Nr. 21 (04.11.2013): 10659–75. http://dx.doi.org/10.5194/acp-13-10659-2013.

Der volle Inhalt der Quelle
Annotation:
Abstract. A global three-dimensional (i.e. latitude, longitude, altitude) climatology of tropospheric ozone is derived from the ozone sounding record by trajectory mapping. Approximately 52 000 ozonesonde profiles from more than 100 stations worldwide since 1965 are used. The small number of stations results in a sparse geographical distribution. Here, forward and backward trajectory calculations are performed for each sounding to map ozone measurements to a number of other locations, and so to fill in the spatial domain. This is possible because the lifetime of ozone in the troposphere is of the order of weeks. This physically based interpolation method offers obvious advantages over typical statistical interpolation methods. The trajectory-mapped ozone values show reasonable agreement, where they overlap, to the actual soundings, and the patterns produced separately by forward and backward trajectory calculations are similar. Major regional features of the tropospheric ozone distribution are clearly evident in the global maps. An interpolation algorithm based on spherical functions is further used for smoothing and to fill in remaining data gaps. The resulting three-dimensional global tropospheric ozone climatology facilitates visualization and comparison of different years, decades, and seasons, and offers some intriguing insights into the global variation of tropospheric ozone. It will be useful for climate and air quality model initialization and validation, and as an a priori climatology for satellite data retrievals. Further division of the climatology into decadal and annual averages can provide a global view of tropospheric ozone changes, although uncertainties with regard to the performance of older sonde types, as well as more recent variations in operating procedures, need to be taken into account.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Kozachok, Vasilii, Alexander Kozachok und Evgenii Kochetkov. „Multi-Level Policy Model Access Control Security Operating Systems of the Windows Family“. Voprosy kiberbezopasnosti, Nr. 1(41) (2021): 41–56. http://dx.doi.org/10.21681/2311-3456-2021-1-41-56.

Der volle Inhalt der Quelle
Annotation:
The purpose of research – development of a more advanced Windows NT family access control mechanism to protect against information leakage from memory by hidden channels. The method of research – analysis of Windows NT family models of mandatory access control and integrity control, modeling of access control security policy for specified security properties, automatic verification of models. The Lamport Temporal Logic of Actions (TLA +) used to describe the model and its specification is used. TLA+ allows automatic verification of the model with the specified security properties. The result of research – revealed the main limitations of the existing mandatory integrity control of operating systems of the Windows NT family. A set of structures of a multilevel model has been developed, reflecting the attributes that are significant for modeling the process of access of subjects to objects. The key mechanisms of access control in the operating system are modeled: management of users, groups, subjects, objects, roles, rights, discretionary and mandatory access control, mandatory integrity control - multilevel control of subjects’ access to objects. The model defines a mechanism for controlling the creation of subjects based on executable files to organize an isolated software environment. The values of the attributes of the model variables for the initialization stage are determined. The invariants of variables correctness in the process of verification and subjects to objects safe access are developed. The model was specified using the TLA + modeling language and verified.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Lencova, B., und G. Wisselink. „The Added Value of Graphical Input and Display for Electron Lens Design“. Proceedings, annual meeting, Electron Microscopy Society of America 48, Nr. 1 (12.08.1990): 190–91. http://dx.doi.org/10.1017/s0424820100179701.

Der volle Inhalt der Quelle
Annotation:
Recent progress in computer technology enables the calculation of lens fields and focal properties on commonly available computers such as IBM ATs. If we add to this the use of graphics, we greatly increase the applicability of design programs for electron lenses. Most programs for field computation are based on the finite element method (FEM). They are written in Fortran 77, so that they are easily transferred from PCs to larger machines.The design process has recently been made significantly more user friendly by adding input programs written in Turbo Pascal, which allows a flexible implementation of computer graphics. The input programs have not only menu driven input and modification of numerical data, but also graphics editing of the data. The input programs create files which are subsequently read by the Fortran programs. From the main menu of our magnetic lens design program, further options are chosen by using function keys or numbers. Some options (lens initialization and setting, fine mesh, current densities, etc.) open other menus where computation parameters can be set or numerical data can be entered with the help of a simple line editor. The "draw lens" option enables graphical editing of the mesh - see fig. I. The geometry of the electron lens is specified in terms of coordinates and indices of a coarse quadrilateral mesh. In this mesh, the fine mesh with smoothly changing step size is calculated by an automeshing procedure. The options shown in fig. 1 allow modification of the number of coarse mesh lines, change of coordinates of mesh points or lines, and specification of lens parts. Interactive and graphical modification of the fine mesh can be called from the fine mesh menu. Finally, the lens computation can be called. Our FEM program allows up to 8000 mesh points on an AT computer. Another menu allows the display of computed results stored in output files and graphical display of axial flux density, flux density in magnetic parts, and the flux lines in magnetic lenses - see fig. 2. A series of several lens excitations with user specified or default magnetization curves can be calculated and displayed in one session.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Krishfield, R., J. Toole, A. Proshutinsky und M.-L. Timmermans. „Automated Ice-Tethered Profilers for Seawater Observations under Pack Ice in All Seasons“. Journal of Atmospheric and Oceanic Technology 25, Nr. 11 (01.11.2008): 2091–105. http://dx.doi.org/10.1175/2008jtecho587.1.

Der volle Inhalt der Quelle
Annotation:
Abstract An automated, easily deployed Ice-Tethered Profiler (ITP) instrument system, developed for deployment on perennial sea ice in the polar oceans to measure changes in upper ocean water properties in all seasons, is described, and representative data from prototype instruments are presented. The ITP instrument consists of three components: a surface subsystem that sits atop an ice floe; a weighted, plastic-jacketed wire-rope tether of arbitrary length (up to 800 m) suspended from the surface element; and an instrumented underwater unit that employs a traction drive to profile up and down the wire tether. ITPs profile the water column at a programmed sampling interval; after each profile, the underwater unit transfers two files holding oceanographic and engineering data to the surface unit using an inductive modem and from the surface instrument to a shore-based data server using an Iridium transmitter. The surface instrument also accumulates battery voltage readings, buoy temperature data, and locations from a GPS receiver at a specified interval (usually every hour) and transmits those data daily. Oceanographic and engineering data are processed, displayed, and made available in near–real time (available online at http://www.whoi.edu/itp). Six ITPs were deployed in the Arctic Ocean between 2004 and 2006 in the Beaufort gyre with various programmed sampling schedules of two to six one-way traverses per day between 10- and 750–760-m depth, providing more than 5300 profiles in all seasons (as of July 2007). The acquired CTD profile data document interesting spatial variations in the major water masses of the Canada Basin, show the double-diffusive thermohaline staircase that lies above the warm, salty Atlantic layer, measure seasonal surface mixed layer deepening, and document several mesoscale eddies. Augmenting the systems already deployed and to replace expiring systems, an international array of more than one dozen ITPs will be deployed as part of the Arctic Observing Network during the International Polar Year (IPY) period (2007–08) holding promise for more valuable real-time upper ocean observations for operational needs, to support studies of ocean processes, and to facilitate numerical model initialization and validation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Osman, M., D. W. Tarasick, J. Liu, O. Moeini, V. Thouret, V. E. Fioletov, M. Parrington und P. Nédélec. „Carbon monoxide climatology derived from the trajectory mapping of global MOZAIC-IAGOS data“. Atmospheric Chemistry and Physics Discussions 15, Nr. 21 (02.11.2015): 29871–937. http://dx.doi.org/10.5194/acpd-15-29871-2015.

Der volle Inhalt der Quelle
Annotation:
Abstract. A three-dimensional gridded climatology of carbon monoxide (CO) has been developed by trajectory mapping of global MOZAIC-IAGOS in situ measurements from commercial aircraft data. CO measurements made during aircraft ascent and descent, comprising nearly 41 200 profiles at 148 airports worldwide from December 2001 to December 2012 are used. Forward and backward trajectories are calculated from meteorological reanalysis data in order to map the CO measurements to other locations, and so to fill in the spatial domain. This domain-filling technique employs 15 800 000 calculated trajectories to map otherwise sparse MOZAIC-IAGOS data into a quasi-global field. The resulting trajectory-mapped CO dataset is archived monthly from 2001–2012 on a grid of 5° longitude × 5° latitude × 1 km altitude, from the surface to 14 km altitude. The mapping product has been carefully evaluated, by comparing maps constructed using only forward trajectories and using only backward trajectories. The two methods show similar global CO distribution patterns. The magnitude of their differences is most commonly 10 % or less, and found to be less than 30 % for almost all cases. The trajectory-mapped CO dataset has also been validated by comparison profiles for individual airports with those produced by the mapping method when data from that site are excluded. While there are larger differences below 2 km, the two methods agree very well between 2 and 10 km with the magnitude of biases within 20 %. Maps are also compared with Version 6 data from the Measurements Of Pollution In The Troposphere (MOPITT) satellite instrument. While agreement is good in the lowermost troposphere, the MOPITT CO profile shows negative biases of ~ 20 % between 500 and 300 hPa. These upper troposphere biases are not related to the mapping procedure, as almost identical differences are found with the original in situ MOZAIC-IAGOS data. The total CO trajectory-mapped MOZAIC-IAGOS climatology column agrees with the MOPITT CO total column within ±5 %, which is consistent with previous reports. The maps clearly show major regional CO sources such as biomass burning in the central and southern Africa and anthropogenic emissions in eastern China. The dataset shows the seasonal CO cycle over different latitude bands and altitude ranges that are representative of the regions as well as long-term trends over latitude bands. We observe a decline in CO over the Northern Hemisphere extratropics and the tropics consistent with that reported by previous studies. Similar maps have been made using the concurrent O3 measurements by MOZAIC-IAGOS, as the global variation of O3–CO correlations can be a useful tool for the evaluation of ozone sources and transport in chemical transport models. We anticipate use of the trajectory-mapped MOZAIC-IAGOS CO dataset as an a priori climatology for satellite retrieval, and for air quality model validation and initialization.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Osman, Mohammed K., David W. Tarasick, Jane Liu, Omid Moeini, Valerie Thouret, Vitali E. Fioletov, Mark Parrington und Philippe Nédélec. „Carbon monoxide climatology derived from the trajectory mapping of global MOZAIC-IAGOS data“. Atmospheric Chemistry and Physics 16, Nr. 15 (12.08.2016): 10263–82. http://dx.doi.org/10.5194/acp-16-10263-2016.

Der volle Inhalt der Quelle
Annotation:
Abstract. A three-dimensional gridded climatology of carbon monoxide (CO) has been developed by trajectory mapping of global MOZAIC-IAGOS in situ measurements from commercial aircraft data. CO measurements made during aircraft ascent and descent, comprising nearly 41 200 profiles at 148 airports worldwide from December 2001 to December 2012, are used. Forward and backward trajectories are calculated from meteorological reanalysis data in order to map the CO measurements to other locations and so to fill in the spatial domain. This domain-filling technique employs 15 800 000 calculated trajectories to map otherwise sparse MOZAIC-IAGOS data into a quasi-global field. The resulting trajectory-mapped CO data set is archived monthly from 2001 to 2012 on a grid of 5° longitude × 5° latitude × 1 km altitude, from the surface to 14 km altitude.The mapping product has been carefully evaluated, firstly by comparing maps constructed using only forward trajectories and using only backward trajectories. The two methods show similar global CO distribution patterns. The magnitude of their differences is most commonly 10 % or less and found to be less than 30 % for almost all cases. Secondly, the method has been validated by comparing profiles for individual airports with those produced by the mapping method when data from that site are excluded. While there are larger differences below 2 km, the two methods agree very well between 2 and 10 km with the magnitude of biases within 20 %. Finally, the mapping product is compared with global MOZAIC-IAGOS cruise-level data, which were not included in the trajectory-mapped data set, and with independent data from the NOAA aircraft flask sampling program. The trajectory-mapped MOZAIC-IAGOS CO values show generally good agreement with both independent data sets.Maps are also compared with version 6 data from the Measurements Of Pollution In The Troposphere (MOPITT) satellite instrument. Both data sets clearly show major regional CO sources such as biomass burning in Central and southern Africa and anthropogenic emissions in eastern China. While the maps show similar features and patterns, and relative biases are small in the lowermost troposphere, we find differences of ∼ 20 % in CO volume mixing ratios between 500 and 300 hPa. These upper-tropospheric biases are not related to the mapping procedure, as almost identical differences are found with the original in situ MOZAIC-IAGOS data. The total CO trajectory-mapped MOZAIC-IAGOS column is also higher than the MOPITT CO total column by 12–16 %.The data set shows the seasonal CO cycle over different latitude bands and altitude ranges as well as long-term trends over different latitude bands. We observe a decline in CO over the northern hemispheric extratropics and the tropics consistent with that reported by previous studies using other data sources.We anticipate use of the trajectory-mapped MOZAIC-IAGOS CO data set as an a priori climatology for satellite retrieval and for air quality model validation and initialization.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Deng, He Lian, und You Gang Xiao. „Development of General Embedded Intelligent Monitoring System for Tower Crane“. Applied Mechanics and Materials 103 (September 2011): 394–98. http://dx.doi.org/10.4028/www.scientific.net/amm.103.394.

Der volle Inhalt der Quelle
Annotation:
For improving the generality, expandability and accuracy, the general embedded intelligent monitoring system of tower crane is developed. The system can be applied to different kinds of tower cranes running at any lifting ratio, can be initialized using U disk with the information of tower crane, and fit the lifting torque curve automatically. In dangerous state, the system can sent out alarm signals with sounds and lights, and cut off power by sending signals to PLC through communication interface RS485. When electricity goes off suddenly, the system can record the real-time operating information automatically, and store them in a black box, which can be taken as the basis for confirming the accident responsibility.In recent years, tower cranes play a more and more important role in the construction of tall buildings, in other construction fields are also more widely used. For the safety of tower cranes, various monitors have been developed for monitoring the running information of crane tower [1-8]. These monitors can’t eliminate the errors caused by temperature variations automatically. The specific tower crane’s parameters such as geometric parameters, alarming parameters, lifting ratio, lifting torque should be embedded into the core program, so a monitor can only be applied to a specific type of tower crane, lack of generality and expansibility.For improving the defects of the existing monitors, a general intelligent monitoring modular system of tower crane with high precision is developed, which can initialize the system automatically, eliminate the temperature drift and creep effect of sensor, and store power-off data, which is the function of black box.Hardware design of the monitoring systemThe system uses modularized design mode. These modules include embedded motherboard module, sensor module, signal processing module, data acquisition module, power module, output control module, display and touch screen module. The hardware structure is shown in figure 1. Figure 1 Hardware structure of the monitoring systemEmbedded motherboard module is the core of the system. The motherboard uses the embedded microprocessor ARM 9 as MCU, onboard SDRAM and NAND Flash. Memory size can be chosen according to users’ needs. SDRAM is used for running procedure and cache data. NAND Flash is used to store embedded Linux operating system, applications and operating data of tower crane. Onboard clock with rechargeable batteries provides the information of year, month, day, hour, minute and second. This module provides time tag for real-time operating data. Most interfaces are taken out by the plugs on the embedded motherboard. They include I/O interface, RS232 interface, RS485 interface, USB interface, LCD interface, Audio interface, Touch Screen interface. Pull and plug structure is used between all interfaces and peripheral equipments, which not only makes the system to be aseismatic, but also makes its configuration flexible. Watch-dog circuit is designed on the embedded motherboard, which makes the system reset to normal state automatically after its crash because of interference, program fleet, or getting stuck in an infinite loop, so the system stability is improved greatly. In order to store operating data when power is down suddenly, the power-down protection circuit is designed. The saved data will be helpful to repeat the accident process later, confirm the accident responsibility, and provide the basis for structure optimization of tower crane.Sensor module is confirmed by the main parameters related to tower crane’s security, such as lifting weight, lifting torque, trolley luffing, lifting height, rotary angle and wind speed. Axle pin shear load cell is chosen to acquire lifting weight signals. Potentiometer accompanied with multi-stopper or incremental encoder is chosen to acquire trolley luffing and lifting height signals. Potentiometer accompanied with multi-stopper or absolute photoelectric encoder is chosen to acquire rotary angle signals. Photoelectric sensor is chosen to acquire wind speed signals. The output signals of these sensors can be 0~5V or 4~20mA analog signals, or digital signal from RS485 bus. The system can choose corresponding signal processing method according to the type of sensor signal, which increases the flexibility on the selection of sensors, and is helpful for the users to expand monitoring objects. If the acquired signal is analog signal, it will be processed with filtering, isolation, anti-interference processing by signal isolate module, and sent to A/D module for converting into digital signals, then transformed into RS485 signal by the communication protocol conversion device according to Modbus protocol. If the acquired signal is digital signal with RS485 interface, it can be linked to RS485 bus directly. All the acquired signals are sent to embedded motherboard for data processing through RS485 bus.The data acquisition module is linked to the data acquisition control module on embedded motherboard through RS485 interface. Under the control of program, the system inquires the sensors at regular intervals, and acquires the operating data of crane tower. Median filter technology is used to eliminate interferences from singularity signals. After analysis and processing, the data are stored in the database on ARM platform.Switch signal can be output to relay module or PLC from output control module through RS485 bus, then each actuator will be power on or power off according to demand, so the motion of tower crane will be under control.Video module is connected with motherboard through TFT interface. After being processed, real-time operating parameters are displayed on LCD. The working time, work cycle times, alarm, overweight and ultar-torque information will be stored into database automatically. For meeting the needs of different users, the video module is compatible with 5.7, 8.4 or 10.4 inches of color display.Touch screen is connected with embedded motherboard by touch screen interface, so human machine interaction is realized. Initialization, data download, alarm information inquire, parameter modification can be finished through touch screen.Speaker is linked with audio interface, thus alarm signals is human voice signal, not harsh buzz.USB interface can be linked to conventional U disk directly. Using U disk, users can upload basic parameters of tower crane, initialize system, download operating data, which provides the basis for the structural optimization and accident analysis. Software design of the monitoring systemAccording to the modular design principle, the system software is divided into grading encryption module, system update module, parameter settings module, calibration module, data acquisition and processor module, lifting parameters monitoring module, alarm query module, work statistics module.Alarm thresholds are guarantee for safety operation of the tower crane. Operating data of tower crane are the basis of service life prediction, structural optimization, accident analysis, accident responsibility confirmation. According to key field, the database is divided into different security levels for security requirements. Key fields are grade encryption with symmetrical encryption algorithm, and data keys are protected with elliptic curve encryption algorithm. The association is realized between the users’ permission and security grade of key fields, which will ensure authorized users with different grades to access the equivalent encrypted key fields. The user who meets the grade can access equivalent encrypted database and encrypted key field in the database, also can access low-grade encrypted key fields. This ensures the confidentiality and integrity of key data, and makes the system a real black box.The system is divided into operating mode and management mode in order to make the system toggle between the two states conveniently. The default state is operating mode. As long as the power is on, the monitoring system will be started by the system guide program, and monitor the operating state of the tower crane. The real-time operating data will be displayed on the display screen. At the dangerous state, warning signal will be sent to the driver through voice alarm and light alarm, and corresponding control signal will be output to execution unit to cut off relevant power for tower crane’s safety.By clicking at the mode switch button on the initial interface, the toggle can be finished between the management mode and the operating mode. Under the management mode, there are 4 grades encrypted modes, namely the system update, alarm query, parameter setting and data query. The driver only can browse relevant information. Ordinary administrator can download the alarm information for further analysis. Senior administrator can modify the alarm threshold. The highest administrator can reinitialize system to make it adapt to different types of tower crane. Only browse and download function are available in the key fields of alarm inquiry, anyone can't modify the data. The overload fields in alarm database are encrypted, only senior administrator can browse. The sensitive fields are prevented from being tampered to the great extent, which will provide the reliable basis for the structural optimization and accident analysis. The system can be initialized through the USB interface. Before initialization, type, structural parameters, alarm thresholds, control thresholds, lifting torque characteristics of tower crane should be made as Excel files and then converted to XML files by format conversion files developed specially, then the XML files are downloaded to U disk. The U disk is inserted into USB interface, then the highest administrator can initialize the system according to hints from system. After initialization, senior administrator can modify structural parameters, alarm thresholds, control thresholds by clicking on parameters setting menu. So long as users can make the corresponding excel form, the system initialization can be finished easily according to above steps and used for monitoring. This is very convenient for user.Tower crane belongs to mobile construction machinery. Over time, sensor signals may have some drift, so it is necessary to calibrate the system regularly for guaranteeing the monitoring accuracy. Considering the tower is a linear elastic structure, sensors are linear sensors,in calibration linear equation is used:y=kx+b (1)where x is sample value of sensor, y is actual value. k, b are calibration coefficients, and are calculated out by two-points method. At running mode, the relationship between x and y is:y=[(y1-y0)/(x1-x0)](x-x0)+y0 (2)After calibration, temperature drift and creep can be eliminated, so the monitoring accuracy is improved greatly.Lifting torque is the most important parameter of condition monitoring of tower crane. Comparing the real-time torque M(L) with rated torque Me(L), the movement of tower crane can be controlled under a safe status.M (L)= Q (L)×L (3)Where, Q(L)is actual lifting weight, L is trolley luffing. Me(L) = Qe(L)×L (4)Where, Q e(L) is rated lifting weight. The design values of rated lifting weight are discrete, while trolley luffing is continuous. Therefore there is a rated lifting weight in any position. According to the mechanical characteristics of tower crane, the rated lifting weight is calculated out at any point by 3 spline interpolation according to the rated lifting weight at design points.When lifting weight or lifting torque is beyond rated value, alarm signal and control signal will be sent out. The hoist motor with high, medium and low speed is controlled by the ratio of lifting weight Q and maximum lifting weight Qmax,so the hoisting speed can be controlled automatically by the lifting weight. The luffing motor with high and low speed is controlled by the ratio of lifting torque M and rated lifting torque Me. Thus the luffing speed can be controlled by the lifting torque automatically. The flow chart is shown in figure 2. Fig. 2 real-time control of lifting weight and lifting torqueWhen accidents take place, power will be off suddenly. It is vital for identifying accident liability to record the operating data at the time of power-off. If measures are not taken to save the operating data, the relevant departments is likely to shirk responsibility. In order to solve the problem, the power-off protection module is designed. The module can save the operating data within 120 seconds automatically before power is off suddenly. In this 120 seconds, data is recorded every 0.1 seconds, and stores in a 2D array with 6 rows 1200 columns in queue method. The elements of the first line are the recent time (year-month-day-hour-minute-second), the elements of the second line to sixth line are lifting weight, lifting torque, trolley luffing, lifting height and wind speed in turn. The initial values are zero, when a set of data are obtained, the elements in the first column are eliminated, the elements in the backward columns move frontwards, new elements are filled into the last column of the array, so the array always saves the operating data at the recent 120 seconds. In order to improve the real-time property of the response, and to extend the service life of the nonvolatile memory chip EEPROM-93C46, the array is cached in volatile flip SDRAM usually. So long as power-off signal produces, the array will be shift to EEPROM, at once.In order to achieve the task, the external interruption thread and the power-off monitoring thread of program is set up, the power-off monitoring thread of program is the highest priority. These two threads is idle during normal operation. When power is off, the power-off monitoring thread of program can be executed immediately. When power-off is monitored by power-off control circuit, the external interruption pins produces interrupt signal. The ARM microprocessor responds to external interrupt request, and wakes up the processing thread of external interruption, then sets synchronized events as informing state. After receiving the synchronized events, the data cached in SDRAM will be written to EEPROM in time.ConclusionThe general intelligence embedded monitoring system of tower crane, which can be applicable to various types of tower crane operating under any lifting rates, uses U disk with the information of the tower crane to finish the system initialization and fits the lifting torque curve automatically. In dangerous state, the system will give out the voice and light alarm, link with the relay or PLC by the RS485 communication interface, and cut off the power. When power is down suddenly, the instantaneous operating data can be recorded automatically, and stored in a black box, which can be taken as the proof for identifying accident responsibility. The system has been used to monitor the "JiangLu" series of tower cranes successfully, and achieved good social and economic benefits.AcknowledgementsThe authors wish to thank China Natural Science Foundation(50975289), China Postdoctoral Science Foundation(20100471229), Hunan science & technology plan, Jianglu Machinery & Electronics Co. Ltd for funding this work.Reference Leonard Bernold. Intelligent Technology for Crane Accident Prevention. Journal of Construction Engineering and Management. 1997, 9: 122~124.Gu Lichen,Lei Peng,Jia Yongfeng. Tower crane' monitor and control based on multi-sensor. Journal of Vibration, Measurement and Diagnosis. 2006, 26(SUPPL.): 174-178.Wang Ming,Zhang Guiqing,Yan Qiao,et, al. Development of a novel black box for tower crane based on an ARM-based embedded system. Proceedings of the IEEE International Conference on Automation and Logistics. 2007: 82-87.Wang Renqun, Yin Chenbo, Zhang Song, et, al. Tower Crane Safety Monitoring and Control System Based on CAN Bus. Instrument Techniques and Sensor. 2010(4): 48-51.Zheng Conghai,Li Yanming,Yang Shanhu,et, al. Intelligent Monitoring System for Tower Crane Based on BUS Architecture and Cut IEEE1451 Standard. Computer Measurement & Control. 2010, 18, (9): 1992-1995.Yang Yu,Zhenlian Zhao,Liang Chen. Research and Design of Tower Crane Condition Monitoring and Fault Diagnosis System. 2010 Proceedings of International Conference on Artificial Intelligence and Computational Intelligence. 2010: 405-408.Yu Yang, Chen Liang, Zhao Zhenlian. Research and design of tower crane condition monitoring and fault diagnosis system. International Conference on Artificial Intelligence and Computational Intelligence, 2010, 3: 405-408.Chen Baojiang, Zeng Xiaoyuan. Research on structural frame of the embedded monitoring and control system for tower crane. 2010 International Conference on Mechanic Automation and Control Engineering. 2010: 5374-5377.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

„Implementation of Parallelized K-means and K-Medoids++ Clustering Algorithms on Hadoop Map Reduce Framework“. International Journal of Innovative Technology and Exploring Engineering 9, Nr. 2S (14.12.2019): 530–35. http://dx.doi.org/10.35940/ijitee.b1045.1292s19.

Der volle Inhalt der Quelle
Annotation:
The electronic information from online newspapers, journals, conference proceedings website pages and emails are growing rapidly which are generating huge amount of data. Data grouping has been gotten impressive consideration in numerous applications. The size of data is raised exponentially due to the advancement of innovation and development, makes clustering of vast size of information, a challenging issue. With the end goal to manage the issue, numerous scientists endeavor to outline productive parallel clustering representations to be needed in algorithms of hadoop. In this paper, we show the implementation of parallelized K-Means and parallelized K-Medoids algorithms for clustering an large data objects file based on MapReduce for grouping huge information. The proposed algorithms combines initialization algorithm with Map Reduce framework to reduce the number of iterations and it can scale well with the commodity hardware as the efficient process for large dataset processing. The outcome of this paper shows the implementation of each algorithms.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Mudjihartono, Paulus, und Findra Kartika Sari Dewi. „Rekayasa Pembalikan Kode Berorientasi Objek ke Desain Kelas Menggunakan Struktur Data Graf“. Jurnal Buana Informatika 1, Nr. 2 (31.07.2010). http://dx.doi.org/10.24002/jbi.v1i2.301.

Der volle Inhalt der Quelle
Annotation:
 Abstract. Reverse Engineering of Object Oriented Code to Class Design Using Graph Data Structure. This time the software engineering methodology still follows the process of standard system development consisting of four phases of the system initialization, system analysis, system design, and system implementation. The phases occur sequentially so that the implementation is always done after the design is complete. This research is developing software Code Converter (COCON) to automate the conversion of object-oriented code to class design using graph data structures. COCON only requires an object-oriented code file as input and produce results in the form of a file containing a list of classes and relations between classes that read from input file. COCON help users to find out the class design of object-oriented code and becomes the basis for drawing class diagrams of object-oriented code. Keywords: reverse engineering, class design, relation, OOP, graph Abstrak. Pada saat ini metodologi rekayasa perangkat lunak masih mengikuti proses pengembangan sistem yang standar yang terdiri dari empat fase yakni inisialisasi sistem, analisis sistem, desain sistem dan implementasi sistem. Keempat fase tersebut terjadi secara berurutan sehingga fase implementasi selalu dilakukan setelah fase desain selesai. Penelitian ini membangun perangkat lunak Code Converter (COCON) yang berfungsi untuk mengotomasi konversi kode program berorientasi objek ke desain kelas dengan struktur data graf. COCON hanya membutuhkan sebuah berkas kode program berorientasi objek sebagai masukan dan memberikan hasil berupa sebuah berkas yang berisi daftar kelas dan relasi antar kelas yang dibaca dari berkas masukan. COCON membantu pengguna untuk mengetahui desain kelas dari kode program berorientasi objek dan menjadi dasar dalam penggambaran diagram kelas dari kode berorientasi objek. Kata Kunci: rekayasa pembalikan, desain kelas, relasi, PBO, graf
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

„Software Product Validation of Body Controller ECU’s using Intelligent Automation Methods“. International Journal of Engineering and Advanced Technology 9, Nr. 5 (30.06.2020): 747–51. http://dx.doi.org/10.35940/ijeat.e9831.069520.

Der volle Inhalt der Quelle
Annotation:
In Continental the, Electronic Control Unit (ECU’s) software consists of 2 parts, the basic software which is the generic part and the application software which does the product specific functionality. The project specific functionality depends upon the product type, like Body domain controller, immobilizer etc. Before the ECU is going to mass production, the controller has to be validated i.e., software and hardware. The main reason is to ensure, that once the ECU is fitted to the vehicle and there is a recall for ECU, it involves additional cost to the organization. The validation of the software includes, checking of the micro controller input, outputs, peripheral drivers, the initialization of the drivers, the timing measurement of the different components. All the parameters are controlled by the generic and application specific modules, and can be configured via configuration tool, manual configuration etc. Since the SW modules are complex and huge, so its need to perform validation of the SW modules manually. This is time consuming process, as it involves checking many c file, configuration files etc. The proposal is to automate the process using Python programming, which interacts with the C code, configuration files, debugger, interacting with the ECU’s peripherals, components and validate the Software parameters before going for series production. In the proposed method the different steps can be automated for different micro-controllers i.e., Renesas RL78 and RH850 used in the ECU’s and different architecture i.e., AUTOSAR and NON-AUTOSAR will be considered and the results will be compared with the manual validation effort and accuracy.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Ploskas, Nikolaos, Nikolaos V. Sahinidis und Nikolaos Samaras. „A triangulation and fill-reducing initialization procedure for the simplex algorithm“. Mathematical Programming Computation, 26.06.2020. http://dx.doi.org/10.1007/s12532-020-00188-1.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Azhari, Muhamad Taufan. „Skenario Pengembangan Untuk Meningkatkan Recovery Factor Pada Lapangan TR Lapisan X Dengan Menggunakan Simulasi Reservoir“. PETRO 4, Nr. 4 (27.09.2018). http://dx.doi.org/10.25105/petro.v4i4.1901.

Der volle Inhalt der Quelle
Annotation:
<p>Reservoir simulation is an area of reservoir engineering in which computer models are used to predict the flow of fluids through porous media. Reservoir simulation process starts with several steps; data preparation, model and grid construction, initialization, history matching and prediction. Initialization process is done for matching OOIP or total initial hydrocarbon which fill reservoir with hydrocarbon control volume with volumetric method.</p><p>To aim the best encouraging optimum data, these development scenarios of TR Field Layer X will be predicted for 30 years (from 2014 until January 2044). Development scenarios in this study consist of 4 scenarios : Scenario 1 (Base Case), Scenario 2 (Base Case + Reopening non-active wells), Scenario 3 (scenario 2 + infill production wells), Scenario 4 (Scenario 2 + 5 spot pattern of infill injection wells).</p>
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Widyastuti, Maria Irmina, und Maman Djumantara. „PENINGKATAN PRODUKSI LAPANGAN “M” DENGAN PENDEKATAN SIMULASI UNTUK MENENTUKAN SKENARIO PENGEMBANGAN MENGGUNAKAN METODE WATERFLOODING“. PETRO 5, Nr. 1 (27.09.2018). http://dx.doi.org/10.25105/petro.v5i1.1979.

Der volle Inhalt der Quelle
Annotation:
<p>Reservoir simulation is an area of reservoir engineering in which computer models are used to predict the flow of fluids through porous media. Reservoir simulation process starts with several steps; data preparation, model and grid construction, initialization, history matching and prediction. Initialization process is done for matching OOIP or total initial hydrocarbon which fill reservoir with hydrocarbon control volume with volumetric method.</p><p>To aim the best encouraging optimum data, the plant of developments of this field was predicted for 22 years( until December 2035). The Scenario consisted of five different variation. First one is basecase, second scenario is scenario 1 + workover, third scenario would be scenario 1 + infill wells, fourth scenario is scenario 1 + peripheral injection, and the last fifth scenario is scenario 1 + 5-spot injection pattern wells. From all of the scenarios planned, recovery from from each scenario varied, the results are 31.05% for the first scenario, 31.53%, for the second one, 34.12%, for the third, 33.75% for the fourth scenario, and 37.04% for the fifth scenario which is the last one.</p><p>Keywords: reservoir simulation,reservoir simulator, history matching</p>
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Zhang, Shuowen, und Chenhui Zhang. „A New Deterministic Model for Mixed Lubricated Point Contact With High Accuracy“. Journal of Tribology, 14.12.2020, 1–33. http://dx.doi.org/10.1115/1.4049328.

Der volle Inhalt der Quelle
Annotation:
Abstract Mixed lubrication is a major lubrication regime in the presence of surface roughness. A deterministic model is established to solve mixed lubricated point contact in this paper, using a new method to solve asperity contact region in mixed lubrication. Treatment of pressure boundary condition between elastohydrodynamic lubrication region and asperity contact region is discussed. The new model is capable of calculating typical Stribeck curve and analyzing transition of lubrication regime, from full film lubrication to boundary lubrication. Moreover, final result of the model is independent of pressure initialization. High performance in accuracy and convergence has been achieved, which is of great importance for further lubrication modelling with consideration of nano-scale roughness, intermolecular and surface forces.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Kodavasal, Janardhan, Kevin Harms, Priyesh Srivastava, Sibendu Som, Shaoping Quan, Keith Richards und Marta García. „Development of a Stiffness-Based Chemistry Load Balancing Scheme, and Optimization of Input/Output and Communication, to Enable Massively Parallel High-Fidelity Internal Combustion Engine Simulations“. Journal of Energy Resources Technology 138, Nr. 5 (23.02.2016). http://dx.doi.org/10.1115/1.4032623.

Der volle Inhalt der Quelle
Annotation:
A closed-cycle gasoline compression ignition (GCI) engine simulation near top dead center (TDC) was used to profile the performance of a parallel commercial engine computational fluid dynamics (CFD) code, as it was scaled on up to 4096 cores of an IBM Blue Gene/Q (BG/Q) supercomputer. The test case has 9 × 106 cells near TDC, with a fixed mesh size of 0.15 mm, and was run on configurations ranging from 128 to 4096 cores. Profiling was done for a small duration of 0.11 crank angle degrees near TDC during ignition. Optimization of input/output (I/O) performance resulted in a significant speedup in reading restart files, and in an over 100-times speedup in writing restart files and files for postprocessing. Improvements to communication resulted in a 1400-times speedup in the mesh load balancing operation during initialization, on 4096 cores. An improved, “stiffness-based” algorithm for load balancing chemical kinetics calculations was developed, which results in an over three-times faster runtime near ignition on 4096 cores relative to the original load balancing scheme. With this improvement to load balancing, the code achieves over 78% scaling efficiency on 2048 cores, and over 65% scaling efficiency on 4096 cores, relative to 256 cores.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Azhari, Muhamad Taufan, und Maman Djumantara. „Skenario Pengembangan Untuk Meningkatkan Recovery Factor Pada Lapangan TR Lapisan X Dengan Menggunakan Simulasi Reservoir“. PETRO 5, Nr. 1 (27.09.2018). http://dx.doi.org/10.25105/petro.v5i1.1976.

Der volle Inhalt der Quelle
Annotation:
<div class="WordSection1"><p><strong>SARI</strong></p><p>Simulasi reservoir merupakan bagian dari ilmu teknik perminyakan, khususnya teknik reservoir dimana model komputer digunakan untuk memprediksikan aliran fluida melalui media yang bersifat <em>porous. </em>Proses suatu simulasi reservoir dimulai dengan beberapa langkah, yakni preparasi data, pembangunan model beserta <em>grid</em>, inisialisasi, penyelarasan data produksi dengan simulasi (<em>history matching</em>)., serta prediksi <em>performance </em>produksi model yang disimulasikan. Proses inisialisasi dilakukan untuk menyesuaikan nilai OOIP atau total hidrokarbon awal yang mengisi reservoir dengan nilai OOIP awal pada model static.</p><p>Untuk mendapatkan peramalan kinerja produksi yang akurat, rencana pengembangan Lapangan TR Lapisan X dilakukan dengan memprediksikan kinerja reservoir untuk berproduksi selama 30 tahun (sampai dengan Januari 2044). Pengembangan yang direncanakan pada penelitian ini berjumlah 4 skenario, yang terdiri dari skenario 1 (<em>Base Case</em>), skenario 2 (<em>Base Case </em>+ <em>Reopening </em>sumur yang non-aktif), skenario 3 (skenario 2 + <em>Infill </em>sumur produksi), skenario 4 (Skenario 2 + <em>infill </em>sumur injeksi pola <em>5 spot</em>).</p><p><strong>ABSTRACT</strong></p><p>Reservoir simulation is an area of reservoir engineering in which computer models are used to predict the flow of fluids through porous media. Reservoir simulation process starts with several steps; data preparation, model and grid construction, initialization, history matching and prediction. Initialization process is done for matching OOIP or total initial hydrocarbon which fill reservoir with hydrocarbon control volume with volumetric method.</p><p>To aim the best encouraging optimum data, these development scenarios of TR Field Layer X will be predicted for 30 years (from 2014 until January 2044). Development scenarios in this study consist of 4 scenarios : Scenario 1 (Base Case), Scenario 2 (Base Case + Reopening non-active wells), Scenario 3 (scenario 2 + infill production wells), Scenario 4 (Scenario 2 + 5 spot pattern of infill injection wells).</p><p>Keywords: reservoir simulation,reservoir simulator, history matching</p></div>
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie