Literatura académica sobre el tema "Time Series Processor (Computer program)"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Time Series Processor (Computer program)".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Time Series Processor (Computer program)"

1

Kraemer, R. y B. Meister. "Fast real-time moment-ratio analysis of multibreath nitrogen washout in children". Journal of Applied Physiology 59, n.º 4 (1 de octubre de 1985): 1137–44. http://dx.doi.org/10.1152/jappl.1985.59.4.1137.

Texto completo
Resumen
To apply real-time moment-ratio analysis to multibreath N2-washout curves (MBNW) from children, a new processor-controlled device was constructed. Flow and fractional N2 concentration (FN2) were each sampled by 200 Hz. An electromagnetic triple-valve system, with an instrumental dead space of 36 ml and a valve resistance of 0.3 cmH2O . l-1 . s, was connected in series with a pneumotachograph and an N2 analyzer (Ohio 720) placed next to the mouthpiece. A FORTRAN/MACRO program on a PDP 11/23 computer enabled measurement of inspiratory and expiratory flow and FN2 sampling by a 12-bit analog-to-digital converter. The fast real-time digital processing of the N2 and flow signals incorporated filtering, delay compensation, and corrections for the effects of changes in gas composition and temperature. MBNW dynamics of the lungs were studied in 17 healthy and 28 asthmatic children and in 16 patients with cystic fibrosis, evaluating the moment ratios of the washout curves as indices of the ventilation characteristics. Intrasubject variability of the moment ratios (m1/m0, m2/m0) and determination of functional residual capacity (FRC) varied between 6.3 and 14.7% (depending on which parameter is considered) and was comparatively lower than other indices previously investigated in adults. In addition, the sensitivity of the moment ratios for discriminating different stages of ventilation inhomogeneity was superior to other indices. m2/m0 is closely related to the simultaneously measured airway resistance, and the ratio between cumulative expired volume and FRC is correlated with the ratio between residual volume and total lung capacity.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Engroff, Alian, Marcelo Romanssini, Lucas Compassi-Severo, Paulo C. C. de Aguirre y Alessandro Girardi. "ASIPAMPIUM: An Efficient ASIP Generator for Low Power Applications". Electronics 12, n.º 2 (12 de enero de 2023): 401. http://dx.doi.org/10.3390/electronics12020401.

Texto completo
Resumen
The adoption of customized ASIPs (Application Specific Instruction Set Processors) in embedded circuits is an important alternative for optimizing power consumption, silicon area, or processing performance according to the design requirements. The processor is implemented specifically for the target application, which allows the hardware customization in terms of instruction set architecture, data word length, memory size, and parallelism. This work describes an EDA tool for the semi-automatic development of ASIPs named ASIPAMPIUM. The strategy is to provide a set of integrated tools to interpret and generate a customized hardware for a given target application, including compilation, simulation, and hardware synthesis. From the C code description of the application, the tool returns a synthesizable hardware description of the processor. The proposed methodology is based on the adaptation of a new customizable microprocessor called PAMPIUM, which can be optimized in terms of silicon area, power consumption, or processing performance according to the target application. The ASIPAMPIUM tool provides a series of simulated data to the designer in order to identify optimization strategies in both software and hardware domains. We show the results for the implementation of an FFT algorithm using the proposed methodology, which achieved best results in terms of silicon area and energy consumption compared to other works described in the literature for both FPGA and silicon implementation. Moreover, measurement results of the implementation in silicon of a dedicated ASIP for interfacing with six sensors in real-time, including three I2C, an SPI, and an RS-232 interfaces, demonstrate the complete design flow, from the C code program to physical implementation and characterization. Aside from providing a short design time, the ASIPAMPIUM tool also affords a simple and intuitive design flow, allowing the designer to deal with different design trade-offs and objectives.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Makhanov, K. M. y L. V. Chirkova. "Interdisciplinary communication of physics, schemes and programming". Bulletin of the Karaganda University. "Physics" Series 98, n.º 2 (30 de junio de 2020): 143–49. http://dx.doi.org/10.31489/2020ph2/143-149.

Texto completo
Resumen
The article presents the results of the development of an electronic device based on digital components, carried out jointly with schoolchildren and students. The aim of this work was the design and development of an electronic device based on 32-bit microcontrollers of the STM32 series. For the design and development of the electronic part of the device, the integrated development environment Altium Designer was used. Using a high-level C ++ language, a control program was developed in the Keil v.5 debugging environment. The microcontroller of the STM32F030K6T6 series was used as the central control processor. The electrical circuit and the circuit board of the device are developed. A highly sensitive gas sensor TGS 2610, manufactured by Figaro (Japan), was used as a sensor element. For SMS transmission, the SIM800A module was used. In the process of doing the work, students independently made electric boards using the method of irradiation of a special photoresist with a UV lamp. Practically all the components were arranged and soldered by the students on their own. Together with the students, a control program for the microcontroller was developed. To manufacture the case of the device, the «KOMPAS» environment was used. The case was printed on a 3D — printer. Calibrated the device and its test tests. According to the results of preliminary test tests of the device, it was found that the gas sensor responds to the presence of gas during the first 20 seconds from the time the leak started. False positives of the device are not fixed. The importance of the foundations of physics, computer science and mathematics in the process of designing digital devices and instruments is shown.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Ye, Qian y Minyan Lu. "SPOT: Testing Stream Processing Programs with Symbolic Execution and Stream Synthesizing". Applied Sciences 11, n.º 17 (30 de agosto de 2021): 8057. http://dx.doi.org/10.3390/app11178057.

Texto completo
Resumen
Adoption of distributed stream processing (DSP) systems such as Apache Flink in real-time big data processing is increasing. However, DSP programs are prone to be buggy, especially when one programmer neglects some DSP features (e.g., source data reordering), which motivates development of approaches for testing and verification. In this paper, we focus on the test data generation problem for DSP programs. Currently, there is a lack of an approach that generates test data for DSP programs with both high path coverage and covering different stream reordering situations. We present a novel solution, SPOT (i.e., Stream Processing Program Test), to achieve these two goals simultaneously. At first, SPOT generates a set of individual test data representing each path of one DSP program through symbolic execution. Then, SPOT composes these independent data into various time series data (a.k.a, stream) in diverse reordering. Finally, we can perform a test by feeding the DSP program with these streams continuously. To automatically support symbolic analysis, we also developed JPF-Flink, a JPF (i.e., Java Pathfinder) extension to coordinate the execution of Flink programs. We present four case studies to illustrate that: (1) SPOT can support symbolic analysis for the commonly used DSP operators; (2) test data generated by SPOT can more efficiently achieve high JDU (i.e., Joint Dataflow and UDF) path coverage than two recent DSP testing approaches; (3) test data generated by SPOT can more easily trigger software failure when comparing with those two DSP testing approaches; and (4) the data randomly generated by those two test techniques are highly skewed in terms of stream reordering, which is measured by the entropy metric. In comparison, it is even for test data from SPOT.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Patalakh, D., A. Prykhodko, K. Lut y S. O. Tykhovod. "Improvement of modeling techniques of transients in transformers based on magnetoelectric equivalent schemes". Naukovyi Visnyk Natsionalnoho Hirnychoho Universytetu, n.º 6 (2021): 107–12. http://dx.doi.org/10.33271/nvngu/2021-6/107.

Texto completo
Resumen
Purpose. Use of an improved numerical method of calculating transient processes in electrical circuits for modeling electromagnetic processes in nonlinear magneto-electric circuits, and also development of a circuit model based on this method, which leads to the convenience of calculation. Methodology. Approximation of functions by Chebyshevs polynomials, numerical methods of differential equations integrating, matrix methods, spline interpolation, programming, theory of electric and magnetic circuits. Findings. On the base of the well-known method of transient process analysis in linear electric circuits, the method of numerical calculation of transient processes in nonlinear magneto-electric equivalent circuits of transformer has been developed. By the help of the proposed method it is possible to reduce processing time for modeling electromagnetic processes in transformers. The example of using the developed method is shown. The computer program for modeling of electromagnetic transient in a single-phase transformer based on the described method has been developed. This example shows reduction of processor time by more than four times compared to examples of calculations based on other known methods. Originality. The method in which the solution of state differential equations is presented in the form of decomposition into a series along orthogonal Chebyshevs polynomials is used in this work. The polynomial approximation applied in this work is not corresponding to the solution function itself, but its derivative, which significantly reduces the error of integration of differential equations. Differential equations of state are transformed into linear algebraic equations for special images of solution functions. A principle is developed of constructing magneto-electric substitution circuits in which images of solution functions appear. Images of true dynamic currents and magnetic fluxes in the proposed equivalent scheme are interpreted as direct currents and direct magnetic fluxes. The used method has shown advantages in accuracy and time of simulation of electromagnetic transient over other known methods based on application of magneto-electric substitution circuits. Practical value. The developed method opens up the possibility of using the apparatus of the theory of electric and magnetic circuits to work with images of currents and magnetic fluxes. Based on this, a universal software complex is being developed to calculate transients in transformers of various constructions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Ma, Haoran, Hang Sun y Changsheng Li. "Penetration Overload Prediction Method Based on a Deep Neural Network with Multiple Inputs". Applied Sciences 13, n.º 4 (11 de febrero de 2023): 2351. http://dx.doi.org/10.3390/app13042351.

Texto completo
Resumen
In the process of high-speed penetration, penetrating ammunition is prone to problems such as penetration overload signal vibration and mixings and projectile attitude deflection. It is easy to misjudge if a fuze relies only on the overload data from the ground or the utilized program, and the actual penetration overload measured under actual launch conditions cannot be taken as the dynamic judgement basis. Therefore, a real-time penetration overload prediction method based on a deep neural network is proposed, which can predict overload values according to the projectile parameter settings, the real-time collection of overload information, and the calculation speed and assist the fuze in judging the target layer and projectile attitude. In this paper, we adopt a deep learning model with multiple time series inputs and modify the input coding mode so that the model can output a 48 us overload curve within 20 us, meeting the real-time signal processing requirements of the high-speed missile penetration process. The mean squared error between the predicted curve and the actual curve is 0.221 for the prediction of multilayer penetrating targets and 0.452 for the prediction of thick penetrating targets. A penetration overload prediction function can be realized.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Brincat, Arthur, Angéline Antezack, Camille Sadowski, Mathias Faure-Brac, Romain Ohanessian y Virginie Monnet-Corti. "Absence of Progressive Bone Loss Following Peri-Implantitis Surgical Therapy with Implantoplasty: A Case Series". Applied Sciences 13, n.º 12 (16 de junio de 2023): 7224. http://dx.doi.org/10.3390/app13127224.

Texto completo
Resumen
Background: Peri-implantitis, a bacteria-associated inflammatory disease, is characterized by inflammation of the peri-implant mucosa and progressive loss of the supporting bone, thereby reducing the chances of dental implant survival. The absence of progressive marginal bone loss is crucial for implant success. The aim of this study is to assess the peri-implantitis resolution by measuring the absence of progressive bone loss rate around the implant over a period of one year to more than three years after surgical reconstructive (REC) treatment, apically repositioned flap (ARP) surgery, or combined (COM) treatment of peri-implantitis with implantoplasty. Methods: Peri-implantitis patients, that underwent surgical therapy with implantoplasty and that enrolled in a regular peri-implant supportive care program with a follow up of ≥12 months, were recruited in this study. ARP, REC, or COM surgical therapy was performed depending on the anatomy of the bone defect. For REC and COM groups, intraosseous defects were filled with a bone substitute. The ARP group consisted of an apically positioned flap without osseous surgery. Absence of progressive marginal bone loss was evaluated on radiographs of the treated implants. Results: A total of 57 patients (91 implants) were included. The study occurred over a follow-up period of 12 to 42 months (mean = 24 months). The surgical treatment with implantoplasty yielded an absence of progressive bone loss rate of 96.7% at implant level (100% REC, 98% COM, 92,9% ARP) and 96.5% at patient level. Three implants had to be removed in two patients due to relapse or progression of peri-implantitis. Conclusions: This case series demonstrated that implantoplasty during surgical treatment of peri-implantitis lesions resulted in favorable biological conditions to maintain functional implants with 96.7% of implants that did not show bone loss over time from one year to more than three years.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Lauritano, Dorina, Giulia Moreo, Annalisa Palmieri, Fedora Della Vella, Massimo Petruzzi, Daniele Botticelli y Francesco Carinci. "Photodynamic Therapy Using 5-Aminolevulinic Acid (Ala) for the Treatment of Chronic Periodontitis: A Prospective Case Series". Applied Sciences 12, n.º 6 (18 de marzo de 2022): 3102. http://dx.doi.org/10.3390/app12063102.

Texto completo
Resumen
Aim: The objective of this study was to compare the efficacy of supportive periodontal therapy (i.e., scaling and root planning, SRP) alone versus ALADENT medical device used in association with SRP in the treatment of chronic periodontitis in adult patients. Materials and Methods: A total of 20 patients with a diagnosis of chronic periodontitis (40 localized chronic periodontitis sites) aged between 35 and 55 were selected. None of these patients previously received any surgical or non-surgical periodontal therapy, and they presented radiographic evidence of moderate bone loss. Two non-adjacent sites in different quadrants were identified and observed in each patient, analyzing treatment effectiveness (split-mouth design). Clinical pocket depth, clinical attachment loss, and bleeding on probing were evaluated at time 0 and after 6 months, while microbial analysis (MA) was conducted at baseline and after 15 days. Significant differences were calculated using SPSS program and paired simple statistic t-test. Results: Total bacteria loadings had a statistically significant reduction before and after treatment with SRP (left site) (total average decrease of 27%). The sites treated with SRP plus ALADENT (right) showed a significantly reduced total bacterial loading compared to the untreated sites (right) (total average decrease of 75%). Mean values of CAL/PD and percentages data of BOP, recorded after SRP + ALADENT therapy, showed a higher reduction (CAL = 2.42, PD = 2.87 mm, 90% of sites with no bleeding) than those obtained after SRP treatment (CAL = 4.08 mm, PD = 4.73 mm, 70% of sites with no bleeding). Conclusion: The treatment of moderate and severe chronic periodontitis should include, beside SRP, the use of ALADENT medical device, which has been proved to be a useful adjuvant therapy.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Makahleh, Firas M., Ali A. Badran, Hani Attar, Ayman Amer y Ayman A. Al-Maaitah. "Modeling and Simulation of a Two-Stage Air Cooled Adsorption Chiller with Heat Recovery Part I: Physical and Mathematical Performance Model". Applied Sciences 12, n.º 13 (28 de junio de 2022): 6542. http://dx.doi.org/10.3390/app12136542.

Texto completo
Resumen
In the proposed work, the MATLAB program was used to model and simulate the performance of the investigated two-stage adsorption chiller with and without heat recovery using an activated carbon/methanol pair. The simulated model results were then validated by the experimental results conducted by Millennium Industries. The model was based on 10th order differential equations; six of them were used to predict bed, evaporator and condenser temperatures while the other four equations were used to calculate the adsorption isotherm and adsorption kinetics. The detailed validation is stated in the next paragraphs; for example, it clearly notes that the simulation model results for the two-stage air cooled chiller are well compared with the experimental data in terms of cooling capacity (6.7 kW for the model compared with 6.14 kW from the experimental results at the same conditions). The Coefficient of Performance (COP) predicted by this simulation was 0.4, which is very close to that given by the Carnot cycle working at the same operating conditions. The model optimized the switching time, adsorption/desorption time and heat recovery time to maximize both cooling capacity and COP. The model optimized the adsorption/desorption cycle time (300 to 400 s), switching cycle time (50 s) and heat recovery cycle time (30 s). The temporal history of bed, evaporator and condenser temperatures is provided by this model for both heat recovery and without heat recovery chiller operation modes. The importance of this study is that it will be used as a basis for future series production.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Acosta, Ruth, Klaus Heckmann, Jürgen Sievers, Tim Schopf, Tobias Bill, Peter Starke, Kai Donnerbauer, Lukas Lücker, Frank Walther y Christian Boller. "Microstructure-Based Lifetime Assessment of Austenitic Steel AISI 347 in View of Fatigue, Environmental Conditions and NDT". Applied Sciences 11, n.º 23 (25 de noviembre de 2021): 11214. http://dx.doi.org/10.3390/app112311214.

Texto completo
Resumen
The assessment of metallic materials used in power plants’ piping represents a big challenge due to the thermal transients and the environmental conditions to which they are exposed. At present, a lack of information related to degradation mechanisms in structures and materials is covered by safety factors in its design, and in some cases, the replacement of components is prescribed after a determined period of time without knowledge of the true degree of degradation. In the collaborative project “Microstructure-based assessment of maximum service life of nuclear materials and components exposed to corrosion and fatigue (MibaLeb)”, a methodology for the assessment of materials’ degradation is being developed, which combines the use of NDT techniques for materials characterization, an optimized fatigue lifetime analysis using short time evaluation procedures (STEPs) and numerical simulations. In this investigation, the AISI 347 (X6CrNiNb18-10) is being analyzed at different conditions in order to validate the methodology. Besides microstructural analysis, tensile and fatigue tests, all to characterize the material, a pressurized hot water pipe exposed to a series of flow conditions will be evaluated in terms of full-scale testing as well as prognostic evaluation, where the latter will be based on the materials’ data generated, which should prognose changes in the material’s condition, specifically in a pre-cracked stage. This paper provides an overview of the program, while the more material’s related aspects are presented in the subsequent paper.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Time Series Processor (Computer program)"

1

Lewis, Arthur M. "Seasonal Hidden Markov Models for Stochastic Time Series with Periodically Varying Characteristics". PDXScholar, 1995. https://pdxscholar.library.pdx.edu/open_access_etds/5056.

Texto completo
Resumen
Novel seasonal hidden Markov models (SHMMs) for stochastic time series with periodically varying characteristics are developed. Nonlinear interactions among SHMM parameters prevent the use of the forward-backward algorithms which are usually used to fit hidden Markov models to a data sequence. Instead, Powell's direction set method for optimizing a function is repeatedly applied to adjust SHMM parameters to fit a data sequence. SHMMs are applied to a set of meteorological data consisting of 9 years of daily rain gauge readings from four sites. The fitted models capture both the annual patterns and the short term persistence of rainfall patterns across the four sites.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Venugopal, Niveditha. "Annotation-Enabled Interpretation and Analysis of Time-Series Data". PDXScholar, 2018. https://pdxscholar.library.pdx.edu/open_access_etds/4708.

Texto completo
Resumen
As we continue to produce large amounts of time-series data, the need for data analysis is growing rapidly to help gain insights from this data. These insights form the foundation of data-driven decisions in various aspects of life. Data annotations are information about the data such as comments, errors and provenance, which provide context to the underlying data and aid in meaningful data analysis in domains such as scientific research, genomics and ECG analysis. Storing such annotations in the database along with the data makes them available to help with analysis of the data. In this thesis, I propose a user-friendly technique for Annotation-Enabled Analysis through which a user can employ annotations to help query and analyze data without having prior knowledge of the details of the database schema or any kind of database programming language. The proposed technique receives the request for analysis as a high-level specification, hiding the details of the schema, joins, etc., and parses it, validates the input and converts it into SQL. This SQL query can then be executed in a relational database and the result of the query returned to the user. I evaluate this technique by providing real-world data from a building-data platform containing data about Portland State University buildings such as room temperature, air volume and CO2 level. This data is annotated with information such as class schedules, power outages and control modes (for example, day or night mode). I test my technique with three increasingly sophisticated levels of use cases drawn from this building science domain. (1) Retrieve data with include or exclude annotation selection (2) Correlate data with include or exclude annotation selection (3) Align data based on include annotation selection to support aggregation over multiple periods. I evaluate the technique by performing two kinds of tests: (1) To validate correctness, I generate synthetic datasets for which I know the expected result of these annotation-enabled analyses and compare the expected results with the results generated from my technique (2) I evaluate the performance of the queries generated by this service with respect to execution time in the database by comparing them with alternative SQL translations that I developed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Guthrey, Delparde Raleigh. "Time series analysis of ozone data". CSUSB ScholarWorks, 1998. https://scholarworks.lib.csusb.edu/etd-project/1788.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Quinlan, Patrick John Adrian. "Time series modeling of hybrid wind photovoltaic diesel power systems". 1996. http://catalog.hathitrust.org/api/volumes/oclc/36866574.html.

Texto completo
Resumen
Thesis (M.S.)--University of Wisconsin--Madison, 1996.
Typescript. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 143-162).
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Time Series Processor (Computer program)"

1

Hall, Bronwyn H. Time Series Processor: Version 4.5. Palo Alto, Calif: TSP International, 1999.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Hall, Bronwyn H. Time Series Processor, version 4.2. Palo Alto, CA: TSP International, 1991.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Instituto Nacional de Estadística, Geografía e Informática (Mexico), ed. Manual de análisis de paquete computacional TSP (Time Series Processor). México: Instituto Nacional de Estadística, Geografía e Informática, 1985.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Hall, Bronwyn H. Time Series Processor: Version 4.1 : reference manual. Palo Alto,CA: TSP International, 1988.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Hall, Bronwyn H. Time Series Processor, version 4.2: Reference manual. Palo Alto, Calif: TSP International, 1991.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Hall, Bronwyn H. Times Series Processor, version 4.3: Reference manual. Palo Alto, Calif: TSP International, 1995.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Hall, Bronwyn H. Time Series Processor, version 4.2: User's manual, including an introductory guide. Palo Alto, Calif: TSP International, 1991.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Hall, Bronwyn H. Time Series Processor: Version 4.1 : user's manual : including an introductory guide. Palo Alto,CA: TSP International, 1988.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Hall, Bronwyn H. Times Series Processor, version 4.3: User's guide, including an introductory guide. Palo Alto, Calif: TSP International, 1995.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Pfaff, Bernhard. Analysis of integrated and cointegrated time series with R. 2a ed. New York: Springer, 2008.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Time Series Processor (Computer program)"

1

Catthoor, Francky, Lars Svensson y Klaus Wölcken. "Application-Driven Synthesis Methodologies for Real-Time Processor Architectures". En The Kluwer International Series in Engineering and Computer Science, 1–22. Boston, MA: Springer US, 1993. http://dx.doi.org/10.1007/978-1-4615-3242-2_1.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Chen, Hao-Yun. "Copyright Protection for Software 2.0?" En Artificial Intelligence and Intellectual Property, 323–40. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198870944.003.0015.

Texto completo
Resumen
Traditionally, software programmers write a series of hard-coded rules to instruct a machine, step by step. However, with the ubiquity of neural networks, instead of giving specific instructions, programmers can write a skeleton of code to build a neural network structure, and then feed the machine with data sets, in order to have the machine write code by itself. Software containing the code written in this manner changes and evolves over time as new data sets are input and processed. This characteristic distinguishes it markedly from traditional software, and is partly the reason why it is referred to as ‘software 2.0’. Yet the vagueness of the scope of such software might make it ineligible for protection by copyright law. To properly understand and address this issue, this chapter will first review the current scope of computer program protection under copyright laws, and point out the potential inherent issues arising from the application of copyright law to software 2.0. After identifying related copyright law issues, this chapter will then examine the possible justification for protecting computer programs in the context of software 2.0, aiming to explore whether new exclusivity should be granted or not under copyright law, and if not, what alternatives are available to provide protection for the investment in the creation and maintenance of software 2.0.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Gratzer, Walter. "Butterfly in Beijing". En Eurekas and euphorias, 158–60. Oxford University PressNew York, NY, 2002. http://dx.doi.org/10.1093/oso/9780192804037.003.0098.

Texto completo
Resumen
Abstract The phenomenon of chaos—the emergence of patterns out of randomness— has in recent years touched almost every area of science. Irregularities in physical and biological (indeed, even in economic) processes were always regarded as defying theoretical analysis and were accordingly shunned by theoreticians. Turbulence in fluid flow was one practical problem, which had troubled both engineers and physiologists, and physicists had long been irked by the seemingly random transitions between steady and sporadic flow of water from a dribbling tap. The ideas behind chaos theory had been hazily prefigured in earlier years, but the beginnings of the subject can properly be dated to 1961, and the place, MIT, the Massachusetts Institute of Technology. Edward Lorenz was a meteorologist, trained as a mathematician; his interest was long-range weather forecasting, and he had recognized early on that any system of equations aimed at simulating the change with time of a weather pattern would become tractable only with the advent of the high-speed computer. Lorenz had acquired one of the first commercially available machines and he had written a crude program for the change in a weather pattern, based on 12 equations. His computer disgorged an endless series of successive weather maps.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Dasgupta, Subrata. "“The Best Way to Design . . .”". En It Began with Babbage. Oxford University Press, 2014. http://dx.doi.org/10.1093/oso/9780199309412.003.0016.

Texto completo
Resumen
In February 1951, the Ferranti Mark I was delivered to the University of Manchester. This was the commercial “edition” of the Manchester Mark I (see Chapter 8, Section XIII), the product of a collaboration between town and gown, the former being the Manchester firm of Ferranti Limited. It became (by a few months) the world’s first commercially available digital computer (followed in June 1951 by the “Universal Automatic Computer” [UNIVAC], developed by the Eckert-Mauchly Computer Corporation). The Ferranti Mark I was unveiled formally at an inaugural conference held in Manchester, June 9 to 12, 1951. At this conference, Maurice Wilkes delivered a lecture titled “The Best Way to Design an Automatic Calculating Machine.” This conference is probably (perhaps unfairly) more known because of Wilkes’s lecture than for its primary focus, the Ferranti Mark I. For during this lecture, Wilkes announced a new approach to the design of a computer’s control unit called microprogramming, which would be massively consequential in the later evolution of computers. Wilkes’s lecture also marked something else: the search for order, structure, and simplicity in the design of computational artifacts; and an attendant concern for, a preoccupation with, the design process itself in the realm of computational artifacts. We have already seen the first manifestations of this concern with the design process in the Goldstine-von Neumann invention of a flow diagram notation for beginning the act of computer programming (see Chapter 9, Section III), and in David Wheeler’s and Stanley Gill’s discussions of a method for program development (Chapter 10, Section IV). Wilkes’s lecture was notable for “migrating” this concern into the realm of the physical computer itself. We recall that, in May 1949, the Cambridge EDSAC became fully operational (see Chapter 8, Section XIII). The EDSAC was a serial machine in that reading from or writing into memory was done 1 bit at a time (bit serial) ; and, likewise, the arithmetic unit performed its operations in a bit-by-bit fashion. Soon after the EDSAC’s completion, while others in his laboratory were busy refining the programming techniques and exploring its use in scientific applications (see Chapter 9, Sections V–VIII; and Chapter 10), Wilkes became preoccupied with issues of regularity and complexity in computer design and their relation to reliability.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Gershon, Richard A. "Intelligent Networking and Business Process Innovation". En Business Information Systems, 1412–24. IGI Global, 2010. http://dx.doi.org/10.4018/978-1-61520-969-9.ch088.

Texto completo
Resumen
Today, innovation is much more about much than just developing new products. It is about reinventing business processes and building entirely new markets to meet untapped customer needs. This chapter will examine the subject of business process innovation which involves creating systems and methods for improving organizational performance. Special attention is given to the topic of intelligent networking which represents the combination of software, technology, and electronic pathways that makes business process innovation possible for both large and small organizations alike. A central tenet is that the intelligent network is not one network, but a series of networks designed to enhance world-wide communication for business and residential users. Two very different kinds of intelligent networks are discussed in this chapter. The first involves satellite-to-cable television networking where the emphasis is on program distribution to the end consumer. The second is a supply chain management network where the emphasis is on just-in-time manufacturing. Each of the said networks represents a highly innovative business process and share the common goal of improving organizational performance. The information presented in this chapter is theory-based and supported by a case-study analysis of Home Box Office, Inc. and Dell Computers.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Zaitsev, Dmitry A. "Sleptsov Net Computing". En Advances in Computer and Electrical Engineering, 1660–74. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-7598-6.ch122.

Texto completo
Resumen
Motivation for new models of hyper-computations was presented. Sleptsov net was introduced compared to Petri and Salwicki nets. A concept of universal Sleptsov net, as a prototype of a processor in Sleptsov net computing, was discussed. Small universal Sleptsov net that runs in polynomial time was constructed; it consists of 15 places and 29 transitions. Principles of programming in Sleptsov nets, as composition of reverse control flow and data, have been developed. Standard control flow patterns include sequence, branching, loop, and parallel execution. Basic modules, which implement efficiently copying, logic, and arithmetic operations, have been developed. Special dashed arcs were introduced for brief specification of input and output data of modules (subnets). Ways of hierarchical composition of a program via substitution of a transition by a module were discussed. Examples of Sleptsov net programs for data encryption, fuzzy logic, and partial differential equations have been presented. Enterprise implementation of Sleptsov net programming promises ultra-performance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Moorthi, M. Narayana y R. Manjula. "Challenges Faced in Enhancing the Performance and Scalability in Parallel Computing Architecture". En Advances in Computer and Electrical Engineering, 252–69. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9479-8.ch010.

Texto completo
Resumen
Now a day the architecture of high performance systems are improving with more and more processor cores on the chip. This has both benefits as well as challenges. The benefit is running more task simultaneously which reduces the running time of the program or application. The challenges are what is the maximum limit of the number of cores in the given chip, how the existing and future software will make use of all the cores, what parallel programming language to choose, what are the memory and cache coherence issues involved when we increase the number of cores, how to solve the power and performance issues, how the cores are connected and how they are communicating to solve a single problem, workload distribution and load balancing issues in terms of scalability. There is a practical limit for speedup and scalability of number of cores on the chip which needs to be analyzed. So this chapter will focus on the introduction and overviews of parallel computing and the challenges faced in enhancing the performance and scalability in parallel computing architecture.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Wilson, M. T. y J. Torres. "Stopped-flow spectroscopy". En Spectrophotometry and Spectrofluorimetry. Oxford University Press, 2000. http://dx.doi.org/10.1093/oso/9780199638130.003.0012.

Texto completo
Resumen
There was a time, fortunately some years ago now, when to undertake rapid kinetic measurements using a stopped-flow spectrophotometer verged on the heroic. One needed to be armed with knowledge of amplifiers, light sources, oscilloscopes etc. and ideally one’s credibility was greatly enhanced were one to build one’s own instrument. Analysis of the data was similarly difficult. To obtain a single rate constant might involve a wide range of skills in addition to those required for the chemical/biochemical manipulation of the system and could easily include photography, developing prints and considerable mathematical agility. Now all this has changed and, from the point of view of the scientist attempting to solve problems through transient kinetic studies, a good thing too! Very high quality data can readily be obtained by anyone with a few hours training and the ability to use a mouse and ‘point and click’ programs. Excellent stopped -flow spectrophotometers can be bought which are reliable, stable, sensitive and which are controlled by computers able to signal-average and to analyse, in seconds, kinetic progress curves in a number of ways yielding rate constants, amplitudes, residuals and statistics. Because it is now so easy, from the technical point of view, to make measurement and to do so without an apprenticeship in kinetic methods, it becomes important to make sure that one collects data that are meaningful and open to sensible interpretation. There are a number of pitfalls to avoid. The emphasis of this article is, therefore, somewhat different to that written by Eccleston (1) in an earlier volume of this series. Less time will be spent on consideration of the hardware, although the general principles are given, but the focus will be on making sure that the data collected means what one thinks it means and then how to be sure one is extracting kinetic parameters from this in a sensible way. With the advent of powerful, fast computers it has now become possible to process very large data sets quickly and this has paved the way for the application of ‘rapid scan’ devices (usually, but not exclusively, diode arrays), which allow complete spectra to be collected at very short time intervals during a reaction.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Michailidou, Sofia y Eleni Koustriava. "Training a Child with Blindness on the Basic Use of Computer with the Aim of Internet Socialization; an Intervention Program". En Assistive Technology: Shaping a Sustainable and Inclusive World. IOS Press, 2023. http://dx.doi.org/10.3233/shti230643.

Texto completo
Resumen
In the present intervention program an eleven-year-old student with visual impairments was introduced in the basic use of a computer for the first time. The key tools for achieving this goal were the screen reading software “NVDA” as well as the development of a well-structured educational program. The purpose of the intervention was to enhance the student’s technological skills, to make him familiar with the use of assistive technology and to enable him to exploit these new skills for his internet socialization. The evaluation of the intervention program’s results was completed on three stages: a) after testing the student’s knowledge and skills in the basic use of a computer (pre- and post-assessment ), b) after measuring his social network, his self-esteem and the perceived social support, and c) after analyzing the content of the student’s written speech based on a series of criteria. (pre- and post-assessment). The results showed that the basic use of a computer was acquired and internet socialization increased his level of self-esteem, his social network and simultaneously created a sense of belonging. Finally, there was an improvement in his writing.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Campbell-Kelly, Martin. "ACE". En The Turing Guide. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198747826.003.0030.

Texto completo
Resumen
In October 1945 Alan Turing was recruited by the National Physical Laboratory to lead computer development. His design for a computer, the Automatic Computing Engine (ACE), was idiosyncratic but highly effective. The small-scale Pilot ACE, completed in 1950, was the fastest medium-sized computer of its era. By the time that the full-sized ACE was operational in 1958, however, technological advance had rendered it obsolescent. Although the wartime Bletchley Park operation saw the development of the electromechanical codebreaking bombe (specified by Turing) and the electronic Colossus (to which Turing was a bystander), these inventions had no direct impact on the invention of the electronic storedprogram computer, which originated in the United States. The stored-program computer was described in the classic ‘First draft of a report on the EDVAC’, written by John von Neumann on behalf of the computer group at the Moore School of Electrical Engineering, University of Pennsylvania, in June 1945. The report was the outcome of a series of discussions commencing in the summer of 1944 between von Neumann and the inventors of the ENIAC computer—John Presper Eckert, John W. Mauchly, and others. ENIAC was an electronic computer designed primarily for ballistics calculations: in practice, the machine was limited to the integration of ordinary differential equations and it had several other design shortcomings, including a vast number of electronic tubes (18,000) and a tiny memory of just twenty numbers. It was also very time-consuming to program. The EDVAC design grew out of an attempt to remedy these shortcomings. The most novel concept in the EDVAC, which gave it the description ‘stored program’, was the decision to store both instructions and numbers in the same memory. It is worth noting that during 1936 Turing became a research student of Alonzo Church at Princeton University. Turing came to know von Neumann, who was a founding professor of the Institute for Advanced Study (IAS) in Princeton and was fully aware of Turing’s 1936 paper ‘On computable numbers’. Indeed, von Neumann was sufficiently impressed with it that he invited Turing to become his research assistant at the IAS, but Turing decided to return to England and subsequently spent the war years at Bletchley Park.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Time Series Processor (Computer program)"

1

Benjaboonyazit, Veerawit, Kazem Kiani Nassab, Sompop Buapha, Nithipoom Durongwattana y Rachit Garg. "Implementation of Well Delivery Process Application, A Success Story of Digitalization in Well Design Process". En IADC/SPE Asia Pacific Drilling Technology Conference and Exhibition. SPE, 2022. http://dx.doi.org/10.2118/209841-ms.

Texto completo
Resumen
Abstract As a part of the company's digitalization project, the Well Delivery Process (WDP) software solution was successfully developed and launched in early 2021. By using this solution, engineers benefit from the project objectives like process standardization, well design quality, and well planning cycle time reduction. The WDP software consists of a series of integrated well delivery workflows including well design, drilling execution monitoring, and closeout reports within a single cloud-based platform. The system also provides users with many automation features such as well design rules validation, offset well analysis, risk analysis, automated calculations, and digital reports (drilling programs). Through this paper, we share our experiences and observations on the system development and implementation, and also challenges and limitations that we faced to roll it out successfully. In our conventional practices before the digital transformation, drilling engineers had their individual practices/formats to perform the well delivery process using various applications and techniques. Computer applications were used to generate and compile a drilling program, but there was no standard way of performing this process. In situations where the design parameters and/or subsurface information change, engineers should have repeated the entire process to revise the design and/or the drilling program manually. To standardize the process, minimize repetitive tasks, and provide engineers with all required digital tools, the first solution (a prototype computer program) was developed in-house to prove the concept. After multiple peer reviews and trials, we used the prototype program to specify user requirements for generating the WDP software solution. An agile methodology was utilized to engage all drilling engineers in each sprint of the WDP development phase to ensure that their requirements are fulfilled. Upon completion of the development phase, a comprehensive User Acceptance Test (UAT) was conducted followed by training sessions before the cut-off date for switching from the manual process to the WDP digital workflow. Switching from existing practice to a new way of work is always challenging especially when it comes to introducing new software. There was some resistance at the start-up period due to users' unfamiliarity with the software and also software bugs that were not found during testing periods and UAT. A support team was set up to ensure immediate action on users' requirements and any technical issues with the software. During the challenging transition period, all users were fully supported until make sure they are enough familiar with the system to complete the task effectively. The support team deployed hotfixes (bug fixing) as quickly as possible with no major impact on the drilling programs' timeline, monitored the system performance on daily basis, and gathered feedback for further improvements. At present, the WDP is the primary system used by all drilling engineers in the company's domestic assets as an integrated end-to-end well delivery workflow that provides an improved design quality in a shorter planning time. More than 200 well design workflows have been initiated and approved through the system since it is being used. Based on the recent users' satisfaction survey and feedback, an overall rating of 4.2 from 5 shows the WDP solution's effectiveness with many popular and useful features especially offset well analyzer and well design dashboards. As one of the first end-to-end workflow generations in the drilling engineering process, the WDP could save the working time up to 35-40% (voted by users) through a centralized dataset platform and its holistic approach to process standardization/automation. Besides design quality and cycle time improvement, the system also enables advanced data analytics for further automation and enhancements.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Liu, Kai, Xi Zhang y YangQuan Chen. "An Evaluation of ARFIMA Programs". En ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/detc2017-67483.

Texto completo
Resumen
Strong coupling between values at different time that exhibit properties of long range dependence, non-stationary, spiky signals cannot be processed by the conventional time series analysis. The ARFIMA model, which employs the fractional order signal processing techniques, is the generalization of the conventional integer order models — ARIMA and ARMA model. Therefore, it has much wider applications since it could capture both short-range dependence and long range dependence. For now, several software have developed functions dealing with ARFIMA processes. However, it could be a big difference, if using different numerical tools for time series analysis. Time to time, being asked about which tool is suitable for a specific application, the authors decide to carry out this survey to present recapitulative information of the available tools in the literature, in hope of benefiting researchers with different academic backgrounds. In this paper, 4 primary functions concerning simulation, fractional order difference filter, estimation and forecast are compared and evaluated respectively in the different software and informative comments are also provided for selection.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Kiani Nassab, Kazem, Rachit Garg, Veerawit Benjaboonyazit, Rudy Harianto, Atiq Ur Rehman, Nithipoom Durongwattana y Sompop Buapha. "Well Design and Engineering Process Automation". En IADC/SPE Asia Pacific Drilling Technology Conference. SPE, 2021. http://dx.doi.org/10.2118/201088-ms.

Texto completo
Resumen
Abstract Drilling engineers use several applications to perform well design tasks and to create a final report for review and approval, any changes in subsurface information require revalidation of engineering calculations and repeat of the entire tasks to update the stage-gate report. This is usually a manual, timeconsuming and human error prone process that may result in additional cost and/or prolonged planning cycle time. Moreover, such manual works by individual engineers lead to diversified well design practices and formats across a company which make it difficult for standardization and compliance control. Drilling engineering computer programs are primarily standalone applications that are used for engineering calculations with no continuous workflow in most cases. Well Delivery Process (WDP) is an engineering software solution developed to integrate, automate and standardize well construction planning process across the operating company. The system encompasses several integrated workflows by which users can carry out drilling/completion tasks from feasibility study to concept selection and detailed design as well as operations monitoring and closeout reports on a single digital platform. Furthermore, functions such as engineering calculations, rules validation, offset analyzing, well schematic, risk analysis, checklist, and well program are automated through the workflows and several microservices built on a series of applications. The WDP, based on the company's well design automation initiative, was developed jointly with the service provider using its Business Process Management (BPM) tools. The system integration transforms how wells are constructed and delivered by combining a digitalized planning and design process with engineering models on a single and open cloud-native platform. Several tools and techniques such as design templates, continuous calculations, well cost models, etc. are utilized through integrated workflows to automate well design and process. The solution supports all new wells and leverage data from existing wells to optimize well construction process. As a result, the collaborative well design platform and automation tools take the drilling engineering process to the next level with a better quality well design and a reduced planning time.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Butnariu, Silviu y Florin Girbacia. "DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS IN EDUCATIONAL PROCESS". En eLSE 2012. Editura Universitara, 2012. http://dx.doi.org/10.12753/2066-026x-12-103.

Texto completo
Resumen
In the educational process, the support for the presentations displayed at courses, seminars and laboratory hours are the PowerPoint slides. Usually, a simple on-screen slide is not enough for students to understand complex information, requiring additions like verbal explanations and sometimes, to complete the presentation, there are needed drawings, sketches, charts, flowcharts or arrows. To create a presentation, there are used various methods today, from the simplest ones (projecting the slides on the whiteboard and the usage of water-based markers to overwrite) to some more complex, like using interactive whiteboards which can be written directly with special pens. However, there are cases when the screen size is too large and the reader is not even near it. The same thing happens when there is not used a projection screen and the presentation is displayed on a white wall. In these cases, it is required a new approach for teaching methods, especially for emphasize and addition of information in real time. This paper presents the new methodology for using a Natural User Interface (NUI) dedicated device (e.g. Microsoft Kinect) in PowerPoint presentations, considering the following actions: achieving additions and changes in real time, on the same screen with the PowerPoint presentation. Kinect is a peripheral low-cost device that allows users to interact with data, using body gestures and voice commands. We propose a series of intuitive NUI actions based on body postures, gestures and voice commands used to make additions to the classical presentation such as underline, highlight, drawings, diagrams, various smart-arts. The GUI of the developed program is similar to the one of the classical Windows Paint program. There were performed tests and calibration of equipment, in order to modify / correct the positions / movements attached to each action. The advantage of the developed application is the usage of low cost equipment to create an intuitive human computer interface that improves the traditional presentations.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Xing, Tongtong, Ruiling Fu, Jingxuan Min y Fulian Yin. "Sequential recommendations based on time series and dynamic interest for TV program". En Fifth International Conference on Computer Information Science and Artificial Intelligence (CISAI 2022), editado por Yuanchang Zhong. SPIE, 2023. http://dx.doi.org/10.1117/12.2667261.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Manazir, Abdul y Khalid Raza. "pCGP: A Parallel Implementation of Cartesian Genetic Program-ming for Combinatorial Circuit Design and Time-Series Prediction". En 2022 International Conference on Electrical, Computer and Energy Technologies (ICECET). IEEE, 2022. http://dx.doi.org/10.1109/icecet55527.2022.9872630.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Maravic, Manojlo y Gorana Rakicbajic. "THE TEACHERS' ATTITUDE TOWARDS THE USE OF VIDEO GAMES IN TEACHING PROCESS". En eLSE 2018. ADL Romania, 2018. http://dx.doi.org/10.12753/2066-026x-18-040.

Texto completo
Resumen
Nowadays, in Serbia, in all public spheres, there is a generally positive atmosphere towards the use of new digital technologies, while digital technology and user training for ICT industries are set as one of the national priorities of economic development. The City of Novi Sad already has a centralized position in national IT industry and particularly important data points for present research are two facts: the first one, practical, is that the largest Serbian companies specializing in the production of video games are set up in Novi Sad, and the second one, academic, is that the Academy of Arts, as a part of the University of Novi Sad, established The Video games design program - the first-ever study program of its kind in the Western Balkan region. For a while, video games had stopped being viewed as entertainment for children and teenagers only, but are instead also used in different public spheres. A new academic discipline - games studies - is using the term "serious games", which describes video games intended for education, military training, medical treatment, political, religious and corporate propaganda, and according to some authors, it also describes artistic video games. The focus of this research is in the field of educational video games, so-called edutainment - the concept of education through fun activities, and in this particular case through the use of video games in the educational process. At the point when a new method is about to be introduced in the education process, it is teachers attitude towards the change that will impact the success of this novelty. Therefore, this study should be considered as a probing study which deals with the conditions and possibilities of implementing a video games as a new tool in teaching. The main aim of our study was to explore the teachers' attitude towards the use of the serious game-based learning (ASGBL). Also included was a relationship between teachers' ASGBL with employment length, the use of the computer in teaching and playing digital games. The research included 182 teachers from elementary and secondary schools in Novi Sad, 86,3% females, aged between 25 and 63 years. For attitude towards the use of the serious game-based learning assessment, we used 25 items questionnaire. The results show that teachers don't have extremely positive or extremely negative attitude towards the use of serious game-based learning. We found no differences between employment length and ASGBL. There are differences between teachers who use the computer in teaching and teachers who do not - teachers who already use the computer in classes have a more positive attitude than those who don't use it. Also, there were no differences in ASGBL between teachers playing games in their leisure times and teachers who do not. In conclusion, elementary and secondary school teachers in Novi Sad have a slightly positive attitude towards the use of the serious game-based learning.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Schieler, Richard F. "Warehouse Configuration Analysis to Achieve Productivity and Cube Utilization". En ASME 1995 Citrus Engineering Conference. American Society of Mechanical Engineers, 1995. http://dx.doi.org/10.1115/cec1995-4105.

Texto completo
Resumen
There are many factors which influence profitability for a Citrus Industry processor. Demand and raw product quality/availability surely are near the top of the list. The process itself gets a lot of attention relative to cost reduction. One area which does not get a lot of attention, however, is warehousing. The warehouse has historically been a “foster child” so to speak. If warehousing continues to be considered a necessary evil, its effect on profitability of total operations will obviously be negative. If warehousing is given some much needed attention, the negative effect can be minimized and, in fact, can even help improve profitability. An example might demonstrate this claim: A processing plant is relatively land-locked and needs to expand to meet plan goals. Thirty (30) to forty (40) percent of existing plant area is utilized for warehousing. In many cases, productivity in these existing warehouses is poor, maintenance costs are higher than they might be, and cube utilization can be poor. A land-locked plant which must expand suggests big capital costs or debt service for additional land and buildings. Why not zero in on the relatively unproductive utilization of 30 to 40 percent of existing space. If we can improve cube utilization alone, we might free up enough space to accommodate process expansion on the site. If we can do this with a corresponding productivity increase, we effectively lower operating cost and capital cost or debt service. In the profit equation, lower cost means higher profits. Product is stored as concentrate in tank farms, as concentrate in drums, in totes in some cases, as frozen single-strength slabs or in the many finished package configurations. There is not much we can do about improving tank farm space utilization, so we will concentrate on storage of unit loads (drums, frozen single-strength slabs or palletized finished unit loads). Given the time we have to address the topic, we will zero in on drum storage. The principles discussed can be applied on any unit load configuration. To adequately address warehouse optimization, product storage configuration and method of operation must be evaluated. These two (2) variables cannot be independently developed. To a degree, each is affected by changes in the other. SORA has developed a systematic approach to the analysis of warehousing operations that recognizes this interrelationship. This analysis in general consists of a series of proprietary computer programs and algorithms that are individually customized to suit the particular needs of a client while at the same time maintaining the inter-linked relationship. This methodology is further explained and the date collection requirements are defined in this paper. An example is provided which demonstrates the results of proper analysis and provides sufficient budgetary and “rule-of-thumb” data for implementation of preliminary analysis of your own needs. Paper published with permission.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Langa, Claudiu. "USING ICT IN THE EDUCATION PROCESS BY PRE-UNIVERSITY TEACHERS AND PRE-SERVICE TEACHERS - COMPARATIVE STUDY". En eLSE 2012. Editura Universitara, 2012. http://dx.doi.org/10.12753/2066-026x-12-031.

Texto completo
Resumen
Practice has often proven that in many cases the multimedia technological applications (internet, media editing hardware and software platforms) can rise the quality of the educational act by optimizing the information transfer, pupils’/students’ development of specific competences, development of creativity and involving in common activities with the help of the internet. Nevertheless, a series of studies reveal the fact that teachers consider the student-teacher relation could not be replaced, even partially, by technology. The present paper presents the results of a study run in the 14 gymnasiums and high schools from Arges county and University of Pitesti. The purpose of the study is to identify the perception of the social-humanistic discipline teachers in pre-university education and of the students at the social-humanistic faculties (psychology, history, journalism), wishing to become teachers, on the use of ICT resources in the didactical process. Analyzing the perception of the teachers and students in the group on the efficiency of multimedia means in facilitating the transmission of knowledge and the shaping of pupils’ competences, in the judicious management of the time allocated to class teaching, in the differentiated treatment of pupils, in intermediating the pupil-teacher relationship, is among the objectives of this study. The research hypotheses of this study are: there are differences as regards students’ and teachers’ perception on the utility of multimedia means in facilitating the application of teaching-learning methods which are attractive and interactive for pupils; the existence of differences as regards students’ and teachers’ perception on the judicious management of the time allocated to class teaching through the use of multimedia means, the existence of differences as regards students’ and teachers’ perception on the individualization of training through the application of multimedia methods, the existence of differences as regards students’ and teachers’ perception on the intermediation of the pupil-teacher relationship through social media means. The methodology used in this investigative approach is the questionnaire applied to the social-humanistic discipline teachers in pre-university education and those students who attend the courses of the psycho-pedagogical studies program with certification for a teaching career and who have evaluated the university didactic staff. The study was carried out on a group of 41 socio-humanistic teachers and 56 third-year students of the University of Piteşti, in the social and human sciences field. Only third-year students were included in the group because the Computer Assisted Instruction was a subject studied during that year. Also during the third year students undertake pedagogical training and have access to schools in order to attend to lessons and to teach a certain number of hours. Findings and Results of this study have aimed at the verification of research hypotheses. The results have been processed and interpreted through descriptive and inferential statistical analysis. Three of these hypotheses have been confirmed and one has not been confirmed following data processing and interpretation. The conclusions further to the research provide relevant data for improving the ICT use methodology in the process related to the teaching of social-humanistic disciplines.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Tikhonov, Vadim S. y Alexander I. Safronov. "Analysis of Postbucking Drillstring Vibrations in Rotary Drilling of Extended-Reach Wells". En ASME 2009 28th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/omae2009-79086.

Texto completo
Resumen
One of the most serious concerns of extended-reach drilling is the dynamic behavior of the drillstring and cleaning of well. Good cleaning requires an increased angular speed. However, at higher rotary speeds, the drill string sections lying on the borehole horizontal sections tend to buckle, first, in the form of “snake”, sliding up and down the borehole bottom wall, and then in the form of whirling as the angular velocity increases. This paper presents the 3D nonlinear dynamic model of drillstring in a wellbore of 3D profile. The model suggests the possible contact/lift-off of drill pipes with/from the wellbore wall. The interaction of lateral, torsion and axial vibrations is taken into account. The relation between the normal component of contact force and the deformation of the wellbore wall is taken as quadratic-elastic. The friction force is described based on a hysteretic dynamic model. The friction force model also takes into account the transition from a sliding to a whirling. The equations of the drillstring dynamics are solved numerically using the method of lines. The DYNTUB computer program is developed to analyze the drillstring time-varying processes under different loading. The program is used to study the effects of the angular velocity, compression load, torque, friction factor, well profile, and availability of connectors on the drillstring dynamic behavior.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Informes sobre el tema "Time Series Processor (Computer program)"

1

Bigorre, Sebastien P., Raymond Graham, y Matthias Lankhorst. The Northwest Tropical Atlantic Station (NTAS): NTAS-21 Mooring Turnaround Cruise Report Cruise On Board RV Ronald H. Brown JOctober 6-25, 2022 Bridgetown, Barbados – Bridgetown, Barbados. Woods Hole Oceanographic Institution, mayo de 2023. http://dx.doi.org/10.1575/1912/66127.

Texto completo
Resumen
The Northwest Tropical Atlantic Station (NTAS) was established to address the need for accurate air-sea flux estimates and upper ocean measurements in a region with strong sea surface temperature anomalies and the likelihood of significant local air–sea interaction on interannual to decadal timescales. The approach is to maintain a surface mooring outfitted for meteorological and oceanographic measurements at a site near 15°N, 51°W by successive mooring turnarounds. These observations are used to investigate air–sea interaction processes related to climate variability. The NTAS Ocean Reference Station (ORS NTAS) is supported by the National Oceanic and Atmospheric Administration’s (NOAA) Global Ocean Monitoring and Observing (GOMO) Program (formerly Ocean Observing and Monitoring Division). This report documents recovery of the NTAS-20, the final mooring of the NTAS time-series. The NTAS moorings use Surlyn foam buoys as the surface element. These buoys were outfitted with two Air–Sea Interaction Meteorology (ASIMET) systems. Each system measures, records, and transmits via satellite the surface meteorological variables necessary to compute air–sea fluxes of heat, moisture, and momentum. The upper 160 m of the mooring line were outfitted with oceanographic sensors for the measurement of temperature, salinity, and velocity. The mooring recovery was done by the Upper Ocean Processes Group of the Woods Hole Oceanographic Institution (WHOI) and Drew Cole, onboard R/V Ronald H. Brown, Cruise RB-22-04. The cruise took place between October 6 and 25 2022. Other operations during the cruise consisted of the intercomparison between ship and NTAS buoy measurements, turnaround of Meridional Overturning Variability Experiment (MOVE) subsurface mooring array, CTD casts, and four Argo floats deployments. MOVE is designed to monitor the integrated deep meridional flow in the tropical North Atlantic. This report describes these operations.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova y Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], febrero de 2020. http://dx.doi.org/10.31812/123456789/3677.

Texto completo
Resumen
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Ayers, Dotson y Alexander. L52332 Offshore Pipeline Damage Emergency Response Guidelines. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), julio de 2012. http://dx.doi.org/10.55274/r0010016.

Texto completo
Resumen
Subsea pipelines and flow lines are periodically subjected to damaging events such as anchor impacts that result in massive pipeline movements, dropped object damage, internal/external corrosion damage, etc. Knowing how to assess these damage events is often challenging, especially considering the potential for product release. The cost of production shut-ins can be significant and avoiding un-necessary shut-ins is desirable. While most pipeline operators have company-level procedures and programs in place for responding to pipeline emergencies, at the current time there is no single resource for providing guidance for the pipeline industry. Development of emergency response guidelines for operators to respond to offshore pipeline damage emergencies in an effective and timely manner. One unique feature of this project is that SES utilized a series of workshops spaced over a years to collectively build the Decision/Task Tree, which is the key feature of this work. Further, a collaborative effort was continued to develop detailed input for the report. This Collaborative Workshop Model of conducting project work combines the best minds available on the subject, rather than having our customers merely serve as observers and evaluators, as is done traditionally. A second unique feature is that this report is formatted as a computer-based entry portal a "front door"� to existing proprietary documents that each company has assembled for use in responding to an offshore pipeline damage incident. Often the treasured company documents are in dusty notebooks that should be scanned for incorporation with this front door document. This guideline document in its final form can provide live links to the proprietary company documents in an Adobe Acrobat format, along with the materials we have developed for the project. This front door is intended for use on a computer that is linked to the internet. The contents of this report are organized to place traditional introductory topics that would detract from operational use of this report for actual offshore emergencies in appendices near the back of the report. This report provides insights on the critical elements required for effectively responding to pipeline emergencies. PART A of this report contains the traditional introductory material, while PART B is named the Field Manual - for offshore emergency use. PART B can be used alone as an emergency response field manual without the introductory material PART A contains the Executive Summary, Introduction and Background, while PART B Contains the Preface to the Field Manual, How to Use This Report, the Detailed Task/Decision Matrix, the Resource Sheets referred to in the Matrix, In-House Company Processes Needed, Table of Preferred Consultants and Service Providers, and the SPIM 3-1 Detailed Repair Investigation Checklist.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Galili, Naftali, Roger P. Rohrbach, Itzhak Shmulevich, Yoram Fuchs y Giora Zauberman. Non-Destructive Quality Sensing of High-Value Agricultural Commodities Through Response Analysis. United States Department of Agriculture, octubre de 1994. http://dx.doi.org/10.32747/1994.7570549.bard.

Texto completo
Resumen
The objectives of this project were to develop nondestructive methods for detection of internal properties and firmness of fruits and vegetables. One method was based on a soft piezoelectric film transducer developed in the Technion, for analysis of fruit response to low-energy excitation. The second method was a dot-matrix piezoelectric transducer of North Carolina State University, developed for contact-pressure analysis of fruit during impact. Two research teams, one in Israel and the other in North Carolina, coordinated their research effort according to the specific objectives of the project, to develop and apply the two complementary methods for quality control of agricultural commodities. In Israel: An improved firmness testing system was developed and tested with tropical fruits. The new system included an instrumented fruit-bed of three flexible piezoelectric sensors and miniature electromagnetic hammers, which served as fruit support and low-energy excitation device, respectively. Resonant frequencies were detected for determination of firmness index. Two new acoustic parameters were developed for evaluation of fruit firmness and maturity: a dumping-ratio and a centeroid of the frequency response. Experiments were performed with avocado and mango fruits. The internal damping ratio, which may indicate fruit ripeness, increased monotonically with time, while resonant frequencies and firmness indices decreased with time. Fruit samples were tested daily by destructive penetration test. A fairy high correlation was found in tropical fruits between the penetration force and the new acoustic parameters; a lower correlation was found between this parameter and the conventional firmness index. Improved table-top firmness testing units, Firmalon, with data-logging system and on-line data analysis capacity have been built. The new device was used for the full-scale experiments in the next two years, ahead of the original program and BARD timetable. Close cooperation was initiated with local industry for development of both off-line and on-line sorting and quality control of more agricultural commodities. Firmalon units were produced and operated in major packaging houses in Israel, Belgium and Washington State, on mango and avocado, apples, pears, tomatoes, melons and some other fruits, to gain field experience with the new method. The accumulated experimental data from all these activities is still analyzed, to improve firmness sorting criteria and shelf-life predicting curves for the different fruits. The test program in commercial CA storage facilities in Washington State included seven apple varieties: Fuji, Braeburn, Gala, Granny Smith, Jonagold, Red Delicious, Golden Delicious, and D'Anjou pear variety. FI master-curves could be developed for the Braeburn, Gala, Granny Smith and Jonagold apples. These fruits showed a steady ripening process during the test period. Yet, more work should be conducted to reduce scattering of the data and to determine the confidence limits of the method. Nearly constant FI in Red Delicious and the fluctuations of FI in the Fuji apples should be re-examined. Three sets of experiment were performed with Flandria tomatoes. Despite the complex structure of the tomatoes, the acoustic method could be used for firmness evaluation and to follow the ripening evolution with time. Close agreement was achieved between the auction expert evaluation and that of the nondestructive acoustic test, where firmness index of 4.0 and more indicated grade-A tomatoes. More work is performed to refine the sorting algorithm and to develop a general ripening scale for automatic grading of tomatoes for the fresh fruit market. Galia melons were tested in Israel, in simulated export conditions. It was concluded that the Firmalon is capable of detecting the ripening of melons nondestructively, and sorted out the defective fruits from the export shipment. The cooperation with local industry resulted in development of automatic on-line prototype of the acoustic sensor, that may be incorporated with the export quality control system for melons. More interesting is the development of the remote firmness sensing method for sealed CA cool-rooms, where most of the full-year fruit yield in stored for off-season consumption. Hundreds of ripening monitor systems have been installed in major fruit storage facilities, and being evaluated now by the consumers. If successful, the new method may cause a major change in long-term fruit storage technology. More uses of the acoustic test method have been considered, for monitoring fruit maturity and harvest time, testing fruit samples or each individual fruit when entering the storage facilities, packaging house and auction, and in the supermarket. This approach may result in a full line of equipment for nondestructive quality control of fruits and vegetables, from the orchard or the greenhouse, through the entire sorting, grading and storage process, up to the consumer table. The developed technology offers a tool to determine the maturity of the fruits nondestructively by monitoring their acoustic response to mechanical impulse on the tree. A special device was built and preliminary tested in mango fruit. More development is needed to develop a portable, hand operated sensing method for this purpose. In North Carolina: Analysis method based on an Auto-Regressive (AR) model was developed for detecting the first resonance of fruit from their response to mechanical impulse. The algorithm included a routine that detects the first resonant frequency from as many sensors as possible. Experiments on Red Delicious apples were performed and their firmness was determined. The AR method allowed the detection of the first resonance. The method could be fast enough to be utilized in a real time sorting machine. Yet, further study is needed to look for improvement of the search algorithm of the methods. An impact contact-pressure measurement system and Neural Network (NN) identification method were developed to investigate the relationships between surface pressure distributions on selected fruits and their respective internal textural qualities. A piezoelectric dot-matrix pressure transducer was developed for the purpose of acquiring time-sampled pressure profiles during impact. The acquired data was transferred into a personal computer and accurate visualization of animated data were presented. Preliminary test with 10 apples has been performed. Measurement were made by the contact-pressure transducer in two different positions. Complementary measurements were made on the same apples by using the Firmalon and Magness Taylor (MT) testers. Three-layer neural network was designed. 2/3 of the contact-pressure data were used as training input data and corresponding MT data as training target data. The remaining data were used as NN checking data. Six samples randomly chosen from the ten measured samples and their corresponding Firmalon values were used as the NN training and target data, respectively. The remaining four samples' data were input to the NN. The NN results consistent with the Firmness Tester values. So, if more training data would be obtained, the output should be more accurate. In addition, the Firmness Tester values do not consistent with MT firmness tester values. The NN method developed in this study appears to be a useful tool to emulate the MT Firmness test results without destroying the apple samples. To get more accurate estimation of MT firmness a much larger training data set is required. When the larger sensitive area of the pressure sensor being developed in this project becomes available, the entire contact 'shape' will provide additional information and the neural network results would be more accurate. It has been shown that the impact information can be utilized in the determination of internal quality factors of fruit. Until now,
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía