Journal articles on the topic 'Linear programming Data processing'

To see the other types of publications on this topic, follow the link: Linear programming Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Linear programming Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Dzyubetsʹka, M. O., and P. O. Yahanov. "Data processing modules for automated systems building management." Electronics and Communications 16, no. 3 (March 28, 2011): 92–100. http://dx.doi.org/10.20535/2312-1807.2011.16.3.266193.

Full text
Abstract:
New program modules of ACS «The Intelligent Building» which use for data processing the methods of regression procedure, neural networks, systems of mass service, linear programming, fuzzy logic were developed. The control panels were developed for the operator of the ACS «The Intelligent Building» using the technologies of virtual devices of the environment of graphic programming LabVEW. The developed modules are convenient in use and can be easily integrated into already existing ACS
APA, Harvard, Vancouver, ISO, and other styles
2

Shevchenko, Hanna, and Mykola Petrushenko. "Managing change in nature-based tourism: A decision-making model using linear programming." Problems and Perspectives in Management 20, no. 2 (May 4, 2022): 199–219. http://dx.doi.org/10.21511/ppm.20(2).2022.17.

Full text
Abstract:
In conditions of forced isolation, nature-based tourism meets the needs of safe and comfortable recreation and travel combined with the solution of acute issues of medical treatment and rehabilitation during the pandemic and post-pandemic periods. This study aims to develop a model for decision-making on change management in nature tourism based on the approach of linear economic and mathematical programming. The paper formalized changes in the variability of objective function parameters of the model and the system of its restrictions, following the structure of assets of nature-based tourism, balanced by the sustainability principle. The algorithm for implementing the model includes four stages: collection and processing of relevant data on nature-based tourism; considering changes in the objective function and the system of its limitations; linear programming with variability tests using the simplex method; defining ranges/limits in which decisions are made. The initial data are summarized and averaged based on the primary data analysis on the functioning of sanatoriums and other tourist and recreational facilities in Ukraine. Short-term nature-based tourism is considered, the services of which are classified according to the criterion of the primary purpose of travel: “wow-effect” tourism, sports tourism, health tourism, traditional recreation, and green tourism. The results make it possible to substantiate decisions on changes in recreational land areas and human resources, on the limits of changes in income due to the dynamics of service prices, as well as determine the price range while maintaining income structure and sustainability limits for natural and human assets of nature-based tourism. AcknowledgmentThe paper contains the results of a study conducted under the National Academy of Science of Ukraine’s grant Formation and Use of Natural-Resource Assets of the Recreational and Tourism Sphere (0120U100159) and the Nominal Scholarship of the Verkhovna Rada of Ukraine for Young Scientists-Doctors of Sciences for 2021 (0121U113482).
APA, Harvard, Vancouver, ISO, and other styles
3

KODAMA, KOICHI, KOHEI SUENAGA, and NAOKI KOBAYASHI. "Translation of tree-processing programs into stream-processing programs based on ordered linear type." Journal of Functional Programming 18, no. 3 (May 2008): 333–71. http://dx.doi.org/10.1017/s0956796807006570.

Full text
Abstract:
AbstractThere are two ways to write a program for manipulating tree-structured data such as XML documents: One is to write a tree-processing program focusing on the logical structure of the data and the other is to write a stream-processing program focusing on the physical structure. While tree-processing programs are easier to write than stream-processing programs, tree-processing programs are less efficient in memory usage since they use trees as intermediate data. Our aim is to establish a method for automatically translating a tree-processing program to a stream-processing one in order to take the best of both worlds. We first define a programming language for processing binary trees and a type system based on ordered linear type, and show that every well-typed program can be translated to an equivalent stream-processing program. We then extend the language and the type system to deal with XML documents. We have implemented an XML stream processor generator based on our algorithm, and obtained promising experimental results.
APA, Harvard, Vancouver, ISO, and other styles
4

Indriyanti, Aries Dwi, Dedy Rahman Prehanto, Ginanjar Setyo Permadi, Chamdan Mashuri, and Tanhella Zein Vitadiar. "Using Fuzzy Time Series (FTS) and Linear Programming for Production Planning and Planting Pattern Scheduling Red Onion." E3S Web of Conferences 125 (2019): 23007. http://dx.doi.org/10.1051/e3sconf/201912523007.

Full text
Abstract:
This study discusses the production planning system and scheduling shallots planting patterns using fuzzy time series and linear programming methods. In this study fuzzy time series to predict the number of requests and the results of predictions from fuzzy time series methods become one of the variables in the calculation of linear programming. Using supporting variables, demand data, production data, employment data, land area data, production profit data, data on the number of seedlings and planting time data are indicators used in processing the system. The system provides recommendations for cropping patterns and the number of seeds that must be planted in one period. The age of harvesting onions is ± 3-4 months from the planting process, the number of planting seeds is adjusted to the number of requests that have been predicted by using fuzzy time series and cropping pattern calculation process is carried out using linear programming. The results of this system are recommendations for farmers to plant seedlings, planting schedules, and harvest schedules to meet market demand.
APA, Harvard, Vancouver, ISO, and other styles
5

Shaikhha, Amir, Mathieu Huot, Jaclyn Smith, and Dan Olteanu. "Functional collection programming with semi-ring dictionaries." Proceedings of the ACM on Programming Languages 6, OOPSLA1 (December 8, 2022): 1–33. http://dx.doi.org/10.1145/3527333.

Full text
Abstract:
This paper introduces semi-ring dictionaries, a powerful class of compositional and purely functional collections that subsume other collection types such as sets, multisets, arrays, vectors, and matrices. We developed SDQL, a statically typed language that can express relational algebra with aggregations, linear algebra, and functional collections over data such as relations and matrices using semi-ring dictionaries. Furthermore, thanks to the algebraic structure behind these dictionaries, SDQL unifies a wide range of optimizations commonly used in databases (DB) and linear algebra (LA). As a result, SDQL enables efficient processing of hybrid DB and LA workloads, by putting together optimizations that are otherwise confined to either DB systems or LA frameworks. We show experimentally that a handful of DB and LA workloads can take advantage of the SDQL language and optimizations. SDQL can be competitive with or outperforms a host of systems that are state of the art in their own domain: in-memory DB systems Typer and Tectorwise for (flat, not nested) relational data; SciPy for LA workloads; sparse tensor compiler taco; the Trance nested relational engine; and the in-database machine learning engines LMFAO and Morpheus for hybrid DB/LA workloads over relational data.
APA, Harvard, Vancouver, ISO, and other styles
6

Han, Zhi Yong, and Zhi Gang Han. "Data Processing with Transportation Cost Allocation Method of Formwork and Scaffold Leasing Companies Based on Supply Chain." Advanced Materials Research 977 (June 2014): 520–24. http://dx.doi.org/10.4028/www.scientific.net/amr.977.520.

Full text
Abstract:
The formwork and scaffold leasing companies become one of important links in the industry supply chain. The sustainable development of the company requires the alliance and the further cooperation between the leasing companies. This thesis attempts to study how to decrease the alliance cost and to achieve reasonable cost allocation through linear programming and cooperative games in a sense of transportation cost allocation. Furthermore, the thesis puts forward the data processing method of minimizing the ratio difference of cost saving. Based on this study, it can be concluded that the method is more rational for cost allocation.
APA, Harvard, Vancouver, ISO, and other styles
7

Volkova, Elena, and Vladimir Gisin. "Privacy-Preserving Two-Party Computation of Parameters of a Fuzzy Linear Regression." Voprosy kiberbezopasnosti, no. 3(43) (2021): 11–19. http://dx.doi.org/10.21681/2311-3456-2021-3-11-19.

Full text
Abstract:
Purpose: describe two-party computation of fuzzy linear regression with horizontal partitioning of data, while maintaining data confidentiality. Methods: the computation is designed using a transformational approach. The optimization problems of each of the two participants are transformed and combined into a common problem. The solution to this problem can be found by one of the participants. Results: A protocol is proposed that allows two users to obtain a fuzzy linear regression model based on the combined data. Each of the users has a set of data about the results of observations, containing the values of the explanatory variables and the values of the response variable. The data structure is shared: both users use the same set of explanatory variables and a common criterion. Regression coefficients are searched for as symmetric triangular fuzzy numbers by solving the corresponding linear programming problem. It is assumed that both users are semihonest (honest but curious, or passive and curious), i.e. they execute the protocol, but can try to extract information about the source data of the partner by applying arbitrary processing methods to the received data that are not provided for by the protocol. The protocol describes the transformed linear programming problem. The solution of this problem can be found by one of the users. The number of observations of each user is known to both users. The observation data remains confidential. The correctness of the protocol is proved and its security is justified. Keywords: fuzzy numbers, collaborative solution of a linear programming problem, two-way computation, transformational approach, cloud computing, federated machine learning.
APA, Harvard, Vancouver, ISO, and other styles
8

Gilles, Jean-François, and Thomas Boudier. "TAPAS: Towards Automated Processing and Analysis of multi-dimensional bioimage data." F1000Research 9 (July 2, 2021): 1278. http://dx.doi.org/10.12688/f1000research.26977.2.

Full text
Abstract:
Modern microscopy is based on reproducible quantitative analysis, image data should be batch-processed by a standardized system that can be shared and easily reused by others. Furthermore, such system should require none or minimal programming from the users. We developed TAPAS (Towards an Automated Processing and Analysis System). The goal is to design an easy system for describing and exchanging processing workflows. The protocols are simple text files comprising a linear list of commands used to process and analyse the images. An extensive set of 60 modules is already available, mostly based on the tools proposed in the 3D ImageJ Suite. We propose a wizard, called TAPAS menu, to help the user design the protocol by listing the available modules and the parameters associated. Most modules will have default parameters values for most common tasks. Once the user has designed the protocol, he/she can apply the protocol to a set of images, that can be either stored locally or on a OMERO database. An extensive documentation including the list of modules, various tutorials and link to the source code is available at https://imagej.net/TAPAS.
APA, Harvard, Vancouver, ISO, and other styles
9

Gilles, Jean-François, and Thomas Boudier. "TAPAS: Towards Automated Processing and Analysis of multi-dimensional bioimage data." F1000Research 9 (October 28, 2020): 1278. http://dx.doi.org/10.12688/f1000research.26977.1.

Full text
Abstract:
Modern microscopy is based on reproducible quantitative analysis, image data should be batch-processed by a standardized system that can be shared and easily reused by others. Furthermore such system should require none or minimal programming from the users. We developed TAPAS (Towards an Automated Processing and Analysis System). The goal is to design an easy system for describing and exchanging processing workflows. The protocols are simple text files comprising a linear list of commands used to process and analyse the images. An extensive set of 60 modules is already available, mostly based on the tools proposed in the 3D ImageJ Suite. We propose a wizard, called TAPAS menu, to help the user design her protocol by listing the available modules and the parameters associated. Most modules will have default parameters values for most common tasks. Once the user has designed her protocol, she can apply the protocol to a set of images, that can be either stored locally or on a OMERO database. An extensive documentation including the list of modules, various tutorials and link to the source code is available at https://imagej.net/TAPAS.
APA, Harvard, Vancouver, ISO, and other styles
10

Pan, Ying Hui, Shi Yin Fang, and Bo Guo. "An Algorithm of Adaptive Variable Spacing for Arc Fitting Non-Circular Curve." Advanced Materials Research 753-755 (August 2013): 1960–65. http://dx.doi.org/10.4028/www.scientific.net/amr.753-755.1960.

Full text
Abstract:
The Numerical Control (NC) systems are only linear and circular interpolation function, and the non-circular curve must be fitted with them before processed. There are many algorithms of fitting curve with arc, but most of them have some shortcomings, such as the numbers of fitting data points and arcs segments are too many, the error controlling are difficult in arc fitting , the programming are difficult to meet the real-time NC processing requirement and so on. This paper proposed a new algorithm, which could quickly and automatically determine the best node for fitting curve with arc according to the change of curvature and the machining error. It could ensure the machining precision and efficiency. The algorithm could be directly completed programming and processing by the NCs macro program function without any other auxiliary software. The detailed algorithm process and the application programming were given.
APA, Harvard, Vancouver, ISO, and other styles
11

Mauluddin, Yusuf. "Three-stage flow-shop scheduling model with batch processing machine and discrete processing machine." MATEC Web of Conferences 197 (2018): 14002. http://dx.doi.org/10.1051/matecconf/201819714002.

Full text
Abstract:
This study discusses the model development of three-stage flow-shop scheduling involving Batch Processing Machine (BPM) and Discrete Processing Machine (DPM). Each job passes several stages of the production process from the first stage in DPM-a, the second stage in BPM, to the third stage in DPM-b. Each job is sequentially processed in DPM-a and DPM-b. in BPM; every job is handled simultaneously at the same time. The maximum weight determines the capacity of BPM. This study uses mathematic modeling approach. The result model produced in this study is Mixed Integer Linear Programming (MILP) Model. Purpose function model is minimizing total completion time. Model testing is done by using numerical examples with several data scenarios. The results showed that the model produced was the optimum model and provided a unique schedule pattern. In the future research can be formulated the heuristic model.
APA, Harvard, Vancouver, ISO, and other styles
12

Lin, Tsong Der, and Alan C. Lin. "Using Interpolation Points to Rectify Cutter Paths of Five-Axis Machining." Applied Mechanics and Materials 275-277 (January 2013): 2629–34. http://dx.doi.org/10.4028/www.scientific.net/amm.275-277.2629.

Full text
Abstract:
When operating a 5-axis linear cutting command G01, the controller automatically inserts point data between two NC blocks in a linear distribution fashion, and then drives the cutter along these inserted points. If a simultaneous 5-axis machining is initiated, and CL data is directly transformed one-to-one into NC data, and then cutting is performed directly without having the 5-axis controller inserting any points between NC blocks, its processing path will deviate from the original plotted path. In fact, the processing path is no longer a straight line, but a curve line. This study proposes a method to increase CL interpolation points to automatically rectify the path deviation. The extent of research of this study comprises: (1) Transforming CL data into NC data (2) Programming tool orientations (3) Obtaining all CL interpolation points and their tool orientations (4) Generating NC data.
APA, Harvard, Vancouver, ISO, and other styles
13

Gong, Yu Sheng, Li Ping Zhang, and Qian Han. "Research into the Method of Accuracy Optimization in Survey Programming of MATLAB." Advanced Materials Research 446-449 (January 2012): 3150–54. http://dx.doi.org/10.4028/www.scientific.net/amr.446-449.3150.

Full text
Abstract:
In the process of survey data processing, solving problem can usually be attributed to linear equations like Ax=b, but under some circumstances, the equations may be ill-conditioned, if conventional method of MATLAB still be used in programming, the solution worked out may not be the true one. To solve this problem, method of introducing symbolic calculation into programming has been proposed in order to make ill-conditioned feature of designed equations better and improve stability and accuracy of parameter, the superiority in GNSS elevation fitting has been proved by triple-order polynomial curve-fitting method.
APA, Harvard, Vancouver, ISO, and other styles
14

Noskov, Sergey I. "CONSTRUCTION OF EXPERT-STATISTICAL MODELS FROM INCOMPLETE DATA." T-Comm 15, no. 6 (2021): 33–39. http://dx.doi.org/10.36724/2072-8735-2021-15-6-33-39.

Full text
Abstract:
The article deals with the problem of constructing a linear regression model based on incomplete data containing gaps, using statistical and expert information. The reasons for the gaps in the data can be, in particular, a temporary malfunction (failure) of the measuring equipment when taking various technical characteristics, or negligence in the work of statistical services when fixing the reporting indicators. Very often, gaps arise when processing various kinds of sociological information in the form of questionnaires, when respondents refuse to answer a specific question (but answer others) or give an inadmissible, in particular, evasive answer. The approach proposed in the work involves filling the gaps with intervals, the boundaries of which are formed by experts, guided by both their experience and knowledge about the object of research, and using the well-known methods of point filling in the gaps. After that, the estimation of the parameters of the model, depending on the nature of the initial uncertainty in the data, is reduced to solving problems of linear or partially Boolean linear programming. The case is considered when the solution of the formalizing uncertainty in the initial data of the interval system of linear algebraic equations is not unique. The problem of constructing a linear regression equation for the influence of the volume of export of large-tonnage containers and the freight turnover of the PRC railway transport on the volume of import of large-capacity containers at the Zabaikalsk-Manchuria railway checkpoint is solved.
APA, Harvard, Vancouver, ISO, and other styles
15

Archanjo, Gabriel A., and Fernando J. Von Zuben. "Genetic Programming for Automating the Development of Data Management Algorithms in Information Technology Systems." Advances in Software Engineering 2012 (July 5, 2012): 1–14. http://dx.doi.org/10.1155/2012/893701.

Full text
Abstract:
Information technology (IT) systems are present in almost all fields of human activity, with emphasis on processing, storage, and handling of datasets. Automated methods to provide access to data stored in databases have been proposed mainly for tasks related to knowledge discovery and data mining (KDD). However, for this purpose, the database is used only to query data in order to find relevant patterns associated with the records. Processes modelled on IT systems should manipulate the records to modify the state of the system. Linear genetic programming for databases (LGPDB) is a tool proposed here for automatic generation of programs that can query, delete, insert, and update records on databases. The obtained results indicate that the LGPDB approach is able to generate programs for effectively modelling processes of IT systems, opening the possibility of automating relevant stages of data manipulation, and thus allowing human programmers to focus on more complex tasks.
APA, Harvard, Vancouver, ISO, and other styles
16

Caswell-Midwinter, Benjamin, Elizabeth M. Doney, Meisam K. Arjmandi, Kelly N. Jahn, Barbara S. Herrmann, and Julie G. Arenberg. "The Relationship Between Impedance, Programming and Word Recognition in a Large Clinical Dataset of Cochlear Implant Recipients." Trends in Hearing 26 (January 2022): 233121652110609. http://dx.doi.org/10.1177/23312165211060983.

Full text
Abstract:
Cochlear implant programming typically involves measuring electrode impedance, selecting a speech processing strategy and fitting the dynamic range of electrical stimulation. This study retrospectively analyzed a clinical dataset of adult cochlear implant recipients to understand how these variables relate to speech recognition. Data from 425 implanted post-lingually deafened ears with Advanced Bionics devices were analyzed. A linear mixed-effects model was used to infer how impedance, programming and patient factors were associated with monosyllabic word recognition scores measured in quiet. Additional analyses were conducted on subsets of data to examine the role of speech processing strategy on scores, and the time taken for the scores of unilaterally implanted patients to plateau. Variation in basal impedance was negatively associated with word score, suggesting importance in evaluating the profile of impedance. While there were small, negative bivariate correlations between programming level metrics and word scores, these relationships were not clearly supported by the model that accounted for other factors. Age at implantation was negatively associated with word score, and duration of implant experience was positively associated with word score, which could help to inform candidature and guide expectations. Electrode array type was also associated with word score. Word scores measured with traditional continuous interleaved sampling and current steering speech processing strategies were similar. The word scores of unilaterally implanted patients largely plateaued within 6-months of activation. However, there was individual variation which was not related to initially measured impedance and programming levels.
APA, Harvard, Vancouver, ISO, and other styles
17

Risfendra, Risfendra, Asfinaldi Asfinaldi, Habibullah Habibullah, and Julisardi Julisardi. "Sistem Pergerakan Robot Kiper Beroda Menggunakan Metode Wall Follower Berbasis Image Processing." ELKHA 12, no. 1 (October 10, 2020): 1. http://dx.doi.org/10.26418/elkha.v12i1.35245.

Full text
Abstract:
One of Indonesian Robot Contest divisions is the Indonesia wheeled soccer robot contest. There are three players called the striker, defense and goalkeeper robot, which is drived by wheels that controlled based on three aferomentioned positions. This study aims build the goalkeeper robot equipped with image processing to detect the ball using a camera sensor that installed in the the robot system. The Image processing contructed using the python programming language with OpenCV library. The results of image processing are used as input data that controlled by Arduino Mega 2560, which is connected serially to the PC's USB port. The results shows the maximum linear velocity that can be achieved is 1.59 m/s. Furthermore, the efficiency ratio of analysis data to the actual distance is 86.77 %
APA, Harvard, Vancouver, ISO, and other styles
18

de Lima, Roberto X., Ernesto F. Nobre Júnior, and Pedro G. P. S. Fernandes. "Optimization of Earthmoving Operations Planning: A Novel Approach Considering Interferences." Journal of Engineering, Project, and Production Management 11, no. 2 (October 20, 2020): 158–68. http://dx.doi.org/10.2478/jeppm-2021-0016.

Full text
Abstract:
Abstract The purpose of this paper is to present an optimization model for planning the distribution of materials in earthmoving operations, considering possible interferences between cut-and-fill sections such as rivers, vegetation, topographical features, or expropriations. The earth allocation problem incorporating interferences was modeled as a linear programming problem, aiming to minimize the total earthmoving cost while considering the constraints related to volume balance, construction project duration, and time for the release of traffic. The proposed linear programming model was run by an integrated system, using Excel for data analysis and IBM CPLEX as the optimizer. The mathematical model was evaluated by a sensitivity analysis and validated by a real-world project of a dam access road in the state of Ceará, Brazil. The unit costs and productivity rates used in the fictional example and in the real-world application followed the referential cost system created by Ceará’s Secretariat of Infrastructure (SEINFRA-CE). The proposed optimization model achieved reasonable processing times for all tested applications, presenting itself as a viable and efficient option for planning earthmoving operations. Furthermore, the linear programming approach provided a 2.12% cost reduction for the real-world case study, when comparing the optimized solution and original budget. This study explored the problem of earth allocation with interferences using a linear programming approach, while avoiding complex modeling issues found in recent literature. As a result, this paper proposes a user-friendly optimization system that can be easily utilized by construction companies and departments.
APA, Harvard, Vancouver, ISO, and other styles
19

Asadov, M. M., T. S. Abbasova, and E. N. Alieva. "Analysis of non-linear processes in the primary processing unit and calculation of gasoline fraction properties." Azerbaijan Oil Industry, no. 3 (March 15, 2020): 44–47. http://dx.doi.org/10.37474/0365-8554/2020-3-44-47.

Full text
Abstract:
The paper analyzes the operation of atmospheric block of electric demineralization unit in the primary oil processing (ELOU-AVT-6). For the linear processes a programming model has been selected. A calculation algorythm of the block’s optimum operation modes has been developed. The processes proceeding in the atmospheric block of the unit were reviewed as non-linear tasks. For solution of non-linear task were used a gradient and optimization methods. For selection of motion step and limitation conditions in the gradient method were used the experimental data on the gasoline fraction. A mathematical model and control algorythm have been developed. The calculation input parameters were generated from the experiment. Considering the limitations for the gasoline fraction yield, the optimum value in 34.67 % has been specified. The initial boiling temperatures for gasoline fractions have been defined in the laboratory conditions. According to the technical conditions, the boiling point of gasoline fraction should be equal to 40-60 °С, the value obtained from the calculation-experimental data is 70 °С. Moreover, the fractions boiling temperature should be end in the inetrval of 180-190 oC, from the calculation- experimental data it is 180 oC.
APA, Harvard, Vancouver, ISO, and other styles
20

Sedona, Rocco, Gabriele Cavallaro, Jenia Jitsev, Alexandre Strube, Morris Riedel, and Jón Benediktsson. "Remote Sensing Big Data Classification with High Performance Distributed Deep Learning." Remote Sensing 11, no. 24 (December 17, 2019): 3056. http://dx.doi.org/10.3390/rs11243056.

Full text
Abstract:
High-Performance Computing (HPC) has recently been attracting more attention in remote sensing applications due to the challenges posed by the increased amount of open data that are produced daily by Earth Observation (EO) programs. The unique parallel computing environments and programming techniques that are integrated in HPC systems are able to solve large-scale problems such as the training of classification algorithms with large amounts of Remote Sensing (RS) data. This paper shows that the training of state-of-the-art deep Convolutional Neural Networks (CNNs) can be efficiently performed in distributed fashion using parallel implementation techniques on HPC machines containing a large number of Graphics Processing Units (GPUs). The experimental results confirm that distributed training can drastically reduce the amount of time needed to perform full training, resulting in near linear scaling without loss of test accuracy.
APA, Harvard, Vancouver, ISO, and other styles
21

Makarichev, Victor, and Vyacheslav Kharchenko. "Application of dynamic programming approach to computation of atomic functions." RADIOELECTRONIC AND COMPUTER SYSTEMS, no. 4 (November 29, 2021): 36–45. http://dx.doi.org/10.32620/reks.2021.4.03.

Full text
Abstract:
The special class of atomic functions is considered. The atomic function is a solution with compact support of linear differential functional equation with constant coefficients and linear transformations of the argument. The functions considered are used in discrete atomic compression (DAC) of digital images. The algorithm DAC is lossy and provides better compression than JPEG, which is de facto a standard for compression of digital photos, with the same quality of the result. Application of high precision values of atomic functions can improve the efficiency of DAC, as well as provide the development of new technologies for data processing and analysis. This paper aims to develop a low complexity algorithm for computing precise values of the atomic functions considered. Precise values of atomic functions at the point of dense grids are the subject matter of this paper. Formulas of V. O. Rvachev and their generalizations are used. Direct application of them to the computation of atomic functions on dense grids leads to multiple calculations of a great number of similar expressions that should be reduced. In this research, the reduction required is provided. The goal is to develop an algorithm based on V. O. Rvachev’s formulas and their generalizations. The following tasks are solved: to convert these formulas to reduce the number of arithmetic operations and to develop a verification procedure that can be used to check results. In the current research, methods of atomic function theory and dynamic programming algorithms development principles are applied. A numerical scheme for computation of atomic functions at the points of the grid with the step, which is less than each predetermined positive real number, is obtained and a dynamic algorithm based on it is developed. Also, a verification procedure, which is based on the properties of atomic functions, is introduced. The following results are obtained: 1) the algorithm developed provides faster computation than direct application of the corresponding formulas; 2) the algorithm proposed provides precise computation of atomic functions values; 3) procedure of verification has linear complexity in the number of values to be checked. Moreover, the algorithms proposed are implemented using Python programming language and a set of tables of atomic functions values are obtained. Conclusions: results of this research are expected to improve existing data processing technologies based on atomic functions, especially the algorithm DAC, and accelerate the development of new ones.
APA, Harvard, Vancouver, ISO, and other styles
22

Chen, Jiahao, Yujiao Jiang, and Guang Wang. "Bifuzzy-Bilevel Programming Model: Solution and Application." Symmetry 13, no. 9 (August 26, 2021): 1572. http://dx.doi.org/10.3390/sym13091572.

Full text
Abstract:
Bi-level programming is widely used in processing various questions, but it cannot deal with the complex and fuzzy information contained in problems. In order to solve such problems better with intricate and vague information that can be efficiently handled by bifuzzy theory, a bifuzzy–bilevel programming model that sets the parameters to bifuzzy variables is proposed in this paper, which can process complex realistic data more accurately and improve the feasibility and validity of bi-level programming models. To ensure the solvability of the model, the equivalent form of the bifuzzy–bilevel programming model is obtained by utilizing the expected value operator. According to the linear and nonlinear characteristics of the model, the Karush–Kuhn–Tucker condition and particle swarm optimization algorithm are employed to handle the problem, respectively. Finally, by taking the distribution center location problem of the supplier as an example, the bifuzzy–bilevel programming model is applied in practice to balance highly intricate customer demands and corporate cost minimization, obtaining the feasible solution of functions at the upper and lower levels, and the bifuzzy information in the problem can also be processed well, which proves the effectiveness of the proposed methodology.
APA, Harvard, Vancouver, ISO, and other styles
23

Golikov, Ruslan Yu. "Piecewise linear approximation of a highly noisy signal waveform using least squares method." Journal Of Applied Informatics 17, no. 5 (October 21, 2022): 116–24. http://dx.doi.org/10.37791/2687-0649-2022-17-5-116-124.

Full text
Abstract:
The rising trend of computer technology using makes digital signal processing (DSP) techniques converted into numerical data sets particularly relevant. For the most part, they are quite complex and their use is not always justified for a wide range of applications. This determines the ongoing interest in heuristic algorithms that are based on simplified approaches and allow quickly obtaining approximation of estimates with the least work amount. This paper discusses a method of pulsed (single) aperiodic signal with a high level of noise component mathematical processing by approximating its shape by a piecewise linear function, that parameters are determined using the method of least squares. A brief justification for this method is given, based on an analysis of the stochastic nature of the noise component. A numerical analysis of the signals spectral composition before and after processing is performed, as well as a comparison with other common methods: filtering and coherent averaging. It is shown that the waveform piecewise linear approximation can effectively separate the useful signal from the noise component, does not require complex algorithmic designs, and its program code implementation is possible in any high-level languages. The developed method is applicable for all types of signals and is most effective for processing single aperiodic pulses without its repetition possibility. The proposed approach can also be used in the educational process when studying the programming basics and for solving economic problems based on the determination of trend lines by parametric methods.
APA, Harvard, Vancouver, ISO, and other styles
24

Zhao, Hui, Huaxiao Yan, Congwang Zhang, Xiaodong Liu, Yanhui Xue, Yingyun Qiao, Yuanyu Tian, and Song Qin. "Pyrolytic Characteristics and Kinetics ofPhragmites australis." Evidence-Based Complementary and Alternative Medicine 2011 (2011): 1–6. http://dx.doi.org/10.1155/2011/408973.

Full text
Abstract:
The pyrolytic kinetics ofPhragmites australiswas investigated using thermogravimetric analysis (TGA) method with linear temperature programming process under an inert atmosphere. Kinetic expressions for the degradation rate in devolatilization and combustion steps have been obtained forP. australiswith Dollimore method. The values of apparent activation energy, the most probable mechanism functions, and the corresponding preexponential factor were determined. The results show that the model agrees well with the experimental data and provide useful information for the design of pyrolytic processing system usingP. australisas feedstock to produce biofuel.
APA, Harvard, Vancouver, ISO, and other styles
25

Ding, Feng Qin, Yi Yu, and Zhi Yi Miao. "The Research of Linear Rotation-Axis Control Machining CNC Rotary Cam Surface Transformation." Applied Mechanics and Materials 163 (April 2012): 138–42. http://dx.doi.org/10.4028/www.scientific.net/amm.163.138.

Full text
Abstract:
General CNC milling machine for special transformation in the middle and low numerical control system of internal control software does not change under the premise of achieving the original system does not have the linear movement and rotary movement of the operating linkage function. The numerical control system of linear movement into rotary movement of the operation, and expansion of two straight line linkage CNC system functions, cleverly converted to a straight line movement control of a rotary movement of the linkage, thereby achieving the surface of the cylinder rotating cam track surface CNC machining. CNC Milling through the difficult parts of the application examples to explain the design principles of transformation CNC milling machine, design approach,As well as the design parameters in the programming of data conversion. And data conversion processing errors resulting from the measures and the elimination of error analysis.
APA, Harvard, Vancouver, ISO, and other styles
26

Gibb, Christopher M., Robert Jackson, Sabah Mohammed, Jinan Fiaidhi, and Ingeborg Zehbe. "Pathogen–Host Analysis Tool (PHAT): an integrative platform to analyze next-generation sequencing data." Bioinformatics 35, no. 15 (December 18, 2018): 2665–67. http://dx.doi.org/10.1093/bioinformatics/bty1003.

Full text
Abstract:
Abstract Summary The Pathogen–Host Analysis Tool (PHAT) is an application for processing and analyzing next-generation sequencing (NGS) data as it relates to relationships between pathogens and their hosts. Unlike custom scripts and tedious pipeline programming, PHAT provides an integrative platform encompassing raw and aligned sequence and reference file input, quality control (QC) reporting, alignment and variant calling, linear and circular alignment viewing, and graphical and tabular output. This novel tool aims to be user-friendly for life scientists studying diverse pathogen–host relationships. Availability and implementation The project is available on GitHub (https://github.com/chgibb/PHAT) and includes convenient installers, as well as portable and source versions, for both Windows and Linux (Debian and RedHat). Up-to-date documentation for PHAT, including user guides and development notes, can be found at https://chgibb.github.io/PHATDocs/. We encourage users and developers to provide feedback (error reporting, suggestions and comments).
APA, Harvard, Vancouver, ISO, and other styles
27

Nolé, María Luisa, David Soler, Juan Luis Higuera-Trujillo, and Carmen Llinares. "Optimization of the Cognitive Processes in a Virtual Classroom: A Multi-objective Integer Linear Programming Approach." Mathematics 10, no. 7 (April 5, 2022): 1184. http://dx.doi.org/10.3390/math10071184.

Full text
Abstract:
A fundamental problem in the design of a classroom is to identify what characteristics it should have in order to optimize learning. This is a complex problem because learning is a construct related to several cognitive processes. The aim of this study is to maximize learning, represented by the processes of attention, memory, and preference, depending on six classroom parameters: height, width, color hue, color saturation, color temperature, and illuminance. Multi-objective integer linear programming with three objective functions and 56 binary variables was used to solve this optimization problem. Virtual reality tools were used to gather the data; novel software was used to create variations of virtual classrooms for a sample of 112 students. Using an interactive method, more than 4700 integer linear programming problems were optimally solved to obtain 13 efficient solutions to the multi-objective problem, which allowed the decision maker to analyze all the information and make a final choice. The results showed that achieving the best cognitive processing performance involves using different classroom configurations. The use of a multi-objective interactive approach is interesting because in human behavioral studies, it is important to consider the judgement of an expert in order to make decisions.
APA, Harvard, Vancouver, ISO, and other styles
28

Mohd Azman, Nur Zafira, Nurul Akmal Mohamed, Nurul Farihan Mohamed, and Muzirah Musa. "Application of the simplex method on profit maximization in Baker's Cottage." Indonesian Journal of Electrical Engineering and Computer Science 27, no. 2 (August 1, 2022): 1034. http://dx.doi.org/10.11591/ijeecs.v27.i2.pp1034-1042.

Full text
Abstract:
Linear programming is an <span lang="EN-US">operational research technique widely used to identify and optimize management decision. Its application encourages businesses to increase their output. Instead, however, many organizations most commonly adopt the trial-and-error method. As such, companies find it challenging to distribute scarce resources in a manner that maximizes profit. This study focuses on implementing linear programming to optimize the profit of a manufacturing sector based on the optimized (best possible, efficient) use of raw materials. Our study uses the data gathered on five market bread types from Baker's Cottage reports, i.e., chicken floss, spicy floss, Frank Cheese, Mexico bun, and doughnut. This attribute has been recognized as a linear programming problem mathematically built that was solved using Excel software. The result showed that the Baker’s Cottage unit had to produce 332 loaves of Chicken Floss and 196 loaves of Frank Cheese, as these products objectively contributed to the profit. In contrast, other types of bread did not have to be produced, as their value turned to zero to achieve the maximum monthly profit.</span>
APA, Harvard, Vancouver, ISO, and other styles
29

Haddad, Henrique Moreira Dabien, Lucas Rezende Gomide, Bruno Rogério Cruz, and Sérgio Teixeira Da Silva. "AN INTEGER LINEAR PROGRAMMING APPROACH APPLIED TO THE CERRADO (SAVANNA) MANAGEMENT." FLORESTA 44, no. 1 (January 8, 2014): 1. http://dx.doi.org/10.5380/rf.v44i1.31055.

Full text
Abstract:
AbstractCerrado presents great potential for the use of its resources, whether timber or non-timber, as fruits, firewood and charcoal. Thus, this study aimed to test the use of forest regulation model type I in a remnant of cerrado, applying the integer linear programming. The studied area was a remnant of cerrado sensu stricto located in São Romão – MG. The type of forest management carried out was the strip cutting. with post regeneration conduction. The model type I was applied generating 8 scenarios considering 14 years of planning horizon. The tested scenarios considered the area control, volume control and also both controls in the same model, where the objective function was to maximize the present value of revenues (PVR). After data processing it was observed that scenario 5 was the best, because it obtained the lowest amplitude variation (425 – 575 ha/year) and volume (18.000 – 21.000 m3/year) of the exploited surface limits, presenting a PVR of R$4,004,561.58. It is possible to conclude that the developed models were able to regulate the volumetric yield in constant flows over the planning horizon, representing a promising alternative for the sustainable planning of the wood resources of the cerrado.Keywords: Mathematical programming; forest regulation; forest management. ResumoUma abordagem aplicada da programação linear inteira no manejo do cerrado. O cerrado apresenta um grande potencial de uso de seus recursos, sejam eles madeireiros ou não madeireiros, como frutos, lenha e carvão vegetal. Assim, o trabalho teve como objetivo testar o uso do modelo tipo I de regulação florestal em um remanescente de cerrado utilizando a programação linear inteira. A área de estudo foi um remanescente de cerrado sensu stricto localizado em São Romão – MG. O regime de manejo elaborado foi o corte em faixas com posterior condução da regeneração. O modelo tipo I foi aplicado gerando 8 cenários em um horizonte de planejamento de 14 anos. Os cenários testados consideraram o controle por área, por volume, bem como ambos no mesmo modelo, sendo a função objetivo o valor presente das receitas (VPR) a ser maximizado. Após o processamento dos modelos observou-se que o cenário 5 foi o melhor, possuindo limites de área explorada (425 – 575 ha/ano) e volumétricos (18.000 – 21.000 m3/ano) com menor variação de amplitude, apresentando R$4.004.561,58 de VPR. Pôde-se concluir que os modelos formulados foram capazes de regular a produção volumétrica em fluxos constantes ao longo do horizonte de planejamento, em sua grande maioria, constituindo-se de uma alternativa promissora ao planejamento sustentável dos recursos madeireiros do cerrado.Palavras-chave: Programação matemática; regulação florestal; manejo florestal.
APA, Harvard, Vancouver, ISO, and other styles
30

Zuo, Wen-Jin, Deng-Feng Li, and Gao-Feng Yu. "A GENERAL MULTI-ATTRIBUTE MULTI-SCALE DECISION MAKING METHOD BASED ON DYNAMIC LINMAP FOR PROPERTY PERCEIVED SERVICE QUALITY EVALUATION." Technological and Economic Development of Economy 26, no. 5 (June 23, 2020): 1052–73. http://dx.doi.org/10.3846/tede.2020.12726.

Full text
Abstract:
The scientific evaluation of property perceived service quality (PPSQ) needs multi-stage, multi-source and large-group perceived information, which is deemed to be the decision problem for dynamic, heterogeneous and large-scale data processing. Aiming at the problem, we propose a general multi-attribute multi-scale (MAMS) method based on the dynamic linear programming technique for multi-dimensional analysis of preference (LINMAP). In the dynamic LINMAP model, the classic MAMS matrix is introduced and extended into a general form. The dynamic LINMAP model is constructed by defining dynamic consistency and dynamic inconsistency. The time series weight is determined by Orness method. The new method adapts to the requirements of modern PPSQ. Finally, we verify the feasibility and effectiveness of dynamic LINMAP method by analyzing a PPSQ evaluation example. The new method improves the traditional PPSQ evaluation, and provides a perspective for large-scale data processing by the classic decision method.
APA, Harvard, Vancouver, ISO, and other styles
31

Osipov, Pavel, and Arkady Borisov. "Practice of Web Data Mining Methods Application." Scientific Journal of Riga Technical University. Computer Sciences 40, no. 1 (January 1, 2009): 101–7. http://dx.doi.org/10.2478/v10143-010-0014-x.

Full text
Abstract:
Practice of Web Data Mining Methods ApplicationRecent growth of information on the Internet imposes high demands on the effectiveness of processing algorithms. This paper discusses some algorithms from the field of Web Data Mining which have proved effective in many existing applications. The paper is divided into two logical parts; the first part provides a theoretical description of the algorithms, but the second one contains examples of their successful use to solve real problems. Search algorithms of vague duplicates of documents are currently actively used by all the leading search engines in the world. The paper describes the following algorithms: shingles, signature methods and image-based algorithms. Such methods of classification as a method of fuzzy clustering to-medium (Fuzzy cmeans/ FCM clustering) and clustering by ant colony (Standard Ant Clustering Algorithm SACA) are considered. In conclusion, the experience of the successful application of fuzzy clustering in conjunction with the software toolkit DataEngine to improve the efficiency of the bank "BCI Bank" is described as well as the sharing of the ant colony clustering method in conjunction with linear genetic programming to meet the increasing efficiency of predicting the load on the servers of high load Internet portal Monash Institut.
APA, Harvard, Vancouver, ISO, and other styles
32

Kim, Junhyung Lyle, George Kollias, Amir Kalev, Ken Wei, and Anastasios Kyrillidis. "Fast Quantum State Reconstruction via Accelerated Non-Convex Programming." Photonics 10, no. 2 (January 22, 2023): 116. http://dx.doi.org/10.3390/photonics10020116.

Full text
Abstract:
We propose a new quantum state reconstruction method that combines ideas from compressed sensing, non-convex optimization, and acceleration methods. The algorithm, called Momentum-Inspired Factored Gradient Descent (MiFGD), extends the applicability of quantum tomography for larger systems. Despite being a non-convex method, MiFGD converges provably close to the true density matrix at an accelerated linear rate asymptotically in the absence of experimental and statistical noise, under common assumptions. With this manuscript, we present the method, prove its convergence property and provide the Frobenius norm bound guarantees with respect to the true density matrix. From a practical point of view, we benchmark the algorithm performance with respect to other existing methods, in both synthetic and real (noisy) experiments, performed on the IBM’s quantum processing unit. We find that the proposed algorithm performs orders of magnitude faster than the state-of-the-art approaches, with similar or better accuracy. In both synthetic and real experiments, we observed accurate and robust reconstruction, despite the presence of experimental and statistical noise in the tomographic data. Finally, we provide a ready-to-use code for state tomography of multi-qubit systems.
APA, Harvard, Vancouver, ISO, and other styles
33

Lu, Ming, Jun Jie Liu, and Xiao Zheng Han. "Linear Calibration of Oxygen Detection in Coal Mine Explosion Monitoring." Applied Mechanics and Materials 303-306 (February 2013): 161–65. http://dx.doi.org/10.4028/www.scientific.net/amm.303-306.161.

Full text
Abstract:
Mine fire occurrence, continues to be one of the most enduring problems faced by the mining industry. The purpose of this study is to assess strategies for detecting the most important ingredient in the coal mine gas explosion-proof monitoring-oxygen content, so as to make the best foresee and prediction to the mine fire. The oxygen concentration within atmosphere located in the area of the seal gas explosion hazard safety zone should be less than 8%. In accordance with the national standard, the mine requires oxygen detection accuracy of 1%. The performances of an oxygen content monitoring system is influenced by such four elements as gas line, sensor, signal conditioning and processing as well as terminal handling system, etc.. In this work, through the use of computer-controlled technology, the oxygen content monitoring system was first made a linear open-loop calibration based on hardware, then made a linear closed loop calibration based on software programming, and at last, the famous OriginPRO data analysis system was used to analyze and interpret the sampling data, a programmed planning and real-time information correction of linear closed-loop calibration based on software was achieved, so that an oxygen content monitoring sensitivity limit of less than 0.36% was obtained in the system.
APA, Harvard, Vancouver, ISO, and other styles
34

Rizki, Fido, M. Izman Herdiansyah, and Darius Antoni. "Model Optimasi Biaya Produksi Pada Jaringan Rantai Pasok Karet Rakyat Menggunakan Pemrograman Linier." JURNAL MEDIA INFORMATIKA BUDIDARMA 5, no. 2 (April 25, 2021): 447. http://dx.doi.org/10.30865/mib.v5i2.2805.

Full text
Abstract:
The purpose of this research is to optimize the smallholder rubber supply chain with the sample data in Kabupaten Musirawas, The main problem in the people's rubber plantation in Musirawas Regency that has not been able to be resolved include the supply chain system that is not optimal and efficient. Based on the results of the author's observations and interviews at the Plantation Office of Kabupaten Musirawas, there are three supply chain channels in Kabupaten Musirawas, the first is through the channel of Planters - Village Collectors - Subdistrict Collectors - Bokar Processing Factories, the second is Planters - Subdistrict Collectors - Bokar Processing Plants and the third is Planters - UPPB - Bokar Processing Factory, this study uses a linear programming method in determining the most efficient and profitable supply chain in Kabupaten Musirawas, For optimization testing the author uses the LINGO 13.0 software, The results of tests that have been carried out using LINGO software show that the optimal and efficient supply chain is channel 3
APA, Harvard, Vancouver, ISO, and other styles
35

Burel, Gilles, Anthony Fiche, and Roland Gautier. "A Modulated Wideband Converter Model Based on Linear Algebra and Its Application to Fast Calibration." Sensors 22, no. 19 (September 28, 2022): 7381. http://dx.doi.org/10.3390/s22197381.

Full text
Abstract:
In the context of cognitive radio, smart cities and Internet-of-Things, the need for advanced radio spectrum monitoring becomes crucial. However, surveillance of a wide frequency band without using extremely expensive high sampling rate devices is a challenging task. The recent development of compressed sampling approaches offers a promising solution to these problems. In this context, the Modulated Wideband Converter (MWC), a blind sub-Nyquist sampling system, is probably the most realistic approach and was successfully validated in real-world conditions. The MWC can be realized with existing analog components, and there exist calibration methods that are able to integrate the imperfections of the mixers, filters and ADCs, hence allowing its use in the real world. The MWC underlying model is based on signal processing concepts such as filtering, modulation, Fourier series decomposition, oversampling and undersampling, spectrum aliasing, and so on, as well as in-flow data processing. In this paper, we develop an MWC model that is entirely based on linear algebra, matrix theory and block processing. We show that this approach has many interests: straightforward translation of mathematical equations into simple and efficient software programming, suppression of some constraints of the initial model, and providing a basis for the development of an extremely fast system calibration method. With a typical MWC acquisition device, we obtained a speed-up of the calibration computation time by a factor greater than 20 compared with a previous implementation.
APA, Harvard, Vancouver, ISO, and other styles
36

Supardi, Reno. "PENERAPAN METODE REGRESI LINEAR DALAM MEMPREDIKSI DATA PENJUALAN BARANG DI TOKO BANGUNAN VITA VIYA." Journal of Technopreneurship and Information System (JTIS) 3, no. 1 (April 4, 2020): 11–18. http://dx.doi.org/10.36085/jtis.v3i1.629.

Full text
Abstract:
Abstract- Vita Viya lumberyard is one of the lumberyard is located on Semangka Street, Bengkulu city. During this time the processing sales data of goods in lumberyard is still done conventionally, namely using sales notes that occur every day. Then, the note data will be made into bookkeeping to find out how many sales of goods occur every month. Management of inventory also only sees the remaining items available in the lumberyard without looking at the sales of the item. The problem that often occurs in Vita Viya lumberyard is the management of inventory, where the building materials have a usage period so that they are not suitable for sale. The application in predicting sales data of goods in Vita Viya lumberyard was made by using the Visual Basic.Net programming language by applying the linear regression method. With the construction of the application can be used as an alternative in predicting the sales data of goods in the following month based on sales data that occurred in the last 2 years starting from January 2017 to December 2018. The 2-year data will be used as a data trend to determine the predicted value of goods sold at 1 year in the future. Based on the testing that has been done, it can be concluded that the application is able to predict the sale of goods for the next 1 year based on sales data that has been inputted previously. Based on the testing of error values using the MAPE and MSE methods, it was found that the error value of the predicted value obtained from the linear regression method used the MAPE method of 0.1244575. While, the result by using the MSE method of 14.25. Keywords: Linear Regression Method, Prediction of Vita Viya lumberyard.
APA, Harvard, Vancouver, ISO, and other styles
37

Lv, Peng Wei, and Jian Qing Xiao. "Parallelization and Locality Optimization Based the Polyhedral Model." Applied Mechanics and Materials 713-715 (January 2015): 2045–48. http://dx.doi.org/10.4028/www.scientific.net/amm.713-715.2045.

Full text
Abstract:
Current trends in micro-architecture are towards larger number of processing elements on a single chip. It is challenging to tap the peak performance of those processors. In order to address this issue, the most promising solution is automatic parallelization. This approach does not require programmer too much effort in the process of parallelizing programs. Polyhedral model is a mathematical framework based on the powerful linear integer programming, which provides an abstraction concept to represent the nested loops computing and dependence of data access using integer points in the polyhedron. We propose an automatic transformation framework based on polyhedral model to optimize nested loop with affine dependences for parallelism and locality.
APA, Harvard, Vancouver, ISO, and other styles
38

Ding, Haitao, Chu Sun, and Jianqiu Zeng. "Fuzzy Weighted Clustering Method for Numerical Attributes of Communication Big Data Based on Cloud Computing." Symmetry 12, no. 4 (April 3, 2020): 530. http://dx.doi.org/10.3390/sym12040530.

Full text
Abstract:
It is necessary to optimize clustering processing of communication big data numerical attribute feature information in order to improve the ability of numerical attribute mining of communication big data, and thus a big data clustering algorithm based on cloud computing was proposed. The cloud extended distributed feature fitting method was used to process the numerical attribute linear programming of communication big data, and the mutual information feature quantity of communication big data numerical attribute was extracted. Combined with fuzzy C-means clustering and linear regression analysis, the statistical analysis of big data numerical attribute feature information was carried out, and the associated attribute sample set of communication big data numerical attribute cloud grid distribution was constructed. Cloud computing and adaptive quantitative recurrent classifiers were used for data classification, and block template matching and multi-sensor information fusion were combined to search the clustering center automatically to improve the convergence of clustering. The simulation results show that, after the application of this method, the information fusion performance of the clustering process was better, the automatic searching ability of the data clustering center was stronger, the frequency domain equalization control effect was good, the bit error rate was low, the energy consumption was small, and the ability of fuzzy weighted clustering retrieval of numerical attributes of communication big data was effectively improved.
APA, Harvard, Vancouver, ISO, and other styles
39

Yar, Morteza Husainy, Vahid Rahmati, and Hamid Reza Dalili Oskouei. "A Survey on Evolutionary Computation: Methods and Their Applications in Engineering." Modern Applied Science 10, no. 11 (August 9, 2016): 131. http://dx.doi.org/10.5539/mas.v10n11p131.

Full text
Abstract:
Evolutionary computation is now an inseparable branch of artificial intelligence and smart methods based on evolutional algorithms aimed at solving different real world problems by natural procedures involving living creatures. It’s based on random methods, regeneration of data, choosing by changing or replacing data within a system such as personal computer (PC), cloud, or any other data center. This paper briefly studies different evolutionary computation techniques used in some applications specifically image processing, cloud computing and grid computing. These methods are generally categorized as evolutionary algorithms and swarm intelligence. Each of these subfields contains a variety of algorithms and techniques which are presented with their applications. This work tries to demonstrate the benefits of the field by presenting the real world applications of these methods implemented already. Among these applications is cloud computing scheduling problem improved by genetic algorithms, ant colony optimization, and bees algorithm. Some other applications are improvement of grid load balancing, image processing, improved bi-objective dynamic cell formation problem, robust machine cells for dynamic part production, integrated mixed-integer linear programming, robotic applications, and power control in wind turbines.
APA, Harvard, Vancouver, ISO, and other styles
40

Pticin, S. O., D. O. Zaytcev, D. A. Pavlov, and V. V. Shmelev. "Model of process of processing rapid telemetrated parameters in real tim." Issues of radio electronics, no. 10 (December 16, 2020): 31–37. http://dx.doi.org/10.21778/2218-5453-2020-10-31-37.

Full text
Abstract:
The paper considers the generalized problem of processing telemetry data. The solution of this problem in real time is due to the requirement of the operational stage of processing telemetry information in the form of a report on the flight of rocket and space technology. At the operational stage, 10% of the total number of telemetry parameters is processed. The consequence is that the results obtained are insufficient for an operational and reliable analysis of the technical condition of on-Board systems of rocket and space technology. To eliminate the insufficiency, it is necessary to increase the completeness of the results of processing telemetry information. A conceptual formulation of the problem of processing rapidly changing telemetry parameters is formulated, taking into account the requirement to include rapidly changing parameters in the processing of telemetry information at the operational stage. A model of the processing rapidly changing parameters in real time is constructed, which is based on the method of discrete linear mathematical programming. Restrictions on processing time, processing nodes, and completeness of the processing result are defined. Taking into account the constraints, the Pareto optimal set of acceptable solutions is determined. A narrowing of the set of acceptable solutions based on the use of nonlinear partial quality indicators is described. Conclusions are made about the expected results of the decision, as well as about the course of further research.
APA, Harvard, Vancouver, ISO, and other styles
41

Borissova, Daniela, Ivan Mustakerov, and Lyubka Doukovska. "Predictive Maintenance Sensors Placement by Combinatorial Optimization." International Journal of Electronics and Telecommunications 58, no. 2 (June 1, 2012): 153–58. http://dx.doi.org/10.2478/v10177-012-0022-6.

Full text
Abstract:
Predictive Maintenance Sensors Placement by Combinatorial Optimization The strategy of predictive maintenance monitoring is important for successful system damage detection. Maintenance monitoring utilizes dynamic response information to identify the possibility of damage. The basic factors of faults detection analysis are related to properties of the structure under inspection, collect the signals and appropriate signals processing. In vibration control, structures response sensing is limited by the number of sensors or the number of input channels of the data acquisition system. An essential problem in predictive maintenance monitoring is the optimal sensor placement. The paper addresses that problem by using mixed integer linear programming tasks solving. The proposed optimal sensors location approach is based on the difference between sensor information if sensor is present and information calculated by linear interpolation if sensor is not present. The tasks results define the optimal sensors locations for a given number of sensors. The results of chosen sensors locations give as close as possible repeating the curve of structure dynamic response function. The proposed approach is implemented in an algorithm for predictive maintenance and the numerical results indicate that together with intelligent signal processing it could be suitable for practical application.
APA, Harvard, Vancouver, ISO, and other styles
42

Zhang, Yue, and Stephen Clark. "Syntactic Processing Using the Generalized Perceptron and Beam Search." Computational Linguistics 37, no. 1 (March 2011): 105–51. http://dx.doi.org/10.1162/coli_a_00037.

Full text
Abstract:
We study a range of syntactic processing tasks using a general statistical framework that consists of a global linear model, trained by the generalized perceptron together with a generic beam-search decoder. We apply the framework to word segmentation, joint segmentation and POS-tagging, dependency parsing, and phrase-structure parsing. Both components of the framework are conceptually and computationally very simple. The beam-search decoder only requires the syntactic processing task to be broken into a sequence of decisions, such that, at each stage in the process, the decoder is able to consider the top-n candidates and generate all possibilities for the next stage. Once the decoder has been defined, it is applied to the training data, using trivial updates according to the generalized perceptron to induce a model. This simple framework performs surprisingly well, giving accuracy results competitive with the state-of-the-art on all the tasks we consider. The computational simplicity of the decoder and training algorithm leads to significantly higher test speeds and lower training times than their main alternatives, including log-linear and large-margin training algorithms and dynamic-programming for decoding. Moreover, the framework offers the freedom to define arbitrary features which can make alternative training and decoding algorithms prohibitively slow. We discuss how the general framework is applied to each of the problems studied in this article, making comparisons with alternative learning and decoding algorithms. We also show how the comparability of candidates considered by the beam is an important factor in the performance. We argue that the conceptual and computational simplicity of the framework, together with its language-independent nature, make it a competitive choice for a range of syntactic processing tasks and one that should be considered for comparison by developers of alternative approaches.
APA, Harvard, Vancouver, ISO, and other styles
43

Zhao, Ligang, Guofeng Xia, Yuhu Shi, and Aisheng Wu. "Three-dimensional topography simulation research of diamond-wire sawing based on MATLAB." Industrial Lubrication and Tribology 72, no. 3 (October 12, 2019): 325–31. http://dx.doi.org/10.1108/ilt-07-2019-0247.

Full text
Abstract:
Purpose The purpose of this paper is to study the influence of the processing parameters of diamond wire sawing on surface morphology and roughness. Design/methodology/approach First, a wire saw cutting model is established to determine the positional relationship between a wire saw and the machined surface of the workpiece, and the abrasive grain cutting trajectory is generated. Through the data processing of the cutting trajectory, the simulation of the three-dimensional surface topography of the slice and the calculation of the surface roughness are realized by using the GUI programming of MATLAB. Finally, different surface roughness values are obtained by changing the machining parameters (saw wire speed and workpiece feed speed). Findings The conclusion is that the surface roughness of the slice is larger when the feed speed is higher and smaller when the linear speed is higher. Originality/value Diamond wire saw cutting is the first process of chip processing, and its efficiency and quality have an important impact on subsequent processing. This paper will focus on the influence of the sawing wire cutting processing parameters (sawing wire speed and workpiece feed speed) on the surface roughness to optimize the processing parameters and obtain smaller surface roughness values. Through MATLAB three-dimensional simulation, the surface morphology can be observed more intuitively, which provides a theoretical basis for improving the processing quality.
APA, Harvard, Vancouver, ISO, and other styles
44

Papathanasiou, I., B. Manos, Μ. Vlachopoulou, and I. Vassiliadou. "A decision support system for farm regional planning." Yugoslav Journal of Operations Research 15, no. 1 (2005): 109–24. http://dx.doi.org/10.2298/yjor0501109p.

Full text
Abstract:
This paper presents a Decision Support System (DSS) for planning of farm regions in Greece. The DSS is based on the development possibilities of the agricultural sector in relation with the agricultural processing industries of the region and aims at the development of farm regions through a better utilization of available agricultural recourses and agricultural industries. The DSS uses Linear and Goal Programming models and provides for different goals alternative production plans that optimize the use of available recourses. On the other hand, the alternative plans achieve a better utilization of the existent agricultural processing industries or propose their expansion by taking into account the supply and demand of agricultural products in the region. The DSS is computerized and supported by a set of relational data bases. The corresponding software has been developed in Microsoft Windows platform, using Microsoft Visual Basic, Microsoft Access and LINDO. For demonstration reasons, the paper includes an application of the proposed DSS in the region of Servia Kozanis in Northern Greece.
APA, Harvard, Vancouver, ISO, and other styles
45

Wang, Wei. "Algorithm for Key Classification Feature Selection of Big Data Based on Henie Theorem." International Journal of Circuits, Systems and Signal Processing 15 (August 31, 2021): 1208–13. http://dx.doi.org/10.46300/9106.2021.15.131.

Full text
Abstract:
With the extensive application of the database system, the available data of enterprises or individuals are expanding, and the existing technology is difficult to meet the data analysis requirements of the big data age. Therefore, the selection of key classification features of big data needs to be carried out. However, when the key classification features of big data are selected by the current algorithm, the distance between the samples can not be given accurately, and there is a large error in the classification. To solve this problem, a key classification feature selection algorithm based on Henie theorem is proposed. In this algorithm, the second programming algorithm is firstly used to make the weighted distance between the intra-class and the inter-class as the quadratic term and linear term parameter in the target function, and balance the relationship between the data features and the different categories. The optimized vector is used as the weight vector to measure the contribution of the feature to the classification. According to the feature importance degree, the redundancy feature is gradually deleted, and the problem of selecting the key classification features of big data into the resolution principle is fused into the Henie theorem. The function limit and sequence limit of the key classification features of big data are obtained. Based on this, the key classification features of big data are selected. Experimental simulation shows that the proposed algorithm has higher classification accuracy and can effectively meet the needs of data analysis in the era of big data.
APA, Harvard, Vancouver, ISO, and other styles
46

Sun, Dajun, Zixuan Jia, Tingting Teng, and Cong Huang. "Strong coherent interference suppression based on second- order cone null steering beamforming." MATEC Web of Conferences 283 (2019): 07004. http://dx.doi.org/10.1051/matecconf/201928307004.

Full text
Abstract:
For active sonar systems, coherent interference, such as direct-path wave and strong multi-path, will shield the weak target echo signal, which is one of the main factors affecting the target detection performance. Null steering beamforming is an important signal processing method that suppresses strong interference. In this paper, the second-order cone programming (SOCP) method is used to perform null steering beamforming on the uniform linear array. Beam null is caused in the direction of interference to suppress strong coherent interference in the active sonar system during detection. However, there are two limitations that can causes decline of the algorithm performance. A norm-control method is proposed to limit the response near the end-fire direction of linear array, and an array manifold compensation method is introduced to solve the array manifold mismatch. Simulated and experimental data was used to evaluate the performance and verify the feasibility of the methods.
APA, Harvard, Vancouver, ISO, and other styles
47

Puspita, Fitri Maya, Bella Juwita Rezky, Arden Naser Yustian Simarmata, Evi Yuliza, and Yusuf Hartono. "Improved incentive pricing-based quasi-linear utility function of wireless networks." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 3 (June 1, 2021): 1467. http://dx.doi.org/10.11591/ijeecs.v22.i3.pp1467-1475.

Full text
Abstract:
The model of the incentive pricing scheme-based quasi-linear utility function in wireless network was designed. Previous research seldom focusses on user’s satisfaction while using network. Therefore, the model is then attempted to be set up that is derived from the modification of bundling and models of reverse charging and maintain the quality of service to users by utilizing quasi-linear utility function. The pricing schemes then are applied to local data server traffic. The model used is known as mathematical programming problem that can be solved by LINGO 13.0 program as optimization tool to get the optimal solution. The optimal results show that the improved incentive pricing can achieve better solution compared to original reverse charging where the models will be obtained in flat fee, usage-based, and two-part tariff strategies for homogeneous consumers.
APA, Harvard, Vancouver, ISO, and other styles
48

Zhdanov, Valery, Elena Logacheva, Viktor Yarosh, and Alexander Ivashina. "Optimisation of repair and maintenance costs for electrical equipment in agricultural enterprises." BIO Web of Conferences 37 (2021): 00103. http://dx.doi.org/10.1051/bioconf/20213700103.

Full text
Abstract:
Application of mathematical methods of cost optimization for repair and maintenance of electrical equipment of agro-industrial enterprises is one of the important and promising directions for increasing the efficiency of electrical equipment operation management in agriculture. Mathematical programming systems use graphical and related attributive information in solving optimization problems. As graphical information in these systems we used maps, plans, diagrams, schedules of preventive measures from which the list of equipment for certain types of repair and maintenance, their labor intensity for individual objects, types of equipment and in total for the enterprise are established. Databases of electrical equipment are used as attributive information to describe electrical equipment of agro-industrial enterprises. Due to joint processing of graphical and attributive information in optimization systems, all stages of work with spatial data are more operative. Beginning from spatial data search, selection and analysis we can make a specific decision during the operation control of electrical equipment. This article considers maintenance and repair operation (MR) as a task of mathematical programming with cost optimization and deals with three approaches to the organization of this task. The expediency of using each method of solution is analyzed. The structural schemes, equations describing mathematical models, advantages and disadvantages of the presented models are given. We marked prospect of using linear programming programs for the decision of the given optimization problem by means of the inverse matrix method, i.e. the modified simplex method and computing algorithm with a standard sequence of operations.
APA, Harvard, Vancouver, ISO, and other styles
49

Darmawan, Bagus. "SUMBER PENINGKATAN PRODUKTIVITAS PERUSAHAAN GARMEN DI INDONESIA DENGAN ADANYA PENANAMAN MODAL ASING PERIODE 2007-2013." Jurnal Ekonomi dan Bisnis 22, no. 1 (November 1, 2017): 9–22. http://dx.doi.org/10.24123/jeb.v22i1.1642.

Full text
Abstract:
This study analyzes the source of total factor productivity in relation to with the existence of foreign direct investment in Indonesian Textile Industry for the period 2007-2013. The Textile Industry has ISIC code 170 and 130. This research applies quantitative analysis on survey data of manufacturing companies with medium and large scales, conducted by Indonesian Central Bureau of Statistics. The method used in data processing is divided into two methods, namely non-parametric programming linear method and panel regression method. The number of observed companies are 325 companies for seven years. The results indicate that the primary source of productivity improvement in the Textile Industry is Technical EfficiencyChange (TEC), the technological changes increase in the period of observation. Foreign Direct Investment has a positive Spillover impact on Technical EfficiencyChange, but has a negative impact on Scale EfficiencyChange (SEC) and Technological Change (TC).
APA, Harvard, Vancouver, ISO, and other styles
50

Addis, Addisu H., Hugh T. Blair, Paul R. Kenyon, Stephen T. Morris, and Nicola M. Schreurs. "Optimization of Profit for Pasture-Based Beef Cattle and Sheep Farming Using Linear Programming: Model Development and Evaluation." Agriculture 11, no. 6 (June 4, 2021): 524. http://dx.doi.org/10.3390/agriculture11060524.

Full text
Abstract:
A linear programming optimization tool is useful to assist farmers with optimizing resource allocation and profitability. This study developed a linear programming profit optimization model with a silage supplement scenario. Utilizable kilograms of pasture dry matter (kg DM) of the total pasture mass was derived using minimum and maximum pasture mass available for beef cattle and sheep and herbage utilization percentage. Daily metabolizable energy (MJ ME/head) requirements for the various activities of beef cattle and sheep were estimated and then converted to kg DM/head on a bi-monthly basis. Linear programming was employed to identify the optimum carrying capacity of beef cattle and sheep, the most profitable slaughtering ages of beef cattle, the number of prime lambs (sold to meat processing plants), and sold store lambs (sold to other farmers for finishing). Gross farm revenue (GFR) and farm earnings before tax (EBT) per hectare and per stock unit, as well as total farm expenditure (TFE), were calculated and compared to the average value of Taranaki-Manawatu North Island intensive finishing sheep and beef Class 5 farming using Beef and Lamb New Zealand (B+LNZ) data. The modeled farm ran 46% more stock units (a stock unit consumed 550 kg DM/year) than the average value of Class 5 farms. At this stocking rate, 83% of the total feed supplied for each species was consumed, and pasture supplied 95% and 98% of beef cattle and sheep feed demands, respectively. More than 70% of beef cattle were finished before the second winter. This enabled the optimized system to return 53% and 188% higher GFR/ha and EBT/ha, respectively, compared to the average values for a Class 5 farm. This paper did not address risk, such as pasture growth and price fluctuations. To understand this, several additional scenarios could be examined using this model. Further studies to include alternative herbages and crops for feed supply during summer and winter are required to expand the applicability of the model for different sheep and beef cattle farm systems.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography