To see the other types of publications on this topic, follow the link: Pareto graph.

Journal articles on the topic 'Pareto graph'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Pareto graph.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Luo, Dongqi, Binqiang Si, Saite Zhang, Fan Yu, and Jihong Zhu. "Near-Optimal Graph Signal Sampling by Pareto Optimization." Sensors 21, no. 4 (February 18, 2021): 1415. http://dx.doi.org/10.3390/s21041415.

Full text
Abstract:
In this paper, we focus on the bandlimited graph signal sampling problem. To sample graph signals, we need to find small-sized subset of nodes with the minimal optimal reconstruction error. We formulate this problem as a subset selection problem, and propose an efficient Pareto Optimization for Graph Signal Sampling (POGSS) algorithm. Since the evaluation of the objective function is very time-consuming, a novel acceleration algorithm is proposed in this paper as well, which accelerates the evaluation of any solution. Theoretical analysis shows that POGSS finds the desired solution in quadratic time while guaranteeing nearly the best known approximation bound. Empirical studies on both Erdos-Renyi graphs and Gaussian graphs demonstrate that our method outperforms the state-of-the-art greedy algorithms.
APA, Harvard, Vancouver, ISO, and other styles
2

Hamann, Michael, and Ben Strasser. "Graph Bisection with Pareto Optimization." ACM Journal of Experimental Algorithmics 23 (November 15, 2018): 1–34. http://dx.doi.org/10.1145/3173045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chandrasekhar, Arjun, and Saket Navlakha. "Neural arbors are Pareto optimal." Proceedings of the Royal Society B: Biological Sciences 286, no. 1902 (May 2019): 20182727. http://dx.doi.org/10.1098/rspb.2018.2727.

Full text
Abstract:
Neural arbors (dendrites and axons) can be viewed as graphs connecting the cell body of a neuron to various pre- and post-synaptic partners. Several constraints have been proposed on the topology of these graphs, such as minimizing the amount of wire needed to construct the arbor (wiring cost), and minimizing the graph distances between the cell body and synaptic partners (conduction delay). These two objectives compete with each other—optimizing one results in poorer performance on the other. Here, we describe how well neural arbors resolve this network design trade-off using the theory of Pareto optimality. We develop an algorithm to generate arbors that near-optimally balance between these two objectives, and demonstrate that this algorithm improves over previous algorithms. We then use this algorithm to study how close neural arbors are to being Pareto optimal. Analysing 14 145 arbors across numerous brain regions, species and cell types, we find that neural arbors are much closer to being Pareto optimal than would be expected by chance and other reasonable baselines. We also investigate how the location of the arbor on the Pareto front, and the distance from the arbor to the Pareto front, can be used to classify between some arbor types (e.g. axons versus dendrites, or different cell types), highlighting a new potential connection between arbor structure and function. Finally, using this framework, we find that another biological branching structure—plant shoot architectures used to collect and distribute nutrients—are also Pareto optimal, suggesting shared principles of network design between two systems separated by millions of years of evolution.
APA, Harvard, Vancouver, ISO, and other styles
4

He, Qinghai, and Weili Kong. "Structure of Pareto Solutions of Generalized Polyhedral-Valued Vector Optimization Problems in Banach Spaces." Abstract and Applied Analysis 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/619206.

Full text
Abstract:
In general Banach spaces, we consider a vector optimization problem (SVOP) in which the objective is a set-valued mapping whose graph is the union of finitely many polyhedra or the union of finitely many generalized polyhedra. Dropping the compactness assumption, we establish some results on structure of the weak Pareto solution set, Pareto solution set, weak Pareto optimal value set, and Pareto optimal value set of (SVOP) and on connectedness of Pareto solution set and Pareto optimal value set of (SVOP). In particular, we improved and generalize, Arrow, Barankin, and Blackwell’s classical results in Euclidean spaces and Zheng and Yang’s results in general Banach spaces.
APA, Harvard, Vancouver, ISO, and other styles
5

Igarashi, Ayumi, and Dominik Peters. "Pareto-Optimal Allocation of Indivisible Goods with Connectivity Constraints." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 2045–52. http://dx.doi.org/10.1609/aaai.v33i01.33012045.

Full text
Abstract:
We study the problem of allocating indivisible items to agents with additive valuations, under the additional constraint that bundles must be connected in an underlying item graph. Previous work has considered the existence and complexity of fair allocations. We study the problem of finding an allocation that is Pareto-optimal. While it is easy to find an efficient allocation when the underlying graph is a path or a star, the problem is NP-hard for many other graph topologies, even for trees of bounded pathwidth or of maximum degree 3. We show that on a path, there are instances where no Pareto-optimal allocation satisfies envy-freeness up to one good, and that it is NP-hard to decide whether such an allocation exists, even for binary valuations. We also show that, for a path, it is NP-hard to find a Pareto-optimal allocation that satisfies maximin share, but show that a moving-knife algorithm can find such an allocation when agents have binary valuations that have a non-nested interval structure.
APA, Harvard, Vancouver, ISO, and other styles
6

Vora, Jay, Rakesh Chaudhari, Chintan Patel, Danil Yurievich Pimenov, Vivek K. Patel, Khaled Giasin, and Shubham Sharma. "Experimental Investigations and Pareto Optimization of Fiber Laser Cutting Process of Ti6Al4V." Metals 11, no. 9 (September 15, 2021): 1461. http://dx.doi.org/10.3390/met11091461.

Full text
Abstract:
In the current study, laser cutting of Ti6Al4V was accomplished using Taguchi’s L9 orthogonal array (OA). Laser power, cutting speed, and gas pressure were selected as input process parameters, whereas surface roughness (SR), kerf width, dross height, and material removal rate (MRR) were considered as output variables. The effects of input variables were analyzed through the analysis of variance (ANOVA), main effect plots, residual plots, and contour plots. A heat transfer search algorithm was used to optimize the parameters for the single objective function including higher MRR, minimum SR, minimum dross, and minimum kerf. A multi-objective heat transfer search algorithm was used to create non-dominant optimal Pareto points, giving unique optimal solutions with the corresponding input parameters. For better understanding and ease of selection of input parameters in industry and by scientists, a Pareto graph (2D and 3D graph) is generated from the Pareto points.
APA, Harvard, Vancouver, ISO, and other styles
7

Bugaev, Yu V., and S. V. Chikunov. "Power estimation of the full set of alternatives to Paret's subgraphs in a graph." Proceedings of the Voronezh State University of Engineering Technologies 80, no. 2 (October 2, 2018): 73–76. http://dx.doi.org/10.20914/2310-1202-2018-2-73-76.

Full text
Abstract:
In practice problems of creation of an optimum subgraph of a certain look in a given graph count often meet. As possible annexes problems of search of optimum structure of technological networks, design of architecture of computers, modeling of artificial intelligence and many others are used. More and more relevant are multicriteria options of the specified tasks. An essential limiting factor of improvement of methods of multicriteria optimization on graphs is the problem of their exponential computing complexity caused by big dimension of a task. A number of data demonstrates that the theoretical assessment of complexity constructed for methods of full search isn't true, and the drawn conclusions have no sufficient justification. Among effective decisions the so-called complete set of alternatives which power can be lower on orders, than the power of the Pareto set is of the greatest interest. Taking into account the listed facts in this work the result of researches consisting in creation of assessment from above for the power of a complete set of alternatives of a problem of stay is stated pareto-optimal subgraphs for a given graph.
APA, Harvard, Vancouver, ISO, and other styles
8

Vianna, Dalessandro Soares, José Elias Claudio Arroyo, Pedro Sampaio Vieira, and Thiago Ribeiro de Azeredo. "Parallel strategies for a multi-criteria GRASP algorithm." Production 17, no. 1 (April 2007): 84–93. http://dx.doi.org/10.1590/s0103-65132007000100006.

Full text
Abstract:
This paper proposes different strategies of parallelizing a multi-criteria GRASP (Greedy Randomized Adaptive Search Problem) algorithm. The parallel GRASP algorithm is applied to the multi-criteria minimum spanning tree problem, which is NP-hard. In this problem, a vector of costs is defined for each edge of the graph and the goal is to find all the efficient or Pareto optimal spanning trees (Pareto-optimal solutions). Each process finds a subset of efficient solutions. These subsets are joined using different strategies to obtain the final set of efficient solutions. The multi-criteria GRASP algorithm with the different parallel strategies are tested on complete graphs with n = 20, 30 and 50 nodes and r = 2 and 3 criteria. The computational results show that the proposed parallel algorithms reduce the execution time and the results obtained by the sequential version were improved.
APA, Harvard, Vancouver, ISO, and other styles
9

Saha, Soma, Rajeev Kumar, and Gyan Baboo. "Characterization of graph properties for improved Pareto fronts using heuristics and EA for bi-objective graph coloring problem." Applied Soft Computing 13, no. 5 (May 2013): 2812–22. http://dx.doi.org/10.1016/j.asoc.2012.06.021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zou, Lei, and Lei Chen. "Pareto-Based Dominant Graph: An Efficient Indexing Structure to Answer Top-K Queries." IEEE Transactions on Knowledge and Data Engineering 23, no. 5 (May 2011): 727–41. http://dx.doi.org/10.1109/tkde.2010.240.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Parodi, Pietro, and Peter Watson. "PROPERTY GRAPHS – A STATISTICAL MODEL FOR FIRE AND EXPLOSION LOSSES BASED ON GRAPH THEORY." ASTIN Bulletin 49, no. 2 (March 27, 2019): 263–97. http://dx.doi.org/10.1017/asb.2019.4.

Full text
Abstract:
AbstractIt is rare that the severity loss distribution for a specific line of business can be derived from first principles. One such example is the use of generalised Pareto distribution for losses above a large threshold (or more accurately: asymptotically), which is dictated by extreme value theory. Most popular distributions, such as the lognormal distribution or the Maxwell-Boltzmann-Bose-Einstein-Fermi-Dirac (MBBEFD), are convenient heuristics with no underlying theory to back them. This paper presents a way to derive a severity distribution for property losses based on modelling a property as a weighted graph, that is, a collection of nodes and weighted arcs connecting these nodes. Each node v (to which a value can also be assigned) corresponds to a room or a unit of the property where a fire can occur, while an arc (v, v′; p) between vertices v and v′ signals that the probability of the fire propagating from v to v′ is p. The paper presents two simple models for fire propagation (the random graph approach and the random time approach) and a model for explosion risk that allow one to calculate the loss distribution for a given property from first principles. The MBBEFD model is shown to be a good approximation for the simulated distribution of losses based on property graphs for both the random graph and the random time approach.
APA, Harvard, Vancouver, ISO, and other styles
12

Gurina, R. V., E. V. Morozova, and V. V. Kosheva. "Rank analysis in evaluating the validity of olympiad tasks." Professional education in the modern world 10, no. 4 (January 30, 2021): 4302–9. http://dx.doi.org/10.20913/2618-7515-2020-4-14.

Full text
Abstract:
The rank analysis (cenological approach) based on a strict mathematical apparatus allows diagnosing the state of educational systems and processes on a scientific objective basis. The rank analysis base is applying the law of hyperbolic rank distribution of objects in systems-cenoses, which is a refined Pareto law 20/80 (Pareto-Kudrin law). Using of rank analysis to evaluate the validity of the olympiad tasks of the all-Russian Multidisciplinary Engineering Olympiad «Star», the authors demonstrate the necessity of implementing rank analysis in the practice of education quality management. They show that this law is important and appropriate to apply in the education quality management. Optimization of educational systems and processes using rank analysis consists in eliminating anomalous deviations from the hyperbolic law, which is possible only with graphical visualization of the rank distribution and its approximation. The rank distribution graph provides visibility and insight into the nature of the rank decrease. Shortcomings in the content of olympiad tasks (as well as tests), leading to a deterioration of their validity and reliability, are shown the form of distortions of the hyperbolic graph (humps, depressions, tails, degeneration of the hyperbola into other dependencies). The rank analysis allows identifying objectively several levels of validity of olympiad or test tasks carrying on a scientific basis. The paper demonstrates graphs of run distributions illustrating examples of high (natural science) and below the average (history) validity of olympiad tasks. Educational systems are cenoses, ranking objects in them are students, classes, schools, etc., and their ranked parameters are academic performance, rating in points, performance indicators, etc. Using simple mathematical tools of rank analysis and computer software bring the technologies to assess the education quality to a higher level meeting the challenges of the time.
APA, Harvard, Vancouver, ISO, and other styles
13

Menczer, Filippo, Melania Degeratu, and W. Nick Street. "Efficient and Scalable Pareto Optimization by Evolutionary Local Selection Algorithms." Evolutionary Computation 8, no. 2 (June 2000): 223–47. http://dx.doi.org/10.1162/106365600568185.

Full text
Abstract:
Local selection is a simple selection scheme in evolutionary computation. Individual fitnesses are accumulated over time and compared to a fixed threshold, rather than to each other, to decide who gets to reproduce. Local selection, coupled with fitness functions stemming from the consumption of finite shared environmental resources, maintains diversity in a way similar to fitness sharing. However, it is more efficient than fitness sharing and lends itself to parallel implementations for distributed tasks. While local selection is not prone to premature convergence, it applies minimal selection pressure to the population. Local selection is, therefore, particularly suited to Pareto optimization or problem classes where diverse solutions must be covered. This paper introduces ELSA, an evolutionary algorithm employing local selection and outlines three experiments in which ELSA is applied to multiobjective problems: a multimodal graph search problem, and two Pareto optimization problems. In all these experiments, ELSA significantly outperforms other well-known evolutionary algorithms. The paper also discusses scalability, parameter dependence, and the potential distributed applications of the algorithm.
APA, Harvard, Vancouver, ISO, and other styles
14

Chung, Hun, and John Duggan. "Directional equilibria." Journal of Theoretical Politics 30, no. 3 (July 2018): 272–305. http://dx.doi.org/10.1177/0951629818775515.

Full text
Abstract:
We propose the solution concept of directional equilibrium for the multidimensional model of voting with general spatial preferences. This concept isolates alternatives that are stable with respect to forces applied by all voters in the directions of their gradients, and it extends a known concept from statistics for Euclidean preferences. We establish connections to the majority core, Pareto optimality, and existence and closed graph, and we provide non-cooperative foundations in terms of a local contest game played by voters.
APA, Harvard, Vancouver, ISO, and other styles
15

Ensor, Andrew, and Felipe Lillo. "Colored-Edge Graph Approach for the Modeling of Multimodal Transportation Systems." Asia-Pacific Journal of Operational Research 33, no. 01 (February 2016): 1650005. http://dx.doi.org/10.1142/s0217595916500056.

Full text
Abstract:
Many networked systems involve multiple modes of transport. Such systems are called multimodal, and examples include logistic networks, biomedical phenomena and telecommunication networks. Existing techniques for determining minimal paths in multimodal networks have either required heuristics or else application-specific constraints to obtain tractable problems, removing the multimodal traits of the network during analysis. In this paper weighted colored-edge graphs are introduced for modeling multimodal networks, where colors represent the modes of transportation. Minimal paths are selected using a partial order that compares the weights in each color, resulting in a Pareto set of minimal paths. Although the computation of minimal paths is theoretically intractable and [Formula: see text]-complete, the approach is shown to be tractable through experimental analyses without the need to apply heuristics or constraints.
APA, Harvard, Vancouver, ISO, and other styles
16

Lyu, Naesung, and Kazuhiro Saitou. "Decomposition-Based Assembly Synthesis of a Three-Dimensional Body-in-White Model for Structural Stiffness." Journal of Mechanical Design 127, no. 1 (January 1, 2005): 34–48. http://dx.doi.org/10.1115/1.1799551.

Full text
Abstract:
This paper presents an extension of our previous work on decomposition-based assembly synthesis for structural stiffness, where the three-dimensional finite element model of a vehicle body-in-white (BIW) is optimally decomposed into a set of components considering (1) stiffness of the assembled structure under given loading conditions, (2) manufacturability, and (3) assembleability of components. Two case studies, each focusing on the decomposition of a different portion of a BIW, are discussed. In the first case study, the side frame is decomposed for the minimum distortion of front door frame geometry under global bending. In the second case study, the side/floor frame and floor panels are decomposed for the minimum floor deflections under global bending. In each case study, multiobjective genetic algorithm with graph-based crossover, combined with finite element methods analyses, is used to obtain Pareto optimal solutions. Representative designs are selected from the Pareto front and trade-offs among stiffness, manufacturability, and assembleability are discussed.
APA, Harvard, Vancouver, ISO, and other styles
17

Nasrollahi, Meisam, Mehdi Safaei, and Nooshin Mahmoodi. "A Mathematical Model for Integrated Green Healthcare Supply Network Design." International Journal of Business Analytics 8, no. 1 (January 2021): 58–86. http://dx.doi.org/10.4018/ijban.2021010104.

Full text
Abstract:
This research presents a novel integrated mathematical programming model for green healthcare supply network design. A multi-graph integrated supply network was designed to meet real-world situations. Minimizing total cost and minimizing delivery time are considered as the objective functions. Since the considered problem is np-hard and the presented mathematical model is very complex and highly constrained, an innovative non-dominated ranked genetic algorithm (NRGA) called M-NRGA is developed to solve the real word problems. Three different selection procedures were implemented to improve the quality and diversity of solutions in the Pareto-front resulted from M-NRGA. Several numerical examples and a case study are solved to validate the model and performance evaluation of the solution algorithm. Four different performance metrics are implemented for performance evaluation of the solution algorithm. The quality of resulted solutions, the diversity of the solutions in the Pareto front are calculated for evaluation. The results are compared with two other meta-heuristic algorithms.
APA, Harvard, Vancouver, ISO, and other styles
18

Aparicio, Juan, Eva M. Garcia-Nove, Magdalena Kapelko, and Jesus T. Pastor. "Graph productivity change measure using the least distance to the pareto-efficient frontier in data envelopment analysis." Omega 72 (October 2017): 1–14. http://dx.doi.org/10.1016/j.omega.2016.10.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Li, Junhui, Shuai Wang, Hu Zhang, and Aimin Zhou. "A Multi-Objective Evolutionary Algorithm Based on KNN-Graph for Traffic Network Attack." Electronics 9, no. 10 (September 28, 2020): 1589. http://dx.doi.org/10.3390/electronics9101589.

Full text
Abstract:
The research of vulnerability in complex network plays a key role in many real-world applications. However, most of existing work focuses on some static topological indexes of vulnerability and ignores the network functions. This paper addresses the network attack problems by considering both the topological and the functional indexes. Firstly, a network attack problem is converted into a multi-objective optimization network vulnerability problem (MONVP). Secondly to deal with MONVPs, a multi-objective evolutionary algorithm is proposed. In the new approach, a k-nearest-neighbor graph method is used to extract the structure of the Pareto set. With the obtained structure, similar parent solutions are chosen to generate offspring solutions. The statistical experiments on some benchmark problems demonstrate that the new approach shows higher search efficiency than some compared algorithms. Furthermore, the experiments on a subway system also suggests that the multi-objective optimization model can help to achieve better attach plans than the model that only considers a single index.
APA, Harvard, Vancouver, ISO, and other styles
20

Avena-Koenigsberger, Andrea, Joaquín Goñi, Richard F. Betzel, Martijn P. van den Heuvel, Alessandra Griffa, Patric Hagmann, Jean-Philippe Thiran, and Olaf Sporns. "Using Pareto optimality to explore the topology and dynamics of the human connectome." Philosophical Transactions of the Royal Society B: Biological Sciences 369, no. 1653 (October 5, 2014): 20130530. http://dx.doi.org/10.1098/rstb.2013.0530.

Full text
Abstract:
Graph theory has provided a key mathematical framework to analyse the architecture of human brain networks. This architecture embodies an inherently complex relationship between connection topology, the spatial arrangement of network elements, and the resulting network cost and functional performance. An exploration of these interacting factors and driving forces may reveal salient network features that are critically important for shaping and constraining the brain's topological organization and its evolvability. Several studies have pointed to an economic balance between network cost and network efficiency with networks organized in an ‘economical’ small-world favouring high communication efficiency at a low wiring cost. In this study, we define and explore a network morphospace in order to characterize different aspects of communication efficiency in human brain networks. Using a multi-objective evolutionary approach that approximates a Pareto-optimal set within the morphospace, we investigate the capacity of anatomical brain networks to evolve towards topologies that exhibit optimal information processing features while preserving network cost. This approach allows us to investigate network topologies that emerge under specific selection pressures, thus providing some insight into the selectional forces that may have shaped the network architecture of existing human brains.
APA, Harvard, Vancouver, ISO, and other styles
21

Echtle, K., I. Eusgeld, and D. Hirsch. "Genetic multistart algorithm for the design of fault-tolerant systems." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 222, no. 1 (March 1, 2008): 17–29. http://dx.doi.org/10.1243/1748006xjrr70.

Full text
Abstract:
This paper presents a new approach to the multiobjective design of fault-tolerant systems. The design objectives are fault tolerance and cost. Reducing the cost is of particular importance for fault-tolerant systems because the overhead caused by redundant components is considerable. The new design method consists of a special genetic algorithm that is tailored to the particular issues of fault-tolerant systems. The interface of the present tool ePADuGA (elitist and Pareto-based Approach to Design fault-tolerant systems using a Genetic Algorithm) allows for adaptation to various fields of application. The degree of fault tolerance is measured by the number of tolerated faults rather than traditional reliability metrics, because reliability numbers are mostly unknown during early design phases. The special features of the genetic algorithm comprise a graph-oriented representation of systems (which are the individuals during the evolutionary process), a simple yet expressive fault model, a very efficient procedure for fault-tolerance evaluation, and a Pareto-oriented fitness function. In a genetic algorithm generating thousands of individuals, a very fast evaluation of each individual is mandatory. For this purpose, state-space-oriented evaluation methods have been cut down to an extremely simple function which is still sufficient to assess the fault tolerance of individuals. An innovative aspect is also a multistart technique to find a Pareto solution set, which is independent of any parameters. In this paper, experimental results are presented showing the feasibility of the approach as well as the usefulness of the final fault-tolerant architectures, particularly in the field of mechatronic systems.
APA, Harvard, Vancouver, ISO, and other styles
22

Tykkyläinen, Markku. "Accessibility in the provinces of Finland." Fennia - International Journal of Geography 159, no. 2 (November 28, 2013): 361–96. http://dx.doi.org/10.11143/fennia.9158.

Full text
Abstract:
Relative, mean and integral accessibility in the existing provinces of Finland and in the proposed new provinces was studied applying graph theory, through evaluating the shapes of the provinces by reference to polygonal graphs, and, in a more detailed analysis, through using flow graphs to represent the accessibility between locations. The sensitivity of the ac­cessibility structure was tested by means of two simulation experiments. The poorest accessibility levels existed in the more northerly provinces, although some provinces weak in this respect could also be found in Southern and Central Finland, especially when accessibility was studied in relation to the size of the province. This correlates with a certain backwardness in the development of the administrative regions in relation to population development. The proposed province revisions have not corrected this situation entirely. The locations of the provincial capitals are close to the (pareto‑)optimum in general, the non‑optimal cases being ones in which the most significant centres of population growth have been located elsewhere than in the provincial capital or its immediate sur­roundings. Integral accessibility as a measure of optimal location is not sensitive to changes in spatial structure, and extremely large movements of population would be required to cause any substantial alteration in the location of the optimal centre.
APA, Harvard, Vancouver, ISO, and other styles
23

Meyer, Jan, and Clifford A. Whitcomb. "Numerical Method for Design Margin Selection." Journal of Ship Production 20, no. 01 (February 1, 2004): 24–42. http://dx.doi.org/10.5957/jsp.2004.20.1.24.

Full text
Abstract:
Design margins on speed or weight growth, to name just two, are used to ensure performance of a ship design given uncertainties in the design models or the production process. Although facing a high probability of meeting all performance requirements is very desirable, this assurance also comes at a price. When designers have the choice to set multiple margins, which all have some interacting effect on the vessel's performance, the problem of choosing a profit-maximizing margin combination becomes nontrivial. Using a mathematical model of the design process in conjunction with a Monte Carlo simulation, this paper demonstrates how the theoretical approach to margin selection presented in Meyer and Whitcomb (2004) can be applied. From the theory the profit-maximizing margin combination can be found where the Pareto frontier in the construction cost versus expected penalty graph touches the highest iso-profit line. The results from the numerical simulation show that such a Pareto frontier does form and a good (close to optimal) margin selection can be selected. Furthermore, limiting the risk level for failing to meet the performance requirements appears to have only a small negative effect on profit. Finally, a Bayesian approach is suggested, when insufficient data are available for meaningful statistics.
APA, Harvard, Vancouver, ISO, and other styles
24

Bensaci, Mohammed Abdellatif, and Aoul Elias Hadjadj. "The Impact of Integrated Maintenance Actions Optimization on Strategic Machines Lifetime." Acta Universitatis Sapientiae, Electrical and Mechanical Engineering 12, no. 1 (December 1, 2020): 46–65. http://dx.doi.org/10.2478/auseme-2020-0004.

Full text
Abstract:
Abstract This article is based on the quantitative study of reliability, maintainability, availability (RMA), on the Pareto diagram (ABC) and on the analysis of the failure modes and their impact on criticality (FMECA). The developed approach aims, at first, implementing two optimization procedures (code development, Simulink modelling in MATLAB) and using two methods (graphical, analytic) of the Weibull parameters. The results obtained via code (which is based on the Weibull graph method of two parameters) are a bit divergent. The evaluation criteria are rather far from that of the model (which is based on the maximum likelihood of three-parameter Weibull distribution). This will allow to locate the critical elements, and to determine the systematic preventive time of replacements. An application case validates the proposed study objective.
APA, Harvard, Vancouver, ISO, and other styles
25

Liu, Ruochen, Jiangdi Liu, and Manman He. "A multi-objective ant colony optimization with decomposition for community detection in complex networks." Transactions of the Institute of Measurement and Control 41, no. 9 (May 2, 2019): 2521–34. http://dx.doi.org/10.1177/0142331218804002.

Full text
Abstract:
Community detection in complex networks plays an important role in mining and analyzing the structure and function of networks. However, traditional algorithms for community detection-based graph partition and hierarchical clustering usually have to face expensive computational costs or require some specific conditions when dealing with complex networks. Recently, community detection based on intelligent optimization attracts more and more attention because of its good effectiveness. In this paper, a new multi-objective ant colony optimization with decomposition (MACOD) for community detection in complex networks is proposed. Firstly, a new framework of multi-objective ant colony algorithm specialized initially for the complex network clustering is developed, in which two-objective optimization problem can be decomposed into a series of subproblems and each ant is responsible for one single objective subproblem and it targets a particular point in the Pareto front. Secondly, a problem-specific individual encoding strategy based on graph is proposed. Moreover, a new efficient local search mechanism is designed in order to improve the stability of the algorithm. The proposed MACOD has been compared with four other state of the art algorithms on two benchmark networks and seven real-world networks including three large-scale networks. Experimental results show that MACOD performs competitively for the community detection problems.
APA, Harvard, Vancouver, ISO, and other styles
26

Abd Elaziz, Mohamed, Esraa Osama Abo Zaid, Mohammed A. A. Al-qaness, and Rehab Ali Ibrahim. "Automatic Superpixel-Based Clustering for Color Image Segmentation Using q-Generalized Pareto Distribution under Linear Normalization and Hunger Games Search." Mathematics 9, no. 19 (September 25, 2021): 2383. http://dx.doi.org/10.3390/math9192383.

Full text
Abstract:
Superixel is one of the most efficient of the image segmentation approaches that are widely used for different applications. In this paper, we developed an image segmentation based on superpixel and an automatic clustering using q-Generalized Pareto distribution under linear normalization (q-GPDL), called ASCQPHGS. The proposed method uses the superpixel algorithm to segment the given image, then the Density Peaks clustering (DPC) is employed to the results obtained from the superpixel algorithm to produce a decision graph. The Hunger games search (HGS) algorithm is employed as a clustering method to segment the image. The proposed method is evaluated using two different datasets, collected form Berkeley segmentation dataset and benchmark (BSDS500) and standford background dataset (SBD). More so, the proposed method is compared to several methods to verify its performance and efficiency. Overall, the proposed method showed significant performance and it outperformed all compared methods using well-known performance metrics.
APA, Harvard, Vancouver, ISO, and other styles
27

Xu, Meng, Qing Zhong Li, and Li Zhen Cui. "Performance and Cost Aware Application Placement in Cloud." Applied Mechanics and Materials 631-632 (September 2014): 204–9. http://dx.doi.org/10.4028/www.scientific.net/amm.631-632.204.

Full text
Abstract:
With the development of Internet technology, cloud computing has been widely applied to various industries as a new service delivery model in recent years. The cloud service providers must provide services for many customs at the same time. So a large number of different applications must be deployed and the application deployment problem becomes more and more important. How to deploy the application according to their different performance requirements has an important effect on improving the quality of service, enhancing user experience and reducing the service cost. However, for service providers, improving service quality and reducing service cost are contradictory. In this paper, the application deployment problem is modeled as the application deployment graph. Then by using the Pareto optimal thought, a multi-objective optimization algorithm is proposed. It makes that the service providers use less cost to provide the better service quality for users.
APA, Harvard, Vancouver, ISO, and other styles
28

Bullinger, Martin, Warut Suksompong, and Alexandros A. Voudouris. "Welfare Guarantees in Schelling Segregation." Journal of Artificial Intelligence Research 71 (May 29, 2021): 143–74. http://dx.doi.org/10.1613/jair.1.12771.

Full text
Abstract:
Schelling’s model is an influential model that reveals how individual perceptions and incentives can lead to residential segregation. Inspired by a recent stream of work, we study welfare guarantees and complexity in this model with respect to several welfare measures. First, we show that while maximizing the social welfare is NP-hard, computing an assignment of agents to the nodes of any topology graph with approximately half of the maximum welfare can be done in polynomial time. We then consider Pareto optimality, introduce two new optimality notions based on it, and establish mostly tight bounds on the worst-case welfare loss for assignments satisfying these notions as well as the complexity of computing such assignments. In addition, we show that for tree topologies, it is possible to decide whether there exists an assignment that gives every agent a positive utility in polynomial time; moreover, when every node in the topology has degree at least 2, such an assignment always exists and can be found efficiently.
APA, Harvard, Vancouver, ISO, and other styles
29

Ruan, Lang, Jin Chen, Qiuju Guo, Xiaobo Zhang, Yuli Zhang, and Dianxiong Liu. "Group Buying-Based Data Transmission in Flying Ad-Hoc Networks: A Coalition Game Approach." Information 9, no. 10 (October 15, 2018): 253. http://dx.doi.org/10.3390/info9100253.

Full text
Abstract:
In scenarios such as natural disasters and military strikes, it is common for unmanned aerial vehicles (UAVs) to form groups to execute reconnaissance and surveillance. To ensure the effectiveness of UAV communications, repeated resource acquisition issues and transmission mechanism designs need to be addressed urgently. Since large-scale UAVs will generate high transmission overhead due to the overlapping resource requirements, in this paper, we propose a resource allocation optimization method based on distributed data content in a Flying Ad-hoc network (FANET). The resource allocation problem with the goal of throughput maximization is constructed as a coalition game framework. Firstly, a data transmission mechanism is designed for UAVs to execute information interaction within the coalitions. Secondly, a novel mechanism of coalition selection based on group-buying is investigated for UAV coalitions to acquire data from the central UAV. The data transmission and coalition selection problem are modeled as coalition graph game and coalition formation game, respectively. Through the design of the utility function, we prove that both games have stable solutions. We also prove the convergence of the proposed approach with coalition order and Pareto order. Based on simulation results, coalition order based coalition selection algorithm (CO-CSA) and Pareto order based coalition selection algorithm (PO-CSA) are proposed to explore the stable coalition partition of system model. CO-CSA and PO-CSA can achieve higher data throughput than the contrast onetime coalition selection algorithm (Onetime-CSA) (at least increased by 34.5% and 16.9%, respectively). Besides, although PO-CSA has relatively lower throughput gain, its convergence times is on average 50.9% less than that of CO-CSA, which means that the algorithm choice is scenario-dependent.
APA, Harvard, Vancouver, ISO, and other styles
30

Zischg, Jonatan, Christopher Klinkhamer, Xianyuan Zhan, P. Suresh C. Rao, and Robert Sitzenfrei. "A Century of Topological Coevolution of Complex Infrastructure Networks in an Alpine City." Complexity 2019 (January 6, 2019): 1–16. http://dx.doi.org/10.1155/2019/2096749.

Full text
Abstract:
In this paper, we used complex network analysis approaches to investigate topological coevolution over a century for three different urban infrastructure networks. We applied network analyses to a unique time-stamped network data set of an Alpine case study, representing the historical development of the town and its infrastructure over the past 108 years. The analyzed infrastructure includes the water distribution network (WDN), the urban drainage network (UDN), and the road network (RN). We use the dual representation of the network by using the Hierarchical Intersection Continuity Negotiation (HICN) approach, with pipes or roads as nodes and their intersections as edges. The functional topologies of the networks are analyzed based on the dual graphs, providing insights beyond a conventional graph (primal mapping) analysis. We observe that the RN, WDN, and UDN all exhibit heavy tailed node degree distributions [P(k)] with high dispersion around the mean. In 50 percent of the investigated networks, P(k) can be approximated with truncated [Pareto] power-law functions, as they are known for scale-free networks. Structural differences between the three evolving network types resulting from different functionalities and system states are reflected in the P(k) and other complex network metrics. Small-world tendencies are identified by comparing the networks with their random and regular lattice network equivalents. Furthermore, we show the remapping of the dual network characteristics to the spatial map and the identification of criticalities among different network types through co-location analysis and discuss possibilities for further applications.
APA, Harvard, Vancouver, ISO, and other styles
31

Gottlob, G., G. Greco, and F. Scarcello. "Pure Nash Equilibria: Hard and Easy Games." Journal of Artificial Intelligence Research 24 (September 1, 2005): 357–406. http://dx.doi.org/10.1613/jair.1683.

Full text
Abstract:
We investigate complexity issues related to pure Nash equilibria of strategic games. We show that, even in very restrictive settings, determining whether a game has a pure Nash Equilibrium is NP-hard, while deciding whether a game has a strong Nash equilibrium is SigmaP2-complete. We then study practically relevant restrictions that lower the complexity. In particular, we are interested in quantitative and qualitative restrictions of the way each player's payoff depends on moves of other players. We say that a game has small neighborhood if the utility function for each player depends only on (the actions of) a logarithmically small number of other players. The dependency structure of a game G can be expressed by a graph DG(G) or by a hypergraph H(G). By relating Nash equilibrium problems to constraint satisfaction problems (CSPs), we show that if G has small neighborhood and if H(G) has bounded hypertree width (or if DG(G) has bounded treewidth), then finding pure Nash and Pareto equilibria is feasible in polynomial time. If the game is graphical, then these problems are LOGCFL-complete and thus in the class NC2 of highly parallelizable problems.
APA, Harvard, Vancouver, ISO, and other styles
32

Monteiro, Mayra Keroly Sales, Victor Rafael Leal Oliveira, Francisco Klebson Gomes dos Santos, Eduardo Lins Barros Neto, Leite Ricardo Henrique de Lima, Edna Maria Mendes Aroucha, and Karyn Nathallye de Oliveira Silva. "Increase of the Elastic Modulus of Cassava Starch Films with Modified Clay through Factorial Planning." Materials Science Forum 958 (June 2019): 81–86. http://dx.doi.org/10.4028/www.scientific.net/msf.958.81.

Full text
Abstract:
In this study, was investigated the optimization of the factors that significantly influenced the mechanical property improvement of cassava starch nanocomposites through complete factorial design 23. The factors to be analyzed were cassava starch (A), glycerol (B) and modified clay (C) contents. The clay had its surface modified by anion exchange in the presence of a quaternary ammonium salt. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch biofilm from the maximization of the elastic modulus. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyzes: Pareto graph and response surface. The response surface showed the best combination of factor configurations to achieve the best response and SEM analysis in thermoplastic cassava starch biofilms in both the best and the worst elasticity conditions was performed to visualize the standard of the structure of the biopolymeric matrix in both conditions. The sequence of the degree of statistical significance on the elastic modulus in relation to the effects investigated is therefore C> B> A> BC> AC.
APA, Harvard, Vancouver, ISO, and other styles
33

Zheng, Hao, Yixiong Feng, Jianrong Tan, and Zixian Zhang. "An integrated modular design methodology based on maintenance performance consideration." Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 231, no. 2 (August 8, 2016): 313–28. http://dx.doi.org/10.1177/0954405415573060.

Full text
Abstract:
Maintenance nowadays not only plays a crucial role in the usage phase, but is fast becoming the primary focus of the design stage—especially with general increased emphasis on product service. The modularization of maintenance has been explored rarely by previous researchers, despite its significant potential benefit. Existing modular design methods on life cycle do not sufficiently improve maintenance performance as a whole. In effort to remedy this, this article considers relevant maintenance issues at early stages of product development and presents a novel modular methodology based on the simultaneous consideration of maintenance and modularity characteristics. The proposed method first employs the design structure matrix to analyze the comprehensive correlation among components. Next, based on graph theory, initial modules with high cohesion and low coupling are generated. After that, a maintenance performance multi-objective model is established for further optimization to minimize maintenance costs, minimize differences in the maintenance cycle, and maximize system availability. To conclude, an improved strength Pareto evolutionary algorithm 2 is used for modular optimization. The complete methodology is demonstrated using a case study with a hydraulic press, where results reveal that the optimized modules can reduce maintenance cost under the premise of approximately constant modular performance.
APA, Harvard, Vancouver, ISO, and other styles
34

Rucitra, Andan Linggar, and S. Fadiah. "PENERAPAN STATISTICAL QUALITY CONTROL (SQC) PADA PENGENDALIAN MUTU MINYAK TELON (STUDI KASUS DI PT.X)." AGROINTEK 13, no. 1 (March 5, 2019): 72. http://dx.doi.org/10.21107/agrointek.v13i1.4920.

Full text
Abstract:
<p><em>Telon oil is</em><em> one of </em><em> </em><em>the </em><em>traditional medicine in the form of </em><em> </em><em>liquid preparations that serves to provide a sense of warmth to the wearer. PT</em><em>.X</em><em> is one of the companies that produce</em><em> </em><em>telon</em><em> oil</em><em>.</em><em> To maintain</em><em> the quality of telon oil from PT.X</em><em> product</em><em>, required overall quality control that is starting from the quality control of raw materials, quality control process to the quality control of the final product. The purpose of this research is to know the application of Statistical Quality Control (SQC) in controlling the quality of telon oil in PT X. </em><em>F</em><em>inal product</em><em> quality</em><em> become one of the measurement of success of a process, so it needs a good quality control. SQC method used in this research is Pareto Diagram and Cause and Effect Diagram. Pareto diagram is a bar graph </em><em>that </em><em>show the problem based on the order of the number of occurrences of the most number of problems until the least happened. A causal diagram is often called a fishbone diagram, a tool for identifying potential causes of an effect or problem. The result of applying the method indicates that 80% defect is caused by unsuitable volume and on the incompatibility of Expired Date (ED) code. The damage is caused by several factors namely the method, labor, and machine while the most potential factor is the volume conformity to reduce the number of defect products.</em></p>
APA, Harvard, Vancouver, ISO, and other styles
35

Ab Rashid, Mohd Fadzil Faisae, Ashutosh Tiwari, and Windo Hutabarat. "Integrated optimization of mixed-model assembly sequence planning and line balancing using Multi-objective Discrete Particle Swarm Optimization." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 33, no. 03 (May 6, 2019): 332–45. http://dx.doi.org/10.1017/s0890060419000131.

Full text
Abstract:
AbstractRecently, interest in integrated assembly sequence planning (ASP) and assembly line balancing (ALB) began to pick up because of its numerous benefits, such as the larger search space that leads to better solution quality, reduced error rate in planning, and expedited product time-to-market. However, existing research is limited to the simple assembly problem that only runs one homogenous product. This paper therefore models and optimizes the integrated mixed-model ASP and ALB using Multi-objective Discrete Particle Swarm Optimization (MODPSO) concurrently. This is a new variant of the integrated assembly problem. The integrated mixed-model ASP and ALB is modeled using task-based joint precedence graph. In order to test the performance of MODPSO to optimize the integrated mixed-model ASP and ALB, an experiment using a set of 51 test problems with different difficulty levels was conducted. Besides that, MODPSO coefficient tuning was also conducted to identify the best setting so as to optimize the problem. The results from this experiment indicated that the MODPSO algorithm presents a significant improvement in term of solution quality toward Pareto optimal and demonstrates the ability to explore the extreme solutions in the mixed-model assembly optimization search space. The originality of this research is on the new variant of integrated ASP and ALB problem. This paper is the first published research to model and optimize the integrated ASP and ALB research for mixed-model assembly problem.
APA, Harvard, Vancouver, ISO, and other styles
36

Rahmadi, Ridho, Perry Groot, Marieke HC van Rijn, Jan AJG van den Brand, Marianne Heins, Hans Knoop, and Tom Heskes. "Causality on longitudinal data: Stable specification search in constrained structural equation modeling." Statistical Methods in Medical Research 27, no. 12 (June 28, 2017): 3814–34. http://dx.doi.org/10.1177/0962280217713347.

Full text
Abstract:
A typical problem in causal modeling is the instability of model structure learning, i.e., small changes in finite data can result in completely different optimal models. The present work introduces a novel causal modeling algorithm for longitudinal data, that is robust for finite samples based on recent advances in stability selection using subsampling and selection algorithms. Our approach uses exploratory search but allows incorporation of prior knowledge, e.g., the absence of a particular causal relationship between two specific variables. We represent causal relationships using structural equation models. Models are scored along two objectives: the model fit and the model complexity. Since both objectives are often conflicting, we apply a multi-objective evolutionary algorithm to search for Pareto optimal models. To handle the instability of small finite data samples, we repeatedly subsample the data and select those substructures (from the optimal models) that are both stable and parsimonious. These substructures can be visualized through a causal graph. Our more exploratory approach achieves at least comparable performance as, but often a significant improvement over state-of-the-art alternative approaches on a simulated data set with a known ground truth. We also present the results of our method on three real-world longitudinal data sets on chronic fatigue syndrome, Alzheimer disease, and chronic kidney disease. The findings obtained with our approach are generally in line with results from more hypothesis-driven analyses in earlier studies and suggest some novel relationships that deserve further research.
APA, Harvard, Vancouver, ISO, and other styles
37

Zhuang, Hongchao, Kailun Dong, Yuming Qi, Ning Wang, and Lei Dong. "Multi-Destination Path Planning Method Research of Mobile Robots Based on Goal of Passing through the Fewest Obstacles." Applied Sciences 11, no. 16 (August 11, 2021): 7378. http://dx.doi.org/10.3390/app11167378.

Full text
Abstract:
In order to effectively solve the inefficient path planning problem of mobile robots traveling in multiple destinations, a multi-destination global path planning algorithm is proposed based on the optimal obstacle value. A grid map is built to simulate the real working environment of mobile robots. Based on the rules of the live chess game in Go, the grid map is optimized and reconstructed. This grid of environment and the obstacle values of grid environment between each two destination points are obtained. Using the simulated annealing strategy, the optimization of multi-destination arrival sequence for the mobile robot is implemented by combining with the obstacle value between two destination points. The optimal mobile node of path planning is gained. According to the Q-learning algorithm, the parameters of the reward function are optimized to obtain the q value of the path. The optimal path of multiple destinations is acquired when mobile robots can pass through the fewest obstacles. The multi-destination path planning simulation of the mobile robot is implemented by MATLAB software (Natick, MA, USA, R2016b) under multiple working conditions. The Pareto numerical graph is obtained. According to comparing multi-destination global planning with single-destination path planning under the multiple working conditions, the length of path in multi-destination global planning is reduced by 22% compared with the average length of the single-destination path planning algorithm. The results show that the multi-destination global path planning method of the mobile robot based on the optimal obstacle value is reasonable and effective. Multi-destination path planning method proposed in this article is conducive to improve the terrain adaptability of mobile robots.
APA, Harvard, Vancouver, ISO, and other styles
38

Sales, Priscila Ferreira de, Yago Ribeiro de Oliveira Silva, Leonardo Silva Santos Lapa, and Francisco Hélcio Canuto Amaral. "Caracterização e aplicação de filmes biodegradáveis de amido de incorporados de extrato de própolis-verde." ForScience 9, no. 2 (August 20, 2021): e00958. http://dx.doi.org/10.29069/forscience.2021v9n2.e958.

Full text
Abstract:
Foram sintetizados filmes biodegradáveis empregando quantidades distintas de amido de milho, glicerina e extrato de própolis-verde em um planejamento fatorial completo com três pontos centrais. Os materiais foram analisados com base em duas propriedades: físicas (espessura, densidade, gramatura e índice de intumescimento) e óptica (transparência). Em todas as propriedades analisadas foi empregado o gráfico de Pareto com a finalidade de verificar o efeito das variáveis e de suas interações. Foram construídas superfícies de resposta para as interações e regressões que foram significativas ao teste conduzido com 95% de confiança. Os resultados permitiram evidenciar que dentre os parâmetros avaliados, o amido de milho foi o que mais influenciou na determinação da espessura, densidade, gramatura e transparência, justificado pela maior quantidade inserida durante a síntese e por ser empregado como agente formador. Por outro lado, na determinação do índice de intumescimento, a glicerina foi o fator de maior influência, já que a mesma é adicionada como agente plastificante. Após a avaliação dos filmes, foram selecionados os que apresentaram as características mais adequadas para aplicação na conservação de bananas-prata (Musa spp.). Os resultados foram promissores na medida em que as frutas embaladas apresentaram menor perda de massa, quando comparadas à amostra sem embalagem, indicando que os biofilmes produzidos são adequados por impedirem as reações de oxidação e deterioração de alimentos. Palavras-chave: Biofilmes. Ferramentas quimiométricas. Conservação de bananas. Characterization and application of biodegradable starch films incorporated with propolis-green extract Abstract Biodegradable films were synthesized using different amounts of corn starch, glycerin and green propolis extract in a complete factorial design with three central points. The materials obtained were analyzed based of two properties: physical (thickness, density, weight and swelling index) and optics (transparency). In all the properties analyzed, the Pareto graph was used in order to verify the effect of the variables and their interactions. Response surfaces were built for interactions and regressions that were significant to the test conducted with 95% confidence. The results showed that among the evaluated parameters, corn starch was the one that most influenced the determination of thickness, density, weight and transparency, justified by the greater quantity inserted during the synthesis and for being used as a forming agent. On the other hand, in the determination of the swelling index, glycerin was the factor of greatest influence, since it is added as a plasticizer. After the evaluation of the synthesized films, those that had the most suitable characteristics for application in the conservation of silver bananas (Musa spp.) were selected. The results were promising insofar as the packaged fruits show less loss of mass, when compared to the sample without packaging, indicating that the biofilms produced are adequate for preventing oxidation and food spoilage reactions. Keywords: Biofilms. Chemometric tools. Banana´s conservation.
APA, Harvard, Vancouver, ISO, and other styles
39

Lapa, Leonardo Silva Santos, Yago Ribeiro de Oliveira Silva, Priscila Ferreira de Sales, Karina Carvalho Guimarães, and Marali Vilela Dias. "Avaliação das propriedades mecânicas de filmes biodegradáveis e sua aplicação em embalagens para acondicionamento de plantas." ForScience 9, no. 2 (August 12, 2021): e00970. http://dx.doi.org/10.29069/forscience.2021v9n2.e970.

Full text
Abstract:
Neste trabalho empregou-se a técnica de casting para a produção de filmes. Utilizou-se quantidades distintas de amido de milho, glicerina e extrato de própolis-verde, sendo avaliadas as propriedades mecânicas (módulo de Young, elasticidade e tensão à ruptura) dos materiais obtidos. Verificou-se o efeito das variáveis e de suas interações pelo gráfico de Pareto. Superfícies de resposta foram construídas em testes conduzidos com 95% de confiança. A análise dos resultados indicou que o amido de milho foi o que mais influenciou na determinação do módulo de Young. Por outro lado, para a elongação, a glicerina, utilizada como agente plastificante, foi o fator mais relevante. Na análise da tensão à ruptura, observou-se que a interação entre a glicerina e o extrato de própolis-verde foi a mais significativa. Foi então selecionado o filme que apresentou as características mais adequadas no acondicionamento de mudas de alface. Os resultados indicaram que a amostra produzida a partir de 4 g de amido de milho, 1 g de glicerina e 0,12 g de extrato de própolis-verde foi a mais apropriada. Com relação ao acondicionamento da muda de alface, foi feita uma análise visual e macroscópica com a finalidade de verificar a resistência do filme biodegradável quando o substrato e a muda foram adicionados na embalagem. Foi observado que o material sintetizado e escolhido teve uma capacidade adequada para a sua retenção, o que favorece e ratifica o seu emprego, uma vez que, ao ser biodegradável, pode se tornar substituto de plásticos convencionais. Palavras-chave: Técnica casting. Módulo de Young. Elasticidade. Tensão à ruptura. Evaluation of the mechanical properties of biodegradable films and their application in packaging for packaging plants Abstract In this work the technique of casting was used for the production of films. Different quantities of corn starch, glycerin and propolis-green extract were used, and the mechanical properties (Young's modulus, elasticity and tensile strength) of the materials obtained were evaluated. The effect of the variables and their interactions on the Pareto graph was verified. Response surfaces were constructed in tests conducted with 95% confidence. The analysis of the results indicated that the corn starch was the most influential in determining the Young's modulus. On the other hand, for elongation, glycerin, used as a plasticizing agent, was the most relevant factor. In analyzing the rupture stress, it was observed that the interaction between glycerin and the propolis-green extract was the most significant. It wasthenselectedthefilmthatpresentedthemost suitable characteristics in the packaging of lettuce seedlings. The results indicated that the sample produced from 4 g of cornstarch, 1 g of glycerin and 0.12 g of propolis-green extract was the most appropriate. Regarding the conditioning of the lettuce change, a visual and macroscopic analysis was performed to verify the resistance of the biodegradable film when the substrate and the change were added to the packaging. It hás been observed that the synthesized and selected material hás the proper capacity for its retention, which favors and ratifies its use, since, being biodegradable, it can be come a substitute for conventional plastics. Keywords:Casting technique. Young's module. Elasticity. Tensionto break.
APA, Harvard, Vancouver, ISO, and other styles
40

Pulkin, I. S., and A. V. Tatarintsev. "Sufficient statistics for the Pareto distribution parameter." Russian Technological Journal 9, no. 3 (June 28, 2021): 88–97. http://dx.doi.org/10.32362/2500-316x-2021-9-3-88-97.

Full text
Abstract:
The task of estimating the parameters of the Pareto distribution, first of all, of an indicator of this distribution for a given sample, is relevant. This article establishes that for this estimate, it is sufficient to know the product of the sample elements. It is proved that this product is a sufficient statistic for the Pareto distribution parameter. On the basis of the maximum likelihood method the distribution degree indicator is estimated. It is proved that this estimate is biased, and a formula eliminating the bias is justified. For the product of the sample elements considered as a random variable the distribution function and probability density are found; mathematical expectation, higher moments, and differential entropy are calculated. The corresponding graphs are built. In addition, it is noted that any function of this product is a sufficient statistic, in particular, the geometric mean. For the geometric mean also considered as a random variable, the distribution function, probability density, and the mathematical expectation are found; the higher moments, and the differential entropy are also calculated, and the corresponding graphs are plotted. In addition, it is proved that the geometric mean of the sample is a more convenient sufficient statistic from a practical point of view than the product of the sample elements. Also, on the basis of the Rao–Blackwell–Kolmogorov theorem, effective estimates of the Pareto distribution parameter are constructed. In conclusion, as an example, the technique developed here is applied to the exponential distribution. In this case, both the sum and the arithmetic mean of the sample can be used as sufficient statistics.
APA, Harvard, Vancouver, ISO, and other styles
41

Tikhonova, I. V., Yu E. Titova, A. E. Gvozdev, and E. V. Ageev. "THE USE OF PARETO DIAGRAMS TO ANALYZE THE QUALITY OF TEACHING IN HIGHER EDUCATION." Proceedings of the Southwest State University 21, no. 5 (October 28, 2017): 27–37. http://dx.doi.org/10.21869/2223-1560-2017-21-5-27-37.

Full text
Abstract:
Professionals in the field of analysis of quality systems is widely known for the Pareto principle and the corresponding graphs showing the contribution of private reasons in the overall picture of the functioning of the system. Pareto chart and the associated cumulative Lorenz curve are used to illustrate the dominant alternatives in terms of their total number. The reason for the popularity of Pareto charts lies in the objectively obvious visibility of the apparatus when nagoricino the analysis of functioning of complex systems. The aim of this work was to determine the possibility of using Pareto analysis to identify the causes that affect the quality of the students in the learning process of knowledge. Students were asked to analyze their own learning with the use of Pareto charts the results of activities and causes. In the first stage, the students in the scoring books expected average score in all disciplines, included in the above blocks. The obtained information was used to build the Pareto chart of learning outcomes. In the second phase, the students formulated a list of the most probable reasons for low (or high enough) progress on the most problematic unit, previously identified and evaluated the significance of each reason on a 100-point system. Addressing the identified causes of failure the student can't and requires improvement of the curriculum, and improving logistical support of the educational process. Pareto analysis, successfully used industrialized countries in assessing the quality of products, efficient and at the analysis of quality performance as well as allows to reveal problem places of the educational process.
APA, Harvard, Vancouver, ISO, and other styles
42

Tkachyov, A. "Test simulation experiments of mesh algorithm of pareto efficiency routes in graphs." Актуальные направления научных исследований XXI века: теория и практика 3, no. 5 (December 2, 2015): 207–10. http://dx.doi.org/10.12737/16242.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Weber, Tiago Oliveira, and Wilhelmus A. M. Van Noije. "Analog circuit synthesis performing fast Pareto frontier exploration and analysis through 3D graphs." Analog Integrated Circuits and Signal Processing 73, no. 3 (August 12, 2012): 861–71. http://dx.doi.org/10.1007/s10470-012-9939-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Yu, Chong, Bo Huang, and Jiangen Hao. "Multiobjective Heuristic Scheduling of Automated Manufacturing Systems Based on Petri Nets." International Journal of Machine Learning and Computing 11, no. 2 (March 2021): 176–81. http://dx.doi.org/10.18178/ijmlc.2021.11.2.1032.

Full text
Abstract:
In practice, automated manufacturing systems usually have multiple, incommensurate, and conflicting objectives to achieve. To deal with them, this paper proposes an extend Petri nets for the multiobjective scheduling of AMSs. In addition, a multiobjective heuristic A* search within reachability graphs of extended Petri nets is also proposed to schedule these nets. The method can obtain all Pareto-optimal schedules for the underlying systems if admissible heuristic functions are used. Finally, the effectiveness of the method is illustrated by some experimental systems.
APA, Harvard, Vancouver, ISO, and other styles
45

DE VICO FALLANI, FABRIZIO, JLENIA TOPPI, CLAUDIA DI LANZO, GIOVANNI VECCHIATO, LAURA ASTOLFI, GIANLUCA BORGHINI, DONATELLA MATTIA, FEBO CINCOTTI, and FABIO BABILONI. "REDUNDANCY IN FUNCTIONAL BRAIN CONNECTIVITY FROM EEG RECORDINGS." International Journal of Bifurcation and Chaos 22, no. 07 (July 2012): 1250158. http://dx.doi.org/10.1142/s0218127412501581.

Full text
Abstract:
The concept of redundancy is a critical resource of the brain enhancing the resilience to neural damages and dysfunctions. In the present work, we propose a graph-based methodology to investigate the connectivity redundancy in brain networks. By taking into account all the possible paths between pairs of nodes, we considered three complementary indexes, characterizing the network redundancy (i) at the global level, i.e. the scalar redundancy (ii) across different path lengths, i.e. the vectorial redundancy (iii) between node pairs, i.e. the matricial redundancy. We used this procedure to investigate the functional connectivity estimated from a dataset of high-density EEG signals in a group of healthy subjects during a no-task resting state. The statistical comparison with a benchmark dataset of random networks, having the same number of nodes and links of the EEG nets, revealed a significant (p < 0.05) difference for all the three indexes. In particular, the redundancy in the EEG networks, for each frequency band, appears radically higher than random graphs, thus revealing a natural tendency of the brain to present multiple parallel interactions between different specialized areas. Notably, the matricial redundancy showed a high (p < 0.05) redundancy between the scalp sensors over the parieto-occipital areas in the Alpha range of EEG oscillations (7.5–12.5 Hz), which is known to be the most responsive channel during resting state conditions.
APA, Harvard, Vancouver, ISO, and other styles
46

Ren, Yong Chang, De Yi Jiang, Tao Xing, and Ping Zhu. "Research on Quality Control Flows and Techniques of Software Project Management." Advanced Materials Research 328-330 (September 2011): 820–23. http://dx.doi.org/10.4028/www.scientific.net/amr.328-330.820.

Full text
Abstract:
Software project quality management is the important work throughout the entire software life cycle, the effective implementation of software product quality control is an important means of improving software quality and reducing the cost. For the difficulty of quality control, research the control flows and techniques. First, describe the quality control process including prior quality control, matter quality control and afterwards quality control. Then, research the quality control flows through graphs. Finally, by combining graphs and textual descriptions to research quality control techniques, including four commonly used techniques such as cause-effect diagram, Pareto diagram, control chart and run chart. The results show that, through the study of quality control flows and techniques provide techniques and methods to support quality control and improve the science of quality management.
APA, Harvard, Vancouver, ISO, and other styles
47

Salmalian, K., N. Nariman-Zadeh, H. Gharababei, H. Haftchenari, and A. Varvani-Farahani. "Multi-objective evolutionary optimization of polynomial neural networks for fatigue life modelling and prediction of unidirectional carbon-fibre-reinforced plastics composites." Proceedings of the Institution of Mechanical Engineers, Part L: Journal of Materials: Design and Applications 224, no. 2 (April 1, 2010): 79–91. http://dx.doi.org/10.1243/14644207jmda260.

Full text
Abstract:
In this article, evolutionary algorithms (EAs) are employed for multi-objective Pareto optimum design of group method data handling (GMDH)-type neural networks that have been used for fatigue life modelling and prediction of unidirectional (UD) carbon-fibre-reinforced plastics (CFRP) composites using input—output experimental data. The input parameters used for such modelling are stress ratio, cyclic strain energy, fibre orientation angle, maximum stress, and failure stress level in one cycle. In this way, EAs with a new encoding scheme are first presented for evolutionary design of the generalized GMDH-type neural networks, in which the connectivity configurations in such networks are not limited to adjacent layers. Second, multi-objective EAs with a new diversity preserving mechanism are used for Pareto optimization of such GMDH-type neural networks. The important conflicting objectives of GMDH-type neural networks that are considered in this work are training error (TE), prediction error (PE), and number of neurons ( N). Different pairs of these objective functions are selected for two-objective optimization processes. Therefore, optimal Pareto fronts of such models are obtained in each case, which exhibit the trade-offs between the corresponding pair of conflicting objectives and, thus, provide different non-dominated optimal choices of GMDH-type neural network model for fatigue life of UD CFRP composites. Moreover, all the three objectives are considered in a three-objective optimization process, which consequently leads to some more non-dominated choices of GMDH-type models representing the trade-offs among the TE, PE, and N (complexity of network), simultaneously. The comparison graphs of these Pareto fronts also show that the three-objective results include those of the two-objective results and, thus, provide more optimal choices for the multi-objective design of GMDH-type neural networks.
APA, Harvard, Vancouver, ISO, and other styles
48

MAHAPATRA, N. K., and M. MAITI. "PRODUCTION–INVENTORY MODEL FOR A DETERIORATING ITEM WITH IMPRECISE PREPARATION TIME FOR PRODUCTION IN A FINITE TIME HORIZON." Asia-Pacific Journal of Operational Research 23, no. 02 (June 2006): 171–92. http://dx.doi.org/10.1142/s0217595906000826.

Full text
Abstract:
In this paper, realistic production-inventory models with shortages for a deteriorating item with imprecise preparation time for production (hereafter called preparation time) has been formulated and an inventory policy is proposed for maximum profit in a finite time horizon. Here, the rate of production is constant, demand depends on selling price, marketing cost, and inventory level, and setup cost depends on preparation time. The imprecise preparation time is assumed to be represented by a fuzzy number and is first transformed to a corresponding interval number and then following interval mathematics, the objective function for total profit over the finite time horizon is changed to respective multi-objective functions. These functions are maximized and solved for a set of Pareto optimal solution with the help of the weighted sum method using the generalized reduced gradient technique. Different case studies have been made depending upon the occurrence of shortages. The models have been illustrated by numerical data. Pareto-optimal solutions for different weights are obtained and presented in tables and graphs.
APA, Harvard, Vancouver, ISO, and other styles
49

Subba Rao, R., V. Ravindranath, A. V.Dattatreya Rao, G. Prasad, and P. Ravi Kishore. "Wrapped Lomax Distribution :a New CircularProbability Model." International Journal of Engineering & Technology 7, no. 3.31 (August 24, 2018): 150. http://dx.doi.org/10.14419/ijet.v7i3.31.18285.

Full text
Abstract:
Lomax Distribution (Pareto Type IV) is fitted for a life time random variable which can be studied for the data belongs to Actuarialscience, medical diagnosis and Queuing theory etc. In the time of day events observed in cycles like hourly, daily, weekly, monthlyor yearly are in circular distribution. By adopting the technique of wrapping an attempt is made to identify a new circular probabilitymodel originate as Wrapped Lomax Distribution. The concept of circular model is introduced and strategy of wrapping is given forLomaxDistribution. Wrapped Lomax Distribution PDF and CDF are derived, their graphs are also studied. The trigonometric momentsand characteristic function of Wrapped Lomax Distribution are obtained and their graphs are also depicted. The characteristics likemean, variance, skewness, kurtosis and circular standard deviation for various values of location and scale parameters are derived in this paper.
APA, Harvard, Vancouver, ISO, and other styles
50

Stanojevic, Milan, Mirko Vujoševic, and Bogdana Stanojevic. "Computation Results of Finding All Efficient Points in Multiobjective Combinatorial Optimization." International Journal of Computers Communications & Control 3, no. 4 (December 1, 2008): 374. http://dx.doi.org/10.15837/ijccc.2008.4.2405.

Full text
Abstract:
The number of efficient points in criteria space of multiple objective combinatorial optimization problems is considered in this paper. It is concluded that under certain assumptions, that number grows polynomially although the number of Pareto optimal solutions grows exponentially with the problem size. In order to perform experiments, an original algorithm for obtaining all efficient points was formulated and implemented for three classical multiobjective combinatorial optimization problems. Experimental results with the shortest path problem, the Steiner tree problem on graphs and the traveling salesman problem show that the number of efficient points is much lower than a polynomial upper bound.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography