Journal articles on the topic 'EDA solutions'

To see the other types of publications on this topic, follow the link: EDA solutions.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'EDA solutions.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yin, Peng-Yeng, and Hsi-Li Wu. "Cyber-EDA: Estimation of Distribution Algorithms with Adaptive Memory Programming." Mathematical Problems in Engineering 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/132697.

Full text
Abstract:
The estimation of distribution algorithm (EDA) aims to explicitly model the probability distribution of the quality solutions to the underlying problem. By iterative filtering for quality solution from competing ones, the probability model eventually approximates the distribution of global optimum solutions. In contrast to classic evolutionary algorithms (EAs), EDA framework is flexible and is able to handle inter variable dependence, which usually imposes difficulties on classic EAs. The success of EDA relies on effective and efficient building of the probability model. This paper facilitates EDA from the adaptive memory programming (AMP) domain which has developed several improved forms of EAs using the Cyber-EA framework. The experimental result on benchmark TSP instances supports our anticipation that the AMP strategies can enhance the performance of classic EDA by deriving a better approximation for the true distribution of the target solutions.
APA, Harvard, Vancouver, ISO, and other styles
2

Nijimbere, Dieudonné, Songzheng Zhao, Haichao Liu, Bo Peng, and Aijun Zhang. "A Hybrid Metaheuristic of Integrating Estimation of Distribution Algorithm with Tabu Search for the Max-Mean Dispersion Problem." Mathematical Problems in Engineering 2019 (July 1, 2019): 1–16. http://dx.doi.org/10.1155/2019/7104702.

Full text
Abstract:
This paper presents a hybrid metaheuristic that combines estimation of distribution algorithm with tabu search (EDA-TS) for solving the max-mean dispersion problem. The proposed EDA-TS algorithm essentially alternates between an EDA procedure for search diversification and a tabu search procedure for search intensification. The designed EDA procedure maintains an elite set of high quality solutions, based on which a conditional preference probability model is built for generating new diversified solutions. The tabu search procedure uses a fast 1-flip move operator for solution improvement. Experimental results on benchmark instances with variables ranging from 500 to 5000 disclose that our EDA-TS algorithm competes favorably with state-of-the-art algorithms in the literature. Additional analysis on the parameter sensitivity and the merit of the EDA procedure as well as the search balance between intensification and diversification sheds light on the effectiveness of the algorithm.
APA, Harvard, Vancouver, ISO, and other styles
3

Cáceres, Mercedes, Alvaro Lobato, Nubia J. Mendoza, Laura J. Bonales, and Valentín G. Baonza. "Local, solvation pressures and conformational changes in ethylenediamine aqueous solutions probed using Raman spectroscopy." Physical Chemistry Chemical Physics 18, no. 37 (2016): 26192–98. http://dx.doi.org/10.1039/c6cp03857c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Villarroel, Josselyne A., Alex Palma-Cando, Alfredo Viloria, and Marvin Ricaurte. "Kinetic and Thermodynamic Analysis of High-Pressure CO2 Capture Using Ethylenediamine: Experimental Study and Modeling." Energies 14, no. 20 (October 19, 2021): 6822. http://dx.doi.org/10.3390/en14206822.

Full text
Abstract:
One of the alternatives to reduce CO2 emissions from industrial sources (mainly the oil and gas industry) is CO2 capture. Absorption with chemical solvents (alkanolamines in aqueous solutions) is the most widely used conventional technology for CO2 capture. Despite the competitive advantages of chemical solvents, the technological challenge in improving the absorption process is to apply alternative solvents, reducing energy demand and increasing the CO2 captured per unit of solvent mass. This work presents an experimental study related to the kinetic and thermodynamic analysis of high-pressure CO2 capture using ethylenediamine (EDA) as a chemical solvent. EDA has two amine groups that can increase the CO2 capture capacity per unit of solvent. A non-stirred experimental setup was installed and commissioned for CO2 capture testing. Tests of the solubility of CO2 in water were carried out to validate the experimental setup. CO2 capture testing was accomplished using EDA in aqueous solutions (0, 5, 10, and 20 wt.% in amine). Finally, a kinetic model involving two steps was proposed, including a rapid absorption step and a slow diffusion step. EDA accelerated the CO2 capture performance. Sudden temperature increases were observed during the initial minutes. The CO2 capture was triggered after the absorption of a minimal amount of CO2 (~10 mmol) into the liquid solutions, and could correspond to the “lean amine acid gas loading” in a typical sweetening process using alkanolamines. At equilibrium, there was a linear relationship between the CO2 loading and the EDA concentration. The CO2 capture behavior obtained adapts accurately (AAD < 1%) to the kinetic mechanism.
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Lei, Yeguo Sun, Yihong Liu, Rachel Edita O. Roxas, and Rodolfo C. Raga. "Research and Implementation of Text Generation Based on Text Augmentation and Knowledge Understanding." Computational Intelligence and Neuroscience 2022 (September 10, 2022): 1–10. http://dx.doi.org/10.1155/2022/2988639.

Full text
Abstract:
Text generation has always been limited by the lack of corpus data required for language model (LM) training and the low quality of the generated text. Researchers have proposed some solutions, but these solutions are often complex and will greatly increase the consumption of computing resources. Referring to the current main solutions, this paper proposes a lightweight language model (EDA-BoB) based on text augmentation technology and knowledge understanding mechanism. Experiments show that the EDA-BoB model cannot only expand the scale of the training data set but also ensure the data quality at the cost of consuming little computing resources. Moreover, our model is shown to combine the contextual semantics of sentences to generate rich and accurate texts.
APA, Harvard, Vancouver, ISO, and other styles
6

Garashchenko, Anton Vitalievich, Daria Sergeevna Lashina, Svyatoslav Aleksandrovich Nikitin, Artyom Valerievich Nikolaev, Evgeny Andreevich Prokopev, Fedor Mikhailovich Putrya, and Bulat Namsaraevich Tsyrenzhapov. "The practice and prospects of using open and proprietary software solutions in the verification route of SoC." Proceedings of the Institute for System Programming of the RAS 34, no. 5 (2022): 23–42. http://dx.doi.org/10.15514/ispras-2022-34(5)-2.

Full text
Abstract:
This article discusses the experience and evaluates the capability of using free, open and internal proprietary EDA tools in the billon gates SoC verification flow initially based on the commercial EDA of the “Big 3”. Article suggests an approach to assessing the suitability of a particular EDA tools for a certain stage of the verification flow based on a formal stage description and requirements for the tool. It presents our internal tools implemented in the company, which are used as an alternative to commercial tools or as unique solutions. Based on the proposed approach, the analysis of the existing automation tools from the point of view of the applicability in the SoC verification flow was carried out.
APA, Harvard, Vancouver, ISO, and other styles
7

Firouzi, Farshad, Bahar Farahani, Mohamed Ibrahim, and Krishnendu Chakrabarty. "Keynote Paper: From EDA to IoT eHealth: Promises, Challenges, and Solutions." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 37, no. 12 (December 2018): 2965–78. http://dx.doi.org/10.1109/tcad.2018.2801227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hamity, M., and R. H. Lema. "The photochemistry of electron donor–acceptor (EDA) complexes in micellar solutions. I. The stilbene–methylviologen system." Canadian Journal of Chemistry 66, no. 7 (July 1, 1988): 1552–57. http://dx.doi.org/10.1139/v88-252.

Full text
Abstract:
The association constants for the electron donor–acceptor (EDA) complexes formed between both cis-stilbene (cS) and trans-stilbene (tS) as donors and methylviologen (MV+2) as acceptor were determined in ethanol and micellar sodium dodecyl sulfate (SDS) solutions in the range of SDS concentration 0.05–0.1 M. The values obtained in micellar solutions were much higher than those in ethanol and were heavily dependent upon SDS concentration. This effect is due to an increase in the local concentration of the reactants in the micellar pseudophase. The tS fluorescence quenching by MV+2 was also studied in the same solvent media. In ethanol, the Stern–Volmer plot was found to be linear, with quenching constant (KSV) similar to the association constant determined by the absorption method. In micellar solutions, although upward curvature of the Stern–Volmer plots was observed, a reaction scheme based on static quenching via ground state EDA complex is proposed, which explains the experimental results. Irradiation in the absorption band of the EDA complexes formed by tS or cS and MV+2 was carried out in ethanol and SDS solutions, in the absence of oxygen. Only cis–trans isomerization of cS in SDS solution was observed, with a quantum yield value of Φcis = 0.012.
APA, Harvard, Vancouver, ISO, and other styles
9

Lenin, K. "MINIMIZATION OF REAL POWER LOSS BY ENHANCED GREAT DELUGE ALGORITHM." International Journal of Research -GRANTHAALAYAH 5, no. 8 (August 31, 2017): 207–16. http://dx.doi.org/10.29121/granthaalayah.v5.i8.2017.2215.

Full text
Abstract:
This paper presents Enhanced Great Deluge Algorithm (EDA) for solving reactive power problem. Alike other local exploration methods, this Enhanced Great Deluge Algorithm (EDA) also swap general solution (fresh_Config) with most excellent results (most excellent_Config) that have been found by then. This deed prolong until stop conditions is offered. In this algorithm, new solutions are selected from neighbours. Selection strategy is different from other approaches. In order to evaluate validity of the proposed Enhanced Great Deluge Algorithm (EDA) algorithm, it has been tested on standard IEEE 118 & practical 191 bus test systems and compared to other standard reported algorithms. Results show that Enhanced Great Deluge Algorithm (EDA) reduces the real power loss and voltage profiles are within the limits.
APA, Harvard, Vancouver, ISO, and other styles
10

Spyrou, Tom. "What are the challenges and solutions for parallel processing in EDA applications?" ACM SIGDA Newsletter 39, no. 2 (February 2009): 1. http://dx.doi.org/10.1145/1862888.1862889.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Dardouri, Mokhtar, Fayçel Ammari, and Faouzi Meganem. "Aminoalkylated Merrifield Resins Reticulated by Tris-(2-chloroethyl) Phosphate for Cadmium, Copper, and Iron (II) Extraction." International Journal of Polymer Science 2015 (2015): 1–6. http://dx.doi.org/10.1155/2015/782841.

Full text
Abstract:
We aimed to synthesize novel substituted polymers bearing functional groups to chelate heavy metals during depollution applications. Three polyamine functionalized Merrifield resins were prepared via ethylenediamine (EDA), diethylenetriamine (DETA), and triethylenetetramine (TETA) modifications named, respectively, MR-EDA, MR-DETA, and MR-TETA. The aminoalkylated polymers were subsequently reticulated by tris-(2-chloroethyl) phosphate (TCEP) to obtain new polymeric resins called, respectively, MR-EDA-TCEP, MR-DETA-TCEP, and MR-TETA-TCEP. The obtained resins were characterized via attenuated total reflectance Fourier transform infrared spectroscopy (ATR-FTIR), elemental analysis (EA), and thermogravimetric (TGA), thermodynamic (DTA), and differential thermogravimetric (DTG) analysis. The synthesized resins were then assayed to evaluate their efficiency to extract metallic ions such as Cd2+, Cu2+, and Fe2+from aqueous solutions.
APA, Harvard, Vancouver, ISO, and other styles
12

Ren, Aihong, Yuping Wang, and Fei Jia. "A Hybrid Estimation of Distribution Algorithm and Nelder-Mead Simplex Method for Solving a Class of Nonlinear Bilevel Programming Problems." Journal of Applied Mathematics 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/378568.

Full text
Abstract:
We propose a hybrid algorithm based on estimation of distribution algorithm (EDA) and Nelder-Mead simplex method (NM) to solve a class of nonlinear bilevel programming problems where the follower’s problem is linear with respect to the lower level variable. The bilevel programming is an NP-hard optimization problem, for which EDA-NM is applied as a new tool aiming at obtaining global optimal solutions of such a problem. In fact, EDA-NM is very easy to be implementedsince it does not require gradients information. Moreover, the hybrid algorithm intends to produce faster and more accurate convergence. In the proposed approach, for fixed upper level variable, we make use of the optimality conditions of linear programming to deal with the follower’s problem and obtain its optimal solution. Further, the leader’s objective function is taken as the fitness function. Based on these schemes, the hybrid algorithm is designed by combining EDA with NM. To verify the performance of EDA-NM, simulations on some test problems are made, and the results demonstrate that the proposed algorithm has a better performance than the compared algorithms. Finally, the proposed approach is used to solve a practical example about pollution charges problem.
APA, Harvard, Vancouver, ISO, and other styles
13

Abdollahzadeh, Asaad, Alan Reynolds, Mike Christie, David Corne, Glyn Williams, and Brian Davies. "Estimation of Distribution Algorithms Applied to History Matching." SPE Journal 18, no. 03 (April 17, 2013): 508–17. http://dx.doi.org/10.2118/141161-pa.

Full text
Abstract:
Summary The topic of automatically history-matched reservoir models has seen much research activity in recent years. History matching is an example of an inverse problem, and there is significant active research on inverse problems in many other scientific and engineering areas. While many techniques from other fields, such as genetic algorithms, evolutionary strategies, differential evolution, particle swarm optimization, and the ensemble Kalman filter have been tried in the oil industry, more recent and effective ideas have yet to be tested. One of these relatively untested ideas is a class of algorithms known as estimation of distribution algorithms (EDAs). EDAs are population-based algorithms that use probability models to estimate the probability distribution of promising solutions, and then to generate new candidate solutions. EDAs have been shown to be very efficient in very complex high-dimensional problems. An example of a state-of-the-art EDA is the Bayesian optimization algorithm (BOA), which is a multivariate EDA employing Bayesian networks for modeling the relationships between good solutions. The use of a Bayesian network leads to relatively fast convergence as well as high diversity in the matched models. Given the relatively limited number of reservoir simulations used in history matching, EDA-BOA offers the promise of high-quality history matches with a fast convergence rate. In this paper, we introduce EDAs and describe BOA in detail. We show results of the EDA-BOA algorithm on two history-matching problems. First, we tune the algorithm, demonstrate convergence speed, and search diversity on the PUNQ-S3 synthetic case. Second, we apply the algorithm to a real North Sea turbidite field with multiple wells. In both examples, we show improvements in performance over traditional population-based algorithms.
APA, Harvard, Vancouver, ISO, and other styles
14

Hamity, M., and R. H. Lema. "The photochemistry of EDA complexes in micellar solutions. II. The stilbene–methylviologen system in the presence of oxygen." Canadian Journal of Chemistry 69, no. 1 (January 1, 1991): 146–50. http://dx.doi.org/10.1139/v91-023.

Full text
Abstract:
The photolyses of electron donor–acceptor (EDA) complexes formed between both cis-stilbene (cS) and trans-stilbene (tS) as donors and mefhylviologen (MV+2) as acceptor have been carried out in homogeneous ethanolic and in micellar sodium dodecyl sulfate (SDS) solutions, in the presence of oxygen. The stilbene loss quantum yields (Φ−S) have been determined in both media. Quantum yields were observed to be dependent upon methylviologen concentration and the acidity of media. A reaction scheme is proposed, which accounts for experimental results and can be related to the photochemistry of MV+2 and its cation radical MV•+. Key words: EDA complexes, micelles, methyl viologen, stilbene.
APA, Harvard, Vancouver, ISO, and other styles
15

Geng, Shu Ya, Li Min Dong, Chen Wang, and Tong Xiang Liang. "Surface Functionalization of Ordered Mesoporous Carbon by Immobilization of Diamine for Cobalt-Ion Adsorption." Key Engineering Materials 602-603 (March 2014): 300–303. http://dx.doi.org/10.4028/www.scientific.net/kem.602-603.300.

Full text
Abstract:
Highly ordered mesoporous carbon (CMK-3) was fabricated for the adsorption of cobalt from aqueous solutions. With the high surface area of1112.7m2/g and pore size of 17.2 nm, its abundant mesopores were benefit for providing channels for liquid propagate. In order to improve the adsorption properties,CMK-3 was modified by hydroxylation and amination. Fourier transform infrared (FTIR) spectroscopy can be seen that the amino group was successfully grafted onto the CMK-3 with highly ordered mesoporous structure. The functionalized ordered mesoporous carbon (CMK-3-EDA) ,CMK-3 and CMK-3-OX were used as absents for the adsorption of Co (II) from aqueous solution. The results showed that CMK-3-EDA were more twice effective in adsorption of Co (II) compared to CMK-3, which indicated that CMK-3-EDA had great potential for the adsorption of Co (II).
APA, Harvard, Vancouver, ISO, and other styles
16

Urbán Rivero, Luis Eduardo, Jonás Velasco, and Javier Ramírez Rodríguez. "A Simple Greedy Heuristic for Site Specific Management Zone Problem." Axioms 11, no. 7 (June 29, 2022): 318. http://dx.doi.org/10.3390/axioms11070318.

Full text
Abstract:
In agriculture, the soil properties influence the productivity and quality of crops. The farmer expects that in a specific area of the land, the physicochemical characteristics of the soil will be homogeneous as the selected crop has the desired quality and minimizes the use of fertilizers. There are three approaches to determining the correct delimitation of the land in the state-of-the art. The first one (k-means and fuzzy k-means) is impractical for current agricultural technology. The second approach is based on integer linear programming and a pre-processing step. This approach limits the shapes of delimited zones to rectangular, and the third approach extends the solution search space and generates orthogonal regions using Estimation of Distribution Algorithms (EDA). In this work, we generate orthogonal regions with a different approach to the EDA, a greedy construction heuristic. Our heuristic produces feasible solutions with a reasonable running time compared with the running times of EDA.
APA, Harvard, Vancouver, ISO, and other styles
17

Li, Yannan, Jun Cheng, Leiqing Hu, Niu Liu, Junhu Zhou, and Kefa Cen. "Regulating crystal structures of EDA-carbamates in solid–liquid phase-changing CO2 capture solutions." Fuel 252 (September 2019): 47–54. http://dx.doi.org/10.1016/j.fuel.2019.04.051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Cheriet, Abdelhakim, Foudil Cherif, and Abdelmalik Taleb-Ahmed. "Fast Solutions Enhancing using a Copula-based EDA and SVM for many-objective problems." IFAC-PapersOnLine 49, no. 12 (2016): 781–86. http://dx.doi.org/10.1016/j.ifacol.2016.07.869.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Tajgardan, Mahjoubeh, Habib Izadkhah, and Shahriar Lotfi. "Software Systems Clustering Using Estimation of Distribution Approach." Journal of Applied Computer Science Methods 8, no. 2 (December 1, 2016): 99–113. http://dx.doi.org/10.1515/jacsm-2016-0007.

Full text
Abstract:
AbstractSoftware clustering is usually used for program understanding. Since the software clustering is a NP-complete problem, a number of Genetic Algorithms (GAs) are proposed for solving this problem. In literature, there are two wellknown GAs for software clustering, namely, Bunch and DAGC, that use the genetic operators such as crossover and mutation to better search the solution space and generating better solutions during genetic algorithm evolutionary process. The major drawbacks of these operators are (1) the difficulty of defining operators, (2) the difficulty of determining the probability rate of these operators, and (3) do not guarantee to maintain building blocks. Estimation of Distribution (EDA) based approaches, by removing crossover and mutation operators and maintaining building blocks, can be used to solve the problems of genetic algorithms. This approach creates the probabilistic models from individuals to generate new population during evolutionary process, aiming to achieve more success in solving the problems. The aim of this paper is to recast EDA for software clustering problems, which can overcome the existing genetic operators’ limitations. For achieving this aim, we propose a new distribution probability function and a new EDA based algorithm for software clustering. To the best knowledge of the authors, EDA has not been investigated to solve the software clustering problem. The proposed EDA has been compared with two well-known genetic algorithms on twelve benchmarks. Experimental results show that the proposed approach provides more accurate results, improves the speed of convergence and provides better stability when compared against existing genetic algorithms such as Bunch and DAGC.
APA, Harvard, Vancouver, ISO, and other styles
20

Vasile, Floriana, Anna Vizziello, Natascia Brondino, and Pietro Savazzi. "Stress State Classification Based on Deep Neural Network and Electrodermal Activity Modeling." Sensors 23, no. 5 (February 23, 2023): 2504. http://dx.doi.org/10.3390/s23052504.

Full text
Abstract:
Electrodermal Activity (EDA) has become of great interest in the last several decades, due to the advent of new devices that allow for recording a lot of psychophysiological data for remotely monitoring patients’ health. In this work, a novel method of analyzing EDA signals is proposed with the ultimate goal of helping caregivers assess the emotional states of autistic people, such as stress and frustration, which could cause aggression onset. Since many autistic people are non-verbal or suffer from alexithymia, the development of a method able to detect and measure these arousal states could be useful to aid with predicting imminent aggression. Therefore, the main objective of this paper is to classify their emotional states to prevent these crises with proper actions. Several studies were conducted to classify EDA signals, usually employing learning methods, where data augmentation was often performed to countervail the lack of extensive datasets. Differently, in this work, we use a model to generate synthetic data that are employed to train a deep neural network for EDA signal classification. This method is automatic and does not require a separate step for features extraction, as in EDA classification solutions based on machine learning. The network is first trained with synthetic data and then tested on another set of synthetic data, as well as on experimental sequences. In the first case, an accuracy of 96% is reached, which becomes 84% in the second case, thus demonstrating the feasibility of the proposed approach and its high performance.
APA, Harvard, Vancouver, ISO, and other styles
21

Santana, Roberto. "Estimation of Distribution Algorithms with Kikuchi Approximations." Evolutionary Computation 13, no. 1 (March 2005): 67–97. http://dx.doi.org/10.1162/1063656053583496.

Full text
Abstract:
The question of finding feasible ways for estimating probability distributions is one of the main challenges for Estimation of Distribution Algorithms (EDAs). To estimate the distribution of the selected solutions, EDAs use factorizations constructed according to graphical models. The class of factorizations that can be obtained from these probability models is highly constrained. Expanding the class of factorizations that could be employed for probability approximation is a necessary step for the conception of more robust EDAs. In this paper we introduce a method for learning a more general class of probability factorizations. The method combines a reformulation of a probability approximation procedure known in statistical physics as the Kikuchi approximation of energy, with a novel approach for finding graph decompositions. We present the Markov Network Estimation of Distribution Algorithm (MN-EDA), an EDA that uses Kikuchi approximations to estimate the distribution, and Gibbs Sampling (GS) to generate new points. A systematic empirical evaluation of MN-EDA is done in comparison with different Bayesian network based EDAs. From our experiments we conclude that the algorithm can outperform other EDAs that use traditional methods of probability approximation in the optimization of functions with strong interactions among their variables.
APA, Harvard, Vancouver, ISO, and other styles
22

Hauschild, M. W., M. Pelikan, K. Sastry, and D. E. Goldberg. "Using Previous Models to Bias Structural Learning in the Hierarchical BOA." Evolutionary Computation 20, no. 1 (March 2012): 135–60. http://dx.doi.org/10.1162/evco_a_00056.

Full text
Abstract:
Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at least its accurate approximation, besides this, any EDA provides us with a sequence of probabilistic models, which in most cases hold a great deal of information about the problem. Although using problem-specific knowledge has been shown to significantly improve performance of EDAs and other evolutionary algorithms, this readily available source of problem-specific information has been practically ignored by the EDA community. This paper takes the first step toward the use of probabilistic models obtained by EDAs to speed up the solution of similar problems in the future. More specifically, we propose two approaches to biasing model building in the hierarchical Bayesian optimization algorithm (hBOA) based on knowledge automatically learned from previous hBOA runs on similar problems. We show that the proposed methods lead to substantial speedups and argue that the methods should work well in other applications that require solving a large number of problems with similar structure.
APA, Harvard, Vancouver, ISO, and other styles
23

Chen, Wei, Zhongfei Li, and Jinchao Guo. "A VNS-EDA Algorithm-Based Feature Selection for Credit Risk Classification." Mathematical Problems in Engineering 2020 (April 27, 2020): 1–14. http://dx.doi.org/10.1155/2020/4515480.

Full text
Abstract:
Many quantitative credit scoring models have been developed for credit risk assessment. Irrelevant and redundant features may deteriorate the performance of credit risk classification. Feature selection with metaheuristic techniques can be applied to excavate the most significant features. However, metaheuristic techniques suffer from various issues such as being trapped in local optimum and premature convergence. Therefore, in this article, a hybrid variable neighborhood search and estimation of distribution technique with the elitist population strategy is proposed to identify the optimal feature subset. Variable neighborhood search with the elitist population strategy is used to direct its local searching in order to optimize the ergodicity, avoid premature convergence, and jump out of the local optimum in the searching process. The probabilistic model attempts to capture the probability distribution of the promising solutions which are biased towards the global optimum. The proposed technique has been tested on both publicly available credit datasets and a real-world credit dataset in China. Experimental analysis demonstrates that it outperforms existing techniques in large-scale credit datasets with high dimensionality, making it well suited for feature selection in credit risk classification.
APA, Harvard, Vancouver, ISO, and other styles
24

Davis Cross, Mai’a K. "The European Defence Agency and the Member States: Public and Hidden Transcripts." European Foreign Affairs Review 20, Special Issue (July 1, 2015): 83–102. http://dx.doi.org/10.54648/eerr2015026.

Full text
Abstract:
The European Defence Agency (EDA) was founded in 2004 with the aim of improving the EU’s defence capabilities through promoting collaboration, common initiatives, and innovative solutions to the EU’s security needs. This article examines the nature of the relationship between European Union Member States and the EDA a decade after its founding. The agency has solidified a clear body of norms that it seeks Member States to implement. To a surprising extent, Member States have publically embraced these norms as necessary for the future viability of European security. But they at the same time resist implementing these norms in certain fundamental ways. Building upon the framework article of this special issue, the author applies the concepts of ‘public’ and ‘hidden’ transcripts to shed light on how Member States simultaneously embrace and resist norms in a climate of supranational pressure.
APA, Harvard, Vancouver, ISO, and other styles
25

Hur, Do Haeng, Myung Sik Choi, Eun Hee Lee, and Uh Chul Kim. "Copper dissolution and electrochemical behavior in EDTA- and EDA-based solutions for steam generator chemical cleaning." Nuclear Engineering and Design 224, no. 2 (September 2003): 207–12. http://dx.doi.org/10.1016/s0029-5493(03)00100-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Robles, V., P. Larrañaga, and C. Bielza. "Estimation of Distribution Algorithms as Logistic Regression Regularizers of Microarray Classifiers." Methods of Information in Medicine 48, no. 03 (2009): 236–41. http://dx.doi.org/10.3414/me9223.

Full text
Abstract:
Summary Objectives: The “large k (genes), small N (samples)” phenomenon complicates the problem of microarray classification with logistic regression. The indeterminacy of the maximum likelihood solutions, multicollinearity of predictor variables and data over-fitting cause unstable parameter estimates. Moreover, computational problems arise due to the large number of predictor (genes) variables. Regularized logistic regression excels as a solution. However, the difficulties found here involve an objective function hard to be optimized from a mathematical viewpoint and a careful required tuning of the regularization parameters. Methods: Those difficulties are tackled by introducing a new way of regularizing the logistic regression. Estimation of distribution algorithms (EDAs), a kind of evolutionary algorithms, emerge as natural regularizers. Obtaining the regularized estimates of the logistic classifier amounts to maximizing the likelihood function via our EDA, without having to be penalized. Likelihood penalties add a number of difficulties to the resulting optimization problems, which vanish in our case. Simulation of new estimates during the evolutionary process of EDAs is performed in such a way that guarantees their shrinkage while maintaining their probabilistic dependence relationships learnt. The EDA process is embedded in an adapted recursive feature elimination procedure, thereby providing the genes that are best markers for the classification. Results: The consistency with the literature and excellent classification performance achieved with our algorithm are illustrated on four microarray data sets: Breast, Colon, Leukemia and Prostate. Details on the last two data sets are available as supplementary material. Conclusions: We have introduced a novel EDA-based logistic regression regularizer. It implicitly shrinks the coefficients during EDA evolution process while optimizing the usual likelihood function. The approach is combined with a gene subset selection procedure and automatically tunes the required parameters. Empirical results on microarray data sets provide sparse models with confirmed genes and performing better in classification than other competing regularized methods.
APA, Harvard, Vancouver, ISO, and other styles
27

Gnilenko, Alexey. "HARDWARE IMPLEMENTATION DESIGN OF A SPIKING NEURON." System technologies 1, no. 132 (March 1, 2021): 116–23. http://dx.doi.org/10.34185/1562-9945-1-132-2021-10.

Full text
Abstract:
The hardware implementation of an artificial neuron is the key problem of the design of neuromorphic chips which are new promising architectural solutions for massively parallel computing. In this paper an analog neuron circuit design is presented to be used as a building element of spiking neuron networks. The design of the neuron is performed at the transistor level based on Leaky Integrate-and-Fire neuron implementation model. The neuron is simulated using EDA tool to verify the design. Signal waveforms at key nodes of the neuron are obtained and neuron functionality is demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
28

Xie, Shu Tong, and Li Fang Pan. "Optimization of Machining Parameters for Parallel Turnings Using Estimation of Distribution Algorithms." Advanced Materials Research 753-755 (August 2013): 1192–95. http://dx.doi.org/10.4028/www.scientific.net/amr.753-755.1192.

Full text
Abstract:
Optimal machining parameters can lead to considerable savings in manufacturing problems. In this paper, to deal with the nonlinear optimization problem of machining parameters which aims to minimize the unit production cost (UC) in parallel turnings, we propose a novel optimization approach which divides this complicated problem into several sub-problems. Then an estimation of distribution algorithm (EDA) is developed to search the optimal results for each sub-problem. Computer simulations show that the proposed approach is efficient in searching the optimal solutions to reduce significantly the unit production cost.
APA, Harvard, Vancouver, ISO, and other styles
29

Batishchev, Denis, Andrey Kosarev, Ivan Vasyukov, and Andrey Zhivodernikov. "Application of the EDA- software LTspice for Computer Simulation of Transient Modes in Electrical Circuits." Известия высших учебных заведений. Электромеханика 65, no. 3 (2022): 10–19. http://dx.doi.org/10.17213/0136-3360-2022-3-10-19.

Full text
Abstract:
The purpose of this article is to evaluate the reliability of particular solutions obtained in practice for stiff systems of ordinary differential equations using the LTspice software. Testing was carried out in two directions: experimental testing of the software on known practical problems with a known experimental solution and mathematical testing of the software solver on known mathematical problems with a known asymptotic or analytical solution. The results of calculations for test problems of circuit analysis from the set of CircuitSim90 circuits that are difficult to solve and the set of tests for stiff and non-stiff INdAM-Bari ODEs are presented. A comparative assessment of the obtained results with the test ones was made. The effectiveness of using the LTspice software for solving such problems is shown by the example of calculating a high-voltage pulsed power source for electron beam welding with setting the solver parame-ters to obtain reliable and accurate results of numerical simulation.
APA, Harvard, Vancouver, ISO, and other styles
30

Molele, L. S., T. Magadzu, and A. A. Ambushe. "Removal of Pb2+ from contaminated water using modified multiwalled carbon nanotubes." Digest Journal of Nanomaterials and Biostructures 16, no. 3 (July 2021): 995–1009. http://dx.doi.org/10.15251/djnb.2021.163.995.

Full text
Abstract:
The effects of ethylene diamine (EDA), poly (amidoamine) (PAMAM) dendrimer and polyvinyl alcohols (PVA) modified multi-walled carbon nanotubes (MWCNTs) were investigated for the adsorption of Pb2+ ions in synthetic water and subsequently, in wastewater samples. The prepared MWCNTs nanocomposites were confirmed by Fourier transform infrared (FTIR) spectroscopy, X-ray diffraction (XRD), thermogravimetric analysis (TGA), transmission electron microscopy (TEM) and scanning electron microscopy (SEM). The concentration of Pb2+ ions were measured by flame-atomic absorption spectroscopy (F-AAS) and inductively coupled plasma-mass spectroscopy (ICP-MS), respectively. Removal of Pb2+ ions from synthetic solutions was carried out by varying parameters such as pH, initial metal ion concentration, contact time and adsorbent dosage. The maximum removal of Pb2+ ions was achievable at a pH of 6.5. The removal efficiency increased (more than 90%) with metal ion concentration; and reached equilibrium at 40 mg/L Pb2+ ions in all nanocomposites. Equilibrium was reached within the first 30 minutes, (at a low adsorbent dosage of 0.03 g) and remained constant in all nanocomposites. The EDA-MWCNTs nanocomposite yielded more than 90% Pb2+ ions removal from synthetic solutions as compared to both PAMAM dendrimers and polyvinyl alcohols. The analysis of wastewater indicated that the concentration of Pb2+ before treatment was ranging from 4.09 to 35.73 µg/L. The nanocomposite was able to remove 99% of Pb2+ from wastewater. Concentrations of Pb2+ varied from 0.200 - 0.234 µg/L after adsorption, which are below acceptable levels recommended by World Health Organisation (WHO), i.e, 10 µg/L Pb2+ .
APA, Harvard, Vancouver, ISO, and other styles
31

Park, Heechun, Bon Woong Ku, Kyungwook Chang, Da Eun Shim, and Sung Kyu Lim. "Pseudo-3D Physical Design Flow for Monolithic 3D ICs: Comparisons and Enhancements." ACM Transactions on Design Automation of Electronic Systems 26, no. 5 (June 5, 2021): 1–25. http://dx.doi.org/10.1145/3453480.

Full text
Abstract:
Studies have shown that monolithic 3D ( M3D ) ICs outperform the existing through-silicon-via ( TSV ) -based 3D ICs in terms of power, performance, and area ( PPA ) metrics, primarily due to the orders of magnitude denser vertical interconnections offered by the nano-scale monolithic inter-tier vias. In order to facilitate faster industry adoption of the M3D technologies, physical design tools and methodologies are essential. Recent academic efforts in developing an EDA algorithm for 3D ICs, mainly targeting placement using TSVs, are inadequate to provide commercial-quality GDS layouts. Lately, pseudo-3D approaches have been devised, which utilize commercial 2D IC EDA engines with tricks that help them operate as an efficient 3D IC CAD tool. In this article, we provide thorough discussions and fair comparisons (both qualitative and quantitative) of the state-of-the-art pseudo-3D design flows, with analysis of limitations in each design flow and solutions to improve their PPA metrics. Moreover, we suggest a hybrid pseudo-3D design flow that achieves both benefits. Our enhancements and the inter-mixed design flow, provide up to an additional 26% wirelength, 10% power consumption, and 23% of power-delay-product improvements.
APA, Harvard, Vancouver, ISO, and other styles
32

Trapani, Mariachiara, Antonino Mazzaglia, Anna Piperno, Annalaura Cordaro, Roberto Zagami, Maria Angela Castriciano, Andrea Romeo, and Luigi Monsù Scolaro. "Novel Nanohybrids Based on Supramolecular Assemblies of Meso-tetrakis-(4-sulfonatophenyl) Porphyrin J-aggregates and Amine-Functionalized Carbon Nanotubes." Nanomaterials 10, no. 4 (April 2, 2020): 669. http://dx.doi.org/10.3390/nano10040669.

Full text
Abstract:
The ability of multiwalled carbon nanotubes (MWCNTs) covalently functionalized with polyamine chains of different length (ethylenediamine, EDA and tetraethylenepentamine, EPA) to induce the J-aggregation of meso-tetrakis(4-sulfonatophenyl)porphyrin (TPPS) was investigated in different experimental conditions. Under mild acidic conditions, protonated amino groups allow for the assembly by electrostatic interaction with the diacid form of TPPS, leading to hybrid nanomaterials. The presence of only one pendant amino group for a chain in EDA does not lead to any aggregation, whereas EPA (with four amine groups for chain) is effective in inducing J-aggregation using different mixing protocols. These nanohybrids have been characterized through UV/Vis extinction, fluorescence emission, resonance light scattering and circular dichroism spectroscopy. Their morphology and chemical composition have been elucidated through transmission electron microscopy (TEM) and scanning transmission electron microscopy (STEM). TEM and STEM analysis evidence single or bundles of MWCNTs in contact with TPPS J-aggregates nanotubes. The nanohybrids are quite stable for days, even in aqueous solutions mimicking physiological medium (NaCl 0.15 M). This property, together with their peculiar optical features in the therapeutic window of visible spectrum, make them potentially useful for biomedical applications.
APA, Harvard, Vancouver, ISO, and other styles
33

Saif, Sherif M., Mohamed Dessouky, M. Watheq El-Kharashi, Hazem Abbas, and Salwa Nassar. "A Platform for Placement of Analog Integrated Circuits Using Satisfiability Modulo Theories." Journal of Circuits, Systems and Computers 25, no. 05 (February 25, 2016): 1650047. http://dx.doi.org/10.1142/s021812661650047x.

Full text
Abstract:
Satisfiability modulo theories (SMT) is an area concerned with checking the satisfiability of logical formulas over one or more theories. SMT can be well tuned to solve several of the most intriguing problems in electronic design automation (EDA). Analog placers use physical constraints to automatically generate small sections of layout. The work presented in this paper shows that SMT solvers can be used for the automation of analog placement, given some physical constraints. We propose a tool that uses Microsoft Z3 SMT solver to find valid placement solutions for the given analog blocks. Accordingly, it generates multiple layouts that fulfill some given constraints and provides a variety of alternative layouts. The user has the option to choose one of the feasible solutions. The proposed system uses the quantifier-free linear real arithmetic (QFLRA), which makes the problem decidable. The proposed system is able to generate valid placement solutions for benchmarks. For benchmarks that have many constraints and few geometries, the proposed system achieves a speedup that is 10 times faster than other recently used approaches.
APA, Harvard, Vancouver, ISO, and other styles
34

Youngmann, Brit, Sihem Amer-Yahia, and Aurelien Personnaz. "Guided exploration of data summaries." Proceedings of the VLDB Endowment 15, no. 9 (May 2022): 1798–807. http://dx.doi.org/10.14778/3538598.3538603.

Full text
Abstract:
Data summarization is the process of producing interpretable and representative subsets of an input dataset. It is usually performed following a one-shot process with the purpose of finding the best summary. A useful summary contains k individually uniform sets that are collectively diverse to be representative. Uniformity addresses interpretability and diversity addresses representativity. Finding such as summary is a difficult task when data is highly diverse and large. We examine the applicability of Exploratory Data Analysis (EDA) to data summarization and formalize Eda4Sum, the problem of guided exploration of data summaries that seeks to sequentially produce connected summaries with the goal of maximizing their cumulative utility. Eda4Sum generalizes one-shot summarization. We propose to solve it with one of two approaches: (i) Top1Sum that chooses the most useful summary at each step; (ii) RLSum that trains a policy with Deep Reinforcement Learning that rewards an agent for finding a diverse and new collection of uniform sets at each step. We compare these approaches with one-shot summarization and top-performing EDA solutions. We run extensive experiments on three large datasets. Our results demonstrate the superiority of our approaches for summarizing very large data, and the need to provide guidance to domain experts.
APA, Harvard, Vancouver, ISO, and other styles
35

Pérez-Rodríguez, Ricardo. "A Hybrid Estimation of Distribution Algorithm for the Quay Crane Scheduling Problem." Mathematical and Computational Applications 26, no. 3 (September 10, 2021): 64. http://dx.doi.org/10.3390/mca26030064.

Full text
Abstract:
The aim of the quay crane scheduling problem (QCSP) is to identify the best sequence of discharging and loading operations for a set of quay cranes. This problem is solved with a new hybrid estimation of distribution algorithm (EDA). The approach is proposed to tackle the drawbacks of the EDAs, i.e., the lack of diversity of solutions and poor ability of exploitation. The hybridization approach, used in this investigation, uses a distance based ranking model and the moth-flame algorithm. The distance based ranking model is in charge of modelling the solution space distribution, through an exponential function, by measuring the distance between solutions; meanwhile, the heuristic moth-flame determines who would be the offspring, with a spiral function that identifies the new locations for the new solutions. Based on the results, the proposed scheme, called QCEDA, works to enhance the performance of those other EDAs that use complex probability models. The dispersion results of the QCEDA scheme are less than the other algorithms used in the comparison section. This means that the solutions found by the QCEDA are more concentrated around the best value than other algorithms, i.e., the average of the solutions of the QCEDA converges better than other approaches to the best found value. Finally, as a conclusion, the hybrid EDAs have a better performance, or equal in effectiveness, than the so called pure EDAs.
APA, Harvard, Vancouver, ISO, and other styles
36

Abidin, Zainil, and Deni Erpian. "Sistem Analisis Kerusakan Sepeda Motor Karburator dengan Metode Forward Chaining." Cakrawala Repositori IMWI 4, no. 1 (July 12, 2021): 111–20. http://dx.doi.org/10.52851/cakrawala.v4i1.68.

Full text
Abstract:
Motorcycles are said to be a means of land transfortation that is very popular with many people in the country but also in various parts of the world. In this era of advanced writers become writers on the rapid technology that is very rapid today, the author increasingly ideas for where in one program what is the application of motorcycle carburetor damage system de ngan forward chaining method or often called the application of expert applications into the quality of motorcycle service activities, especially in the workshop in a workshop. The author tried to modify an application that will be a lot of much loved sep eda motor to be able to easily get users more solutions from all solutions experienced by users. Applications that will design whether there is a carburetor motorcycle damage system by forward chaining and using basic visual tools as well as for modeling using UML (Unified Modeling Language) with microsoft access input database.
APA, Harvard, Vancouver, ISO, and other styles
37

Bao, Lin, Xiaoyan Sun, Yang Chen, Guangyi Man, and Hui Shao. "Restricted Boltzmann Machine-Assisted Estimation of Distribution Algorithm for Complex Problems." Complexity 2018 (November 1, 2018): 1–13. http://dx.doi.org/10.1155/2018/2609014.

Full text
Abstract:
A novel algorithm, called restricted Boltzmann machine-assisted estimation of distribution algorithm, is proposed for solving computationally expensive optimization problems with discrete variables. First, the individuals are evaluated using expensive fitness functions of the complex problems, and some dominant solutions are selected to construct the surrogate model. The restricted Boltzmann machine (RBM) is built and trained with the dominant solutions to implicitly extract the distributed representative information of the decision variables in the promising subset. The visible layer’s probability of the RBM is designed as the sampling probability model of the estimation of distribution algorithm (EDA) and is updated dynamically along with the update of the dominant subsets. Second, according to the energy function of the RBM, a fitness surrogate is developed to approximate the expensive individual fitness evaluations and participates in the evolutionary process to reduce the computational cost. Finally, model management is developed to train and update the RBM model with newly dominant solutions. A comparison of the proposed algorithm with several state-of-the-art surrogate-assisted evolutionary algorithms demonstrates that the proposed algorithm effectively and efficiently solves complex optimization problems with smaller computational cost.
APA, Harvard, Vancouver, ISO, and other styles
38

Малышев, Н. М., and С. В. Рыбкин. "ОСОБЕННОСТЬ РАЗРАБОТКИ САПР ДЛЯ ПРОЕКТИРОВАНИЯ И ВЕРИФИКАЦИИ КОНФИГУРАЦИИ ПЛИС." NANOINDUSTRY Russia 96, no. 3s (June 15, 2020): 270–76. http://dx.doi.org/10.22184/1993-8578.2020.13.3s.270.276.

Full text
Abstract:
Создан модуль САПР сквозного проектирования с возможностью отладки и верификации проектов для программируемых логических интегральных схем (ПЛИС). На основе модуля создан синтаксический анализатор HDL-кода с формированием на его основе дерева разбора с дальнейшей компиляцией во внутренние объекты. Кроме этого, в работе освещаются вопросы разработки синтеза HDL-абстракций в библиотечные компоненты устройств производителя. Освещаются методы создания синтезатора. The paper deals with EDA (Electronic Design Automation) for designing, verifying and modeling FPGA configuration file. Based on that module HDL syntax analyzer has been created which helps to parse and compile code in internal objects. Besides, the paper presents solutions for HDL synthesis in library-typed component of vendor’s devices, as well as highlights methods of developing a synthesizer.
APA, Harvard, Vancouver, ISO, and other styles
39

Beniazza, Redouane, Natalia Bayo, Florian Molton, Carole Duboc, Stéphane Massip, Nathan McClenaghan, Dominique Lastécouères, and Jean-Marc Vincent. "Effective ascorbate-free and photolatent click reactions in water using a photoreducible copper(II)-ethylenediamine precatalyst." Beilstein Journal of Organic Chemistry 11 (October 21, 2015): 1950–59. http://dx.doi.org/10.3762/bjoc.11.211.

Full text
Abstract:
The search for copper catalysts able to perform effectively click reactions in water in the absence of sodium ascorbate is an active area of current research with strong potential for applications in bioconjugation. The water-soluble and photoreducible copper(II)–EDA (EDA = ethylenediamine) complex 1, which has two 4-benzoylbenzoates acting as both counterion and photosensitizer, has been synthesized and characterized by different techniques including single crystal X-ray diffraction. Highly efficient photoreduction was demonstrated when solutions of 1 in hydrogen atom donating solvents, such as THF or MeOH, were exposed to UVA radiation (350–400 nm) provided by a low pressure mercury lamp (type TLC = thin-layer chromatography, 365 nm), or by a 23 W fluorescent bulb, or by ambient/sunlight. In water, a much poorer hydrogen atom donating solvent, the photoreduction of 1 proved inefficient. Interestingly, EPR studies revealed that complex 1 could nonetheless be effectively photoreduced in water when alkynes were present in solution. The catalytic activity of 1 for click reactions involving a range of water-soluble alkynes and azides, in particular saccharides, was tested under various illumination conditions. Complex 1 was found to exhibit a photolatent character, the photogenerated copper(I) being very reactive. On irradiating aqueous reaction mixtures containing 1 mol % of 1 at 365 nm (TLC lamp) for 1 h, click reactions were shown to proceed to full conversion.
APA, Harvard, Vancouver, ISO, and other styles
40

Pawlaczyk, Mateusz, Michał Cegłowski, Rafał Frański, Joanna Kurczewska, and Grzegorz Schroeder. "The Electrospray (ESI) and Flowing Atmosphere-Pressure Afterglow (FAPA) Mass Spectrometry Studies of Nitrophenols (Plant Growth Stimulants) Removed Using Strong Base-Functionalized Materials." Materials 14, no. 21 (October 25, 2021): 6388. http://dx.doi.org/10.3390/ma14216388.

Full text
Abstract:
The functional silica-based materials functionalized with a strong nitrogen base TBD (SiO2-TBD) deposited via a linker or with a basic poly(amidoamine) dendrimer containing multiple terminal amine groups -NH2 (SiO2-EDA) and functional polymers containing a strong phosphazene base (Polymer-Phosphazene) or another basic poly(amidoamine) dendrimer (PMVEAMA-PAMAM) were tested as sorbents dedicated to a mixture of nitrophenols (p-nitrophenol and 2-methoxy-5-nitrophenol), which are analogs of nitrophenols used in plant growth biostimulants. The adsorptive potential of the studied materials reached 0.102, 0.089, 0.140, and 0.074 g of the nitrophenols g−1, for SiO2-TBD, SiO2-EDA, polymer-phosphazene, and PMVEAMA-PAMAM, respectively. The sorptive efficiency of the analytes, i.e., their adsorption on the functional materials, the desorption from the obtained [(sorbent)H+ − nitrophenolates–] complexes, and interactions with the used soil, were monitored using mass spectrometry (MS) technique with electrospray (ESI) and flowing atmosphere-pressure afterglow (FAPA) ionizations, for the analysis of the aqueous solutions and the solids, respectively. The results showed that the adsorption/desorption progress is determined by the structures of the terminal basic domains anchored to the materials, which are connected with the strength of the proton exchange between the sorbents and nitrophenols. Moreover, the conducted comprehensive MS analyses, performed for both solid and aqueous samples, gave a broad insight into the interactions of the biostimulants and the presented functional materials.
APA, Harvard, Vancouver, ISO, and other styles
41

Uppuluri, Srinivas, Steven E. Keinath, Donald A. Tomalia, and Petar R. Dvornic. "Rheology of Dendrimers. I. Newtonian Flow Behavior of Medium and Highly Concentrated Solutions of Polyamidoamine (PAMAM) Dendrimers in Ethylenediamine (EDA) Solvent." Macromolecules 31, no. 14 (July 1998): 4498–510. http://dx.doi.org/10.1021/ma971199b.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Jardak, Mohamed, and Olivier Talagrand. "Ensemble variational assimilation as a probabilistic estimator – Part 2: The fully non-linear case." Nonlinear Processes in Geophysics 25, no. 3 (August 24, 2018): 589–604. http://dx.doi.org/10.5194/npg-25-589-2018.

Full text
Abstract:
Abstract. The method of ensemble variational assimilation (EnsVAR), also known as ensemble of data assimilations (EDA), is implemented in fully non-linear conditions on the Lorenz-96 chaotic 40-parameter model. In the case of strong-constraint assimilation, it requires association with the method of quasi-static variational assimilation (QSVA). It then produces ensembles which possess as much reliability and resolution as in the linear case, and its performance is at least as good as that of ensemble Kalman filter (EnKF) and particle filter (PF). On the other hand, ensembles consisting of solutions that correspond to the absolute minimum of the objective function (as identified from the minimizations without QSVA) are significantly biased. In the case of weak-constraint assimilation, EnsVAR is fully successful without need for QSVA.
APA, Harvard, Vancouver, ISO, and other styles
43

Sun, J., J. M. Garibaldi, N. Krasnogor, and Q. Zhang. "An Intelligent Multi-Restart Memetic Algorithm for Box Constrained Global Optimisation." Evolutionary Computation 21, no. 1 (March 2013): 107–47. http://dx.doi.org/10.1162/evco_a_00068.

Full text
Abstract:
In this paper, we propose a multi-restart memetic algorithm framework for box constrained global continuous optimisation. In this framework, an evolutionary algorithm (EA) and a local optimizer are employed as separated building blocks. The EA is used to explore the search space for very promising solutions (e.g., solutions in the attraction basin of the global optimum) through its exploration capability and previous EA search history, and local search is used to improve these promising solutions to local optima. An estimation of distribution algorithm (EDA) combined with a derivative free local optimizer, called NEWUOA (M. Powell, Developments of NEWUOA for minimization without derivatives. Journal of Numerical Analysis, 28:649–664, 2008 ), is developed based on this framework and empirically compared with several well-known EAs on a set of 40 commonly used test functions. The main components of the specific algorithm include: (1) an adaptive multivariate probability model, (2) a multiple sampling strategy, (3) decoupling of the hybridisation strategy, and (4) a restart mechanism. The adaptive multivariate probability model and multiple sampling strategy are designed to enhance the exploration capability. The restart mechanism attempts to make the search escape from local optima, resorting to previous search history. Comparison results show that the algorithm is comparable with the best known EAs, including the winner of the 2005 IEEE Congress on Evolutionary Computation (CEC2005), and significantly better than the others in terms of both the solution quality and computational cost.
APA, Harvard, Vancouver, ISO, and other styles
44

Kavoosi, Masoud, Maxim A. Dulebenets, Olumide Abioye, Junayed Pasha, Oluwatosin Theophilus, Hui Wang, Raphael Kampmann, and Marko Mikijeljević. "Berth scheduling at marine container terminals." Maritime Business Review 5, no. 1 (November 18, 2019): 30–66. http://dx.doi.org/10.1108/mabr-08-2019-0032.

Full text
Abstract:
Purpose Marine transportation has been faced with an increasing demand for containerized cargo during the past decade. Marine container terminals (MCTs), as the facilities for connecting seaborne and inland transportation, are expected to handle the increasing amount of containers, delivered by vessels. Berth scheduling plays an important role for the total throughput of MCTs as well as the overall effectiveness of the MCT operations. This study aims to propose a novel island-based metaheuristic algorithm to solve the berth scheduling problem and minimize the total cost of serving the arriving vessels at the MCT. Design/methodology/approach A universal island-based metaheuristic algorithm (UIMA) was proposed in this study, aiming to solve the spatially constrained berth scheduling problem. The UIMA population was divided into four sub-populations (i.e. islands). Unlike the canonical island-based algorithms that execute the same metaheuristic on each island, four different population-based metaheuristics are adopted within the developed algorithm to search the islands, including the following: evolutionary algorithm (EA), particle swarm optimization (PSO), estimation of distribution algorithm (EDA) and differential evolution (DE). The adopted population-based metaheuristic algorithms rely on different operators, which facilitate the search process for superior solutions on the UIMA islands. Findings The conducted numerical experiments demonstrated that the developed UIMA algorithm returned near-optimal solutions for the small-size problem instances. As for the large-size problem instances, UIMA was found to be superior to the EA, PSO, EDA and DE algorithms, which were executed in isolation, in terms of the obtained objective function values at termination. Furthermore, the developed UIMA algorithm outperformed various single-solution-based metaheuristic algorithms (including variable neighborhood search, tabu search and simulated annealing) in terms of the solution quality. The maximum UIMA computational time did not exceed 306 s. Research limitations/implications Some of the previous berth scheduling studies modeled uncertain vessel arrival times and/or handling times, while this study assumed the vessel arrival and handling times to be deterministic. Practical implications The developed UIMA algorithm can be used by the MCT operators as an efficient decision support tool and assist with a cost-effective design of berth schedules within an acceptable computational time. Originality/value A novel island-based metaheuristic algorithm is designed to solve the spatially constrained berth scheduling problem. The proposed island-based algorithm adopts several types of metaheuristic algorithms to cover different areas of the search space. The considered metaheuristic algorithms rely on different operators. Such feature is expected to facilitate the search process for superior solutions.
APA, Harvard, Vancouver, ISO, and other styles
45

Echegoyen, Carlos, Alexander Mendiburu, Roberto Santana, and Jose A. Lozano. "On the Taxonomy of Optimization Problems Under Estimation of Distribution Algorithms." Evolutionary Computation 21, no. 3 (September 2013): 471–95. http://dx.doi.org/10.1162/evco_a_00095.

Full text
Abstract:
Understanding the relationship between a search algorithm and the space of problems is a fundamental issue in the optimization field. In this paper, we lay the foundations to elaborate taxonomies of problems under estimation of distribution algorithms (EDAs). By using an infinite population model and assuming that the selection operator is based on the rank of the solutions, we group optimization problems according to the behavior of the EDA. Throughout the definition of an equivalence relation between functions it is possible to partition the space of problems in equivalence classes in which the algorithm has the same behavior. We show that only the probabilistic model is able to generate different partitions of the set of possible problems and hence, it predetermines the number of different behaviors that the algorithm can exhibit. As a natural consequence of our definitions, all the objective functions are in the same equivalence class when the algorithm does not impose restrictions to the probabilistic model. The taxonomy of problems, which is also valid for finite populations, is studied in depth for a simple EDA that considers independence among the variables of the problem. We provide the sufficient and necessary condition to decide the equivalence between functions and then we develop the operators to describe and count the members of a class. In addition, we show the intrinsic relation between univariate EDAs and the neighborhood system induced by the Hamming distance by proving that all the functions in the same class have the same number of local optima and that they are in the same ranking positions. Finally, we carry out numerical simulations in order to analyze the different behaviors that the algorithm can exhibit for the functions defined over the search space [Formula: see text].
APA, Harvard, Vancouver, ISO, and other styles
46

Abdollahzadeh, Asaad, Alan Reynolds, Mike Christie, David Corne, Brian Davies, and Glyn Williams. "Bayesian Optimization Algorithm Applied to Uncertainty Quantification." SPE Journal 17, no. 03 (August 23, 2012): 865–73. http://dx.doi.org/10.2118/143290-pa.

Full text
Abstract:
Summary Prudent decision making in subsurface assets requires reservoir uncertainty quantification. In a typical uncertainty-quantification study, reservoir models must be updated using the observed response from the reservoir by a process known as history matching. This involves solving an inverse problem, finding reservoir models that produce, under simulation, a similar response to that of the real reservoir. However, this requires multiple expensive multiphase-flow simulations. Thus, uncertainty-quantification studies employ optimization techniques to find acceptable models to be used in prediction. Different optimization algorithms and search strategies are presented in the literature, but they are generally unsatisfactory because of slow convergence to the optimal regions of the global search space, and, more importantly, failure in finding multiple acceptable reservoir models. In this context, a new approach is offered by estimation-of-distribution algorithms (EDAs). EDAs are population-based algorithms that use models to estimate the probability distribution of promising solutions and then generate new candidate solutions. This paper explores the application of EDAs, including univariate and multivariate models. We discuss two histogram-based univariate models and one multivariate model, the Bayesian optimization algorithm (BOA), which employs Bayesian networks for modeling. By considering possible interactions between variables and exploiting explicitly stored knowledge of such interactions, EDAs can accelerate the search process while preserving search diversity. Unlike most existing approaches applied to uncertainty quantification, the Bayesian network allows the BOA to build solutions using flexible rules learned from the models obtained, rather than fixed rules, leading to better solutions and improved convergence. The BOA is naturally suited to finding good solutions in complex high-dimensional spaces, such as those typical in reservoir-uncertainty quantification. We demonstrate the effectiveness of EDA by applying the well-known synthetic PUNQ-S3 case with multiple wells. This allows us to verify the methodology in a well-controlled case. Results show better estimation of uncertainty when compared with some other traditional population-based algorithms.
APA, Harvard, Vancouver, ISO, and other styles
47

Sauvage, Laurent, Maxime Nassar, Sylvain Guilley, Florent Flament, Jean-Luc Danger, and Yves Mathieu. "Exploiting Dual-Output Programmable Blocks to Balance Secure Dual-Rail Logics." International Journal of Reconfigurable Computing 2010 (2010): 1–12. http://dx.doi.org/10.1155/2010/375245.

Full text
Abstract:
FPGA design of side-channel analysis countermeasures using unmasked dual-rail with precharge logic appears to be a great challenge. Indeed, the robustness of such a solution relies on careful differential placement and routing whereas both FPGA layout and FPGA EDA tools are not developed for such purposes. However, assessing the security level which can be achieved with them is an important issue, as it is directly related to the suitability to use commercial FPGA instead of proprietary custom FPGA for this kind of protection. In this article, we experimentally gave evidence that differential placement and routing of an FPGA implementation can be done with a granularity fine enough to improve the security gain. However, so far, this gain turned out to be lower for FPGAs than for ASICs. The solutions demonstrated in this article exploit the dual-output of modern FPGAs to achieve a better balance of dual-rail interconnections. However, we expect that an in-depth analysis of routing resources power consumption could still help reduce the interconnect differential leakage.
APA, Harvard, Vancouver, ISO, and other styles
48

Sinthong, Phanwadee, Dhaval Patel, Nianjun Zhou, Shrey Shrivastava, Arun Iyengar, and Anuradha Bhamidipaty. "DQDF." Proceedings of the VLDB Endowment 15, no. 4 (December 2021): 949–57. http://dx.doi.org/10.14778/3503585.3503602.

Full text
Abstract:
Data quality assessment is an essential process of any data analysis process including machine learning. The process is time-consuming as it involves multiple independent data quality checks that are performed iteratively at scale on evolving data resulting from exploratory data analysis (EDA). Existing solutions that provide computational optimizations for data quality assessment often separate the data structure from its data quality which then requires efforts from users to explicitly maintain state-like information. They demand a certain level of distributed system knowledge to ensure high-level pipeline optimizations from data analysts who should instead be focusing on analyzing the data. We, therefore, propose data-quality-aware dataframes, a data quality management system embedded as part of a data analyst's familiar data structure, such as a Python dataframe. The framework automatically detects changes in datasets' metadata and exploits the context of each of the quality checks to provide efficient data quality assessment on ever-changing data. We demonstrate in our experiment that our approach can reduce the overall data quality evaluation runtime by 40-80% in both local and distributed setups with less than 10% increase in memory usage.
APA, Harvard, Vancouver, ISO, and other styles
49

Sandic, Zvjezdana, and Aleksandra Nastasovic. "Functionalized macroporous copolymer of glycidyl methacrylate: The type of ligand and porosity parameters influence on Cu(II) ion sorption from aqueous solutions." Chemical Industry 63, no. 3 (2009): 269–73. http://dx.doi.org/10.2298/hemind0903269s.

Full text
Abstract:
The removal of heavy metals from hydro-metallurgical and other industries' wastewaters, their safe storage and possible recovery from waste- water streams is one of the greater ecological problems of modern society. Conventional methods, like precipitation, adsorption and biosorption, electrowinning, membrane separation, solvent extraction and ion exchange are often ineffective, expensive and can generate secondary pollution. On the other hand, chelating polymers, consisting of crosslinked copolymers as a solid support and functional group (ligand), are capable of selectively loading different metal ions from aqueous solutions. In the relatively simple process, the chelating copolymer is contacted with the contaminated solution, loaded with metal ions, and stripped with the appropriate eluent. Important properties of chelating polymers are high capacity, high selectivity and fast kinetics combined with mechanical stability and chemical inertness. Macroporous hydrophilic copolymers of glycidyl methacrylate and ethylene glycol dimethacrylate modified by different amines show outstanding efficiency and selectivity for the sorption of precious and heavy metals from aqueous solutions. In this study poly(GMA-co-EGDMA) copolymers were synthesized with different porosity parameters and functionalized in reactions with ethylene diamine (EDA), diethylene triamine (DETA) and triethylene tetramine (TETA). Under non-competitive conditions, in batch experiments at room temperature, the rate of sorption of Cu(II) ions from aqueous solutions and the influence of pH on it was determined for four samples of amino-functionalized poly(GMA-co-EGDMA). The sorption of Cu(II) for both amino-functionalized samples was found to be very rapid. The sorption half time, t1/2, defined as the time required to reach 50% of the total sorption capacity, was between 1 and 2 min. The maximum sorption capacity for copper (2.80 mmol/g) was obtained on SGE-10/12-deta sample. The sorption capacity of Cu(II) ions increases with increasing pH and has maximum at pH ~5. In the experimental pH range, the maximum sorption capacity of Cu(II) ions again is reached on SGE-10/12-deta. By comparing literature data and obtained results it is possible to conclude that amino-functionalized macroporous copolymers based on glycidyl methacrylate are efficient for sorption of Cu(II) ions from aqueous solutions and sorption capacity for copper mostly depends on type of amine with which the basic copolymer is functionalized.
APA, Harvard, Vancouver, ISO, and other styles
50

Novack, Ari, Matt Streshinsky, Ran Ding, Yang Liu, Andy Eu-Jin Lim, Guo-Qiang Lo, Tom Baehr-Jones, and Michael Hochberg. "Progress in silicon platforms for integrated optics." Nanophotonics 3, no. 4-5 (August 1, 2014): 205–14. http://dx.doi.org/10.1515/nanoph-2013-0034.

Full text
Abstract:
AbstractRapid progress has been made in recent years repurposing CMOS fabrication tools to build complex photonic circuits. As the field of silicon photonics becomes more mature, foundry processes will be an essential piece of the ecosystem for eliminating process risk and allowing the community to focus on adding value through clever design. Multi-project wafer runs are a useful tool to promote further development by providing inexpensive, low-risk prototyping opportunities to academic and commercial researchers. Compared to dedicated silicon manufacturing runs, multi-project-wafer runs offer cost reductions of 100× or more. Through OpSIS, we have begun to offer validated device libraries that allow designers to focus on building systems rather than modifying device geometries. The EDA tools that will enable rapid design of such complex systems are under intense development. Progress is also being made in developing practical optical and electronic packaging solutions for the photonic chips, in ways that eliminate or sharply reduce development costs for the user community. This paper will provide a review of the recent developments in silicon photonic foundry offerings with a focus on OpSIS, a multi-project-wafer foundry service offering a silicon photonics platform, including a variety of passive components as well as high-speed modulators and photodetectors, through the Institute of Microelectronics in Singapore.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography