Articles de revues sur le sujet « Representative Surrogate Problems »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Representative Surrogate Problems.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 45 meilleurs articles de revues pour votre recherche sur le sujet « Representative Surrogate Problems ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Bao, Lin, Xiaoyan Sun, Yang Chen, Guangyi Man et Hui Shao. « Restricted Boltzmann Machine-Assisted Estimation of Distribution Algorithm for Complex Problems ». Complexity 2018 (1 novembre 2018) : 1–13. http://dx.doi.org/10.1155/2018/2609014.

Texte intégral
Résumé :
A novel algorithm, called restricted Boltzmann machine-assisted estimation of distribution algorithm, is proposed for solving computationally expensive optimization problems with discrete variables. First, the individuals are evaluated using expensive fitness functions of the complex problems, and some dominant solutions are selected to construct the surrogate model. The restricted Boltzmann machine (RBM) is built and trained with the dominant solutions to implicitly extract the distributed representative information of the decision variables in the promising subset. The visible layer’s probability of the RBM is designed as the sampling probability model of the estimation of distribution algorithm (EDA) and is updated dynamically along with the update of the dominant subsets. Second, according to the energy function of the RBM, a fitness surrogate is developed to approximate the expensive individual fitness evaluations and participates in the evolutionary process to reduce the computational cost. Finally, model management is developed to train and update the RBM model with newly dominant solutions. A comparison of the proposed algorithm with several state-of-the-art surrogate-assisted evolutionary algorithms demonstrates that the proposed algorithm effectively and efficiently solves complex optimization problems with smaller computational cost.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Wu, Mengtian, Jin Xu, Pengjie Hu, Qianyi Lu, Pengcheng Xu, Han Chen et Lingling Wang. « An Adaptive Surrogate-Assisted Simulation-Optimization Method for Identifying Release History of Groundwater Contaminant Sources ». Water 14, no 10 (23 mai 2022) : 1659. http://dx.doi.org/10.3390/w14101659.

Texte intégral
Résumé :
The simulation-optimization method, integrating the numerical model and the evolutionary algorithm, is increasingly popular for identifying the release history of groundwater contaminant sources. However, due to the usage of computationally intensive evolutionary algorithms, traditional simulation-optimization methods always require thousands of simulations to find appropriate solutions. Such methods yield a prohibitive computational burden if the simulation involved is time-consuming. To reduce general computation, this study proposes a novel simulation-optimization method for solving the inverse contaminant source identification problems, which uses surrogate models to approximate the numerical model. Unlike many existing surrogate-assisted methods using the pre-determined surrogate model, this paper presents an adaptive surrogate technique to construct the most appropriate surrogate model for the current numerical model. Two representative cases about identifying the release history of contaminant sources are used to investigate the accuracy and robustness of the proposed method. The results indicate that the proposed adaptive surrogate-assisted method effectively identifies the release history of groundwater contaminant sources with a higher degree of accuracy and shorter computation time than traditional methods.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Zhao, Hongbo, Lin Zhang, Jiaolong Ren, Meng Wang et Zhiqiang Meng. « AdaBoost-Based Back Analysis for Determining Rock Mass Mechanical Parameters of Claystones in Goupitan Tunnel, China ». Buildings 12, no 8 (22 juillet 2022) : 1073. http://dx.doi.org/10.3390/buildings12081073.

Texte intégral
Résumé :
The back analysis is an effective tool to determine the representative values of rock mass mechanical properties in rock engineering. The surrogate model is widely used in back analyses since analytical or numerical models are usually unavailable for practical engineering problems. This study proposes a novel back analysis framework by adopting the AdaBoost algorithm for deriving the surrogate model. Moreover, the simplicial homology global optimization (SHGO) algorithm, which is robust and applicable for a black-box global problem, is also integrated into the framework. To evaluate the performance, an experimental tunnel in Goupitan Hydropower Station, China, is introduced, and the representative rheological properties of the surrounding rock are obtained by applying the proposed framework. Then the computed displacements based on the acquired properties via both surrogate and numerical models are compared with field measurements. By taking triple-day data, the discrepancy between the calculated and field-measured displacements is less than 0.5 mm This validates the reliability of the obtained properties and the feasibility of the proposed framework. As an AdaBoost-based method, the proposed framework is sensitive to noise and outliers in the data, the elimination of which is recommended before application.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Sala, Ramses, Niccolò Baldanzini et Marco Pierini. « Representative surrogate problems as test functions for expensive simulators in multidisciplinary design optimization of vehicle structures ». Structural and Multidisciplinary Optimization 54, no 3 (4 mai 2016) : 449–68. http://dx.doi.org/10.1007/s00158-016-1410-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Lin, Dalue, Haogan Huang, Xiaoyan Li et Yuejiao Gong. « Empirical Study of Data-Driven Evolutionary Algorithms in Noisy Environments ». Mathematics 10, no 6 (15 mars 2022) : 943. http://dx.doi.org/10.3390/math10060943.

Texte intégral
Résumé :
For computationally intensive problems, data-driven evolutionary algorithms (DDEAs) are advantageous for low computational budgets because they build surrogate models based on historical data to approximate the expensive evaluation. Real-world optimization problems are highly susceptible to noisy data, but most of the existing DDEAs are developed and tested on ideal and clean environments; hence, their performance is uncertain in practice. In order to discover how DDEAs are affected by noisy data, this paper empirically studied the performance of DDEAs in different noisy environments. To fulfill the research purpose, we implemented four representative DDEAs and tested them on common benchmark problems with noise simulations in a systematic manner. Specifically, the simulation of noisy environments considered different levels of noise intensity and probability. The experimental analysis revealed the association relationships among noisy environments, benchmark problems and the performance of DDEAs. The analysis showed that noise will generally cause deterioration of the DDEA’s performance in most cases, but the effects could vary with different types of problem landscapes and different designs of DDEAs.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Gilbody, Simon M., Allan O. House et Trevor A. Sheldon. « Outcomes research in mental health ». British Journal of Psychiatry 181, no 1 (juillet 2002) : 8–16. http://dx.doi.org/10.1192/bjp.181.1.8.

Texte intégral
Résumé :
BackgroundOutcomes research involves the secondary analysis of data collected routinely by clinical services, in order to judge the effectiveness of interventions and policy initiatives. It permits the study of large databases of patients who are representative of ‘real world’ practice. However, there are potential problems with this observational design.AimsTo establish the strengths and limitations of outcomes research when applied in mental health.MethodA systematic review was made of the application of outcomes research in mental health services research.ResultsNine examples of outcomes research in mental health services were found. Those that used insurance claims data have information on large numbers of patients but use surrogate outcomes that are of questionable value to clinicians and patients. Problems arise when attempting to adjust for important confounding variables using routinely collected claims data, making results difficult to interpret.ConclusionsOutcomes research is unlikely to be a quick or cheap means of establishing evidence for the effectiveness of mental health practice and policy.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Jones, D. A., et K. J. Sene. « A Bayesian approach to flow record infilling and extension for reservoir design ». Hydrology and Earth System Sciences 3, no 4 (31 décembre 1999) : 491–503. http://dx.doi.org/10.5194/hess-3-491-1999.

Texte intégral
Résumé :
Abstract. A Bayesian approach is described for dealing with the problem of infilling and generating stochastic flow sequences using rainfall data to guide the flow generation process, and including bounded (censored) observed flow and rainfall data to provide additional information. Solutions are obtained using a Gibbs sampling procedure. Particular problems discussed include developing new procedures for fitting transformations when bounded values are available, coping with additional information in the form of values, or bounds, for totals of flows across several sites, and developing relationships between annual flow and rainfall data. Examples are shown of both infilled values of unknown past river flows, with assessment of uncertainty, and realisations of flows representative of what might occur in the future. Several procedures for validating the model output are described and the central estimates of flows, taken as a surrogate for historical observed flows, are compared with long term regional flow and rainfall data.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Bronstert, A., et A. Bárdossy. « The role of spatial variability of soil moisture for modelling surface runoff generation at the small catchment scale ». Hydrology and Earth System Sciences 3, no 4 (31 décembre 1999) : 505–16. http://dx.doi.org/10.5194/hess-3-505-1999.

Texte intégral
Résumé :
Abstract. A Bayesian approach is described for dealing with the problem of infilling and generating stochastic flow sequences using rainfall data to guide the flow generation process, and including bounded (censored) observed flow and rainfall data to provide additional information. Solutions are obtained using a Gibbs sampling procedure. Particular problems discussed include developing new procedures for fitting transformations when bounded values are available, coping with additional information in the form of values, or bounds, for totals of flows across several sites, and developing relationships between annual flow and rainfall data. Examples are shown of both infilled values of unknown past river flows, with assessment of uncertainty, and realisations of flows representative of what might occur in the future. Several procedures for validating the model output are described and the central estimates of flows, taken as a surrogate for historical observed flows, are compared with long term regional flow and rainfall data.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Dalton, James P., Benedict Uy, Narisa Phummarin, Brent R. Copp, William A. Denny, Simon Swift et Siouxsie Wiles. « Effect of common and experimental anti-tuberculosis treatments onMycobacterium tuberculosisgrowing as biofilms ». PeerJ 4 (22 novembre 2016) : e2717. http://dx.doi.org/10.7717/peerj.2717.

Texte intégral
Résumé :
Much is known regarding the antibiotic susceptibility of planktonic cultures ofMycobacterium tuberculosis, the bacterium responsible for the lung disease tuberculosis (TB). As planktonically-grownM. tuberculosisare unlikely to be entirely representative of the bacterium during infection, we set out to determine how effective a range of anti-mycobacterial treatments were againstM. tuberculosisgrowing as a biofilm, a bacterial phenotype known to be more resistant to antibiotic treatment. Light levels from bioluminescently-labelledM. tuberculosisH37Rv (strain BSG001) were used as a surrogate for bacterial viability, and were monitored before and after one week of treatment. After treatment, biofilms were disrupted, washed and inoculated into fresh broth and plated onto solid media to rescue any surviving bacteria. We found that in this phenotypic stateM. tuberculosiswas resistant to the majority of the compounds tested. Minimum inhibitory concentrations (MICs) increased by 20-fold to greater than 1,000-fold, underlying the potential of this phenotype to cause significant problems during treatment.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Salmina, A. V. « ON THE NEED FOR SOCIOLOGICAL RESEARCH ON THE INTRODUCTION OF ASSISTED REPRODUCTIVE TECHNOLOGIES IN BELARUS ». Journal of the Grodno State Medical University 19, no 4 (12 septembre 2021) : 451–56. http://dx.doi.org/10.25298/2221-8785-2021-19-4-451-456.

Texte intégral
Résumé :
Background. At present there are no scientifically substantiated data on the problems of introducing assisted reproductive technologies (ART) in the Republic of Belarus. The relevance of developing approaches, organizing opinion polls and processing data on a representative sample of Belarusians does not raise doubts in view of the relationship between social attitudes in the field of reproductive health and the national security of the country. Purpose. Substantiation of medical and sociological study of ART in the population of the Republic of Belarus. Material and methods. The bibliographic analysis included the study of Russian and foreign experience in assessing the sociological aspects of reproductology (materials of Springer Link, Oxford University Press, The New England Journal of Medicine, The British Medical Journal, the SCOPUS database of Elsevier, the EBSCO platform), as well as the analysis of the legislation of the Republic of Belarus in the field of reproductology. Results. In the country, the interests of the party that wants to become a parent (surrogate motherhood, donation) are respected as much as possible. The medicalized approach to the definition of ART methods in Belarus is typical, as in other post-Soviet countries, which are characterized by a classical (nuclear) understanding of the family. Taking into account the current trends in the development of the market for reproductive technologies and those techniques that are used in reproductive centers of the Republic of Belarus, it is necessary in the legislative framework to provide for the rules and possibilities of using such methods as hatching (dissection of the embryo membrane), intracytoplasmic sperm injection, intracytoplasmic sperm injection after selection according to morphological criteria, preimplantation diagnostics. Conclusions. The following areas are relevant for Belarus: 1) study of the awareness of the population of the Republic of Belarus about ART; 2) assessment of social trust in ART on the part of the population; 3) development of technologies for positive reproductive attitudes in society, including the use of ART.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Kim, Yong-Hyuk, Alberto Moraglio, Ahmed Kattan et Yourim Yoon. « Geometric Generalisation of Surrogate Model-Based Optimisation to Combinatorial and Program Spaces ». Mathematical Problems in Engineering 2014 (2014) : 1–10. http://dx.doi.org/10.1155/2014/184540.

Texte intégral
Résumé :
Surrogate models (SMs) can profitably be employed, often in conjunction with evolutionary algorithms, in optimisation in which it is expensive to test candidate solutions. The spatial intuition behind SMs makes them naturally suited to continuous problems, and the only combinatorial problems that have been previously addressed are those with solutions that can be encoded as integer vectors. We show how radial basis functions can provide a generalised SM for combinatorial problems which have a geometric solution representation, through the conversion of that representation to a different metric space. This approach allows an SM to be cast in a natural way for the problem at hand, without ad hoc adaptation to a specific representation. We test this adaptation process on problems involving binary strings, permutations, and tree-based genetic programs.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Tonolo, Sara. « Adoption v. Surrogacy : New Perspectives on the Parental Projects of Same-Sex Couples ». Italian Review of International and Comparative Law 1, no 1 (15 octobre 2021) : 132–45. http://dx.doi.org/10.1163/27725650-01010007.

Texte intégral
Résumé :
Abstract In Italy all forms of surrogacy are forbidden, whether it be traditional or gestational, commercial or altruistic. Act n. 40 of 19/2/2004, entitled “Rules about medically-assisted reproduction”, introduces a prohibition on employing gametes from donors, and specifically incriminates not only intermediary agencies and clinics practising surrogacy, but also the intended parents and the surrogate mother too. Other penal consequences are provided by the Criminal Code about the registration of a birth certificate where parents are the intended ones, as provided by the lex loci actus (art. 567 of the Italian Criminal Code, concerning the false representation or concealment of status). Apart from the mentioned criminal problems, several aspects of private international law are involved. In the cases where national rules forbid the transcription of birth certificates for public policy reasons, specifically the prohibition of surrogacy, Italian Judges often seek solutions to enforce the status filiationis. In this case, the Italian Supreme Court intervenes in the debate, allowing the recognition of a foreign adoption order related to a procedure of surrogate motherhood in favour of a same-sex couple. Focusing on the recent evolution of the notion of international public policy the Supreme Court affirms that the inherent adoptive parental status acquired by a homogenitorial couple is not contrary to international public policy, when the effects of the act from which this status derives are not incompatible with the limits that cannot be exceeded constituted by the founding principles of the relational choices between intended parents and child (Article 2 of the Constitution, Article 8 ECHR), by the Best interest of the child as codified in the Italian Law 219/2012, by the principle of non-discrimination, by the principle of solidarity that is the basis of social parenting. Splitting the problem of the surrogacy, underlying the adoption order to recognize in this case, and narrowing the public policy exception, is highly evident the risk to suggest to same-sex couples to realize their parental projects putting in place the surrogacy within the legal systems where contemporary it is possible to carry out the adoption of the child born as a result of this procedure.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Senne, Edson Luiz França, Luiz Antonio Nogueira Lorena et Silvely Nogueira de Almeida Salomão. « Métodos de geração de colunas para problemas de atribuição ». Production 17, no 1 (avril 2007) : 71–83. http://dx.doi.org/10.1590/s0103-65132007000100005.

Texte intégral
Résumé :
Este trabalho apresenta métodos de geração de colunas para dois importantes problemas de atribuição: o Problema Generalizado de Atribuição (PGA) e o Problema de Atribuição de Antenas a Comutadores (PAAC). O PGA é um dos mais representativos problemas de Otimização Combinatória e consiste em otimizar a atribuição de n tarefas a m agentes, de forma que cada tarefa seja atribuída a exatamente um agente e a capacidade de cada agente seja respeitada. O PAAC consiste em atribuir n antenas a m comutadores em uma rede de telefonia celular, de forma a minimizar os custos de cabeamento entre antenas e comutadores e os custos de transferência de chamadas entre comutadores. A abordagem tradicional de geração de colunas é comparada com as propostas neste trabalho, que utilizam a relaxação lagrangeana/surrogate. São apresentados testes computacionais que demonstram a efetividade dos algoritmos propostos.
Styles APA, Harvard, Vancouver, ISO, etc.
14

Mohamed, Hamdaoui, Guénhaël Le Quilliec, Piotr Breitkopf et Pierre Villon. « Surrogate POD Models for Parametrized Sheet Metal Forming Applications ». Key Engineering Materials 554-557 (juin 2013) : 919–27. http://dx.doi.org/10.4028/www.scientific.net/kem.554-557.919.

Texte intégral
Résumé :
The aim of this work is to present a POD (Proper Orthogonal Decomposition) based surrogate approach for sheet metal forming parametrized applications. The final displacement field for the stamped work-piece computed using a finite element approach is approximated using the method of snapshots for POD mode determination and kriging for POD coefficients interpolation. An error analysis, performed using a validation set, shows that the accuracy of the surrogate POD model is excellent for the representation of finite element displacement fields. A possible use of the surrogate to assess the quality of the stamped sheet is considered. The Green-Lagrange strain tensor is derived and forming limit diagrams are computed on the fly for any point of the design space. Furthermore, the minimization of a cost function based on the surrogate POD model is performed showing its potential for solving optimization problems.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Kang, Zhao, Chong Peng, Jie Cheng et Qiang Cheng. « LogDet Rank Minimization with Application to Subspace Clustering ». Computational Intelligence and Neuroscience 2015 (2015) : 1–10. http://dx.doi.org/10.1155/2015/824289.

Texte intégral
Résumé :
Low-rank matrix is desired in many machine learning and computer vision problems. Most of the recent studies use the nuclear norm as a convex surrogate of the rank operator. However, all singular values are simply added together by the nuclear norm, and thus the rank may not be well approximated in practical problems. In this paper, we propose using a log-determinant (LogDet) function as a smooth and closer, though nonconvex, approximation to rank for obtaining a low-rank representation in subspace clustering. Augmented Lagrange multipliers strategy is applied to iteratively optimize the LogDet-based nonconvex objective function on potentially large-scale data. By making use of the angular information of principal directions of the resultant low-rank representation, an affinity graph matrix is constructed for spectral clustering. Experimental results on motion segmentation and face clustering data demonstrate that the proposed method often outperforms state-of-the-art subspace clustering algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Franz, M., S. Pfingst, M. Zimmermann et S. Wartzack. « Estimation of Composite Laminate Ply Angles Using an Inverse Bayesian Approach Based on Surrogate Models ». Proceedings of the Design Society 2 (mai 2022) : 1569–78. http://dx.doi.org/10.1017/pds.2022.159.

Texte intégral
Résumé :
AbstractA digital twin (DT) relies on a detailed, virtual representation of a physical product. Since uncertainties and deviations can lead to significant changes in the functionality and quality of products, they should be considered in the DT. However, valuable product properties are often hidden and thus difficult to integrate into a DT. In this work, a Bayesian inverse approach based on surrogate models is applied to infer hidden composite laminate ply angles from strain measurements. The approach is able to find the true values even for ill-posed problems and shows good results up to 6 plies.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Dadkhahi, Hamid, Jesus Rios, Karthikeyan Shanmugam et Payel Das. « Fourier Representations for Black-Box Optimization over Categorical Variables ». Proceedings of the AAAI Conference on Artificial Intelligence 36, no 9 (28 juin 2022) : 10156–65. http://dx.doi.org/10.1609/aaai.v36i9.21255.

Texte intégral
Résumé :
Optimization of real-world black-box functions defined over purely categorical variables is an active area of research. In particular, optimization and design of biological sequences with specific functional or structural properties have a profound impact in medicine, materials science, and biotechnology. Standalone search algorithms, such as simulated annealing (SA) and Monte Carlo tree search (MCTS), are typically used for such optimization problems. In order to improve the performance and sample efficiency of such algorithms, we propose to use existing methods in conjunction with a surrogate model for the black-box evaluations over purely categorical variables. To this end, we present two different representations, a group-theoretic Fourier expansion and an abridged one-hot encoded Boolean Fourier expansion. To learn such representations, we consider two different settings to update our surrogate model. First, we utilize an adversarial online regression setting where Fourier characters of each representation are considered as experts and their respective coefficients are updated via an exponential weight update rule each time the black box is evaluated. Second, we consider a Bayesian setting where queries are selected via Thompson sampling and the posterior is updated via a sparse Bayesian regression model (over our proposed representation) with a regularized horseshoe prior. Numerical experiments over synthetic benchmarks as well as real-world RNA sequence optimization and design problems demonstrate the representational power of the proposed methods, which achieve competitive or superior performance compared to state-of-the-art counterparts, while improving the computation cost and/or sample efficiency, substantially.
Styles APA, Harvard, Vancouver, ISO, etc.
18

P. Faith, Daniel, P. A. Walker et C. R. Margules. « Some future prospects for systematic biodiversity planning in Papua New Guinea - and for biodiversity planning in general ». Pacific Conservation Biology 6, no 4 (2000) : 325. http://dx.doi.org/10.1071/pc010325.

Texte intégral
Résumé :
We describe three challenges for biodiversity planning, which arise from a study in Papua New Guinea, but apply equally to biodiversity planning in general. These are 1) the best use of available data for providing biodiversity surrogate information, 2) the integration of representativeness and persistence goals into the area prioritization process, and 3) implications for the implementation of a conservation plan over time. Each of these problems is linked to the effective use of complementarity. Further, we find that a probabilistic framework for calculating persistence-based complementarity values over time can contribute to resolving each challenge. Probabilities allow for the exploration of a range of possible complementarity values over different planning scenarios, and provide a way to evalua!e biodiversity surrogates. The integration of representativeness and persistence goals, via estimated probabilities of persistence, facilitates the crediting of partial protection provided by sympathetic management. For the selection of priority areas and land use allocation, partial protection may be a "given" or implied by an allocated land use. Such an integration also allows the incorporation of vulnerability/threat information at the level of attributes or areas, incorporating persistence values that may depend on reserve design. As an example of the use of persistence probabilities, we derive an alternative proposed priority area set for PNG. This is based on 1) a goal of 0.99 probability of persistence of all biodiversity surrogate attributes used in the study, 2) an assumption of a 0.10 probability of persistence in the absence of any form of formal protection, and 3) a 0.90 probability of persistence for surrogate attributes in proposed priority areas, assuming formal protection is afforded to them. The calculus of persistence also leads to a proposed system of environmental levies based on biodiversity complementarity values. The assigned levy for an area may change to reflect its changing complementarity value in light of changes to protection status of other areas. We also propose a number of complementarity-based options for a carbon credits framework. These address required principles of additionality and collateral benefits from biodiversity protection. A related biodiversity credits scheme, also based on complementarity, encourages investments in those areas that make greatest ongoing contributions to regional biodiversity representation and persistence. All these new methods point to a new "systematic conservation planning" that is not focused only on selecting sets of areas but utilizes complementarity values and changes in probabilities of persistence for a range of decision making processes. The cornerstone of biodiversity planning, complementarity, no longer reflects only relative amounts of biodiversity but also relative probabilities of persistence.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Chen, Liming, Enying Li et Hu Wang. « Time-based reflow soldering optimization by using adaptive Kriging-HDMR method ». Soldering & ; Surface Mount Technology 28, no 2 (4 avril 2016) : 101–13. http://dx.doi.org/10.1108/ssmt-07-2015-0021.

Texte intégral
Résumé :
Purpose Reflow soldering process is an important step of the surface mount technology. The purpose of this paper is to minimize the maximum warpage of shielding frame by controlling reflow soldering control parameters. Design/methodology/approach Compared with other reflow-related design methods, both time and temperate of each extracted time region are considered. Therefore, the number of design variable is increased. To solve the high-dimensional problem, a surrogate-assisted optimization (SAO) called adaptive Kriging high-dimensional representation model (HDMR) is used. Findings Therefore, the number of design variable is increased. To solve the high-dimensional problem, a surrogate-assisted optimization (SAO) called HDMR is used. The warpage of shield frame is significantly reduced. Moreover, the correlations of design variables are also disclosed. Originality/value Compared with the original Kriging HDMR, the expected improvement (EI) criterion is used and a new projection strategy is suggested to improve the efficiency of optimization method. The application suggests that the adaptive Kriging HDMR has potential capability to solve such complicated engineering problems.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Koziel, Slawomir, et Adrian Bekasiewicz. « On deterministic procedures for low-cost multi-objective design optimization of miniaturized impedance matching transformers ». Engineering Computations 34, no 2 (18 avril 2017) : 403–19. http://dx.doi.org/10.1108/ec-01-2016-0046.

Texte intégral
Résumé :
Purpose This paper aims to investigate deterministic strategies for low-cost multi-objective design optimization of compact microwave structures, specifically, impedance matching transformers. The considered methods involve surrogate modeling techniques and variable-fidelity electromagnetic (EM) simulations. In contrary to majority of conventional approaches, they do not rely on population-based metaheuristics, which permit lowering the design cost and improve reliability. Design/methodology/approach There are two algorithmic frameworks presented, both fully deterministic. The first algorithm involves creating a path covering the Pareto front and arranged as a sequence of patches relocated in the course of optimization. Response correction techniques are used to find the Pareto front representation at the high-fidelity EM simulation level. The second algorithm exploits Pareto front exploration where subsequent Pareto-optimal designs are obtained by moving along the front by means of solving appropriately defined local constrained optimization problems. Numerical case studies are provided demonstrating feasibility of solving real-world problems involving expensive EM-simulation models of impedance transformer structures. Findings It is possible, by means of combining surrogate modeling techniques and constrained local optimization, to identify the set of alternative designs representing Pareto-optimal solutions, in a realistic time frame corresponding to a few dozen of high-fidelity EM simulations of the respective structures. Multi-objective optimization for the considered class of structures can be realized using deterministic approaches without defaulting to evolutionary methods. Research limitations/implications The present study can be considered a step toward further studies on expedited optimization of computationally expensive simulation models for miniaturized microwave components. Originality/value The proposed algorithmic solutions proved useful for expedited multi-objective design optimization of miniaturized microwave structures. The problem is extremely challenging when using conventional methods, in particular evolutionary algorithms. To the authors’ knowledge, this is one of the first attempts to investigate deterministic surrogate-assisted multi-objective optimization of compact components at the EM-simulation level.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Birolleau, Alexandre, Gaël Poëtte et Didier Lucor. « Adaptive Bayesian Inference for Discontinuous Inverse Problems, Application to Hyperbolic Conservation Laws ». Communications in Computational Physics 16, no 1 (juillet 2014) : 1–34. http://dx.doi.org/10.4208/cicp.240113.071113a.

Texte intégral
Résumé :
AbstractVarious works from the literature aimed at accelerating Bayesian inference in inverse problems. Stochastic spectral methods have been recently proposed as surrogate approximations of the forward uncertainty propagation model over the support of the prior distribution. These representations are efficient because they allow affordable simulation of a large number of samples from the posterior distribution. Unfortunately, they do not perform well when the forward model exhibits strong nonlinear behavior with respect to its input.In this work, we first relate the fast (exponential) L2-convergence of the forward approximation to the fast (exponential) convergence (in terms of Kullback-Leibler divergence) of the approximate posterior. In particular, we prove that in case the prior distribution is uniform, the posterior is at least twice as fast as the convergence rate of the forwardmodel in those norms. The Bayesian inference strategy is developed in the framework of a stochastic spectral projectionmethod. The predicted convergence rates are then demonstrated for simple nonlinear inverse problems of varying smoothness.We then propose an efficient numerical approach for the Bayesian solution of inverse problems presenting strongly nonlinear or discontinuous systemresponses. This comes with the improvement of the forward model that is adaptively approximated by an iterative generalized Polynomial Chaos-based representation. The numerical approximations and predicted convergence rates of the former approach are compared to the new iterative numericalmethod for nonlinear time-dependent test cases of varying dimension and complexity, which are relevant regarding our hydrodynamics motivations and therefore regarding hyperbolic conservation laws and the apparition of discontinuities in finite time.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Tucci, Mauro, Sami Barmada, Alessandro Formisano et Dimitri Thomopulos. « A Regularized Procedure to Generate a Deep Learning Model for Topology Optimization of Electromagnetic Devices ». Electronics 10, no 18 (7 septembre 2021) : 2185. http://dx.doi.org/10.3390/electronics10182185.

Texte intégral
Résumé :
The use of behavioral models based on deep learning (DL) to accelerate electromagnetic field computations has recently been proposed to solve complex electromagnetic problems. Such problems usually require time-consuming numerical analysis, while DL allows achieving the topologically optimized design of electromagnetic devices using desktop class computers and reasonable computation times. An unparametrized bitmap representation of the geometries to be optimized, which is a highly desirable feature needed to discover completely new solutions, is perfectly managed by DL models. On the other hand, optimization algorithms do not easily cope with high dimensional input data, particularly because it is difficult to enforce the searched solutions as feasible and make them belong to expected manifolds. In this work, we propose the use of a variational autoencoder as a data regularization/augmentation tool in the context of topology optimization. The optimization was carried out using a gradient descent algorithm, and the DL neural network was used as a surrogate model to accelerate the resolution of single trial cases in the due course of optimization. The variational autoencoder and the surrogate model were simultaneously trained in a multi-model custom training loop that minimizes total loss—which is the combination of the two models’ losses. In this paper, using the TEAM 25 problem (a benchmark problem for the assessment of electromagnetic numerical field analysis) as a test bench, we will provide a comparison between the computational times and design quality for a “classical” approach and the DL-based approach. Preliminary results show that the variational autoencoder manages regularizing the resolution process and transforms a constrained optimization into an unconstrained one, improving both the quality of the final solution and the performance of the resolution process.
Styles APA, Harvard, Vancouver, ISO, etc.
23

Das, Rajeev, et Azzedine Soulaimani. « Non-Deterministic Methods and Surrogates in the Design of Rockfill Dams ». Applied Sciences 11, no 8 (20 avril 2021) : 3699. http://dx.doi.org/10.3390/app11083699.

Texte intégral
Résumé :
The parameters of the constitutive models used in the design of rockfill dams are associated with a high degree of uncertainty. This occurs because rockfill dams are comprised of numerous zones, each with different soil materials, and it is not feasible to extract materials from such structures to accurately ascertain their behavior or their respective parameters. The general approach involves laboratory tests using small material samples or empirical data from the literature. However, such measures lack an accurate representation of the actual scenario, resulting in uncertainties. This limits the suitability of the model in the design process. Inverse analysis provides an option to better understand dam behavior. This procedure involves the use of real monitored data, such as deformations and stresses, from the dam structure via installed instruments. Fundamentally, it is a non-destructive approach that considers optimization methods and actual performance data to determine the values of the parameters by minimizing the differences between simulated and observed results. This paper considers data from an actual rockfill dam and proposes a surrogate assisted non-deterministic framework for its inverse analysis. A suitable error/objective function that measures the differences between the actual and simulated displacement values is defined first. Non-deterministic algorithms are used as the optimization technique, as they can avoid local optima and are more robust when compared to the conventional deterministic methods. Three such approaches, the genetic algorithm, differential evolution, and particle swarm optimization are evaluated to identify the best strategy in solving problems of this nature. A surrogate model in the form of a polynomial regression is studied and recommended in place of the actual numerical model of the dam to reduce computation cost. Finally, this paper presents the relevant dam parameters estimated by the analysis and provides insights into the performance of the three procedures to solve the inverse problem.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Li, Enying, Fan Ye et Hu Wang. « Alternative Kriging-HDMR optimization method with expected improvement sampling strategy ». Engineering Computations 34, no 6 (7 août 2017) : 1807–28. http://dx.doi.org/10.1108/ec-06-2016-0208.

Texte intégral
Résumé :
Purpose The purpose of study is to overcome the error estimation of standard deviation derived from Expected improvement (EI) criterion. Compared with other popular methods, a quantitative model assessment and analysis tool, termed high-dimensional model representation (HDMR), is suggested to be integrated with an EI-assisted sampling strategy. Design/methodology/approach To predict standard deviation directly, Kriging is imported. Furthermore, to compensate for the underestimation of error in the Kriging predictor, a Pareto frontier (PF)-EI (PFEI) criterion is also suggested. Compared with other surrogate-assisted optimization methods, the distinctive characteristic of HDMR is to disclose the correlations among component functions. If only low correlation terms are considered, the number of function evaluations for HDMR grows only polynomially with the number of input variables and correlative terms. Findings To validate the suggested method, various nonlinear and high-dimensional mathematical functions are tested. The results show the suggested method is potential for solving complicated real engineering problems. Originality/value In this study, the authors hope to integrate superiorities of PFEI and HDMR to improve optimization performance.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Stankevičienė, Virginija, Viorika Šestakova et Daiva Zavistanavičienė. « Meaning equivalence issue of English and Lithuanian terms of finance ». Journal of Language and Cultural Education 9, no 2 (1 septembre 2021) : 58–68. http://dx.doi.org/10.2478/jolace-2021-0011.

Texte intégral
Résumé :
Abstract The intensification of language contacts has led to increasing problems of compliance between terms in different languages in the translation of subject-specific texts. Dictionaries are the main tool in clarifying term meanings and trying to find the most appropriate version which could be perceived by the representatives of different languages equally. More accurate dissemination and usage of the term equivalent allows the addressee (recipient) to better comprehend the contents of the language. Not only are meaning explanation and consistency of definition formulation significant aspects in compiling bilingual dictionaries but also determination of term equivalents. Although terms of economics have old traditions in various languages, it is possible to notice the cases of meaning discrepancy and different perception. The choice and usage of appropriate term remain crucial in communicating about various finance-related operations. The more accurate and clearer the term the better perception of the subtleties of other language. The purpose of this article is to determine the extent to which the concepts of a semantic group representing finance are equivalent, i. e. how similar and different their features are in English and Lithuanian. The research revealed that the majority of the analysed terms are partially equivalent in both languages. Hence, partial equivalents and surrogates in particular may cause confusion and discrepancy in term meaning perception.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Abdullah, Talal A. A., Mohd Soperi Mohd Zahid, Waleed Ali et Shahab Ul Hassan. « B-LIME : An Improvement of LIME for Interpretable Deep Learning Classification of Cardiac Arrhythmia from ECG Signals ». Processes 11, no 2 (16 février 2023) : 595. http://dx.doi.org/10.3390/pr11020595.

Texte intégral
Résumé :
Deep Learning (DL) has gained enormous popularity recently; however, it is an opaque technique that is regarded as a black box. To ensure the validity of the model’s prediction, it is necessary to explain its authenticity. A well-known locally interpretable model-agnostic explanation method (LIME) uses surrogate techniques to simulate reasonable precision and provide explanations for a given ML model. However, LIME explanations are limited to tabular, textual, and image data. They cannot be provided for signal data features that are temporally interdependent. Moreover, LIME suffers from critical problems such as instability and local fidelity that prevent its implementation in real-world environments. In this work, we propose Bootstrap-LIME (B-LIME), an improvement of LIME, to generate meaningful explanations for ECG signal data. B-LIME implies a combination of heartbeat segmentation and bootstrapping techniques to improve the model’s explainability considering the temporal dependencies between features. Furthermore, we investigate the main cause of instability and lack of local fidelity in LIME. We then propose modifications to the functionality of LIME, including the data generation technique, the explanation method, and the representation technique, to generate stable and locally faithful explanations. Finally, the performance of B-LIME in a hybrid deep-learning model for arrhythmia classification was investigated and validated in comparison with LIME. The results show that the proposed B-LIME provides more meaningful and credible explanations than LIME for cardiac arrhythmia signal data, considering the temporal dependencies between features.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Koziel, Slawomir, et Adrian Bekasiewicz. « Fast multi-objective design optimization of microwave and antenna structures using data-driven surrogates and domain segmentation ». Engineering Computations 37, no 2 (30 août 2019) : 753–88. http://dx.doi.org/10.1108/ec-01-2019-0004.

Texte intégral
Résumé :
Purpose The purpose of this paper is to investigate the strategies and algorithms for expedited design optimization of microwave and antenna structures in multi-objective setup. Design/methodology/approach Formulation of the multi-objective design problem-oriented toward execution of the population-based metaheuristic algorithm within the segmented search space is investigated. Described algorithmic framework exploits variable fidelity modeling, physics- and approximation-based representation of the structure and model correction techniques. The considered approach is suitable for handling various problems pertinent to the design of microwave and antenna structures. Numerical case studies are provided demonstrating the feasibility of the segmentation-based framework for the design of real-world structures in setups with two and three objectives. Findings Formulation of appropriate design problem enables identification of the search space region containing Pareto front, which can be further divided into a set of compartments characterized by small combined volume. Approximation model of each segment can be constructed using a small number of training samples and then optimized, at a negligible computational cost, using population-based metaheuristics. Introduction of segmentation mechanism to multi-objective design framework is important to facilitate low-cost optimization of many-parameter structures represented by numerically expensive computational models. Further reduction of the design cost can be achieved by enforcing equal-volumes of the search space segments. Research limitations/implications The study summarizes recent advances in low-cost multi-objective design of microwave and antenna structures. The investigated techniques exceed capabilities of conventional design approaches involving direct evaluation of physics-based models for determination of trade-offs between the design objectives, particularly in terms of reliability and reduction of the computational cost. Studies on the scalability of segmentation mechanism indicate that computational benefits of the approach decrease with the number of search space segments. Originality/value The proposed design framework proved useful for the rapid multi-objective design of microwave and antenna structures characterized by complex and multi-parameter topologies, which is extremely challenging when using conventional methods driven by population-based metaheuristics algorithms. To the authors knowledge, this is the first work that summarizes segmentation-based approaches to multi-objective optimization of microwave and antenna components.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Li, Enying, Zheng Zhou, Hu Wang et Kang Cai. « A global sensitivity analysis-assisted sequential optimization tool for plant-fin heat sink design ». Engineering Computations 37, no 2 (31 juillet 2019) : 591–614. http://dx.doi.org/10.1108/ec-12-2018-0590.

Texte intégral
Résumé :
Purpose This study aims to suggest and develops a global sensitivity analysis-assisted multi-level sequential optimization method for the heat transfer problem. Design/methodology/approach Compared with other surrogate-assisted optimization methods, the distinctive characteristic of the suggested method is to decompose the original problem into several layers according to the global sensitivity index. The optimization starts with the several most important design variables by the support vector regression-based efficient global optimization method. Then, when the optimization process progresses, the filtered design variables should be involved in optimization one by one or the setting value. Therefore, in each layer, the design space should be reduced according to the previous optimization result. To improve the accuracy of the global sensitivity index, a novel global sensitivity analysis method based on the variance-based method incorporating a random sampling high-dimensional model representation is introduced. Findings The advantage of this method lies in its capability to solve complicated problems with a limited number of sample points. Moreover, to enhance the reliability of optimum, the support vector regression-based global efficient optimization is used to optimize in each layer. Practical implications The developed optimization tool is built by MATLAB and can be integrated by commercial software, such as ABAQUS and COMSOL. Lastly, this tool is integrated with COMSOL and applied to the plant-fin heat sink design. Compared with the initial temperature, the temperature after design is over 49°. Moreover, the relationships among all design variables are also disclosed clearly. Originality/value The D-MORPH-HDMR is integrated to obtain the coupling relativities among the design variables efficiently. The suggested method can be decomposed into multiplier layers according to the GSI. The SVR-EGO is used to optimize the sub-problem because of its robustness of modeling.
Styles APA, Harvard, Vancouver, ISO, etc.
29

Pérez, Victor M., John E. Renaud et Layne T. Watson. « Reduced sampling for construction of quadratic response surface approximations using adaptive experimental design ». Engineering Computations 25, no 8 (14 novembre 2008) : 764–82. http://dx.doi.org/10.1108/02644400810909607.

Texte intégral
Résumé :
PurposeTo reduce the computational complexity per step from O(n2) to O(n) for optimization based on quadratic surrogates, where n is the number of design variables.Design/methodology/approachApplying nonlinear optimization strategies directly to complex multidisciplinary systems can be prohibitively expensive when the complexity of the simulation codes is large. Increasingly, response surface approximations (RSAs), and specifically quadratic approximations, are being integrated with nonlinear optimizers in order to reduce the CPU time required for the optimization of complex multidisciplinary systems. For evaluation by the optimizer, RSAs provide a computationally inexpensive lower fidelity representation of the system performance. The curse of dimensionality is a major drawback in the implementation of these approximations as the amount of required data grows quadratically with the number n of design variables in the problem. In this paper a novel technique to reduce the magnitude of the sampling from O(n2) to O(n) is presented.FindingsThe technique uses prior information to approximate the eigenvectors of the Hessian matrix of the RSA and only requires the eigenvalues to be computed by response surface techniques. The technique is implemented in a sequential approximate optimization algorithm and applied to engineering problems of variable size and characteristics. Results demonstrate that a reduction in the data required per step from O(n2) to O(n) points can be accomplished without significantly compromising the performance of the optimization algorithm.Originality/valueA reduction in the time (number of system analyses) required per step from O(n2) to O(n) is significant, even more so as n increases. The novelty lies in how only O(n) system analyses can be used to approximate a Hessian matrix whose estimation normally requires O(n2) system analyses.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Humphreys, David, A. Kupresanin, M. D. Boyer, J. Canik, C. S. Chang, E. C. Cyr, R. Granetz et al. « Advancing Fusion with Machine Learning Research Needs Workshop Report ». Journal of Fusion Energy 39, no 4 (août 2020) : 123–55. http://dx.doi.org/10.1007/s10894-020-00258-1.

Texte intégral
Résumé :
Abstract Machine learning and artificial intelligence (ML/AI) methods have been used successfully in recent years to solve problems in many areas, including image recognition, unsupervised and supervised classification, game-playing, system identification and prediction, and autonomous vehicle control. Data-driven machine learning methods have also been applied to fusion energy research for over 2 decades, including significant advances in the areas of disruption prediction, surrogate model generation, and experimental planning. The advent of powerful and dedicated computers specialized for large-scale parallel computation, as well as advances in statistical inference algorithms, have greatly enhanced the capabilities of these computational approaches to extract scientific knowledge and bridge gaps between theoretical models and practical implementations. Large-scale commercial success of various ML/AI applications in recent years, including robotics, industrial processes, online image recognition, financial system prediction, and autonomous vehicles, have further demonstrated the potential for data-driven methods to produce dramatic transformations in many fields. These advances, along with the urgency of need to bridge key gaps in knowledge for design and operation of reactors such as ITER, have driven planned expansion of efforts in ML/AI within the US government and around the world. The Department of Energy (DOE) Office of Science programs in Fusion Energy Sciences (FES) and Advanced Scientific Computing Research (ASCR) have organized several activities to identify best strategies and approaches for applying ML/AI methods to fusion energy research. This paper describes the results of a joint FES/ASCR DOE-sponsored Research Needs Workshop on Advancing Fusion with Machine Learning, held April 30–May 2, 2019, in Gaithersburg, MD (full report available at https://science.osti.gov/-/media/fes/pdf/workshop-reports/FES_ASCR_Machine_Learning_Report.pdf). The workshop drew on broad representation from both FES and ASCR scientific communities, and identified seven Priority Research Opportunities (PRO’s) with high potential for advancing fusion energy. In addition to the PRO topics themselves, the workshop identified research guidelines to maximize the effectiveness of ML/AI methods in fusion energy science, which include focusing on uncertainty quantification, methods for quantifying regions of validity of models and algorithms, and applying highly integrated teams of ML/AI mathematicians, computer scientists, and fusion energy scientists with domain expertise in the relevant areas.
Styles APA, Harvard, Vancouver, ISO, etc.
31

О. L., Lvovа, et Ivaniv I. R. « The moral and legal foundations of bioethics in the context of human rights : legal theory and international practice ». Almanac of law : The role of legal doctrine in ensuring of human rights 11, no 11 (août 2020) : 327–33. http://dx.doi.org/10.33663/2524-017x-2020-11-55.

Texte intégral
Résumé :
Modern processes of globalization taking place in the field of law are a great challenge to the idea of human nature, which is recognized in Ukraine as the highest social value, as well as to the concept and essence of law itself. In our opinion, this is a threat on a global scale and necessitates the search for an adequate response to the threat from the scientific and technical process in the field of biomedicine, both for the natural (physical) existence of man and the preservation of his moral identity. In fact, these foundations have become the prerequisites for the development of the science of bioethics. Bioethics studies controversial and ambiguous issues and proposes a humanitarian examination, which aims to assess the arguments in favor of the development of human creativity, health and prevention of premature death, and arguments in favor of preserving human identity in its spiritual and physical integrity. The purpose of the article is to study the essence of controversial bioethical problems, the reasons for their occurrence and prospects for solving these problems. human, manipulation of stem cells and others. Bioethical issues usually include the ethical issues of abortion; contraception and new reproductive technologies (artificial insemination, surrogacy); conducting experiments on humans and animals; obtaining informed consent and ensuring patients' rights; determination of death, suicide and euthanasia; problems in relation to dying patients (hospices); demographic policy and family planning; genetics (including problems of genome research, genetic engineering and gene therapy); transplantology; health equity; human cloning, manipulation of stem cells and others. These issues related to the progress of genetics, genomics, pharmacology, transplantation, biotechnology, cloning are becoming increasingly important as a direction of international law in the context of ensuring and protecting human rights. IN legal literature indicates the formation of "biolaw", "bioethical legislation", "bioethical human rights". Thus there is a combination of possibilities and purposes of medicine and law. In our article, we have explored only some of these issues, which are currently the most relevant, debatable, and therefore require detailed analysis. These include, in our view, the legal status of the embryo, therapeutic and reproductive cloning, abortion, the use of assisted reproductive technologies and organ transplantation. In order to adequately cover these issues, we compare the rules of law governing these debatable issues with the views of church representatives and scholars on these issues. We also proposed changes that need to be made to the legislation of Ukraine so that the rules of law governing these issues meet the moral and ethical principles. As a conclusion is marked, that as bioethics as science dealing with survival combines in itself biological knowledge and general human values, then it is possible to consider natural human rights, her honour and dignity morally-legal principles of bioethics, a self right and law must become on defence of that, in particular, with the aim of providing of natural (physical) existence of man, and maintenance of her moral identity. Keywords: human rights, moral, bioethics, abortion, reproductive technologies, cloning.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Bunn, Frances, Claire Goodman, Peter Reece Jones, Bridget Russell, Daksha Trivedi, Alan Sinclair, Antony Bayer, Greta Rait, Jo Rycroft-Malone et Chris Burton. « Managing diabetes in people with dementia : a realist review ». Health Technology Assessment 21, no 75 (décembre 2017) : 1–140. http://dx.doi.org/10.3310/hta21750.

Texte intégral
Résumé :
BackgroundDementia and diabetes mellitus are common long-term conditions that coexist in a large number of older people. People living with dementia and diabetes may be at increased risk of complications such as hypoglycaemic episodes because they are less able to manage their diabetes.ObjectivesTo identify the key features or mechanisms of programmes that aim to improve the management of diabetes in people with dementia and to identify areas needing further research.DesignRealist review, using an iterative, stakeholder-driven, four-stage approach. This involved scoping the literature and conducting stakeholder interviews to develop initial programme theories, systematic searches of the evidence to test and develop the theories, and the validation of programme theories with a purposive sample of stakeholders.ParticipantsTwenty-six stakeholders (user/patient representatives, dementia care providers, clinicians specialising in dementia or diabetes and researchers) took part in interviews and 24 participated in a consensus conference.Data sourcesThe following databases were searched from 1990 to March 2016: MEDLINE (PubMed), Cumulative Index to Nursing and Allied Health Literature, Scopus, The Cochrane Library (including the Cochrane Database of Systematic Reviews), Database of Abstracts of Reviews of Effects, the Health Technology Assessment (HTA) database, NHS Economic Evaluation Database, AgeInfo (Centre for Policy on Ageing – UK), Social Care Online, the National Institute for Health Research (NIHR) portfolio database, NHS Evidence, Google (Google Inc., Mountain View, CA, USA) and Google Scholar (Google Inc., Mountain View, CA, USA).ResultsWe included 89 papers. Ten papers focused directly on people living with dementia and diabetes, and the rest related to people with dementia or diabetes or other long-term conditions. We identified six context–mechanism–outcome (CMO) configurations that provide an explanatory account of how interventions might work to improve the management of diabetes in people living with dementia. This includes embedding positive attitudes towards people living with dementia, person-centred approaches to care planning, developing skills to provide tailored and flexible care, regular contact, family engagement and usability of assistive devices. A general metamechanism that emerges concerns the synergy between an intervention strategy, the dementia trajectory and social and environmental factors, especially family involvement. A flexible service model for people with dementia and diabetes would enable this synergy in a way that would lead to the improved management of diabetes in people living with dementia.LimitationsThere is little evidence relating to the management of diabetes in people living with dementia, although including a wider literature provided opportunities for transferable learning. The outcomes in our CMOs are largely experiential rather than clinical. This reflects the evidence available. Outcomes such as increased engagement in self-management are potential surrogates for better clinical management of diabetes, but this is not proven.ConclusionsThis review suggests that there is a need to prioritise quality of life, independence and patient and carer priorities over a more biomedical, target-driven approach. Much current research, particularly that specific to people living with dementia and diabetes, identifies deficiencies in, and problems with, current systems. Although we have highlighted the need for personalised care, continuity and family-centred approaches, there is much evidence to suggest that this is not currently happening. Future research on the management of diabetes in older people with complex health needs, including those with dementia, needs to look at how organisational structures and workforce development can be better aligned to the needs of people living with dementia and diabetes.Study registrationThis study is registered as PROSPERO CRD42015020625.FundingThe NIHR HTA programme.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Mehrmohamadi, Mahya, Mohammad S. Esfahani, Joanne Soo, Florian Scherer, Joseph G. Schroers-Martin, Binbin Chen, David M. Kurtz et al. « Distinct Chromatin Accessibility Profiles of Lymphoma Subtypes Revealed By Targeted Cell Free DNA Profiling ». Blood 132, Supplement 1 (29 novembre 2018) : 672. http://dx.doi.org/10.1182/blood-2018-99-119361.

Texte intégral
Résumé :
Abstract Background Diffuse large B-cell lymphomas (DLBCL) can be divided into subtypes that relate to their cell-of-origin: germinal center B-cell (GCB) and activated B-cell (ABC). These clinically distinct subtypes were originally discovered based on their unique transcriptional signatures, but they also harbor distinct genetic aberrations and epigenetic profiles. We previously showed the utility of genetic mutations in circulating tumor DNA (ctDNA) for DLBCL subtyping using cancer personalized profiling by deep sequencing (CAPP-Seq) [Scherer et al., Sci Transl Med 2016, Newman et al., Nat Med 2014]. Here, we assess epigenetic information encoded in cell-free DNA (cfDNA) for addressing this distinction. Methods We extended recent observations revealing nucleosome depletion at transcription start sites (TSS) of highly expressed genes within cfDNA [Snyder et al. 2016 Cell and Ulz et al. 2016 Nat Genetics]. Specifically, to overcome the inherent sensitivity limitations of cfDNA whole genome sequencing (WGS), we focused on 32 genes differentially expressed between DLBCL subtypes (as well as 70 control genes). We targeted 2 kb regions flanking each TSS for these genes by CAPP-Seq [Newman et al. 2014 Nature Med]. Plasma samples were collected from 41 individuals and cfDNA subjected to ultra-deep sequencing (~4000x) by CAPP-Seq. DLBCL cases were labeled as either ABC or GCB based on gene expression in tumor biopsies. Results We observed significant heterogeneity in the TSS profiles of individual genes. Despite this variation, many genes in our targeted panel exhibited discriminatory profiles. For example, CD20 (MS4A1), reliably distinguished DLBCL cases from control subjects in nucleosome depletion at the TSS as expected (p= 1.5e-05, Fig1a). Remarkably, when considering 32 genes differentially expressed between DLBCL subtypes, we observed significant differences in their TSS profiles in plasma from patients with GCB vs ABC DLBCL (p=0.0002, Fig 1b). For the majority of the subtype-specific genes, the pattern of nucleosome depletion at the TSS was consistent with expected expression (Fig 1b). Specifically, genes typically over-expressed in GCB DLBCL (e.g., CD10/MME, LMO2, SERPINA9), had significantly more TSS nucleosome depletion in cfDNA from GCB-like DLBCL patients. Conversely, we observed significantly more nucleosome depletion in cfDNA of ABC DLBCL patients at the TSS of genes known to be transcriptionally more active in ABC DLBCL (e.g., IRF4, PIM1, CCND2 and IL16). When aggregating the epigenetic profiles across these genes into a single cell-of-origin score, we observed significant stratification of patients with distinct DLBCL subtypes (Fig 1c, p=0.008). Across the cohort, this epigenetic cell-of-origin score from cfDNA showed significant correlation (Spearman rho=0.67, p=0.05) with a score derived from somatic mutations genotyped from ctDNA [Scherer et al. 2016 Sci Transl Med]. Conclusions Lymphoma subtypes exhibit distinct chromatin accessibility profiles that can be captured by targeted cell free DNA profiling, and that can serve as surrogates for expected gene expression differences. These epigenetic features can be used to noninvasively classify DLBCL molecular subtypes using blood plasma and should be readily extensible to other cancer classification problems. Extension of this approach to other epigenetic features could serve to further refine potential applications. Figure legend. Figure 1. (A) Sequencing coverage around the TSS of the B-cell antigen CD20 (MS4A1) is shown for three representative DLBCL patients (red) and three normal controls (gray). The boxplot shows summary of normalized depth across all samples (n=10 DLBCL, n=31 control). (B) The y-axis shows 32 genes with known ABC-favoring (blue) and GCB-favoring (orange) expression. The x-axis shows a delta score defined as log2( mean normalized depth across GCB samples - mean normalized depth across GCB samples). (C) Boxplot of the GCBness score obtained by subtracting the overall depth around the TSSs of a set of GCB-genes from the same metric for a set of ABC-genes. Disclosures No relevant conflicts of interest to declare.
Styles APA, Harvard, Vancouver, ISO, etc.
34

Archvadze, Joseph. « TRANSFORMATION OF THE FORMAT OF STUDY AND WORK IN THE CONTEXT OF THE CORONAVIRUS PANDEMIC ». Economic Profile 15, no 20 (25 décembre 2020) : 8–21. http://dx.doi.org/10.52244/ep.2020.20.01.

Texte intégral
Résumé :
Despite the fact that the coronavirus pandemic caused an economic crisis and a significant reduction in demand and supply, it gave a strong impetus to the development and massive use of information technology, in the beginning of a new long wave of the economic cycle. The pandemic is not a challenge only to the world community, but it also tests it - to what extent it is able to quickly and efficiently digitize the economy, transfer production to a new technological level, and ultimately, implement the fourth technological revolution. The Internet, telecommunications already have the opportunity to improve their qualifications in their own or promising specialties of interested persons through appropriate online courses. In countries with developed market economies and before the pandemic, the number of students in such courses was almost equal to the number of university students, and in the coming years, in all likelihood, it will significantly exceed it. This fully fits into the life-long learning trend caused by rapid technological changes, which nullifies the "eternal status" of the acquired profession. In less than a year since the beginning of the global coronavirus pandemic, there have been significant changes in the organization of forms of study and teaching in universities. The latter have to seriously revise their teaching methods in the "rapid chess" mode and abandon outdated forms of teaching. COVID-19 has paved the way for distance learning and work, online lessons, lectures, video conferences. However, the distance, online form of training and work has not only advantages, but also disadvantages, the ultimate level of efficiency. At the same time, at present, not all students, students and even educational institutions have the technical and material capabilities to provide online learning. Knowledge is a multifunctional tool with the help of which a person copes with certain tasks. Taking this into account, the university should equip young people with such knowledge that will have not only informational load, but also great applied value. In the context of the pandemic, universities are faced with serious challenges, most of them are like a downward spiral orbit: on the one hand, online teaching saves money on the maintenance and operation of classrooms, but on the other hand, this means reducing the costs of students (their parents, sponsors, etc. etc.), the appropriate adjustment of their mobility plans, which makes the administration and the state the need to pursue a new, changed educational and economic policy. In the context of a pandemic, the implementation of these tasks faces a serious danger: traditional forms of study and knowledge transfer are disrupted and the market requirements become not entirely clear due to its significant narrowing. The question of the need to reduce the dependence of the cost of studying at universities on the contributions of the students themselves (their parents, sponsors, etc.) is becoming increasingly acute. Such expenses in state universities should be completely borne by the state, which will practically reduce to zero the motivation of the universities to maintain undisciplined students within the walls of the university and will significantly increase the demand towards them (and, accordingly, the quality of education). Which place can online learning take in a pandemic period? In general (without taking into account the mention in the context of the pandemic), the advantage of online education is unambiguous for those specialties, the perspective of which presupposes predominantly the same format. However, it should be remembered that the online format is a specific, actually surrogate form of social relations and professional and personal self-confidence and communication skills are practically not based on it. So, instead of a tug of war between online and offline forms, it is necessary to find the optimal balance between the forms of their complementarity and not interchangeability. Online courses for stakeholders are no less mobilizing. Listening to a lecture in this format is possible at a convenient time (at home, while traveling, while waiting for a bus or waiting in line for purchases, etc.). These courses are a subjective choice of everyone and are not a forced form for gaining some number of so-called credits. Due to high motivation, the degree of development is more stable and higher. Over time, the knowledge gained through such courses will become more valuable in the selection of an employee for a vacant position. Distance learning, in spite of the fact that it has a number of advantages (convenient, economical - it saves costs on transport, time, egalitarian, etc.) is still a substitute - it only partially replaces, but does not replace fully relations between the lecturer and student (as well as between academic staff). It does not replace such aspects of full-fledged student life as the living relations of students, their joint participation in public events. In the online format, it is possible to transfer knowledge, but not the nuances of culture, behavior, creativity, national and regional sensitivity, humor, etc. Online learning and relationships is, figuratively speaking, a movie shot on a flat screen, while classroom learning is a movie in 3D. The focus on distance learning has one very significant side effect: it creates a danger to the level and frequency of mutual communication of academic staff. Because of this, departments and social and professional unions can undergo a serious deformation of their significance. Accordingly, this will increase the degree of alienation of the academic staff from the universities that provided them with jobs. The problem of socialization, from a slightly different perspective, is also faced by persons engaged exclusively in scientific work - how will scientific links, "chained" by common ideas and goals, be formed? The final effectiveness of the chosen training format (online vs. offline format) is determined by the effectiveness of the training process. The latter is largely determined by compliance with the requirements of the labor market. In turn, this compliance is largely predetermined by the systemic (instead of episodic) contact of potential employers with universities. The coronavirus pandemic is directly reflected in the scale and format of employment - it reduces the overall contingent and significantly changes the employment situation, "throws" a significant part of the employees who have retained their jobs online. Remote forms of work, which were already pioneered by higher and general education schools, will develop even more, new, more sophisticated forms of intensification of work of employees employed at remote locations will appear. Here, as in other spheres of economic and social life, the boundaries between the traditional division of working and non-working (free) time will actually be erased. However, online work has refrained level of efficiency. For example, it is very problematic to create a workplace at home - not everyone has the opportunity to organize a separate office, a desk for such work, which creates psychological and, often, physical discomfort. The fact is that online work has a concomitant negative effect. It causes the atomization of the collective, the corrosion of its unity. To the extent that representatives of higher education and the academic sphere can lead the students who have switched online, not only in their studies, but also in the development of skills and feelings of socialization and empathy in them, the future and the maturity of civil society will largely depend on this. It is necessary to find a kind of "golden mean" of relations both between students and academic staff, and with representatives of "their own cluster", in which direct live relationships may not be as intense and daily as before. The rest of the time will be completely transferred to their self-organizing and self-fulfilling "box of time". The final victory over the pandemic will eliminate or significantly alleviate most of the above problems, however, the need to adapt everyday life, study, work to the online format of relationships will remain highly relevant.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Shankar Bhattacharjee, Kalyan, Hemant Kumar Singh et Tapabrata Ray. « Multi-Objective Optimization With Multiple Spatially Distributed Surrogates ». Journal of Mechanical Design 138, no 9 (18 juillet 2016). http://dx.doi.org/10.1115/1.4034035.

Texte intégral
Résumé :
In engineering design optimization, evaluation of a single solution (design) often requires running one or more computationally expensive simulations. Surrogate assisted optimization (SAO) approaches have long been used for solving such problems, in which approximations/surrogates are used in lieu of computationally expensive simulations during the course of search. Existing SAO approaches often use the same type of approximation model to represent all objectives and constraints in all regions of the search space. The selection of a type of surrogate model over another is nontrivial and an a priori choice limits flexibility in representation. In this paper, we introduce a multi-objective evolutionary algorithm (EA) with multiple adaptive spatially distributed surrogates. Instead of a single global surrogate, local surrogates of multiple types are constructed in the neighborhood of each offspring solution and a multi-objective search is conducted using the best surrogate for each objective and constraint function. The proposed approach offers flexibility of representation by capitalizing on the benefits offered by various types of surrogates in different regions of the search space. The approach is also immune to illvalidation since approximated and truly evaluated solutions are not ranked together. The performance of the proposed surrogate assisted multi-objective algorithm (SAMO) is compared with baseline nondominated sorting genetic algorithm II (NSGA-II) and NSGA-II embedded with global and local surrogates of various types. The performance of the proposed approach is quantitatively assessed using several engineering design optimization problems. The numerical experiments demonstrate competence and consistency of SAMO.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Kalina, Karl A., Lennart Linden, Jörg Brummund et Markus Kästner. « FE$${}^\textrm{ANN}$$ : an efficient data-driven multiscale approach based on physics-constrained neural networks and automated data mining ». Computational Mechanics, 8 février 2023. http://dx.doi.org/10.1007/s00466-022-02260-0.

Texte intégral
Résumé :
AbstractHerein, we present a new data-driven multiscale framework called FE$${}^\textrm{ANN}$$ ANN which is based on two main keystones: the usage of physics-constrained artificial neural networks (ANNs) as macroscopic surrogate models and an autonomous data mining process. Our approach allows the efficient simulation of materials with complex underlying microstructures which reveal an overall anisotropic and nonlinear behavior on the macroscale. Thereby, we restrict ourselves to finite strain hyperelasticity problems for now. By using a set of problem specific invariants as the input of the ANN and the Helmholtz free energy density as the output, several physical principles, e. g., objectivity, material symmetry, compatibility with the balance of angular momentum and thermodynamic consistency are fulfilled a priori. The necessary data for the training of the ANN-based surrogate model, i. e., macroscopic deformations and corresponding stresses, are collected via computational homogenization of representative volume elements (RVEs). Thereby, the core feature of the approach is given by a completely autonomous mining of the required data set within an overall loop. In each iteration of the loop, new data are generated by gathering the macroscopic deformation states from the macroscopic finite element simulation and a subsequently sorting by using the anisotropy class of the considered material. Finally, all unknown deformations are prescribed in the RVE simulation to get the corresponding stresses and thus to extend the data set. The proposed framework consequently allows to reduce the number of time-consuming microscale simulations to a minimum. It is exemplarily applied to several descriptive examples, where a fiber reinforced composite with a highly nonlinear Ogden-type behavior of the individual components is considered. Thereby, a rather high accuracy could be proved by a validation of the approach.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Eriksson, Lise. « Outsourcing problems or regulating altruism ? Parliamentary debates on domestic and cross-border surrogacy in Finland and Norway ». European Journal of Women's Studies, 3 mai 2021, 135050682110099. http://dx.doi.org/10.1177/13505068211009936.

Texte intégral
Résumé :
This article employs the concept of respectability and the discursive representation of gender equality policies to discuss how surrogacy is represented in Nordic parliamentary debates and policy documents. The article’s objective is to study how respectability, problems and equality are represented and discursively and rhetorically produced through a comparative study of Finnish and Norwegian political discourses on domestic unpaid surrogacy and cross-border commercial surrogacy. The article uses rhetorical and discursive analysis to analyse the Finnish and Norwegian Parliaments’ bills, members’ initiatives and proceedings from 2002 to 2018. Finland’s policy on surrogacy has evolved from an unregulated and permissive approach towards a more restrictive one, with discourses focusing on medicalisation, equality, altruism and safety concerning domestic surrogacy and problems and risks concerning cross-border surrogacy. Norway’s policy on surrogacy has been restrictive consistently, with discourses focusing on surrogacy as a transnational social problem involving exploitation of women and children, and biocentrism. Analysing surrogacy regulation in Nordic welfare states, the author concludes that policies and parliamentary debates in both countries have expressed expectations for inclusive health policies and social security for families. Cross-border surrogacy is characterised as an unwanted consequence of globalisation and marketisation of reproduction. Surrogate mothers’ respectability is constructed through rhetoric on differences in terms of nationality, class and binary representations of female caring and instrumentalism.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Wang, Z., J. He, W. J. Milliken et X. H. Wen. « Fast History Matching and Optimization Using a Novel Physics-BasedData-Driven Model : An Application to a Diatomite Reservoir ». SPE Journal, 1 juin 2021, 1–20. http://dx.doi.org/10.2118/200772-pa.

Texte intégral
Résumé :
Summary Full-physics models in history matching (HM) and optimization can be computationally expensive because these problems usually require hundreds of simulations or more. In a previous study, a physics-baseddata-driven network model was implemented with a commercial simulator that served as a surrogate without the need to build a 3D geological model. In this paper, the network model is reconstructed to account for complex reservoir conditions of mature fields and successfully apply it to a diatomite reservoir in the San Joaquin Valley, California, for rapid HM and optimization. The reservoir is simplified into a network of 1D connections between well perforations. These connections are discretized into gridblocks, and the grid properties are calibrated to historical production data. Elevation change, saturation distribution, capillary pressure, and relative permeability are accounted for to best represent the mature field conditions. To simulate this physics-based network model through a commercial simulator, an equivalent Cartesian model is designed where rows correspond to the previously mentioned connections. Thereafter, the HM can be performed with the ensemble smoother with multiple data assimilation (ESMDA) algorithm under a sequential iterative process. A representative model after HM is then used for well control optimization. The network model methodology has been successfully applied to the waterflood optimization for a 56-well sector model of a diatomite reservoir in the San Joaquin Valley. HM results show that the network model matches with field level production history and gives reasonable matches for most of the wells, including pressure and volumetric data. The calibrated posterior ensemble of HM yields a satisfactory production prediction that is verified by the remaining historical data. For well control optimization, the P50 model is selected to maximize the net present value (NPV) in 5 years under provided well/field constraints. This confirms that the calibrated network model is accurate enough for production forecasts and optimization. The use of a commercial simulator in the network model provided flexibility to account for complex physics, such as elevation difference between wells, saturation nonequilibrium, and strong capillary pressure. Unlike the traditional big-loop workflow that relies on a detailed characterization of geological models, the proposed network model only requires production data and can be built and updated rapidly. The model also runs much faster (tens of seconds) than a full-physics model because of the use of much fewer gridblocks. To our knowledge, this is the first time this physics-baseddata-driven network model is applied with a commercial simulator on a field waterflood case. Unlike approaches developed with analytic solutions, the use of a commercial simulator makes it feasible to be further extended for complex processes (e.g., thermal or compositional flow). It serves as a useful surrogate model for both fast and reliable decision-making in reservoir management.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Bhattacharjee, Kalyan Shankar, Hemant Kumar Singh et Tapabrata Ray. « Multiple Surrogate-Assisted Many-Objective Optimization for Computationally Expensive Engineering Design ». Journal of Mechanical Design 140, no 5 (23 mars 2018). http://dx.doi.org/10.1115/1.4039450.

Texte intégral
Résumé :
Engineering design often involves problems with multiple conflicting performance criteria, commonly referred to as multi-objective optimization problems (MOP). MOPs are known to be particularly challenging if the number of objectives is more than three. This has motivated recent attempts to solve MOPs with more than three objectives, which are now more specifically referred to as “many-objective” optimization problems (MaOPs). Evolutionary algorithms (EAs) used to solve such problems require numerous design evaluations prior to convergence. This is not practical for engineering applications involving computationally expensive evaluations such as computational fluid dynamics and finite element analysis. While the use of surrogates has been commonly studied for single-objective optimization, there is scarce literature on its use for MOPs/MaOPs. This paper attempts to bridge this research gap by introducing a surrogate-assisted optimization algorithm for solving MOP/MaOP within a limited computing budget. The algorithm relies on principles of decomposition and adaptation of reference vectors for effective search. The flexibility of function representation is offered through the use of multiple types of surrogate models. Furthermore, to efficiently deal with constrained MaOPs, marginally infeasible solutions are promoted during initial phases of the search. The performance of the proposed algorithm is benchmarked with the state-of-the-art approaches using a range of problems with up to ten objective problems. Thereafter, a case study involving vehicle design is presented to demonstrate the utility of the approach.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Hou, Chun Kit Jeffery, et Kamran Behdinan. « Dimensionality Reduction in Surrogate Modeling : A Review of Combined Methods ». Data Science and Engineering, 21 août 2022. http://dx.doi.org/10.1007/s41019-022-00193-5.

Texte intégral
Résumé :
AbstractSurrogate modeling has been popularized as an alternative to full-scale models in complex engineering processes such as manufacturing and computer-assisted engineering. The modeling demand exponentially increases with complexity and number of system parameters, which consequently requires higher-dimensional engineering solving techniques. This is known as the curse of dimensionality. Surrogate models are commonly used to replace costly computational simulations and modeling of complex geometries. However, an ongoing challenge is to reduce execution and memory consumption of high-complexity processes, which often exhibit nonlinear phenomena. Dimensionality reduction algorithms have been employed for feature extraction, selection, and elimination for simplifying surrogate models of high-dimensional problems. By applying dimensionality reduction to surrogate models, less computation is required to generate surrogate model parts while retaining sufficient representation accuracy of the full process. This paper aims to review the current literature on dimensionality reduction integrated with surrogate modeling methods. A review of the current state-of-the-art dimensionality reduction and surrogate modeling methods is introduced with a discussion of their mathematical implications, applications, and limitations. Finally, current studies that combine the two topics are discussed and avenues of further research are presented.
Styles APA, Harvard, Vancouver, ISO, etc.
41

Oldenburg, Jan, Finja Borowski, Alper Öner, Klaus-Peter Schmitz et Michael Stiehm. « Geometry aware physics informed neural network surrogate for solving Navier–Stokes equation (GAPINN) ». Advanced Modeling and Simulation in Engineering Sciences 9, no 1 (21 juin 2022). http://dx.doi.org/10.1186/s40323-022-00221-z.

Texte intégral
Résumé :
AbstractMany real world problems involve fluid flow phenomena, typically be described by the Navier–Stokes equations. The Navier–Stokes equations are partial differential equations (PDEs) with highly nonlinear properties. Currently mostly used methods solve this differential equation by discretizing geometries. In the field of fluid mechanics the finite volume method (FVM) is widely used for numerical flow simulation, so-called computational fluid dynamics (CFD). Due to high computational costs and cumbersome generation of the discretization they are not widely used in real time applications. Our presented work focuses on advancing PDE-constrained deep learning frameworks for more real-world applications with irregular geometries without parameterization. We present a Deep Neural Network framework that generate surrogates for non-geometric boundaries by data free solely physics driven training, by minimizing the residuals of the governing PDEs (i.e., conservation laws) so that no computationally expensive CFD simulation data is needed. We named this method geometry aware physics informed neural network—GAPINN. The framework involves three network types. The first network reduces the dimensions of the irregular geometries to a latent representation. In this work we used a Variational-Auto-Encoder (VAE) for this task. We proposed the concept of using this latent representation in combination with spatial coordinates as input for PINNs. Using PINNs we showed that it is possible to train a surrogate model purely driven on the reduction of the residuals of the underlying PDE for irregular non-parametric geometries. Furthermore, we showed the way of designing a boundary constraining network (BCN) to hardly enforce boundary conditions during training of the PINN. We evaluated this concept on test cases in the fields of biofluidmechanics. The experiments comprise laminar flow (Re = 500) in irregular shaped vessels. The main highlight of the presented GAPINN is the use of PINNs on irregular non-parameterized geometries. Despite that we showed the usage of this framework for Navier Stokes equations, it should be feasible to adapt this framework for other problems described by PDEs.
Styles APA, Harvard, Vancouver, ISO, etc.
42

Zhang, Qi, Yizhong Wu, Li Lu et Ping Qiao. « An Adaptive Dendrite-HDMR Metamodeling Technique for High-Dimensional Problems ». Journal of Mechanical Design 144, no 8 (24 mars 2022). http://dx.doi.org/10.1115/1.4053526.

Texte intégral
Résumé :
Abstract High-dimensional model representation (HDMR), decomposing the high-dimensional problem into summands of different order component terms, has been widely researched to work out the dilemma of “curse-of-dimensionality” when using surrogate techniques to approximate high-dimensional problems in engineering design. However, the available one-metamodel-based HDMRs usually encounter the predicament of prediction uncertainty, while current multi-metamodels-based HDMRs cannot provide simple explicit expressions for black-box problems, and have high computational complexity in terms of constructing the model by the explored points and predicting the responses of unobserved locations. Therefore, aimed at such problems, a new stand-alone HDMR metamodeling technique, termed as Dendrite-HDMR, is proposed in this study based on the hierarchical Cut-HDMR and the white-box machine learning algorithm, Dendrite Net. The proposed Dendrite-HDMR not only provides succinct and explicit expressions in the form of Taylor expansion but also has relatively higher accuracy and stronger stability for most mathematical functions than other classical HDMRs with the assistance of the proposed adaptive sampling strategy, named KKMC, in which k-means clustering algorithm, k-Nearest Neighbor classification algorithm and the maximum curvature information of the provided expression are utilized to sample new points to refine the model. Finally, the Dendrite-HDMR technique is applied to solve the design optimization problem of the solid launch vehicle propulsion system with the purpose of improving the impulse-weight ratio, which represents the design level of the propulsion system.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Wang, Qian. « AN ITERATIVE HDMR APPROACH FOR ENGINEERING RELIABILITY ANALYSIS ». Proceedings of International Structural Engineering and Construction 8, no 1 (juillet 2021). http://dx.doi.org/10.14455/isec.2021.8(1).rad-04.

Texte intégral
Résumé :
Engineering reliability analysis has long been an active research area. Surrogate models, or metamodels, are approximate models that can be created to replace implicit performance functions in the probabilistic analysis of engineering systems. Traditional 1st-order or second-order high dimensional model representation (HDMR) methods are shown to construct accurate surrogate models of response functions in an engineering reliability analysis. Although very efficient and easy to implement, 1st-order HDMR models may not be accurate, since the cross-effects of variables are neglected. Second-order HDMR models are more accurate; however they are more complicated to implement. Moreover, they require much more sample points, i.e., finite element (FE) simulations, if FE analyses are employed to compute values of a performance function. In this work, a new probabilistic analysis approach combining iterative HDMR and a first-order reliability method (FORM) is investigated. Once a performance function is replaced by a 1st-order HDMR model, an alternate FORM is applied. In order to include higher-order contributions, additional sample points are generated and HDMR models are updated, before FORM is reapplied. The analysis iteration continues until the reliability index converges. The novelty of the proposed iterative strategy is that it greatly improves the efficiency of the numerical algorithm. As numerical examples, two engineering problems are studied and reliability analyses are performed. Reliability indices are obtained within a few iterations, and they are found to have a good accuracy. The proposed method using iterative HDMR and FORM provides a useful tool for practical engineering applications.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Montes de Oca Zapiain, David, Apaar Shanker et Surya R. Kalidindi. « Convolutional Neural Networks for the Localization of Plastic Velocity Gradient Tensor in Polycrystalline Microstructures ». Journal of Engineering Materials and Technology 144, no 1 (28 mai 2021). http://dx.doi.org/10.1115/1.4051085.

Texte intégral
Résumé :
Abstract Recent work has demonstrated the potential of convolutional neural networks (CNNs) in producing low-computational cost surrogate models for the localization of mechanical fields in two-phase microstructures. The extension of the same CNNs to polycrystalline microstructures is hindered by the lack of an efficient formalism for the representation of the crystal lattice orientation in the input channels of the CNNs. In this paper, we demonstrate the benefits of using generalized spherical harmonics (GSH) for addressing this challenge. A CNN model was successfully trained to predict the local plastic velocity gradient fields in polycrystalline microstructures subjected to a macroscopically imposed loading condition. Specifically, it is demonstrated that the proposed approach improves significantly the accuracy of the CNN models when compared with the direct use of Bunge–Euler angles to represent the crystal orientations in the input channels. Since the proposed approach implicitly satisfies the expected crystal symmetries in the specification of the input microstructure to the CNN, it opens new research directions for the adoption of CNNs in addressing a broad range of polycrystalline microstructure design and optimization problems.
Styles APA, Harvard, Vancouver, ISO, etc.
45

A.Wilson, Jason. « Performance, anxiety ». M/C Journal 5, no 2 (1 mai 2002). http://dx.doi.org/10.5204/mcj.1952.

Texte intégral
Résumé :
In a recent gaming anthology, Henry Jenkins cannot help contrasting his son's cramped, urban, media-saturated existence with his own idyllic, semi-rural childhood. After describing his own Huck Finn meanderings over "the spaces of my boyhood" including the imaginary kingdoms of Jungleoca and Freedonia, Jenkins relates his version of his son's experiences: My son, Henry, now 16 has never had a backyard He has grown up in various apartment complexes, surrounded by asphalt parking lots with, perhaps, a small grass buffer from the street… Once or twice, when I became exasperated by my son's constant presence around the house I would … tell him he should go out and play. He would look at me with confusion and ask, where? … Who wouldn't want to trade in the confinement of your room for the immersion promised by today's video games? … Perhaps my son finds in his video games what I found in the woods behind the school, on my bike whizzing down the hills of suburban backstreets, or settled into my treehouse with a good adventure novel intensity of experience, escape from adult regulation; in short, "complete freedom of movement". (Jenkins 1998, 263-265) Games here are connected with a shrinking availability of domestic and public space, and a highly mediated experience of the world. Despite his best intentions, creeping into Jenkins's piece is a sense that games act as a poor substitute for the natural spaces of a "healthy" childhood. Although "Video games did not make backyard play spaces disappear", they "offer children some way to respond to domestic confinement" (Jenkins 1998, 266). They emerge, then, as a palliation for the claustrophobic circumstances of contemporary urban life, though they offer only unreal spaces, replete with "lakes of fire … cities in the clouds … [and] dazzling neon-lit Asian marketplaces" (Jenkins 1998, 263), where the work of the childish imagination is already done. Despite Jenkins's assertion that games do offer "complete freedom of movement", it is hard to shake the feeling that he considers his own childhood far richer in exploratory and imaginative opportunities: Let me be clear I am not arguing that video games are as good for kids as the physical spaces of backyard play culture. As a father, I wish that my son would come home covered in mud or with scraped knees rather than carpet burns ... The psychological and social functions of playing outside are as significant as the impact of "sunshine and good exercise" upon our physical well-being. (Jenkins 1998, 266) Throughout the piece, games are framed by a romantic, anti-urban discourse: the expanding city is imagined as engulfing space and perhaps destroying childhood itself, such that "'sacred' places are now occupied by concrete, bricks or asphalt" (Jenkins 1998, 263). Games are complicit in this alienation of space and experience. If this is not quite Paul Virilio's recent dour contention that modern mass media forms work mainly to immobilise the body of the consumer--Virilio, luckily, has managed to escape the body-snatchers--games here are produced as a feeble response to an already-effected urban imprisonment of the young. Strikingly, Jenkins seems concerned about his son's "unhealthy" confinement to private, domestic space, and his inability to imaginatively possess a slice of the world outside. Jenkins's description of his son's confinement to the world of "carpet burns" rather than the great outdoors of "scraped knees" and "mud" implicitly leaves the distinction between domestic and public, internal and external, and even the imagined passivity of the domestic sphere as against the activity of the public intact. For those of us who see games as productive activities, which generate particular, unique kinds of pleasure in their own right, rather than as anaemic replacements for lost spaces, this seems to reduce a central cultural form. For those of us who have at least some sympathy with writers on the urban environment like Raban (1974) and Young (1990), who see the city's theatrical and erotic possibilities, Jenkins's fears might seem to erase the pleasures and opportunities that city life provides. Rather than seeing gamers and children (the two groups only partially overlap) as unwitting agents in their own confinement, we can arrive at a slightly more complex view of the relationship between games and urban space. By looking at the video games arcade as it is situated in urban retail space, we can see how gameplay simultaneously acts to regulate urban space, mediates a unique kind of urban performance, and allows sophisticated representations, manipulations and appropriations of differently conceived urban spaces. Despite being a long-standing feature of the urban and retail environment, and despite also being a key site for the "exhibition" of a by-now central media form, the video game arcade has a surprisingly small literature devoted to it. Its prehistory in pinball arcades and pachinko parlours has been noted (by, for example, Steven Poole 2000) but seldom deeply explored, and its relations with a wider urban space have been given no real attention at all. The arcade's complexity, both in terms of its positioning and functions, may contribute to this. The arcade is a space of conflicting, contradictory uses and tendencies, though this is precisely what makes it as important a space as the cinema or penny theatre before it. Let me explain why I think so. The arcade is always simultaneously a part of and apart from the retail centres to which it tends to attach itself.1 If it is part of a suburban shopping mall, it is often located on the ground floor near the entrance, or is semi-detached as cinema complexes often are, so that the player has to leave the mall's main building to get there, or never enter. If it is part of a city or high street shopping area, it is often in a side street or a street parallel to the main retail thoroughfare, or requires the player to mount a set of stairs into an off-street arcade. At other times the arcade is located in a space more strongly marked as liminal in relation to the city -- the seaside resort, sideshow alley or within the fences of a theme park. Despite this, the videogame arcade's interior is usually wholly or mostly visible from the street, arcade or thoroughfare that it faces, whether this visibility is effected by means of glass walls, a front window or a fully retractable sliding door. This slight distance from the mainstream of retail activity and the visibility of the arcade's interior are in part related to the economics of the arcade industry. Arcade machines involve relatively low margins -- witness the industry's recent feting and embrace of redemption (i.e. low-level gambling) games that offer slightly higher turnovers -- and are hungry for space. At the same time, arcades are dependent on street traffic, relentless technological novelty and their de facto use as gathering space to keep the coins rolling in. A balance must be found between affordability, access and visibility, hence their positioning at a slight remove from areas of high retail traffic. The story becomes more complicated, though, when we remember that arcades are heavily marked as deviant, disreputable spaces, whether in the media, government reports or in sociological and psychological literature. As a visible, public, urban space where young people are seen to mix with one another and unfamiliar and novel technologies, the arcade is bound to give rise to adult anxieties. As John Springhall (1998) puts it: More recent youth leisure… occupies visible public space, is seen as hedonistic and presents problems within the dominant discourse of 'enlightenment' … [T]he most popular forms of entertainment among the young at any given historical moment tend also to provide the focus of the most intense social concern. A new medium with mass appeal, and with a technology best understood by the young… almost invariably attracts a desire for adult or government control (160-161, emphasis mine) Where discourses of deviant youth have also been employed in extending the surveillance and policing of retail space, it is unsurprising that spaces seen as points for the concentration of such deviance will be forced away from the main retail thoroughfares, in the process effecting a particular kind of confinement, and opportunity for surveillance. Michel Foucault writes, in Discipline and Punish, about the classical age's refinements of methods for distributing and articulating bodies, and the replacement of spectacular punishment with the crafting of "docile bodies". Though historical circumstances have changed, we can see arcades as disciplinary spaces that reflect aspects of those that Foucault describes. The efficiency of arcade games in distributing bodies in rows, and side by side demonstrates that" even if the compartments it assigns become purely ideal, the disciplinary space is always, basically, cellular" (Foucault 1977, 143). The efficiency of games from Pong (Atari:1972) to Percussion Freaks (Konami: 1999) in articulating bodies in play, in demanding specific and often spectacular bodily movements and competencies means that "over the whole surface of contact between the body and the object it handles, power is introduced, fastening them to one another. It constitutes a body weapon, body-tool, body-machine complex" (Foucault 1977,153). What is extraordinary is the extent to which the articulation of bodies proceeds only through a direct engagement with the game. Pong's instructions famously read only "avoid missing ball for high score"--a whole economy of movement, arising from this effort, is condensed into six words. The distribution and articulation of bodies also entails a confinement in the space of the arcade, away from the main areas of retail trade, and renders occupants easily observable from the exterior. We can see that games keep kids off the streets. On the other hand, the same games mediate spectacular forms of urban performance and allow particular kinds of reoccupation of urban space. Games descended or spun off from Dance Dance Revolution (Konami: 1998) require players to dance, in time with thumping (if occasionally cheesy) techno, and in accordance with on-screen instructions, in more and more complex sequences on lit footpads. These games occupy a lot of space, and the newest instalment (DDR has just issued its "7th Mix") is often installed at the front of street level arcades. When played with flair, games such as these are apt to attract a crowd of onlookers to gather, not only inside, but also on the footpath outside. Indeed games such as these have given rise to websites like http://www.dancegames.com/au which tells fans not only when and where new games are arriving, but whether or not the positioning of arcades and games within them will enable a player to attract attention to their performance. This mediation of cyborg performance and display -- where success both achieves and exceeds perfect integration with a machine in urban space -- is particularly important to Asian-Australian youth subcultures, which are often marginalised in other forums for youthful display, like competitive sport. International dance gamer websites like Jason Ho's http://www.ddrstyle.com , which is emblazoned with the slogan "Asian Pride", explicitly make the connection between Asian youth subcultures and these new kinds of public performance. Games like those in the Time Crisis series, which may seem less innocuous, might be seen as effecting important inversions in the representation of urban space. Initially Time Crisis, which puts a gun in the player's hand and requires them to shoot at human figures on screen, might even be seen to live up to the dire claims made by figures like Dave Grossman that such games effectively train perpetrators of public violence (Grossman 1995). What we need to keep in mind, though, is that first, as "cops", players are asked to restore order to a representation of urban space, and second, that that they are reacting to images of criminality. When criminality and youth are so often closely linked in public discourse (not to mention criminality and Asian ethnicity) these games stage a reversal whereby the young player is responsible for performing a reordering of the unruly city. In a context where the ideology of privacy has progressively marked public space as risky and threatening,2 games like Time Crisis allow, within urban space, a performance aimed at the resolution of risk and danger in a representation of the urban which nevertheless involves and incorporates the material spaces that it is embedded in.This is a different kind of performance to DDR, involving different kinds of image and bodily attitude, that nevertheless articulates itself on the space of the arcade, a space which suddenly looks more complex and productive. The manifest complexity of the arcade as a site in relation to the urban environment -- both regulating space and allowing spectacular and sophisticated types of public performance -- means that we need to discard simplistic stories about games providing surrogate spaces. We reify game imagery wherever we see it as a space apart from the material spaces and bodies with which gaming is always involved. We also need to adopt a more complex attitude to urban space and its possibilities than any narrative of loss can encompass. The abandonment of such narratives will contribute to a position where we can recognise the difference between the older and younger Henrys' activities, and still see them as having a similar complexity and richness. With work and luck, we might also arrive at a material organisation of society where such differing spaces of play -- seen now by some as mutually exclusive -- are more easily available as choices for everyone. NOTES 1 Given the almost total absence of any spatial study of arcades, my observations here are based on my own experience of arcades in the urban environment. Many of my comments are derived from Brisbane, regional Queensland and urban-Australian arcades this is where I live but I have observed the same tendencies in many other urban environments. Even where the range of services and technologies in the arcades are different in Madrid and Lisbon they serve espresso and alcohol (!), in Saigon they often consist of a bank of TVs equipped with pirated PlayStation games which are hired by the hour their location (slightly to one side of major retail areas) and their openness to the street are maintained. 2 See Spigel, Lynn (2001) for an account of the effects and transformations of the ideology of privacy in relation to media forms. See Furedi, Frank (1997) and Douglas, Mary (1992) for accounts of the contemporary discourse of risk and its effects. References Douglas, M. (1992) Risk and Blame: Essays in Cultural Theory. London ; New York : Routledge. Foucault, M. (1979) Discipline and Punish: The Birth of the Prison. Trans. Alan Sheridan. Harmondsworth, England: Penguin,. Furedi, F.(1997) Culture of Fear: Risk-taking and the Morality of Low Expectation. London ; Washington : Cassell. Grossman, D. (1995) On Killing: The Psychological Cost of Learning to Kill in War and Society. Boston: Little, Brown. Jenkins, H. (1998) Complete freedom of movement: video games as gendered play spaces. In Jenkins, Henry and Justine Cassell (eds) From Barbie to Mortal Kombat : Gender and Computer Games. Cambridge, Mass.: MIT Press. Poole, S. (2000) Trigger Happy: The Inner Life of Videogames. London: Fourth Estate. Raban, J. (1974) Soft City. London: Hamilton. Spigel, L. (2001) Welcome to the Dreamhouse: Popular Media and the Postwar Suburbs. Durham and London: Duke University Press. Springhall, J. (1998) Youth, Popular Culture and Moral Panics : Penny Gaffs to Gangsta-rap, 1830-1996. New York: St. Martin's Press. Young, I.M. (1990) Justice and the Politics of Difference. Princeton, NJ: Princeton University Press. Websites http://www.yesterdayland.com/popopedia/s... (Time Crisis synopsis and shots) http://www.dancegames.com/au (Site for a network of fans revealing something about the culture around dancing games) http://www.ddrstyle.com (website of Jason Ho, who connects his dance game performances with pride in his Asian identity). http://www.pong-story.com (The story of Pong, the very first arcade game) Games Dance Dance Revolution, Konami: 1998. Percussion Freaks, Konami: 1999. Pong, Atari: 1972. Time Crisis, Namco: 1996. Links http://www.dancegames.com/au http://www.yesterdayland.com/popopedia/shows/arcade/ag1154.php http://www.pong-story.com http://www.ddrstyle.com Citation reference for this article MLA Style Wilson, Jason A.. "Performance, anxiety" M/C: A Journal of Media and Culture 5.2 (2002). [your date of access] < http://www.media-culture.org.au/0205/performance.php>. Chicago Style Wilson, Jason A., "Performance, anxiety" M/C: A Journal of Media and Culture 5, no. 2 (2002), < http://www.media-culture.org.au/0205/performance.php> ([your date of access]). APA Style Wilson, Jason A.. (2002) Performance, anxiety. M/C: A Journal of Media and Culture 5(2). < http://www.media-culture.org.au/0205/performance.php> ([your date of access]).
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie