Academic literature on the topic 'DECISION METAMODEL'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'DECISION METAMODEL.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "DECISION METAMODEL"

1

Soeteman, Djøra I., Stephen C. Resch, Hawre Jalal, Caitlin M. Dugdale, Martina Penazzato, Milton C. Weinstein, Andrew Phillips, et al. "Developing and Validating Metamodels of a Microsimulation Model of Infant HIV Testing and Screening Strategies Used in a Decision Support Tool for Health Policy Makers." MDM Policy & Practice 5, no. 1 (January 2020): 238146832093289. http://dx.doi.org/10.1177/2381468320932894.

Full text
Abstract:
Background. Metamodels can simplify complex health policy models and yield instantaneous results to inform policy decisions. We investigated the predictive validity of linear regression metamodels used to support a real-time decision-making tool that compares infant HIV testing/screening strategies. Methods. We developed linear regression metamodels of the Cost-Effectiveness of Preventing AIDS Complications Pediatric (CEPAC-P) microsimulation model used to predict life expectancy and lifetime HIV-related costs/person of two infant HIV testing/screening programs in South Africa. Metamodel performance was assessed with cross-validation and Bland-Altman plots, showing between-method differences in predicted outcomes against their means. Predictive validity was determined by the percentage of simulations in which the metamodels accurately predicted the strategy with the greatest net health benefit (NHB) as projected by the CEPAC-P model. We introduced a zone of indifference and investigated the width needed to produce between-method agreement in 95% of the simulations. We also calculated NHB losses from “wrong” decisions by the metamodel. Results. In cross-validation, linear regression metamodels accurately approximated CEPAC-P-projected outcomes. For life expectancy, Bland-Altman plots showed good agreement between CEPAC-P and the metamodel (within 1.1 life-months difference). For costs, 95% of between-method differences were within $65/person. The metamodels predicted the same optimal strategy as the CEPAC-P model in 87.7% of simulations, increasing to 95% with a zone of indifference of 0.24 life-months ( ∼ 7 days). The losses in health benefits due to “wrong” choices by the metamodel were modest (range: 0.0002–1.1 life-months). Conclusions. For this policy question, linear regression metamodels offered sufficient predictive validity for the optimal testing strategy as compared with the CEPAC-P model. Metamodels can simulate different scenarios in real time, based on sets of input parameters that can be depicted in a widely accessible decision-support tool.
APA, Harvard, Vancouver, ISO, and other styles
2

Bracco, A., J. D. Neelin, H. Luo, J. C. McWilliams, and J. E. Meyerson. "High dimensional decision dilemmas in climate models." Geoscientific Model Development Discussions 6, no. 2 (May 8, 2013): 2731–67. http://dx.doi.org/10.5194/gmdd-6-2731-2013.

Full text
Abstract:
Abstract. An important source of uncertainty in climate models is linked to the calibration of model parameters. Interest in systematic and automated parameter optimization procedures stems from the desire to improve the model climatology and to quantify the average sensitivity associated with potential changes in the climate system. Neelin et al. (2010) used a quadratic metamodel to objectively calibrate an atmospheric circulation model (AGCM) around four adjustable parameters. The metamodel accurately estimates global spatial averages of common fields of climatic interest, from precipitation, to low and high level winds, from temperature at various levels to sea level pressure and geopotential height, while providing a computationally cheap strategy to explore the influence of parameter settings. Here, guided by the metamodel, the ambiguities or dilemmas related to the decision making process in relation to model sensitivity and optimization are examined. Simulations of current climate are subject to considerable regional-scale biases. Those biases may vary substantially depending on the climate variable considered, and/or on the performance metric adopted. Common dilemmas are associated with model revisions yielding improvement in one field or regional pattern or season, but degradation in another, or improvement in the model climatology but degradation in the interannual variability representation. Challenges are posed to the modeler by the high dimensionality of the model output fields and by the large number of adjustable parameters. The use of the metamodel in the optimization strategy helps visualize trade-offs at a regional level, e.g. how mismatches between sensitivity and error spatial fields yield regional errors under minimization of global objective functions.
APA, Harvard, Vancouver, ISO, and other styles
3

Bracco, A., J. D. Neelin, H. Luo, J. C. McWilliams, and J. E. Meyerson. "High dimensional decision dilemmas in climate models." Geoscientific Model Development 6, no. 5 (October 15, 2013): 1673–87. http://dx.doi.org/10.5194/gmd-6-1673-2013.

Full text
Abstract:
Abstract. An important source of uncertainty in climate models is linked to the calibration of model parameters. Interest in systematic and automated parameter optimization procedures stems from the desire to improve the model climatology and to quantify the average sensitivity associated with potential changes in the climate system. Building upon on the smoothness of the response of an atmospheric circulation model (AGCM) to changes of four adjustable parameters, Neelin et al. (2010) used a quadratic metamodel to objectively calibrate the AGCM. The metamodel accurately estimates global spatial averages of common fields of climatic interest, from precipitation, to low and high level winds, from temperature at various levels to sea level pressure and geopotential height, while providing a computationally cheap strategy to explore the influence of parameter settings. Here, guided by the metamodel, the ambiguities or dilemmas related to the decision making process in relation to model sensitivity and optimization are examined. Simulations of current climate are subject to considerable regional-scale biases. Those biases may vary substantially depending on the climate variable considered, and/or on the performance metric adopted. Common dilemmas are associated with model revisions yielding improvement in one field or regional pattern or season, but degradation in another, or improvement in the model climatology but degradation in the interannual variability representation. Challenges are posed to the modeler by the high dimensionality of the model output fields and by the large number of adjustable parameters. The use of the metamodel in the optimization strategy helps visualize trade-offs at a regional level, e.g., how mismatches between sensitivity and error spatial fields yield regional errors under minimization of global objective functions.
APA, Harvard, Vancouver, ISO, and other styles
4

Gayathri, Rajakumaran, Shola Usha Rani, Lenka Čepová, Murugesan Rajesh, and Kanak Kalita. "A Comparative Analysis of Machine Learning Models in Prediction of Mortar Compressive Strength." Processes 10, no. 7 (July 15, 2022): 1387. http://dx.doi.org/10.3390/pr10071387.

Full text
Abstract:
Predicting the mechanical properties of cement-based mortars is essential in understanding the life and functioning of structures. Machine learning (ML) algorithms in this regard can be especially useful in prediction scenarios. In this paper, a comprehensive comparison of nine ML algorithms, i.e., linear regression (LR), random forest regression (RFR), support vector regression (SVR), AdaBoost regression (ABR), multi-layer perceptron (MLP), gradient boosting regression (GBR), decision tree regression (DT), hist gradient boosting regression (hGBR) and XGBoost regression (XGB), is carried out. A multi-attribute decision making method called TOPSIS (technique for order of preference by similarity to ideal solution) is used to select the best ML metamodel. A large dataset on cement-based mortars consisting of 424 sample points is used. The compressive strength of cement-based mortars is predicted based on six input parameters, i.e., the age of specimen (AS), the cement grade (CG), the metakaolin-to-total-binder ratio (MK/B), the water-to-binder ratio (W/B), the superplasticizer-to-binder ratio (SP) and the binder-to-sand ratio (B/S). XGBoost regression is found to be the best ML metamodel while simple metamodels like linear regression (LR) are found to be insufficient in handling the non-linearity in the process. This mapping of the compressive strength of mortars using ML techniques will be helpful for practitioners and researchers in identifying suitable mortar mixes.
APA, Harvard, Vancouver, ISO, and other styles
5

Suljic, Mirza, Edin Osmanbegovic, and Željko Dobrović. "Common Metamodel of Questionnaire Model and Decision Tree Model." Research in Applied Economics 10, no. 3 (September 25, 2018): 106. http://dx.doi.org/10.5296/rae.v10i3.13540.

Full text
Abstract:
The subject of this paper is metamodeling and its application in the field of scientific research. The main goal is to explore the possibilities of integration of two methods: questionnaires and decision trees. The questionnaire method was established as one of the methods for data collecting, while the decision tree method represents an alternative way of presenting and analyzing decision making situations. These two methods are not completely independent, but on the contrary, there is a strong natural bond between them. Therefore, the result reveals a common meta-model that over common concepts and with the use of metamodeling connects the methods: questionnaires and decision trees. The obtained results can be used to create a CASE tool or create repository that can be suitable for exchange between different systems. The proposed meta-model is not necessarily the final product. It could be further developed by adding more entities that will keep some other data.
APA, Harvard, Vancouver, ISO, and other styles
6

Meidt, Gregory J., and Kenneth W. Bauer. "PCRSM: A decision support system for simulation metamodel construction." SIMULATION 59, no. 3 (September 1992): 183–91. http://dx.doi.org/10.1177/003754979205900307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rojas, Luz Andrea Rodríguez, Juan Manuel Cueva Lovelle, Giovanny Mauricio Tarazona Bermúdez, Carlos Enrique Montenegro, Elena Giménez de Ory, and Rubén Arístides González Crespo. "Metamodel to support decision-making from open government data." Journal of Ambient Intelligence and Humanized Computing 9, no. 3 (February 6, 2017): 553–63. http://dx.doi.org/10.1007/s12652-016-0443-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mahmud, Istiak, Md Mohsin Kabir, M. F. Mridha, Sultan Alfarhood, Mejdl Safran, and Dunren Che. "Cardiac Failure Forecasting Based on Clinical Data Using a Lightweight Machine Learning Metamodel." Diagnostics 13, no. 15 (July 31, 2023): 2540. http://dx.doi.org/10.3390/diagnostics13152540.

Full text
Abstract:
Accurate prediction of heart failure can help prevent life-threatening situations. Several factors contribute to the risk of heart failure, including underlying heart diseases such as coronary artery disease or heart attack, diabetes, hypertension, obesity, certain medications, and lifestyle habits such as smoking and excessive alcohol intake. Machine learning approaches to predict and detect heart disease hold significant potential for clinical utility but face several challenges in their development and implementation. This research proposes a machine learning metamodel for predicting a patient’s heart failure based on clinical test data. The proposed metamodel was developed based on Random Forest Classifier, Gaussian Naive Bayes, Decision Tree models, and k-Nearest Neighbor as the final estimator. The metamodel is trained and tested utilizing a combined dataset comprising five well-known heart datasets (Statlog Heart, Cleveland, Hungarian, Switzerland, and Long Beach), all sharing 11 standard features. The study shows that the proposed metamodel can predict heart failure more accurately than other machine learning models, with an accuracy of 87%.
APA, Harvard, Vancouver, ISO, and other styles
9

Shevchenko, Igor, Denis Vasiliev, Nataly Rylova, and Natalia Sharonova. "CONCEPTUAL MODELS OF THE DECISION SUPPORT SYSTEM FOR MANAGING SET OF THE MUNICIPAL SPHERE PROJECTS." Transactions of Kremenchuk Mykhailo Ostrohradskyi National University, no. 3(128) (June 11, 2021): 57–62. http://dx.doi.org/10.30929/1995-0519.2021.3.57-62.

Full text
Abstract:
Purpose. As a rule modern organizational systems in any industry contain many elements. These elements are connected by a complex scheme of relationships. In the municipal sphere, various programs and projects are implemented simultaneously. To increase the efficiency of work with projects, it is necessary to solve operations management problems. It is necessary to coordinate the allocation of resources, assess the quality of implementation of individual stages of each project, take into account the joint use of contractors, assess risks and eliminate problem situations. The purpose of the research is to improve the conceptual models of decision support systems to create information technology for managing the implementation of many programs and projects in the municipal sphere. Methodology. For the implementation of operational management of many projects, it is advisable to have the appropriate information technology and tools. The development of such technologies requires a certain methodological approach. The difficulties of structural synthesis of executive systems are associated with the uncertainty of the design tasks conditions. It is impossible to fully specify the algorithms of functioning and interaction of executive structures. Results. An improved ontological metamodel of the problem area and an improved model of the decision support system for managing multiple projects have been developed. Originality. The ontological metamodel of the problem area differs in that it has several strata, each of which identifies the relevant problems and aspects of the organization. The model of the decision support system has been improved by adding a formal static description of the implementation process of many projects and their relationship with many attributes and aspects. Practical value. An ontological metamodel allows to establish hierarchical causal relationships between different problems. This is especially true when implementing many projects involving many departments. The model of the decision support system allows making operational decisions taking into account many factors and their interrelations on various aspects of activity. All this contributes to the main task – the development of information technology to support decision-making in the organizational processes of municipal bodies and enterprises. References 12, figures 1.
APA, Harvard, Vancouver, ISO, and other styles
10

Idier, Déborah, Axel Aurouet, François Bachoc, Audrey Baills, José Betancourt, Fabrice Gamboa, Thierry Klein, et al. "A User-Oriented Local Coastal Flooding Early Warning System Using Metamodelling Techniques." Journal of Marine Science and Engineering 9, no. 11 (October 27, 2021): 1191. http://dx.doi.org/10.3390/jmse9111191.

Full text
Abstract:
Given recent scientific advances, coastal flooding events can be properly modelled. Nevertheless, such models are computationally expensive (requiring many hours), which prevents their use for forecasting and warning. In addition, there is a gap between the model outputs and information actually needed by decision makers. The present work aims to develop and test a method capable of forecasting coastal flood information adapted to users’ needs. The method must be robust and fast and must integrate the complexity of coastal flood processes. The explored solution relies on metamodels, i.e., mathematical functions that precisely and efficiently (within minutes) estimate the results that would provide the numerical model. While the principle of relying on metamodel solutions is not new, the originality of the present work is to tackle and validate the entire process from the identification of user needs to the establishment and validation of the rapid forecast and early warning system (FEWS) while relying on numerical modelling, metamodelling, the development of indicators, and information technologies. The development and validation are performed at the study site of Gâvres (France). This site is subject to wave overtopping, so the numerical phase-resolving SWASH model is used to build the learning dataset required for the metamodel setup. Gaussian process- and random forest classifier-based metamodels are used and post-processed to estimate 14 indicators of interest for FEWS users. These metamodelling and post-processing schemes are implemented in an FEWS prototype, which is employed by local users and exhibits good warning skills during the validation period. Based on this experience, we provide recommendations for the improvement and/or application of this methodology and individual steps to other sites.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "DECISION METAMODEL"

1

Penadés, Plà Vicent. "Life-cycle sustainability design of post-tensioned box-girder bridge obtained by metamodel-assisted optimization and decision-making under uncertainty." Doctoral thesis, Universitat Politècnica de València, 2021. http://hdl.handle.net/10251/147480.

Full text
Abstract:
[EN] Currently, there is a trend towards sustainability, especially in developed countries, where the concerns of society about environmental degradation and social problems have increased. Following this trend, the construction sector is one of the most influential sectors due to its high economic, environmental, and social impacts. At the same time, there is an increase in the demand for transport, which drives a need to develop and maintain the necessary infrastructure for this purpose. Taking all these factors into account, bridges become a key structure and therefore assessment of sustainability throughout their whole life-cycle is essential. The main objective of this thesis is to propose a methodology that allows assessment of the sustainability of a bridge under uncertain initial conditions (subjectivity of the decision-maker or variability of initial parameters) and optimization of the design to obtain a robust optimal bridge. To this end, an extensive bibliographic review of all the works that perform assessments of the sustainability of bridges through the valuation of criteria related to their main pillars (economic, environmental, or social) has been carried out. In this review, it has been observed that the most comprehensive way to evaluate the environmental and social pillars is through the use of life-cycle impact assessment methods. These methods allow sustainability assessment to be performed for the whole life-cycle of the bridge. This process provides valuable information to decision-makers for the assessment and selection of the most sustainable bridge. However, the decision-makers' subjective assessments of the relative importance of the criteria influence the final assessment of sustainability. For this reason, it is necessary to create a methodology that reduces the associated uncertainty and seeks robust solutions according to the opinion of decision-makers. In addition, for bridges, the design and decision-making are conditioned by the initially defined parameters. This leads to solutions that may be sensitive to small changes in these initial conditions. A robust optimal design makes it possible to obtain optimal solutions and structurally stable designs under variations of the initial conditions as well as sustainable designs that are not influenced by the preferences of the stakeholders who are part of the decision-making process. Thus, obtaining a robust optimal design becomes a probabilistic optimization process that has a high computational cost. For this reason, the use of metamodels has been integrated into the proposed methodology. Specifically, Latin hypercube sampling is used for the definition of the initial sample and a kriging model is used for the definition of the mathematical approximation. In this way, kriging-based heuristic optimization reduces the computational cost by more than 90% with respect to conventional heuristic optimization while obtaining very similar results. This thesis provides, first of all, an extensive bibliographic review of both the criteria used for the assessment of sustainability of bridges and the different methods of life-cycle impact assessment to obtain a complete profile of the environmental and social pillars. Subsequently, a methodology is defined for the full assessment of sustainability, using life-cycle impact assessment methods. Likewise, an approach is proposed that makes it possible to obtain structures with little influence from the structural parameters, as well as from the preferences of the different decision-makers regarding the sustainability criteria. The methodology provided in this thesis is applicable to any other type of structure.
[ES] Actualmente existe una tendencia hacia la sostenibilidad, especialmente en los países desarrollados donde la preocupación de la sociedad por el deterioro ambiental y los problemas sociales ha aumentado. Siguiendo esta tendencia, el sector de la construcción es uno de los sectores que mayor influencia tiene debido a su alto impacto económico, ambiental y social. Al mismo tiempo, existe un incremento en la demanda de transporte que provoca la necesidad de desarrollo y mantenimiento de las infraestructuras necesarias para tal fin. Con todo esto, los puentes se convierten en una estructura clave, y por tanto, la valoración de la sostenibilidad a lo largo de toda su vida es esencial. El objetivo principal de esta tesis es proponer una metodología que permita valorar la sostenibilidad de un puente bajo condiciones iniciales inciertas (subjetividad del decisor o variabilidad de parámetros iniciales) y optimizar el diseño para obtener puentes óptimos robustos. Para ello, se ha realizado una extensa revisión bibliográfica de todos los trabajos en los que se realiza un análisis de la sostenibilidad mediante la valoración de criterios relacionados con sus pilares principales (económico, medio ambiental o social). En esta revisión, se ha observado que la forma más completa de valorar los pilares medioambientales y sociales es mediante el uso de métodos de análisis de ciclo de vida. Estos métodos permiten llevar a cabo la valoración de la sostenibilidad durante todas las etapas de la vida de los puentes. Todo este procedimiento proporciona información muy valiosa a los decisores para la valoración y selección del puente más sostenible. No obstante, las valoraciones subjetivas de los decisores sobre la importancia de los criterios influyen en la evaluación final de la sostenibilidad. Por esta razón, es necesario crear una metodología que reduzca la incertidumbre asociada y busque soluciones robustas frente a las opiniones de los agentes implicados en la toma de decisiones. Además, el diseño y toma de decisiones en puentes está condicionado por los parámetros inicialmente definidos. Esto conduce a soluciones que pueden ser sensibles frente a pequeños cambios en dichas condiciones iniciales. El diseño óptimo robusto permite obtener diseños óptimos y estructuralmente estables frente a variaciones de las condiciones iniciales, y también diseños sostenibles y poco influenciables por las preferencias de los decisores que forman parte del proceso de toma de decisión. Así pues, el diseño óptimo robusto se convierte en un proceso de optimización probabilística que requiere un gran coste computacional. Por este motivo, el uso de metamodelos se ha integrado en la metodología propuesta. En concreto, se ha utilizado hipercubo latino para la definición de la muestra inicial y los modelos kriging para la definción de la aproximación matemática. De esta forma, la optimización heurística basada en kriging ha permitido reducir más de un 90% el coste computacional respecto a la optimización heurística conveniconal obteniendo resultados muy similares. Esta tesis proporciona en primer lugar, una amplia revisión bibliográfica, tanto de los criterios utilizados para la valoración de la sostenibilidad en puentes como de los diferentes métodos de análisis de ciclo de vida para obtener un perfil completo de los pilares ambientales y sociales. Posteriormente, se define una metodología para la valoración completa de la sostenibilidad, usando métodos de análisis de ciclo de vida. Asimismo, se propone un enfoque que permite obtener estructuras poco influenciables por los parámetros estructurales, así como por las preferencias de los diferentes decisores frente a los criterios sostenibles. La metodología proporcionada en esta tesis es aplicable a cualquier otro tipo de estructura.
[CA] Actualment existeix una tendència cap a la sostenibilitat, especialment en els països desenrotllats on la preocupació de la societat pel deteriori ambiental i els problemes socials ha augmentat. Seguint aquesta tendència, el sector de la construcció és un dels sectors que major influència té a causa del seu alt impacte econòmic, ambiental i social. Al mateix temps, existeix un increment en la demanda de transport que provoca la necessitat de desenrotll i manteniment de les infraestructures necessàries per a tal fi. En tot açò, els ponts es converteixen en una estructura clau, i per tant, la valoració de la sostenibilitat al llarg de tota la seua vida és essencial. L'objectiu principal d'aquesta tesi doctoral és proposar una metodologia que permeta valorar la sostenibilitat d'un pont baix condicions inicials incertes (subjectivitat del decisor o variabilitat dels paràmetres inicials) i optimitzar el disseny per a obtenir ponts òptims robusts. Per a això, s'ha realitzat una extensa revisió bibliogràfica de tots els treballs en els quals es realitza un anàlisis de la sostenibilitat mitjançant la valoració de criteris relacionats amb els seus pilars principals (econòmic, ambiental o social). En aquesta revisió s'ha observat que la forma més completa de valorar els pilars ambientals i socials és mitjançant l'ús de mètodes d'anàlisis de cicle de vida. Aquests mètodes permeten realitzar la valoració de la sostenibilitat al llarg de totes les etapes de la vida dels ponts. Tot aquest procediment proporciona informació molt valuosa als decisors per a la valoració i selecció del pont més sostenible. No obstant això, les valoracions subjectives dels decisors sobre la importància dels criteris influeixen en l'avaluació final de la sostenibilitat. Per aquesta raó, és necessari crear una metodologia que reduïsca la incertesa associada i busque solucions robustes enfront de les opinions dels agents implicats en la presa de decisions. A més, el disseny i la presa de decisions en ponts està condicionat pels paràmetres inicialment definits. Açò condueix a solucions que poden ser sensibles front a menuts canvis en les dites condicions inicials. El disseny òptim robust permet obtenir dissenys òptims i estructuralment estables front a variacions de les condicions inicials, i també dissenys sostenibles i poc influenciables per les preferències dels decisors que formen part del procés de presa de decisió. D'aquesta manera, el disseny òptim robust es converteix en un procés d'optimització probabilística que requereix un gran cost computacional. Per aquest motiu, l'ús de metamodels s'ha integrat en la metodologia proposta. En concret, s'ha utilitzat l'hipercub llatí per a la definició de la mostra inicial i els models kriging per a la definició de l'aproximació matemàtica. D'aquesta forma, l'optimització heurística basada en kriging ha permés reduir més d'un 90% el cost computacional respecte a l'optimització heurística convencional obtenint resultats molt similars. Aquesta tesi doctoral proporciona en primer lloc, una ampla revisió bibliogràfica, tant dels criteris utilitzats per a la valoració de la sostenibilitat en ponts com dels diferents mètodes d'anàlisis de cicle de vida per a obtenir un perfil complet dels pilars ambientals i socials. Posteriorment, es defineix una metodologia per a la valoració completa de la sostenibilitat, utilitzant mètodes d'anàlisis de cicle de vida. Així mateix, es proposa un enfocament que permet obtenir estructures poc influenciables pels paràmetres estructurals, així com per les preferències dels diferents decisors enfront dels criteris sostenibles. La metodologia proporcionada en aquesta tesi doctoral és aplicable a qualsevol altre tipus d'estructura. Nº de páginas:
I would like to acknowledge the economic support of the Spanish Ministry of Economy and Competitiveness, formerly called Spanish Ministry of Science and Innovation. This thesis has been possible thanks to the FPI fellowship and the financially support of BRIDLIFE (Research Project BIA2014-56574-R) and DIMALIFE (Project BIA2017-85098-R).
Penadés Plà, V. (2020). Life-cycle sustainability design of post-tensioned box-girder bridge obtained by metamodel-assisted optimization and decision-making under uncertainty [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/147480
TESIS
APA, Harvard, Vancouver, ISO, and other styles
2

Audoux, Yohann. "Développement d’une nouvelle méthode de réduction de modèle basée sur les hypersurfaces NURBS (Non-Uniform Rational B-Splines)." Thesis, Paris, ENSAM, 2019. http://www.theses.fr/2019ENAM0016/document.

Full text
Abstract:
Malgré des décennies d’incontestables progrès dans le domaine des sciences informatiques, un certain nombre de problèmes restent difficiles à traiter en raison, soit de leur complexité numérique (problème d’optimisation, …), soit de contraintes spécifiques telle que la nécessité de traitement en temps réel (réalité virtuelle, augmentée, …). Dans ce contexte, il existe des méthodes de réduction de modèle qui permettent de réduire les temps de calcul de simulations multi-champs et/ou multi-échelles complexes. Le processus de réduction de modèle consiste à paramétrer un métamodèle qui requiert moins de ressources pour être évalué que le modèle complexe duquel il a été obtenu, tout en garantissant une certaine précision. Les méthodes actuelles nécessitent, en général, soit une expertise de l’utilisateur, soit un grand nombre de choix arbitraires de sa part. De plus, elles sont bien souvent adaptées à une application spécifique mais difficilement transposable à d’autres domaines. L’objectif de notre approche est donc d’obtenir, s'il n'est pas le meilleur, un bon métamodèle quel que soit le problème considéré. La stratégie développée s’appuie sur l’utilisation des hypersurfaces NURBS et se démarque des approches existantes par l’absence d’hypothèses simplificatrices sur les paramètres de celles-ci. Pour ce faire, une méta heuristique (de type algorithme génétique), capable de traiter des problèmes d’optimisation dont le nombre de variables n’est pas constant, permet de déterminer automatiquement l’ensemble des paramètres de l’hypersurface sans transférer la complexité des choix à l’utilisateur
Despite undeniable progress achieved in computer sciences over the last decades, some problems remain intractable either by their numerical complexity (optimisation problems, …) or because they are subject to specific constraints such as real-time processing (virtual and augmented reality, …). In this context, metamodeling techniques can minimise the computational effort to realize complex multi-field and/or multi-scale simulations. The metamodeling process consists of setting up a metamodel that needs less resources to be evaluated than the complex one that is extracted from by guaranteeing, meanwhile, a minimal accuracy. Current methods generally require either the user’s expertise or arbitrary choices. Moreover, they are often tailored for a specific application, but they can be hardly transposed to other fields. Thus, even if it is not the best, our approach aims at obtaining a metamodel that remains a good one for whatever problem at hand. The developed strategy relies on NURBS hypersurfaces and stands out from existing ones by avoiding the use of empiric criteria to set its parameters. To do so, a metaheuristic (a genetic algorithm) able to deal with optimisation problems defined over a variable number of optimisation variables sets automatically all the hypersurface parameters so that the complexity is not transferred to the user
APA, Harvard, Vancouver, ISO, and other styles
3

Choi, Hae-Jin. "A Robust Design Method for Model and Propagated Uncertainty." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7508.

Full text
Abstract:
One of the important factors to be considered in designing an engineering system is uncertainty, which emanates from natural randomness, limited data, or limited knowledge of systems. In this study, a robust design methodology is established in order to design multifunctional materials, employing multi-time and length scale analyses. The Robust Concept Exploration Method with Error Margin Index (RCEM-EMI) is proposed for design incorporating non-deterministic system behavior. The Inductive Design Exploration Method (IDEM) is proposed to facilitate distributed, robust decision-making under propagated uncertainty in a series of multiscale analyses or simulations. These methods are verified in the context of Design of Multifunctional Energetic Structural Materials (MESM). The MESM is being developed to replace the large amount of steel reinforcement in a missile penetrator for light weight, high energy release, and sound structural integrity. In this example, the methods facilitate following state-of-the-art design capabilities, robust MESM design under (a) random microstructure changes and (b) propagated uncertainty in a multiscale analysis chain. The methods are designed to facilitate effective and efficient materials design; however, they are generalized to be applicable to any complex engineering systems design that incorporates computationally intensive simulations or expensive experiments, non-deterministic models, accumulated uncertainty in multidisciplinary analyses, and distributed, collaborative decision-making.
APA, Harvard, Vancouver, ISO, and other styles
4

Nixon, Janel Nicole. "A Systematic Process for Adaptive Concept Exploration." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/13952.

Full text
Abstract:
This thesis presents a method for streamlining the process of obtaining and interpreting quantitative data for the purpose of creating a low-fidelity modeling and simulation environment. By providing a more efficient means for obtaining such information, quantitative analyses become much more practical for decision-making in the very early stages of design, where traditionally, quants are viewed as too expensive and cumbersome for concept evaluation. The method developed to address this need uses a Systematic Process for Adaptive Concept Exploration (SPACE). In the SPACE method, design space exploration occurs in a sequential fashion; as data is acquired, the sampling scheme adapts to the specific problem at hand. Previously gathered data is used to make inferences about the nature of the problem so that future samples can be taken from the more interesting portions of the design space. Furthermore, the SPACE method identifies those analyses that have significant impacts on the relationships being modeled, so that effort can be focused on acquiring only the most pertinent information. The results show that the combination of a tailored data set, and an informed model structure work together to provide a meaningful quantitative representation of the system while relying on only a small amount of resources to generate that information. In comparison to more traditional modeling and simulation approaches, the SPACE method provides a more accurate representation of the system using fewer resources to generate that representation. For this reason, the SPACE method acts as an enabler for decision making in the very early design stages, where the desire is to base design decisions on quantitative information while not wasting valuable resources obtaining unnecessary high fidelity information about all the candidate solutions. Thus, the approach enables concept selection to be based on parametric, quantitative data so that informed, unbiased decisions can be made.
APA, Harvard, Vancouver, ISO, and other styles
5

Qian, Zhiguang. "Computer experiments [electronic resource] : design, modeling and integration /." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11480.

Full text
Abstract:
The use of computer modeling is fast increasing in almost every scientific, engineering and business arena. This dissertation investigates some challenging issues in design, modeling and analysis of computer experiments, which will consist of four major parts. In the first part, a new approach is developed to combine data from approximate and detailed simulations to build a surrogate model based on some stochastic models. In the second part, we propose some Bayesian hierarchical Gaussian process models to integrate data from different types of experiments. The third part concerns the development of latent variable models for computer experiments with multivariate response with application to data center temperature modeling. The last chapter is devoted to the development of nested space-filling designs for multiple experiments with different levels of accuracy.
APA, Harvard, Vancouver, ISO, and other styles
6

PRAKASH, DEEPIKA. "ELICTING INFORMATION REQUIREMENTS FOR DATA WAREHOUSES." Thesis, 2016. http://dspace.dtu.ac.in:8080/jspui/handle/repository/14844.

Full text
Abstract:
Data Warehouse support in terms of Requirements Engineering models and techniques has been extensively provided for operational level of decision making. However, it is increasingly being recognized that there are other forms of decision making that exist in an organization. These are strategic in nature and ‘above’ operational decision making. This thesis addresses the issue of providing decision making support to both strategic and operational decision making in the same DW system. The solution starts by defining two broad categories of decisions for which decision support is needed, one for policy enforcement rule (PER) formulation decisions and the other for operational decisions. Both kinds of decisions are structured based on a generic decision meta-model developed here. The process starts by developing two Data Warehouses, one for policy enforcement rules and the other for operational decisions. In order to identify the needed information for supporting decision making, a set of generic techniques for eliciting information is proposed. This information is stored in the DW. Again, the structure of information for the two DWs is based on a generic information meta-model developed here. The two DWs are integrated upstream in the requirements engineering phase using an integration life cycle proposed in this thesis. It is argued that there is a need for integration following the problems of inconsistency and loss of business control that can occur. This is due to common information and differing refresh times between the two Data Warehouses. Further, three tools were developed to provide computer support for arriving at information for (a) PER, (b) operational decisions and (c) integrating information. This process was validated using AYUSH policies.
APA, Harvard, Vancouver, ISO, and other styles
7

Faustino, Bruno. "Evidence-based clinical decision-making : Conceptual and empirical foundations for an integrative psychological and neurobiological transtheoretical metamodel." Doctoral thesis, 2021. http://hdl.handle.net/10451/52489.

Full text
Abstract:
The dialogue between psychotherapy and neuroscience is ongoing. Previous meta-analytic research suggests that 35% of psychotherapy outcome variance is not fully explained, whereas 30% is attributed to patient variables, 15% to therapeutic relationship, 10% to specific therapeutic techniques, 7% to therapist variables and 3% to other factors (Norcross & Wampold, 2019). Several authors emphasize the need for integrative, metatheoretical or transtheoretical approaches to enhance conceptual understanding of clinical phenomena, augmenting psychotherapy responsiveness to patients’ significant variables, such as maladaptive patterns, states of mind, relational styles, emotional difficulties, neurocognitive deficits, and psychological needs. The present doctoral proposal aims to respond to these claims through the establishment of preliminary conceptual and empirical foundations for an Integrative Psychological and Neurobiological Transtheoretical Metamodel. First, an extensive literature review of the relationships between psychotherapy and neuroscience was performed to establish theoretical and conceptual integration of different components of the presently proposed model. Second, several methodological aspects were described to systematize the complex data acquisition process. Third, seven studies were conducted, and implications of the results were discussed. Fourth, an integrative discussion was elaborated, emphasizing the major and general implications of the results for clinical practice and future research. The first empirical study aimed to develop and/or adapt self-report assessment measures to evaluate several psychological variables (e.g., metacognition, states of mind), which resulted in five scientific articles. Thus, the Metacognitive Self-assessment Scale (Pedone et al., 2017) and the Inventory of Interpersonal Problems – 32 (IIP-32, Barkham et al., 1998) were validated and adapted to European Portuguese. The State of Mind Questionnaire (SMQ, Faustino et al., 2021b, Emotional Processing Difficulties Scale – R (EPDS-R, Faustino et al., in press) and the Clinical Decision-Making Inventory (Faustino & Vasco, in press) were developed. All instruments showed satisfactory psychometric properties. Nevertheless, the SMQ showed low reliability in the composite scales in smaller subsamples. For the second empirical study, the main aims were to explore the complex relationships between early disorder determinants, maladaptive schemas and states of mind, defensive maneuvers and critical consequences, mental skills and processes, and adaptive self-domains. This was performed with Structural Equation Modeling (SEM). Results showed significant sequential and mediational models between maladaptive schemas, defensive maneuvers and dysfunctional consequences, mental abilities and processes, and adaptive self-domains with psychological needs. Maladaptive schemas and states of mind were both predictors and mediators in several models. However, the relationship between maladaptive schematic functioning and symptomatology had less significant mediations with the same variables. For the third study, the main aims were to explore the relationships of early disorder determinants, maladaptive schematic functioning and states of mind, defensive maneuvers and dysfunctional consequences, mental abilities and processes, and adaptive self-domains, with several neurocognitive variables. Executive functions were negatively correlated with maladaptive schematic functioning and with defensive maneuvers and dysfunctional consequences. Memory only correlated with psychological needs, self-confidence and with dysfunctional interpersonal cycles. These results emphasize previous assumptions that there is a difference between self-report questionnaires and neuropsychological assessment measures which may difficult the integrated study of psychological and neurocognitive processes. The fourth study aimed to explore the associations of affective subliminal processing with dispositional states and contextual states, defined in the present work as early disorder determinants, schematic functioning, and defensive maneuvers and dysfunctional consequences, mental abilities and processes, and adaptive self-domains. Results showed strong correlations between maladaptive schematic functioning, coping responses, emotional processing difficulties, and expressive suppression with behavioral responses. Dispositional traits and contextual states seem to be associated with affective processing, especially when it comes to the neutral valence of the subliminal stimuli. ERPs waveforms showed an amplitude modulation with a temporal progression: in the first 100 msec the waveform amplitude was highest to the negative condition; Later on, in the time windows after 350 msec, the neutral condition was the one that elicited the ERPs’ heist amplitude. These indexes a cascade of reactions, first a priority to nonconscious negative stimulation; and after that, a later processing phase of affective-cognitive interpretation (350msc) in which neutral stimuli acquire a meaning according to schemas. The fifth study explored the diagnostic and or transdiagnostic potential of early disorder determinants, maladaptive schematic functioning and states of mind, defensive maneuvers and dysfunctional consequences, mental abilities and processes, and adaptive self-domains. Results showed that only early complex trauma and expressive suppression were not statistically different in two subsamples. Individuals in the low-symptoms sub-sample reported lower levels of maladaptive schematic functioning, defensive maneuvers, and psychological inflexibility than individuals in the higher-symptoms subsample. The sixth study was focused on the exploration of the temporal stability of maladaptive schematic functioning and states of mind, defensive maneuvers and dysfunctional consequences, mental abilities, and adaptive self-domains. Results showed significant differences between moment one and two, with a descending pattern in the mean scores of dysfunctional variables. An inverse pattern was found regarding the adaptive variables. However, mean scores of some variables, such as early maladaptive schemas, emotional schemas, psychological needs, and cognitive reappraisal were not statistically significant. The seventh study aimed to explore associations of early disorder determinants, maladaptive schemas and states of mind, defensive maneuvers and critical consequences, mental skills and processes and adaptive self-domains, with an empirical based clinical profile (e.g., psychotherapy and motivational stage, coping styles). Results showed significant negative correlations between maladaptive schematic functioning and stage process, motivational stage, therapeutic relationship, attachment style, reactance, and coping style. An inverse pattern was found regarding the adaptive variables. These preliminary results seem to support a theoretically- and empirically-based integrative and transtheoretical metamodel focused on unifying psychotherapy and neuroscience into a coherent framework. Further research is required to augment and enhance the presently proposed model.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "DECISION METAMODEL"

1

P, Van Gigch John, ed. Decision making about decision making: Metamodels and metasystems. Cambridge, Mass: Abacus Press, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

H, Bigelow J., and Rand Corporation, eds. Motivated metamodels: Synthesis of cause-effect reasoning and statistical metamodeling. Santa Monica, CA: RAND, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gigch, J. Van. Decision Making About Decision Making: Metamodels and Metasystems (Cybernetics and Systems). Taylor & Francis, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "DECISION METAMODEL"

1

Mihalache, Sabina-Cristiana. "Decision Rules: A Metamodel to Organize Information." In Lecture Notes in Electrical Engineering, 299–308. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-85437-3_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Capilla, Rafael, Olaf Zimmermann, Uwe Zdun, Paris Avgeriou, and Jochen M. Küster. "An Enhanced Architectural Knowledge Metamodel Linking Architectural Design Decisions to other Artifacts in the Software Engineering Lifecycle." In Software Architecture, 303–18. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-23798-0_33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Barbier, Guillaume, Véronique Cucchi, François Pinet, and David R. C. Hill. "CMF." In Advances in Systems Analysis, Software Engineering, and High Performance Computing, 181–95. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-4217-1.ch006.

Full text
Abstract:
This chapter shows how Model Driven Engineering (MDE) can contribute to the production of Crop models. The ITK firm works in agronomy; it designs digital models and Decision Support Systems for croppers. Common model development at ITK relies on a dual implementation. The first one in Matlab® is usually proposed by agronomists, but for industrial purposes, software engineers translate this model in Java. This leads to double implementation, maintenance, and heavy production costs. To deal with this efficiency problem, the authors use a MDE approach to produce a Crop Model Factory (CMF). For this factory they propose a DSML (Domain Specific Modeling Language) formalized by a metamodel. In this chapter, the authors present this DSML, the concrete syntax retained for the factory, and its implementation in a tool enabling automatic code generation. The resulting Crop Model Factory (CMF) prototype is the fruit of an interdisciplinary collaboration, and they also present feedback on this working experience.
APA, Harvard, Vancouver, ISO, and other styles
4

"Digital Citizenship." In Metamodernism and Changing Literacy, 225–62. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-3534-9.ch009.

Full text
Abstract:
In metamodern culture, digital citizens are required to make multitudinous decisions about consuming and producing information around the clock. Technology surrounds us at home, at work, in our communities, in our schools and libraries with increased expectations of instant global communication. In networked society, our responsibilities for digital citizenship have become essential. Metaliteracy is key to digital citizenship and critical to education as learners acquire, produce, and share knowledge in collaborative online communities and social media platforms. As both physical citizens of local communities and virtual citizens of global communities, the oscillation between physical and virtual space epitomizes metamodernism, laying a foundation for the future. This chapter concludes with a look at future trends for the metamodern metaliterate learner.
APA, Harvard, Vancouver, ISO, and other styles
5

Gunn, Eldon, and Corinne MacDonald. "Neural Networks in Manufacturing Operations." In Artificial Neural Networks in Finance and Manufacturing, 165–81. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-670-9.ch010.

Full text
Abstract:
This chapter provides some examples from the literature of how feed-forward neural networks are used in three different contexts in manufacturing operations. Operational design problems involve the determination of design parameters, such as number of kanbans, in order to optimize the performance of the system. Operational-system decision support refers to the use of neural networks as decision-support mechanisms in predicting system performance in response to certain settings of system parameters and current environmental factors. Operational-system-control problems are distinguished from decision support in that the consequences of a control decision are both an immediate return and putting the system in a new state from which another control decision needs to be taken. In operational control, new ideas are emerging using neural networks in approximate dynamic programming. Manufacturing systems can be very complex. There are many factors that may influence the performance of these systems; yet in many cases, the true relationship between these factors and the system outcomes is not fully understood. Neural networks have been given a great deal of attention in recent years with their ability to learn complex mappings even when presented with a partial, and even noisy, set of data. This has resulted in their being considered as a means to study and perhaps even optimize the performance of manufacturing operations. This chapter provides some examples from the literature of how neural networks are used in three different contexts in manufacturing systems. The categories (1) operational design, (2) operational decision-support systems, and (3) operational control are distinguished by the time context within which the models are used. Some examples make use of simulation models to produce training data, while some use actual production data. In some applications, the network is used to simply predict performance or outcomes, while in others the neural network is used in the determination of optimal parameters or to recommend good settings. Readers who wish to explore further examples of neural networks in manufacturing can examine Udo (1992), Zhang and Huang (1995), and Wang, Tang, and Roze (2001). We begin with two areas in which neural networks have found extensive use in manufacturing. Operational-system design has seen considerable use of neural networks as metamodels that can stand in place of the system, as we attempt to understand its behavior and optimize design parameters. Operational-system decision support refers to the use of neural networks as decision-support mechanisms in predicting system performance in response to certain settings of system parameters. We close with a short introduction to an area where we anticipate seeing growing numbers of applications, namely the use of approximate dynamic programming methods to develop real-time controllers for manufacturing systems.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "DECISION METAMODEL"

1

"METAMODEL-BASED DECISION SUPPORT SYSTEM FOR DISASTER MANAGEMENT." In 5th International Conference on Software and Data Technologies. SciTePress - Science and and Technology Publications, 2010. http://dx.doi.org/10.5220/0003006504030412.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hamzane, Ibrahim, and Badr EL Khalyly. "Towards an IT Governance of DevOps Metamodel." In 2021 International Conference on Decision Aid Sciences and Application (DASA). IEEE, 2021. http://dx.doi.org/10.1109/dasa53625.2021.9682334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Franke, Ulrik, Johan Ullberg, Teodor Sommestad, Robert Lagerstrom, and Pontus Johnson. "Decision support oriented Enterprise Architecture metamodel management using classification trees." In 2009 13th Enterprise Distributed Object Computing Conference Workshops, EDOCW. IEEE, 2009. http://dx.doi.org/10.1109/edocw.2009.5331975.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

"A Generic Workflow Metamodel to Support Resource-aware Decision Making." In 15th International Conference on Enterprise Information Systems. SciTePress - Science and and Technology Publications, 2013. http://dx.doi.org/10.5220/0004417002430250.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ramete, G. Mace, J. Lamothe, M. Lauras, and F. Benaben. "A road crisis management metamodel for an information decision support system." In 2012 6th IEEE International Conference on Digital Ecosystems and Technologies (DEST) - Complex Environment Engineering. IEEE, 2012. http://dx.doi.org/10.1109/dest.2012.6227934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Mian. "An Improved Kriging Assisted Multi-Objective Genetic Algorithm." In ASME 2010 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2010. http://dx.doi.org/10.1115/detc2010-28543.

Full text
Abstract:
Although Genetic Algorithms (GAs) and Multi-Objective Genetic Algorithms (MOGAs) have been widely used in engineering design optimization, the important challenge still faced by researchers in using these methods is their high computational cost due to the population-based nature of these methods. For these problems it is important to devise MOGAs that can significantly reduce the number of simulation calls compared to a conventional MOGA. We present an improved kriging assisted MOGA, called Circled Kriging MOGA (CK-MOGA), in which kriging metamodels are embedded within the computation procedure of a traditional MOGA. In the proposed approach, the decision as to whether the original simulation or its kriging metamodel should be used for evaluating an individual is based on a new objective switch criterion and an adaptive metamodeling technique. The effect of the possible estimated error from the metamodel is mitigated by applying the new switch criterion. Three numerical and engineering examples with different degrees of difficulty are used to illustrate applicability of the proposed approach. The results show that, on the average, CK-MOGA outperforms both a conventional MOGA and our developed Kriging MOGA in terms of the number of simulation calls.
APA, Harvard, Vancouver, ISO, and other styles
7

Lafi, Lamine, Jamel Feki, and Slimane Hammoudi. "M2BenchMatch: An assisting tool for metamodel matching." In 2013 International Conference on Control, Decision and Information Technologies (CoDIT). IEEE, 2013. http://dx.doi.org/10.1109/codit.2013.6689586.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Conner, Landon, Clarence L. Worrell, James P. Spring, and Jun Liao. "Machine Learned Metamodeling of a Computationally Intensive Accident Simulation Code." In 2021 28th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/icone28-66619.

Full text
Abstract:
Abstract The nuclear power industry is increasingly identifying applications of machine learning to reduce design, engineering, manufacturing, and operational costs. In some cases, applications have been deployed and are realizing value, in particular in the higher volume and data rich manufacturing areas of the nuclear industry. In this paper, we use machine learning to develop metamodel approximations of a computationally intense safety analysis code used to simulate a postulated loss-of-coolant accident (LOCA). The benefit of an accurate metamodel is that it runs at a fraction of the computational cost (milliseconds) compared to the LOCA analysis code. Metamodels can therefore support applications requiring a high volume of runs, for example optimization, uncertainty analysis, and probabilistic decision analysis, which would otherwise not be possible using the computationally intense code. We first generate training data by running the safety analysis code over a design of experiment. We then perform exploratory data analysis and an initial fitting of several model forms, including neighbor-based models, tree-based models, support vector machines, and artificial neural networks. We select neural network as the most promising candidate and perform hyperparameter optimization using a genetic algorithm. We discuss the resulting model, its potential applications, and areas for further research.
APA, Harvard, Vancouver, ISO, and other styles
9

Llesol, Mel, and Grace Lorraine Intal. "Development and Validation of a Disaster Response Decision Metamodel in the Philippine Context." In 2019 IEEE 6th International Conference on Industrial Engineering and Applications (ICIEA). IEEE, 2019. http://dx.doi.org/10.1109/iea.2019.8714905.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Mian, Genzi Li, and Shapour Azarm. "A Kriging Metamodel Assisted Multi-Objective Genetic Algorithm for Design Optimization." In ASME 2006 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/detc2006-99316.

Full text
Abstract:
The high computational cost of population based optimization methods, such as multi-objective genetic algorithms, has been preventing applications of these methods to realistic engineering design problems. The main challenge is to devise methods that can significantly reduce the number of computationally intensive simulation (objective/constraint functions) calls. We present a new multi-objective design optimization approach in that kriging-based metamodeling is embedded within a multi-objective genetic algorithm. The approach is called Kriging assisted Multi-Objective Genetic Algorithm, or K-MOGA. The key difference between K-MOGA and a conventional MOGA is that in K-MOGA some of the design points or individuals are evaluated by kriging metamodels, which are computationally inexpensive, instead of the simulation. The decision as to whether the simulation or their kriging metamodels to be used for evaluating an individual is based on checking a simple condition. That is, it is determined whether by using the kriging metamodels for an individual the non-dominated set in the current generation is changed. If this set is changed, then the simulation is used for evaluating the individual; otherwise, the corresponding kriging metamodels are used. Seven numerical and engineering examples with different degrees of difficulty are used to illustrate applicability of the proposed K-MOGA. The results show that on the average, K-MOGA converges to the Pareto frontier with about 50% fewer number of simulation calls compared to a conventional MOGA.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography