Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: DECISION METAMODEL.

Artykuły w czasopismach na temat „DECISION METAMODEL”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „DECISION METAMODEL”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Soeteman, Djøra I., Stephen C. Resch, Hawre Jalal, Caitlin M. Dugdale, Martina Penazzato, Milton C. Weinstein, Andrew Phillips i in. "Developing and Validating Metamodels of a Microsimulation Model of Infant HIV Testing and Screening Strategies Used in a Decision Support Tool for Health Policy Makers". MDM Policy & Practice 5, nr 1 (styczeń 2020): 238146832093289. http://dx.doi.org/10.1177/2381468320932894.

Pełny tekst źródła
Streszczenie:
Background. Metamodels can simplify complex health policy models and yield instantaneous results to inform policy decisions. We investigated the predictive validity of linear regression metamodels used to support a real-time decision-making tool that compares infant HIV testing/screening strategies. Methods. We developed linear regression metamodels of the Cost-Effectiveness of Preventing AIDS Complications Pediatric (CEPAC-P) microsimulation model used to predict life expectancy and lifetime HIV-related costs/person of two infant HIV testing/screening programs in South Africa. Metamodel performance was assessed with cross-validation and Bland-Altman plots, showing between-method differences in predicted outcomes against their means. Predictive validity was determined by the percentage of simulations in which the metamodels accurately predicted the strategy with the greatest net health benefit (NHB) as projected by the CEPAC-P model. We introduced a zone of indifference and investigated the width needed to produce between-method agreement in 95% of the simulations. We also calculated NHB losses from “wrong” decisions by the metamodel. Results. In cross-validation, linear regression metamodels accurately approximated CEPAC-P-projected outcomes. For life expectancy, Bland-Altman plots showed good agreement between CEPAC-P and the metamodel (within 1.1 life-months difference). For costs, 95% of between-method differences were within $65/person. The metamodels predicted the same optimal strategy as the CEPAC-P model in 87.7% of simulations, increasing to 95% with a zone of indifference of 0.24 life-months ( ∼ 7 days). The losses in health benefits due to “wrong” choices by the metamodel were modest (range: 0.0002–1.1 life-months). Conclusions. For this policy question, linear regression metamodels offered sufficient predictive validity for the optimal testing strategy as compared with the CEPAC-P model. Metamodels can simulate different scenarios in real time, based on sets of input parameters that can be depicted in a widely accessible decision-support tool.
Style APA, Harvard, Vancouver, ISO itp.
2

Bracco, A., J. D. Neelin, H. Luo, J. C. McWilliams i J. E. Meyerson. "High dimensional decision dilemmas in climate models". Geoscientific Model Development Discussions 6, nr 2 (8.05.2013): 2731–67. http://dx.doi.org/10.5194/gmdd-6-2731-2013.

Pełny tekst źródła
Streszczenie:
Abstract. An important source of uncertainty in climate models is linked to the calibration of model parameters. Interest in systematic and automated parameter optimization procedures stems from the desire to improve the model climatology and to quantify the average sensitivity associated with potential changes in the climate system. Neelin et al. (2010) used a quadratic metamodel to objectively calibrate an atmospheric circulation model (AGCM) around four adjustable parameters. The metamodel accurately estimates global spatial averages of common fields of climatic interest, from precipitation, to low and high level winds, from temperature at various levels to sea level pressure and geopotential height, while providing a computationally cheap strategy to explore the influence of parameter settings. Here, guided by the metamodel, the ambiguities or dilemmas related to the decision making process in relation to model sensitivity and optimization are examined. Simulations of current climate are subject to considerable regional-scale biases. Those biases may vary substantially depending on the climate variable considered, and/or on the performance metric adopted. Common dilemmas are associated with model revisions yielding improvement in one field or regional pattern or season, but degradation in another, or improvement in the model climatology but degradation in the interannual variability representation. Challenges are posed to the modeler by the high dimensionality of the model output fields and by the large number of adjustable parameters. The use of the metamodel in the optimization strategy helps visualize trade-offs at a regional level, e.g. how mismatches between sensitivity and error spatial fields yield regional errors under minimization of global objective functions.
Style APA, Harvard, Vancouver, ISO itp.
3

Bracco, A., J. D. Neelin, H. Luo, J. C. McWilliams i J. E. Meyerson. "High dimensional decision dilemmas in climate models". Geoscientific Model Development 6, nr 5 (15.10.2013): 1673–87. http://dx.doi.org/10.5194/gmd-6-1673-2013.

Pełny tekst źródła
Streszczenie:
Abstract. An important source of uncertainty in climate models is linked to the calibration of model parameters. Interest in systematic and automated parameter optimization procedures stems from the desire to improve the model climatology and to quantify the average sensitivity associated with potential changes in the climate system. Building upon on the smoothness of the response of an atmospheric circulation model (AGCM) to changes of four adjustable parameters, Neelin et al. (2010) used a quadratic metamodel to objectively calibrate the AGCM. The metamodel accurately estimates global spatial averages of common fields of climatic interest, from precipitation, to low and high level winds, from temperature at various levels to sea level pressure and geopotential height, while providing a computationally cheap strategy to explore the influence of parameter settings. Here, guided by the metamodel, the ambiguities or dilemmas related to the decision making process in relation to model sensitivity and optimization are examined. Simulations of current climate are subject to considerable regional-scale biases. Those biases may vary substantially depending on the climate variable considered, and/or on the performance metric adopted. Common dilemmas are associated with model revisions yielding improvement in one field or regional pattern or season, but degradation in another, or improvement in the model climatology but degradation in the interannual variability representation. Challenges are posed to the modeler by the high dimensionality of the model output fields and by the large number of adjustable parameters. The use of the metamodel in the optimization strategy helps visualize trade-offs at a regional level, e.g., how mismatches between sensitivity and error spatial fields yield regional errors under minimization of global objective functions.
Style APA, Harvard, Vancouver, ISO itp.
4

Gayathri, Rajakumaran, Shola Usha Rani, Lenka Čepová, Murugesan Rajesh i Kanak Kalita. "A Comparative Analysis of Machine Learning Models in Prediction of Mortar Compressive Strength". Processes 10, nr 7 (15.07.2022): 1387. http://dx.doi.org/10.3390/pr10071387.

Pełny tekst źródła
Streszczenie:
Predicting the mechanical properties of cement-based mortars is essential in understanding the life and functioning of structures. Machine learning (ML) algorithms in this regard can be especially useful in prediction scenarios. In this paper, a comprehensive comparison of nine ML algorithms, i.e., linear regression (LR), random forest regression (RFR), support vector regression (SVR), AdaBoost regression (ABR), multi-layer perceptron (MLP), gradient boosting regression (GBR), decision tree regression (DT), hist gradient boosting regression (hGBR) and XGBoost regression (XGB), is carried out. A multi-attribute decision making method called TOPSIS (technique for order of preference by similarity to ideal solution) is used to select the best ML metamodel. A large dataset on cement-based mortars consisting of 424 sample points is used. The compressive strength of cement-based mortars is predicted based on six input parameters, i.e., the age of specimen (AS), the cement grade (CG), the metakaolin-to-total-binder ratio (MK/B), the water-to-binder ratio (W/B), the superplasticizer-to-binder ratio (SP) and the binder-to-sand ratio (B/S). XGBoost regression is found to be the best ML metamodel while simple metamodels like linear regression (LR) are found to be insufficient in handling the non-linearity in the process. This mapping of the compressive strength of mortars using ML techniques will be helpful for practitioners and researchers in identifying suitable mortar mixes.
Style APA, Harvard, Vancouver, ISO itp.
5

Suljic, Mirza, Edin Osmanbegovic i Željko Dobrović. "Common Metamodel of Questionnaire Model and Decision Tree Model". Research in Applied Economics 10, nr 3 (25.09.2018): 106. http://dx.doi.org/10.5296/rae.v10i3.13540.

Pełny tekst źródła
Streszczenie:
The subject of this paper is metamodeling and its application in the field of scientific research. The main goal is to explore the possibilities of integration of two methods: questionnaires and decision trees. The questionnaire method was established as one of the methods for data collecting, while the decision tree method represents an alternative way of presenting and analyzing decision making situations. These two methods are not completely independent, but on the contrary, there is a strong natural bond between them. Therefore, the result reveals a common meta-model that over common concepts and with the use of metamodeling connects the methods: questionnaires and decision trees. The obtained results can be used to create a CASE tool or create repository that can be suitable for exchange between different systems. The proposed meta-model is not necessarily the final product. It could be further developed by adding more entities that will keep some other data.
Style APA, Harvard, Vancouver, ISO itp.
6

Meidt, Gregory J., i Kenneth W. Bauer. "PCRSM: A decision support system for simulation metamodel construction". SIMULATION 59, nr 3 (wrzesień 1992): 183–91. http://dx.doi.org/10.1177/003754979205900307.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Rojas, Luz Andrea Rodríguez, Juan Manuel Cueva Lovelle, Giovanny Mauricio Tarazona Bermúdez, Carlos Enrique Montenegro, Elena Giménez de Ory i Rubén Arístides González Crespo. "Metamodel to support decision-making from open government data". Journal of Ambient Intelligence and Humanized Computing 9, nr 3 (6.02.2017): 553–63. http://dx.doi.org/10.1007/s12652-016-0443-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Mahmud, Istiak, Md Mohsin Kabir, M. F. Mridha, Sultan Alfarhood, Mejdl Safran i Dunren Che. "Cardiac Failure Forecasting Based on Clinical Data Using a Lightweight Machine Learning Metamodel". Diagnostics 13, nr 15 (31.07.2023): 2540. http://dx.doi.org/10.3390/diagnostics13152540.

Pełny tekst źródła
Streszczenie:
Accurate prediction of heart failure can help prevent life-threatening situations. Several factors contribute to the risk of heart failure, including underlying heart diseases such as coronary artery disease or heart attack, diabetes, hypertension, obesity, certain medications, and lifestyle habits such as smoking and excessive alcohol intake. Machine learning approaches to predict and detect heart disease hold significant potential for clinical utility but face several challenges in their development and implementation. This research proposes a machine learning metamodel for predicting a patient’s heart failure based on clinical test data. The proposed metamodel was developed based on Random Forest Classifier, Gaussian Naive Bayes, Decision Tree models, and k-Nearest Neighbor as the final estimator. The metamodel is trained and tested utilizing a combined dataset comprising five well-known heart datasets (Statlog Heart, Cleveland, Hungarian, Switzerland, and Long Beach), all sharing 11 standard features. The study shows that the proposed metamodel can predict heart failure more accurately than other machine learning models, with an accuracy of 87%.
Style APA, Harvard, Vancouver, ISO itp.
9

Shevchenko, Igor, Denis Vasiliev, Nataly Rylova i Natalia Sharonova. "CONCEPTUAL MODELS OF THE DECISION SUPPORT SYSTEM FOR MANAGING SET OF THE MUNICIPAL SPHERE PROJECTS". Transactions of Kremenchuk Mykhailo Ostrohradskyi National University, nr 3(128) (11.06.2021): 57–62. http://dx.doi.org/10.30929/1995-0519.2021.3.57-62.

Pełny tekst źródła
Streszczenie:
Purpose. As a rule modern organizational systems in any industry contain many elements. These elements are connected by a complex scheme of relationships. In the municipal sphere, various programs and projects are implemented simultaneously. To increase the efficiency of work with projects, it is necessary to solve operations management problems. It is necessary to coordinate the allocation of resources, assess the quality of implementation of individual stages of each project, take into account the joint use of contractors, assess risks and eliminate problem situations. The purpose of the research is to improve the conceptual models of decision support systems to create information technology for managing the implementation of many programs and projects in the municipal sphere. Methodology. For the implementation of operational management of many projects, it is advisable to have the appropriate information technology and tools. The development of such technologies requires a certain methodological approach. The difficulties of structural synthesis of executive systems are associated with the uncertainty of the design tasks conditions. It is impossible to fully specify the algorithms of functioning and interaction of executive structures. Results. An improved ontological metamodel of the problem area and an improved model of the decision support system for managing multiple projects have been developed. Originality. The ontological metamodel of the problem area differs in that it has several strata, each of which identifies the relevant problems and aspects of the organization. The model of the decision support system has been improved by adding a formal static description of the implementation process of many projects and their relationship with many attributes and aspects. Practical value. An ontological metamodel allows to establish hierarchical causal relationships between different problems. This is especially true when implementing many projects involving many departments. The model of the decision support system allows making operational decisions taking into account many factors and their interrelations on various aspects of activity. All this contributes to the main task – the development of information technology to support decision-making in the organizational processes of municipal bodies and enterprises. References 12, figures 1.
Style APA, Harvard, Vancouver, ISO itp.
10

Idier, Déborah, Axel Aurouet, François Bachoc, Audrey Baills, José Betancourt, Fabrice Gamboa, Thierry Klein i in. "A User-Oriented Local Coastal Flooding Early Warning System Using Metamodelling Techniques". Journal of Marine Science and Engineering 9, nr 11 (27.10.2021): 1191. http://dx.doi.org/10.3390/jmse9111191.

Pełny tekst źródła
Streszczenie:
Given recent scientific advances, coastal flooding events can be properly modelled. Nevertheless, such models are computationally expensive (requiring many hours), which prevents their use for forecasting and warning. In addition, there is a gap between the model outputs and information actually needed by decision makers. The present work aims to develop and test a method capable of forecasting coastal flood information adapted to users’ needs. The method must be robust and fast and must integrate the complexity of coastal flood processes. The explored solution relies on metamodels, i.e., mathematical functions that precisely and efficiently (within minutes) estimate the results that would provide the numerical model. While the principle of relying on metamodel solutions is not new, the originality of the present work is to tackle and validate the entire process from the identification of user needs to the establishment and validation of the rapid forecast and early warning system (FEWS) while relying on numerical modelling, metamodelling, the development of indicators, and information technologies. The development and validation are performed at the study site of Gâvres (France). This site is subject to wave overtopping, so the numerical phase-resolving SWASH model is used to build the learning dataset required for the metamodel setup. Gaussian process- and random forest classifier-based metamodels are used and post-processed to estimate 14 indicators of interest for FEWS users. These metamodelling and post-processing schemes are implemented in an FEWS prototype, which is employed by local users and exhibits good warning skills during the validation period. Based on this experience, we provide recommendations for the improvement and/or application of this methodology and individual steps to other sites.
Style APA, Harvard, Vancouver, ISO itp.
11

Sacha, Krzysztof. "On the Semantics of Architectural Decisions". International Journal of Software Engineering and Knowledge Engineering 26, nr 02 (marzec 2016): 333–46. http://dx.doi.org/10.1142/s0218194016500145.

Pełny tekst źródła
Streszczenie:
The architecture of a software system results from decisions made by the developers throughout the software life cycle. Any decision pertaining to software architecture is called an architectural decision. Architectural decision modelling captures the dependencies that exist between the decisions and serves as a foundation for knowledge management and reuse. Several models have been described in the literature, using natural language to explain the basic notions and class diagrams to show relations between them. However, a formal definition of an architectural decision is still missing. This paper analyzes existing architectural decision models and provides a formal background for the basic notions that all the models have consensus on. The major contribution of this paper is twofold: to propose a set-theoretic definition of the semantics of architectural decisions; and to show an explicit interpretation of basic relationships that exist in the architectural knowledge. The formalization can help in understanding the meaning of architectural decisions and the meaning of relations that exist between the decision elements. UML-based metamodel for architectural design decisions is also presented.
Style APA, Harvard, Vancouver, ISO itp.
12

Kalita, Kanak, Ranjan Kumar Ghadai i Ankur Bansod. "Sensitivity Analysis of GFRP Composite Drilling Parameters and Genetic Algorithm-Based Optimisation". International Journal of Applied Metaheuristic Computing 13, nr 1 (styczeń 2022): 1–17. http://dx.doi.org/10.4018/ijamc.290539.

Pełny tekst źródła
Streszczenie:
In this article, a genetic algorithm (GA) is used for optimizing a metamodel of surface roughness (R_a ) in drilling glass-fibre reinforced plastic (GFRP) composites. A response surface methodology (RSM) based three levels (-1, 0, 1) design of experiments is used for developing the metamodel. Analysis of variance (ANOVA) is undertaken to determine the importance of each process parameter in the developed metamodel. Subsequently, after detailed metamodel adequacy checks, the insignificant terms are dropped to make the established metamodel more rigorous and make accurate predictions. A sensitivity analysis of the independent variables on the output response helps in determining the most influential parameters. It is observed that f is the most crucial parameter, followed by the t and D. The optimization results depict that the R_a increases as the f increases and a minor value of drill diameter is the most appropriate to attain minimum surface roughness. Finally, a robustness test of the predicted GA solution is carried out.
Style APA, Harvard, Vancouver, ISO itp.
13

Kaliszewski, I., i D. Podkopaev. "Simple additive weighting—A metamodel for multiple criteria decision analysis methods". Expert Systems with Applications 54 (lipiec 2016): 155–61. http://dx.doi.org/10.1016/j.eswa.2016.01.042.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Lin, Li, Jeffery K. Cochran i Joseph Sarkis. "A metamodel-based decision support system for shop floor production control". Computers in Industry 18, nr 2 (1992): 155–68. http://dx.doi.org/10.1016/0166-3615(92)90110-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
15

Di Tria, Francesco, Ezio Lefons i Filippo Tangorra. "Metadata for Approximate Query Answering Systems". Advances in Software Engineering 2012 (3.09.2012): 1–13. http://dx.doi.org/10.1155/2012/247592.

Pełny tekst źródła
Streszczenie:
In business intelligence systems, data warehouse metadata management and representation are getting more and more attention by vendors and designers. The standard language for the data warehouse metadata representation is the Common Warehouse Metamodel. However, business intelligence systems include also approximate query answering systems, since these software tools provide fast responses for decision making on the basis of approximate query processing. Currently, the standard meta-model does not allow to represent the metadata needed by approximate query answering systems. In this paper, we propose an extension of the standard metamodel, in order to define the metadata to be used in online approximate analytical processing. These metadata have been successfully adopted in ADAP, a web-based approximate query answering system that creates and uses statistical data profiles.
Style APA, Harvard, Vancouver, ISO itp.
16

Kaptan, Kubilay. "An Organizational Metamodel for Hospital Emergency Departments". Disaster Medicine and Public Health Preparedness 8, nr 5 (październik 2014): 436–44. http://dx.doi.org/10.1017/dmp.2014.101.

Pełny tekst źródła
Streszczenie:
AbstractI introduce an organizational model describing the response of the hospital emergency department. The hybrid simulation/analytical model (called a “metamodel”) can estimate a hospital’s capacity and dynamic response in real time and incorporate the influence of damage to structural and nonstructural components on the organizational ones. The waiting time is the main parameter of response and is used to evaluate the disaster resilience of health care facilities. Waiting time behavior is described by using a double exponential function and its parameters are calibrated based on simulated data. The metamodel covers a large range of hospital configurations and takes into account hospital resources in terms of staff and infrastructures, operational efficiency, and the possible existence of an emergency plan; maximum capacity; and behavior both in saturated and overcapacitated conditions. The sensitivity of the model to different arrival rates, hospital configurations, and capacities and the technical and organizational policies applied during and before a disaster were investigated. This model becomes an important tool in the decision process either for the engineering profession or for policy makers.(Disaster Med Public Health Preparedness. 2014;8:436-444)
Style APA, Harvard, Vancouver, ISO itp.
17

He, Lei, Jian Yao i Yong Lin Lei. "Air-Combat Decision Modeling Method Based on DSM". Applied Mechanics and Materials 536-537 (kwiecień 2014): 416–20. http://dx.doi.org/10.4028/www.scientific.net/amm.536-537.416.

Pełny tekst źródła
Streszczenie:
Air-combat decision modeling in effectiveness simulation has to be concerned with the important feature of decision making, such as complexity, diversity, flexibility. So Several challenges have to be mastered, including: improving the abstract level of modeling, providing friendly modeling language, validating concept model and generated code (or executive model) automatically. In this paper, domain-specific modeling (DSM) method is applied in air-combat decision simulation modeling to cope with those challenges. A graphical and textual domain-specific modeling language (DSML) of air-combat decision is designed through metamodel based on an open source tool, Generic Modeling Environment (GME). A code generator is developed to implement users decision model based on python script.
Style APA, Harvard, Vancouver, ISO itp.
18

Abd Sahrin, Mohammad Sahrul Akmal, i Mohd Faisal Abdul Khanan. "GEOSPATIAL METAMODEL FOR LANDSLIDE DISASTER MANAGEMENT IN MALAYSIA: CURRENT PRACTICES". Journal of Information System and Technology Management 7, nr 25 (7.03.2022): 65–82. http://dx.doi.org/10.35631/jistm.725005.

Pełny tekst źródła
Streszczenie:
Most of time, landslide in Malaysia was triggered by heavy rainfall during monsoon season. Landslides disaster in Malaysia are managed by Public Work Department (JKR) via Slope Engineering Branch (CKC), Department of Mineral and Geosciences (JMG), and Malaysian Space Agency (Agensi Angkasa Malaysia). JKR was critically engaged in slope remediation activities and the establishment of slope management. JMG and MYSA contribution was informing the government areas prone to landslides via landslide mapping. National Institution, expert practices, and researcher in landslide management in Malaysia is actively engaged in disaster management (DM) activities through the implementation National Slope Master Plan 2009 – 2023 (NSMP 2009 – 2023). With due respect, some issues such as no evasive approach where experts might use for decision-making process according to site suitability, and appropriate mitigation measure, lack of study in landslide disaster management practices in Malaysia, and the availability landslide historical data. Hence, this paper systematically reviews the current practices of geospatial metamodel for landslide disaster management in Malaysia. Findings show that, only few researchers deeply explore the potential of geospatial metamodel in facilitating landslide management and monitoring activities in Malaysia. Geospatial metamodel gives a benefit in determined the completeness of any landslide DM activities either before, during, or after the incident.
Style APA, Harvard, Vancouver, ISO itp.
19

Haberlandt, U. "From hydrological modelling to decision support". Advances in Geosciences 27 (23.08.2010): 11–19. http://dx.doi.org/10.5194/adgeo-27-11-2010.

Pełny tekst źródła
Streszczenie:
Abstract. Decision support for planning and management of water resources needs to consider many target criteria simultaneously like water availability, water quality, flood protection, agriculture, ecology, etc. Hydrologic models provide information about the water balance components and are fundamental for the simulation of ecological processes. Objective of this contribution is to discuss the suitability of classical hydrologic models on one hand and of complex eco-hydrologic models on the other hand to be used as part of decision support systems. The discussion is based on results from two model comparison studies. It becomes clear that none of the hydrologic models tested fulfils all requirements in an optimal sense. Regarding the simulation of water quality parameters like nitrogen leaching a high uncertainty needs to be considered. Recommended for decision support is a hybrid metamodel approach, which comprises a hydrologic model, empirical relationships for the less dynamic processes and makes use of simulation results from complex eco-hydrologic models through second-order modelling at a generalized level.
Style APA, Harvard, Vancouver, ISO itp.
20

Rulik, Sebastian, Włodzimierz Wróblewski i Daniel Frączek. "Metamodel-Based Optimization of the Labyrinth Seal". Archive of Mechanical Engineering 64, nr 1 (1.03.2017): 75–91. http://dx.doi.org/10.1515/meceng-2017-0005.

Pełny tekst źródła
Streszczenie:
Abstract The presented paper concerns CFD optimization of the straight-through labyrinth seal with a smooth land. The aim of the process was to reduce the leakage flow through a labyrinth seal with two fins. Due to the complexity of the problem and for the sake of the computation time, a decision was made to modify the standard evolutionary optimization algorithm by adding an approach based on a metamodel. Five basic geometrical parameters of the labyrinth seal were taken into account: the angles of the seal’s two fins, and the fin width, height and pitch. Other parameters were constrained, including the clearance over the fins. The CFD calculations were carried out using the ANSYS-CFX commercial code. The in-house optimization algorithm was prepared in the Matlab environment. The presented metamodel was built using a Multi-Layer Perceptron Neural Network which was trained using the Levenberg-Marquardt algorithm. The Neural Network training and validation were carried out based on the data from the CFD analysis performed for different geometrical configurations of the labyrinth seal. The initial response surface was built based on the design of the experiment (DOE). The novelty of the proposed methodology is the steady improvement in the response surface goodness of fit. The accuracy of the response surface is increased by CFD calculations of the labyrinth seal additional geometrical configurations. These configurations are created based on the evolutionary algorithm operators such as selection, crossover and mutation. The created metamodel makes it possible to run a fast optimization process using a previously prepared response surface. The metamodel solution is validated against CFD calculations. It then complements the next generation of the evolutionary algorithm.
Style APA, Harvard, Vancouver, ISO itp.
21

Yousefi, Milad, i Moslem Yousefi. "Human resource allocation in an emergency department". Kybernetes 49, nr 3 (19.06.2019): 779–96. http://dx.doi.org/10.1108/k-12-2018-0675.

Pełny tekst źródła
Streszczenie:
Purpose The complexity and interdisciplinarity of healthcare industry problems make this industry one of the attention centers of computer-based simulation studies to provide a proper tool for interaction between decision-makers and experts. The purpose of this study is to present a metamodel-based simulation optimization in an emergency department (ED) to allocate human resources in the best way to minimize door to doctor time subject to the problem constraints which are capacity and budget. Design/methodology/approach To obtain the objective of this research, first the data are collected from a public hospital ED in Brazil, and then an agent-based simulation is designed and constructed. Afterwards, three machine-learning approaches, namely, adaptive neuro-fuzzy inference system (ANFIS), feed forward neural network (FNN) and recurrent neural network (RNN), are used to build an ensemble metamodel through adaptive boosting. Finally, the results from the metamodel are applied in a discrete imperialist competitive algorithm (ICA) for optimization. Findings Analyzing the results shows that the yellow zone section is considered as a potential bottleneck of the ED. After 100 executions of the algorithm, the results show a reduction of 24.82 per cent in the door to doctor time with a success rate of 59 per cent. Originality/value This study fulfils an identified need to optimize human resources in an ED with less computational time.
Style APA, Harvard, Vancouver, ISO itp.
22

SIMONSSON, MÅRTEN, PONTUS JOHNSON, MATHIAS EKSTEDT i WALDO ROCHA FLORES. "IT GOVERNANCE DECISION SUPPORT USING THE IT ORGANIZATION MODELING AND ASSESMENT TOOL". International Journal of Innovation and Technology Management 08, nr 02 (czerwiec 2011): 167–89. http://dx.doi.org/10.1142/s0219877011002325.

Pełny tekst źródła
Streszczenie:
This paper describes the information technology (IT) organization modeling and assessment tool (ITOMAT) and how it can be used for IT governance decision making. The ITOMAT consists of an enterprise architecture metamodel that describes IT organizations. Further, ITOMAT contains a Bayesian network for making predictions on how changes to IT organization models will affect the IT governance performance as perceived by business stakeholders. Thorough case studies at 20 different companies have been conducted in order to calibrate the network. Finally, the paper describes a case study where ITOMAT was used to analyze the future impact of two IT organization change scenarios in a medium-sized engineering company.
Style APA, Harvard, Vancouver, ISO itp.
23

Chattopadhyay, Ritwika, Partha Protim Das i Shankar Chakraborty. "Development of a Rough-MABAC-DoE-based Metamodel for Supplier Selection in an Iron and Steel Industry". Operational Research in Engineering Sciences: Theory and Applications 5, nr 1 (20.04.2022): 20–40. http://dx.doi.org/10.31181/oresta190222046c.

Pełny tekst źródła
Streszczenie:
In the context of supply chain management, supplier selection can be defined as the process by which organizations score and evaluate a range of alternative suppliers to choose the best possible one who can provide superior quality of raw materials at cheaper rate and lesser lead time. It is a decision making process with multiple trade-offs between various conflicting criteria which in turn helps the organizations identify the suitable suppliers that would establish a robust supply chain assisting in maintaining a competitive edge. The main objective of supplier selection is thus focused on reducing purchase risk, maximizing overall value to the organization, and developing closeness and long-term relationships between the suppliers and the organization. In this paper, while selecting the most suitable supplier for gearboxes in an Indian iron and steel industry, assessments of three decision makers on the performance of five candidate suppliers with respect to five evaluation criteria are first aggregated using rough numbers. The definitive distances of those rough numbers are then treated as the inputs to a 25 full-factorial design plan with the corresponding multi-attributive border approximation area comparison (MABAC) scores as the output variables. Finally, a design of experiments (DoE)-based metamodel is formulated to interlink the computed MABAC scores with the considered criteria. The competing suppliers are ranked based on this rough-MABAC-DoE-based metamodel, which also easies out the computational steps when new suppliers are included in the decision making process.
Style APA, Harvard, Vancouver, ISO itp.
24

Gilliams, S., J. Van Orshoven, B. Muys, H. Kros, G. W. Heil i W. Van Deursen. "AFFOREST sDSS: a metamodel based spatial decision support system for afforestation of agricultural land". New Forests 30, nr 1 (lipiec 2005): 33–53. http://dx.doi.org/10.1007/s11056-004-0761-z.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
25

García-Segura, Tatiana, Vicent Penadés-Plà i Víctor Yepes. "Sustainable bridge design by metamodel-assisted multi-objective optimization and decision-making under uncertainty". Journal of Cleaner Production 202 (listopad 2018): 904–15. http://dx.doi.org/10.1016/j.jclepro.2018.08.177.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
26

Lima, Gustavo Meirelles, Bruno Melo Brentan, Daniel Manzi i Edevar Luvizotto. "Metamodel for nodal pressure estimation at near real-time in water distribution systems using artificial neural networks". Journal of Hydroinformatics 20, nr 2 (8.12.2017): 486–96. http://dx.doi.org/10.2166/hydro.2017.036.

Pełny tekst źródła
Streszczenie:
Abstract The development of computational models for analysis of the operation of water supply systems requires the calibration of pipes' roughness, among other parameters. Inadequate values of this parameter can result in inaccurate solutions, compromising the applicability of the model as a decision-making tool. This paper presents a metamodel to estimate the pressure at all nodes of a distribution network based on artificial neural networks (ANNs), using a set of field data obtained from strategically located pressure sensors. This approach aims to increase the available pressure data, reducing the degree of freedom of the calibration problem. The proposed model uses the inlet flow of the district metering area and pressure data monitored in some nodes, as input data to the ANN, obtaining as output, the pressure values for nodes that were not monitored. Two case studies of real networks are presented to validate the efficiency and accuracy of the method. The results ratify the efficiency of ANN as state forecaster, showing the high applicability of the metamodel tool to increase a database or to identify abnormal events during an operation.
Style APA, Harvard, Vancouver, ISO itp.
27

Walter, Maximilian, Sebastian Hahner, Tomáš Bureš, Petr Hnětynka, Robert Heinrich i Ralf Reussner. "Architecture-based attack propagation and variation analysis for identifying confidentiality issues in Industry 4.0". at - Automatisierungstechnik 71, nr 6 (1.06.2023): 443–52. http://dx.doi.org/10.1515/auto-2022-0135.

Pełny tekst źródła
Streszczenie:
Abstract Exchanging data between entities is an essential part of Industry 4.0. However, the data exchange should not affect the confidentiality. Therefore, data should only be shared with the intended entities. In exceptional scenarios, it is unclear whether data should be shared or not and what the impact of the access decision is. Runtime access control systems such as role-based access control often do not consider the impact on the overall confidentiality. Static design-time analyses often provide this information. We use architectural design-time analyses together with an uncertainty variation metamodel mitigating uncertainty to calculate impact properties of attack paths. Runtime access control approaches can then use this information to support the access control decision. We evaluated our approach on four case studies based on real-world examples and research cases.
Style APA, Harvard, Vancouver, ISO itp.
28

Mauro, Francesco, Luca Braidotti i Giorgio Trincas. "A Model for Intact and Damage Stability Evaluation of CNG Ships during the Concept Design Stage". Journal of Marine Science and Engineering 7, nr 12 (8.12.2019): 450. http://dx.doi.org/10.3390/jmse7120450.

Pełny tekst źródła
Streszczenie:
To face the design of a new ship concept, the evaluation of multiple feasible solutions concerning several aspects of naval architecture and marine engineering is necessary. Compressed natural gas technologies are in continuous development; therefore, there are no available databases for existing ships to use as a basis for the design process of a new unit. In this sense, the adoption of a modern multi-attribute decision-based method can help the designer for the study of a completely new ship prototype. A database of compressed natural gas ships was generated starting from a baseline hull, varying six hull-form parameters by means of the design of experiment technique. Between the attributes involved in the concept design process, stability is for sure one of the most relevant topics, both for intact and damaged cases. This work describes two approaches to identify the compliance of a ship with the intact stability regulations based on the ship main geometrical quantities. Moreover, a metamodel based on the maximum floodable length concept (damage stability) allows determining the main internal subdivision of the ship. The metamodel outcomes were compared with results from direct calculations on a ship external to the database, highlighting the adequate accuracy given by the developed methods.
Style APA, Harvard, Vancouver, ISO itp.
29

Gumienny, Grzegorz, Barbara Kacprzyk, Barbara Mrzygłód i Krzysztof Regulski. "Data-Driven Model Selection for Compacted Graphite Iron Microstructure Prediction". Coatings 12, nr 11 (4.11.2022): 1676. http://dx.doi.org/10.3390/coatings12111676.

Pełny tekst źródła
Streszczenie:
Compacted graphite iron (CGI), having a specific graphite form with a large matrix contact surface, is a unique casting material. This type of cast iron tends to favor direct ferritization and is characterized by a complex of very interesting properties. Intelligent computing tools such as artificial neural networks (ANNs) are used as predictive modeling tools, allowing their users to forecast the microstructure of the tested cast iron at the level of computer simulation. This paper presents the process of the development of a metamodel for the selection of a neural network appropriate for a specific chemical composition. Predefined models for the specific composition have better precision, and the initial selection provides the user with automation of reasoning and prediction. Automation of the prediction is based on the rules obtained from the decision tree, which classifies the type of microstructure. In turn, the type of microstructure was obtained by clustering objects of different chemical composition. The authors propose modeling the prediction of the volume fraction of phases in the CGI microstructure in a three-step procedure. In the first phase, k-means, unsupervised segmentation techniques were used to determine the metamodel (DT), which in the second phase enables the selection of the appropriate ANN submodel (third phase).
Style APA, Harvard, Vancouver, ISO itp.
30

Kaz, M. S., i E. A. Akerman. "The Method of Real Options and the Business Model «Lean Canvas» in the Practice of Performance Evaluation of it Projects". Vestnik NSUEM, nr 4 (1.01.2022): 80–92. http://dx.doi.org/10.34020/2073-6495-2021-4-080-092.

Pełny tekst źródła
Streszczenie:
The relevance of the study is due to the active implementation of IT technologies in various aspects of companies, which gives special importance to the development of a methodology for assessing the effectiveness of projects in a highly uncertain environment. The paper presents the methodology and assesses the effectiveness of IT projects using binomial «decision tree» model and iterative risk assessment metamodel «Lean Canvas». The comparative assessment of IT project efficiency using discounted cash flow method, binomial «decision tree» model and Black–Scholes model was carried out. The results have shown the advantage of option-based approach to the evaluation of IT project efficiency in comparison with the traditional DCF method, which allows to build flexibility in the planning and management of the project, assess its potential and consider the uncertainties as additional opportunities for profit.
Style APA, Harvard, Vancouver, ISO itp.
31

Gomolka, Zbigniew, Ewa Dudek-Dyduch i Ewa Zeslawska. "Generalization of ALMM Based Learning Method for Planning and Scheduling". Applied Sciences 12, nr 24 (12.12.2022): 12766. http://dx.doi.org/10.3390/app122412766.

Pełny tekst źródła
Streszczenie:
This paper refers to a machine learning method for solving NP-hard discrete optimization problems, especially planning and scheduling. The method utilizes a special multistage decision process modeling paradigm referred to as the Algebraic Logical Metamodel based learning methods of Multistage Decision Processes (ALMM). Hence, the name of the presented method is the ALMM Based Learning method. This learning method utilizes a specifically built local multicriterion optimization problem that is solved by means of scalarization. This paper describes both the development of such local optimization problems and the concept of the learning process with the fractional derivative mechanism itself. It includes proofs of theorems showing that the ALMM Based Learning method can be defined for a much broader problem class than initially assumed. This significantly extends the range of the prime learning method applications. New generalizations for the prime ALMM Based Learning method, as well as some essential comments on a comparison of Reinforcement Learning with the ALMM Based Learning, are also presented.
Style APA, Harvard, Vancouver, ISO itp.
32

Jiang, Tao, i Weihong Zhou. "An Approach of Defining Domain Constraints for Domain-Specific Modeling Language". International Journal of Pattern Recognition and Artificial Intelligence 35, nr 09 (10.04.2021): 2153002. http://dx.doi.org/10.1142/s0218001421530025.

Pełny tekst źródła
Streszczenie:
Many Domain-Specific Modeling Languages (DSML) cannot formally define their semantics, leading to difficulties in identifying user-defined domain constraints. In this study, we propose a user-defined mechanism of domain constraints based on the formalization of structural semantics of DSML. First, we formally define concepts and decision methods of consistency and validity of domain constraints. Subsequently, we establish concepts and reasoning methods of domain-based model consistency. Thus, several domain constraint instances are defined and different models instances’ consistency are reasoned based on formalization of software architecture domain metamodel to illustrate our approach. Finally, our formal definition mechanism of domain constraint is added to our automatic translator for formalizing DSML and its models to automatically reason about domain constraints built based on DSML.
Style APA, Harvard, Vancouver, ISO itp.
33

Nguyen, Duc Nam, Thanh-Phong Dao, Ngoc Le Chau i Van Anh Dang. "Hybrid Approach of Finite Element Method, Kigring Metamodel, and Multiobjective Genetic Algorithm for Computational Optimization of a Flexure Elbow Joint for Upper-Limb Assistive Device". Complexity 2019 (27.01.2019): 1–13. http://dx.doi.org/10.1155/2019/3231914.

Pełny tekst źródła
Streszczenie:
Modeling for robotic joints is actually complex and may lead to wrong Pareto-optimal solutions. Hence, this paper develops a new hybrid approach for multiobjective optimization design of a flexure elbow joint. The joint is designed for the upper-limb assistive device for physically disable people. The optimization problem considers three design variables and two objective functions. An efficient hybrid optimization approach of central composite design (CDD), finite element method (FEM), Kigring metamodel, and multiobjective genetic algorithm (MOGA) is developed. The CDD is used to establish the number of numerical experiments. The FEM is developed to retrieve the strain energy and the reaction torque of joint. And then, the Kigring metamodel is used as a black-box to find the pseudoobjective functions. Based on pseudoobjective functions, the MOGA is applied to find the optimal solutions. Traditionally, an evolutionary optimization algorithm can only find one Pareto front. However, the proposed approach can generate 6 Pareto-optimal solutions, as near optimal candidates, which provides a good decision-maker. Based on the user’s real-work problem, one of the best optimal solutions is chosen. The results found that the optimal strain energy is about 0.0033 mJ and the optimal torque is approximately 588.94 Nm. Analysis of variance is performed to identify the significant contribution of design variables. The sensitivity analysis is then carried out to determine the effect degree of each parameter on the responses. The predictions are in a good agreement with validations. It confirms that the proposed hybrid optimization approach has an effectiveness to solve for complex optimization problems.
Style APA, Harvard, Vancouver, ISO itp.
34

Chacón, Lorena, Miguel Chen Austin i Carmen Castaño. "A Multiobjective Optimization Approach for Retrofitting Decision-Making towards Achieving Net-Zero Energy Districts: A Numerical Case Study in a Tropical Climate". Smart Cities 5, nr 2 (26.03.2022): 405–32. http://dx.doi.org/10.3390/smartcities5020023.

Pełny tekst źródła
Streszczenie:
Buildings are among the main reasons for the deterioration of the world environment as they are responsible for a large percentage of CO2 emissions related to energy. For this reason, it is necessary to find solutions to this problem. This research project consists of constructing the metamodel of an urbanization located in Panama, Herrera province. The classification and systematization of its main elements, using the software DesignBuilder and SysML diagrams, were carried out for its subsequent implementation in an optimization analysis that seeks to approach the NZED standard. The main objectives of the optimization are reducing the energy consumption at the lowest possible price while maintaining or improving thermal comfort. In this study, it was possible to reduce electricity consumption to at least 60% of the original value and about 10% of the renewable energy generation capacity by implementing optimization techniques within the retrofit category related to the envelope of the buildings and the occupant’s behavior.
Style APA, Harvard, Vancouver, ISO itp.
35

Geyer, Philipp, i Arno Schlüter. "Automated metamodel generation for Design Space Exploration and decision-making – A novel method supporting performance-oriented building design and retrofitting". Applied Energy 119 (kwiecień 2014): 537–56. http://dx.doi.org/10.1016/j.apenergy.2013.12.064.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Blanco, Carlos, Guzmán de, Eduardo Fernández-Medina i Juan Trujillo. "An MDA approach for developing secure OLAP applications: Metamodels and transformations". Computer Science and Information Systems 12, nr 2 (2015): 541–65. http://dx.doi.org/10.2298/csis140617007b.

Pełny tekst źródła
Streszczenie:
Decision makers query enterprise information stored in Data Warehouses (DW) by using tools (such as On-Line Analytical Processing (OLAP) tools) which employ specific views or cubes from the corporate DW or Data Marts, based on multidimensional modelling. Since the information managed is critical, security constraints have to be correctly established in order to avoid unauthorized access. In previous work we defined a Model-Driven based approach for developing a secure DW repository by following a relational approach. Nevertheless, it is also important to define security constraints in the metadata layer that connects the DW repository with the OLAP tools; that is, over the same multidimensional structures that end users manage. This paper incorporates a proposal for developing secure OLAP applications within our previous approach: it improves a UML profile for conceptual modelling; it defines a logical metamodel for OLAP applications; and it defines and implements transformations from conceptual to logical models, as well as from logical models to secure implementation in a specific OLAP tool (SQL Server Analysis Services).
Style APA, Harvard, Vancouver, ISO itp.
37

Hariri-Ardebili, Mohammad Amin, S. Mahdi Seyed-Kolbadi i Mohammad Noori. "Response Surface Method for Material Uncertainty Quantification of Infrastructures". Shock and Vibration 2018 (5.07.2018): 1–14. http://dx.doi.org/10.1155/2018/1784203.

Pełny tekst źródła
Streszczenie:
Recently, probabilistic simulations became an inseparable part of risk analysis. Managers and stakeholders prefer to make their decision knowing the existing uncertainties in the system. Nonlinear dynamic analysis and design of infrastructures are affected by two main uncertainty sources, i.e., epistemic and aleatory. In the present paper, the epistemic uncertainty is addressed in the context of material randomness. An old ultra-high arch dam is selected as a vehicle for numerical analyses. Four material properties are selected as random variables in the coupled dam-reservoir-foundation system, i.e., concrete elasticity, mass density, compressive (and tensile) strength, and the rock modulus of elasticity. The efficient Box-Behnken experimental design is adopted to minimize the required simulations. A response surface metamodel is developed for the system based on different outputs, i.e., displacement and damage index. The polynomial-based response surface model is subsequently validated with a large number of simulations based on Latin Hypercube sampling. Results confirm the high accuracy of proposed technique in material uncertainty quantification.
Style APA, Harvard, Vancouver, ISO itp.
38

Zhang, Haochuan, Jie Han, Xiaojun Zhou i Yuxuan Zheng. "Robust Optimization with Interval Uncertainties Using Hybrid State Transition Algorithm". Electronics 12, nr 14 (11.07.2023): 3035. http://dx.doi.org/10.3390/electronics12143035.

Pełny tekst źródła
Streszczenie:
Robust optimization is concerned with finding an optimal solution that is insensitive to uncertainties and has been widely used in solving real-world optimization problems. However, most robust optimization methods suffer from high computational costs and poor convergence. To alleviate the above problems, an improved robust optimization algorithm is proposed. First, to reduce the computational cost, the second-order Taylor series surrogate model is used to approximate the robustness indices. Second, to strengthen the convergence, the state transition algorithm is studied to explore the whole search space for candidate solutions, while sequential quadratic programming is adopted to exploit the local area. Third, to balance the robustness and optimality of candidate solutions, a preference-based selection mechanism is investigated which effectively determines the promising solution. The proposed robust optimization method is applied to obtain the optimal solutions of seven examples that are subject to decision variables and parameter uncertainties. Comparative studies with other robust optimization algorithms (robust genetic algorithm, Kriging metamodel-assisted robust optimization method, etc.) show that the proposed method can obtain accurate and robust solutions with less computational cost.
Style APA, Harvard, Vancouver, ISO itp.
39

Degan, Germano, Luca Braidotti, Alberto Marinò i Vittorio Bucci. "LCTC Ships Concept Design in the North Europe- Mediterranean Transport Scenario Focusing on Intact Stability Issues". Journal of Marine Science and Engineering 9, nr 3 (4.03.2021): 278. http://dx.doi.org/10.3390/jmse9030278.

Pełny tekst źródła
Streszczenie:
In late years, the size of RoRo cargo ships has continuously increased, leading to the so-called Large Car Truck Carriers (LCTC). The design of these vessels introduced new challenges that shall be considered during the ship design since the conceptual stage, which has a very strong impact on the technical and economic performances of the vessel during all its life-cycle. In this work, the concept design of an LCTC is presented based on Multi-Attribute Decision Making (MADM). A large set of design alternatives have been generated and compared in order to find out the most promising feasible designs. The proposed approach is based on a Mathematical Design Model (MDM) capable to assess all the main technical and economic characteristics for each design. Among the others, here focus has been done on the ship stability to assure the compliance with statutory rules within the MDM. A new stability metamodel has been developed capable to define the cross curves of stability at the concept design stage. The proposed MADM methodology has been applied to North Europe-Mediterranean transport scenario highlighting the impact of main particulars describing hull geometry on the technical and economic performances of an LCTC ship.
Style APA, Harvard, Vancouver, ISO itp.
40

Havinga, Jos, Pranab K. Mandal i Ton van den Boogaard. "Exploiting data in smart factories: real-time state estimation and model improvement in metal forming mass production". International Journal of Material Forming 13, nr 5 (24.07.2019): 663–73. http://dx.doi.org/10.1007/s12289-019-01495-2.

Pełny tekst źródła
Streszczenie:
Abstract Modern production systems have numerous sensors that produce large amounts of data. This data can be exploited in many ways, from providing insight into the manufacturing process to facilitating automated decision making. These opportunities are still underexploited in the metal forming industry, due to the complexity of these processes. In this work, a probabilistic framework is proposed for simultaneous model improvement and state estimation in metal forming mass production. Recursive Bayesian estimation is used to simultaneously track the evolution of process state and to estimate the deviation between the physics-based model and the real process. A sheet bending mass production process is used to test the proposed framework. A metamodel of the process is built using proper orthogonal decomposition and radial basis function interpolation. The model is extended with a deviation model in order to account for the difference between model and real process. Particle filtering is used to track the state evolution and to estimate the deviation model parameters simultaneously. The approach is tested and analysed using a large number of simulations, based on pseudo-data obtained from a numerical sheet bending model.
Style APA, Harvard, Vancouver, ISO itp.
41

MacCalman, Alex D., i Simon R. Goerger. "Leveraging a design of experiments methodology to enhance impacts of modeling and simulations for engineered resilient systems". Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 15, nr 4 (4.05.2018): 383–97. http://dx.doi.org/10.1177/1548512918765701.

Pełny tekst źródła
Streszczenie:
System engineers rely on a variety of models and simulations to help understand multiple perspectives in several domains throughout a system’s life-cycle. These domain models include operational simulations, life-cycle cost models, physics-based computational models, and many more. Currently, there is a technical gap with regard to our ability to untangle system design drivers within system life-cycle domains. This article provides a procedural workflow that addresses this technical gap by leveraging the methods of experimental design in order to clearly identify tradable variables and narrow the search for viable system variants. Our purpose is to illuminate trade decisions across several different viewpoints by integrating metamodels that approximate the behavior of multiple domain models; a metamodel is a statistical function that acts as a surrogate to a model. Model inputs often represent value properties that define a system alternative configuration or environmental conditions that represent uncertain factors within the system boundary. Model outputs represent measures of performance or effectiveness that allow us to compare alternatives and understand the tradeoffs among several objectives. In order to illuminate the tradeoffs that exist in a complex system design problem we propose an approach that approximates model input and output behavior using the functional form of statistical metamodels. After performing an experimental design, we can fit a metamodel with a functional form known as a response surface. We utilize contour profilers that show horizontal cross sections of multiple response surfaces to visualize where key trade decisions exist. Our research supports the tradespace analytics pillar for the development of the engineered resilient system (ERS) architecture. The article concludes with instructions on how to perform simulation experiments to construct a dynamic dashboard that illuminates system tradeoffs.
Style APA, Harvard, Vancouver, ISO itp.
42

Stefanovic, Nenad. "Proactive Supply Chain Performance Management with Predictive Analytics". Scientific World Journal 2014 (2014): 1–17. http://dx.doi.org/10.1155/2014/528917.

Pełny tekst źródła
Streszczenie:
Today’s business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.
Style APA, Harvard, Vancouver, ISO itp.
43

Kashmar, Nadine, Mehdi Adda i Hussein Ibrahim. "HEAD Access Control Metamodel: Distinct Design, Advanced Features, and New Opportunities". Journal of Cybersecurity and Privacy 2, nr 1 (14.02.2022): 42–64. http://dx.doi.org/10.3390/jcp2010004.

Pełny tekst źródła
Streszczenie:
Access control (AC) policies are a set of rules administering decisions in systems and they are increasingly used for implementing flexible and adaptive systems to control access in today’s internet services, networks, security systems, and others. The emergence of the current generation of networking environments, with digital transformation, such as the internet of things (IoT), fog computing, cloud computing, etc., with their different applications, bring out new trends, concepts, and challenges to integrate more advanced and intelligent systems in critical and heterogeneous structures. This fact, in addition to the COVID-19 pandemic, has prompted a greater need than ever for AC due to widespread telework and the need to access resources and data related to critical domains such as government, healthcare, industry, and others, and any successful cyber or physical attack can disrupt operations or even decline critical services to society. Moreover, various declarations have announced that the world of AC is changing fast, and the pandemic made AC feel more essential than in the past. To minimize security risks of any unauthorized access to physical and logical systems, before and during the pandemic, several AC approaches are proposed to find a common specification for security policy where AC is implemented in various dynamic and heterogeneous computing environments. Unfortunately, the proposed AC models and metamodels have limited features and are insufficient to meet the current access control requirements. In this context, we have developed a Hierarchical, Extensible, Advanced, and Dynamic (HEAD) AC metamodel with substantial features that is able to encompass the heterogeneity of AC models, overcome the existing limitations of the proposed AC metamodels, and follow the various technology progressions. In this paper, we explain the distinct design of the HEAD metamodel, starting from the metamodel development phase and reaching to the policy enforcement phase. We describe the remaining steps and how they can be employed to develop more advanced features in order to open new opportunities and answer the various challenges of technology progressions and the impact of the pandemic in the domain. As a result, we present a novel approach in five main phases: metamodel development, deriving models, generating policies, policy analysis and assessment, and policy enforcement. This approach can be employed to assist security experts and system administrators to design secure systems that comply with the organizational security policies that are related to access control.
Style APA, Harvard, Vancouver, ISO itp.
44

Hansen, Jan-Cédric, Paul Barach, Donald Donahue, Stefan Göbbels, John Quinn i Frank Van Trimpont. "Strengthening Global Systems to Prevent and Respond to High-Consequence Biological Threats". Prehospital and Disaster Medicine 38, S1 (maj 2023): s1. http://dx.doi.org/10.1017/s1049023x23000535.

Pełny tekst źródła
Streszczenie:
Introduction:The world is facing the devastating impact a biological event can have on human health, economies, and political stability. COVID-19 has revealed that national governments and the international community are woefully unprepared to respond to pandemics—underscoring our shared vulnerability to future catastrophic biological threats that could meet or exceed the severe consequences of the current pandemic. This study examines potential threats related to deliberate Russian military use and misuse of the tools of modern biology or an accident caused by a CBRN event evolving rapidly in the highly volatile political environment in and around Ukraine and other conflicts.Method:A participatory foresight, co-creative, future and transformation-oriented methodology was used to structure a transformative model for a disciplined exploration of scenarios to confront complex challenges and facilitate improved outcomes. Foresight helps to evaluate current policy priorities and potential new policy directions; see how the impact of possible policy decisions may combine with other developments; inform, support and link policy-making in and across a range of sectors; identify future directions, emerging technologies, new societal demands and challenges; and anticipate future developments, disruptive events, risks and opportunities.Results:The study found that the “mitigation scenarios” are based on the “Confront, Regulate, Overcome” metamodel combined with the “Security, Rescue, Care” response modalities. These require the cooperation/coordination of law enforcement forces along with military forces, fire departments and civil security resources, hospital and first-line responder teams, in order to appropriately address populations, assets and territories issues elicited by the identified threat, which drives key decision makers’ tasks at the strategic level.Conclusion:The participatory foresight exercise demonstrated gaps in national and international biosecurity and pandemic preparedness architectures highlighted by the challenges of the Ukraine war—exploring opportunities for better cooperation to improve prevention and response capabilities for high-consequence biological events, and generate actionable recommendations for the international community.
Style APA, Harvard, Vancouver, ISO itp.
45

Reisenthel, P. H., J. F. Love, D. J. Lesieutre i R. E. Childs. "Cumulative global metamodels with uncertainty — a tool for aerospace integration". Aeronautical Journal 110, nr 1108 (czerwiec 2006): 375–84. http://dx.doi.org/10.1017/s0001924000001299.

Pełny tekst źródła
Streszczenie:
Abstract The integration of multidisciplinary data is key to supporting decisions during the development of aerospace products. Multidimensional metamodels can be automatically constructed using limited experimental or numerical data, including data from heterogeneous sources. Recent progress in multidimensional response surface technology, for example, provides the ability to interpolate between sparse data points in a multidimensional parameter space. These analytical representations act as surrogates that are based on and complement higher fidelity models and/or experiments, and can include technical data from multiple fidelity levels and multiple disciplines. Most importantly, these representations can be constructed on-the-fly and are cumulatively enriched as more data become available. The purpose of the present paper is to highlight applications of these cumulative global metamodels (CGM), their ease of construction, and the role they can play in aerospace integration. A simple metamodel implementation based on a radial basis function network is presented. This model generalises multidimensional data while simultaneously furnishing an estimate of the uncertainty on the prediction. Four examples are discussed. The first two illustrate the efficiency of surrogate-based design/optimisation. The third applies CGM concepts to a data fusion application. The last example is used to visualise extrapolation uncertainty, based on computational fluid dynamics data.
Style APA, Harvard, Vancouver, ISO itp.
46

Flynn, Barbara B., i John P. van Gigch. "Decision Making About Decision Making: Metamodels and Metasystems." Administrative Science Quarterly 33, nr 2 (czerwiec 1988): 327. http://dx.doi.org/10.2307/2393069.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
47

Olson, David L. "Decision making about decision making: Metamodels and metasystems". European Journal of Operational Research 33, nr 1 (styczeń 1988): 132–33. http://dx.doi.org/10.1016/0377-2217(88)90268-8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
48

Snoeck, André, i Matthias Winkenbach. "A Discrete Simulation-Based Optimization Algorithm for the Design of Highly Responsive Last-Mile Distribution Networks". Transportation Science 56, nr 1 (styczeń 2022): 201–22. http://dx.doi.org/10.1287/trsc.2021.1105.

Pełny tekst źródła
Streszczenie:
Online and omnichannel retailers are proposing increasingly tight delivery deadlines, moving toward instant on-demand delivery. To operate last-mile distribution systems with such tight delivery deadlines efficiently, defining the right strategic distribution network design is of paramount importance. However, this problem exceeds the complexity of the strategic design of traditional last-mile distribution networks for two main reasons: (1) the reduced time available for order handling and delivery and (2) the absence of a delivery cut-off time that clearly separates order collection and delivery periods. This renders state-of-the-art last-mile distribution network design models inappropriate, as they assume periodic order fulfillment based on a delivery cutoff. In this study, we propose a metamodel simulation-based optimization (SO) approach to strategically design last-mile distribution networks with tight delivery deadlines. Our methodology integrates an in-depth simulator with traditional optimization techniques by extending a traditional black-box SO algorithm with an analytical model that captures the underlying structure of the decision problem. Based on a numerical study inspired by the efforts of a global fashion company to introduce on-demand distribution with tight delivery deadlines in Manhattan, we show that our approach outperforms contemporary SO approaches as well as deterministic and stochastic programming methods. In particular, our method systematically yields network designs with superior expected cost performance. Furthermore, it converges to good solutions with a lower computational budget and is more consistent in finding high-quality solutions. We show how congestion effects in the processing of orders at facilities negatively impact the network performance through late delivery of orders and reduced potential for consolidation. In addition, we show that the sensitivity of the optimal network design to congestion effects in order processing at the facilities increases as delivery deadlines become increasingly tight.
Style APA, Harvard, Vancouver, ISO itp.
49

Olsina, Luis, Pablo Becker, Denis Peppino i Guido Tebes. "Specifying the Process Model for Systematic Reviews: An Augmented Proposal". Journal of Software Engineering Research and Development 7 (21.12.2019): 7. http://dx.doi.org/10.5753/jserd.2019.460.

Pełny tekst źródła
Streszczenie:
Context: Systematic Literature Review (SLR) is a research methodology intended to obtain evidence from scientific articles stored in digital libraries. SLRs can be performed on primary and secondary studies. Although there are guidelines to the SLR process in Software Engineering, the SLR process is not fully and rigorously specified yet. Moreover, it can often be observed a lack of a clear separation of concerns between what to do (process) and how to do it (methods). Objective: To specify the SLR process in a more detailed and rigorous manner by considering different process modeling perspectives, such as functional, behavioral, organizational and informational. The main objective in this work is specifying the SLR activities rather than their methods. Method: The SPEM (Software & Systems Process Engineering Metamodel) language is used to model the SLR process from different perspectives. In addition, we illustrate aspects of the proposed process by using a recently conducted SLR on software testing ontologies. Results: Our SLR process model specifications favor a clear identification of what task/activities should be performed, in which order, by whom, and which are the consumed and produced artifacts as well as their inner structures. Also, we explicitly specify activities related to the SLR pilot test, analyzing the gains. Conclusion: The proposed SLR process considers with higher rigor the principles and benefits of process modeling backing SLRs to be more systematic, repeatable and auditable for researchers and practitioners. In fact, the rigor provided by process modeling, where several perspectives are combined, but can also be independently detached, provides a greater richness of expressiveness in sequences and decision flows, while representing different levels of granularity in the work definitions, such as activity, sub-activity and task.
Style APA, Harvard, Vancouver, ISO itp.
50

Huang, Wei, Xuanyu Zhang, Haofan Cheng i Jiemin Xie. "Metamodel-Based Optimization Method for Traffic Network Signal Design under Stochastic Demand". Journal of Advanced Transportation 2023 (27.05.2023): 1–15. http://dx.doi.org/10.1155/2023/3917657.

Pełny tekst źródła
Streszczenie:
Traffic network design problems (NDPs) play an important role in urban planning. Since there exist uncertainties in the real urban traffic network, neglecting the uncertainty factors may lead to unreasonable decisions. This paper considers the transportation network signal design problem under stochastic origin-destination (OD) demand. In general, solving this stochastic problem requires a large amount of computational budget to calculate the equilibrium flow corresponding to a certain demand distribution, which limits its real applications. To reduce the computational time in calculating the equilibrium flow under stochastic demand, this paper proposes a metamodel-based optimization method. First, a combined metamodel that integrates a physical modeling part and a model bias generic part is developed. The metamodel is used to approximate the time-consuming average equilibrium flow solution process, hence to improve the computational efficiency. To further improve the convergence and the solution optimality performance of the metamodel-based optimization, the gradient information of traffic flow with respect to the signal plan is incorporated in the optimization model. A gradient-based metamodel algorithm is then proposed. In the numerical example, a six-node test network is used to examine the proposed metamodel-based optimization method. The proposed combined metamodel is compared with the benchmark method to investigate the importance of incorporating a model bias generic part and the traffic flow gradient information in the combined metamodel. Although there is a reduction in solution optimality since the metamodel is an approximation of the original model, the metamodel methods greatly improve the computational efficiency (the computational time is reduced by 4.84 to 13.47 times in the cases of different initial points). By incorporating the model bias, the combined metamodel can better approximate the original optimal solution. Moreover, incorporating the gradient information of the traffic flow in the optimization search algorithm can further improve the solution performance. Numerical results show that the gradient-based metamodel method can effectively improve the computation efficiency while slightly reducing the solution optimality (with an increase of 0.09% in the expected total travel cost).
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii