Journal articles on the topic 'Prediction theory Mathematical models'

To see the other types of publications on this topic, follow the link: Prediction theory Mathematical models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Prediction theory Mathematical models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Majda, Andrew, and Nan Chen. "Model Error, Information Barriers, State Estimation and Prediction in Complex Multiscale Systems." Entropy 20, no. 9 (August 28, 2018): 644. http://dx.doi.org/10.3390/e20090644.

Full text
Abstract:
Complex multiscale systems are ubiquitous in many areas. This research expository article discusses the development and applications of a recent information-theoretic framework as well as novel reduced-order nonlinear modeling strategies for understanding and predicting complex multiscale systems. The topics include the basic mathematical properties and qualitative features of complex multiscale systems, statistical prediction and uncertainty quantification, state estimation or data assimilation, and coping with the inevitable model errors in approximating such complex systems. Here, the information-theoretic framework is applied to rigorously quantify the model fidelity, model sensitivity and information barriers arising from different approximation strategies. It also succeeds in assessing the skill of filtering and predicting complex dynamical systems and overcomes the shortcomings in traditional path-wise measurements such as the failure in measuring extreme events. In addition, information theory is incorporated into a systematic data-driven nonlinear stochastic modeling framework that allows effective predictions of nonlinear intermittent time series. Finally, new efficient reduced-order nonlinear modeling strategies combined with information theory for model calibration provide skillful predictions of intermittent extreme events in spatially-extended complex dynamical systems. The contents here include the general mathematical theories, effective numerical procedures, instructive qualitative models, and concrete models from climate, atmosphere and ocean science.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Qian, Bee Lan Oo, and Benson Teck Heng Lim. "Validating and Applying the Mathematical Models for Predicting Corporate Social Responsibility Behavior in Construction Firms: A Roadmap." Buildings 12, no. 10 (October 12, 2022): 1666. http://dx.doi.org/10.3390/buildings12101666.

Full text
Abstract:
The prevalence of the sophisticated doctrine of corporate social responsibility (CSR) is increasing, given the perennial environmental concerns and social demands in the construction industry worldwide. Firms’ CSR implementation has been influenced by a broad spectrum of external impetuses and internal motives, yet fragmented assessments of such influences make the prediction and implementation of CSR in construction problematic. This study aimed to validate and apply mathematical models for predicting CSR practices in construction firms. Mobilizing integrated institutional theory, stakeholder theory, and self-determination theory, a questionnaire survey within the top-tier construction contractors was undertaken. Eight mathematical models were developed to predict the key dimensions of CSR practices, such as “government commitment” and “environmental preservation”, and validated by five subjective matter expert interviews. The results demonstrated the comprehensiveness, practicality, and robustness of the CSR prediction models in the construction industry. The results also highlighted the perceived importance of CSR practices; external coercive and normative forces, together with internal organizational culture, were the most influential factors directly enhancing construction firms’ CSR implementation. Conceptually, the findings refined CSR practice prediction in a construction management context. The proposed CSR assessment checklists can help practitioners improve the often-tenuous overall CSR performance and spur competitiveness in the construction market.
APA, Harvard, Vancouver, ISO, and other styles
3

D L Prasanna, S. V. S. N., K. Sandeep Reddy, Chandrasekhar, S. Sai Shivani, and E. Divya. "Prediction and Comparison of Rainfall-Runoff Using Mathematical Model." IOP Conference Series: Earth and Environmental Science 1130, no. 1 (January 1, 2023): 012044. http://dx.doi.org/10.1088/1755-1315/1130/1/012044.

Full text
Abstract:
Abstract The Runoff assessment is a crucial parameter in understanding the urban flooding scenario. This estimation becomes the deciding factor because of the uneven distribution of rainfall. Physics-based models for simulation of Runoff from catchments are composite models based on learning algorithms. The application of models to water resource problems is complex due to the incredible spatial variability of the characteristics of watershed and precipitation forms — the pattern-learning algorithms. Fuzzy-based algorithms, Artificial Neural Networks (ANNs), etc., have gained wide recognition in simulating the Rainfall-Runoff (RR), producing a comparable accuracy. In the present study, RR modeling is carried out targeting the application and estimation of Runoff using mathematical modeling. The investigations were carried out for the Malkajgiri catchment adopting 16 years of daily data from 2005 to 2021. The statistical learning theory-based pattern-learning algorithm is further utilized to evaluate the value of Runoff for the year 2021. The results were found to have fair accordance with the analytical outcomes.
APA, Harvard, Vancouver, ISO, and other styles
4

RATH, S., P. P. SENGUPTA, A. P. SINGH, A. K. MARIK, and P. TALUKDAR. "MATHEMATICAL-ARTIFICIAL NEURAL NETWORK HYBRID MODEL TO PREDICT ROLL FORCE DURING HOT ROLLING OF STEEL." International Journal of Computational Materials Science and Engineering 02, no. 01 (March 2013): 1350004. http://dx.doi.org/10.1142/s2047684113500048.

Full text
Abstract:
Accurate prediction of roll force during hot strip rolling is essential for model based operation of hot strip mills. Traditionally, mathematical models based on theory of plastic deformation have been used for prediction of roll force. In the last decade, data driven models like artificial neural network have been tried for prediction of roll force. Pure mathematical models have accuracy limitations whereas data driven models have difficulty in convergence when applied to industrial conditions. Hybrid models by integrating the traditional mathematical formulations and data driven methods are being developed in different parts of world. This paper discusses the methodology of development of an innovative hybrid mathematical-artificial neural network model. In mathematical model, the most important factor influencing accuracy is flow stress of steel. Coefficients of standard flow stress equation, calculated by parameter estimation technique, have been used in the model. The hybrid model has been trained and validated with input and output data collected from finishing stands of Hot Strip Mill, Bokaro Steel Plant, India. It has been found that the model accuracy has been improved with use of hybrid model, over the traditional mathematical model.
APA, Harvard, Vancouver, ISO, and other styles
5

Lima, E. A. B. F., J. T. Oden, D. A. Hormuth, T. E. Yankeelov, and R. C. Almeida. "Selection, calibration, and validation of models of tumor growth." Mathematical Models and Methods in Applied Sciences 26, no. 12 (October 25, 2016): 2341–68. http://dx.doi.org/10.1142/s021820251650055x.

Full text
Abstract:
This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as “model agnostic” in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology (in vivo). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction–diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory animals while demonstrating successful implementations of OPAL.
APA, Harvard, Vancouver, ISO, and other styles
6

Ivanov, L. M., A. D. Kirwan, Jr., and O. V. Melnichenko. "Prediction of the stochastic behaviour of nonlinear systems by deterministic models as a classical time-passage probabilistic problem." Nonlinear Processes in Geophysics 1, no. 4 (December 31, 1994): 224–33. http://dx.doi.org/10.5194/npg-1-224-1994.

Full text
Abstract:
Abstract. Assuming that the behaviour of a nonlinear stochastic system can be described by a Markovian diffusion approximation and that the evolution equations can be reduced to a system of ordinary differential equations, a method for the calculation of prediction time is developed. In this approach, the prediction time depends upon the accuracy of prediction, the intensity of turbulence, the accuracy of the initial conditions, the physics contained in the mathematical model, the measurement errors, and the number of prediction variables. A numerical application to zonal channel flow illustrates the theory. Some possible generalizations of the theory are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Vlachos, D., and Y. A. Tolias. "Neuro-fuzzy modeling in bankruptcy prediction." Yugoslav Journal of Operations Research 13, no. 2 (2003): 165–74. http://dx.doi.org/10.2298/yjor0302165v.

Full text
Abstract:
For the past 30 years the problem of bankruptcy prediction had been thoroughly studied. From the paper of Altman in 1968 to the recent papers in the '90s, the progress of prediction accuracy was not satisfactory. This paper investigates an alternative modeling of the system (firm), combining neural networks and fuzzy controllers, i.e. using neuro-fuzzy models. Classical modeling is based on mathematical models that describe the behavior of the firm under consideration. The main idea of fuzzy control, on the other hand, is to build a model of a human control expert who is capable of controlling the process without thinking in a mathematical model. This control expert specifies his control action in the form of linguistic rules. These control rules are translated into the framework of fuzzy set theory providing a calculus, which can stimulate the behavior of the control expert and enhance its performance. The accuracy of the model is studied using datasets from previous research papers.
APA, Harvard, Vancouver, ISO, and other styles
8

Geng, Xinyu, Yufei Liu, Wei Zheng, Yongbin Wang, and Meng Li. "Prediction of Crushing Response for Metal Hexagonal Honeycomb under Quasi-Static Loading." Shock and Vibration 2018 (September 13, 2018): 1–10. http://dx.doi.org/10.1155/2018/8043410.

Full text
Abstract:
To provide a theoretical basis for metal honeycombs used for buffering and crashworthy structures, this study investigated the out-of-plane crushing of metal hexagonal honeycombs with various cell specifications. The mathematical models of mean crushing stress and peak crushing stress for metal hexagonal honeycombs were predicted on the basis of simplified super element theory. The experimental study was carried out to check the accuracy of mathematical models and verify the effectiveness of the proposed approach. The presented theoretical models were compared with the results obtained from experiments on nine types of honeycombs under quasi-static compression loading in the out-of-plane direction. Excellent correlation has been observed between the theoretical and experimental results.
APA, Harvard, Vancouver, ISO, and other styles
9

Guseynov, Sharif, and Eugene Kopytov. "Complex Mathematical Models for Analysis, Evaluation and Prediction of Aqueous and Atmospheric Environment of Latvia." Transport and Telecommunication Journal 13, no. 1 (January 1, 2012): 57–74. http://dx.doi.org/10.2478/v10244-012-0006-8.

Full text
Abstract:
Complex Mathematical Models for Analysis, Evaluation and Prediction of Aqueous and Atmospheric Environment of Latvia In present paper the authors consider the complete statements of initial-boundary problems for the modelling of various aspects of aqueous (3 models) and atmospheric systems (2 models) in Latvia. All the proposed models are described in terms of differential equations theory (using both ordinary differential equations and partial differential equations) and are regarded to be the evolutional models. Two of the three aqueous system models being studied are intended to describe the natural aquatic media ecosystems while the other models are aimed at studying environmental pollution processes.
APA, Harvard, Vancouver, ISO, and other styles
10

Zhou, Ai Zhao, Wei Wei Gu, and Wei wang. "Study on Prediction Models for Time-Dependent Settlement of Soft Road Foundation." Applied Mechanics and Materials 204-208 (October 2012): 1880–85. http://dx.doi.org/10.4028/www.scientific.net/amm.204-208.1880.

Full text
Abstract:
The characteristics of soft clay roadbed settlement prediction model are studied in this paper. Firstly, based on one-dimension soil consolidation theory, the shape of time-dependent settlement process curve was analysed. Then, Mathematical analysis of traditional settlement models, including Gompertz model and Logistic model, was conducted, and the mathematical deficiency of above two traditional models were pointed out, which the settlement corresponding to inflection point has a constant ratio to the ultimate settlement. Further, Weibull model was proposed to describe the time-dependent settlement process of roadbed. This proposed model overcomes the deficiency of above two traditional models, and exponent model is one of its degraded expressions. Moreover, it can predict the total settlement processes of both the instantaneous load and the ramp load conditions. Finally, according to a group settlement observation data, the prediction results of different models are compared, and Weibull model has a good agreements.
APA, Harvard, Vancouver, ISO, and other styles
11

Ghaffari Laleh, Narmin, Chiara Maria Lavinia Loeffler, Julia Grajek, Kateřina Staňková, Alexander T. Pearson, Hannah Sophie Muti, Christian Trautwein, Heiko Enderling, Jan Poleszczuk, and Jakob Nikolas Kather. "Classical mathematical models for prediction of response to chemotherapy and immunotherapy." PLOS Computational Biology 18, no. 2 (February 4, 2022): e1009822. http://dx.doi.org/10.1371/journal.pcbi.1009822.

Full text
Abstract:
Classical mathematical models of tumor growth have shaped our understanding of cancer and have broad practical implications for treatment scheduling and dosage. However, even the simplest textbook models have been barely validated in real world-data of human patients. In this study, we fitted a range of differential equation models to tumor volume measurements of patients undergoing chemotherapy or cancer immunotherapy for solid tumors. We used a large dataset of 1472 patients with three or more measurements per target lesion, of which 652 patients had six or more data points. We show that the early treatment response shows only moderate correlation with the final treatment response, demonstrating the need for nuanced models. We then perform a head-to-head comparison of six classical models which are widely used in the field: the Exponential, Logistic, Classic Bertalanffy, General Bertalanffy, Classic Gompertz and General Gompertz model. Several models provide a good fit to tumor volume measurements, with the Gompertz model providing the best balance between goodness of fit and number of parameters. Similarly, when fitting to early treatment data, the general Bertalanffy and Gompertz models yield the lowest mean absolute error to forecasted data, indicating that these models could potentially be effective at predicting treatment outcome. In summary, we provide a quantitative benchmark for classical textbook models and state-of-the art models of human tumor growth. We publicly release an anonymized version of our original data, providing the first benchmark set of human tumor growth data for evaluation of mathematical models.
APA, Harvard, Vancouver, ISO, and other styles
12

Benzekry, Sébastien, Clare Lamont, Afshin Beheshti, Amanda Tracz, John M. L. Ebos, Lynn Hlatky, and Philip Hahnfeldt. "Classical Mathematical Models for Description and Prediction of Experimental Tumor Growth." PLoS Computational Biology 10, no. 8 (August 28, 2014): e1003800. http://dx.doi.org/10.1371/journal.pcbi.1003800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Qi, Tianqi, Wei Zhou, Xinghong Liu, Qiao Wang, and Sifan Zhang. "Predictive Hydration Model of Portland Cement and Its Main Minerals Based on Dissolution Theory and Water Diffusion Theory." Materials 14, no. 3 (January 27, 2021): 595. http://dx.doi.org/10.3390/ma14030595.

Full text
Abstract:
Efficient and accurate cement hydration simulation is an important issue for predicting and analyzing concrete’s performance evolution. A large number of models have been proposed to describe cement hydration. Some models can simulate the test results with high accuracy by constructing reasonable functions, but they are based on mathematical regression and lack of physical background and prediction ability. Other models, such as the famous HYMOSTRUC model and CEMHYD3D model, can predict the hydration rate and microstructure evolution of cement based on its initial microstructure. However, this kind of prediction model also has some limitations, such as the inability to fully consider the properties of cement slurry, or being too complicated for use in finite element analysis (FEA). In this study, the hydration mechanisms of the main minerals in Portland cement (PC) are expounded, and the corresponding hydration model is built. Firstly, a modified particle hydration model of tricalcium silicate (C3S) and alite is proposed based on the moisture diffusion theory and the calcium silicate hydrate (C-S-H) barrier layer hypothesis, which can predict the hydration degree of C3S and alite throughout the age. Taking the hydration model of C3S as a reference, the hydration model of dicalcium silicate (C2S) is established, and the synergistic hydration effect of C3S and C2S is calibrated by analyzing the published test results. The hydration model of tricalcium aluminate(C3A)-gypsum system is then designed by combining the theory of dissolution and diffusion. This model can reflect the hydration characteristics of C3A in different stages, and quantify the response of the hydration process of C3A to different gypsum content, water–cement ratio, and particle size distribution. Finally, several correction coefficients are introduced into the hydration model of the main mineral, to consider the synergistic hydration effect among the minerals to some extent and realize the prediction of the hydration of PC.
APA, Harvard, Vancouver, ISO, and other styles
14

Gubarev, Vyacheslav. "NEW TRENDS IN CONTROL THEORY." International Scientific Technical Journal "Problems of Control and Informatics 67, no. 3 (June 1, 2022): 5–21. http://dx.doi.org/10.34229/2786-6505-2022-3-1.

Full text
Abstract:
The article outlines the conceptual foundations of new trends in control theorythat have been intensively developing in recent years. Unlike classical controltheory, which was formed in the last century and is based on well-known mathematical models of controlled processes in the form of local equations, new approaches to linear stationary systems use input-output relations that follow directly from the Cauchy formula for both continuous and discrete systems. On thebasis of the same description, it is possible to substantiate and obtain the socalled data-based models, which are directly linked to data that form, at the observation intervals, the trajectories of already implemented past processes andfuture ones, for which control is to be synthesized. This approach is focusedprimarily on finding control from the prediction model. At the same time, thecurrent measurements carried out at the plant make it possible to implementfeedback and, in case of discrepancies between the forecast and the real process,to correct the predictive control, i.e. such a way to stabilize it. Control by trajectory prediction model allows to exclude model identification by trajectory data,and control directly on their base. Since the data contain errors, the most important issue in the considered approach is the robustness of the chosen control.A large number of published works are dedicated to this problem, where theguaranteed approach, focused on the worst-case in the data, is the most in demand. In most cases, control synthesis is reduced to solving various optimizationproblems, mainly on the finite prediction horizon. Considerable attention in thearticle is paid to methods for solving synthesis problems based on SVD decomposition. To reduce the complexity of the tasks to be solved, it is proposed to reduce it to terminal control on the horizon of a short duration. Then an iterativecontrol strategy is implemented, which, due to feedback, ensures the feasibilityof the global control goal.
APA, Harvard, Vancouver, ISO, and other styles
15

DOUGHERTY, EDWARD R., and ULISSES BRAGA-NETO. "EPISTEMOLOGY OF COMPUTATIONAL BIOLOGY: MATHEMATICAL MODELS AND EXPERIMENTAL PREDICTION AS THE BASIS OF THEIR VALIDITY." Journal of Biological Systems 14, no. 01 (March 2006): 65–90. http://dx.doi.org/10.1142/s0218339006001726.

Full text
Abstract:
Knowing the roles of mathematics and computation in experimental science is important for computational biology because these roles determine to a great extent how research in this field should be pursued and how it should relate to biology in general. The present paper examines the epistemology of computational biology from the perspective of modern science, the underlying principle of which is that a scientific theory must have two parts: (1) a structural model, which is a mathematical construct that aims to represent a selected portion of physical reality and (2) a well-defined procedure for relating consequences of the model to quantifiable observations. We also explore the contingency and creative nature of a scientific theory. Among the questions considered are: Can computational biology form the theoretical core of biology? What is the basis, if any, for choosing one particular model over another? And what is the role of computation in science, and in biology in particular? We examine how this broad epistemological framework applies to important statistical methodologies pertaining to computational biology, such as expression-based phenotype classification, gene regulatory networks, and clustering. We consider classification in detail, as the epistemological issues raised by classification are related to all computational-biology topics in which statistical prediction plays a key role. We pay particular attention to classifier-model validity and its relation to estimation rules.
APA, Harvard, Vancouver, ISO, and other styles
16

Delort, Thierry. "Theory of Dark Matter and Dark Energy." Applied Physics Research 10, no. 5 (September 27, 2018): 1. http://dx.doi.org/10.5539/apr.v10n5p1.

Full text
Abstract:
In this article, we propose a new model of dark matter. According to this new model, dark matter is a substance, that is a new physical element not constituted of classical particles, called dark substance and filling the Universe. Assuming some very simple physical properties to this dark substance, we theoretically justify the flat rotation curve of galaxies and the baryonic Tully-Fisher’s law. We then study according to our new theory of dark matter  the different possible distributions of dark matter in galaxies and in galaxy clusters, and the velocities of galaxies in galaxy clusters. Then using the new model of dark matter we are naturally led to propose a new geometrical model of Universe, finite, that is different from all geometrical models proposed by the Standard Cosmological Model (SCM). Despite that our Theory of dark matter is compatible with the SCM, we then expose a new Cosmological model based on this new geometrical form of the Universe and on the interpretation of the CMB Rest Frame (CRF), that has not physical interpretation on the SCM and that we will call local Cosmological frame. We then propose 2 possible mathematical models of expansion inside the new Cosmological model. The 1st mathematical model is based on General Relativity as the SCM and gives the same theoretical predictions of distances and of the Hubble’s constant as the SCM. The 2nd mathematical model of expansion of the Universe is mathematically much simpler than the mathematical model of expansion used in the SCM, but we will see that its theoretical predictions are in agreement with astronomical observations. Moreover, this 2nd mathematical model of expansion does not need to introduce the existence of a dark energy contrary to the mathematical model of expansion of the SCM. To end we study the evolution of the temperature of dark substance in the Universe and we make appear the existence of a dark energy, due to our model of dark matter.
APA, Harvard, Vancouver, ISO, and other styles
17

Manusov, Vadim, and Javod Ahyoev. "Technical Diagnostics of Electric Equipment with the Use of Fuzzy Logic Models." Applied Mechanics and Materials 792 (September 2015): 324–29. http://dx.doi.org/10.4028/www.scientific.net/amm.792.324.

Full text
Abstract:
The possible technical diagnostics method of electrical power systems (EPS) has been considered, in particular, the electric equipment of substations and electric networks by means of using mathematical apparatus of indistinct sets and fuzzy logic theory. It is shown that on the basis of indistinct expert estimates it is possible to do a prediction for possible causes of failures if preliminary estimates of the failures signs available are known.
APA, Harvard, Vancouver, ISO, and other styles
18

Wang, Debby D., Haoran Xie, and Hong Yan. "Proteo-chemometrics interaction fingerprints of protein–ligand complexes predict binding affinity." Bioinformatics 37, no. 17 (February 27, 2021): 2570–79. http://dx.doi.org/10.1093/bioinformatics/btab132.

Full text
Abstract:
Abstract Motivation Reliable predictive models of protein–ligand binding affinity are required in many areas of biomedical research. Accurate prediction based on current descriptors or molecular fingerprints (FPs) remains a challenge. We develop novel interaction FPs (IFPs) to encode protein–ligand interactions and use them to improve the prediction. Results Proteo-chemometrics IFPs (PrtCmm IFPs) formed by combining extended connectivity fingerprints (ECFPs) with the proteo-chemometrics concept. Combining PrtCmm IFPs with machine-learning models led to efficient scoring models, which were validated on the PDBbind v2019 core set and CSAR-HiQ sets. The PrtCmm IFP Score outperformed several other models in predicting protein–ligand binding affinities. Besides, conventional ECFPs were simplified to generate new IFPs, which provided consistent but faster predictions. The relationship between the base atom properties of ECFPs and the accuracy of predictions was also investigated. Availability PrtCmm IFP has been implemented in the IFP Score Toolkit on github (https://github.com/debbydanwang/IFPscore). Supplementary information Supplementary data are available at Bioinformatics online.
APA, Harvard, Vancouver, ISO, and other styles
19

Panotopoulos, Grigorios P., and Ziyad S. Haidar. "Mathematical Modeling for Pharmaco-Kinetic and -Dynamic Predictions from Controlled Drug Release NanoSystems: A Comparative Parametric Study." Scientifica 2019 (January 6, 2019): 1–5. http://dx.doi.org/10.1155/2019/9153876.

Full text
Abstract:
Predicting pharmacokinetics, based on the theory of dynamic systems, for an administered drug (whether intravenously, orally, intramuscularly, etc.), is an industrial and clinical challenge. Often, mathematical modeling of pharmacokinetics is preformed using only a measured concentration time profile of a drug administered in plasma and/or in blood. Yet, in dynamic systems, mathematical modeling (linear) uses both a mathematically described drug administration and a mathematically described body response to the administered drug. In the present work, we compare several mathematical models well known in the literature for simulating controlled drug release kinetics using available experimental data sets obtained in real systems with different drugs and nanosized carriers. We employed the χ2 minimization method and concluded that the Korsmeyer–Peppas model (or power-law model) provides the best fit, in all cases (the minimum value of χ2 per degree of freedom; χmin2/d.o.f. = 1.4183, with 2 free parameters or m = 2). Hence, (i) better understanding of the exact mass transport mechanisms involved in drugs release and (ii) quantitative prediction of drugs release can be computed and simulated. We anticipate that this work will help devise optimal pharmacokinetic and dynamic release systems, with measured variable properties, at nanoscale, characterized to target specific diseases and conditions.
APA, Harvard, Vancouver, ISO, and other styles
20

Hang, Cheng, Xiu Lan Sun, Yu Zhu Shang, and Mao Lei. "The Stochastic Simulation Prediction Method for Landslide." Advanced Materials Research 971-973 (June 2014): 2255–58. http://dx.doi.org/10.4028/www.scientific.net/amr.971-973.2255.

Full text
Abstract:
Based on the rainfall infiltration and slope stability analysis theory and method of finite element, the landslide early warning and forecast models established empirical and adopt advanced Intensity - at home and abroad for the first time Duration of statistical curve as the latest forecast method, mountain hilly area rainfall condition, the author of this paper high frequency of natural disasters such as landslide and debris flow problems. Results show that the curve fitting mathematical types as piecewise function form its expression changes over rainfall intensity and conform to the actual working condition.
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Li, Yue Qiang, Shaohong Li, and Zhongchao Yang. "Research on Slope Deformation Prediction Based on Fractional-Order Calculus Gray Model." Advances in Civil Engineering 2018 (October 14, 2018): 1–9. http://dx.doi.org/10.1155/2018/9526216.

Full text
Abstract:
Slope deformation prediction has important significance for slope prevention and control. Based on historical time series, the trend of displacement variation can be predicted in advance, and according to the development trend, risk warnings and treatment measures are proposed. The use of the mathematical model to predict slope deformation has been proved to be feasible by many studies; therefore, the choice of the predictive model and the practicability of the model are crucial issues in the prediction of slope deformation, and the mathematical prediction model used should be less complicated considering the practicality of the model. In view of slope deformation prediction, a fractional-order calculus gray model based on the coupling of gray theory and the fractional derivative method is proposed, which takes a deep foundation pit slope in Chongqing, Southwest China, as the study object. The fractional-order gray model is compared with the traditional gray models; therefore, the results show that the accuracy of slope deformation prediction based on the gray coupling model of cumulative displacement and fractional calculus is significantly higher than that of the conventional gray model, and its error is in the acceptable range compared with the actual monitoring data, which can meet the needs of engineering application. Compared with the traditional gray theory method, the gray coupling model of fractional-order calculus only increases the fractional derivative order, which is verified to be feasible, and can be used as a reference method for slope deformation prediction. It has a certain theoretical basis and a good application prospect in slope deformation prediction.
APA, Harvard, Vancouver, ISO, and other styles
22

Carlsson, Leo S., Mikael Vejdemo-Johansson, Gunnar Carlsson, and Pär G. Jönsson. "Fibers of Failure: Classifying Errors in Predictive Processes." Algorithms 13, no. 6 (June 23, 2020): 150. http://dx.doi.org/10.3390/a13060150.

Full text
Abstract:
Predictive models are used in many different fields of science and engineering and are always prone to make faulty predictions. These faulty predictions can be more or less malignant depending on the model application. We describe fibers of failure (FiFa), a method to classify failure modes of predictive processes. Our method uses Mapper, an algorithm from topological data analysis (TDA), to build a graphical model of input data stratified by prediction errors. We demonstrate two ways to use the failure mode groupings: either to produce a correction layer that adjusts predictions by similarity to the failure modes; or to inspect members of the failure modes to illustrate and investigate what characterizes each failure mode. We demonstrate FiFa on two scenarios: a convolutional neural network (CNN) predicting MNIST images with added noise, and an artificial neural network (ANN) predicting the electrical energy consumption of an electric arc furnace (EAF). The correction layer on the CNN model improved its prediction accuracy significantly while the inspection of failure modes for the EAF model provided guiding insights into the domain-specific reasons behind several high-error regions.
APA, Harvard, Vancouver, ISO, and other styles
23

AOUNALLAH, Naceur, and Ali KHALFA. "Analysis Study of Radar Probability of Detection for Fluctuating and Non-fluctuating Targets." Algerian Journal of Signals and Systems 2, no. 1 (March 15, 2017): 12–20. http://dx.doi.org/10.51485/ajss.v2i1.28.

Full text
Abstract:
The radar analyst can develop and use mathematical and statistical techniques that lead to accurate prediction or adapting models for estimating the target detection performance. In radar detection theory, detection probability, false alarm probability, number of samples non-coherently integrated for a detection test, and signal-to-noise ratio (SNR) are closely interrelated. The present paper is intended to provide an overview of the calculations of radar probability of detection and its related parameters. The main methods and procedures for predicting the detection performance of either non-fluctuating or fluctuating targets are described. Performance’s analysis of the studied models is included, along with some graphical simulation examples.
APA, Harvard, Vancouver, ISO, and other styles
24

Collette, Timothy W., and Adam J. Szladow. "Use of Rough Sets and Spectral Data for Building Predictive Models of Reaction Rate Constants." Applied Spectroscopy 48, no. 11 (November 1994): 1379–86. http://dx.doi.org/10.1366/0003702944028047.

Full text
Abstract:
A model for predicting the log of the rate constants for alkaline hydrolysis of organic esters has been developed with the use of gas-phase mid-infrared library spectra and a rule-building software system based on the mathematical theory of rough sets. A diverse set of 41 esters was used as training compounds. The model is an advance in the development of a generalized system for predicting environmentally important reactivity parameters based on spectroscopic data. By comparison to a previously developed model using the same training set with multiple linear regression (MLR), the rough-sets model provided better predictive power, was more widely applicable, and required less spectral data manipulation. [For the previous MLR model, a standard error of prediction (SEP) of 0.59 was calculated for 88% of the training set data under leave-one-out cross-validation. In the present study using rough sets, an SEP of 0.52 was calculated for 95% of the data set.] More importantly, analysis of the decision rules generated by rough-sets analysis can lead to a better understanding of both the reaction process under study and important trends in the spectral data, as well as underlying relationships between the two.
APA, Harvard, Vancouver, ISO, and other styles
25

Oden, J. Tinsley. "Adaptive multiscale predictive modelling." Acta Numerica 27 (May 1, 2018): 353–450. http://dx.doi.org/10.1017/s096249291800003x.

Full text
Abstract:
The use of computational models and simulations to predict events that take place in our physical universe, or to predict the behaviour of engineered systems, has significantly advanced the pace of scientific discovery and the creation of new technologies for the benefit of humankind over recent decades, at least up to a point. That ‘point’ in recent history occurred around the time that the scientific community began to realize that true predictive science must deal with many formidable obstacles, including the determination of the reliability of the models in the presence of many uncertainties. To develop meaningful predictions one needs relevant data, itself possessing uncertainty due to experimental noise; in addition, one must determine model parameters, and concomitantly, there is the overriding need to select and validate models given the data and the goals of the simulation.This article provides a broad overview of predictive computational science within the framework of what is often called the science of uncertainty quantification. The exposition is divided into three major parts. In Part 1, philosophical and statistical foundations of predictive science are developed within a Bayesian framework. There the case is made that the Bayesian framework provides, perhaps, a unique setting for handling all of the uncertainties encountered in scientific prediction. In Part 2, general frameworks and procedures for the calculation and validation of mathematical models of physical realities are given, all in a Bayesian setting. But beyond Bayes, an introduction to information theory, the maximum entropy principle, model sensitivity analysis and sampling methods such as MCMC are presented. In Part 3, the central problem of predictive computational science is addressed: the selection, adaptive control and validation of mathematical and computational models of complex systems. The Occam Plausibility Algorithm, OPAL, is introduced as a framework for model selection, calibration and validation. Applications to complex models of tumour growth are discussed.
APA, Harvard, Vancouver, ISO, and other styles
26

Danilova, Evgeniya, Igor Kochegarov, Nikolay Yurkov, Mikhail Miheev, and Normunds Kante. "Models of Printed Circuit Boards Conductive Pattern Defects." Applied Computer Systems 23, no. 2 (December 1, 2018): 128–34. http://dx.doi.org/10.2478/acss-2018-0016.

Full text
Abstract:
Abstract A number of PCB defects, though having passed successfully the defect identification procedure, can potentially grow into critical defects under the influence of various external and (or) internal influences. The complex nature of the development of defects leading to PCB failures demands developing and updating the data measuring systems not only for detection but also for the prediction of future development of PCB defects considering the external influences. To solve this problem, it is necessary to analyse the models of defect development, which will allow predicting the defect growth and working out the mathematical models for their studies. The study uses the methods of system analysis, theory of mathematical and imitation modelling, analysis of technological systems. The article presents four models for determining the theoretical stress concentration factor for several types of common defects, considering the strength loss of PCB elements. For each model the evaluation of parameters determining its quality is also given. The formulas are given that link the geometry of defects and the stress concentration factor, corresponding to four types of defects. These formulas are necessary for determining the number of cycles and time to failure, fatigue strength coefficient. The chosen models for determining the values of the stress concentration factor can be used as a database for identifying PCB defects. The proposed models are used for software implementation of the optical image inspection systems.
APA, Harvard, Vancouver, ISO, and other styles
27

Sanbonmatsu, David M., and William A. Johnston. "Redefining Science: The Impact of Complexity on Theory Development in Social and Behavioral Research." Perspectives on Psychological Science 14, no. 4 (June 11, 2019): 672–90. http://dx.doi.org/10.1177/1745691619848688.

Full text
Abstract:
Disciplinary differences in the development of scientific generalizations and explanations are reviewed in this article. The social and behavioral sciences have identified fewer laws, established fewer “paradigms,” and developed “worse” theories than the physical sciences. We argue that the variability in the theoretical attainments of disciplines is due primarily to differences in the complexity of the topics studied. Accounts suggesting that differences in the maturity of disciplines are responsible for the variability are dismissed. In the study of complex phenomena, there is an extreme trade-off between generality and precision in which basic theories do not make the precise predictions needed for the development of applications and in which applied models are lacking in generality. The examination of proximal determinants and the generation of context-specific mathematical models are essential for prediction and application in complex disciplines. The impossibility of developing exacting theories of complex phenomena suggests that we need to redefine our conceptions of “good” and “bad” theories and “real” and “fake” science.
APA, Harvard, Vancouver, ISO, and other styles
28

Foley, D. "Mathematical Formalism and Political-Economic Content." Voprosy Ekonomiki, no. 7 (July 20, 2012): 82–95. http://dx.doi.org/10.32609/0042-8736-2012-7-82-95.

Full text
Abstract:
Mathematical methods are only one moment in a layered process of theory generation in political economy, which starts from Schumpeterian vision, progresses to the identification of relevant abstractions, the development of mathematical and quantitative models, and the confrontation of theories with empirical data through statistical methods. But today the relevant abstract problems of political economy are modified to fit available mathematical tools. The role of empirical research in disciplining theoretical speculation, on which the scientific traditions integrity rests, was undermined by specific limitations of nascent econometric methods, and usurped by ex cathedra methodological fiats of theorists. These developmentssystematically favored certain ideological predispositions of economicsas a discipline. There is abundant room for New Thinking in political economy starting from the vision of the capitalist economy as a complex, adaptive system far from equilibrium, including the development of the theory of statistical fluctuations for economic interactions, redirection of macroeconomics and financial economics from path prediction toward an understanding of the qualitative properties of the system, introduction of constructive and computable methods into economic modeling, and the critical reconstruction of econometric statistical methods.
APA, Harvard, Vancouver, ISO, and other styles
29

Kong, Gangqiang, Hanlong Liu, Qing Yang, Robert Y. Liang, and Hang Zhou. "Mathematical Model and Analysis of Negative Skin Friction of Pile Group in Consolidating Soil." Mathematical Problems in Engineering 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/956076.

Full text
Abstract:
In order to calculate negative skin friction (NSF) of pile group embedded in a consolidating soil, the dragload calculating formulas of single pile were established by considering Davis one-dimensional nonlinear consolidation soils settlement and hyperbolic load-transfer of pile-soil interface. Based on effective influence area theory, a simple semiempirical mathematical model of analysis for predicting the group effect of pile group under dragload was described. The accuracy and reliability of mathematical models built in this paper were verified by practical engineering comparative analysis. Case studies were studied, and the prediction values were found to be in good agreement with those of measured values. Then, the influences factors, such as, soil consolidation degree, the initial volume compressibility coefficient, and the stiffness of bearing soil, were analyzed and discussed. The results show that the mathematical models considering nonlinear soil consolidation and group effect can reflect the practical NSF of pile group effectively and accurately. The results of this paper can provide reference for practical pile group embedded in consolidating soil under NSF design and calculation.
APA, Harvard, Vancouver, ISO, and other styles
30

Twarock, Reidun. "Mathematical virology: a novel approach to the structure and assembly of viruses." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 364, no. 1849 (October 19, 2006): 3357–73. http://dx.doi.org/10.1098/rsta.2006.1900.

Full text
Abstract:
Understanding the structure and life cycle of viruses is a fascinating challenge with a crucial impact on the public health sector. In the early 1960s, Caspar & Klug (Caspar & Klug 1962 Cold Spring Harbor Symp. Quant. Biol . 27 , 1–24) established a theory for the prediction of the surface structures of the protein shells, called viral capsids, which encapsulate and hence provide protection for the viral genome. It is of fundamental importance in virology, with a broad spectrum of applications ranging from the image analysis and classification of experimental data to the construction of assembly models. However, experimental results have provided evidence for the fact that it is incomplete and, in particular, cannot account for the structures of Papovaviridae , which are of particular interest because they contain cancer-causing viruses. This gap has recently been closed by the viral tiling theory, which describes the locations of the protein subunits and inter-subunit bonds in viral capsids based on mathematical tools from the area of quasicrystals. The predictions and various recent applications of the new theory are presented, and it is discussed how further research along these lines may lead to new insights in virology and the design of anti-viral therapeutics.
APA, Harvard, Vancouver, ISO, and other styles
31

Liao, David, and Thea D. Tlsty. "Evolutionary game theory for physical and biological scientists. II. Population dynamics equations can be associated with interpretations." Interface Focus 4, no. 4 (August 6, 2014): 20140038. http://dx.doi.org/10.1098/rsfs.2014.0038.

Full text
Abstract:
The use of mathematical equations to analyse population dynamics measurements is being increasingly applied to elucidate complex dynamic processes in biological systems, including cancer. Purely ‘empirical’ equations may provide sufficient accuracy to support predictions and therapy design. Nevertheless, interpretation of fitting equations in terms of physical and biological propositions can provide additional insights that can be used both to refine models that prove inconsistent with data and to understand the scope of applicability of models that validate. The purpose of this tutorial is to assist readers in mathematically associating interpretations with equations and to provide guidance in choosing interpretations and experimental systems to investigate based on currently available biological knowledge, techniques in mathematical and computational analysis and methods for in vitro and in vivo experiments.
APA, Harvard, Vancouver, ISO, and other styles
32

Costa, Paulo C. S., Joel S. Evangelista, Igor Leal, and Paulo C. M. L. Miranda. "Chemical Graph Theory for Property Modeling in QSAR and QSPR—Charming QSAR & QSPR." Mathematics 9, no. 1 (December 29, 2020): 60. http://dx.doi.org/10.3390/math9010060.

Full text
Abstract:
Quantitative structure-activity relationship (QSAR) and Quantitative structure-property relationship (QSPR) are mathematical models for the prediction of the chemical, physical or biological properties of chemical compounds. Usually, they are based on structural (grounded on fragment contribution) or calculated (centered on QSAR three-dimensional (QSAR-3D) or chemical descriptors) parameters. Hereby, we describe a Graph Theory approach for generating and mining molecular fragments to be used in QSAR or QSPR modeling based exclusively on fragment contributions. Merging of Molecular Graph Theory, Simplified Molecular Input Line Entry Specification (SMILES) notation, and the connection table data allows a precise way to differentiate and count the molecular fragments. Machine learning strategies generated models with outstanding root mean square error (RMSE) and R2 values. We also present the software Charming QSAR & QSPR, written in Python, for the property prediction of chemical compounds while using this approach.
APA, Harvard, Vancouver, ISO, and other styles
33

Lima, Enzo Lenine. "Models, explanation, and the pitfalls of empirical testing." Estudos Internacionais: revista de relações internacionais da PUC Minas 6, no. 3 (December 19, 2018): 82–97. http://dx.doi.org/10.5752/p.2317-773x.2018v6n3p82.

Full text
Abstract:
Formal models constitute an essential part of contemporary political science and International Relations. Their recent history is tightly tied to the developments of Rational Choice Theory, which is considered to be the only deductive theory in the social sciences. This unique character, especially its manifestation through mathematical symbolisms, has caused profound schisms and criticisms in the discipline. Formal models have constantly been accused of being built on unrealistic assumptions of human behaviour and social structure, rendering as a result either trivial predictions or no empirical prediction at all. Nevertheless these criticisms frequently ignore essential elements of the concept of explanation and how it is applicable to formal modelling. In this paper, I provide an approach to mathematical modelling that considers the challenges of designing and performing empirical tests of predictions generated by formal models. Rather than disqualifying or falsifying models, empirical tests are paramount to the tailoring of more grounded explanations of political phenomena and should be seen as a tool to enhance modelling. In this sense, I scrutinise two examples of formal modelling in IR, and derive lessons for the empirical testing of models in the discipline. Os modelos formais constituem uma parte essencial da ciência política contemporânea e das Relações Internacionais. Sua história recente está fortemente ligada aos desenvolvimentos da teoria da escolha racional, que é considerada a única teoria dedutiva nas ciências sociais. Este caráter único, especialmente sua manifestação por meio de simbolismos matemáticos, causou profundas divisões e críticas na disciplina. Os modelos formais têm sido constantemente acusados de serem construídos com base em suposições irrealistas do comportamento humano e da estrutura social, resultando em previsões triviais ou nenhuma previsão empírica. No entanto, essas críticas frequentemente ignoram elementos essenciais do conceito de explicação e como o mesmo é aplicável à modelagem formal. Neste artigo, forneço uma abordagem à modelagem matemática que considera os desafios de conceber e executar testes empíricos de previsões geradas por modelos formais. Em vez de desqualificar ou falsificar modelos, os testes empíricos são fundamentais para a adaptação de explicações mais fundamentadas dos fenômenos políticos e devem ser vistos como uma ferramenta para aprimorar a modelagem. Nesse sentido, analiso dois exemplos de modelagem formal em RI e extraio lições para o teste empírico de modelos na disciplina. Palavras-chave: modelos formais, teoria da escolha racional, teste empírico, explicação
APA, Harvard, Vancouver, ISO, and other styles
34

LEI, G., P. C. DONG, S. Y. MO, S. H. GAI, and Z. S. WU. "A NOVEL FRACTAL MODEL FOR TWO-PHASE RELATIVE PERMEABILITY IN POROUS MEDIA." Fractals 23, no. 02 (May 28, 2015): 1550017. http://dx.doi.org/10.1142/s0218348x15500176.

Full text
Abstract:
Multiphase flow in porous media is very important in various scientific and engineering fields. It has been shown that relative permeability plays an important role in determination of flow characteristics for multiphase flow. The accurate prediction of multiphase flow in porous media is hence highly important. In this work, a novel predictive model for relative permeability in porous media is developed based on the fractal theory. The predictions of two-phase relative permeability by the current mathematical models have been validated by comparing with available experimental data. The predictions by the proposed model show the same variation trend with the available experimental data and are in good agreement with the existing experiments. Every parameter in the proposed model has clear physical meaning. The proposed relative permeability is expressed as a function of the immobile liquid film thickness, pore structural parameters (pore fractal dimension Dfand tortuosity fractal dimension DT) and fluid viscosity ratio. The effects of these parameters on relative permeability of porous media are discussed in detail.
APA, Harvard, Vancouver, ISO, and other styles
35

Khosravani, Ali, Jacky M. A. Desbiolles, Chris Saunders, Mustafa Ucgul, and John M. Fielke. "Prediction of Single Disc Seeding System Forces, Using a Semi-Analytical and Discrete Element Method (DEM) Considering Rotation Effects." Agriculture 13, no. 1 (January 13, 2023): 202. http://dx.doi.org/10.3390/agriculture13010202.

Full text
Abstract:
Disc seeders are commonly used in no-till farming systems, and their performance evaluation generally rely on expensive and time-consuming field experiments. Mathematical models can help speed up force-related evaluations and improve the understanding of soil-disc interactions, to assist the performance optimisation processes. Previous analytical force prediction models of disc blades have not accounted for the free rotation aspect of the disc blade. This paper develops an analytical force prediction model from the wide blade failure theory adapted to suit rotating flat disc blades operating at different sweep and tilt angles and compares predictions with Discrete Element Method (DEM) simulations. To validate the two models, experiments were performed on a remoulded sandy soil condition using a rotating flat disc set at two tilt angles of 0° and 20°, and four sweep angles of 6, 26, 45 and 90° the 3-dimensional force components of draught, vertical and side forces were measured. Results showed a higher coefficient of determination (R2 = 0.95) was obtained with analytical model predictions compared to DEM predictions (R2 = 0.85) for their agreement with the test results. It was found that both the developed analytical approach and the DEM model can be used to predict tillage forces at different sweep and tilt angles acting on a rotating flat disc blade.
APA, Harvard, Vancouver, ISO, and other styles
36

Whiteaker, Brian, and Peter Gerstoft. "Reducing echo state network size with controllability matrices." Chaos: An Interdisciplinary Journal of Nonlinear Science 32, no. 7 (July 2022): 073116. http://dx.doi.org/10.1063/5.0071926.

Full text
Abstract:
Echo state networks are a fast training variant of recurrent neural networks excelling at approximating nonlinear dynamical systems and time series prediction. These machine learning models act as nonlinear fading memory filters. While these models benefit from quick training and low complexity, computation demands from a large reservoir matrix are a bottleneck. Using control theory, a reduced size replacement reservoir matrix is found. Starting from a large, task-effective reservoir matrix, we form a controllability matrix whose rank indicates the active sub-manifold and candidate replacement reservoir size. Resulting time speed-ups and reduced memory usage come with minimal error increase to chaotic climate reconstruction or short term prediction. Experiments are performed on simple time series signals and the Lorenz-1963 and Mackey–Glass complex chaotic signals. Observing low error models shows variation of active rank and memory along a sequence of predictions.
APA, Harvard, Vancouver, ISO, and other styles
37

Poroseva, Svetlana V., Nathan Lay, and M. Yousuff Hussaini. "Multimodel Approach Based on Evidence Theory for Forecasting Tropical Cyclone Tracks." Monthly Weather Review 138, no. 2 (February 1, 2010): 405–20. http://dx.doi.org/10.1175/2009mwr2733.1.

Full text
Abstract:
Abstract In this paper a new multimodel approach for forecasting tropical cyclone tracks is presented. The approach is based on the Dempster–Shafer theory of evidence. At each forecast period, the multimodel forecast is given as an area where the tropical cyclone position is likely to occur. Each area includes a quantitative assessment of the credibility (degree of belief) of the prediction. The multimodel forecast is obtained by combining individual model forecasts into a single prediction by Dempster’s rule. Mathematical requirements associated with the Dempster’s rule are discussed. Particular attention is given to the requirement of independence of evidence sources, which, for tropical cyclone track forecasting, are the model and best-track data. The origin of this requirement is explored, and it is shown that for forecasting tropical cyclone tracks, this requirement is excessive. The influence of the number of models included in the multimodel approach on the forecasting ability is also studied. Data produced by the models of the Navy Operational Global Atmospheric Prediction System, the European Centre for Medium-Range Weather Forecasts, and the National Centers for Environmental Prediction are used to produce two-, three-, and four-model forecasts. The forecasting ability of the multimodel approach is evaluated using the best-track database of the tropical cyclones that occurred in the eastern and western North Pacific and South Indian Ocean basins in the year 2000.
APA, Harvard, Vancouver, ISO, and other styles
38

Зимин, M. Zimin, Вохмина, Yu Vokhmina, Гавриленко, T. Gavrilenko, Попов, and Yuriy Popov. "Mathematical basis of the global instability of biosytems." Complexity. Mind. Postnonclassic 3, no. 1 (April 7, 2014): 49–62. http://dx.doi.org/10.12737/3397.

Full text
Abstract:
The article presents features of modelling of complex systems – complexity. Specific systems of third type have no repetitions of distribution functions (and their measures), no repetitions of initial states (but the probability theory requires initial states to be repeated and experiences should be repeated many times), no certainties in prediction of the future state of a complex system, i.e. sys-tem is constantly changing and unpredictable. There is a question: what should we work with such systems, how to describe and predict them? Deterministic and stochastic approaches are useless. I.R. Progogine thought that such unique systems are not objects of the science. It is very important that the chaos that demonstrates regulation of tremor, tapping, heart rate beats, myograms and electroencephalograms does not suit to Lorenz attractor, or model chaos in chaos theory (because there are no certain initial states and parameters of quasi-attractors of the same subject (a human) cannot be repeatable). In models of traditional chaos these parameters are repeatable, the Lyapunov exponents can be calculated and autocorrelation functions coincide). But in our case everything is opposite: the autocorrelation functions are unstable, the Lyapunov exponents cannot be calculated, and there are no models. The authors suggest calculating volumes of quasi-attractors.
APA, Harvard, Vancouver, ISO, and other styles
39

Abu El-Maaty, Ahmed Ebrahim, Amr M. El-Kholy, and Ahmed Yousry Akal. "Modeling schedule overrun and cost escalation percentages of highway projects using fuzzy approach." Engineering, Construction and Architectural Management 24, no. 5 (September 18, 2017): 809–27. http://dx.doi.org/10.1108/ecam-03-2016-0084.

Full text
Abstract:
Purpose Modeling represents the art of translating problems from an application area into tractable mathematical formulations whose theoretical and numerical analysis provides insight, answers and guidance useful for the originating application. The purpose of this paper is to determine the causal causes of schedule overrun and cost escalation of highway projects in Egypt in order to be used as independents variables in mathematical models for predicting the percentages of schedule overrun and cost escalation of such projects in Egypt. Design/methodology/approach A survey of a randomly selected samples yielded responses from 40 owners, 15 consultants and 56 contractors. The survey includes 38 schedule overrun factors and 26 cost escalation factors. The effectiveness degree of the identified factors has been identified by the triangle fuzzy approach. Findings The results of the survey show that “contractor’s technical staff is insufficient and ineligible to accomplish the project” is the most important cause of schedule overrun, while the major cause of cost escalation is inadequate preparation of the project concerning planning and execution. Originality/value The main contribution of this study is predicting the percentages of schedule overrun and cost escalation of highway projects in Egypt. Through the application of the linear regression analysis method and statistical fuzzy theory, four predictive models have been developed and it has been noted that the linear regression-based model shows prediction accuracy better than statistical fuzzy-based model in predicting percentages of schedule overrun and cost escalation.
APA, Harvard, Vancouver, ISO, and other styles
40

Meng, Fandi, Ying Liu, Li Liu, Ying Li, and Fuhui Wang. "Studies on Mathematical Models of Wet Adhesion and Lifetime Prediction of Organic Coating/Steel by Grey System Theory." Materials 10, no. 7 (June 28, 2017): 715. http://dx.doi.org/10.3390/ma10070715.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Zhang, Zheming. "Prediction of Economic Operation Index Based on Support Vector Machine." Mobile Information Systems 2022 (August 2, 2022): 1–11. http://dx.doi.org/10.1155/2022/3232271.

Full text
Abstract:
Economic forecasting is not only an important field of economic research but also attracts extensive public attention. The results of the study are directly related to the accurate understanding and view of the economic situation, which in turn affects the rational formulation of macroeconomic policies. However, traditional estimation methods are often limited by expert experience and simple mathematical models, which are difficult to deal with on nonlinear models and do not meet the objective requirements for predicting macroeconomic performance indicators. Support vector machine is a popular new data mining technique. Benefiting from its good theoretical foundation and good generalization performance, SVM has become the starting point of research in recent years. Combining support vector machine research, fuzzy theory, and macroeconomic performance estimation, it attempts to develop a method for predicting macroeconomic function indicators based on support vector machines and expand the theory and projects of support vector machines. In addition, empirical evaluations of early financial forecasts are conducted to integrate theoretical and practical data. The characteristics of the traditional evaluation system are discussed in detail, and the standard model of the prediction system is established, including classical and modern forecasting theories and their forecasting systems. The theory of statistical learning and the theory and basic features of support vector machines are also discussed. For the analysis of the internal relationship between the distribution pattern, SVM, and the forecast of macroeconomic performance indicators, predictable economic activity forecast can be considered as a distribution system. Combining SVM with economic forecasting, SWE intelligent economic operation index forecasting is used for the first time, the automatic selection of symbolic parameters is recognized, and some algorithms and in-depth analysis methods are provided. The experimental results show that the prediction of economic operation indicators based on support vector machine technology is 57% faster than the traditional prediction method and the prediction accuracy rate is as high as 98.56%.
APA, Harvard, Vancouver, ISO, and other styles
42

McManus, Michael L., Michael C. Long, Abbot Cooper, and Eugene Litvak. "Queuing Theory Accurately Models the Need for Critical Care Resources." Anesthesiology 100, no. 5 (May 1, 2004): 1271–76. http://dx.doi.org/10.1097/00000542-200405000-00032.

Full text
Abstract:
Background Allocation of scarce resources presents an increasing challenge to hospital administrators and health policy makers. Intensive care units can present bottlenecks within busy hospitals, but their expansion is costly and difficult to gauge. Although mathematical tools have been suggested for determining the proper number of intensive care beds necessary to serve a given demand, the performance of such models has not been prospectively evaluated over significant periods. Methods The authors prospectively collected 2 years' admission, discharge, and turn-away data in a busy, urban intensive care unit. Using queuing theory, they then constructed a mathematical model of patient flow, compared predictions from the model to observed performance of the unit, and explored the sensitivity of the model to changes in unit size. Results The queuing model proved to be very accurate, with predicted admission turn-away rates correlating highly with those actually observed (correlation coefficient = 0.89). The model was useful in predicting both monthly responsiveness to changing demand (mean monthly difference between observed and predicted values, 0.4+/-2.3%; range, 0-13%) and the overall 2-yr turn-away rate for the unit (21%vs. 22%). Both in practice and in simulation, turn-away rates increased exponentially when utilization exceeded 80-85%. Sensitivity analysis using the model revealed rapid and severe degradation of system performance with even the small changes in bed availability that might result from sudden staffing shortages or admission of patients with very long stays. Conclusions The stochastic nature of patient flow may falsely lead health planners to underestimate resource needs in busy intensive care units. Although the nature of arrivals for intensive care deserves further study, when demand is random, queuing theory provides an accurate means of determining the appropriate supply of beds.
APA, Harvard, Vancouver, ISO, and other styles
43

Dibdin, George H. "Mathematical Modeling of Biofilms." Advances in Dental Research 11, no. 1 (April 1997): 127–32. http://dx.doi.org/10.1177/08959374970110010301.

Full text
Abstract:
A set of mathematical equations constitutes a mathematical model if it aims to represent a real system and is based on some theory of that system's operation. On this definition, mathematical models, some very simple, are everywhere in science. A complex system like a biofilm requires modeling by numerical methods and, because of inevitable uncertainties in its theoretical basis, may not be able to make precise predictions. Nevertheless, such models almost always give new insight into the mechanisms involved, and stimulate further investigation. The way in which diffusion coefficients are measured for use in a model, particularly whether they include effects of reversible reaction, is a key element in the modeling. Reasons are given for separating diffusion from reversible reaction effects and dealing with them in a separate subroutine of the model.
APA, Harvard, Vancouver, ISO, and other styles
44

Olive, David J. "Prediction intervals for regression models." Computational Statistics & Data Analysis 51, no. 6 (March 2007): 3115–22. http://dx.doi.org/10.1016/j.csda.2006.02.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Зайцев, В. В., Вал В. Зайцев, Д. В. Зайцев, and В. В. Лукашова. "Prediction of dynamics and seaworthiness of amphibious hovercraft." Automation of ship technical facilities 27, no. 1 (November 25, 2021): 22–32. http://dx.doi.org/10.31653/1819-3293-2021-1-27-22-32.

Full text
Abstract:
Annotation – The creation of hovercraft projects is a very expensive procedure, and the emergence of new ships of this type and their evolution is proceeding at a very slow pace. Despite the fact that hovercraft have been used for decades and there is a large amount of literature that is related to the theory of such ships and the practice of their design, there are still a number of unresolved problems associated with these ships. There are various methods, theories and publications that allow to calculate and design various complexes and systems of hovercraft, and as a result, this allows to create vessels that are successful in operation and with good seaworthiness. But at the same time, until now, the main problem remains to achieve sufficient accuracy in predicting the operational characteristics of hovercraft under various modes. Therefore, it is not uncommon for a successful hovercraft project to be born after a series of unsuccessful trial and error. The presented work describes a methodology for creating simulations of hovercraft for various modes of their operation. Various types of simulations of such ships and the ways of their implementation into functioning software are considered. The main mathematical models that are involved in the implementation of such simulations are described and their structure is shown for various hovercraft. The described complex of mathematical models, embodied in the program, allows an in-depth analysis of the dynamics and operational characteristics of hovercraft before the completion of their design and before the start of construction. It is argued that the most difficult, but suitable for real practical problems of analyzing the dynamics of hovercraft is the simulation of this ship with 6 degrees of freedom. The study allows us to conclude that the described methodology for creating a complex of mathematical models makes it possible with high accuracy to predict the dynamics of hovercraft and simulate any hovercraft with 6 degrees of freedom, which will reduce the cost of designing and building a lead hovercraft with accurately predictable seaworthiness and operational characteristics.
APA, Harvard, Vancouver, ISO, and other styles
46

Bogdanov, A. I., B. S. Mongush, V. A. Kuzmin, D. A. Orekhov, G. S. Nikitin, A. N. Baryshev, A. B. Aidiev, and E. A. Gulyukin. "Analysis of models of the mathematical theory of epidemics and recommendations on the use of deterministic and stochastic models." Legal regulation in veterinary medicine, no. 4 (January 4, 2023): 37–42. http://dx.doi.org/10.52419/issn2782-6252.2022.4.37.

Full text
Abstract:
The model of the epidemic/epizootic process, in principle, should reflect the interaction of all its components: the source of the causative agent of infection, the mechanism of its transmission and susceptible individuals. The purpose of the article is to analyze models of the mathematical theory of epidemics (MTE) and provide recommendations on the areas of use of deterministic and stochastic models of MTE. Depending on the research objectives, relatively simple deterministic, stochastic models or more complex computer simulation models are used. Since each stochastic model of the mathematical theory of epidemics/epizootics has its deterministic counterpart, it is of interest to analyze the errors associated with ignoring the stochastic essence of the epidemic/epizootic process using deterministic models in various situations. As an example for error analysis, a widely used general stochastic model was chosen, the deterministic analogue of which is the Kermak model (W.O. Kermak) and Mc Kendrick (A.G. Mc Kendrik) [6]. The article discusses the principles of constructing deterministic and stochastic models of the mathematical theory of epidemics/epizootics (MTE). A comparative study of the results of the use of deterministic and stochastic models using simulation modeling is carried out. Recommendations on the areas of application of deterministic and stochastic models are given. The results of the conducted studies have shown that the choice between deterministic and stochastic models is determined by the population size, the stage of epidemic development, a set of parameters and requirements for the accuracy of mathematical modeling. It is concluded that mathematical modeling systems are designed to obtain a quantitative forecast of the development of the epidemic / epizootic process in order to assess the effectiveness of antiepidemic / antiepizootic measures, to analyze the risk and assess possible economic damage. The possibilities of predictive or retrospective modeling of the spread of infectious diseases are shown.
APA, Harvard, Vancouver, ISO, and other styles
47

Et. al., Nur Huzeima Mohd Hussain,. "Machine Learning of the Reverse Migration Models for Population Prediction: A Review." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 5 (April 10, 2021): 1830–38. http://dx.doi.org/10.17762/turcomat.v12i5.2197.

Full text
Abstract:
Human migration from rural to urban has historically been prominent in the urbanisation process which associated with economic development that leads to city growth. However, the dwindling supply of natural resources and pressure from the pandemic has threatened economic growth and resulted in changes in human migration; urban to rural. This anecdotal evidence of reverse migration need to be examined and predict related to challenges and expansion of sustainable development The prediction of human migration; related to population size and growth are important for various policy on strategy, planning and industry. Moreover, predicting population mobility can sense the law of migratory flow in advance, and take effective preventive measures, such as crowd evacuation and epidemic diseases. However, migration predictions are notorious for bearing high error, time consuming, complexity and challenging. Therefore, aligning with IR 4.0, this study adopted a significant way to minimize the prediction errors by using a machine learning approach that can predict data in an intelligent way within a broad dataset. This paper present the investigation of the significant models of machine learning in developing reverse migration prediction. Thus, aims of this study is to identify the machine learning models for reverse migration through systematic literature review (SLR) screening.As SLR has recognised to presents a reliable review, this paper measures both, the review from Scopus and Google scholar to determining the signature algorithm for the models. The findings highlighted the decision tree, random forest and linear regression to be the propose algorithms that pursuit the development of the machine learning models for reverse migration in Malaysia.
APA, Harvard, Vancouver, ISO, and other styles
48

Li, Cong, Yaonan Zhang, and Xupeng Ren. "Modeling Hourly Soil Temperature Using Deep BiLSTM Neural Network." Algorithms 13, no. 7 (July 17, 2020): 173. http://dx.doi.org/10.3390/a13070173.

Full text
Abstract:
Soil temperature (ST) plays a key role in the processes and functions of almost all ecosystems, and is also an essential parameter for various applications such as agricultural production, geothermal development, and their utilization. Although numerous machine learning models have been used in the prediction of ST, and good results have been obtained, most of the current studies have focused on daily or monthly ST predictions, while hourly ST predictions are scarce. This paper presents a novel scheme for forecasting the hourly ST using weather forecast data. The method considers the hourly ST prediction to be the superposition of two parts, namely, the daily average ST prediction and the ST amplitude (the difference between the hourly ST and the daily average ST) prediction. According to the results of correlation analysis, we selected nine meteorological parameters and combined two temporal parameters as the input vectors for predicting the daily average ST. For the task of predicting the ST amplitude, seven meteorological parameters and one temporal parameter were selected as the inputs. Two submodels were constructed using a deep bidirectional long short-term memory network (BiLSTM). For the task of hourly ST prediction at five different soil depths at 30 sites, which are located in 5 common climates in the United States, the results showed the method proposed in this paper performs best at all depths for 30 stations (100% of all) for the root mean square error (RMSE), 27 stations (90% of all) for the mean absolute error (MAE), and 30 stations (100% of all) for the coefficient of determination (R2), respectively. Moreover, the method adopted in this study displays a stronger ST prediction ability than the traditional methods under all climate types involved in the experiment, the hourly ST produced by it can be used as a driving parameter for high-resolution biogeochemical models, land surface models and hydrological models and can provide ideas for an analysis of other time series data.
APA, Harvard, Vancouver, ISO, and other styles
49

Et. al., Mr S. Viswanathan,. "Study Of Students’ Performance Prediction Models Using Machine Learning." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 2 (April 11, 2021): 3085–91. http://dx.doi.org/10.17762/turcomat.v12i2.2351.

Full text
Abstract:
Educational Data Mining (EDM) is a novel concept associated with developing methods for exploring the specific types of data produced by educational settings and using those approaches to effectively understand students and the environments in which they learn. Prediction attempts to shape trends that will allow it to predict results or learning outcomes based on available data. Predicting student success has become an appealing challenge for researchers. They develop an understandable and efficient model using supervised and unsupervised EDM techniques. This assists decision-makers in improving student performance. The task of deciding the best model leads to the emergence of various techniques from both EDM techniques. The numerous research models used to solve the problem of student success prediction using educational data mining are discussed in this paper. The primary purpose of this paper is to explain the methodology for implementing the proposed solution for student performance prediction, as well as to present the findings of a study aimed at evaluating the performance of various data mining classification algorithms on the given dataset in order to assess their potential usefulness for achieving the goal and objectives.
APA, Harvard, Vancouver, ISO, and other styles
50

Rahimbakhsh, Aria, Morteza Sabeti, and Farshid Torabi. "An Improved Mathematical Model for Accurate Prediction of the Heavy Oil Production Rate during the SAGD Process." Processes 8, no. 2 (February 19, 2020): 235. http://dx.doi.org/10.3390/pr8020235.

Full text
Abstract:
Steam-assisted gravity drainage (SAGD) is one of the most successful thermal enhanced oil recovery (EOR) methods for cold viscose oils. Several analytical and semi-analytical models have been theorized, yet the process needs more studies to be conducted to improve quick production rate predictions. Following the exponential geometry theory developed for finding the oil production rate, an upgraded predictive model is presented in this study. Unlike the exponential model, the current model divides the steam-oil interface into several segments, and then the heat and mass balances are applied to each of the segments. By manipulating the basic equations, the required formulas for estimating the oil drainage rate, location of interface, heat penetration depth of steam ahead of the interface, and the steam required for the operation are obtained theoretically. The output of the proposed theory, afterwards, is validated with experimental data, and then finalized with data from the real SAGD process in phase B of the underground test facility (UTF) project. According to the results, the model with a suitable heat penetration depth correlation can produce fairly accurate outputs, so the idea of using this model in field operations is convincing.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography