Journal articles on the topic 'Deterministic factor analysis'

To see the other types of publications on this topic, follow the link: Deterministic factor analysis.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Deterministic factor analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Liu, Haitao, Ioan Dzitac, and Sicong Guo. "Reduction of Conditional Factors in Causal Analysis." International Journal of Computers Communications & Control 13, no. 3 (May 27, 2018): 383–90. http://dx.doi.org/10.15837/ijccc.2018.3.3252.

Full text
Abstract:
Faced with a great number of conditional factors in big data causal analysis, the reduction algorithm put forward in this paper can reasonably reduce the number of conditional factors. Compared with the previous reduction methods, we take into consideration the influence of conditional factors on resulted factors, as well as the relationship among conditional factors themselves. The basic idea of the algorithm proposed in this paper is to establish the matrix of mutual deterministic degrees in between conditional factors. If a conditional factor f has a greater deterministic degree with respect to another conditional factor h, we will delete the factor h unless factor h has a greater deterministic degree with respect to f, then delete factor f in this case. With this reduction, we can ensure that the conditional factors participating in causal analysis are as irrelevant as possible. This is a reasonable requirement for causal analysis.
APA, Harvard, Vancouver, ISO, and other styles
2

Utsugi, Akio, and Toru Kumagai. "Bayesian Analysis of Mixtures of Factor Analyzers." Neural Computation 13, no. 5 (May 1, 2001): 993–1002. http://dx.doi.org/10.1162/08997660151134299.

Full text
Abstract:
For Bayesian inference on the mixture of factor analyzers, natural conjugate priors on the parameters are introduced, and then a Gibbs sampler that generates parameter samples following the posterior is constructed. In addition, a deterministic estimation algorithm is derived by taking modes instead of samples from the conditional posteriors used in the Gibbs sampler. This is regarded as a maximum a posteriori estimation algorithm with hyperparameter search. The behaviors of the Gibbs sampler and the deterministic algorithm are compared on a simulation experiment.
APA, Harvard, Vancouver, ISO, and other styles
3

Wu, Weiqiang, Ning Huang, and Zhitao Wu. "Traffic chaotic dynamics modeling and analysis of deterministic network." Modern Physics Letters B 30, no. 18 (July 10, 2016): 1650285. http://dx.doi.org/10.1142/s0217984916502857.

Full text
Abstract:
Network traffic is an important and direct acting factor of network reliability and performance. To understand the behaviors of network traffic, chaotic dynamics models were proposed and helped to analyze nondeterministic network a lot. The previous research thought that the chaotic dynamics behavior was caused by random factors, and the deterministic networks would not exhibit chaotic dynamics behavior because of lacking of random factors. In this paper, we first adopted chaos theory to analyze traffic data collected from a typical deterministic network testbed — avionics full duplex switched Ethernet (AFDX, a typical deterministic network) testbed, and found that the chaotic dynamics behavior also existed in deterministic network. Then in order to explore the chaos generating mechanism, we applied the mean field theory to construct the traffic dynamics equation (TDE) for deterministic network traffic modeling without any network random factors. Through studying the derived TDE, we proposed that chaotic dynamics was one of the nature properties of network traffic, and it also could be looked as the action effect of TDE control parameters. A network simulation was performed and the results verified that the network congestion resulted in the chaotic dynamics for a deterministic network, which was identical with expectation of TDE. Our research will be helpful to analyze the traffic complicated dynamics behavior for deterministic network and contribute to network reliability designing and analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

Anitas, Eugen Mircea, Giorgia Marcelli, Zsolt Szakacs, Radu Todoran, and Daniela Todoran. "Structural Properties of Vicsek-like Deterministic Multifractals." Symmetry 11, no. 6 (June 18, 2019): 806. http://dx.doi.org/10.3390/sym11060806.

Full text
Abstract:
Deterministic nano-fractal structures have recently emerged, displaying huge potential for the fabrication of complex materials with predefined physical properties and functionalities. Exploiting the structural properties of fractals, such as symmetry and self-similarity, could greatly extend the applicability of such materials. Analyses of small-angle scattering (SAS) curves from deterministic fractal models with a single scaling factor have allowed the obtaining of valuable fractal properties but they are insufficient to describe non-uniform structures with rich scaling properties such as fractals with multiple scaling factors. To extract additional information about this class of fractal structures we performed an analysis of multifractal spectra and SAS intensity of a representative fractal model with two scaling factors—termed Vicsek-like fractal. We observed that the box-counting fractal dimension in multifractal spectra coincide with the scattering exponent of SAS curves in mass-fractal regions. Our analyses further revealed transitions from heterogeneous to homogeneous structures accompanied by changes from short to long-range mass-fractal regions. These transitions are explained in terms of the relative values of the scaling factors.
APA, Harvard, Vancouver, ISO, and other styles
5

Mirzaeian, Yousef, Kourosh Shahriar, and Mostafa Sharifzadeh. "Tunnel Probabilistic Structural Analysis Using the FORM." Journal of Geological Research 2015 (August 12, 2015): 1–9. http://dx.doi.org/10.1155/2015/394761.

Full text
Abstract:
In this paper tunnel probabilistic structural analysis (TuPSA) was performed using the first order reliability method (FORM). In TuPSA, a tunnel performance function is defined according to the boundary between the structural stability and instability. Then the performance function is transformed from original space into the standard normal variable space to obtain the design point, reliability index, and also the probability of tunnel failure. In this method, it is possible to consider the design factors as the dependent or independent random parameters with arbitrary probability distributions. A software code is developed to perform the tunnel probabilistic structural analysis (TuPSA) using the FORM. For validation and verification of TuPSA, a typical tunnel example with random joints orientations as well as mechanical properties has been studied. The results of TuPSA were compared with those obtained from Monte-Carlo simulation. The results show, in spite of deterministic analysis which indicates that the rock blocks are stable, that TuPSA resulted in key-blocks failure with certain probabilities. Comparison between probabilistic and deterministic analyses results indicates that probabilistic results, including the design point and probability of failure, are more rational than deterministic factor of safety.
APA, Harvard, Vancouver, ISO, and other styles
6

Nagumanova, R. V., and A. Sabirova. "Using the deterministic factor systems in the analysis of return on equity." Journal of Fundamental and Applied Sciences 9, no. 2S (January 17, 2018): 903. http://dx.doi.org/10.4314/jfas.v9i2s.65.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Umar, Sujeet Kumar, Pijush Samui, and Sunita Kumari. "Reliability Analysis of Liquefaction for Some Regions of Bihar." International Journal of Geotechnical Earthquake Engineering 9, no. 2 (July 2018): 23–37. http://dx.doi.org/10.4018/ijgee.2018070102.

Full text
Abstract:
There are many deterministic and probabilistic liquefaction assessment measures to classify if soil liquefaction will take place or not. Different approaches give dissimilar safety factor and liquefaction probabilities. So, reliability analysis is required to deal with these different uncertainties. This paper describes a reliability technique for predicting the seismic liquefaction potential of soils of some areas at Bihar State. Here a reliability approach has been presented in order to find the probability of liquefaction. The proposed approach is formulated on the basis of the results of reliability analyses of 234 field data. Using a deterministic simplified Idriss and Boulanger method, factor of safety of soil has been accessed. The reliability index as well as corresponding probability of liquefaction has been determined based on a First Order Second Moment (FOSM) method. The developed method can be used as a robust tool for engineers concerned in the estimation of liquefaction potential.
APA, Harvard, Vancouver, ISO, and other styles
8

Rahman, Hafiz, Eri Besra, and Nurhayati Nurhayati. "The Mediating Effect of Emotive Factor on the Constructs That Influence Entrepreneurial Failure." DeReMa (Development Research of Management): Jurnal Manajemen 14, no. 1 (May 28, 2019): 1. http://dx.doi.org/10.19166/derema.v14i1.1113.

Full text
Abstract:
<p>This paper examines the presence of emotive factor that mediates the variables of voluntaristic, deterministic and opportunistic behaviour that impact entrepreneurial failure. The study is a quantitative study and uses causal analysis as its research approach. It relates the constructs of voluntaristic factor, deterministic factor, and opportunistic behaviour with the mediation of emotive factor to entrepreneurial failure. Sample of the study is 1541 nascent entrepreneurs in West Sumatra Province, Indonesia, who have experienced business failures. Analysis was undertaken by using causal step analysis in which the statistical protocol and rule were operated. The study found and argue that emotive factor of entrepreneurs is identified as individual psychological construct that partially mediates voluntaristic, deterministic and opportunistic behaviour in causing entrepreneurial failure experienced by nascent entrepreneurs. Originality and value of the study lies in the framework used – which considers the construct of opportunistic behaviour of entrepreneurs as an independent variable that can cause entrepreneurial failure. Other is related to the consideration that emotive factor that mediates the voluntaristic, deterministic and opportunistic behaviour in causing entrepreneurial failure.</p><p><em><strong>Abstrak dalam Bahasa Indonesia</strong> Penelitian ini membahas tentang keberadaan faktor emosi yang memediasi variabel voluntaristic, deterministic dan perilaku opportunis yang mengakibatkan terjadinya kegagalan berwirausaha. Studi ini merupakan studi kuantitatif dan menggunakan analisa kausal sebagai pendekatan penelitian. Sampel penelitian sebanyak 1541 orang wirausahawan pemula di Provinsi Sumatera Barat, Indonesia yang sebelumnya pernah mengalami kegagalan dalam berwirausaha. Analisa dilakukan dengan menggunakan causal step analysis yang menggunakan prosedur statistik tertentu. Studi ini menemukan dan lebih lanjut berpendapat bahwa faktor emotsi wirausahawan diidentifikasi sebagai konstruk psikologis individu yang secara parsial memediasi hubungan faktor voluntaristic, faktor determiniastic dan perilaku opportunis sebagai penyebab terjadinya kegagalan berwirausaha yang dialami oleh wirausahawan pemula. Keaslian dan nilai dari studi ini terletak pada kerangka penelitian yang digunakan, yang menempatkan faktor perilaku opportunis sebagai variabel bebas yang dapat mengakibatkan kegagalan berwirausaha. Hal lainnya terletak pada adanya unsur faktor emotsi yang memediasi faktor voluntaristic, faktor determiniastic dan perilaku opportunis sebagai penyebab kegagalan berwirausaha</em></p>
APA, Harvard, Vancouver, ISO, and other styles
9

Gao, W. "Finite Element Analysis of Structures with Interval Parameters." Journal of Mechanics 23, no. 1 (March 2007): 79–85. http://dx.doi.org/10.1017/s1727719100001106.

Full text
Abstract:
AbstractThis paper present a new method called the interval factor method for the finite element analysis of truss structures with interval parameters. Using the interval factor method, the structural parameters and loads can be considered as interval variables, and the structural stiffness matrix can then be divided into the product of two parts corresponding to its deterministic value and the interval factors. The computational expressions for lower and upper bounds, mean value and interval change ratio of structural placement and stress responses are derived from the static governing equations by means of the interval operations. The effect of the uncertainty of the structural parameters and loads on the structural static responses is demonstrated by truss structures.
APA, Harvard, Vancouver, ISO, and other styles
10

Hazledine, Saul, Jongho Sun, Derin Wysham, J. Allan Downie, Giles E. D. Oldroyd, and Richard J. Morris. "Nonlinear Time Series Analysis of Nodulation Factor Induced Calcium Oscillations: Evidence for Deterministic Chaos?" PLoS ONE 4, no. 8 (August 13, 2009): e6637. http://dx.doi.org/10.1371/journal.pone.0006637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Muhammed, Jemal J. "Deterministic and Probabilistic Approaches in the Analysis of the Bearing Capacity of a Bridge Foundation on Undrained Clay Soil." Slovak Journal of Civil Engineering 27, no. 2 (June 1, 2019): 44–51. http://dx.doi.org/10.2478/sjce-2019-0015.

Full text
Abstract:
AbstractThis study aims at evaluating deterministic and probabilistic approaches for an analysis of the bearing capacity of a highway bridge foundation on undrained clay soil. The analysis of a rectangular concrete footing was presented for the ultimate strength limit state of the bearing resistance according to the formulation in ES EN 1991:2015 and ERA-Bridge Design Manual, which are the Ethiopian design codes for foundation structures. In the deterministic analysis, the traditional total safety factor method recommended by the ES EN 1991:2015, ERA and AASHTO LRFD method was implemented. It was assumed that design variables such as the soil parameters and loads would follow normal and lognormal distribution functions. With regard to the probabilistic methods, NESSUS-9.8 software, a statistical computer program, was used for the analysis. Comparisons were made between the results obtained from the traditional deterministic method and the reliability-based design approach. The evaluation asserts that the probabilistic approach is a better tool than the deterministic one for assessing the safety and reliability of geotechnical structures. The probabilistic design method rationally accounts for uncertainties more than the conventional deterministic method does. Thus, the author recommends that the National Design Codes of Ethiopia need to be revised and calibrated based on a reliability design format.
APA, Harvard, Vancouver, ISO, and other styles
12

Vongchavalitkul, Sanguan, and Swein Kumpangta. "Probabilistic Assessment of Soil Liquefaction by Using Seismic Chinese Code." Applied Mechanics and Materials 166-169 (May 2012): 2248–52. http://dx.doi.org/10.4028/www.scientific.net/amm.166-169.2248.

Full text
Abstract:
Deterministic safety factor are introduced by Z. CAO et al (2008) according to seismic Chinese code. The approach was deterministic method used the standard penetration test (SPT) to evaluate the liquefaction of soil. With this method, liquefaction of soil is predicted to occur if the factor of safety(FS), which in the ratio of critical SPT-N value(Resistance) over the actual measurement SPT-N(Load), is less than or equal to one. If the factor of safety greater than one, no soil liquefaction is predicted. Because the significant uncertainties in variable involved in the deterministic factor of safety, the probability method need to use. Probabilistic safety factor calculations provide a means of evaluating the combined effeces of uncertainties and provide a logical framework for choosing a factor of safety that is appropriate for the degree of uncertainty and consequences of failure. Then, a probabilistic assessment of soil liquefaction may be performed in which probability of failure and reliability index. By using the most widely reliability analysis as the First Order Second Moment (FOSM) method, the results of a probabilistic assessment of soil liquefaction can be used for engineering decision.
APA, Harvard, Vancouver, ISO, and other styles
13

Huang, Lei, Andy Yat Fai Leung, Wenfei Liu, and Qiujing Pan. "Reliability of an engineered slope considering the Regression Kriging (RK)-based conditional random field." Special Issue with Awarded and Shortlisted Papers from the HKIE Outstanding Paper Award for Young Engineers/Researchers 2020 27, no. 4 (January 11, 2021): 183–94. http://dx.doi.org/10.33430/v27n4thie-2020-0004.

Full text
Abstract:
Many attempts have been made to apply random field theory to the slope reliability analysis in recent decades. However, there are only a few studies that consider real landslide cases by incorporating actual soil data in the probabilistic slope stability analysis with spatially variable soils. In this paper, an engineered slope located in Hong Kong was investigated using the probabilistic approach considering the Regression Kriging (RK)-based conditional random field. The slope had been assessed and considered to be safe by classical deterministic slope stability analyses but failed eventually. In this study, both deterministic slope stability analyses and probabilistic slope stability analyses were conducted, and the comparison was made between the probabilistic approach adopting RK-based conditional random field and that adopting Ordinary Kriging (OK)-based approach. The results show that the deterministic factor of safety (FS) for a slope may not be an adequate indicator of the safety margin. In particular, a slope with a higher deterministic FS may not always represent a lower probability of failure under the framework of probabilistic assessment, where the spatial variability of soil properties is explicitly considered. Besides, the critical portion of the slope could not be found using the OK-based approach that considers a constant trend structure.
APA, Harvard, Vancouver, ISO, and other styles
14

Huang, Lei, Andy Yat Fai Leung, Wenfei Liu, and Qiujing Pan. "Reliability of an engineered slope considering the Regression Kriging (RK)-based conditional random field." Special Issue with Awarded and Shortlisted Papers from the HKIE Outstanding Paper Award for Young Engineers/Researchers 2020 27, no. 4 (January 11, 2021): 183–94. http://dx.doi.org/10.33430/v27n4thie-2020-0004.

Full text
Abstract:
Many attempts have been made to apply random field theory to the slope reliability analysis in recent decades. However, there are only a few studies that consider real landslide cases by incorporating actual soil data in the probabilistic slope stability analysis with spatially variable soils. In this paper, an engineered slope located in Hong Kong was investigated using the probabilistic approach considering the Regression Kriging (RK)-based conditional random field. The slope had been assessed and considered to be safe by classical deterministic slope stability analyses but failed eventually. In this study, both deterministic slope stability analyses and probabilistic slope stability analyses were conducted, and the comparison was made between the probabilistic approach adopting RK-based conditional random field and that adopting Ordinary Kriging (OK)-based approach. The results show that the deterministic factor of safety (FS) for a slope may not be an adequate indicator of the safety margin. In particular, a slope with a higher deterministic FS may not always represent a lower probability of failure under the framework of probabilistic assessment, where the spatial variability of soil properties is explicitly considered. Besides, the critical portion of the slope could not be found using the OK-based approach that considers a constant trend structure.
APA, Harvard, Vancouver, ISO, and other styles
15

Shrestha, Saurav, Indra Prasad Acharya, and Ranjan Kumar Dahal. "Deterministic and Probabilistic Analysis of Dasdhunga Soil Slope along Narayangarh-Mugling Road Section." Journal of Advanced College of Engineering and Management 6 (July 10, 2021): 187–98. http://dx.doi.org/10.3126/jacem.v6i0.38358.

Full text
Abstract:
Instability of slopes is usually governed by a combination of intrinsic and extrinsic factors. The inherent variability of parameters make the problem probabilistic rather than a deterministic one. This research deals with evaluation of stability of slopes with the calculation of the factor of safety of Dasdhunga soil slope along Narayangarh- Mugling road section under different rainfall conditions through the use of coupled finite element and limit equilibrium method in GeoStudio and the determination of probability of failure by sliding, modeled as infinite slopes by using Monte Carlo simulation in R-Studio. Mean, standard deviation, minimum and maximum values of the parameters like- friction angle, cohesion and unit weight were computed from eight samples of the slope. The pore water pressure developed and its corresponding statistical data for different rainfall conditions were computed from FEM based SEEP/W simulation. The above parameters are assumed to follow truncated normal probability distribution function and the geometric parameters like height and slope angle are regarded as constant parameters. It was observed that the safety factors for theslopeis low in high intensity-low duration rainfalls and the probability of failure is high. The tendency to fail increases as the return period of rainfall increases and viceversa. Sensitivity analysis performed in both deterministic and probabilistic methods showed that friction angle is the most sensitive.
APA, Harvard, Vancouver, ISO, and other styles
16

Hartini, Entin, Roziq Himawan, and Mike Susmikanti. "FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D) SUBJECTED TO INTERNAL PRESSURE." JURNAL TEKNOLOGI REAKTOR NUKLIR TRI DASA MEGA 18, no. 2 (June 6, 2016): 55. http://dx.doi.org/10.17146/tdm.2016.18.2.2466.

Full text
Abstract:
ABSTRACT FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D) SUBJECTED TO INTERNAL PRESSURE. The reactor pressure vessel (RPV) is a pressure boundary in the PWR type reactor which serves to confine radioactive material during chain reaction process. The integrity of the RPV must be guaranteed either in a normal operation or accident conditions. In analyzing the integrity of RPV, especially related to the crack behavior which can introduce break to the reactor pressure vessel, a fracture mechanic approach should be taken for this assessment. The uncertainty of input used in the assessment, such as mechanical properties and physical environment, becomes a reason that the assessment is not sufficient if it is perfomed only by deterministic approach. Therefore, the uncertainty approach should be applied. The aim of this study is to analize the uncertainty of fracture mechanics calculations in evaluating the reliability of PWR`s reactor pressure vessel. Random character of input quantity was generated using probabilistic principles and theories. Fracture mechanics analysis is solved by Finite Element Method (FEM) with MSC MARC software, while uncertainty input analysis is done based on probability density function with Latin Hypercube Sampling (LHS) using python script. The output of MSC MARC is a J-integral value, which is converted into stress intensity factor for evaluating the reliability of RPV’s 2D. From the result of the calculation, it can be concluded that the SIF from probabilistic method, reached the limit value of fracture toughness earlier than SIF from deterministic method. The SIF generated by the probabilistic method is 105.240 MPa m0.5. Meanwhile, the SIF generated by deterministic method is 100.876 MPa m0.5. Keywords: Uncertainty analysis, fracture mechanics, LHS, FEM, reactor pressure vessels ABSTRAK ANALISIS KETIDAKPASTIAN FRACTURE MECHANIC PADA EVALUASI KEANDALAN BEJANA TEKAN REAKTOR: 2D DENGAN BEBAN INTERNAL PRESSURE. Bejana tekan reaktor (RPV) merupakan pressure boundary dalam reaktor tipe PWR yang berfungsi untuk mengungkung material radioaktif yang dihasilkan pada proses reaksi berantai. Maka dari itu integritas bejana tekan reaktor harus senantiasa terjamin baik reaktor dalam keadaan operasi normal, maupun kecelakaan. Dalam melakukan analisis integritas RPV, khususnya yang berkaitan dengan pecahnya bejana tekan reaktor akibat adanya retak dilakukan analisis secara fracture mechanics. Adanya ketidakpastian input seperti sifat mekanik bahan, lingkungan fisik, dan input pada data, maka dalam melakukan analisis keandalan tidak hanya dilakukan secara deterministik saja. Tujuan dari penelitian ini adalah melakukan analisis ketidakpastian input pada perhitungan fracture mechanik pada evaluasi keandalan bejana tekan reaktor PWR. Pendekatan untuk karakter random dari kuantitas input menggunakan teori probabilistik. Analisis fracture mechanics dilakukan berdasarkan metode elemen hingga (FEM) menggunakan perangkat lunak MSC MARC. Analisis ketidakpastian input dilakukan berdasarkan probability density function dengan Latin Hypercube Sampling (LHS) menggunakan python script. Output dari MSC MARC adalah nilai J-integral untuk mendapatkan nilai stress intensity factor pada evaluasi keandalan bejana tekan reactor 2D. Dari hasil perhitungan dapat disimpulkan bahwa SIF probabilistik lebih dulu mencapai nilai batas fracture tougness dibanding SIF deterministik. SIF yang dihasilkan dengan metode probabilistik adalah 105,240 MPa m0,5. Sedangkan SIF metode deterministik adalah 100,876 MPa m0,5. Kata kunci: Analisis ketidakpastian, fracture mechanics, LHS, FEM, bejana tekan reaktor
APA, Harvard, Vancouver, ISO, and other styles
17

Anitas, Eugen Mircea. "Small-Angle Scattering and Multifractal Analysis of DNA Sequences." International Journal of Molecular Sciences 21, no. 13 (June 30, 2020): 4651. http://dx.doi.org/10.3390/ijms21134651.

Full text
Abstract:
The arrangement of A, C, G and T nucleotides in large DNA sequences of many prokaryotic and eukaryotic cells exhibit long-range correlations with fractal properties. Chaos game representation (CGR) of such DNA sequences, followed by a multifractal analysis, is a useful way to analyze the corresponding scaling properties. This approach provides a powerful visualization method to characterize their spatial inhomogeneity, and allows discrimination between mono- and multifractal distributions. However, in some cases, two different arbitrary point distributions, may generate indistinguishable multifractal spectra. By using a new model based on multiplicative deterministic cascades, here it is shown that small-angle scattering (SAS) formalism can be used to address such issue, and to extract additional structural information. It is shown that the box-counting dimension given by multifractal spectra can be recovered from the scattering exponent of SAS intensity in the fractal region. This approach is illustrated for point distributions of CGR data corresponding to Escherichia coli, Phospholamban and Mouse mitochondrial DNA, and it is shown that for the latter two cases, SAS allows extraction of the fractal iteration number and the scaling factor corresponding to “ACGT” square, or to recover the number of bases. The results are compared with a model based on multiplicative deterministic cascades, and respectively with one which takes into account the existence of forbidden sequences in DNA. This allows a classification of the DNA sequences in terms of random and deterministic fractals structures emerging in CGR.
APA, Harvard, Vancouver, ISO, and other styles
18

Luo, Chao, Li Yu, and Jun Zheng. "Extending Stochastic Network Calculus to Loss Analysis." Scientific World Journal 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/918565.

Full text
Abstract:
Loss is an important parameter of Quality of Service (QoS). Though stochastic network calculus is a very useful tool for performance evaluation of computer networks, existing studies on stochastic service guarantees mainly focused on the delay and backlog. Some efforts have been made to analyse loss by deterministic network calculus, but there are few results to extend stochastic network calculus for loss analysis. In this paper, we introduce a new parameter named loss factor into stochastic network calculus and then derive the loss bound through the existing arrival curve and service curve via this parameter. We then prove that our result is suitable for the networks with multiple input flows. Simulations show the impact of buffer size, arrival traffic, and service on the loss factor.
APA, Harvard, Vancouver, ISO, and other styles
19

Hassis, Hedi, Abir Jendoubi, Lioua Kolsi, and Mohamed Omri. "A Dynamic Analysis for Probabilistic/Possibilistic Problems Model Reduction Analysis Using Special Functions." Mathematics 10, no. 9 (May 5, 2022): 1554. http://dx.doi.org/10.3390/math10091554.

Full text
Abstract:
Information and data in mechanics, as in many other scientific disciplines, can be certainly known with an error-safety coefficient (deterministic), random with a known probability distribution (probabilistic), or random known with an uncertainty factor in the information (possibilistic). When the information on the parameters is undermined, probabilistic/possibilistic mechanical techniques attempt to provide an estimate of the solution. For various mechanical problems involving probabilistic/possibility parameters, a constraint that must be met is sometimes added, as in the case of reliability analysis. In this paper, an approach for probabilistic/possibilistic dynamic analysis is introduced and validated. In addition, its extension for finite element structural analysis is presented.
APA, Harvard, Vancouver, ISO, and other styles
20

Szabó, Norbert Péter, and Mihály Dobróka. "Robust estimation of reservoir shaliness by iteratively reweighted factor analysis." GEOPHYSICS 82, no. 2 (March 1, 2017): D69—D83. http://dx.doi.org/10.1190/geo2016-0393.1.

Full text
Abstract:
We suggest a statistical method for the simultaneous processing of electric, nuclear, and sonic-logging data using a robust iteratively reweighted factor analysis (IRFA). After giving a first estimate by Jöreskog’s approximate method, we refine the factor loadings and factor scores jointly in an iterative procedure, during which the deviation between the measured and calculated data is weighted in proportion to its magnitude for giving an outlier-free solution. We show a strong nonlinear relation between the first factor and the shale volume of multimineral hydrocarbon formations. We test the noise rejection capability of the new statistical procedure by making synthetic modeling experiments. The IRFA of simulated well-logging data including a high amount of noise gives a well log of the shale volume purified of large errors. Case studies from Hungary and the USA show that the results of factor analysis are consistent with that of independent deterministic modeling and core data. The statistical workflow can be effectively used for the processing of not normally distributed and extremely noisy well-logging data sets to evaluate the shale content and derived petrophysical properties more accurately in reservoir rocks.
APA, Harvard, Vancouver, ISO, and other styles
21

Borodin, Alex, Irina Mityushina, Elena Streltsova, Andrey Kulikov, Irina Yakovenko, and Anzhela Namitulina. "Mathematical Modeling for Financial Analysis of an Enterprise: Motivating of Not Open Innovation." Journal of Open Innovation: Technology, Market, and Complexity 7, no. 1 (March 1, 2021): 79. http://dx.doi.org/10.3390/joitmc7010079.

Full text
Abstract:
The article develops economic and mathematical models as a tool for conducting factor financial analysis of the prospects for the development of an industrial enterprise. The functioning of the developed economic and mathematical models is based on the DuPont model, which allows analyzing the dynamics of the company’s profitability in the course of two-factor and three-factor financial analysis. The proposed model tools are based on the convergence of deterministic financial analysis methods embedded in the DuPont model and simulation methods that allow analysis under the influence of random factors. The constructed economic and mathematical models for forecasting profitability use the company’s retrospective data on its financial condition: the amount of profit, revenue, assets, and equity. The constructed simulation models are implemented in the OMEGA software product and included in the computer technology for predicting the profitability of an industrial enterprise. The architecture of the proposed tools is presented, and the results of simulation experiments performed on models are demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
22

Malik, Mushtaq Ahmad, and Tariq Masood. "Analysis of Growth Accounting and Convergence in MENA Countries: Panel Cointegration Approach." South Asian Journal of Macroeconomics and Public Finance 9, no. 2 (November 9, 2020): 237–62. http://dx.doi.org/10.1177/2277978720968416.

Full text
Abstract:
The objective of this study is to investigate the sources of output growth and their convergence in the Middle East and North African countries over the period 1970–2017. Towards this end, the study employs Levin et al. (2002, Journal of Econometrics, vol. 108, pp. 1–24), Fisher-type (Choi, 2001, Journal of International Money and Finance, vol. 20, pp. 249–272) and Im et al. (2003, Journal of Econometrics, vol. 115, pp. 53–74) panel unit root tests and Pedroni (2004, Econometric Theory, vol. 20, pp. 597–625), Kao (1999, Journal of Econometrics, vol. 90, pp. 1–44) and Johansen–Fisher cointegration tests. After estimating the production function using random effects estimator to obtain the share of physical capital in output, we employed standard growth accounting approach to measure and decompose growth of total output into contributions from growth in physical capital, labour, human capital and total factor productivity (TFP). Further, the study discusses the existence of stochastic and deterministic convergence of real output per worker and its sources (physical capital per worker, human capital and TFP). The statistical results of the article can be summarized as follows: The contribution of physical capital to output growth is found to be positive and higher than the contribution of labour, whereas the contribution of TFP was negative across the region with the exception of Egypt, Morocco, Tunisia and Turkey. However, when the contribution of human capital is netted out, the contribution of TFP becomes negative in all the countries except for Tunisia. In addition, the study found no clear evidence of deterministic convergence in output per worker (but stochastic convergence), human capital and factor productivity. However, the statistical results provide overwhelming evidence for stochastic and deterministic convergence in physical capital per worker. JEL Classification: O4, O40, O47
APA, Harvard, Vancouver, ISO, and other styles
23

Gao, W., N. Zhang, J. Ma, and X. B. Wang. "Interval dynamic response analysis of structures with interval parameters." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 222, no. 3 (March 1, 2008): 377–85. http://dx.doi.org/10.1243/09544062jmes804.

Full text
Abstract:
Dynamic response analysis of truss structures with interval parameters under interval loads are investigated using a new method called the interval factor method (IFM). Using the IFM, the structural physical parameters, geometric dimensions, and loads can be considered as interval variables. The structural stiffness and mass matrices can then, respectively, be described by the product of two parts corresponding to the deterministic matrix and the interval factors of structural parameters. The computational expressions for the midpoint value, lower and upper bounds of the structural dynamic responses are derived by means of the mode superposition method and interval operations. The influences of the uncertainty of the structural parameters and loads on the structural dynamic responses are demonstrated by using truss structures.
APA, Harvard, Vancouver, ISO, and other styles
24

Carter, Bruce J., and Emery Z. Lajtai. "Rock slope stability and distributed joint systems." Canadian Geotechnical Journal 29, no. 1 (February 1, 1992): 53–60. http://dx.doi.org/10.1139/t92-006.

Full text
Abstract:
A deterministic (GEOSLIDE) and a probabilistic (PROSLIDE) microcomputer code are introduced to aid in performing rock wedge analyses based on the limit equilibrium method. The deterministic code evaluates the stability of a single rock wedge formed by discontinuities in rock through three-dimensional vector algebra, GEOSLIDE undertakes a full kinematic analysis (daylighting and obstruction), analyzes both wedge and plane sliding, and provides for anchor designs and sensitivity analyses (cohesion, friction, and water forces). Through multiple stability analyses, PROSLIDE evaluates the probability of failure for a rock slope by examining the distribution of the factors of safety from all the potential sliding wedges formed by the discontinuities of the rock mass. The probability of failure is expressed as the ratio of kinematically free wedges that have a factor of safety less than unity to the total number of wedges, PROSLIDE can form and analyze as many as 2000 different pairs of discontinuities in less than 30 min using a 25 MHz 486 IBM-compatible computer. In a worked example, the probability of failure for a fixed slope strike and loading condition is shown to vary with the slope angle, following the characteristic 'S' shape of a cumulative distribution function. The effect of an anchor force is to spread the distribution over a wider range of the factor of safety (SF), pushing many wedges into a potential upslide situation and splitting the distribution about the failure zone of the stability diagram (−1 < SF < 1). Key words : rock slope, rock wedge, stability analysis, factor of safety, probability of failure, Monte Carlo simulation.
APA, Harvard, Vancouver, ISO, and other styles
25

Makhutov, N. A., and D. O. Reznikov. "Comprehensive analysis of the strength and safety of potentially hazardous facilities subject to uncertainties." Dependability 20, no. 1 (March 30, 2020): 47–56. http://dx.doi.org/10.21683/1729-2646-2020-20-1-47-56.

Full text
Abstract:
Aim. This paper aims to compare the two primary approaches to ensuring the structural strength and safety of potentially hazardous facilities, i.e. the deterministic approach that is based on ensuring standard values of a strength margin per primary limit state mechanisms, and the probabilistic approach, under which the strength condition criterion is the nonexceedance by the target values of probability of damage per various damage modes of the standard maximum allowable values. . The key problem of ensuring the structural strength is the high level of uncertainties that are conventionally subdivided into two types: (1) the uncertainties due to the natural variation of the parameters that define the load-carrying ability of a system and the load it is exposed to, and (2) the uncertainties due to the human factor (the limited nature of human knowledge of a system and possibility of human error at various stages of system operation). The methods of uncertainty mitigation depend on the approach applied to strength assurance: under the deterministic approach the random variables “load” and “carrying capacity” are replaced with deterministic values, i.e. their mathematical expectations, while the fulfillment of the strength conditions subject to uncertainties is ensured by introducing the condition that the relation of the mathematical expectation of the loadcarrying capacity and strength must exceed the standard value of strength margin that, in turn, must be greater than unity. As part of the probabilistic approach, the structural strength is assumed to be ensured if the estimated probability of damage per the given mechanism of limit state attainment does not exceed the standard value of the probability of damage.Conclusions. The two approaches (deterministic and probabilistic) can be deemed equivalent only in particular cases. The disadvantage of both is the limited capability to mitigate the uncertainties of the second type defined by the effects of the human factor, as well as the absence of a correct procedure of accounting for the severity of consequences caused by the attainment of the limit state. The above disadvantages can be overcome if risk-based methods are used in ensuring structural strength and safety. Such methods allow considering uncertainties of the second type and explicitly taking into consideration the criticality of consequences of facility destruction.
APA, Harvard, Vancouver, ISO, and other styles
26

Sankar, V. A., and P. V. Y. Jayasree. "Design and Analysis of Novel Fractal Linear and Planar Array Antennas for Satellite Applications." Advanced Electromagnetics 5, no. 3 (December 1, 2016): 56. http://dx.doi.org/10.7716/aem.v5i3.400.

Full text
Abstract:
This article proposed a new geometric design methodology for the systematic expansion of fractal linear and planar array antennas. Using this proposed geometric design methodology any deterministic polygon shape can be constructed. In this article, two element fractal linear and triangular array antennas are examined using proposed methodology up to four iterations of two expansion factors. Due to the repetitive nature of the proposed geometric design methodology, both linear and planar fractal arrays shows multi-beam behavior with excellent array factor properties. The behavior of the proposed arrays shows better performance than linear and planar fractal array antennas generated by concentric circular ring sub array geometric generator. Triangular planar fractal array of expansion factor two at fourth iteration achieved a single valued beam width of 3.80 with -31.6 side lobe level. The suggested fractal arrays are analyzed and simulated by MATLAB-13 programming.
APA, Harvard, Vancouver, ISO, and other styles
27

Mohd Noh, Muhamad Husnain, Mohd Akramin Mohd Romlay, Chuan Zun Liang, Mohd Shamil Shaari, and Akiyuki Takahashi. "Analysis of stress intensity factor for fatigue crack using bootstrap S-version finite element model." International Journal of Structural Integrity 11, no. 4 (March 16, 2020): 579–89. http://dx.doi.org/10.1108/ijsi-10-2019-0108.

Full text
Abstract:
PurposeFailure of the materials occurs once the stress intensity factor (SIF) overtakes the material fracture toughness. At this level, the crack will grow rapidly resulting in unstable crack growth until a complete fracture happens. The SIF calculation of the materials can be conducted by experimental, theoretical and numerical techniques. Prediction of SIF is crucial to ensure safety life from the material failure. The aim of the simulation study is to evaluate the accuracy of SIF prediction using finite element analysis.Design/methodology/approachThe bootstrap resampling method is employed in S-version finite element model (S-FEM) to generate the random variables in this simulation analysis. The SIF analysis studies are promoted by bootstrap S-version Finite Element Model (BootstrapS-FEM). Virtual crack closure-integral method (VCCM) is an important concept to compute the energy release rate and SIF. The semielliptical crack shape is applied with different crack shape aspect ratio in this simulation analysis. The BootstrapS-FEM produces the prediction of SIFs for tension model.FindingsThe mean of BootstrapS-FEM is calculated from 100 samples by the resampling method. The bounds are computed based on the lower and upper bounds of the hundred samples of BootstrapS-FEM. The prediction of SIFs is validated with Newman–Raju solution and deterministic S-FEM within 95 percent confidence bounds. All possible values of SIF estimation by BootstrapS-FEM are plotted in a graph. The mean of the BootstrapS-FEM is referred to as point estimation. The Newman–Raju solution and deterministic S-FEM values are within the 95 percent confidence bounds. Thus, the BootstrapS-FEM is considered valid for the prediction with less than 6 percent of percentage error.Originality/valueThe bootstrap resampling method is employed in S-FEM to generate the random variables in this simulation analysis.
APA, Harvard, Vancouver, ISO, and other styles
28

Khalep, Yu N., and E. I. Volkogon. "ECONOMIC AND ENERGY EFFICIENCY OF MICROGUMIN USE IN THE CULTIVATION TECHNOLOGIES OF SPRING BARLEY." Agriciltural microbiology 13 (August 19, 2011): 124–36. http://dx.doi.org/10.35868/1997-3004.13.124-136.

Full text
Abstract:
The results of economical and energy efficiency studies of application of the microbial preparation Microgumin in the cultivation technologies of spring barley are presented. The influence of main factors on the formation of the efficiency indices was identified by the means of deterministic factor analysis. The high efficiency of Microgumin application was ensured due to the priority growth rates of yield level as compared to the costs increase.
APA, Harvard, Vancouver, ISO, and other styles
29

Guzhov, S. V. "About combining determinated and stochastic approaches for prediction of the heating balance of the building for water sports." Power engineering: research, equipment, technology 22, no. 1 (April 30, 2020): 103–12. http://dx.doi.org/10.30724/1998-9903-2020-22-1-103-112.

Full text
Abstract:
Forecasting the demand for thermal energy by energy complexes of buildings and structures is an urgent task. To achieve the necessary accuracy of the calculation, it is customary to use various deterministic methods based on the available changing and slightly changing data about the object of study. At the same time, statistical data can also be used in analysis by stochastic methods. The purpose of this article is to analyze the question of the admissibility of combining deterministic and stochastic approaches in order to increase the accuracy of the calculation. Formulas for calculating the components of the expenditure part of the heat balance are shown on the example of a building for water sports. Based on the above formulas, a calculation with a monthly discretization in the period from January 2009 is carried out. until January 2019. An example is given of calculating the accuracy of the forecast of demand for thermal energy through multivariate regression analysis and the use of artificial neural networks. Based on the same data, an artificial neural network was trained on seven different factors: six independent and seventh — the idealized value of the building’s heat loss through the building envelope. An example of the analysis of a building for practicing water sports shows the inadmissibility of the described approach if the same initial data are used in the deterministic and stochastic method. Results: the accuracy of the forecast made using regression analysis increases with an increase in the number of factors. However, the use of an additional group of factors in the stochastic method, for example, which are numerically processed climate data that are already used as initial data, will lead to an unreasonable overestimation of the significance of the twice used factor. The presence in the predictive models using artificial neural networks of collinearity and multicollinearity of variables does not negatively affect the forecast. Conclusion: the combination of the deterministic and stochastic approaches in preparing the predicted heat balance by using only the same input data that is used in the stochastic approach in the deterministic approach is unacceptable.
APA, Harvard, Vancouver, ISO, and other styles
30

Waśkowicz, Bartosz. "Statistical analysis and dimensioning of a wind farm energy storage system." Archives of Electrical Engineering 66, no. 2 (June 27, 2017): 265–77. http://dx.doi.org/10.1515/aee-2017-0020.

Full text
Abstract:
AbstractThe growth in renewable power generation and more strict local regulations regarding power quality indices will make it necessary to use energy storage systems with renewable power plants in the near future. The capacity of storage systems can be determined using different methods most of which can be divided into either deterministic or stochastic. Deterministic methods are often complicated with numerous parameters and complex models for long term prediction often incorporating meteorological data. Stochastic methods use statistics for ESS (Energy Storage System) sizing, which is somewhat intuitive for dealing with the random element of wind speed variation. The proposed method in this paper performs stabilization of output power at one minute intervals to reduce the negative influence of the wind farm on the power grid in order to meet local regulations. This paper shows the process of sizing the ESS for two selected wind farms, based on their levels of variation in generated power and also, for each, how the negative influences on the power grid in the form of voltage variation and a shortterm flicker factor are decreased.
APA, Harvard, Vancouver, ISO, and other styles
31

Kjærland, Frode, Aras Khazal, Erlend Krogstad, Frans Nordstrøm, and Are Oust. "An Analysis of Bitcoin’s Price Dynamics." Journal of Risk and Financial Management 11, no. 4 (October 15, 2018): 63. http://dx.doi.org/10.3390/jrfm11040063.

Full text
Abstract:
This paper aims to enhance the understanding of which factors affect the price development of Bitcoin in order for investors to make sound investment decisions. Previous literature has covered only a small extent of the highly volatile period during the last months of 2017 and the beginning of 2018. To examine the potential price drivers, we use the Autoregressive Distributed Lag and Generalized Autoregressive Conditional Heteroscedasticity approach. Our study identifies the technological factor Hashrate as irrelevant for modeling Bitcoin price dynamics. This irrelevance is due to the underlying code that makes the supply of Bitcoins deterministic, and it stands in contrast to previous literature that has included Hashrate as a crucial independent variable. Moreover, the empirical findings indicate that the price of Bitcoin is affected by returns on the S&P 500 and Google searches, showing consistency with results from previous literature. In contrast to previous literature, we find the CBOE volatility index (VIX), oil, gold, and Bitcoin transaction volume to be insignificant.
APA, Harvard, Vancouver, ISO, and other styles
32

POURZAKI, ABBAS, KHALIL MAFINEJAD, and SAYYED HOSAIN KESHMIRI. "A NEW METHOD FOR CDMA SIGNAL MODELING IN NONLINEAR SYSTEMS." Journal of Circuits, Systems and Computers 15, no. 06 (December 2006): 833–48. http://dx.doi.org/10.1142/s0218126606003374.

Full text
Abstract:
When a CDMA signal is passed through an RF transmitter, nonlinear elements cause spectral regrowth which result in reduction of spectral efficiency. CDMA signal has a pseudo noise nature; hence its mathematical treatment is too complex to analysis. In this paper, first it will be shown how to simplify the complex mathematics of CDMA signal. Then, a deterministic signal replaces the CDMA signal and the system response to both of them is calculated. In this paper, for the first time, ACPR is explicitly calculated for both the CDMA and deterministic signals. ACPR is calculated in terms of the nonlinear system coefficients and input power, and therefore, can be used in design objectives. In addition, it will be shown that if input power of the deterministic signal is multiplied by [Formula: see text] (i.e., correction factor), ACPR error of these kinds of signals in -55 dBc is less than 2.2% for the system nonlinearity orders up to 13. This correction factor is obtained by both theoretical and simulation methods.
APA, Harvard, Vancouver, ISO, and other styles
33

Filatov, E. A. "PRODUCTION PROFITABILITY ANALYSIS FOR SMALL CONSTRUCTION ENTERPRISES OF THE IRKUTSK REGION." Scientific Review: Theory and Practice 10, no. 6 (June 30, 2020): 1086–96. http://dx.doi.org/10.35679/2226-0226-2020-10-6-1086-1096.

Full text
Abstract:
Various indicators of profitability are widely used in many industries to assess the companies’ financial and economic performance. For example, production profitability and sales profitability indicators are used for comparative assessing the performance of individual economic entities and industries that produce different volumes and types of products. To compare the amount of profit and the amount of funds used to achieve it in the sectoral economy, production profitability indicator is used. Basically, production profitability is one of the key parameters for determining the efficiency of the economy. This indicator is very important for making current and strategic decisions. The author of the article has developed a 4-factor model to carry out a factor analysis of production profitability. In deterministic factor analysis, the author’s model of production profitability (effective criterion indicator) is represented by the product of 4 factors, of which three are well-known and one is found by the author. The relationship between these four factors and the production profitability is functional. The article reveals the influence of factors influencing the change in the production profitability and gives the author’s methodological approaches to its calculation (methods of factor analysis, developed by E.A. Filatov). The article presents the author’s analytical and systematized statistical material for the analysis of the key indicators that reveal the impact of the changes in the production profitability of small enterprises in the construction industry of the Irkutsk region of the Russian Federation.
APA, Harvard, Vancouver, ISO, and other styles
34

Wang, Hui Ying, Jian Cai, and Guo Bin Bu. "Evaluation of Strong Column Factors for RC Frames Based on Seismic Vulnerability Analysis." Advanced Materials Research 639-640 (January 2013): 854–58. http://dx.doi.org/10.4028/www.scientific.net/amr.639-640.854.

Full text
Abstract:
Augmenting the flexural strength of columns in the seismic design of reinforced concrete (RC) moment resisting frames is a key measure among all the detailing procedures of seismic capacity design, which induces the desirable beam side-sway mechanism for the structure to dissipate energy during a strong earthquake. The objective of this paper is to assess the influence of various strong column factors which is employed to perform seismic vulnerability analysis to the seismic performance of a six-story deterministic RC frame structure. Seismic vulnerability analyses indicate that augmenting the flexural strength of columns is an effective measure to improve seismic performance of RC frame structures. Increasing strong column factor improves the displacement capacity of structure and induces the biggish grads between the different damage limit states, which provide caution to prevent the abrupt collapse of structure during a strong earthquake. Seismic vulnerability curves provide the quantitative criterion for evaluating the seismic performance of structure and choosing appropriate target strong column factor.
APA, Harvard, Vancouver, ISO, and other styles
35

Durkee, Ben Y., Yushen Qian, Erqi L. Pollom, Martin T. King, Sara A. Dudley, Jenny L. Shaffer, Daniel T. Chang, Iris C. Gibbs, Jeremy D. Goldhaber-Fiebert, and Kathleen C. Horst. "Cost-Effectiveness of Pertuzumab in Human Epidermal Growth Factor Receptor 2–Positive Metastatic Breast Cancer." Journal of Clinical Oncology 34, no. 9 (March 20, 2016): 902–9. http://dx.doi.org/10.1200/jco.2015.62.9105.

Full text
Abstract:
Purpose The Clinical Evaluation of Pertuzumab and Trastuzumab (CLEOPATRA) study showed a 15.7-month survival benefit with the addition of pertuzumab to docetaxel and trastuzumab (THP) as first-line treatment for patients with human epidermal growth factor receptor 2 (HER2) –overexpressing metastatic breast cancer. We performed a cost-effectiveness analysis to assess the value of adding pertuzumab. Patient and Methods We developed a decision-analytic Markov model to evaluate the cost effectiveness of docetaxel plus trastuzumab (TH) with or without pertuzumab in US patients with metastatic breast cancer. The model followed patients weekly over their remaining lifetimes. Health states included stable disease, progressing disease, hospice, and death. Transition probabilities were based on the CLEOPATRA study. Costs reflected the 2014 Medicare rates. Health state utilities were the same as those used in other recent cost-effectiveness studies of trastuzumab and pertuzumab. Outcomes included health benefits expressed as discounted quality-adjusted life-years (QALYs), costs in US dollars, and cost effectiveness expressed as an incremental cost-effectiveness ratio. One- and multiway deterministic and probabilistic sensitivity analyses explored the effects of specific assumptions. Results Modeled median survival was 39.4 months for TH and 56.9 months for THP. The addition of pertuzumab resulted in an additional 1.81 life-years gained, or 0.62 QALYs, at a cost of $472,668 per QALY gained. Deterministic sensitivity analysis showed that THP is unlikely to be cost effective even under the most favorable assumptions, and probabilistic sensitivity analysis predicted 0% chance of cost effectiveness at a willingness to pay of $100,000 per QALY gained. Conclusion THP in patients with metastatic HER2-positive breast cancer is unlikely to be cost effective in the United States.
APA, Harvard, Vancouver, ISO, and other styles
36

Zadmirzaei, Majid, Soleiman Mohammadi Limaei, Alireza Amirteimoori, and Leif Olsson. "Measuring the relative performance of forest management units: a chance-constrained DEA model in the presence of the nondiscretionary factor." Canadian Journal of Forest Research 49, no. 7 (July 2019): 788–801. http://dx.doi.org/10.1139/cjfr-2018-0229.

Full text
Abstract:
In this study, we develop a marginal chance-constrained data envelopment analysis (DEA) model in the presence of nondiscretionary inputs and hybrid outputs for the first time. We call it a stochastic nondiscretionary DEA model (SND-DEA), and it is developed to measure and compare the relative efficiency of forest management units under different environmental management systems. Furthermore, we apply an output-oriented DEA technology to both deterministic and stochastic scenarios. The required data are collected from 24 forest management plans (as decision-making units) and included four inputs and an equal amount of outputs. The findings of this practical research show that the modified SND-DEA model in different probability levels gives us apparently different results compared with the output from pure deterministic models. However, when we calculate the correlation measures, the probability levels give us a strong positive correlation between stochastic and deterministic models. Therefore, approximately 40% of the forest management plans based on the applied SND-DEA model should substantially increase their average efficiency score. As the major conclusion, our developed SND-DEA model is a suitable improvement over previous developed models to discriminate the efficiency and (or) inefficiency of decision-making units to hedge against risk and uncertainty in this type of forest management problem.
APA, Harvard, Vancouver, ISO, and other styles
37

Ezersky, Valery, Pavel Monastyrev, and Ivan Ivanov. "THE ANALYSIS OF THERMAL PROPERTIES OF A WALL FRAGMENT MADE WITH 3D CONSTRUCTION TECHNOLOGY." International Journal for Computational Civil and Structural Engineering 15, no. 4 (December 29, 2019): 34–47. http://dx.doi.org/10.22337/2587-9618-2019-15-4-34-47.

Full text
Abstract:
The article presents the results of the analysis of the effect of the parameters of the external wall of a building, constructed using 3D technology on its heat engineering propertis. The dependence of the heat penetration coefficient Λ (function Y) of the wall on the followig factors has been constructed: the thickness of the outer walls d1 (factor X1), the thickness of telongitudinal partitions between the voids d2(factor Х2), the number of voids in the wall cross section in the transverse direction m (factor X3), the number of voids in the wall section per 1 rm in the longitudinal direction n (factor X4), the thermal conductivity coefficient of the heat-insulating material in the voids A1 (factor X5), provided that the cross-sectional area of the bearing part of the wall, taken under the condition of ensuring strength. The data set foranalysis was obtained by implementing a computational experiment. The analysis and optimization of the parameters was performed on the basis of a deterministic mathematical model that describes the presented dependence for the selected void formation scheme in the wall. The information may be useful for scientists, designers and technologists involved in the development of structural solutions of buildings using 3D printing technology.
APA, Harvard, Vancouver, ISO, and other styles
38

Albers, Susanne, and Maximilian Janke. "Scheduling in the Random-Order Model." Algorithmica 83, no. 9 (June 9, 2021): 2803–32. http://dx.doi.org/10.1007/s00453-021-00841-8.

Full text
Abstract:
AbstractMakespan minimization on identical machines is a fundamental problem in online scheduling. The goal is to assign a sequence of jobs to m identical parallel machines so as to minimize the maximum completion time of any job. Already in the 1960s, Graham showed that Greedy is $$(2-1/m)$$ ( 2 - 1 / m ) -competitive. The best deterministic online algorithm currently known achieves a competitive ratio of 1.9201. No deterministic online strategy can obtain a competitiveness smaller than 1.88. In this paper, we study online makespan minimization in the popular random-order model, where the jobs of a given input arrive as a random permutation. It is known that Greedy does not attain a competitive factor asymptotically smaller than 2 in this setting. We present the first improved performance guarantees. Specifically, we develop a deterministic online algorithm that achieves a competitive ratio of 1.8478. The result relies on a new analysis approach. We identify a set of properties that a random permutation of the input jobs satisfies with high probability. Then we conduct a worst-case analysis of our algorithm, for the respective class of permutations. The analysis implies that the stated competitiveness holds not only in expectation but with high probability. Moreover, it provides mathematical evidence that job sequences leading to higher performance ratios are extremely rare, pathological inputs. We complement the results by lower bounds, for the random-order model. We show that no deterministic online algorithm can achieve a competitive ratio smaller than 4/3. Moreover, no deterministic online algorithm can attain a competitiveness smaller than 3/2 with high probability.
APA, Harvard, Vancouver, ISO, and other styles
39

Cikic, Jovana, and Zivojin Petrovic. "Knowledge as a factor of organic production improvement in farms in Vojvodina." Zbornik Matice srpske za drustvene nauke, no. 133 (2010): 87–96. http://dx.doi.org/10.2298/zmsdn1033087c.

Full text
Abstract:
The paper is focused on the analysis of the rote of knowledge and diffusion of knowledge and innovations as a factor of organic production improvement in Vojvodinian farms. The assessment of the knowledge influence in improvement of organic production is bated upon the analysis of farmers' needs, their awareness of significance and prospects of organic production, as well as the level of knowledge development on organic production. Empirical bate for the analysis includes the results of the survey on agricultural extension agents' attitudes on factors for the development and improvement of organic production in farms and data on the extension agents' activities towards improvement. The analysis aims to point out the position of knowledge in the deterministic frame which defines prospects for the organic farming in Vojvodinian farms.
APA, Harvard, Vancouver, ISO, and other styles
40

Barger, Artem, and Dan Feldman. "Deterministic Coresets for k-Means of Big Sparse Data †." Algorithms 13, no. 4 (April 14, 2020): 92. http://dx.doi.org/10.3390/a13040092.

Full text
Abstract:
Let P be a set of n points in R d , k ≥ 1 be an integer and ε ∈ ( 0 , 1 ) be a constant. An ε-coreset is a subset C ⊆ P with appropriate non-negative weights (scalars), that approximates any given set Q ⊆ R d of k centers. That is, the sum of squared distances over every point in P to its closest point in Q is the same, up to a factor of 1 ± ε to the weighted sum of C to the same k centers. If the coreset is small, we can solve problems such as k-means clustering or its variants (e.g., discrete k-means, where the centers are restricted to be in P, or other restricted zones) on the small coreset to get faster provable approximations. Moreover, it is known that such coreset support streaming, dynamic and distributed data using the classic merge-reduce trees. The fact that the coreset is a subset implies that it preserves the sparsity of the data. However, existing such coresets are randomized and their size has at least linear dependency on the dimension d. We suggest the first such coreset of size independent of d. This is also the first deterministic coreset construction whose resulting size is not exponential in d. Extensive experimental results and benchmarks are provided on public datasets, including the first coreset of the English Wikipedia using Amazon’s cloud.
APA, Harvard, Vancouver, ISO, and other styles
41

Preti, Federico, and Tommaso Letterio. "Shallow landslide susceptibility assessment in a data-poor region of Guatemala (Comitancillo municipality)." Journal of Agricultural Engineering 46, no. 3 (October 1, 2015): 85. http://dx.doi.org/10.4081/jae.2015.450.

Full text
Abstract:
Although landslides are frequent natural phenomena in mountainous regions, the lack of data in emerging countries is a significant issue in the assessment of shallow landslide susceptibility. A key factor in risk-mitigation strategies is the evaluation of deterministic physical models for hazard assessment in these data-poor regions. Given the lack of physical information, input parameters to these data-intensive deterministic models have to be estimated, which has a negative impact on the reliability of the assessment. To address this problem, we examined shallow landslide hazard in Comitancillo municipality, Guatemala. Shallow landslides are here defined as small (less than two or three metre-deep) rotational or translational slides or earth flows. We based our hazard simulation on the stability index mapping model. The model’s input parameters were estimated from a statistical analysis of factors affecting landslides in the municipality obtained from a geodatabase. The outputs from the model were analysed and compared to an inventory of small-scale landslides. The results of the comparison show the effectiveness of the method developed to estimate input parameters for a deterministic model, in regions where physical data related to the assessment of shallow landslide susceptibility is lacking.
APA, Harvard, Vancouver, ISO, and other styles
42

Abdulai, Musah, and Mostafa Sharifzadeh. "Probability Methods for Stability Design of Open Pit Rock Slopes: An Overview." Geosciences 11, no. 8 (July 28, 2021): 319. http://dx.doi.org/10.3390/geosciences11080319.

Full text
Abstract:
The rock slope stability analysis can be performed using deterministic and probabilistic approaches. The deterministic analysis based on the safety concept factor uses fixed representative values for each input parameter involved without considering the variability and uncertainty of the rock mass properties. Probabilistic analysis with the calculation of probability of failure instead of the factor of safety against failure is emerging in practice. Such analyses offer a more rational approach to quantify risk by incorporating uncertainty in the input variables and evaluating the probability of the failure of a system. In rock slope engineering, uncertainty and variability involve a large scatter of geo-structural data and varied geomechanical test results. There has been extensive reliability analysis of rock slope stability in the literature, and different methods of reliability are being employed for assessment of the probability of failure and the reliability of a slope. Probabilistic approaches include Monte Carlo simulation (MCS), the point estimate method (PEM), the response surface method (RSM), first- and second-order reliability methods (FORMs and SORMs), and the first-order second-moment method (FOSM). Although these methods may be complicated, they provide a more complete definition of risk. Probabilistic slope stability analysis is an option in most commercial software; however, the use of this method is not common in practice. This paper provides an overview of the literature on some of the main probabilistic reliability-based methods available for the design of the rock slope in open pit mining. To demonstrate its applicability, the paper investigates the stability of a rock slope in an open pit mine in the Goldfields region, Western Australia. Two different approaches were adopted: deterministic stability analysis using two-dimensional limit equilibrium and finite element shear strength reduction methods using SLIDE and RS2 software, respectively, and probabilistic analysis by applying the MCS and RSM methods in the limit equilibrium method. In this example, the slope stability analysis was performed using the Spencer method with Cuckoo search optimization to locate the critical slip surface. The results obtained were compared and commented on.
APA, Harvard, Vancouver, ISO, and other styles
43

Li, Wenwei, Baotian Wang, Jinyu Zuo, Bingsheng Zhou, and Haixia Zhang. "Reliability Analysis of Expansive Soil Slope Stability Based on the Three-Broken Line Model." Mathematical Problems in Engineering 2021 (March 31, 2021): 1–14. http://dx.doi.org/10.1155/2021/6665099.

Full text
Abstract:
Based on the characteristics of an expansive soil slope, the slip mass can be simplified to a simpler model with three-broken line rigid bodies. A solution was formulated to calculate the safety factors of the slope, and the results are similar to those based on the strength reduction method. However, similar to conventional methods to analyze the stability of slopes, the deterministic method to obtain the safety factors only calculates the safety factor using deterministic values without considering the randomness of soil parameters, which leads to unstable results. To improve the rationality of the calculated results, this paper aims to construct a reliability analysis method based on the simplified three-broken line model of a landslide. The reliability is calculated with the response surface method in a spreadsheet with efficiency and convenience. The designed program considers the changes in the strength of the shallow soil and the depth of the strongly weathered layer for different stages of the wetting-drying cycles and solves for the probability of failure of the sliding surface at the interface between the strong and weak weathered layers. Considering an expansive soil slope as an example, the reliability of the slope was analyzed based on laboratory test data and the proposed formula. The results show that multiple wetting-drying cycles significantly increase the probability of failure of an expansive soil slope and that the slope typically becomes unstable after six wetting-drying cycles. Slope cutting helps alleviate the adverse effects of wetting-drying cycles.
APA, Harvard, Vancouver, ISO, and other styles
44

MacKillop, Kevin, Gordon Fenton, David Mosher, Valerie Latour, and Perry Mitchelmore. "Assessing Submarine Slope Stability through Deterministic and Probabilistic Approaches: A Case Study on the West-Central Scotia Slope." Geosciences 9, no. 1 (December 28, 2018): 18. http://dx.doi.org/10.3390/geosciences9010018.

Full text
Abstract:
A simplified geostatistical approach was adopted to assess the effect of spatial variability of soil properties on slope stability analysis in order to understand continental margin geologic processes and potential geohazards for an area of the central Scotian Slope, offshore Nova Scotia, Canada. The analyses are conducted on piston core samples, thus are restricted to ~12 m sub-seabed; however, the approach provides insight into the general effects of spatial and temporal variability. Data processing using geostatistics and assessment of spatial correlation are used to characterize the current dataset. A deterministic assessment was performed for both non-spatially averaged and spatially averaged core sections. The results indicate that the estimated factor of safety increased by about 30% when spatially averaged values were used. A probabilistic model is introduced to assess reliability of the slope. The approach makes use of estimates of both the mean and variance of input random variables (e.g., Su and γb). The model uses an exact probabilistic formulation for the total stress stability analysis and a Taylor series approximation for the effective stress stability analysis. In both cases, the mean and variance of the factor of safety are computed, leading to estimates of failure probability. The results suggest that the deterministic analysis is conservative with respect to slope reliability, although they do not lead to an estimate of the probability of failure. While these results indicate sediment instability is largely unlikely under static conditions, the reality is that many examples of submarine slope failure are observed in the geologic record. These results suggest that cyclic loading (earthquakes) or pre-conditioning factors (elevation of pore pressures) are critical for slope instability on the Scotian Slope.
APA, Harvard, Vancouver, ISO, and other styles
45

Riselo, Bianca, Larissa Passini, and Alessander Kormann. "Deterministic and probabilistic 3D analysis of slope rupture at BR 116 PR/SP highway in unsaturated soil." MATEC Web of Conferences 337 (2021): 03014. http://dx.doi.org/10.1051/matecconf/202133703014.

Full text
Abstract:
This research was developed with the purpose of presenting a deterministic and probabilistic assessment of the stability of slope located in state of São Paulo-Brazil, in the area denominated Serra Pelada, BR 116 PR/SP, with the incorporation of different suction scenarios in the unsaturated soil. The methodology was composed by 2D and 3D modelling of the slope, in the SoilVision’s SVSlope software, with the imposition of two water levels on the slope, one of 6.5 meters deep and another of 7.5 meters. The results demonstrate the variability of the probability of rupture, the safety factor, and the quantification of mobilized mass volume, in the six suction scenarios. As a result, it is possible to conclude with the analysis that the greater the surface suction in the unsaturated soil, the greater the safety factors of the slope and the lower the probability of rupture. It is also prudent to add that the incorporation of the variability of the geotechnical parameters in the probability analysis of stability, together with the 3D modelling of the slope, allow a more reliable analysis, presenting results of greater applicability in subsequent analyses. Finally, in conclusion, the studied slope is safe regarding its global stability for rupture.
APA, Harvard, Vancouver, ISO, and other styles
46

Kostyuk, V. "METHODOLOGY OF THE ANALYTICAL MODELING AND FACTOR ANALYSIS OF THE PRODUCTION EQUIPMENT PRODUCTIVITY." Series: Economic science 5, no. 158 (September 25, 2020): 96–102. http://dx.doi.org/10.33042/2522-1809-2020-5-158-96-102.

Full text
Abstract:
The article deals with the methodology of modeling and factor analysis of a production equipment unit productivity. It is emphasized that the productivity is an important generalizing indicator, that reflects the efficiency of the production equipment use. The final results of any enterprise’s activity directly depend on its absolute value and growth rates. The change of this indicator is influenced by various factors, that characterize the availability, structure and use of the production equipment in terms of time and capacity. In this regard, the factor analysis of the given indicator, i.e. the study of the influence of any individual factors on its change, has a relevant importance. The article emphasizes that the mathematical modeling of this indicator is an important way of solving any economic and statistical tasks, in particular, of studying the influence of the most important factors on the change in the productivity of a production equipment unit. The calculation of the quantitative influence of the mentioned factors on the change in the productivity of a production equipment unit is proposed to be carried out on the basis of the chain substitutions method. In the process of modeling of the factor systems of this indicator it is proposed to implement a phased factor analysis of a production equipment unit productivity, i.e. to consistently decompose the value of this index into a number of its initial indicators, which depending on the goals and objectives of the enterprise, gives the possibility to calculate the influence of those factors, that are the most significant and relevant at the moment. The methodology of the analytical modeling and factor analysis of production equipment productivity, given in the article, allows to present this indicator in the form of some deterministic multiplicative models, to determine the influence of the most important factors on its change, to investigate the regularities of such an influence, to justify the appropriate management decisions regarding the further development of the enterprise. Keywords: methodology, modeling, productivity, method, factor.
APA, Harvard, Vancouver, ISO, and other styles
47

Li, D. H., C. Y. Tang, M. Jie, Albert H. C. Tsang, and Y. C. Tsim. "Determination of the Optimum Partial Factors in Structural Reliability Analysis." International Journal of Reliability, Quality and Safety Engineering 05, no. 02 (June 1998): 169–80. http://dx.doi.org/10.1142/s0218539398000170.

Full text
Abstract:
In this paper, an optimization method is used to determine the values of partial factors in structural reliability analysis. Once the proper objective function is defined, a group of optimum partial factors, which enable the objective function to take its minimum value, will need to be determined. In the present study, two kinds of objective function are considered. The conditions that have to be satisfied for optimum partial factors of these two kinds of objective function are then derived. In both cases, the result shows that the partial factors of both dead and live loads should satisfy the same proportional expression and should be inversely proportional to the partial factor of resistance force. A simple beam is used as an example to illustrate the computations involved. It is found that the design concept proposed in this paper leads to a design criterion similar to that which applies to the conventional deterministic method. Thus, this concept can be easily used in practice. The illustrative example shows that the values of the dead load and live load have a significant effect on the reliability design criteria.
APA, Harvard, Vancouver, ISO, and other styles
48

Roshchyk, I. A., and A. T. Leshkevych. "PRODUCTIVITY OF PRODUCTION RESOURCES AS A FACTOR OF CONSTRUCTION ENTERPRISES’ PROFITABILITY IN UKRAINE." THEORETICAL AND APPLIED ISSUES OF ECONOMICS, no. 43 (2021): 45–54. http://dx.doi.org/10.17721/tppe.2021.43.5.

Full text
Abstract:
The article proposes a model of deterministic factor analysis, which allows to assess the impact on the profitability of all enterprise’s activities of the production resources’ productivity as well as indicators of the income’s and expenses’ structure of the enterprise. As a result of applying this model for factor analysis, it is possible to justify ways to increase the profitability of the enterprise in view of its various activities. The dynamics of profitability of all and operational activities in construction enterprises of Ukraine for 2010-2020 is analyzed. The comparison of these indicators with the corresponding indicators characterizing the average enterprise in the economy of Ukraine is made. The analysis of profitability of all construction enterprise’s activities in view of specificity for their basic activity and size is carried out. It was found that construction companies, with the exception of 2018 and 2019, operated inefficiently, receiving a net loss. Such financial results are much worse than the average enterprises’ results in Ukraine. Although operating activities were profitable, they did not reach the average values ​​in the Ukrainian economy. Enterprises engaged in the organization or direct construction of buildings and communications were the least efficient. Small businesses were more likely to suffer losses than large and medium-sized ones. On the basis of the model proposed in the article by the method of chain substitutions, a deterministic factor analysis of the profitability of all construction enterprises’ activities for 2014-2020 was carried out. It is established that the production resources’ productivity of this type of activity decreased annually during the analyzed period, except for 2016. The total reduction in production resources’ productivity was 0.216 UAH / UAH. It is concluded that this factor was a stable reserve for increasing profitability. It is concluded that in order to increase the profitability of all activities in construction enterprises of Ukraine, it is important to manage the production resources’ productivity.
APA, Harvard, Vancouver, ISO, and other styles
49

Liu, Keqiang, Yunjia Wang, Lixin Lin, and Guoliang Chen. "An Analysis of Impact Factors for Positioning Performance in WLAN Fingerprinting Systems Using Ishikawa Diagrams and a Simulation Platform." Mobile Information Systems 2017 (2017): 1–20. http://dx.doi.org/10.1155/2017/8294248.

Full text
Abstract:
Many factors influence the positioning performance in WLAN RSSI fingerprinting systems, and summary of these factors is an important but challenging job. Moreover, impact analysis on nonalgorithm factors is significant to system application and quality control but little research has been conducted. This paper analyzes and summarizes the potential impact factors by using an Ishikawa diagram considering radio signal transmitting, propagating, receiving, and processing. A simulation platform was developed to facilitate the analysis experiment, and the paper classifies the potential factors into controllable, uncontrollable, nuisance, and held-constant factors considering simulation feasibility. It takes five nonalgorithm controllable factors including APs density, APs distribution, radio signal propagating attenuation factor, radio signal propagating noise, and RPs density into consideration and adopted the OFAT analysis method in experiment. The positioning result was achieved by using the deterministic and probabilistic algorithms, and the error was presented by RMSE and CDF. The results indicate that the high APs density, signal propagating attenuation factor, and RPs density, with the low signal propagating noise level, are favorable to better performance, while APs distribution has no particular impact pattern on the positioning error. Overall, this paper has made great potential contribution to the quality control of WLAN fingerprinting solutions.
APA, Harvard, Vancouver, ISO, and other styles
50

Ridwan Harimansyah, Faikar, and Tukhas Shilul Imaroh. "AIRCRAFT SPARE PARTS INVENTORY MANAGEMENT ANALYSIS ON AIRFRAME PRODUCT USING CONTINUOUS REVIEW METHODS." Dinasti International Journal of Management Science 2, no. 1 (September 23, 2020): 81–90. http://dx.doi.org/10.31933/dijms.v2i1.528.

Full text
Abstract:
The research aims to find the factors that cause high inventory value, increase the value of forecasting precision, service level and cost efficiency with fishbone diagrams and proposed methods. The research sample is 9 spare parts included in classification A in the ABC analysis and maintenance list 2018. Forecasting methods use Moving Average, Single Exponential Smoothing and Syntetos-Boylan Approximation as well as Mean Square Error calculation, deterministic inventory calculation and Continuous Review Method. The results of this study are an increase in logistics costs by $ 808.71 in the inventory management proposal. An increase in service level from 95% to 99% and the error value in the calculation of the proposal becomes smaller using the proposed method. This study also found that the factor causing the high inventory value was due to inaccurate planning methods so that other comparative methods were needed that could increase the precision of demand forecasting.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography