Academic literature on the topic 'Probabilistic-statistical method'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Probabilistic-statistical method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Probabilistic-statistical method"

1

Bidyuk, Petro I., and Nataliia V. Kuznietsova. "Probabilistic-Statistical Method for Risk Assessment of Financial Losses." Research Bulletin of the National Technical University of Ukraine "Kyiv Politechnic Institute", no. 2 (June 12, 2018): 7–17. http://dx.doi.org/10.20535/1810-0546.2018.2.128989.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Holmes, C. C., and N. M. Adams. "A probabilistic nearest neighbour method for statistical pattern recognition." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 64, no. 2 (May 2002): 295–306. http://dx.doi.org/10.1111/1467-9868.00338.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Yan. "Analysis and Research on the Application of Sampling Big Data Statistical Method." BCP Business & Management 13 (November 16, 2021): 164–69. http://dx.doi.org/10.54691/bcpbm.v13i.86.

Full text
Abstract:
Under the current statistical environment and technical conditions, there is a certain lag in the publication of statistical data. This means that there is a time lag in the completion of reports, which may delay the judgment of the current economic situation. Network real-time analysis based on big data analysis has gradually become the main force of data analysis. This paper puts forward the basic idea of understanding the statistical inference problem of non-probabilistic sampling. Sampling methods can consider sample selection based on sample matching, link tracking sampling method, etc., so that the obtained non-probabilistic samples are like probabilistic samples, so the statistical inference theory of probabilistic samples can be adopted. Random sampling technology and non-random sampling technology still have many applicable scenes, which are not only the scenes of traditional sampling survey in the past, but also applied to more modern information scenes with the times.
APA, Harvard, Vancouver, ISO, and other styles
4

Pelizzola, Alessandro. "Cluster variation method in statistical physics and probabilistic graphical models." Journal of Physics A: Mathematical and General 38, no. 33 (August 3, 2005): R309—R339. http://dx.doi.org/10.1088/0305-4470/38/33/r01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Smugala, S., and D. Kubečková. "Construction Process Duration Predicted by Statistical Method." IOP Conference Series: Materials Science and Engineering 1203, no. 3 (November 1, 2021): 032135. http://dx.doi.org/10.1088/1757-899x/1203/3/032135.

Full text
Abstract:
Abstract Many construction projects today are planned and managed using computer technology. An integral part of the management of these projects is sophisticated software, which includes statistical probabilistic methods. The main task in this area is direct verification of the validity of planned labour productivity values during the construction process according to the recorded average performance values. Using selected statistical methods and analyses, a case study can document this type of undertaking, for example, in a selected masonry process in which the upper and lower limits of performance, i.e. the optimistic and pessimistic bounds, may be calculated with 95% probability. Evaluation of these performance parameters in the construction software used for this study showed a difference in time of 11 days at the end of the process. The figures indicated a 9.6% and 14.3% decrease in labour productivity, respectively, for the optimistic and pessimistic values compared to the construction software ’ s planned values. Repeated evaluation of performance can aid in improving labour productivity and attaining project milestones and subsequent construction deadlines during the construction process. This paper aims to confirm or refute this theoretical balance using probabilistic statistical methods and to emphasize the importance of statistical analysis in the real construction process with the use of the software.
APA, Harvard, Vancouver, ISO, and other styles
6

HIROHATA, Kenji, Katsumi HISANO, Hiroyuki TAKAHASHI, Minoru MUKAI, Noriyasu KAWAMURA, Hideo IWASAKI, Takashi KAWAKAMI, Qiang YU, and Masaki SHIRATORI. "Proposal of the Method for Multidisciplinary Reliability Analysis Based on Statistical and Probabilistic Methods." Transactions of the Japan Society of Mechanical Engineers Series A 71, no. 705 (2005): 740–48. http://dx.doi.org/10.1299/kikaia.71.740.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kurguzov, Konstantin V., Igor K. Fomenko, and Daria D. Shubina. "Probabilistic and statistical modeling of loads and forces." Vestnik MGSU, no. 9 (September 2020): 1249–61. http://dx.doi.org/10.22227/1997-0935.2020.9.1249-1261.

Full text
Abstract:
Introduction. At present, numerical methods enjoy widespread use in construction practice. They enable performing and analyzing complex non-linear, multi-factor models without excessive analytical procedures. However, as a rule, the most complex tasks, performed in a three-dimensional setting with account taken of physical, geometric and other nonlinearities, are performed in deterministic formulations without the analysis of the stochastic nature of physical processes. This seems particularly strange, given that numerical methods are well-suited for modeling stochastic processes. Numerical probabilistic and statistical approaches (PSA) can be applied to simulate and take into consideration various spatiotemporal aspects of the probabilistic nature of loads and forces, structural system resistances, materials and geological terrains. Even the most advanced numerical models of deterministic physical systems are merely a specific case of probabilistic and statistical modeling: they enable obtaining only one value (point) on the whole field of possible implementations, being unable to demonstrate an objective and exhaustive variety of probable outcomes. This article presents a case study of numerical probabilistic and statistical analyses of loads and forces. Methods of research. Materials from different sources, such as reference books, regulatory documents, laboratory test results, as well as available experimental data, were used as input parameters. The principal calculation and analysis of the integral function of loads was performed using the Monte Carlo numerical method of probabilistic and statistical modeling and various theoretical (statistical) and empirical distributions, followed by the quantitative assessment of design loads at various confidence probability values. Results. This study provides an example of the probabilistic and statistical calculation (determination) of the integral function of loads and forces with account taken of different origins of loads and varied input parameter distribution patterns, including empirical distributions. It has proven great importance of accurate description of initial distributions of a random value for the determination of reliable design load values. Conclusions. Probabilistic and statistical approaches have the ability to objectively assess the performance of structural systems based on the quantitative assessment of the probabilistic nature of load factors. These approaches have huge potential for increasing the reliability of buildings and structures and the cost effectiveness of construction projects.
APA, Harvard, Vancouver, ISO, and other styles
8

Wu, Yongxin, Yufeng Gao, Limin Zhang, and Jun Yang. "How distribution characteristics of a soil property affect probabilistic foundation settlement — from the aspect of the first four statistical moments." Canadian Geotechnical Journal 57, no. 4 (April 2020): 595–607. http://dx.doi.org/10.1139/cgj-2019-0089.

Full text
Abstract:
The effects of the first four statistical moments defining the statistical characteristic of elastic modulus on the probabilistic foundation settlement are investigated in this study. By combining the Hermite probability model and spectral representation method, a method to simulate nonGaussian homogenous fields based on the first four statistical moments is proposed. Linear elastic finite element models are employed to study the total settlement and the differential settlement of a shallow foundation. Probabilistic measurements of total–differential settlement obtained by the Monte Carlo simulations are presented. For the cases considered, the effects of skewness and kurtosis defining the probabilistic characteristic of elastic modulus on the total–differential settlement of a probabilistic foundation are illustrated. The computed results show that the value of skewness has a more significant effect on the probabilistic foundation settlement than kurtosis, and the case with the smallest skewness is observed as the most critical one.
APA, Harvard, Vancouver, ISO, and other styles
9

Užpolevičius, Benediktas. "PROBABILISTIC AND STATISTICAL METHODS FOR DESIGNING AND ANALYZING LIMIT STATES/TIKIMYBINIS IR STATISTINIS RIBINŲ BŪVIŲ PROJEKTAVIMO IR ANALIZĖS METODAI." JOURNAL OF CIVIL ENGINEERING AND MANAGEMENT 7, no. 5 (October 31, 2001): 413–18. http://dx.doi.org/10.3846/13921525.2001.10531763.

Full text
Abstract:
Errors of partial reliability factor (PRF) method, being used in operating codes to provide required reliability of construction work (foundations, structures and their systems), are discussed. It is pointed out that additional resistance (strength, stiffness, etc) of the members to compensate these errors is required, and it makes much more than 10%. The main cause of these shortcomings is that in the PRF method, for the sake of simplicity, independent partial coefficients and limited number of these coefficients are applied. Direct probabilistic and statistical methods (without application of partial coefficients and design values) are proposed. It is demonstrated that these methods are free of systematic errors which are characteristic of the known codified (standardized) probability calculations. The proposed methods make up a unified method used for foundations and structures, uniformly based on theoretical conclusions of probability theory and mathematical statistics. These economy seeking methods are intended for effective use (without change of codes) of available additional information, which is made of different amount of data of statistical measurements or statistical observations on minimal values of soils and building material mechanical characteristics of structures and maximum values of loads, actions and their combinations during the operation period, geometrical dimensions of critical cross-sections, errors of algorithms for calculation of resistance and action effects, control errors made by people during design, construction, operation of construction works, etc. In case of a limited amount of information the proposed statistical model is to be used. In addition, the proposed method is adjusted for effective use of the known reliability solutions of economical and social optimisation (in relation to economical and social damage caused by the limit state, human safety and other factors) co-ordinating them with application experience of PRF code method and statistical multidimensional data of limit state frequency. Areas of application and economy of the proposed probabilistic and statistical methods are presented in Table 1. National and established specifications prepared for probabilistic and statistical calculation design and tests of foundations, structures and their systems are discussed. Practical application and economical comparison of calculation of construction members by PRF, probabilistic and statistical methods (area of the cross-section, foundation pad base A % in relation to that determined by SNiP codes)
APA, Harvard, Vancouver, ISO, and other styles
10

Karakus, S., and K. Demırcı. "Statistical Convergence of Double Sequences on Probabilistic Normed Spaces." International Journal of Mathematics and Mathematical Sciences 2007 (2007): 1–11. http://dx.doi.org/10.1155/2007/14737.

Full text
Abstract:
The concept of statistical convergence was presented by Steinhaus in 1951. This concept was extended to the double sequences by Mursaleen and Edely in 2003. Karakus has recently introduced the concept of statistical convergence of ordinary (single) sequence on probabilistic normed spaces. In this paper, we define statistical analogues of convergence and Cauchy for double sequences on probabilistic normed spaces. Then we display an exampl e such that our method of convergence is stronger than usual convergence on probabilistic normed spaces. Also we give a useful characterization for statistically convergent double sequences.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Probabilistic-statistical method"

1

Sumner, Sarah. "A probabilistic and statistical method for analysis of MPEG-encoded video." Thesis, University of Ottawa (Canada), 2001. http://hdl.handle.net/10393/6084.

Full text
Abstract:
To achieve high rates of compression, the MPEG video compression standard provides methods for reducing both spatial and temporal redundancies in video data. The key step in lowering spatial redundancies is the application of the two-dimensional Discrete Cosine Transformation, while calculation of motion vectors is vital for temporal dependence reduction. Both the DCT coefficients and motion vectors can be used to provide information about the video data itself, without actually viewing or even completely decompressing the original data. After discussing different models for background noise, statistical hypothesis tests are developed to detect either the presence or motion of an object in a small area of one picture in the video sequence using either the DCT or motion vector data. The hypothesis tests are then combined into a unified cumulative sum procedure, based on the p-values of the hypothesis tests, which will signal an alarm at the point in time when an object appears in the video under investigation.
APA, Harvard, Vancouver, ISO, and other styles
2

Ламнауер, Наталія Юріївна. "Забезпечення високої точності лінійних розмірів деталей на основі ймовірносно-статистичних методів оцінки якості обробки." Thesis, Українська інженерно-педагогічна академія, 2018. http://repository.kpi.kharkov.ua/handle/KhPI-Press/36083.

Full text
Abstract:
Дисертація на здобуття наукового ступеня доктора технічних наук за спеціальністю 05.02.08 – технологія машинобудування. Національний технічний університет "Харківський політехнічний інститут", Харків, 2018. У дисертації вирішено народногосподарську проблему сучасного машинобудування, яка полягає в забезпечення якості обробки деталей машин за параметром точності розмірів в умовах серійного виробництва. На основі використання ймовірносно-статистичних методів запропонована нова модель розподілу розмірів деталей та знайдені оцінки її параметрів, що дозволяє створити методологію для прогнозування, аналізу та контролю якості технологічного процесу обробки для забезпечення необхідної точності. Ефективність запропонованих методів управління точністю всебічно вивчено і зроблено висновок про їхню придатність застосування в умовах серійного виробництва. Також запропоновано новий підхід до створення методу комплектування деталей за розмірами на основі розрахунку мінімального абсолютного відхилення значення розміру від модального значення. Вдосконалено модель розрахунку узагальненого показника якості виробу, що створена за допомогою дискретної суміші розподілів найбільшого й найменшого члену вибірки. Результати досліджень апробовано на серійних виробах і впроваджено на машинобудівних підприємствах України.
This is for a granting the Degree of Doctor of Technical sciences in speciality 05.02.08 – Manufacturing engineering. National Technical University "Kharkiv Polytechnic Institute", Kharkiv, 2018. The national economic problem of modern machine building, which consists in ensuring the quality of machining of machine parts by the parameter of dimensional accuracy in conditions of mass production, is solved in the thesis. The new model of the distribution of the dimensions of parts and the evaluations it’s a parameters which was found using probabilistic-statistical methods, which allowed creating a methodology for forecasting, analyzing and controlling the quality of the technological process of processing to ensure the necessary accuracy. The effectiveness of the proposed methods of accuracy control was been comprehensively studied and it’s a concluded that they are suitable for use in a batch production. A new approach to the creation of a method for assembling details by size is also proposed based on the calculation of the minimum absolute deviation of the size value from the modal value. The model for calculating the generalized product quality index, created using a discrete mixture of distributions of the largest and smallest member of the sample, has been improved. The results of the tests are tested on serial products and implemented at machine-building enterprises of Ukraine.
APA, Harvard, Vancouver, ISO, and other styles
3

Ламнауер, Наталія Юріївна. "Забезпечення високої точності лінійних розмірів деталей на основі ймовірносно-статитичних методів оцінки якості обробки." Thesis, НТУ "ХПІ", 2018. http://repository.kpi.kharkov.ua/handle/KhPI-Press/35979.

Full text
Abstract:
Дисертація на здобуття наукового ступеня доктора технічних наук за спеціальністю 05.02.08 – технологія машинобудування. Національний технічний університет "Харківський політехнічний інститут", Харків, 2018. У дисертації вирішено народногосподарську проблему сучасного машинобудування, яка полягає в забезпечення якості обробки деталей машин за параметром точності розмірів в умовах серійного виробництва. На основі використання ймовірносно-статистичних методів запропонована нова модель розподілу розмірів деталей та знайдені оцінки її параметрів, що дозволяє створити методологію для прогнозування, аналізу та контролю якості технологічного процесу обробки для забезпечення необхідної точності. Ефективність запропонованих методів управління точністю всебічно вивчено і зроблено висновок про їхню придатність застосування в умовах серійного виробництва. Також запропоновано новий підхід до створення методу комплектування деталей за розмірами на основі розрахунку мінімального абсолютного відхилення значення розміру від модального значення. Вдосконалено модель розрахунку узагальненого показника якості виробу, що створена за допомогою дискретної суміші розподілів найбільшого й найменшого члену вибірки. Результати досліджень апробовано на серійних виробах і впроваджено на машинобудівних підприємствах України.
This is for a granting the Degree of Doctor of Technical sciences in speciality 05.02.08 – Manufacturing engineering. National Technical University "Kharkiv Polytechnic Institute", Kharkiv, 2018. The national economic problem of modern machine building, which consists in ensuring the quality of machining of machine parts by the parameter of dimensional accuracy in conditions of mass production, is solved in the thesis. The new model of the distribution of the dimensions of parts and the evaluations it’s a parameters which was found using probabilistic-statistical methods, which allowed creating a methodology for forecasting, analyzing and controlling the quality of the technological process of processing to ensure the necessary accuracy. The effectiveness of the proposed methods of accuracy control was been comprehensively studied and it’s a concluded that they are suitable for use in a batch production. A new approach to the creation of a method for assembling details by size is also proposed based on the calculation of the minimum absolute deviation of the size value from the modal value. The model for calculating the generalized product quality index, created using a discrete mixture of distributions of the largest and smallest member of the sample, has been improved. The results of the tests are tested on serial products and implemented at machine-building enterprises of Ukraine.
APA, Harvard, Vancouver, ISO, and other styles
4

Dubost, Julien. "Variabilité et incertitudes en géotechnique : de leur estimation à leur prise en compte." Thesis, Bordeaux 1, 2009. http://www.theses.fr/2009BOR13808/document.

Full text
Abstract:
L’évolution actuelle de l’ingénierie géotechnique place la maîtrise des risques d’origine géotechnique au cœur de ses objectifs. On constate aussi que la complexité des projets d’aménagement (à travers les objectifs coûts/délais/performances qui sont recherchés) est croissante et que les terrains choisis pour les recevoir présentent, quant à eux, des conditions géotechniques de plus en plus souvent « difficiles ». Ces conditions défavorables se traduisent par une variabilité forte des propriétés des sols, rendant leur reconnaissance et leur analyse plus complexe. Ce travail de thèse traite de la caractérisation de la variabilité naturelle des sols et des incertitudes liées aux reconnaissances géotechniques dans le but de mieux les prendre en compte dans les dimensionnements des ouvrages. Il se positionne dans le contexte de la maîtrise des risques de projet d’origine géotechnique. Les principaux outils statistiques servant à décrire la dispersion des données et leur structuration spatiale (géostatistique), ainsi que des méthodes probabilistes permettant d’utiliser leur résultats dans des calculs, sont présentés sous l’angle de leur application en géotechnique. La démarche est appliquée à un projet de plate-forme ferroviaire. Cette infrastructure a été implantée sur un site géologiquement et géotechniquement complexe, et présente aujourd’hui des déformations importantes dues aux tassements des sols. Une nouvelle analyse des données géotechniques a donc été entreprise. Elles ont, au préalable, été regroupées dans une base de données qui a facilité leur traitement statistique et géostatistique. Leur variabilité statistique et spatiale a été caractérisée permettant une meilleure compréhension du site. Le modèle géologique et géotechnique ainsi établi a ensuite été utilisé pour calculer les tassements. Une démarche en trois temps est proposée : globale, locale et spatialisée permettant une estimation des tassements et de leur incertitude, respectivement, à l’échelle du site, aux points de sondages, et spatialisée sur la zone d’étude. Les résultats montrent clairement l’intérêt des méthodes statistiques et géostatistiques pour la caractérisation des sites complexes et l’élaboration d’un modèle géologique et géotechnique du site adapté. La démarche d’analyse des tassements proposée met en avant le fait que les incertitudes des paramètres se répercutent sur les résultats des calculs de dimensionnement et expliquent le comportement global de l’infrastructure. Ces résultats peuvent se traduire sous forme d’une probabilité de ruine qui peut ensuite être utilisée dans un processus de prise de décision et de management des risques. D’une manière plus large, ce travail de thèse constitue une contribution à l’élaboration et l’analyse des campagnes de reconnaissances géotechniques, en ayant le souci d’identifier, d’évaluer et de prendre en compte la variabilité et les incertitudes des données lors des différentes phases du projet pour permettre une meilleure maîtrise du risque d’origine géotechnique
The current evolution of the geotechnical engineering places the risk management of geotechnical origin in the heart of its objectives. We also notice that the complexity of the projects of development (through the objectives costs/deadline/performances which are sought) is increasing and that soil chosen to receive them present unusual geotechnical conditions. These unfavourable conditions usually mean a strong variability of the soil properties, which induces soil investigation and data analysis more difficult. This work of thesis deals with the characterization of the natural variability of soils and with the uncertainties dues to geotechnical investigations, with the aim to better take them into account in geotechnical engineering project. This work takes place in the context of the management of the risk of project with geotechnical origin. The main statistical tools used for describe the scattering of the data and their spatial variability (geostatistic), as well as the probabilistic methods enabling to use their results in calculations, are presented under the view of their application in geotechnical design. The approach is applied to a project of railway platform. This infrastructure was located on a site where the geology and the geotechnical conditions are complex, and which present important deformations due to the soil settlements. A new analysis of geotechnical data was started again. First, geotechnical data were included in a database in order to ease their statistical and geostatistical treatment. Their statistical and spatial variability were characterized allowing a better understanding of the site. The geologic and geotechnical model so established was then used to assess the settlement effects. An analysis in three levels is proposed: global, local and spatial, which give means to estimate the settlement values and its uncertainty, respectively, on the scale of the site, on the boring points, and on zone of study according to the spatial connectivity of soil properties. The results clearly show the interest of statistical and geostatiscal methods in characterizing complex sites and in the elaboration of a relevant geologic and geotechnical model. The settlement analysis proposed highlight that the parameter uncertainties are of first importance on the design calculations and explain the global behaviour of the infrastructure. These results can be translated in the form of a reliabilitry analysis which can be then used in a process of decision-making and risk management. In a wider way, this work of thesis contributes toward the elaboration and the analysis of the geotechnical investigations, with the aim to identify, to estimate and to take into account the variability and the uncertainties of the data during the various stages of the project. It leads to better control of the risk of geotechnical origin
APA, Harvard, Vancouver, ISO, and other styles
5

Tschumitschew, Katharina [Verfasser], and Frank [Akademischer Betreuer] Klawonn. "Statistical and Probabilistic Methods for Data Stream Mining / Katharina Tschumitschew ; Betreuer: Frank Klawonn." Braunschweig : Technische Universität Braunschweig, 2012. http://d-nb.info/1175823864/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yiu, Wai-sing Boris, and 姚維勝. "A fast probabilistic method for vehicle detection and tracking with anexplicit contour model." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B35057178.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Brown, Andrew Michael. "Development of a probabilistic dynamic synthesis method for the analysis of non-deterministic structures." Diss., Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/19065.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Verhein, Florian. "Generalised Interaction Mining: Probabilistic, Statistical and Vectorised Methods in High Dimensional or Uncertain Databases." Diss., lmu, 2010. http://nbn-resolving.de/urn:nbn:de:bvb:19-125388.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yiu, Wai-sing Boris. "A fast probabilistic method for vehicle detection and tracking with an explicit contour model." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B35057178.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Кузнєцова, Наталія Володимирівна. "Методи і моделі аналізу, оцінювання та прогнозування ризиків у фінансових системах." Doctoral thesis, Київ, 2018. https://ela.kpi.ua/handle/123456789/26340.

Full text
Abstract:
Роботу виконано в Інституті прикладного системного аналізу Національного технічного університету України «Київський політехнічний інститут імені Ігоря Сікорського».
У дисертаційній роботі розроблено системну методологію аналізу та оцінювання фінансових ризиків, яка ґрунтується на принципах системного аналізу та менеджменту ризиків, а також запропонованих принципах адаптивного та динамічного менеджменту ризиків. Методологія включає: комбінований метод обробки неповних та втрачених даних, ймовірнісно-статистичний метод оцінювання ризику фінансових втрат, динамічний метод оцінювання ризиків, який передбачає побудову різних типів моделей виживання, метод структурно-параметричної адаптації, застосування скорингової карти до аналізу ризиків фінансових систем і нейро-нечіткий метод доповнення вибірки відхиленими заявками. Містить критерії урахування інформаційного ризику, оцінки якості даних, прогнозів та рішень, квадратичний критерій якості опрацювання ризику та інтегральну характеристику оцінювання ефективності методів менеджменту ризиків. Практична цінність одержаних результатів полягає у створенні розширеної інформаційної технології та інформаційної системи підтримки прийняття рішень на основі запропонованої системної методології.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Probabilistic-statistical method"

1

Reliability: Probabilistic models and statistical methods. Englewood Cliffs, N.J: Prentice Hall, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Neuenschwander, Daniel. Probabilistic and Statistical Methods in Cryptology. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/b97045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mari, Jean-François. Probabilistic and statistical methods in computer science. Boston: Kluwer Academic Publishers, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mari, Jean-François. Probabilistic and Statistical Methods in Computer Science. Boston, MA: Springer US, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mari, Jean-François, and René Schott. Probabilistic and Statistical Methods in Computer Science. Boston, MA: Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-6280-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Eerola, Mervi. Probabilistic causality in longitudinal studies. New York: Springer-Verlag, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sharov, Valery Dmitryevich, Vadim Vadimovich Vorobyov, and Dmitry Alexandrovich Zatuchny. Probabilistic-Statistical Methods for Risk Assessment in Civil Aviation. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-0092-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Stapor, Katarzyna. Introduction to Probabilistic and Statistical Methods with Examples in R. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45799-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Neuenschwander, Daniel. Probabilistic and statistical methods in cryptology: An introduction by selected topics. Berlin: Springer, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Brown, Andrew M. Probabilistic component mode synthesis of nondeterministic substructures. Washington, DC: [National Aeronautics and Space Administration, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Probabilistic-statistical method"

1

Tanaka, Kazuyuki. "Probabilistic Computational Method in Image Restoration based on Statistical-mechanical Technique." In Soft Computing in Industrial Applications, 401–14. London: Springer London, 2000. http://dx.doi.org/10.1007/978-1-4471-0509-1_35.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Tanaka, Kazuyuki. "Review of Sublinear Modeling in Probabilistic Graphical Models by Statistical Mechanical Informatics and Statistical Machine Learning Theory." In Sublinear Computation Paradigm, 165–275. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-4095-7_10.

Full text
Abstract:
AbstractWe review sublinear modeling in probabilistic graphical models by statistical mechanical informatics and statistical machine learning theory. Our statistical mechanical informatics schemes are based on advanced mean-field methods including loopy belief propagations. This chapter explores how phase transitions appear in loopy belief propagations for prior probabilistic graphical models. The frameworks are mainly explained for loopy belief propagations in the Ising model which is one of the elementary versions of probabilistic graphical models. We also expand the schemes to quantum statistical machine learning theory. Our framework can provide us with sublinear modeling based on the momentum space renormalization group methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Bacharoudis, Konstantinos, Atanas Popov, and Svetan Ratchev. "Application of Advanced Simulation Methods for the Tolerance Analysis of Mechanical Assemblies." In IFIP Advances in Information and Communication Technology, 153–67. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72632-4_11.

Full text
Abstract:
AbstractIn the frame of a statistical tolerance analysis of complex assemblies, for example an aircraft wing, the capability to predict accurately and fast specified, very small quantiles of the distribution of the assembly key characteristic becomes crucial. The problem is significantly magnified, when the tolerance synthesis problem is considered in which several tolerance analyses are performed and thus, a reliability analysis problem is nested inside an optimisation one in a fully probabilistic approach. The need to reduce the computational time and accurately estimate the specified probabilities is critical. Therefore, herein, a systematic study on several state of the art simulation methods is performed whilst they are critically evaluated with respect to their efficiency to deal with tolerance analysis problems. It is demonstrated that tolerance analysis problems are characterised by high dimensionality, high non-linearity of the state functions, disconnected failure domains, implicit state functions and small probability estimations. Therefore, the successful implementation of reliability methods becomes a formidable task. Herein, advanced simulation methods are combined with in-house developed assembly models based on the Homogeneous Transformation Matrix method as well as off-the-self Computer Aided Tolerance tools. The main outcome of the work is that by using an appropriate reliability method, computational time can be reduced whilst the probability of defected products can be accurately predicted. Furthermore, the connection of advanced mathematical toolboxes with off-the-self 3D tolerance tools into a process integration framework introduces benefits to successfully deal with the tolerance allocation problem in the future using dedicated and powerful computational tools.
APA, Harvard, Vancouver, ISO, and other styles
4

Boukhaled, Mohamed Amine, and Jean-Gabriel Ganascia. "Probabilistic Anomaly Detection Method for Authorship Verification." In Statistical Language and Speech Processing, 211–19. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11397-5_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lucet, Corinne, and Jean-François Manouvrier. "Exact Methods to Compute Network Reliability." In Statistical and Probabilistic Models in Reliability, 279–94. Boston, MA: Birkhäuser Boston, 1999. http://dx.doi.org/10.1007/978-1-4612-1782-4_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Amari, Shun-Ichi. "Mathematical methods of neurocomputing." In Networks and Chaos — Statistical and Probabilistic Aspects, 1–39. Boston, MA: Springer US, 1993. http://dx.doi.org/10.1007/978-1-4899-3099-6_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Neuenschwander, Daniel. "Introduction." In Probabilistic and Statistical Methods in Cryptology, 1–7. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-25942-8_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Neuenschwander, Daniel. "9 Differential Cryptanalysis." In Probabilistic and Statistical Methods in Cryptology, 115–23. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-25942-8_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Neuenschwander, Daniel. "10 Semantic Security." In Probabilistic and Statistical Methods in Cryptology, 125–33. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-25942-8_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Neuenschwander, Daniel. "11 *Algorithmic Complexity." In Probabilistic and Statistical Methods in Cryptology, 135–38. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-25942-8_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Probabilistic-statistical method"

1

De Masi, Giulia, Matteo Mattioli, and Michele Drago. "Statistical method for cyclone probabilistic assessment." In OCEANS 2015 - Genova. IEEE, 2015. http://dx.doi.org/10.1109/oceans-genova.2015.7271589.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Korheeva, Nadezhda A., Anatoly V. Lykin, and Lola Sh Atabaeva. "Probabilistic and Statistical Method Application for Electric Power Losses Calculation." In 2018 XIV International Scientific-Technical Conference on Actual Problems of Electronics Instrument Engineering (APEIE). IEEE, 2018. http://dx.doi.org/10.1109/apeie.2018.8545038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tõns, Tõnis, Freeman Ralph, Sören Ehlers, and Ian J. Jordaan. "Probabilistic Design Load Method for the Northern Sea Route." In ASME 2015 34th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/omae2015-41841.

Full text
Abstract:
A probabilistic design method allows us to link statistical data from the operational area of the vessel with design loads providing the availability for more precise safety level assessment, which is important to ensure safe and sustainable ship transit in ice covered waters. Statistical design methods are well used for open water using spectral analysis. Wave induced loads are estimated by linking statistical load parameters to the sea state parameters. Statistical methods to estimate ice-induced loads are also available, however, current Polar Class rules are not considering probabilistic methods for determining ice-induced loads. This paper shows how developed probabilistic methods can be used for the design of ice going ships, especially for ships operating along the Northern Sea Route (NSR). The method presented in this paper will combine available data from full-scale measurements performed in the Arctic with ice conditions defined using historical data from satellite sources. The full-scale measurements are used to develop the parent distribution, which forms the basis for the extreme load prediction based on the number of excepted interactions along the NSR. Satellite data from history will be used to model ice conditions, e.g. ice type and ice concentration, along the route.
APA, Harvard, Vancouver, ISO, and other styles
4

Laptev, D. V., and Yu A. Pasynkov. "Comparison of statistical and probabilistic model of the frequency measurement by method coincidence." In 2014 12th International Conference on Actual Problems of Electronics Instrument Engineering (APEIE). IEEE, 2014. http://dx.doi.org/10.1109/apeie.2014.7040899.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Reddy, R. M., and B. N. Rao. "Probabilistic Fracture Mechanics Using Fractal Finite Element Method." In ASME 2008 Pressure Vessels and Piping Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/pvp2008-61109.

Full text
Abstract:
This paper presents probabilistic fracture-mechanics analysis of linear-elastic cracked structures subjected to mixed-mode (modes I and II) loading conditions using fractal finite element method (FFEM). The method involves FFEM for calculating fracture response characteristics; statistical models of uncertainties in load, material properties, and crack geometry; and the first-order reliability method for predicting probabilistic fracture response and reliability of cracked structures. The sensitivity of fracture parameters with respect to crack size, required for probabilistic analysis, is calculated using continuum shape sensitivity analysis. Numerical examples based on mode-I and mixed-mode problems are presented to illustrate the proposed method. The results show that the predicted failure probability based on the proposed formulation of the sensitivity of fracture parameter is accurate in comparison with the Monte Carlo simulation results. Since all gradients are calculated analytically, reliability analysis of cracks can be performed efficiently using FFEM.
APA, Harvard, Vancouver, ISO, and other styles
6

Han, Zhi-Qiu, and Hang-Yu Qin. "An Incomplete Probabilistic Linguistic Multi-criteria Group Decision-making Method Based on Statistical Variance." In 2021 33rd Chinese Control and Decision Conference (CCDC). IEEE, 2021. http://dx.doi.org/10.1109/ccdc52312.2021.9601414.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lee, Jae Bong, Jai Hak Park, Hong-Deok Kim, Han-Sub Chung, and Tae Ryong Kim. "Probability of Burst in Steam Generator Tubes Using Monte Carlo Method." In ASME 2006 Pressure Vessels and Piping/ICPVT-11 Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/pvp2006-icpvt-11-93469.

Full text
Abstract:
A statistical assessment model for structural integrity of steam generator tubes was proposed using Monte Carlo method. The growth of flaws in steam generator tubes was predicted using statistical approaches. The statistical parameters that represent the characteristics of flaw growth and initiation were derived from in-service inspection (ISI) non-destructive evaluation (NDE) data. Based on the statistical approaches, flaw growth models were proposed and applied to predict distribution of flaw size at the end of cycle (EOC). Because NDE measurement results differ from that of real ones in steam generator tubes, a simple method for predicting the physical number of flaws from periodic in-service inspection data was proposed. The probabilistic flaw growth rate was calculated from the in-service non-destructive inspection data. And the statistical growth of flaw was simulated using the Monte Carlo method. Probabilistic distributions of the flaw size and the probability of burst were obtained from numerously repeated simulations using the proposed assessment model.
APA, Harvard, Vancouver, ISO, and other styles
8

Mascarenhas, Nelson D. A., and Andre H. Alves. "Application of probabilistic and dictionary-based relaxation techniques to a statistical method of edge detection." In SPIE's 1994 International Symposium on Optics, Imaging, and Instrumentation, edited by Su-Shing Chen. SPIE, 1994. http://dx.doi.org/10.1117/12.179227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wei, Zhigang, Burt Lin, Kay Ellinghaus, Markus Pieszkalla, D. Gary Harlow, and Kamran Nikbin. "Statistical and Probabilistic Analysis of Thermal-Fatigue Test Data Generated Using V-Shape Specimen Testing Method." In ASME 2013 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/pvp2013-97628.

Full text
Abstract:
V-shape specimen testing is a relatively new, simple and useful technique to characterize the thermal-fatigue resistance of materials subjected to combined thermal/mechanical loadings, and to rank and select materials. However, the V-shape specimen test data, similar to many other life test data, always contain an inherent scatter not only because of material non-uniformity but also of the difficulties in operating control, such as loading, boundary conditions, and environment. Therefore, statistical and probabilistic approaches have to be used to interpret the test data in order to implement the observations into new product designs. In this paper, the V-shape specimen test data are selected, analyzed and the scatter properties of the test data are fitted using several continuous probability distribution functions. The results are compared, and the root failure mechanisms of the V-specimens are also discussed. Finally, the main observations are summarized, and a recommendation is provided.
APA, Harvard, Vancouver, ISO, and other styles
10

Lee, Ikjin, Kyung K. Choi, and Liu Du. "Alternative Methods for Reliability-Based Robust Design Optimization Including Dimension Reduction Method." In ASME 2006 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/detc2006-99732.

Full text
Abstract:
The objective of reliability-based robust design optimization (RBRDO) is to minimize the product quality loss function subject to probabilistic constraints. Since the quality loss function is usually expressed in terms of the first two statistical moments, mean and variance, many methods have been proposed to accurately and efficiently estimate the moments. Among the methods, the univariate dimension reduction method (DRM), performance moment integration (PMI), and percentile difference method (PDM) are recently proposed methods. In this paper, estimation of statistical moments and their sensitivities are carried out using DRM and compared with results obtained using PMI and PDM. In addition, PMI and DRM are also compared in terms of how accurately and efficiently they estimate the statistical moments and their sensitivities of a performance function. In this comparison, PDM is excluded since PDM could not even accurately estimate the statistical moments of the performance function. Also, robust design optimization using DRM is developed and then compared with the results of RBRDO using PMI and PDM. Several numerical examples are used for the two comparisons. The comparisons show that DRM is efficient when the number of design variables is small and PMI is efficient when the number of design variables is relatively large. For the inverse reliability analysis of reliability-based design, the enriched performance measure approach (PMA+) is used.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography