Academic literature on the topic 'Complex random process'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Complex random process.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Complex random process"

1

Small, Michael, Lvlin Hou, and Linjun Zhang. "Random complex networks." National Science Review 1, no. 3 (July 18, 2014): 357–67. http://dx.doi.org/10.1093/nsr/nwu021.

Full text
Abstract:
Abstract Exactly what is meant by a ‘complex’ network is not clear; however, what is clear is that it is something other than a random graph. Complex networks arise in a wide range of real social, technological and physical systems. In all cases, the most basic categorization of these graphs is their node degree distribution. Particular groups of complex networks may exhibit additional interesting features, including the so-called small-world effect or being scale-free. There are many algorithms with which one may generate networks with particular degree distributions (perhaps the most famous of which is preferential attachment). In this paper, we address what it means to randomly choose a network from the class of networks with a particular degree distribution, and in doing so we show that the networks one gets from the preferential attachment process are actually highly pathological. Certain properties (including robustness and fragility) which have been attributed to the (scale-free) degree distribution are actually more intimately related to the preferential attachment growth mechanism. We focus here on scale-free networks with power-law degree sequences—but our methods and results are perfectly generic.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Hai Jun, Hong Fu Zuo, and Si Hong Zhu. "Study on Modelling Random Deterioration Process for Complex Repairable System." Advanced Materials Research 156-157 (October 2010): 1356–59. http://dx.doi.org/10.4028/www.scientific.net/amr.156-157.1356.

Full text
Abstract:
Traditional probability statistics theory is impossible to obtain failure lifetime data by accelerated test for expensive and complex systems or equipments for real-time work. Due to the variety of system failure modes and the randomness of system deterioration process and the fuzziness of system maintenance threshold, it is difficult to estimate the random deterioration process of a complex repairable system by single parameter. In order to describe system performance deterioration more subjectly, it proposes generalized proportional intensity model(GPIM). considering the effects of various covariates such as performance parameters, environment stress, failure types and maintenance history simultaneously. This method provides a new method to solve the maintenance decision-making problem of complex repairable system. CF6-80C2A5 aero-engine is illustrated as an example for case study to indicate the obvious practical value by the method proposed herein.
APA, Harvard, Vancouver, ISO, and other styles
3

van Rijn, Monique A., Johan Marinus, Hein Putter, Sarah R. J. Bosselaar, G. Lorimer Moseley, and Jacobus J. van Hilten. "Spreading of complex regional pain syndrome: not a random process." Journal of Neural Transmission 118, no. 9 (February 18, 2011): 1301–9. http://dx.doi.org/10.1007/s00702-011-0601-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Attias, H., and Y. Alhassid. "Gaussian random-matrix process and universal parametric correlations in complex systems." Physical Review E 52, no. 5 (November 1, 1995): 4776–92. http://dx.doi.org/10.1103/physreve.52.4776.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Decreusefond, L., E. Ferraz, H. Randriambololona, and A. Vergne. "Simplicial Homology of Random Configurations." Advances in Applied Probability 46, no. 02 (June 2014): 325–47. http://dx.doi.org/10.1017/s0001867800007114.

Full text
Abstract:
Given a Poisson process on a d-dimensional torus, its random geometric simplicial complex is the complex whose vertices are the points of the Poisson process and simplices are given by the C̆ech complex associated to the coverage of each point. By means of Malliavin calculus, we compute explicitly the three first-order moments of the number of k-simplices, and provide a way to compute higher-order moments. Then we derive the mean and the variance of the Euler characteristic. Using the Stein method, we estimate the speed of convergence of the number of occurrences of any connected subcomplex as it converges towards the Gaussian law when the intensity of the Poisson point process tends to infinity. We use a concentration inequality for Poisson processes to find bounds for the tail distribution of the Betti number of first order and the Euler characteristic in such simplicial complexes.
APA, Harvard, Vancouver, ISO, and other styles
6

Decreusefond, L., E. Ferraz, H. Randriambololona, and A. Vergne. "Simplicial Homology of Random Configurations." Advances in Applied Probability 46, no. 2 (June 2014): 325–47. http://dx.doi.org/10.1239/aap/1401369697.

Full text
Abstract:
Given a Poisson process on a d-dimensional torus, its random geometric simplicial complex is the complex whose vertices are the points of the Poisson process and simplices are given by the C̆ech complex associated to the coverage of each point. By means of Malliavin calculus, we compute explicitly the three first-order moments of the number of k-simplices, and provide a way to compute higher-order moments. Then we derive the mean and the variance of the Euler characteristic. Using the Stein method, we estimate the speed of convergence of the number of occurrences of any connected subcomplex as it converges towards the Gaussian law when the intensity of the Poisson point process tends to infinity. We use a concentration inequality for Poisson processes to find bounds for the tail distribution of the Betti number of first order and the Euler characteristic in such simplicial complexes.
APA, Harvard, Vancouver, ISO, and other styles
7

Farahmand, K., A. Grigorash, and P. Flood. "Real almost zeros of random polynomials with complex coefficients." Journal of Applied Mathematics and Stochastic Analysis 2005, no. 2 (January 1, 2005): 195–209. http://dx.doi.org/10.1155/jamsa.2005.195.

Full text
Abstract:
We present a simple formula for the expected number of times that a complex-valued Gaussian stochastic process has a zero imaginary part and the absolute value of its real part is bounded by a constant value M. We show that only some mild conditions on the stochastic process are needed for our formula to remain valid. We further apply this formula to a random algebraic polynomial with complex coefficients. We show how the above expected value in the case of random algebraic polynomials varies for different behaviour of M.
APA, Harvard, Vancouver, ISO, and other styles
8

Assimakopoulos, Nikitas A. "Random environmental processes for complex computer systems: a theoretical approach." Advances in Complex Systems 02, no. 02 (June 1999): 117–35. http://dx.doi.org/10.1142/s0219525999000072.

Full text
Abstract:
In this paper, we consider various computer inventory, computer queueing and reliability computer models where complexity due to interacting components of subsystems is apparent. In particular, our analysis focuses on a multi-item inventory computer model with stochastically dependent demands, a queueing computer network where there are dependent arrival and service processes, or a reliability computer model with stochastically dependent component lifetimes. We discuss cases where this dependence is induced only by a random environmental process which the system operates in. This process represents the sources of variation that affect all deterministic and stochastic parameters of the model. Thus, not only are the parameters of the model now stochastic processes, but they are all dependent due to the common environment they are all subject to. Our objective is to provide a convincing argument that, under fairly reasonable conditions, the analytical techniques used in these models as well as their solutions are not much more complicated than those where there is no environmental variation.
APA, Harvard, Vancouver, ISO, and other styles
9

Strahov, Eugene. "Dynamical correlation functions for products of random matrices." Random Matrices: Theory and Applications 04, no. 04 (October 2015): 1550020. http://dx.doi.org/10.1142/s2010326315500203.

Full text
Abstract:
We introduce and study a family of random processes with a discrete time related to products of random matrices. Such processes are formed by singular values of random matrix products, and the number of factors in a random matrix product plays a role of a discrete time. We consider in detail the case when the (squared) singular values of the initial random matrix form a polynomial ensemble, and the initial random matrix is multiplied by standard complex Gaussian matrices. In this case, we show that the random process is a discrete-time determinantal point process. For three special cases (the case when the initial random matrix is a standard complex Gaussian matrix, the case when it is a truncated unitary matrix, or the case when it is a standard complex Gaussian matrix with a source) we compute the dynamical correlation functions explicitly, and find the hard edge scaling limits of the correlation kernels. The proofs rely on the Eynard–Mehta theorem, and on contour integral representations for the correlation kernels suitable for an asymptotic analysis.
APA, Harvard, Vancouver, ISO, and other styles
10

Qi, Xing Guang, Dan Dan Dou, and Hai Lun Zhang. "Complex Paper Disease Detection Based on Probabilistic Hough Transform." Advanced Materials Research 712-715 (June 2013): 2327–30. http://dx.doi.org/10.4028/www.scientific.net/amr.712-715.2327.

Full text
Abstract:
Although advantages of traditional Hough transform of such Good robustness and anti-interference have been found in the process of detection of complex paper diseases such as Streak, fold, however, quantity calculation has imposed limitation on the application of this method in dealing process demanding high instantaneity. Probabilistic Hough Transform fits in a straight line by approach of Hough transform. The approach of selecting fitting points by random drawing and combination with meter counter and output a boundary line where random drawing points locates on condition that such points satisfy limitation of threshold value which has been pre-setted is able to reduce the calculation and period of algorithm operation therefore fitts detection of complex paper diseases demanding high instantaneity.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Complex random process"

1

Петранова, Марина Юрiївна. "Випадковi гауссовi процеси зi стiйкими кореляцiйними функцiями." Doctoral thesis, Київ, 2021. https://ela.kpi.ua/handle/123456789/40592.

Full text
Abstract:
Робота виконана на кафедрi прикладної математики Донецького нацiонального унiверситету iменi Василя Стуса Мiнiстерства освiти i науки України
иссертационная работа посвящена изучению случайных гауссо вых процессов с устойчивыми корреляционными функциями и их свойств.
APA, Harvard, Vancouver, ISO, and other styles
2

Dionigi, Pierfrancesco. "A random matrix theory approach to complex networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18513/.

Full text
Abstract:
Si presenta un approccio matematico formale ai complex networks tramite l'uso della Random Matrix Theory (RMT). La legge del semicerchio di Wigner viene presentata come una generalizzazione del Teorema del Limite Centrale per determinati ensemble di matrici random. Sono presentati inoltre i principali metodi per calcolare la distribuzione spettrale delle matrici random e se ne sottolineano le differenze. Si è poi studiato come la RMT sia collegata alla Free Probability. Si è studiato come due tipi di grafi random apparentemente uguali, posseggono proprietà spettrali differenti analizzando le loro matrici di adiacenza. Da questa analisi si deducono alcune proprietà geometriche e topologiche dei grafi e si può analizzare la correlazione statistica tra i vertici. Si è poi costruito sul grafo un passeggiata aleatoria tramite catene di Markov, definendo la matrice di transizione del processo tramite la matrice di adiacenza del network opportunamente normalizzata. Infine si è mostrato come il comportamento dinamico della passeggiata aleatoria sia profondamente connesso con gli autovalori della matrice di transizione, e le principali relazioni sono mostrate.
APA, Harvard, Vancouver, ISO, and other styles
3

Grygierek, Jens Jan. "Random Geometric Structures." Doctoral thesis, 2020. https://repositorium.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-202001302552.

Full text
Abstract:
We construct and investigate random geometric structures that are based on a homogeneous Poisson point process. We investigate the random Vietoris-Rips complex constructed as the clique complex of the well known gilbert graph as an infinite random simplicial complex and prove that every realizable finite sub-complex will occur infinitely many times almost sure as isolated complex and also in the case of percolations connected to the unique giant component. Similar results are derived for the Cech complex. We derive limit theorems for the f-vector of the Vietoris-Rips complex on the unit cube centered at the origin and provide a central limit theorem and a Poisson limit theorem based on the model parameters. Finally we investigate random polytopes that are given as convex hulls of a Poisson point process in a smooth convex body. We establish a central limit theorem for certain linear combinations of intrinsic volumes. A multivariate limit theorem involving the sequence of intrinsic volumes and the number of i-dimensional faces is derived. We derive the asymptotic normality of the oracle estimator of minimal variance for estimation of the volume of a convex body.
APA, Harvard, Vancouver, ISO, and other styles
4

Gabel, Alan. "Emergent phenomena and fluctuations in cooperative systems." Thesis, 2014. https://hdl.handle.net/2144/14298.

Full text
Abstract:
We explore the role of cooperativity and large deviations on a set of fundamental non-equilibrium many-body systems. In the cooperative asymmetric exclusion process, particles hop to the right at a constant rate only when the right neighboring site is vacant and hop at a faster rate when the left neighbor is occupied. In this model, a host of new heterogeneous density profile evolutions arise, including inverted shock waves and continuous compression waves. Cooperativity also drives the growth of complex networks via preferential attachment, where well-connected nodes are more likely to attract future connections. We introduce the mechanism of hindered redirection and show that it leads to network evolution by sublinear preferential attachment. We further show that no local growth rule can recreate superlinear preferential attachment. We also introduce enhanced redirection and show that the rule leads to networks with three unusual properties: (i) many macrohubs -- nodes whose degree is a finite fraction of the number of nodes in the network, (ii) a non-extensive degree distribution, and (iii) large fluctuations between different realizations of the growth process. We next examine large deviations in the diffusive capture model, where N diffusing predators initially all located at L 'chase' a diffusing prey initially at x
APA, Harvard, Vancouver, ISO, and other styles
5

(11009496), Andrew M. Thomas. "Stochastic Process Limits for Topological Functionals of Geometric Complexes." Thesis, 2021.

Find full text
Abstract:

This dissertation establishes limit theory for topological functionals of geometric complexes from a stochastic process viewpoint. Standard filtrations of geometric complexes, such as the Čech and Vietoris-Rips complexes, have a natural parameter r which governs the formation of simplices: this is the basis for persistent homology. However, the parameter r may also be considered the time parameter of an appropriate stochastic process which summarizes the evolution of the filtration.

Here we examine the stochastic behavior of two of the foremost classes of topological functionals of such filtrations: the Betti numbers and the Euler characteristic. There are also two distinct setups in which the points underlying the complexes are generated, where the points are distributed randomly in Rd according to a general density (the traditional setup) and where the points lie in the tail of a heavy-tailed or exponentially-decaying “noise” distribution (the extreme-value theory (EVT) setup).

These results constitute some of the first results combining topological data analysis (TDA) and stochastic process theory. The first collection of results establishes stochastic process limits for Betti numbers of Čech complexes of Poisson and binomial point processes for two specific regimes in the traditional setup: the sparse regime—when the parameter r governing the formation of simplices causes the Betti numbers to concentrate on components of the lowest order; and the critical regime—when the parameter r is of the order n-1/d and the geometric complex becomes highly connected with topological holes of every dimension. The second collection of results establishes a functional strong law of large numbers and a functional central limit theorem for the Euler characteristic of a random geometric complex for the critical regime in the traditional setup. The final collection of results establishes functional strong laws of large numbers for geometric complexes in the EVT setup for the two classes of “noise” densities mentioned above.

APA, Harvard, Vancouver, ISO, and other styles
6

Kouvatsos, Demetres D., and Irfan U. Awan. "Entropy Maximisation and Open Queueing Networks with Priority and Blocking." 2003. http://hdl.handle.net/10454/3084.

Full text
Abstract:
No
A review is carried out on the characterisation and algorithmic implementation of an extended product-form approximation, based on the principle of maximum entropy (ME), for a wide class of arbitrary finite capacity open queueing network models (QNMs) with service and space priorities. A single server finite capacity GE/GE/1/N queue with R (R>1) distinct priority classes, compound Poisson arrival processes (CPPs) with geometrically distributed batches and generalised exponential (GE) service times is analysed via entropy maximisation, subject to suitable GE-type queueing theoretic constraints, under preemptive resume (PR) and head-of-line (HOL) scheduling rules combined with complete buffer sharing (CBS) and partial buffer sharing (PBS) management schemes stipulating a sequence of buffer thresholds {N=(N1,¿,NR),0
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Complex random process"

1

Haghighi, Aliakbar Montazer. Advanced mathematics for engineers with applications in stochastic processes. New York: Nova Science Publishers, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Haghighi, Aliakbar Montazer. Advanced mathematics for engineers with applications in stochastic processes. Hauppauge, N.Y: Nova Science Publishers, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wikle, Christopher K. Spatial Statistics. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.710.

Full text
Abstract:
The climate system consists of interactions between physical, biological, chemical, and human processes across a wide range of spatial and temporal scales. Characterizing the behavior of components of this system is crucial for scientists and decision makers. There is substantial uncertainty associated with observations of this system as well as our understanding of various system components and their interaction. Thus, inference and prediction in climate science should accommodate uncertainty in order to facilitate the decision-making process. Statistical science is designed to provide the tools to perform inference and prediction in the presence of uncertainty. In particular, the field of spatial statistics considers inference and prediction for uncertain processes that exhibit dependence in space and/or time. Traditionally, this is done descriptively through the characterization of the first two moments of the process, one expressing the mean structure and one accounting for dependence through covariability.Historically, there are three primary areas of methodological development in spatial statistics: geostatistics, which considers processes that vary continuously over space; areal or lattice processes, which considers processes that are defined on a countable discrete domain (e.g., political units); and, spatial point patterns (or point processes), which consider the locations of events in space to be a random process. All of these methods have been used in the climate sciences, but the most prominent has been the geostatistical methodology. This methodology was simultaneously discovered in geology and in meteorology and provides a way to do optimal prediction (interpolation) in space and can facilitate parameter inference for spatial data. These methods rely strongly on Gaussian process theory, which is increasingly of interest in machine learning. These methods are common in the spatial statistics literature, but much development is still being done in the area to accommodate more complex processes and “big data” applications. Newer approaches are based on restricting models to neighbor-based representations or reformulating the random spatial process in terms of a basis expansion. There are many computational and flexibility advantages to these approaches, depending on the specific implementation. Complexity is also increasingly being accommodated through the use of the hierarchical modeling paradigm, which provides a probabilistically consistent way to decompose the data, process, and parameters corresponding to the spatial or spatio-temporal process.Perhaps the biggest challenge in modern applications of spatial and spatio-temporal statistics is to develop methods that are flexible yet can account for the complex dependencies between and across processes, account for uncertainty in all aspects of the problem, and still be computationally tractable. These are daunting challenges, yet it is a very active area of research, and new solutions are constantly being developed. New methods are also being rapidly developed in the machine learning community, and these methods are increasingly more applicable to dependent processes. The interaction and cross-fertilization between the machine learning and spatial statistics community is growing, which will likely lead to a new generation of spatial statistical methods that are applicable to climate science.
APA, Harvard, Vancouver, ISO, and other styles
4

Spencer, Diana. Varro’s Roman Ways. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198768098.003.0003.

Full text
Abstract:
During a tumultuous phase in Roman Republican politics M. Terentius Varro’s De Lingua Latina developed a complex and nuanced hermeneutic toolkit for citizen self-fashioning. Cicero once complimented Varro as the man whose work had led Romans home to a knowledge of themselves as agents within a specific and acculturated space. This chapter explores how De Lingua Latina works to deliver to successful readers a particular form of self-knowledge as Latin speakers within Roman space. Moving through the city, and being evaluated as part of that process, is central to Roman political agency; yet Varro’s Rome produces a sense of solitary progress through ostensibly random slices of city life, thus, the dérive, and the idea of cartographies of influence, become important. Moreover, the rich semiotic quality of the topography accompanies a more pragmatic story, drawing in politics, consumption, and agribusiness to defamiliarize individual and collective diurnal city rhythms.
APA, Harvard, Vancouver, ISO, and other styles
5

Thurner, Stefan, Peter Klimek, and Rudolf Hanel. Introduction to the Theory of Complex Systems. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198821939.001.0001.

Full text
Abstract:
This book is a comprehensive introduction to quantitative approaches to complex adaptive systems. Practically all areas of life on this planet are constantly confronted with complex systems, be it ecosystems, societies, traffic, financial markets, opinion formation, epidemic spreading, or the internet and social media. Complex systems are systems composed of many elements that interact with each other, which makes them extremely rich dynamical systems showing a huge range of phenomena. Properties of complex systems that are of particular importance are their efficiency, robustness, resilience, and proneness to collapse. The quantitative tools and concepts needed to understand the co-evolutionary nature of networked systems and their properties are challenging. The intention of the book is to give a self-contained introduction to these concepts so that the reader will be equipped with a conceptual and mathematical toolset that allows her to engage in the science of complex systems. Topics covered include random processes of path-dependent processes, co-evolutionary dynamics, the statistics of driven nonequilibrium systems, dynamics of networks, the theory of scaling, and approaches from statistical mechanics and information theory. The book extends well beyond the early classical literature in the field of complex systems and summarizes the methodological progress over the past twenty years in a clear, structured, and comprehensive way. The book is intended for natural scientists and graduate students.
APA, Harvard, Vancouver, ISO, and other styles
6

Akemann, Gernot, Jinho Baik, and Philippe Di Francesco, eds. The Oxford Handbook of Random Matrix Theory. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198744191.001.0001.

Full text
Abstract:
This handbook showcases the major aspects and modern applications of random matrix theory (RMT). It examines the mathematical properties and applications of random matrices and some of the reasons why RMT has been very successful and continues to enjoy great interest among physicists, mathematicians and other scientists. It also discusses methods of solving RMT, basic properties and fundamental objects in RMT, and different models and symmetry classes in RMT. Topics include the use of classical orthogonal polynomials (OP) and skew-OP to solve exactly RMT ensembles with unitary, and orthogonal or symplectic invariance respectively, all at finite matrix size; the supersymmetric and replica methods; determinantal point processes; Painlevé transcendents; the fundamental property of RMT known as universality; RNA folding; two-dimensional quantum gravity; string theory; and the mathematical concept of free random variables. In addition to applications to mathematics and physics, the book considers broader applications to other sciences, including economics, engineering, biology, and complex networks.
APA, Harvard, Vancouver, ISO, and other styles
7

Advanced mathematics for engineers with applications in stochastic processes. Hauppauge, New Yrok, USA: Nova Science Publishers, Inc., 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Advanced mathematics for engineers with applications in stochastic processes. Nova Science Publishers, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Adams, Karen Ruth. The Causes of War. Oxford University Press, 2017. http://dx.doi.org/10.1093/acrefore/9780190846626.013.323.

Full text
Abstract:
The scientific study of war is a pressing concern for international politics. Given the destructive nature of war, ordinary citizens and policy makers alike are eager to anticipate if not outright avoid outbreaks of violence. Understanding the causes of war can be a complex process. Scholars of international relations must first define war, and then establish a universe of actors or conflicts in which both war and peace are possible. Next, they must collect data on the incidence of war in the entire universe of cases over a particular period of time, a random sample of relevant cases, a number of representative cases, or a set of cases relevant to independent variables in the theories they are testing. Finally, scholars must use this data to construct quantitative and qualitative tests of hypotheses about why actors fight instead of resolving their differences in other ways and, in particular, why actors initiate wars by launching the first attack. Instead of taking the inductive approach of inventorying the causes of particular wars and then attempting to find general rules, it is necessary for scholars to approach the problem deductively, developing theories about the environment in which states operate, deriving hypotheses about the incidence of war and attack, and using quantitative and qualitative methods to test these hypotheses.
APA, Harvard, Vancouver, ISO, and other styles
10

Bellhouse, David. Probability and Its Application in Britain during the 17th and 18th Centuries. Edited by Alan Hájek and Christopher Hitchcock. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199607617.013.5.

Full text
Abstract:
In the 18th and 19th centuries, probability was a part of moral and natural sciences, rather than of mathematics. Still, since Laplace’s 1812 Théorie analytique des probabilités, specific analytic methods of probability aroused the interest of mathematicians, and probability began to develop a purely mathematical quality. In the 20th century the mathematical essence reached full autonomy and constituted “modern” probability. Significant in this development was the gradual introduction of a measure theoretic framework. In this way, the main subfields of modern probability, as axiomatics, weak and strong limit theorems, sequences of non-independent random variables, and stochastic processes, could be integrated into a well-connected complex until World War II.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Complex random process"

1

Rodionov, Alexey S., Hyunseung Choo, Hee Yong Youn, Tai M. Chung, and Kiheon Park. "Efficient Random Process Generation for Reliable Simulation of Complex Systems." In Computational Science - ICCS 2001, 912–21. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-45718-6_96.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Diestmann, Thomas, Nils Broedling, Benedict Götz, and Tobias Melz. "Surrogate Model-Based Uncertainty Quantification for a Helical Gear Pair." In Lecture Notes in Mechanical Engineering, 191–207. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77256-7_16.

Full text
Abstract:
AbstractCompetitive industrial transmission systems must perform most efficiently with reference to complex requirements and conflicting key performance indicators. This design challenge translates into a high-dimensional multi-objective optimization problem that requires complex algorithms and evaluation of computationally expensive simulations to predict physical system behavior and design robustness. Crucial for the design decision-making process is the characterization, ranking, and quantification of relevant sources of uncertainties. However, due to the strict time limits of product development loops, the overall computational burden of uncertainty quantification (UQ) may even drive state-of-the-art parallel computing resources to their limits. Efficient machine learning (ML) tools and techniques emphasizing high-fidelity simulation data-driven training will play a fundamental role in enabling UQ in the early-stage development phase.This investigation surveys UQ methods with a focus on noise, vibration, and harshness (NVH) characteristics of transmission systems. Quasi-static 3D contact dynamic simulations are performed to evaluate the static transmission error (TE) of meshing gear pairs under different loading and boundary conditions. TE indicates NVH excitation and is typically used as an objective function in the early-stage design process. The limited system size allows large-scale design of experiments (DoE) and enables numerical studies of various UQ sampling and modeling techniques where the design parameters are treated as random variables associated with tolerances from manufacturing and assembly processes. The model accuracy of generalized polynomial chaos expansion (gPC) and Gaussian process regression (GPR) is evaluated and compared. The results of the methods are discussed to conclude efficient and scalable solution procedures for robust design optimization.
APA, Harvard, Vancouver, ISO, and other styles
3

Colcombet, Thomas, Nathanaël Fijalkow, and Pierre Ohlmann. "Controlling a Random Population." In Lecture Notes in Computer Science, 119–35. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45231-5_7.

Full text
Abstract:
AbstractBertrand et al. introduced a model of parameterised systems, where each agent is represented by a finite state system, and studied the following control problem: for any number of agents, does there exist a controller able to bring all agents to a target state? They showed that the problem is decidable and EXPTIME-complete in the adversarial setting, and posed as an open problem the stochastic setting, where the agent is represented by a Markov decision process. In this paper, we show that the stochastic control problem is decidable. Our solution makes significant uses of well quasi orders, of the max-flow min-cut theorem, and of the theory of regular cost functions.
APA, Harvard, Vancouver, ISO, and other styles
4

De, Indranil, Andrei Shibkov, and Sharad Saxena. "Robust Method for Fast and Accurate Simulation of Random Dopant Fluctuation-Induced Vth Variation in MOSFETs with Arbitrary Complex Doping Distribution." In Simulation of Semiconductor Processes and Devices 2001, 368–71. Vienna: Springer Vienna, 2001. http://dx.doi.org/10.1007/978-3-7091-6244-6_84.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Silva, Eduardo L., Ana Filipa Sampaio, Luís F. Teixeira, and Maria João M. Vasconcelos. "Cervical Cancer Detection and Classification in Cytology Images Using a Hybrid Approach." In Advances in Visual Computing, 299–312. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-90436-4_24.

Full text
Abstract:
AbstractThe high incidence of cervical cancer in women has prompted the research of automatic screening methods. This work focuses on two of the steps present in such systems, more precisely, the identification of cervical lesions and their respective classification. The development of automatic methods for these tasks is associated with some shortcomings, such as acquiring sufficient and representative clinical data. These limitations are addressed through a hybrid pipeline based on a deep learning model (RetinaNet) for the detection of abnormal regions, combined with random forest and SVM classifiers for their categorization, and complemented by the use of domain knowledge in its design. Additionally, the nuclei in each detected region are segmented, providing a set of nuclei-specific features whose impact on the classification result is also studied. Each module is individually assessed in addition to the complete system, with the latter achieving a precision, recall and F1 score of 0.04, 0.20 and 0.07, respectively. Despite the low precision, the system demonstrates potential as an analysis support tool with the capability of increasing the overall sensitivity of the human examination process.
APA, Harvard, Vancouver, ISO, and other styles
6

Walrand, Jean. "Perspective and Complements." In Probability in Electrical Engineering and Computer Science, 271–307. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-49995-2_15.

Full text
Abstract:
AbstractWe have explored a number of topics motivated by concrete applications. It is time to stitch together these ideas into a complete panorama. In addition, we provide some complements.Section 15.1 discusses the general question of inference: what can one deduce from observations? Section 15.2 explains the important notion of sufficient statistic: what is the relevant data in a set of observations? Section 15.3 presents the theory of Markov chains where the number of states is infinite. Section 15.4 explains the Poisson process. Section 15.5 discusses the boosting algorithm for choosing among experts. What drug should you further research; what noisy channel should one use? These are examples of multi-armed bandit problems. In such problems one faces the trade-off between exploiting known possibilities versus exploring potentially more rewarding but less well understood alternatives. Section 15.6 explains a key results for such multi-armed bandit problems. Information Theory studies the limits of communication systems: how fast can one transmit bits reliably over a noisy channel? How many bits should be transmitted to convey some information? Section 15.7 introduces some key concepts and results of Information Theory. When estimating the likelihood of errors or the reliability of some estimates, one usually has to calculate bounds on the probability that a random variable exceeds a given value. Section 15.8 discusses some useful probability bounds. Section 15.9 explains the main ideas of the theory of martingales and shows how it provides a proof of the law of large numbers.
APA, Harvard, Vancouver, ISO, and other styles
7

Grenander, Ulf, and Michael I. Miller. "Jump Diffusion Inference in Complex Scenes." In Pattern Theory. Oxford University Press, 2006. http://dx.doi.org/10.1093/oso/9780198505709.003.0020.

Full text
Abstract:
This chapter explores random sampling algorithms introduced in for generating conditional expectations in hypothesis spaces in which there is a mixture of discrete, disconnected subsets. Random samples are generated via the direct simulation of a Markov process whose state moves through the hypothesis space with the ergodic property that the transition distribution of the Markov process converges to the posterior distribution. This allows for the empirical generation of conditional expectations under the posterior. To accommodate the connected and disconnected nature of the state spaces, the Markov process is forced to satisfy jump–diffusion dynamics. Through the connected parts of the parameter space (Lie manifolds) the algorithm searches continuously, with sample paths corresponding to solutions of standard diffusion equations. Across the disconnected parts of parameter space the jump process determines the dynamics. The infinitesimal properties of these jump–diffusion processes are selected so that various sample statistics converge to their expectation under the posterior.
APA, Harvard, Vancouver, ISO, and other styles
8

Grenander, Ulf, and Michael I. Miller. "Markov Processes and Random Sampling." In Pattern Theory. Oxford University Press, 2006. http://dx.doi.org/10.1093/oso/9780198505709.003.0019.

Full text
Abstract:
The parameter spaces of natural patterns are so complex that inference must often proceed compositionally, successively building up more and more complex structures, as well as back-tracking, creating simpler structures from more complex versions. Inference is transformational in nature. The philosophical approach studied in this chapter is that the posterior distribution that describes the patterns contains all of the information about the underlying regular structure. Therefore, the transformations of inference are guided via the posterior in the sense that the algorithm for changing the regular structures will correspond to the sample path of a Markov process. The Markov process is constructed to push towards the posterior distribution in which the information about the patterns are stored. This provides the deepconnection between the transformational paradigm of regular structure creation, and random sampling algorithms.
APA, Harvard, Vancouver, ISO, and other styles
9

Yilmaz, Levent. "Applying Monte Carlo Simulation in New Tech." In Public Sector Crisis Management. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.91264.

Full text
Abstract:
Monte Carlo in Monaco is given to the theory for mathematics, whose simulation process involves generating chance variables and exhibiting random behaviours in nature. This simulation is a powerful statistical analysis tool and widely used in both non-engineering fields and engineering fields for new perspectives. This simulation has been applied to diverse problems ranging from the simulation of complex physical phenomena such as atom collisions, to the simulation of river boundary layers as meanders and Dow Jones forecasting. It can deal with many random variables, various distribution types and highly nonlinear engineering models, while Monte Carlo is also suitable for solving complex engineering problems in two areas which are varying randomly. Monte Carlo simulation is given as an application for hydrogen energy potential determination.
APA, Harvard, Vancouver, ISO, and other styles
10

NURITDINOV, S. N., A. A. MUMINOV, and F. U. BOTIROV. "On the method for the analysis of compulsive phase mixing and its application in cosmogony." In Astronomical and Astrophysical Transactions, Vol. 32, No. 2, 83–88. Cambridge Scientific Publishers, 2021. http://dx.doi.org/10.17184/eac.5234.

Full text
Abstract:
In this paper, we study the strong non-stationary stochastic processes that take place in the phase space of self-gravitating systems at the earlier non-stationary stage of their evolution. The numerical calculations of the compulsive phase mixing process were carried out according to the model of chaotic impacts, where the initially selected phase volume experiences random pushes that are of a diverse and complex nature. The application of the method for studying random impacts on a volume element in the case of three-dimensional space is carried out.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Complex random process"

1

Colonnese, Stefania, Stefano Rinauro, and Gaetano Scarano. "Markov Random Fields using complex line process: An application to Bayesian image restoration." In 2011 3rd European Workshop on Visual Information Processing (EUVIP). IEEE, 2011. http://dx.doi.org/10.1109/euvip.2011.6045517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yeo, Jeongho, and Joon Ho Cho. "Asymptotically optimal low-complexity estimation of sampled improper-complex second-order cyclostationary random process." In 2013 IEEE Wireless Communications and Networking Conference (WCNC). IEEE, 2013. http://dx.doi.org/10.1109/wcnc.2013.6555012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Rollot, Olivier, Maurice Pendola, Maurice Lemaire, and Igor Boutemy. "Reliability Indexes Calculation of Industrial Boilers Under Stochastic Thermal Process." In ASME 1998 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1998. http://dx.doi.org/10.1115/detc98/dac-5567.

Full text
Abstract:
Abstract This text sums up a research for the French Electric Company, EDF, which wants to know the influence of the temperature variability on the reliability of some of their boilers. These boilers are very complex structures whose behavior has to be modelized by the Finite Element Method, FEM. This work is an application of Finite Element Methods in a reliability context, that means the introduction of random data into a classical FEM, in order to determine the reliability of the structures. These random data may concern geometry, material characteristics of the structures or the loads the structure may carry. Then, it’s necessary to employ new methods to take into account these stochastic approaches and to obtain more efficient decision’s elements for a better control of the boilers.
APA, Harvard, Vancouver, ISO, and other styles
4

Shulman, Ami, and Jorge Soto-Andrade. "A random walk in stochastic dance." In LINK 2021. Tuwhera Open Access, 2021. http://dx.doi.org/10.24135/link2021.v2i1.71.

Full text
Abstract:
Stochastic music, developed last century by Xenakis, has older avatars, like Mozart, who showed how to compose minuets by tossing dice, in a similar way that contemporary choreographer Cunningham took apart the structural elements of what was considered to be a cohesive choreographic work (including movement, sound, light, set and costume) and reconstructed them in random ways. We intend to explore an enactive and experiential analogue of stochastic music, in the realm of dance, where the poetry of a choreographic spatial/floor pattern is elicited by a mathematical stochastic process, to wit a random walk – a stochastic dance of sorts. Among many possible random walks, we consider two simple examples, embodied in the following scenarios, proposed to the students/dancers: - a frog, jumping randomly on a row of stones, choosing right and left as if tossing a coin, - a person walking randomly on a square grid, starting a given node, and choosing each time randomly, equally likely N, S, E or W, and walking non-stop along the corresponding edge, up to the next node, and so on.When the dancers encounter these situations, quite natural questions arise for the choreographer, like: Where will the walker/dancer be after a while? Several ideas for a choreography emerge, which are more complex than just having one or more dancers perform the random walk, and which surprisingly turn our random process into a deterministic one!For instance, for the first random walk, 16 dancers start at the same node of a discrete line on the stage, and execute, each one, a different path of the 16 possible 4 – jump paths the frog can follow. They would need to agree first on how to carry this out. Interestingly, they may proceed without a Magister Ludi handing out scripts to every dancer. After arriving to their end node/position, they could try to retrace their steps, to come back all to the starting node.Analogously for the grid random walk, where we may have now 16 dancers enacting the 16 possible 2-edge paths of the walker. The dancers could also enter the stage (the grid or some other geometric pattern to walk around), one by one, sequentially, describing different random paths, or deterministic intertwined paths, in the spirit of Beckett’s Quadrat. Also, the dancers could choose their direction ad libitum, after some spinning, each time, on a grid-free stage, but keeping the same step length, as in statistician Pearson’s model for a mosquito random flight.We are interested in various possible spin-offs of these choreographies, which intertwine dance and mathematical cognition: For instance, when the dancers choose each one a different path, they will notice that their final distribution on the nodes is uneven (interesting shapes emerge). In this way, just by moving, choreographer and dancers can find a quantitative answer to the impossible question: where will the walker/dancer be after a while? Indeed, the percentage of dancers ending up at each node gives the probability of the random walker landing there.
APA, Harvard, Vancouver, ISO, and other styles
5

Almeida, Renuá M., Rodrigo M. Rodrigues, Denys M. F. Ribeiro, and Otávio N. Teixeira. "Fitness Value Curves Prediction in the Evolutionary Process of Genetic Algorithms Applied to Benchmark Function." In Escola Regional de Alto Desempenho Norte 2. Sociedade Brasileira de Computação, 2021. http://dx.doi.org/10.5753/erad-no2.2021.18673.

Full text
Abstract:
This work intends to adopt fitness curves prediction from Genetic Algorithms (GAs) proposed in [Almeida et al. 2021], in the context of a more complex function, which is the Schwefel benchmark function. The prediction is performed with the knowledge only of the GA initialization parameters, using the Random Forest model. This approach addresses the main gap in the original work achieving good results, which makes this approach more promising.
APA, Harvard, Vancouver, ISO, and other styles
6

Court, Paul, and Omar Al-Azzam. "Test Automation for Quality Assurance: A Random Approach." In 10th International Conference on Foundations of Computer Science & Technology (FCST 2022). Academy and Industry Research Collaboration Center (AIRCC), 2022. http://dx.doi.org/10.5121/csit.2022.120804.

Full text
Abstract:
Testing is a necessary, but sometimes tedious chore for finding faults in software. Finding faults is essential for ensuring quality and reliability of software for industry. These are valuable traits consumers consider when investing capital and therefore essential to the reputation and financial well-being of a business. This research involves an ongoing trade-off between time and computational resources when testing via random selection of test cases versus complex logical means to find faults. More time will be devoted to an analysis of random test case selection and whether the amount of extra test cases run due to random selection is a viable alternative to the potential time spent fully evaluating the logic for coverage of a generic predicate. The reader will gain knowledge about the expectations for the increase in test cases if randomized selection is employed at some point in the process of testing.
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Meng, Mohammad Kazem Sadoughi, Zhen Hu, and Chao Hu. "System Reliability Analysis Using Hybrid Gaussian Process Model." In ASME 2019 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/detc2019-98173.

Full text
Abstract:
Abstract This paper proposes a system reliability analysis method based on the hybrid of multivariate Gaussian process (MGP) and univariate Gaussian process (UGP) models, named as hybrid Gaussian process-based system reliability analysis (HGP-SRA). MGP and UGP models are selectively constructed for the components of a complex engineered system: MGP models are constructed over the groups of highly interdependent components and the individual UGP models are built over the components which are relatively independent of one another. A nonlinear-dependence measure, namely the randomized dependence coefficient, is adopted to adaptively learn and quantify the pairwise dependencies of the components with both linear and nonlinear dependency patterns. In the proposed HGP-SRA method, initial hybrid Gaussian process (HGP) models are first constructed with a set of near-random samples and these surrogate models are then updated with new samples that are sequentially identified based on the acquisition function named as multivariate probability of improvement (MPI). The results of two mathematical and a real-world engineering case studies suggest that the proposed method can achieve better accuracy and efficiency in system reliability estimation than the benchmark surrogate-based methods.
APA, Harvard, Vancouver, ISO, and other styles
8

Cang, Ruijin, and Max Yi Ren. "Deep Network-Based Feature Extraction and Reconstruction of Complex Material Microstructures." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59404.

Full text
Abstract:
Computational material design (CMD) aims to accelerate optimal design of complex material systems by integrating material science and design automation. For tractable CMD, it is required that (1) a feature space be identified to allow reconstruction of new designs, and (2) the reconstruction process be property-preserving. Existing solutions rely on the designer’s understanding of specific material systems to identify geometric and statistical features, which could be insufficient for reconstructing physically meaningful microstructures of complex material systems. This paper develops a feature learning mechanism that automates a two-way conversion between microstructures and their lower-dimensional feature representations. The proposed model is applied to four material systems: Ti-6Al-4V alloy, Pb-Sn alloy, Fontainebleau sandstone, and spherical colloids, to produce random reconstructions that are visually similar to the samples. This capability is not achieved by existing synthesis methods relying on the Markovian assumption of material systems. For Ti-6Al-4V alloy, we also show that the reconstructions preserve the mean critical fracture force of the system for a fixed processing setting. Source code and datasets are available.
APA, Harvard, Vancouver, ISO, and other styles
9

Van Eikema Hommes, Qi D. "Applying System Theoretical Hazard Analysis Method to Complex Automotive Cyber Physical Systems." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70527.

Full text
Abstract:
The modern automobiles have become software intensive, with electronics features replacing many of the traditionally mechanical systems, and automating many of the drivers’ tasks. This transition brought new challenges to engineering design. The control system software exhibits unprecedented complexity, whose states cannot be exhaustively tested. Software does not fail like hardware due to random noise factors. Electronics and software update and change rapidly. Engineers have limited engineering experience and historical data to draw upon. Automating traditional manual tasks of the drivers may also lead to accidents. Safety regulation for automotive electronics is in its infancy, and standards do not yet provide adequate safety assurance. Motivated by these challenges, this paper compares a number of hazard analysis methods for their ability to address the challenges posed by the modern automotive electronics systems. The System Theoretic Process and Analysis (STPA) framework developed for system safety engineering presents a paradigm shift, and is the most effective at identifying causes of hazards. As the first application on modern automotive electronic systems, STPA was applied to the Adaptive Cruise Control (ACC) feature. The outcome was compared with the ACC design standards and the actual vehicle implementation to illustrate the effectiveness of the method.
APA, Harvard, Vancouver, ISO, and other styles
10

Zhou, Jianhua, Yuwen Zhang, and J. K. Chen. "Numerical Simulation of Random Packing of Spherical Particles for Selective Laser Sintering Applications." In ASME 2008 International Mechanical Engineering Congress and Exposition. ASMEDC, 2008. http://dx.doi.org/10.1115/imece2008-67856.

Full text
Abstract:
Selective Laser Sintering (SLS) is an efficient and rapid manufacturing technique because it allows for making complex parts that are often unobtainable by traditional manufacturing processes. However, the application of such technique is quite limited by the balling phenomenon, for which it largely reduces the manufacturing quality. Eliminating the structural defects is crucial to overcome the balling phenomenon. Therefore, a better understanding of the packing structure details is urgently needed for the SLS applications. In this study, the sequential addition packing algorithm is employed to investigate the random packing of spherical particles with and without shaking effect. The 3-D random packing structures are demonstrated by illustrative pictures and quantified in terms of pair distribution function, coordination number and packing density. The results are presented and discussed aiming to produce the optimal packing parameters for the SLS manufacturing process.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Complex random process"

1

Wilson, D., Matthew Kamrath, Caitlin Haedrich, Daniel Breton, and Carl Hart. Urban noise distributions and the influence of geometric spreading on skewness. Engineer Research and Development Center (U.S.), November 2021. http://dx.doi.org/10.21079/11681/42483.

Full text
Abstract:
Statistical distributions of urban noise levels are influenced by many complex phenomena, including spatial and temporal variations in the source level, multisource mixtures, propagation losses, and random fading from multipath reflections. This article provides a broad perspective on the varying impacts of these phenomena. Distributions incorporating random fading and averaging (e.g., gamma and noncentral Erlang) tend to be negatively skewed on logarithmic (decibel) axes but can be positively skewed if the fading process is strongly modulated by source power variations (e.g., compound gamma). In contrast, distributions incorporating randomly positioned sources and explicit geometric spreading [e.g., exponentially modified Gaussian (EMG)] tend to be positively skewed with exponential tails on logarithmic axes. To evaluate the suitability of the various distributions, one-third octave band sound-level data were measured at 37 locations in the North End of Boston, MA. Based on the Kullback-Leibler divergence as calculated across all of the locations and frequencies, the EMG provides the most consistently good agreement with the data, which were generally positively skewed. The compound gamma also fits the data well and even outperforms the EMG for the small minority of cases exhibiting negative skew. The lognormal provides a suitable fit in cases in which particular non-traffic noise sources dominate.
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Hongrui, and Rahul Ramachandra Shetty. Analytical Models for Traffic Congestion and Accident Analysis. Mineta Transportation Institute, November 2021. http://dx.doi.org/10.31979/mti.2021.2102.

Full text
Abstract:
In the US, over 38,000 people die in road crashes each year, and 2.35 million are injured or disabled, according to the statistics report from the Association for Safe International Road Travel (ASIRT) in 2020. In addition, traffic congestion keeping Americans stuck on the road wastes millions of hours and billions of dollars each year. Using statistical techniques and machine learning algorithms, this research developed accurate predictive models for traffic congestion and road accidents to increase understanding of the complex causes of these challenging issues. The research used US Accidents data consisting of 49 variables describing 4.2 million accident records from February 2016 to December 2020, as well as logistic regression, tree-based techniques such as Decision Tree Classifier and Random Forest Classifier (RF), and Extreme Gradient boosting (XG-boost) to process and train the models. These models will assist people in making smart real-time transportation decisions to improve mobility and reduce accidents.
APA, Harvard, Vancouver, ISO, and other styles
3

Pettit, Chris, and D. Wilson. A physics-informed neural network for sound propagation in the atmospheric boundary layer. Engineer Research and Development Center (U.S.), June 2021. http://dx.doi.org/10.21079/11681/41034.

Full text
Abstract:
We describe what we believe is the first effort to develop a physics-informed neural network (PINN) to predict sound propagation through the atmospheric boundary layer. PINN is a recent innovation in the application of deep learning to simulate physics. The motivation is to combine the strengths of data-driven models and physics models, thereby producing a regularized surrogate model using less data than a purely data-driven model. In a PINN, the data-driven loss function is augmented with penalty terms for deviations from the underlying physics, e.g., a governing equation or a boundary condition. Training data are obtained from Crank-Nicholson solutions of the parabolic equation with homogeneous ground impedance and Monin-Obukhov similarity theory for the effective sound speed in the moving atmosphere. Training data are random samples from an ensemble of solutions for combinations of parameters governing the impedance and the effective sound speed. PINN output is processed to produce realizations of transmission loss that look much like the Crank-Nicholson solutions. We describe the framework for implementing PINN for outdoor sound, and we outline practical matters related to network architecture, the size of the training set, the physics-informed loss function, and challenge of managing the spatial complexity of the complex pressure.
APA, Harvard, Vancouver, ISO, and other styles
4

Riveros, Guillermo, Felipe Acosta, Reena Patel, and Wayne Hodo. Computational mechanics of the paddlefish rostrum. Engineer Research and Development Center (U.S.), September 2021. http://dx.doi.org/10.21079/11681/41860.

Full text
Abstract:
Purpose – The rostrum of a paddlefish provides hydrodynamic stability during feeding process in addition to detect the food using receptors that are randomly distributed in the rostrum. The exterior tissue of the rostrum covers the cartilage that surrounds the bones forming interlocking star shaped bones. Design/methodology/approach – The aim of this work is to assess the mechanical behavior of four finite element models varying the type of formulation as follows: linear-reduced integration, linear-full integration, quadratic-reduced integration and quadratic-full integration. Also presented is the load transfer mechanisms of the bone structure of the rostrum. Findings – Conclusions are based on comparison among the four models. There is no significant difference between integration orders for similar type of elements. Quadratic-reduced integration formulation resulted in lower structural stiffness compared with linear formulation as seen by higher displacements and stresses than using linearly formulated elements. It is concluded that second-order elements with reduced integration and can model accurately stress concentrations and distributions without over stiffening their general response. Originality/value – The use of advanced computational mechanics techniques to analyze the complex geometry and components of the paddlefish rostrum provides a viable avenue to gain fundamental understanding of the proper finite element formulation needed to successfully obtain the system behavior and hot spot locations.
APA, Harvard, Vancouver, ISO, and other styles
5

McPhedran, R., K. Patel, B. Toombs, P. Menon, M. Patel, J. Disson, K. Porter, A. John, and A. Rayner. Food allergen communication in businesses feasibility trial. Food Standards Agency, March 2021. http://dx.doi.org/10.46756/sci.fsa.tpf160.

Full text
Abstract:
Background: Clear allergen communication in food business operators (FBOs) has been shown to have a positive impact on customers’ perceptions of businesses (Barnett et al., 2013). However, the precise size and nature of this effect is not known: there is a paucity of quantitative evidence in this area, particularly in the form of randomised controlled trials (RCTs). The Food Standards Agency (FSA), in collaboration with Kantar’s Behavioural Practice, conducted a feasibility trial to investigate whether a randomised cluster trial – involving the proactive communication of allergen information at the point of sale in FBOs – is feasible in the United Kingdom (UK). Objectives: The trial sought to establish: ease of recruitments of businesses into trials; customer response rates for in-store outcome surveys; fidelity of intervention delivery by FBO staff; sensitivity of outcome survey measures to change; and appropriateness of the chosen analytical approach. Method: Following a recruitment phase – in which one of fourteen multinational FBOs was successfully recruited – the execution of the feasibility trial involved a quasi-randomised matched-pairs clustered experiment. Each of the FBO’s ten participating branches underwent pair-wise matching, with similarity of branches judged according to four criteria: Food Hygiene Rating Scheme (FHRS) score, average weekly footfall, number of staff and customer satisfaction rating. The allocation ratio for this trial was 1:1: one branch in each pair was assigned to the treatment group by a representative from the FBO, while the other continued to operate in accordance with their standard operating procedure. As a business-based feasibility trial, customers at participating branches throughout the fieldwork period were automatically enrolled in the trial. The trial was single-blind: customers at treatment branches were not aware that they were receiving an intervention. All customers who visited participating branches throughout the fieldwork period were asked to complete a short in-store survey on a tablet affixed in branches. This survey contained four outcome measures which operationalised customers’: perceptions of food safety in the FBO; trust in the FBO; self-reported confidence to ask for allergen information in future visits; and overall satisfaction with their visit. Results: Fieldwork was conducted from the 3 – 20 March 2020, with cessation occurring prematurely due to the closure of outlets following the proliferation of COVID-19. n=177 participants took part in the trial across the ten branches; however, response rates (which ranged between 0.1 - 0.8%) were likely also adversely affected by COVID-19. Intervention fidelity was an issue in this study: while compliance with delivery of the intervention was relatively high in treatment branches (78.9%), erroneous delivery in control branches was also common (46.2%). Survey data were analysed using random-intercept multilevel linear regression models (due to the nesting of customers within branches). Despite the trial’s modest sample size, there was some evidence to suggest that the intervention had a positive effect for those suffering from allergies/intolerances for the ‘trust’ (β = 1.288, p<0.01) and ‘satisfaction’ (β = 0.945, p<0.01) outcome variables. Due to singularity within the fitted linear models, hierarchical Bayes models were used to corroborate the size of these interactions. Conclusions: The results of this trial suggest that a fully powered clustered RCT would likely be feasible in the UK. In this case, the primary challenge in the execution of the trial was the recruitment of FBOs: despite high levels of initial interest from four chains, only one took part. However, it is likely that the proliferation of COVID-19 adversely impacted chain participation – two other FBOs withdrew during branch eligibility assessment and selection, citing COVID-19 as a barrier. COVID-19 also likely lowered the on-site survey response rate: a significant negative Pearson correlation was observed between daily survey completions and COVID-19 cases in the UK, highlighting a likely relationship between the two. Limitations: The trial was quasi-random: selection of branches, pair matching and allocation to treatment/control groups were not systematically conducted. These processes were undertaken by a representative from the FBO’s Safety and Quality Assurance team (with oversight from Kantar representatives on pair matching), as a result of the chain’s internal operational restrictions.
APA, Harvard, Vancouver, ISO, and other styles
6

Galili, Naftali, Roger P. Rohrbach, Itzhak Shmulevich, Yoram Fuchs, and Giora Zauberman. Non-Destructive Quality Sensing of High-Value Agricultural Commodities Through Response Analysis. United States Department of Agriculture, October 1994. http://dx.doi.org/10.32747/1994.7570549.bard.

Full text
Abstract:
The objectives of this project were to develop nondestructive methods for detection of internal properties and firmness of fruits and vegetables. One method was based on a soft piezoelectric film transducer developed in the Technion, for analysis of fruit response to low-energy excitation. The second method was a dot-matrix piezoelectric transducer of North Carolina State University, developed for contact-pressure analysis of fruit during impact. Two research teams, one in Israel and the other in North Carolina, coordinated their research effort according to the specific objectives of the project, to develop and apply the two complementary methods for quality control of agricultural commodities. In Israel: An improved firmness testing system was developed and tested with tropical fruits. The new system included an instrumented fruit-bed of three flexible piezoelectric sensors and miniature electromagnetic hammers, which served as fruit support and low-energy excitation device, respectively. Resonant frequencies were detected for determination of firmness index. Two new acoustic parameters were developed for evaluation of fruit firmness and maturity: a dumping-ratio and a centeroid of the frequency response. Experiments were performed with avocado and mango fruits. The internal damping ratio, which may indicate fruit ripeness, increased monotonically with time, while resonant frequencies and firmness indices decreased with time. Fruit samples were tested daily by destructive penetration test. A fairy high correlation was found in tropical fruits between the penetration force and the new acoustic parameters; a lower correlation was found between this parameter and the conventional firmness index. Improved table-top firmness testing units, Firmalon, with data-logging system and on-line data analysis capacity have been built. The new device was used for the full-scale experiments in the next two years, ahead of the original program and BARD timetable. Close cooperation was initiated with local industry for development of both off-line and on-line sorting and quality control of more agricultural commodities. Firmalon units were produced and operated in major packaging houses in Israel, Belgium and Washington State, on mango and avocado, apples, pears, tomatoes, melons and some other fruits, to gain field experience with the new method. The accumulated experimental data from all these activities is still analyzed, to improve firmness sorting criteria and shelf-life predicting curves for the different fruits. The test program in commercial CA storage facilities in Washington State included seven apple varieties: Fuji, Braeburn, Gala, Granny Smith, Jonagold, Red Delicious, Golden Delicious, and D'Anjou pear variety. FI master-curves could be developed for the Braeburn, Gala, Granny Smith and Jonagold apples. These fruits showed a steady ripening process during the test period. Yet, more work should be conducted to reduce scattering of the data and to determine the confidence limits of the method. Nearly constant FI in Red Delicious and the fluctuations of FI in the Fuji apples should be re-examined. Three sets of experiment were performed with Flandria tomatoes. Despite the complex structure of the tomatoes, the acoustic method could be used for firmness evaluation and to follow the ripening evolution with time. Close agreement was achieved between the auction expert evaluation and that of the nondestructive acoustic test, where firmness index of 4.0 and more indicated grade-A tomatoes. More work is performed to refine the sorting algorithm and to develop a general ripening scale for automatic grading of tomatoes for the fresh fruit market. Galia melons were tested in Israel, in simulated export conditions. It was concluded that the Firmalon is capable of detecting the ripening of melons nondestructively, and sorted out the defective fruits from the export shipment. The cooperation with local industry resulted in development of automatic on-line prototype of the acoustic sensor, that may be incorporated with the export quality control system for melons. More interesting is the development of the remote firmness sensing method for sealed CA cool-rooms, where most of the full-year fruit yield in stored for off-season consumption. Hundreds of ripening monitor systems have been installed in major fruit storage facilities, and being evaluated now by the consumers. If successful, the new method may cause a major change in long-term fruit storage technology. More uses of the acoustic test method have been considered, for monitoring fruit maturity and harvest time, testing fruit samples or each individual fruit when entering the storage facilities, packaging house and auction, and in the supermarket. This approach may result in a full line of equipment for nondestructive quality control of fruits and vegetables, from the orchard or the greenhouse, through the entire sorting, grading and storage process, up to the consumer table. The developed technology offers a tool to determine the maturity of the fruits nondestructively by monitoring their acoustic response to mechanical impulse on the tree. A special device was built and preliminary tested in mango fruit. More development is needed to develop a portable, hand operated sensing method for this purpose. In North Carolina: Analysis method based on an Auto-Regressive (AR) model was developed for detecting the first resonance of fruit from their response to mechanical impulse. The algorithm included a routine that detects the first resonant frequency from as many sensors as possible. Experiments on Red Delicious apples were performed and their firmness was determined. The AR method allowed the detection of the first resonance. The method could be fast enough to be utilized in a real time sorting machine. Yet, further study is needed to look for improvement of the search algorithm of the methods. An impact contact-pressure measurement system and Neural Network (NN) identification method were developed to investigate the relationships between surface pressure distributions on selected fruits and their respective internal textural qualities. A piezoelectric dot-matrix pressure transducer was developed for the purpose of acquiring time-sampled pressure profiles during impact. The acquired data was transferred into a personal computer and accurate visualization of animated data were presented. Preliminary test with 10 apples has been performed. Measurement were made by the contact-pressure transducer in two different positions. Complementary measurements were made on the same apples by using the Firmalon and Magness Taylor (MT) testers. Three-layer neural network was designed. 2/3 of the contact-pressure data were used as training input data and corresponding MT data as training target data. The remaining data were used as NN checking data. Six samples randomly chosen from the ten measured samples and their corresponding Firmalon values were used as the NN training and target data, respectively. The remaining four samples' data were input to the NN. The NN results consistent with the Firmness Tester values. So, if more training data would be obtained, the output should be more accurate. In addition, the Firmness Tester values do not consistent with MT firmness tester values. The NN method developed in this study appears to be a useful tool to emulate the MT Firmness test results without destroying the apple samples. To get more accurate estimation of MT firmness a much larger training data set is required. When the larger sensitive area of the pressure sensor being developed in this project becomes available, the entire contact 'shape' will provide additional information and the neural network results would be more accurate. It has been shown that the impact information can be utilized in the determination of internal quality factors of fruit. Until now,
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!