Dissertations / Theses on the topic 'Criticality'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Criticality.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Di, Laudo Umberto. "Deconfined quantum criticality." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2022. http://amslaurea.unibo.it/25125/.
Full textVanni, Fabio. "Criticality in Cooperative Systems." Thesis, University of North Texas, 2012. https://digital.library.unt.edu/ark:/67531/metadc271910/.
Full textStiansen, Einar B. "Criticality in Quantum Dissipative Systems." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for fysikk, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-17475.
Full textPruessner, Gunnar. "Studies in self-organised criticality." Thesis, Imperial College London, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.407087.
Full textIberti, Massimo. "Ising-Kac models near criticality." Thesis, University of Warwick, 2018. http://wrap.warwick.ac.uk/109480/.
Full textBoonzaaier, Leandro. "Self-organised criticality and seismicity." Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/53047.
Full textENGLISH ABSTRACT: In this thesis we give an overview of self-organised criticality and its application to studying seismicity. We recall some of the basic models and techniques for studying self-organised critical systems. We discuss one of these, the sandpile model, in detail and show how various properties of the model can be calculated using a matrix formulation thereof. A correspondence between self-organised critical systems and seismicity is then proposed. Finally, we consider the timeevolution of the sandpile model by using a time-to-failure analysis, originally developed in the study of seismicity and obtain results for the sandpile model that show similarities with that of the analyses of seismic data.
AFRIKAANSE OPSOMMING: In hierdie tesis gee ons 'n oorsig van self-organiserende kritikaliteit en die toepassing daarvan in die studie van seismisiteit. Ons beskryf die basiese modelle en tegnieke vir die studie van self-organiserende kritiese sisteme. Ons bespreek een van hierdie, die sandhoopmodel, in besonderheid en wys hoe om verskeie eienskappe van die model te bereken deur gebruik te maak van 'n matriks-formulering daarvan. Ons stel dan 'n korrespondensie tussen self-organiserende kritiese sisteme en seismisiteit voor. Ter afsluiting ondersoek ons die tydontwikkeling van die sand hoopmodel deur gebruik te maak van 'n deurbreektyd analise wat oorspronklik in die bestudering seismiese data ontwikkel is. Die resultate vir die analise van die sandhoopmodel toon ooreenkomste met dit wat verkry word vir seismiese data.
Kahil, Rany. "Schedulability in Mixed-criticality Systems." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAM023/document.
Full textReal-time safety-critical systems must complete their tasks within a given time limit. Failure to successfully perform their operations, or missing a deadline, can have severe consequences such as destruction of property and/or loss of life. Examples of such systems include automotive systems, drones and avionics among others. Safety guarantees must be provided before these systems can be deemed usable. This is usually done through certification performed by a certification authority.Safety evaluation and certification are complicated and costly even for smaller systems.One answer to these difficulties is the isolation of the critical functionality. Executing tasks of different criticalities on separate platforms prevents non-critical tasks from interfering with critical ones, provides a higher guaranty of safety and simplifies the certification process limiting it to only the critical functions. But this separation, in turn, introduces undesirable results portrayed by an inefficient resource utilization, an increase in the cost, weight, size and energy consumption which can put a system in a competitive disadvantage.To overcome the drawbacks of isolation, Mixed Criticality (MC) systems can be used. These systems allow functionalities with different criticalities to execute on the same platform. In 2007, Vestal proposed a model to represent MC-systems where tasks have multiple Worst Case Execution Times (WCETs), one for each criticality level. In addition, correctness conditions for scheduling policies were formally defined, allowing lower criticality jobs to miss deadlines or be even dropped in cases of failure or emergency situations.The introduction of multiple WCETs and different conditions for correctness increased the difficulty of the scheduling problem for MC-systems. Conventional scheduling policies and schedulability tests proved inadequate and the need for new algorithms arose. Since then, a lot of work has been done in this field.In this thesis, we contribute to the study of schedulability in MC-systems. The workload of a system is represented as a set of jobs that can describe the execution over the hyper-period of tasks or over a duration in time. This model allows us to study the viability of simulation-based correctness tests in MC-systems. We show that simulation tests can still be used in mixed-criticality systems, but in this case, the schedulability of the worst case scenario is no longer sufficient to guarantee the schedulability of the system even for the fixed priority scheduling case. We show that scheduling policies are not predictable in general, and define the concept of weak-predictability for MC-systems. We prove that a specific class of fixed priority policies are weakly predictable and propose two simulation-based correctness tests that work for weakly-predictable policies.We also demonstrate that contrary to what was believed, testing for correctness can not be done only through a linear number of preemptions.The majority of the related work focuses on systems of two criticality levels due to the difficulty of the problem. But for automotive and airborne systems, industrial standards define four or five criticality levels, which motivated us to propose a scheduling algorithm that schedules mixed-criticality systems with theoretically any number of criticality levels. We show experimentally that it has higher success rates compared to the state of the art.We illustrate how our scheduling algorithm, or any algorithm that generates a single time-triggered table for each criticality mode, can be used as a recovery strategy to ensure the safety of the system in case of certain failures.Finally, we propose a high level concurrency language and a model for designing an MC-system with coarse grained multi-core interference
Pueyo, Puntí Salvador. "Irreversibility and Criticality in the Biosphere." Doctoral thesis, Universitat de Barcelona, 2003. http://hdl.handle.net/10803/1421.
Full textI began by adding some new contributions to the thermodynamic approach to systemic ecology, but concluded that there is little scope for further progress of strictly ecological interest with this orientation. Instead, the key for a systemic ecology seems to lie in the "large number" effects that arise at the limit of many organisms and/or species, just like the whole scientific body of statistical physics stands on the general features that emerge at the limit of many particles. The concept of criticality seems to have a special importance within this context (criticality is the quality of lying at the critical point in which there is a second order phase transition).
Some specific issues that I analyze in depth, taking advantage of the concept of criticality and other concepts related to statistical physics, are:
·Wildland fire dynamics. Practical tools to predict and manage fire in boreal forests and in the Mediterranean. Limits to anthropogenic impacts on tropical rainforests before a major fire catastrophe unfolds. The possible generalization of the findings on wildland fires to other kinds of catastrophes, with emphasis on agricultural pests and epidemics.
· Diversity patterns. The origin of species abundance distributions and species-area relations. Their interpretation (and misinterpretation). The case of marine phytoplankton. The quantification of diversity for conservation purposes.
· The effects of diversity on stability. The sources of the apparent inconsistencies between theoretical models, both historical and current, and between theoretical expectations and some experimental results.
I conclude with a discussion on the interest of my and other related findings from the point of view of ecological economics.
Küttler, Martin, Michael Roitzsch, Claude-Joachim Hamann, and Marcus Völp. "Probabilistic Analysis of Low-Criticality Execution." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-233117.
Full textHawtin, Benjamin Charles. "Defect criticality of carbon fibre composites." Thesis, University of Bath, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.425875.
Full textJones, Thomas Berry. "Criticality Assessments for Improving Algorithmic Robustness." Thesis, The University of New Mexico, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=10980232.
Full textThough computational models typically assume all program steps execute flawlessly, that does not imply all steps are equally important if a failure should occur. In the "Constrained Reliability Allocation" problem, sufficient resources are guaranteed for operations that prompt eventual program termination on failure, but those operations that only cause output errors are given a limited budget of some vital resource, insufficient to ensure correct operation for each of them.
In this dissertation, I present a novel representation of failures based on a combination of their timing and location combined with criticality assessments—a method used to predict the behavior of systems operating outside their design criteria. I observe that strictly correct error measures hide interesting failure relationships, failure importance is often determined by failure timing, and recursion plays an important role in structuring output error. I employ these observations to improve the output error of two matrix multiplication methods through an economization procedure that moves failures from worse to better locations, thus providing a possible solution to the constrained reliability allocation problem. I show a 38% to 63% decrease in absolute value error on matrix multiplication algorithms, despite nearly identical failure counts between control and experimental studies. Finally, I show that efficient sorting algorithms are less robust at large scale than less efficient sorting algorithms.
Siapas, Athanassios G. "Criticality and parallelism in combinatorial optimization." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/11009.
Full textIncludes bibliographical references (p. 60-63).
by Athanassios G. Siapas.
Ph.D.
Merchant, P. L. H. "Excitations and criticality in quantum magnets." Thesis, University College London (University of London), 2013. http://discovery.ucl.ac.uk/1388128/.
Full textDe, Villiers Anton Pierre. "Edge criticality in secure graph domination." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/95841.
Full textENGLISH ABSTRACT: The domination number of a graph is the cardinality of a smallest subset of its vertex set with the property that each vertex of the graph is in the subset or adjacent to a vertex in the subset. This graph parameter has been studied extensively since its introduction during the early 1960s and finds application in the generic setting where the vertices of the graph denote physical entities that are typically geographically dispersed and have to be monitored efficiently, while the graph edges model links between these entities which enable guards, stationed at the vertices, to monitor adjacent entities. In the above application, the guards remain stationary at the entities. In 2005, this constraint was, however, relaxed by the introduction of a new domination-related parameter, called the secure domination number. In this relaxed, dynamic setting, each unoccupied entity is defended by a guard stationed at an adjacent entity who can travel along an edge to the unoccupied entity in order to resolve a security threat that may occur there, after which the resulting configuration of guards at the entities is again required to be a dominating set of the graph. The secure domination number of a graph is the smallest number of guards that can be placed on its vertices so as to satisfy these requirements. In this generalised setting, the notion of edge removal is important, because one might seek the cost, in terms of the additional number of guards required, of protecting the complex of entities modelled by the graph if a number of edges in the graph were to fail (i.e. a number of links were to be eliminated form the complex, thereby disqualifying guards from moving along such disabled links). A comprehensive survey of the literature on secure graph domination is conducted in this dissertation. Descriptions of related, generalised graph protection parameters are also given. The classes of graphs with secure domination number 1, 2 or 3 are characterised and a result on the number of defenders in any minimum secure dominating set of a graph without end-vertices is presented, after which it is shown that the decision problem associated with computing the secure domination number of an arbitrary graph is NP-complete. Two exponential-time algorithms and a binary programming problem formulation are presented for computing the secure domination number of an arbitrary graph, while a linear algorithm is put forward for computing the secure domination number of an arbitrary tree. The practical efficiencies of these algorithms are compared in the context of small graphs. The smallest and largest increase in the secure domination number of a graph are also considered when a fixed number of edges are removed from the graph. Two novel cost functions are introduced for this purpose. General bounds on these two cost functions are established, and exact values of or tighter bounds on the cost functions are determined for various infinite classes of special graphs. Threshold information is finally established in respect of the number of possible edge removals from a graph before increasing its secure domination number. The notions of criticality and stability are introduced and studied in this respect, focussing on the smallest number of arbitrary edges whose deletion necessarily increases the secure domination number of the resulting graph, and the largest number of arbitrary edges whose deletion necessarily does not increase the secure domination number of the resulting graph.
AFRIKAANSE OPSOMMING: Die dominasiegetal van ’n grafiek is die kardinaalgetal van ’n kleinste deelversameling van die grafiek se puntversameling met die eienskap dat elke punt van die grafiek in die deelversameling is of naasliggend is aan ’n punt in die deelversameling. Hierdie grafiekparameter is sedert die vroeë 1960s uitvoerig bestudeer en vind toepassing in die generiese situasie waar die punte van die grafiek fisiese entiteite voorstel wat tipies geografies verspreid is en doeltreffend gemonitor moet word, terwyl die lyne van die grafiek skakels tussen hierdie entiteite voorstel waarlangs wagte, wat by die entiteite gebaseer is, naasliggende entiteite kan monitor. In die bogenoemde toepassing, bly die wagte bewegingloos by die fisiese entiteite waar hulle geplaas word. In 2005 is hierdie beperking egter verslap met die daarstelling van ’n nuwe dominasie-verwante grafiekparameter, bekend as die sekure dominasiegetal. In hierdie verslapte, dinamiese situasie word elke punt sonder ’n wag deur ’n wag verdedig wat by ’n naasliggende punt geplaas is en wat langs die verbindingslyn na die leë punt kan beweeg om daar ’n bedreiging te neutraliseer, waarna die gevolglike plasing van wagte weer ’n dominasieversameling van die grafiek moet vorm. Die sekure dominasiegetal van ’n grafiek is die kleinste getal wagte wat op die punte van die grafiek geplaas kan word om aan hierdie vereistes te voldoen. Die beginsel van lynverwydering speel ’n belangrike rol in hierdie veralgemeende situasie, omdat daar gevra mag word na die koste, in terme van die addisionele getal wagte wat vereis word, om die kompleks van entiteite wat deur die grafiek gemodelleer word, te beveilig indien ’n aantal lynfalings in die grafiek plaasvind (m.a.w. indien ’n aantal skakels uit die kompleks van entiteite verwyder word, en wagte dus nie meer langs sulke skakels mag beweeg nie). ’n Omvattende literatuurstudie oor sekure dominasie van grafieke word in hierdie verhandeling gedoen. Beskrywings van verwante, veralgemeende verdedigingsparameters in grafiekteorie word ook gegee. Die klasse van grafieke met sekure dominasiegetal 1, 2 of 3 word gekarakteriseer en ’n resultaat oor die getal verdedigers in enige kleinste sekure dominasieversameling van ’n grafiek sonder endpunte word daargestel, waarna daar getoon word dat die beslissingsprobleem onderliggend aan die berekening van die sekure dominasiegetal van ’n arbitrêre grafiek NP- volledig is. Twee eksponensiële-tyd algoritmes en ’n binêre programmeringsformulering word vir die bepaling van die sekure dominasiegetal van ’n arbitrêre grafiek daargestel, terwyl ’n lineêre algoritme vir die berekening van die sekure dominasiegetal van ’n arbitrêre boom ontwerp word. Die praktiese doeltreffendhede van hierdie algoritmes word vir klein grafieke met mekaar vergelyk. Die kleinste en groostste toename in die sekure dominasiegetal van ’n grafiek word ook oorweeg wanneer ’n vaste getal lyne uit die grafiek verwyder word. Twee nuwe kostefunksies word vir hierdie doel daargestel en algemene grense word op hierdie kostefunksies vir arbitrêre grafieke bepaal, terwyl eksakte waardes van of verbeterde grense op hierdie kostefunksies vir verskeie oneindige klasse van spesiale grafieke bereken word. Drempelinligting word uiteindelik bepaal in terme van die moontlike getal lynverwyderings uit ’n grafiek voordat die sekure dominasiegetal daarvan toeneem. Die konsepte van kritiekheid en stabiliteit word in hierdie konteks bestudeer, met ’n fokus op die kleinste getal arbitrêre lynfalings wat noodwendig die sekure dominasiegetal van die gevolglike grafiek laat toeneem, of die grootste getal arbitrêre lynfalings wat noodwendig die sekure dominasiegetal van die gevolglike grafiek onveranderd laat.
Socci, Dario. "Scheduling of certifiable mixed-criticality systems." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAM025/document.
Full textModern real-time systems tend to be mixed-critical, in the sense that they integrate on the same computational platform applications at different levels of criticality. Integration gives the advantages of reduced cost, weight and power consumption, which can be crucial for modern applications like Unmanned Aerial Vehicles (UAVs). On the other hand, this leads to major complications in system design. Moreover, such systems are subject to certification, and different criticality levels needs to be certified at different level of assurance. Among other aspects, the real-time scheduling of certifiable mixed critical systems has been recognized to be a challenging problem. Traditional techniques require complete isolation between criticality levels or global certification to the highest level of assurance, which leads to resource waste, thus loosing the advantage of integration. This led to a novel wave of research in the real-time community, and many solutions were proposed. Among those, one of the most popular methods used to schedule such systems is Audsley approach. However this method has some limitations, which we discuss in this thesis. These limitations are more pronounced in the case of multiprocessor scheduling. In this case priority-based scheduling looses some important properties. For this reason scheduling algorithms for multiprocessor mixed-critical systems are not as numerous in literature as the single processor ones, and usually are built on restrictive assumptions. This is particularly problematic since industrial real-time systems strive to migrate from single-core to multi-core and many-core platforms. Therefore we motivate and study a different approach that can overcome these problems.A restriction of practical usability of many mixed-critical and multiprocessor scheduling algorithms is assumption that jobs are independent. In reality they often have precedence constraints. In the thesis we show the mixed-critical variant of the problem formulation and extend the system load metrics to the case of precedence-constraint task graphs. We also show that our proposed methodology and scheduling algorithm MCPI can be extended to the case of dependent jobs without major modification and showing similar performance with respect to the independent jobs case. Another topic we treated in this thesis is time-triggered scheduling. This class of schedulers is important because they considerably reduce the uncertainty of job execution intervals thus simplifying the safety-critical system certification. They also simplify any auxiliary timing-based analyses that may be required to validate important extra-functional properties in embedded systems, such as interference on shared buses and caches, peak power dissipation, electromagnetic interference etc..The trivial method of obtaining a time-triggered schedule is simulation of the worst-case scenario in event-triggered algorithm. However, when applied directly, this method is not efficient for mixed-critical systems, as instead of one worst-case scenario they have multiple corner-case scenarios. For this reason, it was proposed in the literature to treat all scenarios into just a few tables, one per criticality mode. We call this scheduling approach Single Time Table per Mode (STTM) and propose a contribution in this context. In fact we introduce a method that transforms practically any scheduling algorithm into an STTM one. It works optimally on single core and shows good experimental results for multi-cores.Finally we studied the problem of the practical realization of mixed critical systems. Our effort in this direction is a design flow that we propose for multicore mixed critical systems. In this design flow, as the model of computation we propose a network of deterministic multi-periodic synchronous processes. Our approach is demonstrated using a publicly available toolset, an industrial application use case and a multi-core platform
Küttler, Martin, Michael Roitzsch, Claude-Joachim Hamann, and Marcus Völp. "Probabilistic Analysis of Low-Criticality Execution." Technische Universität Dresden, 2017. https://tud.qucosa.de/id/qucosa%3A30798.
Full textZare, Marzieh. "Cooperation-induced Criticality in Neural Networks." Thesis, University of North Texas, 2013. https://digital.library.unt.edu/ark:/67531/metadc283813/.
Full textGrilli, Jacopo. "Randomness and Criticality in Biological Interactions." Doctoral thesis, Università degli studi di Padova, 2015. http://hdl.handle.net/11577/3424011.
Full textIn questa tesi studiamo da una prospettiva fisica due problemi legati alle interazioni biologiche. Nella prima parte della tesi consideriamo le interazioni ecologiche, che danno forma agli ecosistemi e determinano la loro sorte, e la loro relazione con la stabilità degli stessi. Usando la teoria delle matrici aleatorie, siamo in grado di identificare gli aspetti chiave, i parametri d'ordine, che determinano la stabilità degli ecosistemi. Quindi consideriamo il problema di determinare la persistenza di una popolazione che vive in un territorio frammentato aleatoriamente. Usando alcune tecniche prese in prestito dalla teoria delle matrici aleatorie applicata ai sistemi disordinati, riusciamo a identificare quali sono gli ingredienti chiave per la persistenza. La seconda parte della tesi è dedicata all'osservazione che molti sistemi viventi sembrano essere calibrati precisamente vicino a un punto critico. Indroduciamo un modello stocastico, basato sulla teoria dell'informazione, che predice i punti critici come risultato naturale di un processo di voluzione e adattamento, senza una calibrazione dei parametri
Namazi, Alireza. "Emergent behavior and criticality in online auctions." [S.l.] : [s.n.], 2005. http://deposit.ddb.de/cgi-bin/dokserv?idn=976716739.
Full textWang, Jingtao. "The nature of asymmetry in fluid criticality." College Park, Md. : University of Maryland, 2006. http://hdl.handle.net/1903/3815.
Full textThesis research directed by: Chemical Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
London, Mark Daniel. "Complexity and criticality in financial time series." Thesis, De Montfort University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.434034.
Full textFord, Gary Nicholas. "Data criticality in through life engineering support." Thesis, University of Bristol, 2016. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.761228.
Full textYeo, Dominic. "Self-organised criticality in random graph processes." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:23af1abc-2128-4315-9b25-55ed8f290875.
Full textp(N) = | 1+λN-1/3 |
N |
Zhou, Luyuan. "Security Risk Analysis based on Data Criticality." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-93055.
Full textNavas, Portella Víctor. "Statistical modelling of avalanche observables: criticality and universality." Doctoral thesis, Universitat de Barcelona, 2020. http://hdl.handle.net/10803/670764.
Full textEls sistemes complexos es poden entendre com entitats compostes per un gran nombre d’elements en interacció on la seva resposta global i emergent no es pot derivar de les lleis particulars que caracteritzen cadascun dels seus constituents. Els observables que caracteritzen aquests sistemes es poden observar a diferents escales i, sovint, mostren propietats interessants tals com la manca d’escales característiques i autosimilitud. En aquest context, les funcions amb lleis de potència prenen un paper important en la descripció d’aquests observables. La presència de lleis de potència s’assimila a la situació dels fenòmens crítics en equilibri, on algunes quantitats termodinàmiques mostren un comportament funcional similar prop d’un punt crític. Diferents sistemes complexos es poden agrupar en la mateixa classe d’universalitat quan les funcions de lleis de potència que caracteritzen els seus observables tenen els mateixos exponents. Quan són conduïts externament, la resposta d’alguns sistemes complexos segueix el que s’anomonena un procès d’allaus: una resposta col·lectiva del sistema caracteritzada per seguir una dinàmica intermitent, amb sobtats increments d’activitat separats per períodes de silenci. Aquesta mena de sistemes fora de l’equilibri es poden trobar en diferents disciplines tals com la sismologia, astrofísica, ecologia, epidemologia o finances, per mencionar alguns. Les allaus estan caracteritzades per un conjunt d’observables tals com la mida, l’energia o la durada. Quan aquests observables mostren una manca d’escales característiques, les seves distribucions de probabilitat es poden modelitzar estadísticament per distribucions de lleis de potència. S’anomenen allaus crítiques aquelles en que els seus observables es poden caracteritzar per aquestes distribucions. En aquest sentit, els conceptes de criticalitat i universalitat, els quals estan ben definits per fenòmens en equilibri, es poden extendre per les distribucions de probabilitat que descriuen els observables de les allaus en sistemes fora de l’equilibri. L’objectiu principal d’aquesta tesi doctoral és proporcionar mètodes estadístics robusts per tal de caracteritzar la criticalitat i la universalitat en allaus corresponents a dades empíriques. Degut a les limitacions en l’adquisició de dades, les dades empíriques sovint cobreixen un rang petit d’observació, dificultant que es pugui establir un determinat comportament en forma de llei de potència de manera inequívoca. Amb l’objectiu de discutir els conceptes de criticalitat i universalitat en allaus, es consideraran dos sistemes diferents: els terratrèmols i els esdeveniments d’emissió acústica que es generen durant experiments de compressió de materials porosos al laboratori (labquakes). Les tècniques desenvolupades en aquesta tesi doctoral estan enfocades principalment a la distribució de la mida dels terratrèmols i labquakes, altrament coneguda com a llei de Gutenberg-Richter. No obstant, aquests mètodes són molt més generals i es poden aplicar a qualsevol observable de les allaus. Les tècniques estadístistiques proporcionades en aquest treball poden també ajudar al pronòstic de terratrèmols. Durant anys, la teoria d’esforços de Coulomb s’ha utilitzat en sismologia per tal d’entendre com els terratrèmols desencadenen l’ocurrència d’altres de nous. Els models de terratrèmols que relacionen la taxa d’ocurrència de rèpliques i l’esforç de Coulomb després d’un gran esdeveniment, assumeixen que la distribució de la mida dels terratrèmols no està afectada pel canvi en l’esforç de Coulomb. Diverses anàlisi estadístiques s’aplicaran per tal de comprovar si la distribució de magnituds és sensible al signe de l’esforç de Coulomb. S’ha provat que l’ús de tècniques estadístiques avançades en l’anàlisi de sistemes complexos és útil i necessari per tal d’aportar rigor als resultats empírics i, en particular, a problemes d’anàlisi de riscos.
Naz, Sabiha. "Benchmark criticality calculations for one speed neutron transport." Diss., Columbia, Mo. : University of Missouri-Columbia, 2007. http://hdl.handle.net/10355/5927.
Full textThe entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file (viewed on October 17, 2007) Vita. Includes bibliographical references.
Coetzer, Audrey. "Criticality of the lower domination parameters of graphs." Thesis, Link to the online version, 2007. http://hdl.handle.net/10019/1051.
Full textHasty, Jeff. "A renormalization group study of self-organized criticality." Diss., Georgia Institute of Technology, 1997. http://hdl.handle.net/1853/29887.
Full textPeters, Ole Bjoern. "Approaches to criticality : rainfall and other relaxation processes." Thesis, Imperial College London, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.415268.
Full textStapleton, Matthew Alexander. "Self-organised criticality and non-equilibrium statistical mechanics." Thesis, Imperial College London, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.443792.
Full textHollinghurst, Joe. "Enabling software defined networking in high criticality networks." Thesis, University of Bristol, 2018. http://hdl.handle.net/1983/8ac68df0-62ba-4cf8-beee-b69ee807f43e.
Full textFleming, Thomas David. "Allocation and optimisation of mixed criticality cyclic executives." Thesis, University of York, 2017. http://etheses.whiterose.ac.uk/19031/.
Full textRoux, Adriana. "Vertex-criticality of the domination parameters of graphs." Thesis, Stellenbosch : University of Stellenbosch, 2011. http://hdl.handle.net/10019.1/6874.
Full textZamani, Farzaneh. "Local quantum criticality in and out of equilibrium." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-213688.
Full textReis, Elohim Fonseca dos 1984. "Criticality in neural networks = Criticalidade em redes neurais." [s.n.], 2015. http://repositorio.unicamp.br/jspui/handle/REPOSIP/276917.
Full textDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin
Made available in DSpace on 2018-08-29T15:40:55Z (GMT). No. of bitstreams: 1 Reis_ElohimFonsecados_M.pdf: 2277988 bytes, checksum: 08f2c3b84a391217d575c0f425159fca (MD5) Previous issue date: 2015
Resumo: Este trabalho é dividido em duas partes. Na primeira parte, uma rede de correlação é construída baseada em um modelo de Ising em diferentes temperaturas, crítica, subcrítica e supercrítica, usando um algorítimo de Metropolis Monte-Carlo com dinâmica de \textit{single-spin-flip}. Este modelo teórico é comparado com uma rede do cérebro construída a partir de correlações das séries temporais do sinal BOLD de fMRI de regiões do cérebro. Medidas de rede, como coeficiente de aglomeração, mínimo caminho médio e distribuição de grau são analisadas. As mesmas medidas de rede são calculadas para a rede obtida pelas correlações das séries temporais dos spins no modelo de Ising. Os resultados da rede cerebral são melhor explicados pelo modelo teórico na temperatura crítica, sugerindo aspectos de criticalidade na dinâmica cerebral. Na segunda parte, é estudada a dinâmica temporal da atividade de um população neural, ou seja, a atividade de células ganglionares da retina gravadas em uma matriz de multi-eletrodos. Vários estudos têm focado em descrever a atividade de redes neurais usando modelos de Ising com desordem, não dando atenção à estrutura dinâmica. Tratando o tempo como uma dimensão extra do sistema, a dinâmica temporal da atividade da população neural é modelada. O princípio de máxima entropia é usado para construir um modelo de Ising com interação entre pares das atividades de diferentes neurônios em tempos diferentes. O ajuste do modelo é feito com uma combinação de amostragem de Monte-Carlo e método do gradiente descendente. O sistema é caracterizado pelos parâmetros aprendidos, questões como balanço detalhado e reversibilidade temporal são analisadas e variáveis termodinâmicas, como o calor específico, podem ser calculadas para estudar aspectos de criticalidade
Abstract: This work is divided in two parts. In the first part, a correlation network is build based on an Ising model at different temperatures, critical, subcritical and supercritical, using a Metropolis Monte-Carlo algorithm with single-spin-flip dynamics. This theoretical model is compared with a brain network built from the correlations of BOLD fMRI temporal series of brain regions activity. Network measures, such as clustering coefficient, average shortest path length and degree distributions are analysed. The same network measures are calculated to the network obtained from the time series correlations of the spins in the Ising model. The results from the brain network are better explained by the theoretical model at the critical temperature, suggesting critical aspects in the brain dynamics. In the second part, the temporal dynamics of the activity of a neuron population, that is, the activity of retinal ganglion cells recorded in a multi-electrode array was studied. Many studies have focused on describing the activity of neural networks using disordered Ising models, with no regard to the dynamic nature. Treating time as an extra dimension of the system, the temporal dynamics of the activity of the neuron population is modeled. The maximum entropy principle approach is used to build an Ising model with pairwise interactions between the activities of different neurons at different times. Model fitting is performed by a combination of Metropolis Monte Carlo sampling with gradient descent methods. The system is characterized by the learned parameters, questions like detailed balance and time reversibility are analysed and thermodynamic variables, such as specific heat, can be calculated to study critical aspects
Mestrado
Física
Mestre em Física
2013/25361-6
FAPESP
Gates, John Fitzgerald. "Leadership of changing universities : a case for criticality." Thesis, University College London (University of London), 2009. http://discovery.ucl.ac.uk/10019906/.
Full textPeliz, Pinto Teixeira Filipe. "Criticality and its effect on other cortical phenomena." Thesis, Imperial College London, 2015. http://hdl.handle.net/10044/1/31608.
Full textPribyl, David James 1963. "Nuclear excursions in criticality accidents with fissile solutions." Thesis, The University of Arizona, 1989. http://hdl.handle.net/10150/276965.
Full textRomero, de Mills L. Patricia. "The development of criticality amongst undergraduate students of Spanish." Thesis, University of Southampton, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.494974.
Full textMiller, Gael. "Measurements of criticality in self-organizing cellular automata models." Thesis, Heriot-Watt University, 2003. http://hdl.handle.net/10399/301.
Full textCurtin, Oliver James. "Quantum criticality and emergent symmetry in coupled quantum dots." Thesis, Imperial College London, 2016. http://hdl.handle.net/10044/1/42499.
Full textScheben, Fynn. "Iterative methods for criticality computations in neutron transport theory." Thesis, University of Bath, 2011. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.545319.
Full textLawley, Martyn Laurence. "Aspects of quantum criticality in itinerant electron ferromagnetic systems." Thesis, University of Birmingham, 2010. http://etheses.bham.ac.uk//id/eprint/536/.
Full textMaurer, Simon. "Analysis and coordination of mixed-criticality cyber-physical systems." Thesis, University of Hertfordshire, 2018. http://hdl.handle.net/2299/21094.
Full textTadjfar, Nagisa. "Assessing the criticality of germanium as a by-product." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/111354.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 37-38).
Although germanium production is currently nowhere near its supply potential, many sources cite germanium, a by-product material produced primarily from zinc and coal, as a critical metal. Current methods for assessing criticality include frameworks that rely on geopolitical risk metrics, geological reserves, substitutability, and processing limitations during extraction among others but there is a gap in understanding the complex supply and demand dynamics that are involved in the market for by-products. This thesis addressed this gap by assessing the supply risk of germanium using an econometric framework to generate estimates of price elasticities. Annual world production and price data of years 1967 - 2014 for germanium was used to construct supply and demand models in order to obtain estimates for the price elasticities of supply and demand. Ordinary least squares (OLS) regression was used on an autoregressive distributed lag (ARDL) model for both supply and demand. The supply model was constructed with price, zinc production, and 5-year interest rate as shifters along with lag terms for germanium production, germanium price, 5-year interest rate, and zinc production. The adjusted R2 was 0.761 and the long term supply price elasticity was found to be 0.05 with an upper bound of 0.7 and a lower bound of -0.6 indicating that germanium supply is price inelastic. In a similar fashion, a demand model was constructed with two structural breaks accounting for fundamental changes in the market structure in 1991 and 2003, along with lag terms for germanium production, germanium price and antimony price. The adjusted R2 value for the demand model was 0.683 and the price elasticity was 0.05 with an upper bound of 1 and a lower bound of -1 indicating that demand, too, is price inelastic. This creates an added risk for supply shortages, adding to the criticality of germanium. However, the stabilizing behavior of its carriers, coal and zinc, reduce the likelihood of an actual shortage. This type of analysis improves upon existing methods and can lead to more accurate quantified estimates for long-term criticality.
by Nagisa Tadjfar.
S.B.
Pinheiro, Neto Joao [Verfasser]. "Criticality and sampling in neural networks / Joao Pinheiro Neto." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2021. http://d-nb.info/1228364605/34.
Full textZhang, Ruoyu. "An Evaluation of Mixed Criticality Metric for Mechatronics Systems." Thesis, KTH, Maskinkonstruktion (Inst.), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-201092.
Full textHinton, Michael Glenn. "Inter-Core Interference Mitigation in a Mixed Criticality System." BYU ScholarsArchive, 2020. https://scholarsarchive.byu.edu/etd/8648.
Full textKlamser, Pascal. "Collective Information Processing and Criticality, Evolution and Limited Attention." Doctoral thesis, Humboldt-Universität zu Berlin, 2021. http://dx.doi.org/10.18452/23099.
Full textIn the first part, I focus on the self-organization to criticality (here an order-disorder phase transition) and investigate if evolution is a possible self-tuning mechanism. Does a simulated cohesive swarm that tries to avoid a pursuing predator self-tunes itself by evolution to the critical point to optimize avoidance? It turns out that (i) the best group avoidance is at criticality but (ii) not due to an enhanced response but because of structural changes (fundamentally linked to criticality), (iii) the group optimum is not an evolutionary stable state, in fact (iv) it is an evolutionary accelerator due to a maximal spatial self-sorting of individuals causing spatial selection. In the second part, I model experimentally observed differences in collective behavior of fish groups subject to multiple generation of different types of size-dependent selection. The real world analog to this experimental evolution is recreational fishery (small fish are released, large are consumed) and commercial fishing with large net widths (small/young individuals can escape). The results suggest that large harvesting reduces cohesion and risk taking of individuals. I show that both findings can be mechanistically explained based on an attention trade-off between social and environmental information. Furthermore, I numerically analyze how differently size-harvested groups perform in a natural predator and fishing scenario. In the last part of the thesis, I quantify the collective information processing in the field. The study system is a fish species adapted to sulfidic water conditions with a collective escape behavior from aerial predators which manifests in repeated collective escape dives. These fish measure about 2 centimeters, but the collective wave spreads across meters in dense shoals at the surface. I find that wave speed increases weakly with polarization, is fastest at an optimal density and depends on its direction relative to shoal orientation.
Joyce, Peter James. "Experimental investigation of defect criticality in FRP laminate composites /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.
Full text