Dissertations / Theses on the topic 'Uncertainty principles'

To see the other types of publications on this topic, follow the link: Uncertainty principles.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Uncertainty principles.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Erb, Wolfgang. "Uncertainty principles on Riemannian manifolds." kostenfrei, 2010. https://mediatum2.ub.tum.de/node?id=976465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pryor, Ronald L. "Principles of nonspecificity." Diss., Online access via UMI:, 2007.

Find full text
Abstract:
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2007.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
3

Mercer, Leah Gwenyth. "Complementarity and the uncertainty principle as aesthetic principles : the practice and performance of The Physics Project." Thesis, Queensland University of Technology, 2009. https://eprints.qut.edu.au/29938/1/Leah_Mercer_Thesis.pdf.

Full text
Abstract:
Using the generative processes developed over two stages of creative development and the performance of The Physics Project at the Loft at the Creative Industries Precinct at the Queensland University of Technology (QUT) from 5th – 8th April 2006 as a case study, this exegesis considers how the principles of contemporary physics can be reframed as aesthetic principles in the creation of contemporary performance. The Physics Project is an original performance work that melds live performance, video and web-casting and overlaps an exploration of personal identity with the physics of space, time, light and complementarity. It considers the acts of translation between the language of physics and the language of contemporary performance that occur via process and form. This exegesis also examines the devices in contemporary performance making and contemporary performance that extend the reach of the performance, including the integration of the live and the mediated and the use of metanarratives.
APA, Harvard, Vancouver, ISO, and other styles
4

Mercer, Leah Gwenyth. "Complementarity and the uncertainty principle as aesthetic principles : the practice and performance of The Physics Project." Queensland University of Technology, 2009. http://eprints.qut.edu.au/29938/.

Full text
Abstract:
Using the generative processes developed over two stages of creative development and the performance of The Physics Project at the Loft at the Creative Industries Precinct at the Queensland University of Technology (QUT) from 5th – 8th April 2006 as a case study, this exegesis considers how the principles of contemporary physics can be reframed as aesthetic principles in the creation of contemporary performance. The Physics Project is an original performance work that melds live performance, video and web-casting and overlaps an exploration of personal identity with the physics of space, time, light and complementarity. It considers the acts of translation between the language of physics and the language of contemporary performance that occur via process and form. This exegesis also examines the devices in contemporary performance making and contemporary performance that extend the reach of the performance, including the integration of the live and the mediated and the use of metanarratives.
APA, Harvard, Vancouver, ISO, and other styles
5

Gillio, Emanuele F. (Emanuele Filiberto) 1973. "Lean principles applied to a supply chain with demand uncertainty." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/34722.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management; and (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering; in conjunction with the Leaders for Manufacturing Program at MIT, 2002.
Includes bibliographical references (leaves 74-75).
This thesis describes the work performed over a six and a half month internship at Eastman Kodak Company in Rochester, NY. The thesis focuses on the implementation of a lean manufacturing system, modeled after the Toyota Production System, in the Kodak color film business. The goal of the system is to systematically eliminate all forms of waste from the production process in an attempt to reduce costs and inventory. This thesis approaches the problem from two different points of view. On the one hand, it takes a high level view of the entire supply chain and describes how material and information should flow through the supply chain. It highlights where inventory buffers should be located and which operations should be improved in order to reduce the size of these buffers. Finally, this thesis highlights the importance of leveling the customer demand signal in order to implement a true pull system using Kanbans. On the other hand, this thesis describes the implementation of lean manufacturing tools such as Kanban systems and Heijunka boards in some Kodak operations. This work includes the use of tools such as visual signals, cellular manufacturing, Kanbans, Heijunka boards, etc. The work performed over the internship sets the foundation for the transformation of the Kodak supply chain into a lean supply chain capable of dealing with uncertain demand. Additionally, the work can easily be transferred and applied to other Kodak businesses such as paper and photochemicals.
by Emanuele F. Gillio.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
6

Essghaier, Fatma. "Collective decision making under qualitative possibilistic uncertainty : principles and characterization." Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30211/document.

Full text
Abstract:
Cette Thèse pose la question de la décision collective sous incertitude possibiliste. On propose différents règles de décision collective qualitative et on montre que dans un contexte possibiliste, l'utilisation d'une fonction d'agrégation collective pessimiste égalitariste ne souffre pas du problème du Timing Effect. On étend ensuite les travaux de Dubois et Prade (1995, 1998) relatifs à l'axiomatisation des règles de décision qualitatives (l'utilité pessimiste) au cadre de décision collective et montre que si la décision collective comme les décisions individuelles satisfont les axiomes de Dubois et Prade ainsi que certains axiomes relatifs à la décision collective, particulièrement l'axiome de Pareto unanimité, alors l'agrégation collective égalitariste s'impose. Le tableau est ensuite complété par une axiomatisation d'un pendant optimiste de cette règle de décision collective. Le système axiomatique que nous avons développé peut être vu comme un pendant ordinal du théorème de Harsanyi (1955). Ce résultat á été démontré selon un formalisme qui et basé sur le modèle de de Von NeuMann and Morgenstern (1948) et permet de comparer des loteries possibilistes. Par ailleurs, on propose une première tentative pour la caractérisation des règles de décision collectives qualitatives selon le formalisme de Savage (1972) qui offre une représentation des décisions par des actes au lieu des loteries. De point de vue algorithmique, on considère l'optimisation des stratégies dans les arbres de décision possibilistes en utilisant les critères de décision caractérisés dans la première partie de ce travail. On offre une adaptation de l'algorithme de Programmation Dynamique pour les critères monotones et on propose un algorithme de Programmation Multi-dynamique et un algorithme de Branch and Bound pour les critères qui ne satisfont pas la monotonie. Finalement, on établit une comparaison empirique des différents algorithmes proposés. On mesure les CPU temps d'exécution qui augmentent linéairement en fonction de la taille de l'arbre mais restent abordable même pour des grands arbres. Ensuite, nous étudions le pourcentage d'exactitude de l'approximation des algorithmes exacts par Programmation Dynamique: Il apparaît que pour le critère U-max ante l'approximation de l'algorithme de Programmation Multi-dynamique n'est pas bonne. Mais, ceci n'est pas si dramatique puisque cet algorithme est polynomial (et efficace dans la pratique). Cependant, pour la règle U+min ante l'approximation par Programmation Dynamique est bonne et on peut dire qu'il devrait être possible d'éviter une énumération complète par Branch and Bound pour obtenir les stratégies optimales
This Thesis raises the question of collective decision making under possibilistic uncertainty. We propose several collective qualitative decision rules and show that in the context of a possibilistic representation of uncertainty, the use of an egalitarian pessimistic collective utility function allows us to get rid of the Timing Effect. Making a step further, we prove that if both the agents' preferences and the collective ranking of the decisions satisfy Dubois and Prade's axioms (1995, 1998) and some additional axioms relative to collective choice, in particular Pareto unanimity, then the egalitarian collective aggregation is compulsory. The picture is then completed by the proposition and the characterization of an optimistic counterpart of this pessimistic decision rule. Our axiomatic system can be seen as an ordinal counterpart of Harsanyi's theorem (1955). We prove this result in a formalism that is based on Von NeuMann and Morgenstern framework (1948) and compares possibilisitc lotteries. Besides, we propose a first attempt to provide a characterization of collective qualitative decision rules in Savage's formalism; where decisions are represented by acts rather than by lotteries. From an algorithmic standpoint, we consider strategy optimization in possibilistic decision trees using the decision rules characterized in the first part of this work. So, we provide an adaptation of the Dynamic Programming algorithm for criteria that satisfy the property of monotonicity and propose a Multi-Dynamic programming and a Branch and Bound algorithm for those that are not monotonic. Finally, we provide an empirical comparison of the different algorithms proposed. We measure the execution CPU times that increases linearly according to the size of the tree and it remains affordable in average even for big trees. Then, we study the accuracy percentage of the approximation of the pertinent exact algorithms by Dynamic Programming: It appears that for U-max ante criterion the approximation of Multi-dynamic programming is not so good. Yet, this is not so dramatic since this algorithm is polynomial (and efficient in practice). However, for U+min ante decision rule the approximation by Dynamic Programming is good and we can say that it should be possible to avoid a full Branch and Bound enumeration to find optimal strategies
APA, Harvard, Vancouver, ISO, and other styles
7

Adamcik, Martin. "Collective reasoning under uncertainty and inconsistency." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/collective-reasoning-under-uncertainty-and-inconsistency(7fab8021-8beb-45e7-8b45-7cb4fadd70be).html.

Full text
Abstract:
In this thesis we investigate some global desiderata for probabilistic knowledge merging given several possibly jointly inconsistent, but individually consistent knowledge bases. We show that the most naive methods of merging, which combine applications of a single expert inference process with the application of a pooling operator, fail to satisfy certain basic consistency principles. We therefore adopt a different approach. Following recent developments in machine learning where Bregman divergences appear to be powerful, we define several probabilistic merging operators which minimise the joint divergence between merged knowledge and given knowledge bases. In particular we prove that in many cases the result of applying such operators coincides with the sets of fixed points of averaging projective procedures - procedures which combine knowledge updating with pooling operators of decision theory. We develop relevant results concerning the geometry of Bregman divergences and prove new theorems in this field. We show that this geometry connects nicely with some desirable principles which have arisen in the epistemology of merging. In particular, we prove that the merging operators which we define by means of convex Bregman divergences satisfy analogues of the principles of merging due to Konieczny and Pino-Perez. Additionally, we investigate how such merging operators behave with respect to principles concerning irrelevant information, independence and relativisation which have previously been intensively studied in case of single-expert probabilistic inference. Finally, we argue that two particular probabilistic merging operators which are based on Kullback-Leibler divergence, a special type of Bregman divergence, have overall the most appealing properties amongst merging operators hitherto considered. By investigating some iterative procedures we propose algorithms to practically compute them.
APA, Harvard, Vancouver, ISO, and other styles
8

Aughenbaugh, Jason Matthew. "Managing Uncertainty in Engineering Design Using Imprecise Probabilities and Principles of Information Economics." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11521.

Full text
Abstract:
The engineering design community recognizes that an essential part of the design process is decision making. Because decisions are generally made under uncertainty, engineers need appropriate methods for modeling and managing uncertainty. Two important characteristics of uncertainty in the context of engineering design are imprecision and irreducible uncertainty. In order to model both of these characteristics, it is valuable to use probabilities that are most generally imprecise and subjective. These imprecise probabilities generalize traditional, precise probabilities; when the available information is extensive, imprecise probabilities reduce to precise probabilities. An approach for comparing the practical value of different uncertainty models is developed. The approach examines the value of a model using the principles of information economics: value equals benefits minus costs. The benefits of a model are measured in terms of the quality of the product that results from the design process. Costs are measured not only in terms of direct design costs, but also the costs of creating and using the model. Using this approach, the practical value of using an uncertainty model that explicitly recognizes both imprecision and irreducible uncertainty is demonstrated in the context of a high-risk engineering design example in which the decision-maker has few statistical samples to support the decision. It is also shown that a particular imprecise probability model called probability bounds analysis generalizes sensitivity analysis, a process of identifying whether a particular decision is robust given the decision makers lack of complete information. An approach for bounding the value of future statistical data samples while collecting information to support design decisions is developed, and specific policies for making decisions in the presence of imprecise information are examined in the context of engineering.
APA, Harvard, Vancouver, ISO, and other styles
9

Björnemo, Erik. "Energy Constrained Wireless Sensor Networks : Communication Principles and Sensing Aspects." Doctoral thesis, Uppsala universitet, Institutionen för teknikvetenskaper, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-9519.

Full text
Abstract:
Wireless sensor networks are attractive largely because they need no wired infrastructure. But precisely this feature makes them energy constrained, and the consequences of this hard energy constraint are the overall topic of this thesis. We are in particular concerned with principles for energy efficient wireless communication and the energy-wise trade-off between sensing and radio communication. Radio transmission between sensors incurs both a fixed energy cost from radio circuit processing, and a variable energy cost related to the level of radiated energy. We here find that transmission techniques that are otherwise considered efficient consumes too much processing energy. Currently available sensor node radios typically have a maximum output power that is too limited to benefit from transmission-efficient, but processing-intensive, techniques. Our results provide new design guidelines for the radio output power. With increasing transmission energy -- with increasing distance -- the considered techniques should be applied in the following order: output power control, polarisation receiver diversity, error correcting codes, multi-hop communication, and cooperative multiple-input multiple-output transmissions. To assess the measurement capability of the network as a whole, and to facilitate a study of the sensing-communication trade-off, we devise a new metric: the network measurement capacity. It is based on the number of different measurement sequences that a network can provide, and is hence a measure of the network's readiness to meet a large number of possible events. Optimised multi-hop routing under this metric reveals that the energy consumed for sensing has decisive impact on the best multi-hop routes. We also find support for the use of hierarchical heterogeneous network structures. Model parameter uncertainties have large impact on our results and we use probability theory as logic to include them consistently. Our analysis shows that common assumptions can give misleading results, and our analysis of radio channel measurements confirms the inadequacy of the Rayleigh fading channel model.
wisenet
APA, Harvard, Vancouver, ISO, and other styles
10

Lantzberg, Daniel [Verfasser], Hans-Georg [Akademischer Betreuer] Stark, Peter [Gutachter] Maaß, and Hans-Georg [Gutachter] Stark. "Quantum Frames and Uncertainty Principles arising from Symplectomorphisms / Daniel Lantzberg ; Gutachter: Peter Maaß, Hans-Georg Stark ; Betreuer: Hans-Georg Stark." Bremen : Staats- und Universitätsbibliothek Bremen, 2019. http://d-nb.info/1186248688/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Carper, Jayme Lee. "Verification and Validation of a Transient Heat Exchanger Model." Wright State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=wright1441064582.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Olson, Erik Davin. "Conceptual Design and Technical Risk Analysis of Quiet Commercial Aircraft Using Physics-Based Noise Analysis Methods." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11486.

Full text
Abstract:
An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid tradeoff and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The analysis process was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and takeoff and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing the physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.
APA, Harvard, Vancouver, ISO, and other styles
13

Duan, Kaifeng. "Uncertainty Principle : a study of the uncertain relationship between people and object." Thesis, Konstfack, Industridesign, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:konstfack:diva-3791.

Full text
Abstract:
When we observe or use something, its property seems to change because of the way we establish relationships with it. Inspired by the Uncertainty Principle – a physical theory published by Heisenberg in the year of 1927 – I take both people and objects as something always in an uncertain status. We cannot fully define objects, but only try to understand and live with it in a complex and constantly changing context.Three pieces of furniture are created to visualize the idea about how the relationships between people and objects could be from this viewpoint, exploring how far away people could accept the imperfect but surprising experience of the world.
APA, Harvard, Vancouver, ISO, and other styles
14

Akten, Burcu Elif. "Generalized uncertainty relations /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Ekfeldt, Philip, and Anders Pettersson. "Experimental Demonstrator of the Uncertainty Principle." Thesis, KTH, Tillämpad fysik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-129286.

Full text
Abstract:
The goal of this project was to create an intuitive and clear demonstrator of the dening properties of quantum mechanics using single slit diraction of light, which has quantum mechanical properties because of light's waveparticle duality. In this report we will describe the process and thoughts behind our project of creating a portable demonstration of the uncertainty principle. By designing and building both a physical setup with a laser, a slit, mirrors, lenses, beam-splitters, attenuators and cameras, and developing software which displays images from the cameras in a clear user interface with calculations we hope that students from high schools and gymnasiums that visit Vetenskapens Hus at Alba Nova will learn something new while using the demonstrator.
APA, Harvard, Vancouver, ISO, and other styles
16

Eady, Dennis. "Miscarriages of justice : the uncertainty principle." Thesis, Cardiff University, 2009. http://orca.cf.ac.uk/54837/.

Full text
Abstract:
The thesis examines in detail the potential for error and distortion in the criminal justice process and the concept of case construction which may contribute to wrongful convictions. The effectiveness of post conviction procedures is then also considered. Three detailed case studies are utilised to illustrate case construction, post conviction issues and current social/cultural factors that may impact on miscarriages of justice. The thesis argues that the "Uncertainty Principle" permeates the criminal justice process such that wrongful convictions are an inevitable risk and moreover that, while there are certain safeguards that protect from some of the problems of the past, there remains a high potential for such events to occur. This potential is exacerbated by the current political "convictionist" rhetoric and policy framework and by trends and developments in the media world and the consequent social influence of this. Further concerns are expressed at the continuing reluctance of post conviction agencies, most notably the Court of Appeal, to fully recognise the risks inherent in the system. Consequently post-conviction procedures continue to function on the principle of finality within the system and prioritise the protection of the decisions of the lower courts. It is argued that the principle should not be finality but uncertainty and that the protection of the innocent rather than the protection of the image of the system should be the paramount concern. The thesis considers the often illusory nature of some of the principles of the criminal justice system and utilises notions of "magical legalism" (Cohen 2001) and other psychological processes that may be involved in maintaining the illusions. Some recommendations for change are proposed, focusing primarily on the philosophical change that is required to change the principles originally designed to protect the innocent from illusion into reality.
APA, Harvard, Vancouver, ISO, and other styles
17

Perrucci, Italo. "Minimal length scale and generalized uncertainty principle." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2022. http://amslaurea.unibo.it/25246/.

Full text
Abstract:
L’esistenza di una lunghezza minima, dell’ordine della lunghezza di Planck lP l = 1.6 × 10−35 m, pone un limite alla piccolezza delle distanze che possiamo misurare. Tale lunghezza minima può essere ottenuta con una modificazione del principio di indeterminazione di Heisenberg in un principio di indeterminazione generalizzato (generalized uncertainty principle o GUP). In ciò che segue, dapprima vengono analizzate le diverse motivazioni che suggeriscono l’esistenza di una lunghezza minima: esperimenti mentali, geometria non-commutativa, teorie della gravità quantistica. Successivamente viene mostrato come il GUP possa essere implementato a partire da una modifica delle relazioni di commutazione canoniche e si indaga come esso vada ad agire sul limite Newtoniano della relatività generale. Infine, esaminando l’influenza del GUP sull’Hamiltoniana di una particella e sulla temperatura della radiazione di Hawking, che a sua volta implica una deformazione della metrica di Schwarzschild, si osserva come tali risultati possano essere utilizzati per stimare il parametro di deformazione del principio di indeterminazione e quindi la scala di lunghezza minima.
APA, Harvard, Vancouver, ISO, and other styles
18

Chan, Chi Hung. "3-dimensional Heisenberg antiferromagnet in cubic lattice under time periodic magnetic field /." View abstract or full-text, 2009. http://library.ust.hk/cgi/db/thesis.pl?PHYS%202009%20CHANC.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Ríos, Zertuche R. Z. Rodolfo A. "The uncertainty principle for functions with sparse support and spectrum." Thesis, Uppsala University, Department of Mathematics, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-121380.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Rodriquez, Roberta. "Decoherence in quantum information processing devices and the uncertainty principle." Thesis, University of Cambridge, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.612977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

MELO, FLAVIO B. "Avaliação metrológica da incerteza na medição de vazão mássica de gases com tecnologias volumétrica e pressão diferencial." reponame:Repositório Institucional do IPEN, 2007. http://repositorio.ipen.br:8080/xmlui/handle/123456789/11553.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:53:11Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T13:59:02Z (GMT). No. of bitstreams: 0
Dissertação (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN/CNEN-SP
APA, Harvard, Vancouver, ISO, and other styles
22

Ronel, Tahel. "Symmetry principles in polyadic inductive logic." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/symmetry-principles-in-polyadic-inductive-logic(6bd9665b-b236-435c-9aad-7edb3cfc399e).html.

Full text
Abstract:
We investigate principles of rationality based on symmetry in Polyadic Pure Inductive Logic. The aim of Pure Inductive Logic (PIL) is to determine how to assign probabilities to sentences of a language being true in some structure on the basis of rational considerations. This thesis centres on principles arising from instances of symmetry for sentences of first-order polyadic languages. We begin with the recently introduced Permutation Invariance Principle (PIP), and find that it is determined by a finite number of permutations on a finite set of formulae. We test the consistency of PIP with established principles of the subject and show, in particular, that it is consistent with Super Regularity. We then investigate the relationship between PIP and the two main polyadic principles thus far, Spectrum Exchangeability and Language Invariance, and discover there are close connections. In addition, we define the key notion of polyadic atoms as the building blocks of polyadic languages. We explore polyadic generalisations of the unary principle of Atom Exchangeability and prove that PIP is a natural extension of Atom Exchangeability to polyadic languages. In the second half of the thesis we investigate polyadic approaches to the unary version of Constant Exchangeability as invariance under signatures. We first provide a theory built on polyadic atoms (for binary and then general languages). We introduce the notion of a signature for non-unary languages, and principles of invariance under signatures, independence, and instantial relevance for this context, as well as a binary representation theorem. We then develop a second approach to these concepts using elements as alternative building blocks for polyadic languages. Finally, we introduce the concepts of homomorphisms and degenerate probability functions in Pure Inductive Logic. We examine which of the established principles of PIL are preserved by these notions, and present a method for reducing probability functions on general polyadic languages to functions on binary languages.
APA, Harvard, Vancouver, ISO, and other styles
23

Leal, Fernando Angelo Ribeiro. "Ônus de argumentação, relações de prioridade e decisão jurídica: mecanismos de controle e de redução da incerteza na subidealidade do sistema jurídico." Universidade do Estado do Rio de Janeiro, 2012. http://www.bdtd.uerj.br/tde_busca/arquivo.php?codArquivo=5168.

Full text
Abstract:
O escopo deste trabalho é investigar a natureza e as funções dos ônus de argumentação em suas relações com o sistema jurídico e com a argumentação jurídica. O pano de fundo para o desenvolvimento dessas análises é o triplo condicionamento do direito. De acordo com essa visão, o direito e a argumentação jurídica são condicionados extrínseca, intrínseca e institucionalmente. Nesse cenário, defende-se, por um lado, que os ônus argumentativos são componentes necessários de um sistema jurídico que compreende regras e princípios. Analisados estruturalmente, os ônus argumentativos são compreendidos, por outro lado, como efeitos de regras e standards que consolidam relações de prioridade normativas. A partir dessas relações, defende-se que ônus de argumentação são mecanismos de redução e controle da incerteza que caracteriza necessariamente a subidealidade do sistema jurídico ao (i) facilitarem a manutenção das relações de prioridade que os sustentam na solução de casos concretos, (ii) dificultarem a inversão dessas relações e (iii) instituírem pontos de parada na argumentação jurídica em situações nas quais o desenvolvimento de cadeias argumentativas não é capaz de garantir se, em determinado caso concreto, certa relação de prioridade deve ser mantida ou invertida.
The goal of this thesis is to analyze the nature and functions of burdens of argumentation, within the context of their relationship with the structure of the legal system and their role in legal reasoning. Such analysis understands law as limited domain, subject to constraints that can be analytically represented by a three-level approach. According to this view, law and legal reasoning are extrinsically, intrinsically and institutionally constrained. In this complex scenario, the argument of this paper is twofold. On the one hand, it claims that burdens of argumentation are necessary components of a legal system that contains rules and principles. On the other hand, by looking at their structure, it claims that these burdens can be understood as effects of rules and standards that establish normative priority relations. On the basis of these analyses, I argue that burdens of argumentation are mechanisms of control and stabilization of the uncertainty that characterizes the suboptimal character of law. First, they make it easier to justify the maintenance in concreto of a preexistent relationship of priority between different principles. Second, they make it harder to invert these relationships of priority. Lastly, burdens of argumentation create stopping points in legal reasoning whenever there is uncertainty about whether the development of new chains of arguments is enough to justify the maintenance or the inversion, in a concrete case, of a given normative relationship of priority.
APA, Harvard, Vancouver, ISO, and other styles
24

Buswell, Richard A. "Uncertainty in the first principle model based condition monitoring of HVAC systems." Thesis, Loughborough University, 2001. https://dspace.lboro.ac.uk/2134/7559.

Full text
Abstract:
Model based techniques for automated condition monitoring of HVAC systems have been under development for some years. Results from the application of these methods to systems installed in real buildings have highlighted robustness and sensitivity issues. The generation of false alarms has been identified as a principal factor affecting the potential usefulness of condition monitoring in HVAC applications. The robustness issue is a direct result of the uncertain measurements and the lack of experimental control that axe characteristic of HVAC systems. This thesis investigates the uncertainties associated with implementing a condition monitoring scheme based on simple first principles models in HVAC subsystems installed in real buildings. The uncertainties present in typical HVAC control system measurements are evaluated. A sensor validation methodology is developed and applied to a cooling coil subsystem installed in a real building. The uncertainty in steady-state analysis based on transient data is investigated. The uncertainties in the simplifications and assumptions associated with the derivation of simple first principles based models of heat-exchangers are established. A subsystem model is developed and calibrated to the test system. The relationship between the uncertainties in the calibration data and the parameter estimates are investigated. The uncertainties from all sources are evaluated and used to generate a robust indication of the subsystem condition. The sensitivity and robustness of the scheme is analysed based on faults implemented in the test system during summer, winter and spring conditions.
APA, Harvard, Vancouver, ISO, and other styles
25

Bang, Jang Young. "Generalized uncertainty principle and Gaussian wave packets in discrete quantum phase space." [Bloomington, Ind.] : Indiana University, 2009. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3380062.

Full text
Abstract:
Thesis (Ph.D.)--Indiana University, Dept. of Physics, 2009.
Title from PDF t.p. (viewed on Jul 19, 2010). Source: Dissertation Abstracts International, Volume: 70-12, Section: B, page: 7630. Adviser: Micheal S. Berger.
APA, Harvard, Vancouver, ISO, and other styles
26

Howarth, Elizabeth. "New rationality principles in pure inductive logic." Thesis, University of Manchester, 2015. https://www.research.manchester.ac.uk/portal/en/theses/new-rationality-principles-in-pure-inductive-logic(e23d028f-c3e9-47b1-a5a0-289027f4d97f).html.

Full text
Abstract:
We propose and investigate several new principles of rational reasoning within the framework of Pure Inductive Logic, PIL, where probability functions defined on the sentences of a first-order language are used to model an agent's beliefs. The Elephant Principle is concerned with how learning, modelled by conditioning, may be uniquely `remembered'. The Perspective Principle requires that, from a given prior, conditioning on statistically similar experiences should result in similar assignments, and is found to be a necessary condition for Reichenbach's Axiom to hold. The Abductive Inference Principle and some variations are proposed as possible formulations of a restriction of C.S. Peirce's notion of hypothesis in the context of PIL, though characterization results obtained for these principles suggest that they may be too strong. The Finite Values Property holds when a probability function takes only finitely many values when restricted to sentences containing only constant symbols from some fixed finite set. This is shown to entail a certain systematic method of assigning probabilities in terms of possible worlds, and it is considered in this light as a possible principle of inductive reasoning. Classification results are given, stating which members of certain established families of probability functions satisfy each of these new principles. Additionally, we define the theory of a principle P of PIL to be the set of those sentences which are assigned probability 1 by every probability function which satisfies P. We investigate the theory of the established principle of Spectrum Exchangeability by finding separately the theories of heterogeneous and homogeneous functions. The theory of Spectrum Exchangeability is found to be equal to the theory of finite structures. The theory of Johnson's Sufficientness Postulate is also found. Consequently, we find that Spectrum Exchangeability, Johnson's Sufficientness Postulate and the Finite Values Property are all inconsistent with the principle of Super-Regularity: that any consistent sentence should be assigned non-zero probability.
APA, Harvard, Vancouver, ISO, and other styles
27

Rekuc, Steven Joseph. "Eliminating Design Alternatives under Interval-Based Uncertainty." Thesis, Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7218.

Full text
Abstract:
Typically, design is approached as a sequence of decisions in which designers select what they believe to be the best alternative in each decision. While this approach can be used to arrive at a final solution quickly, it is unlikely to result in the most-preferred solution. The reason for this is that all the decisions in the design process are coupled. To determine the most preferred alternative in the current decision, the designer would need to know the outcomes of all future decisions, information that is currently unavailable or indeterminate. Since the designer cannot select a single alternative because of this indeterminate (interval-based) uncertainty, a set-based design approach is introduced. The approach is motivated by the engineering practices at Toyota and is based on the structure of the Branch and Bound Algorithm. Instead of selecting a single design alternative that is perceived as being the most preferred at the time of the decision, the proposed set-based design approach eliminates dominated design alternatives: rather than selecting the best, eliminate the worst. Starting from a large initial design space, the approach sequentially reduces the set of non-dominated design alternatives until no further reduction is possible ??e remaining set cannot be rationally differentiated based on the available information. A single alternative is then selected from the remaining set of non-dominated designs. In this thesis, the focus is on the elimination step of the set-based design method: A criterion for rational elimination under interval-based uncertainty is derived. To be efficient, the criterion takes into account shared uncertainty ??certainty shared between design alternatives. In taking this uncertainty into account, one is able to eliminate significantly more design alternatives, improving the efficiency of the set-based design approach. Additionally, the criterion uses a detailed reference design to allow more elimination of inferior design sets without evaluating each alternative in that set. The effectiveness of this elimination is demonstrated in two examples: a beam design and a gearbox design.
APA, Harvard, Vancouver, ISO, and other styles
28

Lee, Grace Sin Dam. "Uncertainty, risk and the (in)applicability of the precautionary principle : reassessing the scope of precaution and prevention in international environmental law." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/277781.

Full text
Abstract:
While the basic premise of precaution has been widely endorsed in environmental treaties since its inclusion in the Rio Declaration on Environment and Development, as a legal principle, it has been framed in such vastly dissimilar ways that it continues to generate significant disagreement over its precise nature, standing and legal effect. Despite the rich and extensive scholarship aimed at clarifying its normative content and operation, the ongoing lack of consensus on when the precautionary principle is applicable and what its application entails points to fundamental definitional challenges as well as its overall limitations as a regulatory tool. This thesis attempts to move beyond this impasse by reassessing the precautionary principle in light of the distinction traditionally made in formal scientific discourse between risk and uncertainty. While this technical distinction is fundamental to defining the proper scope of the principle’s application, the thesis finds that much of the existing legal discourse has either overlooked or marginalised the risk/uncertainty dichotomy, which in turn has blurred the distinction between the principles of precaution and prevention. The thesis sets out what is meant by these analytically distinct concepts in the legal context, focusing on their implications for the processes of legal reasoning and regulatory decision-making. Having examined the conceptual underpinnings of the precautionary principle, and of the principle of prevention, the thesis proceeds to address a central research question – if uncertainty, as opposed to risk, determines the operational scope of the precautionary principle, to what extent do the current applications of the precautionary principle actually fall within its proper domain? To answer this, the thesis embarks on a deconstruction of the precautionary principle in practice by analysing how precaution has been deployed as an operational principle in particular treaty contexts. The treaty regimes examined here include: international fisheries; persistent organic pollutants; ocean dumping; sanitary and phytosanitary threats under the WTO; and atmospheric pollution and climate change. In each case, the thesis scrutinises the extent to which assumptions, obligations and measures contained therein are consistent with the theoretical underpinnings of precaution. Despite the pervasive use of the precautionary rhetoric in treaty texts and practice, the thesis ultimately finds that, for the most part, these instruments are in fact aimed at specific, scientifically-determined risks, and thus what is often upheld in the name of precaution is actually the prevention principle. The thesis argues that it is better to frame risk regulation through prevention, and not precaution, by considering the implications of abandoning the precautionary principle in those areas where the prevention principle is clearly at play. The thesis completes the analysis by addressing what is actually left for the precautionary principle and discussing some of the distinct ways in which precaution functions within its specific, circumscribed domain.
APA, Harvard, Vancouver, ISO, and other styles
29

Norheim, Stein. "Computer Support Simplifying Uncertainty Estimation using Patient Samples." Thesis, Linköpings universitet, Medicinsk informatik, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-68278.

Full text
Abstract:
In this work, a practical approach to assessing bias and uncertainty using patient samples in a clinical laboratory is presented. The scheme is essentially a splitsample setup where one instrument is appointed to being the “master” instrument which other instruments are compared to. The software presented automatically collects test results from a Laboratory Information System in production and couples together the results of pairwise measurements. Partitioning of measurement results by user-defined criteria and how this can facilitate isolation of variation sources are also discussed. The logic and essential data model are described and the surrounding workflows outlined. The described software and workflow are currently in considerable practical use in several Swedish large-scale distributed laboratory organizations. With the appropriate IT-support, split-sample testing can be a powerful complement to external quality assurance.
APA, Harvard, Vancouver, ISO, and other styles
30

Kliess, Malte Sebastian. "The principle of predicate exchangeability in pure inductive logic." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/the-principle-of-predicate-exchangeability-in-pure-inductive-logic(7483a787-d651-4734-8fdf-eda405fc48a6).html.

Full text
Abstract:
We investigate the Principle of Predicate Exchangeability in the framework of Pure Inductive Logic. While this principle was known to Rudolf Carnap, who started research in Inductive Logic, the principle has been somewhat neglected in the past. After providing the framework of Pure Inductive Logic, we will show Representation Theorems for probability functions satisfying Predicate Exchangeability, filling the gap in the list of Representation Theorems for functions satisfying certain rational principles. We then introduce a new principle, called the Principle of Strong Predicate Exchangeability, which is weaker than the well-known Principle of Atom Exchangeability, but stronger than Predicate Exchangeability and give examples of functions that satisfy this principle. Finally, we extend the framework of Inductive Logic to Second Order languages, which allows for increasing a rational agent’s expressive strength. We introduce Wilmers’ Principle, a rational principle that rational agents might want to adopt in this extended framework, and give a representation theorem for this principle.
APA, Harvard, Vancouver, ISO, and other styles
31

Donati, Alessandra. "Le principe de précaution en droit de l'Union européenne." Thesis, Paris 1, 2019. http://www.theses.fr/2019PA01D018.

Full text
Abstract:
Partant de la prémisse de la nature flexible et complexe du principe de précaution en droit européen, le but de cette étude a été d'en donner une interprétation polycentrique, fondée sur la diversité plutôt que sur l’uniformité. Pour atteindre un tel objectif, une méthode issue du pluralisme méthodologique a été employée. Celle-ci nous a permis de rechercher la uniras multiplex parmi les différentes définitions et applications de ce principe. La thèse ici soutenue a consisté à démontrer que l'interprétation polycentrique du principe de précaution peut être construite à partir de deux concepts : l'anticipation et l'action. Dans la première partie de cette étude, il a été montré comment le principe de précaution permet d'anticiper le temps de l'action publique au stade de l'incertitude scientifique. Nous avons à cet égard expliqué que l'anticipation suppose la qualification par le droit et l'évaluation par la science des risques incertains. La seconde partie de cette étude a été finalisée à démontrer comment, une fois le temps de l'action anticipé, les décideurs doivent agir sur le fondement du principe de précaution. Nous avons soutenu, à cet égard, que l'action sur la base de ce principe a une portée et des conséquences distinctes sur le plan procédural et substantiel. Les décideurs ont, en effet à la fois une obligation de prise en compte et une faculté de mise en œuvre du principe de précaution
By acknowledging the flexible and complex nature of the precautionary principle in EU law, the purpose of this work is to provide a polycentric interpretation of this principle based on diversity rather than uniformity. To achieve this objective, a methodology derived from the methodological pluralism is employed. This allows for the “unitas multiplex” between the different definitions and applications of the precautionary principle to be researched. The core claim is that the polycentric interpretation of the precautionary principle can be built on two concepts: anticipation and action. In the first part of this study, I argue that anticipation implies the qualification by law and the evaluation by science of uncertain risks. In the second part, I consider that, after having anticipated the time of action, decision-makers should act on the basis of the precautionary principle. However, the action undertaken has different meanings and consequences from the procedural and substantive perspective. From the procedural side, the decision-makers have the obligation to take into account this principle, white they remain free on the substantive side, to adopt a precautionary measure
APA, Harvard, Vancouver, ISO, and other styles
32

MATSUBARA, TASSIANE C. M. "Estudo sobre a determinação de antimônio em amostras ambientais pelo método de análise por ativação com nêutrons. Validação da metodologia e determinação da incerteza da medição." reponame:Repositório Institucional do IPEN, 2011. http://repositorio.ipen.br:8080/xmlui/handle/123456789/10042.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:34:07Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T14:00:09Z (GMT). No. of bitstreams: 0
Dissertação (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
APA, Harvard, Vancouver, ISO, and other styles
33

Bahri, Oumayma. "A fuzzy framework for multi-objective optimization under uncertainty." Thesis, Lille 1, 2017. http://www.theses.fr/2017LIL10030/document.

Full text
Abstract:
Cette thèse est consacrée à l'étude de l’optimisation combinatoire multi-objective sous incertitudes. Plus particulièrement, nous abordons les problèmes multi-objectifs contenant des données floues qui sont exprimées par des nombres triangulaires floues. Pour faire face à ce type de problèmes, notre idée principale est d’étendre les concepts multi-objectifs classiques au contexte flou. Nous proposons, dans un premier temps, une nouvelle approche Pareto entre des objectifs flous (i.e. vecteurs des nombres triangulaires flous). Ensuite, nous étendons des méta-heuristiques basées sur Pareto afin de converger vers des solutions optimales floues. L’approche proposée est illustrée sur un problème bi-objectif de routage de véhicules avec des demandes floues. Dans le deuxième volet de ce travail, nous abordons l’aspect de robustesse dans le contexte multi-objectif flou en proposant une nouvelle méthodologie d’évaluation de robustesse des solutions. Finalement, les résultats expérimentaux sur des benchmarks flous du problème de routage de véhicules prouvent l’efficacité et la fiabilité de notre approche
This thesis is devoted to the study of multi-objective combinatorial optimization under uncertainty. In particular, we address multi-objective problems with fuzzy data, in which fuzziness is expressed by fuzzy triangular numbers. To handle such problems, our main idea is to extend the classical multi-objective concepts to fuzzy context. To handle such problems, we proposed a new Pareto approach between fuzzy-valued objectives (i.e. vectors of triangular fuzzy numbers). Then, an extension of Pareto-based metaheuristics is suggested as resolution methods. The proposed approach is thereafter illustrated on a bi-objective vehicle routing problem with fuzzy demands. At the second stage, we address robustness aspect in the multi-objective fuzzy context by proposing a new methodology of robustness evaluation of solutions. Finally, the experimental results on fuzzy benchmarks of vehicle routing problem prove the effectiveness and reliability of our approach
APA, Harvard, Vancouver, ISO, and other styles
34

Грабчук, О. М. "Фінансове прогнозування розвитку економіки України в умовах невизначеності." Thesis, Дніпропетровський національний університет імені Олеся Гончара, 2012. http://essuir.sumdu.edu.ua/handle/123456789/51367.

Full text
Abstract:
У дисертації визначено теоретико-методологічні засади здійснення ФП в умовах невизначеності. Уточнено сутність та змістове наповнення ФП, науково-методичні та організаційно-економічні засади прогнозування розвитку економіки України. Сформульовано принципи ФП розвитку економіки в умовах невизначеності. Представлено теоретичні підходи до оцінювання впливу невизначеності фінансових процесів на структурну організацію економіки та системи класифікації невизначеності фінансових процесів. Сформовано методичні засади ФП розвитку економіки в умовах невизначеності на основі визначення ентропійності фінансових характерис-тик. Підтверджено гіпотезу про існування взаємозв’язку між спрямованістю розвитку економіки та виробництвом ентропії її фінансовими характеристи-ками. Запропоновано та формалізовано поняття фінансового гомеостазу розвитку економіки. Визначено стани фінансового гомеостазу для інфляційних процесів та фінансових характеристик зовнішнього сектора економіки України. Розроблено підходи до визначення прогнозних характеристик фінансових інструментів, необхідних для досягнення стану фінансового гомеостазу. The thesis determines theoretical and methodological basis of financial forecasting under conditions of uncertainty. The essence and content of financial forecasting, scientific and methodological as well as economic and organizational foundations of forecasting of Ukraine’s economic development have been specified. The principles of financial forecasting of economic development under conditions of uncertainty have been defined. The classification systems of uncertainty of financial processes have been presented in this work. The methodological foundations of financial forecasting of economic development under conditions of uncertainty on the basis of the entropy of financial properties have been formed. The hypothesis of interdependency of the stages of economic system’s development and entropy production of financial characteristics for Ukrainian economy have been proved. The notion of financial homeostasis of economic development has been proposed and formalized. The states of financial homeostasis for inflationary processes and financial characteristics of the external sector of the economy have been defined. The approaches to define the forecast properties of financial instruments which are essential to achieve the state of financial homeostasis have been developed.
APA, Harvard, Vancouver, ISO, and other styles
35

Pham, Duong Hung. "Contributions to the analysis of multicomponent signals : synchrosqueezing and associated methods." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAM044/document.

Full text
Abstract:
De nombreux signaux physiques incluant des signaux audio (musique, parole), médicaux (ECG, PCG), de mammifères marins ou d'ondes gravitationnelles peuvent être modélisés comme une superposition d'ondes modulées en amplitude et en fréquence (modes AM-FM), appelés signaux multicomposantes (SMCs). L'analyse temps-fréquence (TF) joue un rôle central pour la caractérisation de tels signaux et, dans ce cadre, diverses méthodes ont été développées au cours de la dernière décennie. Néanmoins, ces méthodes souffrent d'une limitation intrinsèque appelée le principe d'incertitude. Dans ce contexte, la méthode de réallocation (MR) a été développée visant à améliorer les représentations TF (RTFs) données respectivement par la transformée de Fourier à court terme (TFCT) et la transformée en ondelette continue (TOC), en les concentrant autour des lignes de crête correspondant aux fréquences instantanées. Malheureusement, elle ne permet pas de reconstruction des modes, contrairement à sa variante récente connue sous le nom de transformée synchrosqueezée (TSS). Toutefois, de nombreux problèmes associés à cette dernière restent encore à traiter tels que le traitement des fortes modulations en fréquence, la reconstruction des modes d'un SMC à partir de sa TFCT sous-échantillonnée or l'estimation des signatures TF de modes irréguliers et discontinus. Cette thèse traite principalement de tels problèmes afin de construire des nouvelles méthodes TF inversibles plus puissantes et précises pour l'analyse des SMCs.Cette thèse offre six nouvelles contributions précieuses. La première contribution introduit une extension de TSS d'ordre deux appliqué à la TOC ainsi qu'une discussion sur son analyse théorique et sa mise en œuvre pratique. La seconde contribution propose une généralisation des techniques de synchrosqueezing construites sur la TFCT, connue sous le nom de transformée synchrosqueezée d'ordre supérieur (FTSSn), qui permet de mieux traiter une large gamme de SMCs. La troisième contribution propose une nouvelle technique utilisant sur la transformée synchrosqueezée appliquée à la TFCT de second ordre (FTSS2) et une procédure de démodulation, appelée DTSS2, conduisant à une meilleure performance de la reconstruction des modes. La quatrième contribution est celle d'une nouvelle approche permettant la récupération des modes d'un SMC à partir de sa TFCT sous-échantillonnée. La cinquième contribution présente une technique améliorée, appelée calcul de représentation des contours adaptatifs (CRCA), utilisée pour une estimation efficace des signatures TF d'une plus grande classe de SMCs. La dernière contribution est celle d'une analyse conjointe entre l'CRCA et la factorisation matricielle non-négative (FMN) pour un débruitage performant des signaux phonocardiogrammes (PCG)
Many physical signals including audio (music, speech), medical data (ECG, PCG), marine mammals or gravitational-waves can be accurately modeled as a superposition of amplitude and frequency-modulated waves (AM-FM modes), called multicomponent signals (MCSs). Time-frequency (TF) analysis plays a central role in characterizing such signals and in that framework, numerous methods have been proposed over the last decade. However, these methods suffer from an intrinsic limitation known as the uncertainty principle. In this regard, reassignment method (RM) was developed with the purpose of sharpening TF representations (TFRs) given respectively by the short-time Fourier transform (STFT) or the continuous wavelet transform (CWT). Unfortunately, it did not allow for mode reconstruction, in opposition to its recent variant known as synchrosqueezing transforms (SST). Nevertheless, many critical problems associated with the latter still remain to be addressed such as the weak frequency modulation condition, the mode retrieval of an MCS from its downsampled STFT or the TF signature estimation of irregular and discontinuous signals. This dissertation mainly deals with such problems in order to provide more powerful and accurate invertible TF methods for analyzing MCSs.This dissertation gives six valuable contributions. The first one introduces a second-order extension of wavelet-based SST along with a discussion on its theoretical analysis and practical implementation. The second one puts forward a generalization of existing STFT-based synchrosqueezing techniques known as the high-order STFT-based SST (FSSTn) that enables to better handle a wide range of MCSs. The third one proposes a new technique established on the second-order STFT-based SST (FSST2) and demodulation procedure, called demodulation-FSST2-based technique (DSST2), enabling a better performance of mode reconstruction. The fourth contribution is that of a novel approach allowing for the retrieval of modes of an MCS from its downsampled STFT. The fifth one presents an improved method developed in the reassignment framework, called adaptive contour representation computation (ACRC), for an efficient estimation of TF signatures of a larger class of MCSs. The last contribution is that of a joint analysis of ACRC with non-negative matrix factorization (NMF) to enable an effective denoising of phonocardiogram (PCG) signals
APA, Harvard, Vancouver, ISO, and other styles
36

Ostapchuk, Mariia. "Determinants of market uptake of innovation in a situation of uncertainty about environmental and health risks : From BPA to nanotechnology." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLED058/document.

Full text
Abstract:
L’incertitude est présente dans toute innovation. Dans le domaine des nanotechnologies, l’incertitude qui entoure les risques sanitaires et environnementaux dont ces technologies pourraient être porteuses est si importante que la question de leur succès se pose.En partie du fait du manque de données cohérentes, il n’existe qu’une littérature empirique limitée sur les déterminants de la diffusion des nanotechnologies. Dans le cadre d’un programme de recherche sur les nanotechnologies, cette thèse a pour but d’investiguer les déterminants de l’adoption d’innovations dans une situation d’incertitude sur les risques environnementaux et sanitaires. Dans cette optique, nos travaux visent dans un premier temps à fournir une meilleure compréhension de la diffusion d’un produit qui est présent sur le marché depuis longtemps. Nous avons choisi une substance chimique très utilisée, le bisphénol A (BPA). Différentes méthodes économétriques sont appliquées afin de mieux comprendre la relation entre la consommation, la croissance économique, les nouvelles connaissances scientifiques concernant le risque et d’autres variables utilisant les données relatives au BPA. Les résultats illustrent un ensemble de facteurs qui influencent la consommation de BPA au niveau international.Dans un second temps, nous montrons dans quelle mesure cette étude permet d’éclairer la réflexion initiée sur la diffusion des nanotechnologies, notamment le nano-argent.Le comportement des différents acteurs en réponse à la production de connaissances scientifiques nouvelles sur les risques est étudié, ce qui nous permet d’aboutir à une compréhension approfondie de “développement nanoresponsable”
Uncertainty is immanent in every innovation. Uncertainty about environmental and health risks that surround nanotechnology raises the questions of innovation success. Due in part to a lack of consistent data, there is limited empirical literature on determinants of the diffusion of nanotechnology. As part of a research program on nanotechnology, this research aims to investigate determinants of uptake of innovation in a situation of uncertainty about environmental and health risks. With this goal, as a first step, this work seeks to provide better understanding of the diffusion of a product that has been on the market for a long time. We have chosen a chemical, bisphenol A (BPA), because of the lack of historical data on nanomaterials. As a second step, we compare the results of the BPA study to nanosilver. We apply different econometric methods to gain insights into the relationship between consumption, economic growth, new scientific knowledge about risk and other variables using the data on BPA. The results illustrate a set of factors that influences the consumption of BPA at international level. The comparative study between BPA and nanosilver helps to refine the interpretation of main results and to obtain additional insights into the determinants of uptake of nanosilver. An explanatory analysis sheds light on the actions that different stakeholders undertake in response to new scientific knowledge about risk and deepens our understanding of “nanoresponsible development”.Keywords: Innovation, diffusion of innovation, product life cycle, nanotechnology, bisphenol A, risk, uncertainty, environment, health, precautionary principle, Safer by Design, responsible development
APA, Harvard, Vancouver, ISO, and other styles
37

Durham, Ian T. "Sir Arthur Eddington and the foundations of modern physics." Thesis, University of St Andrews, 2005. http://hdl.handle.net/10023/12933.

Full text
Abstract:
In this dissertation I analyze Sir Arthur Eddington's statistical theory as developed in the first six chapters of his posthumously published Fundamental Theory. In particular I look at the mathematical structure, philosophical implications, and relevancy to modern physics. This analysis is the only one of Fundamental Theory that compares it to modern quantum field theory and is the most comprehensive look at his statistical theory in four decades. Several major insights have been made in this analysis including the fact that he was able to derive Pauli's Exclusion Principle in part from Heisenberg's Uncertainty Principle. In addition the most profound general conclusion of this research is that Fundamental Theory is, in fact, an early quantum field theory, something that has never before been suggested. Contrary to the majority of historical reports and some comments by his contemporaries, this analysis shows that Eddington's later work is neither mystical nor was it that far from mainstream when it was published. My research reveals numerous profoundly deep ideas that were ahead of their time when Fundamental Theory was developed, but that have significant applicability at present. As such this analysis presents several important questions to be considered by modern philosophers of science, physicists, mathematicians, and historians. In addition it sheds new light on Eddington as a scientist and mathematician, in part indicating that his marginalization has been largely unwarranted.
APA, Harvard, Vancouver, ISO, and other styles
38

FERREIRA, ROBSON de J. "Desenvolvimento de metodologia para a caracterizacao de fontes radioativas seladas." reponame:Repositório Institucional do IPEN, 2010. http://repositorio.ipen.br:8080/xmlui/handle/123456789/9570.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:28:09Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T14:01:44Z (GMT). No. of bitstreams: 0
Dissertacao (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
APA, Harvard, Vancouver, ISO, and other styles
39

Lieb, Florian [Verfasser], Hans-Georg [Akademischer Betreuer] Stark, Peter [Gutachter] Maaß, and Hans-Georg [Gutachter] Stark. "The Affine Uncertainty Principle, Associated Frames and Applications in Signal Processing / Florian Lieb ; Gutachter: Peter Maaß, Hans-Georg Stark ; Betreuer: Hans-Georg Stark." Bremen : Staats- und Universitätsbibliothek Bremen, 2018. http://d-nb.info/1166849767/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Claesson, Ida. "Business Restructuring : The applicability of the arm's length principle for intangibles with an uncertain value at the time of the restructuring." Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Rättsvetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-19074.

Full text
Abstract:
This thesis is based on the regulations found in the OECD model and the OECD TP guidelines concerning the arm’s length principle. The core of the arm’s length principle is that transactions between associated enterprises should be treated the same as transactions between independent enterprises. This principle can be found in Article 9 of the OECD model. One transaction that may fall within the scope of Article 9 of the OECD model is business restructuring. Business restructuring was previously an unregulated TP area but with the new OECD TP guidelines, from 2010, regulations have been formulated. The aim with thesis is therefore to examine how the arm’s length principle should be applied to the new guidelines for business restructurings of intangibles with an uncertain value at the time of the restructuring. In order to answer the question set out in this thesis some of the factors that affect the application of the arm’s length principle have been examined separately. Firstly the arm’s length principle that is the generally accepted TP method used by both taxpayers and tax administrations in order to find a fair price for transactions between associated enterprises. The principle seeks to identify the controlled transaction and thereafter find a comparable uncontrolled transaction that is similar to the transaction performed between the associated enterprises. The second part examined the meaning of the term business restructuring according to the new guidelines since there is no other legal or general definition. Business restructurings are defined as cross-border redeployments of functions assets and risks, performed by MNEs. As long as a transaction falls within this definition it will be subjected to the arm’s length principle for tax purposes. The third part examined intangibles since that also lack a general definition. The identification and valuation of intangibles is a complex and uncertain thing to do for both taxpayers and tax administrations. When applying the arm’s length principle it is however found that the issue of identification of what constitutes and intangible may be unnecessary. The aspect that should be considered is instead the value of the intangible or more precise, the value that independent enterprises would have agreed upon in a similar situation. The applicability of the arm’s length principle to business restructurings of intangibles with an uncertain value at the time of the restructuring should be found by performing a comparability analysis. In order to perform a comparability analysis, the controlled transaction firstly has to be identified. Thereafter, a comparable uncontrolled transaction needs to be found. An equivalent uncontrolled transaction may not be found in all cases and it should in those cases be examined what independent enterprises would have done if they had been in a comparable situation. The arm’s length principle should be applied to business restructurings of intangibles with an uncertain value in the same manner as for any other uncontrolled transaction. The issues for this type of a transaction become the identification of what constitutes a business restructuring and also how to determine a fair value for the intangibles. The OECD TP guidelines lack some guidance as to the issues that can occur when a comparable uncontrolled transaction cannot be found. This creates an unsatisfactory guesswork for both taxpayers and tax administrations when trying to determine what independent enterprises would have done if they had been in a similar situation. This creates an unnecessary uncertainty when trying to apply the arm’s length principle.
APA, Harvard, Vancouver, ISO, and other styles
41

CARDOSO, VANDERLEI. "Estudo das covariâncias envolvidas no método ko de análise por ativação neutrônica." reponame:Repositório Institucional do IPEN, 2011. http://repositorio.ipen.br:8080/xmlui/handle/123456789/10074.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:34:27Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T14:09:41Z (GMT). No. of bitstreams: 0
Tese (Doutoramento)
IPEN/T
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
APA, Harvard, Vancouver, ISO, and other styles
42

Li, Weishuang. "Optimum Signal Design in UWB Communications." Thesis, Linnéuniversitetet, Institutionen för fysik och elektroteknik (IFE), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-78946.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

LUNA, Gabriela Coutinho. "Radiação Hawking de um buraco negro acústico não-comutativo." Universidade Federal de Campina Grande, 2016. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/2139.

Full text
Abstract:
Submitted by Emanuel Varela Cardoso (emanuel.varela@ufcg.edu.br) on 2018-11-06T20:40:37Z No. of bitstreams: 1 GABRIELA COUTINHO LUNA – DISSERTAÇÃO (PPGFísica) 2016.pdf: 715387 bytes, checksum: 874a57d0f89fb17ac576dd091e90bb48 (MD5)
Made available in DSpace on 2018-11-06T20:40:37Z (GMT). No. of bitstreams: 1 GABRIELA COUTINHO LUNA – DISSERTAÇÃO (PPGFísica) 2016.pdf: 715387 bytes, checksum: 874a57d0f89fb17ac576dd091e90bb48 (MD5) Previous issue date: 2016-02
O estudo do buraco negro acústico, ou análogo acústico, se assemelha ao gravitacional da seguinte forma: verifi ca-se o fenômeno da radiação Hawking, apresença de um horizonte de eventos, a possibilidade de se calcular a sua temperatura, também chamada de temperatura Hawking, e obtêm-se uma métrica que descreve a geometria do buraco negro. Inserimos na métrica acústica a teoria não-comutativa, a fim de vericar as correções que resultam desta teoria. Neste trabalho, consideramos o princípio da incerteza generalizado, no formalismo de tunelamento via método de Hamilton-Jacobi, para determinar a temperatura Hawking e a entropia quântica corrigida para buracos negros acústicos não comutativo sem 2+1 dimensões. Em nossos resultados obtemos uma entropia de área, comum termo de correção logarítmica em ordem dominante um termo, em ordem menor, proporcional à temperatura de radiação associada com os buracos negros acústicos comutativos e um termo extra que depende de uma carga conservada. Assim, como no caso gravitacional, não há necessidade de apresentar o corte ultravioleta e as divergências são eliminadas.
Acoustic black hole study resembles the gravitational black hole as follows: we verify Hawking radiation phenomenon the presence of an event horizon, the possibility to calculate its temperature, also known as Hawking temperature, and we obtain a metric that describes the black hole geometry. We insert in the acoustic metric theory the non commutative theory in order to verify the corrections that result from this theory. In this study, we consider the generalized uncertainty principle in tunneling formalism by Hamilton-Jacobi method to determine Hawking temperature and quantum entropy corrected for non commutative acoustic black holes in 2+1 dimensions. In our results, we obtain an entropy are a with a termoflogarith mic correction in ruling order a termina smaller order, proportional to the radiation temperature associated with the commutative acoustic black holes and an extra term that depends on a conserved charge. Thus as in the gravitational case, there is noneed to present the ultraviolet cut-off and differences are eliminated.
APA, Harvard, Vancouver, ISO, and other styles
44

Haberstich, Cécile. "Adaptive approximation of high-dimensional functions with tree tensor networks for Uncertainty Quantification." Thesis, Ecole centrale de Nantes, 2020. http://www.theses.fr/2020ECDN0045.

Full text
Abstract:
Les problèmes de quantification d'incertitudes des modèles numériques nécessitent de nombreuses simulations, souvent très coûteuses (en temps de calcul et/ou en mémoire). C'est pourquoi il est essentiel de construire des modèles approchés qui sont moins coûteux à évaluer. En pratique, si la réponse d'un modèle numérique est représentée par une fonction, on cherche à en construire une approximation.L'objectif de cette thèse est de construire l'approximation d'une fonction qui soit contrôlée tout en utilisant le moins d'évaluations possible de la fonction.Dans un premier temps, nous proposons une nouvelle méthode basée sur les moindres carrés pondérés pour construire l'approximation d'une fonction dans un espace vectoriel. Nous prouvons que la projection vérifie une propriété de stabilité numérique presque sûrement et une propriété de quasi-optimalité en espérance. En pratique on observe que la taille de l'échantillon est plus proche de la dimension de l'espace d'approximation que pour les autres techniques de moindres carrés pondérées existantes.Pour l'approximation en grande dimension et afin d’exploiter de potentielles structures de faible dimension, nous considérons dans cette thèse des approximations dans des formats de tenseurs basés sur des arbres. Ces formats admettent une paramétrisation multilinéaire avec des paramètres formant un réseau de tenseurs de faible ordre et sont ainsi également appelés réseaux de tenseurs basés sur des arbres. Dans cette thèse, nous proposons un algorithme pour construire l'approximation de fonctions dans des formats de tenseurs basés sur des arbres. Il consiste à construire une hiérarchie de sous-espaces imbriqués associés aux différents niveaux de l'arbre. La construction de ces espaces s'appuie sur l'analyse en composantes principales étendue aux fonctions multivariées et sur l'utilisation de la nouvelle méthode des moindres carrés pondérés. Afin de réduire le nombre d'évaluations nécessaires pour construire l'approximation avec une certaine précision, nous proposons des stratégies adaptatives pour le contrôle de l'erreur de discrétisation, la sélection de l'arbre, le contrôle des rangs et l'estimation des composantes principales
Uncertainty quantification problems for numerical models require a lot of simulations, often very computationally costly (in time and/or memory). This is why it is essential to build surrogate models that are cheaper to evaluate. In practice, the output of a numerical model is represented by a function, then the objective is to construct an approximation.The aim of this thesis is to construct a controlled approximation of a function while using as few evaluations as possible.In a first time, we propose a new method based on weighted least-squares to construct the approximation of a function onto a linear approximation space. We prove that the projection verifies a numerical stability property almost surely and a quasi-optimality property in expectation. In practice we observe that the sample size is closer to the dimension of the approximation space than with existing weighted least-squares methods.For high-dimensional approximation, and in order to exploit potential low-rank structures of functions, we consider the model class of functions in tree-based tensor formats. These formats admit a multilinear parametrization with parameters forming a tree network of low-order tensors and are therefore also called tree tensor networks. In this thesis we propose an algorithm for approximating functions in tree-based tensor formats. It consists in constructing a hierarchy of nested subspaces associated to the different levels of the tree. The construction of these subspaces relies on principal component analysis extended to multivariate functions and the new weighted least-squares method. To reduce the number of evaluations necessary to build the approximation with a certain precision, we propose adaptive strategies for the control of the discretization error, the tree selection, the control of the ranks and the estimation of the principal components
APA, Harvard, Vancouver, ISO, and other styles
45

Haikola, Simon. "Bortom kontroll? : Den svenska kemikalieövervakningens logik." Doctoral thesis, Linköpings universitet, Tema teknik och social förändring, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-85662.

Full text
Abstract:
Kemikalier utgör en grundläggande beståndsdel av det senindustriella samhället, och en omfattande produktion av kemikalier brukar allmänt anses som en nödvändig förutsättning för teknisk utveckling och ekonomisk tillväxt. I Sverige ledde miljölarmen om DDT, PCB och kvicksilver på 1960- och 1970-talet till inrättandet av ett system för kemikaliekontroll som brukar framhållas som ett av världens främsta. Avhandlingen undersöker detta kontrollsystem och dess logik. Detta görs genom textanalys av propositioner, statliga utredningar, rapporter från Naturvårdsverket och Kemikalieinspektionen, samt genom intervjuer med anställda på sistnämnda myndigheter. Analysen identifierar kemikaliekontroll i Sverige som ett system genomsyrat av motsättningar, vilka bottnar i en epistemologisk paradox som innebär att ju mer kunskap som ackumuleras om kemikalier, desto mer ökar osäkerheten. Den konstanta ökningen av världens kemikalieproduktion, i kombination med kemikaliers epistemologiska komplexitet, placerar kontrollmyndigheterna i en omöjlig sits. Samtidigt visar avhandlingen att myndigheterna är delaktiga i att upprätthålla detta kontrollsystem som till stor del är ett system av simulerad kontroll. Dels förmedlar kontrollsystemet genom sin blotta existens intrycket av kontroll, och dels fungerar vissa centrala regulatoriska begrepp som signaler om kontroll, trots att de visar sig vara ihåliga. På så vis blir osäkerhet inom kontrollsystemet alltid ett undantag, trots att den är så utbredd.
Chemical substances have become an inextricable feature of the late-industrial society, deemed necessary for the welfare, technological development and economic growth that large parts of the world have come to expect. In Sweden, the identification in the 1960s and 1970s of DDT, PCB and mercury as serious environmental threats led to the establishment of a system of chemicals control which is widely held to be one of the most advanced in the world. The thesisexamines this control system, its possibilities, its problems and its logic, through text analysis of state reports, governmental propositions, the reports of the Swedish Environmental Protection Agency (SEPA) and the Swedish Chemicals Agency (SCA), and interviews with employees at these agencies. The analysis shows chemicals control in Sweden to be a system pervaded with contradictions, which may be explained by an epistemological paradox at its core: that the accumulation of knowledge only serve to increase uncertainty. The constant increase of chemicals production, in combination with the highly unpredictable character of chemicals in the environment, puts the monitoring agencies in an impossible situation, always working against the tide. The thesis also shows, however, that the agencies are themselves an important part of maintaining a system of control that is to a large extent simulated. This in the sense that the system, by its very existence as well as by the circulation of regulatory concepts and principles within it which are in fact without much substance, always signals control, and constitute uncertainty as the exception.
APA, Harvard, Vancouver, ISO, and other styles
46

Setzer, Joana. "Panorama do princípio da precaução: o direito do ambiente face aos novos riscos e incertezas." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/90/90131/tde-11032008-103816/.

Full text
Abstract:
Introdução - Desde a década de 70 a sociedade contemporânea depara-se com riscos e incertezas que apresentam características únicas. Casos emblemáticos nos campos da saúde, segurança e meio ambiente são discutidos pela mídia, organizações não-governamentais, governos, empresas e sociedade civil, e sobre eles o direito é chamado a se posicionar. Para lidar com esses temas, ao longo das duas últimas décadas o direito (internacional e do ambiente) construiu o princípio da precaução. Objetivo - A pesquisa tem por objeto discernir o que o princípio da precaução tem sido, do que ele não é, e o que ele pode vir a ser, contribuindo assim para o estudo das dimensões jurídicas da Sociedade do Risco e das relações entre o direito e a incerteza. Fonte bibliográfica - O estudo se baseou, sobretudo, na literatura francesa sobre o princípio da precaução, na jurisprudência internacional e nas recentes iniciativas da União Européia e da Organização Mundial da Saúde. Considerou-se também a doutrina, jurisprudência e legislação brasileira. Aspectos abordados - Abordou-se a configuração da Sociedade do Risco e como o direito do ambiente se relaciona com suas dimensões científicas e tecnológicas. Em seguida, foi estudada a consolidação desse princípio, em sua dimensão ética e jurídica. Por lidar com temas atuais e polêmicos, sua aplicação é ainda controversa, mas as dificuldades e críticas a ele formuladas auxiliam a evidenciar suas potencialidades. Conclusão - no Brasil é ainda precária a compreensão do que o princípio da precaução é, ou mesmo do que ele não é. Os tribunais confundem precaução e prevenção; o princípio da precaução é usado como sinônimo de uma obrigação geral de preservar o meio ambiente ou como justificativa para abstenções. Uma incorporação mais eficaz desse princípio requer o conhecimento da sua teoria e prática. A aplicação do princípio da precaução deve ser pautada na realização de análises de riscos, na adoção de parâmetros aptos a balizar sua prática e na utilização de stantards jurídicos. A controvérsia promovida pelo princípio da precaução estimula uma atitude reflexiva com relação à ciência e fortalece, no direito e fora dele, a tomada de decisões envolvendo a opinião pública e a comunidade científica.
Introduction - Since the 1970s, the society at large has been faced with unprecedented risks and uncertainties. Landmark cases in the health, safety and environmental areas have been extensively debated by the media, nongovernmental organizations, governments, corporations and the civil society as a whole, and the law has been called upon to take a stance over these issues. To cope with these issues, international and environment law has devised, over the last two decades, the so-called precautionary principle. Scope - This research seeks to draw a distinction between what the precautionary principle has been from what it is not, and what it may come to be, thus contributing to studies on the legal realms of the Risk Society and the relations between the law and uncertainties. Bibliography - This study has primarily relied on French literature covering the precautionary principle, international case law, and the recent initiatives of both the European Union and the World Health Organization. Brazilian case law, legal writings and laws have also been taken into consideration. Aspects - This work addresses the framework of Risk Society and how environmental law relates to its scientific aspects. Then, the consolidation of such principle on its ethical and legal fronts was studied. As current (and rather debatable) themes are into play, the application of Risk Society is still controversial, but the difficulties faced by such application and the criticism directed at it help understand the potential characteristics of this framework. Conclusion - In Brazil, understanding what the precautionary principle stands for (or even what it does not) is still incipient. Brazilian courts make a confusion between precaution and prevention: the precautionary principle is taken as a synonym for a general duty to conserve the environment or as a reason for limitations. A deeper understanding of this precautionary principle calls for a greater awareness of its theory and practice. Applying the precautionary principle should thus be grounded on risk analyses; on the adoption of standards that are capable of governing its practice; and on effective use of legal standards. The controversy over the precautionary principle has invited to a more reflective approach to science, while also strengthening (within the realms of law, and beyond) decision-making efforts involving the public opinion and the scientific community.
APA, Harvard, Vancouver, ISO, and other styles
47

Velho, Vitor Vidal Costa. "Análise comportamental de consumidores brasileiros: fatos estilizados por estratificação social e aplicações em modelos de projeção macro." reponame:Repositório Institucional do FGV, 2016. http://hdl.handle.net/10438/17834.

Full text
Abstract:
Submitted by Vitor Vidal Costa Velho (vitorvidal89@gmail.com) on 2017-02-03T20:23:30Z No. of bitstreams: 1 Dissertação - Vitor Vidal Velho - Mestrado Profissional EPGE.pdf: 1181575 bytes, checksum: 9c27f989399f9ce81252d04fb4be9fe6 (MD5)
Approved for entry into archive by GILSON ROCHA MIRANDA (gilson.miranda@fgv.br) on 2017-02-06T11:30:32Z (GMT) No. of bitstreams: 1 Dissertação - Vitor Vidal Velho - Mestrado Profissional EPGE.pdf: 1181575 bytes, checksum: 9c27f989399f9ce81252d04fb4be9fe6 (MD5)
Made available in DSpace on 2017-02-06T16:50:01Z (GMT). No. of bitstreams: 1 Dissertação - Vitor Vidal Velho - Mestrado Profissional EPGE.pdf: 1181575 bytes, checksum: 9c27f989399f9ce81252d04fb4be9fe6 (MD5) Previous issue date: 2016-12-05
Survey indicators produced by FGV are economic information published relatively quickly and used as a 'thermometer' of the Brazilian level of activity in the short term. The use of the Consumer Confidence Index (CCI), as well as other indicators within the survey, are taken as a antecedent or coincident in household consumption forecasting models and other official quantitative variables. However, the work seeks to show evidence that information at the aggregate level has not always the best predictive power. The work seeks to analyze the behavior of consumers under a more disaggregated view of the research in order to obtain greater correlation and robustness with the target variables. There is evidence that some consumer groups are able to provide better assessments on one topic than others. Using the Principal Component Analysis (PCA), we reduce the dimensionality of these better indicators to obtain robustness in the forecasting scenarios. To complement these predictions, we will use the indicator of dispersion of responses as an uncertainty proxy, in an attempt to understand subjectivity of survey data. The conclusion of the article is given through the analysis of the stylized facts of the behavior of these groups selected for each case.
Indicadores de sondagens produzidas pela FGV são informações econômicas divulgadas de forma relativamente rápida e utilizadas como 'termômetro' do nível de atividade brasileira no curto prazo. A utilização do ICC (Índice de Confiança do Consumidor), assim como outros indicadores por dentro da sondagem, são tidas como antecedente ou coincidente em modelos de previsão do consumo das famílias e para as demais variáveis quantitativas oficiais. No entanto, o trabalho procura mostrar evidências de que nem sempre as informações ao nível agregado, possuem o melhor poder preditivo. O trabalho busca analisar o comportamento dos consumidores sob uma ótica mais desagregada da pesquisa a fim de obter maior correlação e robustez com as variáveis target. Existem evidências de que alguns grupos de consumidores conseguem fornecer melhores avaliações sobre determinado tema do que outros. Utilizando a técnica de componentes principais, extraímos o núcleo destes melhores indicadores desagregados para obter robustez nos cenários de previsão. Para complementar estas previsões, utilizaremos o indicador de dispersão das respostas como uma proxy de incerteza, na tentativa de compreender o lado subjetivo intrínseco às pesquisas de sondagem. A conclusão do artigo se dá através da análise dos fatos estilizados do comportamento destes grupos selecionados para cada caso.
APA, Harvard, Vancouver, ISO, and other styles
48

Shiri-Garakani, Mohsen. "Finite Quantum Theory of the Harmonic Oscillator." Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/5078.

Full text
Abstract:
We apply the Segal process of group simplification to the linear harmonic oscillator. The result is a finite quantum theory with three quantum constants instead of the usual one. We compare the classical (CLHO), quantum (QLHO), and finite (FLHO) linear harmonic oscillators and their canonical or unitary groups. The FLHO is isomorphic to a dipole rotator with N=l(l+1) states where l is very large for physically interesting case. The position and momentum variables are quantized with uniform finite spectra. For fixed quantum constants and large N there are three broad classes of FLHO: soft, medium, and hard corresponding respectively to cases where ratio of the of potential energy to kinetic energy in the Hamiltonian is very small, almost equal to one, or very large The field oscillators responsible for infra-red and ultraviolet divergences are soft and hard respectively. Medium oscillators approximate the QLHO. Their low-lying states have nearly the same zero-point energy and level spacing as the QLHO, and nearly obeying the Heisenberg uncertainty principle and the equipartition principle. The corresponding rotators are nearly polarized along the z-axis. The soft and hard FLHO's have infinitesimal 0-point energy and grossly violate equipartition and the Heisenberg uncertainty principle. They do not resemble the QLHO at all. Their low-lying energy states correspond to rotators polaroizd along x-axis or y-axis respectively. Soft oscillators have frozen momentum, because their maximum potential energy is too small to produce one quantum of momentum. Hard oscillators have frozen position, because their maximum kinetic energy is too small to produce one quantum of momentum. Hard oscillators have frozen position, because their maximum kinetic energy is too small to excite one quantum of position.
APA, Harvard, Vancouver, ISO, and other styles
49

Mari, L. "ON SOME ASPECTS OF OSCILLATION THEORY AND GEOMETRY." Doctoral thesis, Università degli Studi di Milano, 2012. http://hdl.handle.net/2434/170621.

Full text
Abstract:
This thesis aims to discuss some of the relationships between oscillation theory for linear ordinary differential equations on the real line (shortly, ODE) and the geometry of complete Riemannian manifolds. In this respect, we prove new results in both directions. For instance, we improve on classical oscillation and nonoscillation criteria for ODE's, and we find sharp spectral estimates for a number of geometric differential operator on Riemannian manifolds. We apply these results to achieve topological and geometric properties. In the first part of the thesis, we collect some material which often appears in the literature in various forms and for which we give, in some instances, new proofs according to our specific point of view.
APA, Harvard, Vancouver, ISO, and other styles
50

Sharp, Jesse A. "Numerical methods for optimal control and parameter estimation in the life sciences." Thesis, Queensland University of Technology, 2022. https://eprints.qut.edu.au/230762/1/Jesse_Sharp_Thesis.pdf.

Full text
Abstract:
This thesis concerns numerical methods in mathematical optimisation and inference; with a focus on techniques for optimal control, and for parameter estimation and uncertainty quantification. Novel methodological and computational developments are presented, with a view to improving the efficiency, effectiveness and accessibility of these techniques for practitioners. The numerical methods considered in this work are widely applied throughout the life sciences; in areas including ecology, epidemiology and oncology, and beyond the life sciences; in engineering, economics, aeronautics and other disciplines.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography