Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Reaction rules.

Dissertationen zum Thema „Reaction rules“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-40 Dissertationen für die Forschung zum Thema "Reaction rules" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Schwarb, Hillary. „The importance of stimulus-response rules in sequence learning“. Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28221.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Ferga, Jumuaa. „UK monetary policy reaction functions, 1992-2014 : a cointegration approach using Taylor rules“. Thesis, University of Huddersfield, 2016. http://eprints.hud.ac.uk/id/eprint/28564/.

Der volle Inhalt der Quelle
Annotation:
For more than two decades, monetary policy of countries around the world has undergone significant transformation. The long-term stabilization and lowering of inflation is the primary target of central banks founded on the principles of transparency and credibility. The achievement of inflation targeting and control is ultimately judged by the public’s expectations about future inflation. This objective has focused central bank policy making on modern monetary principles and the adoption of one of its core principles, the monetary policy rule. The central bank of the United Kingdom officially adopted an explicit inflating targeting monetary policy in October 1992 following its operational independence in May 1997. In this study, we attempt to investigate the behaviour of the Central Bank of England under an inflation targeting framework. In other words, whether Taylor-type policy rules can be used to describe the behaviour of the Central Bank of England. We specifically attempt to shed light on the question does Taylor's rule (Taylor, 1993) adequately describes central bank behaviour? And whether the existence of formal targets has induced nonlinearity in this behaviour, beginning in October 1992 until December 2014. The study uses time series estimations of Taylor-type reactions functions to characterise monetary policy conduct in the UK, we use time series data, because all the other studies in this area are using the time series method and recommended it, Osterholm (2005), Nelson (2000), Adam et al (2003), Clarida et al (2000) amongst others. In addition, this study uses a long database which is useful for time series analysis. The analysis uses a modified cointegration and error correction model that is robust to the stationary properties of the data as well as vector autoregression techniques; therefore, our methodology in this study employed three types of econometric tests namely: unit root tests, cointegration tests and error correction models. We used monthly data for the UK over the period October 1992 to December 2014, and we estimate Taylor-type policy rules for the UK in order to find answers to these questions. Our results indicate that the Central Bank of England has not been following the Taylor rule. In other words, the regression results clearly indicated that the Central Bank of England did not follow the Taylor rule in the period 1992-2014. This is because all coefficients of inflation gap and the output gap were statistically insignificant. In addition, we conclude these results link with the New Consensus Macroeconomics, criticism of inflation targeting and endogenous money theory. The main contribution in this study is an up-to-date analysis, and evidence that Bank of England policy does not work with Taylor rules. In addition, on the methodological level most previous studies reviewed in the literature have measured the interest rate, inflation and the output gap using one dependent variable, to measure the behaviour of the Central Bank of England, to assess whether the Taylor rule is effective or not. However, this study fills this gap by using two measure for interest rate, three measure for inflation and two variables to measure the output gap, using The Hodrick-Prescott (HP) filter and moving averages, to assess whether the Taylor rule is effective or not effective by using more than one dependent variable.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Albhbah, Atia M. „Dynamic web forms development using RuleML. Building a framework using metadata driven rules to control Web forms generation and appearance“. Thesis, University of Bradford, 2013. http://hdl.handle.net/10454/5719.

Der volle Inhalt der Quelle
Annotation:
Web forms development for Web based applications is often expensive, laborious, error-prone, time consuming and requires a lot of effort. Web forms are used by many different people with different backgrounds and a lot of demands. There is a very high cost associated with the need to update the Web application systems to achieve these demands. A wide range of techniques and ideas to automate the generation of Web forms exist. These techniques and ideas however, are not capable of generating the most dynamic behaviour of form elements, and make Insufficient use of database metadata to control Web forms¿ generation and appearance. In this thesis different techniques are proposed that use RuleML and database metadata to build rulebases to improve the automatic and dynamic generation of Web forms. First this thesis proposes the use of a RuleML format rulebase using Reaction RuleML that can be used to support the development of automated Web interfaces. Database metadata can be extracted from system catalogue tables in typical relational database systems, and used in conjunction with the rulebase to produce appropriate Web form elements. Results show that this mechanism successfully insulates application logic from code and suggests that Abstract iii the method can be extended from generic metadata rules to more domain specific rules. Second it proposes the use of common sense rules and domain specific rules rulebases using Reaction RuleML format in conjunction with database metadata rules to extend support for the development of automated Web forms. Third it proposes the use of rules that involve code to implement more semantics for Web forms. Separation between content, logic and presentation of Web applications has become an important issue for faster development and easy maintenance. Just as CSS applied on the client side to control the overall presentation of Web applications, a set of rules can give a similar consistency to the appearance and operation of any set of forms that interact with the same database. We develop rules to order Web form elements and query forms using Reaction RuleML format in conjunction with database metadata rules. The results show the potential of RuleML formats for representing database structural and active semantics. Fourth it proposes the use of a RuleML based approach to provide more support for greater semantics for example advanced domain support even when this is not a DBMS feature. The approach is to specify most of the semantics associated with data stored in RDBMS, to overcome some RDBMSs limitations. RuleML could be used to represent database metadata as an external format.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Birchwood, Anthony. „Implementation of taylor type rules in nascent money and capital markets under managed exchange rates“. Thesis, Brunel University, 2011. http://bura.brunel.ac.uk/handle/2438/6447.

Der volle Inhalt der Quelle
Annotation:
We investigate the practical use of Taylor-type rules in Trinidad and Tobago, which is in the process of implementing market based monetary policy and seeks to implement flexible inflation targeting in the presence of a managed exchange rate. This is motivated by the idea that normative Taylor rules can be shaped by the practical experience of developing countries. We find that the inflation – exchange rate nexus is strong, hence the country may be unwilling to allow the exchange rate to float freely. We contend that despite weak market development the Taylor rule can still be applied as the central bank is able to use moral suasion to achieve full pass through of the policy rate to the market rate. Our evidence rejects Galí and Monacelli’s (2005) argument that the optimal monetary policy rule for the open economy is isomorphic for a closed economy. Rather, our evidence suggests that the rule for the open economy allows for lower variability when the rule is augmented by the real exchange rate as in Taylor (2001). We also reject Galí and Monacelli’s (2005) hypothesis that domestic inflation is optimal for inclusion in the Taylor-type rule. Instead we find that core CPI inflation leads to lower variability. Additionally, our evidence suggests that the monetary rule, when applied to Trinidad and Tobago, is accommodating to the US Federal Reserve rate. Further, we expand the work of Martin and Milas (2010) which considered the pass through of the policy rate to the interbank rate in the presence of risk and liquidity. By extending the transmission to the market lending rate, we are able to go beyond those disruptive factors by considering excess liquidity and spillovers of international economic disturbances. We found that these shocks are significant for Trinidad and Tobago, but it is not significant enough to disrupt the pass through. As a result, full pass through was robust to the presence of these disruptive factors.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Witt, Johannes [Verfasser]. „Modelling and Analysis of the NF-kappaB Signalling Pathway and Development of a Thermodynamically Consistent Modelling Approach for Reaction Rules / Johannes Witt“. Aachen : Shaker, 2012. http://d-nb.info/105240832X/34.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Albhbah, Atia Mahmod. „Dynamic web forms development using RuleML : building a framework using metadata driven rules to control Web forms generation and appearance“. Thesis, University of Bradford, 2013. http://hdl.handle.net/10454/5719.

Der volle Inhalt der Quelle
Annotation:
Web forms development for Web based applications is often expensive, laborious, error-prone, time consuming and requires a lot of effort. Web forms are used by many different people with different backgrounds and a lot of demands. There is a very high cost associated with the need to update the Web application systems to achieve these demands. A wide range of techniques and ideas to automate the generation of Web forms exist. These techniques and ideas however, are not capable of generating the most dynamic behaviour of form elements, and make Insufficient use of database metadata to control Web forms' generation and appearance. In this thesis different techniques are proposed that use RuleML and database metadata to build rulebases to improve the automatic and dynamic generation of Web forms. First this thesis proposes the use of a RuleML format rulebase using Reaction RuleML that can be used to support the development of automated Web interfaces. Database metadata can be extracted from system catalogue tables in typical relational database systems, and used in conjunction with the rulebase to produce appropriate Web form elements. Results show that this mechanism successfully insulates application logic from code and suggests that Abstract iii the method can be extended from generic metadata rules to more domain specific rules. Second it proposes the use of common sense rules and domain specific rules rulebases using Reaction RuleML format in conjunction with database metadata rules to extend support for the development of automated Web forms. Third it proposes the use of rules that involve code to implement more semantics for Web forms. Separation between content, logic and presentation of Web applications has become an important issue for faster development and easy maintenance. Just as CSS applied on the client side to control the overall presentation of Web applications, a set of rules can give a similar consistency to the appearance and operation of any set of forms that interact with the same database. We develop rules to order Web form elements and query forms using Reaction RuleML format in conjunction with database metadata rules. The results show the potential of RuleML formats for representing database structural and active semantics. Fourth it proposes the use of a RuleML based approach to provide more support for greater semantics for example advanced domain support even when this is not a DBMS feature. The approach is to specify most of the semantics associated with data stored in RDBMS, to overcome some RDBMSs limitations. RuleML could be used to represent database metadata as an external format.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Kim, Sok Won. „Essays on monetary economics and financial economics“. [College Station, Tex. : Texas A&M University, 2006. http://hdl.handle.net/1969.1/ETD-TAMU-1770.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Correia, Ana Filipa Bandeirinha Abrantes. „Regras de Taylor Uma aplicação à política monetária alemã“. Master's thesis, Instituto Superior de Economia e Gestão, 2001. http://hdl.handle.net/10400.5/3947.

Der volle Inhalt der Quelle
Annotation:
Mestrado em Economia Monetária e Financeira
No contexto da literatura sobre política monetária, uma questão que tem sido objecto de discussão é a utilização de regras de política monetária como um instrumento dos bancos centrais para conduzirem e comunicarem a política seguida. Por regras de política monetária entende-se o compromisso da autoridade em cumprir um determinado objectivo ou em estabelecer a trajectória do instrumento da política de uma forma clara e transparente. Uma das regras muito discutida é a regra de Taylor, que relaciona o instrumento da política, a taxa de juro, com apenas duas variáveis: a inflação e o hiato do produto. Neste trabalho procura-se fazer um resumo da literatura sobre esta regra realçando as várias abordagens, nomeadamente a adequação empírica da regra à política realizada, através de estimação de funções de reacção, o estudo como regra óptima em modelos macroeconómicos simples e a análise do seu desempenho em modelos, nos quais não foi deduzida em termos óptimos. Numa segunda parte do trabalho pretende-se validar a sua utilização como forma de representar a política alemã, através da estimação da função de reacção do Bundesbank com uma abordagem de cointegração.
In the context of the literature about monetary policy, one question many times discussed is the utilisation of monetary policy rules as an instrument of the central banks to conduct and communicate the followed policy. By monetary policy rules one understands the commitment of the authority in delivering a determined objective or in establishing a trajectory for the instrument of monetary policy in a transparent way. One of the rules that as been object of discussion is the Taylor rule that establishes a relation between the instrument of policy, the interest rate with only two variables: the inflation and the output gap. This work presents a survey of the literature about Taylor rules taking into account several approaches, namely the empirical support through the estimation of reaction functions, the study of this rule as an optimal one in simple macro models and the analysis of the performance in models, in which it wasn't deduced in optimal terms. On the second part of the work is intend to evaluate the empirical support for the German monetary policy trough the estimation of the reaction function of Bundesbank, with a cointegration approach.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Bertoldi, Adriana. „A eficiência das regras de política monetária nos bancos centrais dos Estados Unidos, do Japão e da União Européia, a partir da década de 1990“. Universidade do Vale do Rio do Sinos, 2009. http://www.repositorio.jesuita.org.br/handle/UNISINOS/2772.

Der volle Inhalt der Quelle
Annotation:
Made available in DSpace on 2015-03-05T18:57:21Z (GMT). No. of bitstreams: 0 Previous issue date: 30
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Este trabalho investiga a função de reação de política monetária, seguindo a abordagem da Regra de Taylor para avaliar o desempenho dessa política, conduzida pela Reserva Federal (FED), pelo Banco do Japão (BOJ) e pelo Banco Central Europeu (ECB), durante o período selecionado para a pesquisa. Considerou-se para a análise, tanto para o FED como para o BOJ, o período de janeiro de 1990 até junho de 2008; enquanto que para o ECB, em virtude da constituição da Área do euro, a análise abrange janeiro de 1998 a junho de 2008. Inicialmente, é realizada a revisão da literatura sobre discricionariedade versus regras de política monetária, em que são apresentados alguns resultados empíricos sobre o uso de regras na condução da política monetária. Num segundo momento, faz-se uma abordagem sobre como estão estruturados os bancos centrais e os sistemas de pagamentos dos países selecionados. Além disso, traçam-se considerações sobre o regime monetário e cambial de cada economia e faz-se também uma breve retrospectiva da c
This work investigates the function of reaction of monetary policy following the approach of the Taylor Rule to evaluate the performance of this policy, lead for the Federal Reserve (FED), for the Bank of Japan (BOJ) and for European Central Bank (ECB), during the period selected for the research. It was considered for the analysis, as much for the FED how much for the BOJ, the period of January 1990 until June 2008; whereas for the ECB, in virtue of the constitution of the Euro Area, the analysis encloses January 1998 until June 2008. Initially, the revision of literature on discretion versus rules of monetary policy is made, where some empirical results on the use of rules in the conduction of the monetary policy are presented. At as a moment, approach becomes on as the central banks and the systems of payments of the selected countries are structuralized. Moreover, considerations are traced on the monetary and exchange regimen of each economy and become one brief retrospective of the management of the mone
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Aguirre-Samboní, Giann Karlo. „Ecosystem Causal Analysis Using Petri Net Unfoldings“. Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG105.

Der volle Inhalt der Quelle
Annotation:
De nombreux problèmes de vérification des systèmes concurrents ont été traités avec succès par diverses méthodes au fil des ans, en particulier les dépliages de réseaux de Petri. Cependant, les questions de comportement et de stabilisation à long terme ont reçu relativement peu d'attention. Par exemple, les caractéristiques cruciales de la dynamique à long terme des écosystèmes, telles que les bassins d'attraction et les points de basculement, restent difficiles à identifier et à quantifier avec une bonne couverture. L'une des principales raisons en est l'accent mis, dans la modélisation écologique, sur les modèles continus, qui fournissent des simulations raffinées mais ne permettent généralement pas d'étudier la manière dont l'évolution du système serait modifiée en cas d'événements supplémentaires ou dans des situations différentes. Dans ce travail, nous visons à fournir une boîte à outils pour l'analyse et la modélisation de la dynamique des écosystèmes. Nous proposons des réseaux de Petri à réinitialisation sûre pour la modélisation, car ils ont le potentiel de donner une vue d'ensemble exhaustive des différents scénarios d'évolution possibles. Le dépliage des réseaux de Petri nous fournit les bons outils pour déterminer les trajectoires du système menant à l'effondrement et/ou à la survie, et finalement caractériser les actions ou inactions qui aident à soutenir la stabilisation de l'écosystème. Cette caractérisation de la production/consommation de jetons a été utilisée pour séparer les configurations minimalement condamnées des configurations libres, c'est-à-dire les exécutions conduisant inévitablement à l'effondrement du système même si ces exécutions ne sont pas identifiées a priori comme mauvaises et les exécutions qui maintiennent le système stable, en excluant les états mauvais ou condamnés, respectivement. Le déploiement des réseaux de réinitialisation sûrs et la partie algorithmique permettant de trouver des configurations minimalement condamnées ont été mis en œuvre avec succès dans un outil logiciel appelé Ecofolder et testés à l'aide de quelques ex- emples intrigants
Many verification problems for concurrent systems have been successfully addressed by a variety of methods over the years, in particular, Petri net unfoldings. However, questions of long-term behaviour and stabilisation have received relatively little attention. For instance, crucial features of the long-term dynamics of ecosystems, such as basins of attraction and tipping points, remain difficult to identify and quantify with good coverage. A central reason for this is the focus, in ecological modeling, on continuous models, which provide refined simulations but do not in general allow to survey how the system evolution would be altered under additional events, or in otherwise different situations. In this work we aimed to provide toolkit for modeling and analyzing ecosystem dynamics. We advocate for safe reset Petri nets for modeling since them have the potential to give an exhaustive possibilistic overview of the different evolution scenarios that are feasible. The unfolding of Petri nets provides us the right tools to determine system trajectories leading to collapse and/or survival, and eventually characterize those actions or inactions that help to support ecosystem stabilisation. This characterization of token's production/consumption was used to separate minimally doomed configurations from free ones, meaning executions leading inevitably to the system's collapse even though these executions are not identified a priori as bad ones and executions that keep the system stable, excluding bad or doomed states, respectively. Both the unfolding of safe reset nets and the algorithmic part for finding minimally doomed configurations have been successfully implemented in a software tool called Ecofolder and tested with some intriguing examples
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Morton, David Robert. „Changing the rules : staff reactions to planned curriculum change“. Thesis, University College London (University of London), 1994. http://discovery.ucl.ac.uk/10021521/.

Der volle Inhalt der Quelle
Annotation:
This study is an action research project concerned with the effect of a change initiative on primary teachers' behaviour. It involves trying out a change approach and then refining and testing that approach in a consciously conducted change experiment. The study has two investigative strands. Both of these build on previous research into change that I conducted at a school in which I was working in 1986. The 1986 research described difficulties I had in conducting school self evaluation and the development of a revised approach to change. The product of the 1986 study was a change model. One strand of this study is an investigation into the effectiveness of that model in supporting teachers moving along the path to change. The second investigative strand of the study is concerned with the wider effect of implementing the change model on staff relationships in primary schools. The phrase 'changing the rules' in the title of the study harks back to an article by Helen Simons (1987) in which she suggests that activities such as self evaluation are 'against the rules of schools as institutions'. One element of this second strand of the study is an investigation into the rules governing staff relationships. It examines whether the closed behaviours that initially undermined the 1986 initiative are more widely prevalent in primary schools. The 1986 change initiative appeared to leave a residual effect of increased openness and collaboration between staff. A further element of this strand of the study is therefore an examination of whether implementing the change model affects staff relationships in other primary schools. The study examines the extent to which the change model acts to dismantle closed patterns of interaction between staff and replace them with more open ones. During the time that has elapsed between setting out and concluding this research there has been a growing focus on staff relationships in schools. Reviewing research into school culture Fullan (1991) suggests that "we have not yet made much head way in how to establish collaborative cultures in schools". This study is an investigation into a possible process by which the rules of schools I have known as a teacher, deputy headteacher and headteacher might be changed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Alegret, Ramon Núria. „Computations on fullerenes: finding rules, identifying products and disclosing reactions paths“. Doctoral thesis, Universitat Rovira i Virgili, 2014. http://hdl.handle.net/10803/275957.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Garg, Aditie. „Designing Reactive Power Control Rules for Smart Inverters using Machine Learning“. Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/83558.

Der volle Inhalt der Quelle
Annotation:
Due to increasing penetration of solar power generation, distribution grids are facing a number of challenges. Frequent reverse active power flows can result in rapid fluctuations in voltage magnitudes. However, with the revised IEEE 1547 standard, smart inverters can actively control their reactive power injection to minimize voltage deviations and power losses in the grid. Reactive power control and globally optimal inverter coordination in real-time is computationally and communication-wise demanding, whereas the local Volt-VAR or Watt-VAR control rules are subpar for enhanced grid services. This thesis uses machine learning tools and poses reactive power control as a kernel-based regression task to learn policies and evaluate the reactive power injections in real-time. This novel approach performs inverter coordination through non-linear control policies centrally designed by the operator on a slower timescale using anticipated scenarios for load and generation. In real-time, the inverters feed locally and/or globally collected grid data to the customized control rules. The developed models are highly adjustable to the available computation and communication resources. The developed control scheme is tested on the IEEE 123-bus system and is seen to efficiently minimize losses and regulate voltage within the permissible limits.
Master of Science
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

ChengLi, Katherine. „A Reactive Performance Monitoring Framework“. Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/34839.

Der volle Inhalt der Quelle
Annotation:
With the ascendency of data and the rise of interest in analytics, organizations are becoming more interested in the use of data to make their business processes more intelligent and reactive. BI applications are a common way that organizations integrate analytics in their processes. However, it can be days, weeks or even months before a manual response is undertaken based on a human interpreting a report. Even when information technology supports automatic responses within an organization, it is often implemented in an ad hoc manner without following a systematic framework. In this thesis, we present a reactive performance monitoring (RPM) framework which aims at automating the link from the analytical (how well is the operational achieving the strategic) to the operational (the particular process steps implemented within an organization that determine its behavior) aspects of businesses to bypass the strategic (the high level and long term goals an organization is trying to achieve) as needed and reduce the latency between knowledge and action. Our RPM framework is composed of an architecture, a methodology, and a rule environment which permits the redaction of rules possessing relevant conditions and actions. In addition, we present an OLAP rule engine which is demonstrated to be effective in our framework where events are streamed in, reacted upon in real-time, and stored in an OLAP database. To develop and evaluate our framework, two case studies were undertaken. The first was done using IBM technologies implementing an application made to identify patients at high risk of cancer recurrence. The second was done using open source technologies. With this second implementation, we created an application that has the goal of informing women from at risk populations of the different stages of pregnancy on a weekly basis.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Kakar, Tabassum. „MARAS: Multi-Drug Adverse Reactions Analytics System“. Digital WPI, 2016. https://digitalcommons.wpi.edu/etd-theses/1236.

Der volle Inhalt der Quelle
Annotation:
Adverse Drug Reactions (ADRs) are a major cause of morbidity and mortality worldwide. Clinical trials, which are extremely costly, human labor intensive and specific to controlled human subjects, are ineffective to uncover all ADRs related to a drug. There is thus a growing need of computing-supported methods facilitating the automated detection of drugs-related ADRs from large reports data sets; especially ADRs that left undiscovered during clinical trials but later arise due to drug-drug interactions or prolonged usage. For this purpose, big data sets available through drug-surveillance programs and social media provide a wealth of longevity information and thus a huge opportunity. In this research, we thus design a system using machine learning techniques to discover severe unknown ADRs triggered by a combination of drugs, also known as drug-drug-interaction. Our proposed Multi-drug Adverse Reaction Analytics System (MARAS) adopts and adapts an association rule mining-based methodology by incorporating contextual information to detect, highlight and visualize interesting drug combinations that are strongly associated with a set of ADRs. MARAS extracts non-spurious associations that are true representations of the combination of drugs taken and reported by patients. We demonstrate the utility of MARAS via case studies from the medical literature, and the usability of the MARAS system via a user study using real world medical data extracted from the FDA Adverse Event Reporting System (FAERS).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

André, Malin. „Rules of Thumb and Management of Common Infections in General Practice“. Doctoral thesis, Linköping University, Linköping University, Department of Medicine and Health Sciences, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-5183.

Der volle Inhalt der Quelle
Annotation:

This thesis deals with problem solving of general practitioners (GPs), which is explored with different methods and from different perspectives. The general aim was to explore and describe rules of thumb and to analyse the management of respiratory and urinary tract infections (RTI and UTI) in general practice in Sweden. The results are based upon focus group interviews concerning rules of thumb and a prospective diagnosis-prescription study concerning the management of patients allocated a diagnosis of RTI or UTI. In addition unpublished data are given from structured telephone interviews concerning specific rules of thumb in acute sinusitis and prevailing cough.

GPs were able to verbalize their rules of thumb, which could be called tacit knowledge. A specific set of rules of thumb was used for rapid assessment when emergency and psychosocial problems were identified. Somatic problems seemed to be the expected, normal state. In the further consultation the rules of thumb seemed to be used in an act of balance between the individual and the general perspective. There was considerable variation between the rules of thumb of different GPs for patients with acute sinusitis and prevailing cough. In their rules of thumb the GPs seemed to integrate their medical knowledge and practical experience of the consultation. A high number of near-patient antigen tests to probe Streptococcus pyogenes (Strep A tests) and C-reactive protein (CRP) tests were performed in patients, where testing was not recommended. There was only a slight decrease in antibiotic prescribing in patients allocated a diagnosis of RTI examined with CRP in comparison with patients not tested. In general, the GPs in Sweden adhered to current guidelines for antibiotic prescribing. Phenoxymethylpenicillin (PcV) was the preferred antibiotic for most patients allocated a diagnosis of respiratory tract infection.

In conclusion, the use of rules of thumb might explain why current practices prevail in spite of educational efforts. One way to change practice could be to identify and evaluate rules of thumb used by GPs and disseminate well adapted rules. The use of diagnostic tests in patients with infectious illnesses in general practice needs critical appraisal before introduction as well as continuing surveillance. The use of rules of thumb by GPs might be one explanation for variation in practice and irrational prescribing of antibiotics in patients with infectious conditions.


On the day of the public defence the status of the articles IV and V was: Accepted.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Le, Truong Giang. „Using Event-Based and Rule-Based Paradigms to Develop Context-Aware Reactive Applications“. Phd thesis, Conservatoire national des arts et metiers - CNAM, 2013. http://tel.archives-ouvertes.fr/tel-00953368.

Der volle Inhalt der Quelle
Annotation:
Context-aware pervasive computing has attracted a significant research interest from both academy and industry worldwide. It covers a broad range of applications that support many manufacturing and daily life activities. For instance, industrial robots detect the changes of the working environment in the factory to adapt their operations to the requirements. Automotive control systems may observe other vehicles, detect obstacles, and monitor the essence level or the air quality in order to warn the drivers in case of emergency. Another example is power-aware embedded systems that need to work based on current power/energy availability since power consumption is an important issue. Those kinds of systems can also be considered as smart applications. In practice, successful implementation and deployment of context-aware systems depend on the mechanism to recognize and react to variabilities happening in the environment. In other words, we need a well-defined and efficient adaptation approach so that the systems' behavior can be dynamically customized at runtime. Moreover, concurrency should be exploited to improve the performance and responsiveness of the systems. All those requirements, along with the need for safety, dependability, and reliability pose a big challenge for developers.In this thesis, we propose a novel programming language called INI, which supports both event-based and rule-based programming paradigms and is suitable for building concurrent and context-aware reactive applications. In our language, both events and rules can be defined explicitly, in a stand-alone way or in combination. Events in INI run in parallel (synchronously or asynchronously) in order to handle multiple tasks concurrently and may trigger the actions defined in rules. Besides, events can interact with the execution environment to adjust their behavior if necessary and respond to unpredictable changes. We apply INI in both academic and industrial case studies, namely an object tracking program running on the humanoid robot Nao and a M2M gateway. This demonstrates the soundness of our approach as well as INI's capabilities for constructing context-aware systems. Additionally, since context-aware programs are wide applicable and more complex than regular ones, this poses a higher demand for quality assurance with those kinds of applications. Therefore, we formalize several aspects of INI, including its type system and operational semantics. Furthermore, we develop a tool called INICheck, which can convert a significant subset of INI to Promela, the input modeling language of the model checker SPIN. Hence, SPIN can be applied to verify properties or constraints that need to be satisfied by INI programs. Our tool allows the programmers to have insurance on their code and its behavior.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Le, Truong Giang. „Using Event-Based and Rule-Based Paradigms to Develop Context-Aware Reactive Applications“. Electronic Thesis or Diss., Paris, CNAM, 2013. http://www.theses.fr/2013CNAM0883.

Der volle Inhalt der Quelle
Annotation:
Les applications réactives et sensibles au contexte sont des applications intelligentes qui observent l’environnement (ou contexte) dans lequel elles s’exécutent et qui adaptent, si nécessaire, leur comportement en cas de changements dans ce contexte, ou afin de satisfaire les besoins ou d'anticiper les intentions des utilisateurs. La recherche dans ce domaine suscite un intérêt considérable tant de la part des académiques que des industriels. Les domaines d'applications sont nombreux: robots industriels qui peuvent détecter les changements dans l'environnement de travail de l'usine pour adapter leurs opérations; systèmes de contrôle automobiles pour observer d'autres véhicules, détecter les obstacles, ou surveiller le niveau d'essence ou de la qualité de l'air afin d'avertir les conducteurs en cas d'urgence; systèmes embarqués monitorant la puissance énergétique disponible et modifiant la consommation en conséquence. Dans la pratique, le succès de la mise en œuvre et du déploiement de systèmes sensibles au contexte dépend principalement du mécanisme de reconnaissance et de réaction aux variations de l'environnement. En d'autres termes, il est nécessaire d'avoir une approche adaptative bien définie et efficace de sorte que le comportement des systèmes peut être modifié dynamiquement à l'exécution. En outre, la concurrence devrait être exploitée pour améliorer les performances et la réactivité des systèmes. Tous ces exigences, ainsi que les besoins en sécurité et fiabilité constituent un grand défi pour les développeurs.C’est pour permettre une écriture plus intuitive et directe d'applications réactives et sensibles au contexte que nous avons développé dans cette thèse un nouveau langage appelé INI. Pour observer les changements dans le contexte et y réagir, INI s’appuie sur deux paradigmes : la programmation événementielle et la programmation à base de règles. Événements et règles peuvent être définis en INI de manière indépendante ou en combinaison. En outre, les événements peuvent être reconfigurésdynamiquement au cours de l’exécution. Un autre avantage d’INI est qu’il supporte laconcurrence afin de gérer plusieurs tâches en parallèle et ainsi améliorer les performances et la réactivité des programmes. Nous avons utilisé INI dans deux études de cas : une passerelle M2M multimédia et un programme de suivi d’objet pour le robot humanoïde Nao. Enfin, afin d’augmenter la fiabilité des programmes écrits en INI, un système de typage fort a été développé, et la sémantique opérationnelle d’INI a été entièrement définie. Nous avons en outre développé un outil appelé INICheck qui permet de convertir automatiquement un sous-ensemble d’INI vers Promela pour permettre un analyse par model checking à l’aide de l’interpréteur SPIN
Context-aware pervasive computing has attracted a significant research interest from both academy and industry worldwide. It covers a broad range of applications that support many manufacturing and daily life activities. For instance, industrial robots detect the changes of the working environment in the factory to adapt their operations to the requirements. Automotive control systems may observe other vehicles, detect obstacles, and monitor the essence level or the air quality in order to warn the drivers in case of emergency. Another example is power-aware embedded systems that need to work based on current power/energy availability since power consumption is an important issue. Those kinds of systems can also be considered as smart applications. In practice, successful implementation and deployment of context-aware systems depend on the mechanism to recognize and react to variabilities happening in the environment. In other words, we need a well-defined and efficient adaptation approach so that the systems' behavior can be dynamically customized at runtime. Moreover, concurrency should be exploited to improve the performance and responsiveness of the systems. All those requirements, along with the need for safety, dependability, and reliability pose a big challenge for developers.In this thesis, we propose a novel programming language called INI, which supports both event-based and rule-based programming paradigms and is suitable for building concurrent and context-aware reactive applications. In our language, both events and rules can be defined explicitly, in a stand-alone way or in combination. Events in INI run in parallel (synchronously or asynchronously) in order to handle multiple tasks concurrently and may trigger the actions defined in rules. Besides, events can interact with the execution environment to adjust their behavior if necessary and respond to unpredictable changes. We apply INI in both academic and industrial case studies, namely an object tracking program running on the humanoid robot Nao and a M2M gateway. This demonstrates the soundness of our approach as well as INI's capabilities for constructing context-aware systems. Additionally, since context-aware programs are wide applicable and more complex than regular ones, this poses a higher demand for quality assurance with those kinds of applications. Therefore, we formalize several aspects of INI, including its type system and operational semantics. Furthermore, we develop a tool called INICheck, which can convert a significant subset of INI to Promela, the input modeling language of the model checker SPIN. Hence, SPIN can be applied to verify properties or constraints that need to be satisfied by INI programs. Our tool allows the programmers to have insurance on their code and its behavior
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Sanli, Ozgur. „Rule-based In-network Processing For Event-driven Applications In Wireless Sensor Networks“. Phd thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613389/index.pdf.

Der volle Inhalt der Quelle
Annotation:
Wireless sensor networks are application-specific networks that necessitate the development of specific network and information processing architectures that can meet the requirements of the applications involved. The most important challenge related to wireless sensor networks is the limited energy and computational resources of the battery powered sensor nodes. Although the central processing of information produces the most accurate results, it is not an energy-efficient method because it requires a continuous flow of raw sensor readings over the network. As communication operations are the most expensive in terms of energy usage, the distributed processing of information is indispensable for viable deployments of applications in wireless sensor networks. This method not only helps in reducing the total amount of packets transmitted and the total energy consumed by sensor nodes, but also produces scalable and fault-tolerant networks. Another important challenge associated with wireless sensor networks is that the possibility of sensory data being imperfect and imprecise is high. The requirement of precision necessitates employing expensive mechanisms such as redundancy or use of sophisticated equipments. Therefore, approximate computing may need to be used instead of precise computing to conserve energy. This thesis presents two schemes that distribute information processing for event-driven reactive applications, which are interested in higher-level information not in the raw sensory data of individual nodes, to appropriate nodes in sensor networks. Furthermore, based on these schemes, a fuzzy rule-based system is proposed that handles imprecision, inherently present in sensory data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Fushitani, Mizuho. „Nuclear Spin Selection Rule in Photoinduced Reaction of Methyl Radical and Nuclear Spin Conversion of Methane in Solid Parahydrogen“. 京都大学 (Kyoto University), 2002. http://hdl.handle.net/2433/150022.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Wallis, Russell Mark. „The vagaries of British compassion : a contextualized analysis of British reactions to the persecution of Jews under Nazi rule“. Thesis, Royal Holloway, University of London, 2011. http://repository.royalholloway.ac.uk/items/e8de6ecc-ffbd-4004-9993-23bc98fbbf6a/9/.

Der volle Inhalt der Quelle
Annotation:
This thesis explores British reactions to the persecution and mass murder of the Jews under Nazi rule. It uniquely provides a deep context by examining British responses to a number of man-made humanitarian disasters between 1914 and 1943. In doing so it takes into account changing context, the memory of previous atrocities and the making and re-making of British national identity. It shows that although each reaction was distinctive, common strands bound British confrontation with foreign atrocity. Mostly, the British consciously reacted in accordance with a long ‘tradition’ of altruism for the oppressed. This tradition had become a part and parcel of how the British saw themselves. The memory of past atrocity provided the framework for subsequent engagement with an increasingly dangerous and unpredictable world. By tracking the discursive pattern of the atrocity discourse, the evidence reveals that a variety of so-called ‘others’ were cast and recast in the British imagination. Therefore, a disparate group of ‘foreign’ victims were the beneficiaries of nationwide indignation almost regardless of the way the government eventually was able to contain or accommodate public protest. When Jews were victims there was a break with this tradition. The thesis shows that atrocity was fully comprehended by Britons but that Jews did not evoke the intensity or longevity of compassion meted out to others. In other words it shows that the reaction to Jewish suffering was particular. They were subject to a hierarchy of compassion.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Nhapulo, Gerson Leonardo. „Assessing nonlinear dyanamics of central bank reaction function : the case of Mozambique“. Master's thesis, Instituto Superior de Economia e Gestão, 2015. http://hdl.handle.net/10400.5/10197.

Der volle Inhalt der Quelle
Annotation:
Mestrado em Economia Monetária e Financeira
Esta dissertação lança alguma luz sobre os elementos que regem a tomada de decisões de política monetária durante o período 2000Q1-2015Q1 em Moçambique, ou seja, se a autoridade monetária do país, o Banco de Moçambique (BM), poderia ter-se comportado de forma diferente ao longo do tempo condicionado a pressões inflacionárias e ao desvio do produto em relação à meta, mudando entre períodos em que a inflação era a principal preocupação da política ou não. Existem várias abordagens para avaliar a dinâmica não-linear da função de reação do banco central. Em primeiro lugar, nós investigamos se as respostas das taxas de juro mudam com o sinal de desvios de inflação e do produto. Em segundo lugar, avaliamos a capacidade de resposta da taxa de juro de curto prazo para a magnitude dos choques de preços e do desvio do produto em relação à meta. Finalmente, usamos um modelo de mudança Markov regime de política monetária tendo como modelo base uma variante da regra de Taylor. A conclusão geral é que somente mudanças na inflação provocam reação do BM. O único elemento do modelo Markov é a uma fraca mudança na estabilidade de preços entre 2000Q1-2006Q4 e 2007Q1-2015Q1
This dissertation sheds some light into the elements governing monetary policy-making during 2000Q1-2015Q1 sample period in Mozambique, i.e., whether the monetary authority of this country, Banco de Moçambique (BM), might have behaved differently over time conditional to price pressures and outputs swings, switching between periods when inflation was the primary concern of policy or other way round. There are several approaches to assess nonlinear dynamics of central bank reaction function. First, we investigate whether the interest rate responses change with the sign of inflation and output deviations. Second, we evaluate the responsiveness of the short-term interest rate to the size of price and output shocks. Finally, we use a Markov switching model to estimate a time-varying Taylor-type rule for the BM. The general finding is that only changes in inflation brings about reaction of the BM. The only element of Markov switching model is captured by a weak change in price stability from 2000Q1-2006Q4 sample sub-period to 2007Q1-2015Q1 sample sub-period.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Oztarak, Hakan. „An Energy-efficient And Reactive Remote Surveillance Framework Using Wireless Multimedia Sensor Networks“. Phd thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614328/index.pdf.

Der volle Inhalt der Quelle
Annotation:
With the introduction of Wireless Multimedia Sensor Networks, large-scale remote outdoor surveillance applications where the majority of the cameras will be battery-operated are envisioned. These are the applications where the frequency of incidents is too low to employ permanent staffing such as monitoring of land and marine border, critical infrastructures, bridges, water supplies, etc. Given the inexpensive costs of wireless resource constrained camera sensors, the size of these networks will be significantly larger than the traditional multi-camera systems. While large number of cameras may increase the coverage of the network, such a large size along with resource constraints poses new challenges, e.g., localization, classification, tracking or reactive behavior. This dissertation proposes a framework that transforms current multi-camera networks into low-cost and reactive systems which can be used in large-scale remote surveillance applications. Specifically, a remote surveillance system framework with three components is proposed: 1) Localization and tracking of objects
2) Classification and identification of objects
and 3) Reactive behavior at the base-station. For each component, novel lightweight, storage-efficient and real-time algorithms both at the computation and communication level are designed, implemented and tested under a variety of conditions. The results have indicated the feasibility of this framework working with limited energy but having high object localization/classification accuracies. The results of this research will facilitate the design and development of very large-scale remote border surveillance systems and improve the systems effectiveness in dealing with the intrusions with reduced human involvement and labor costs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Grenier, Philippe. „Etude des fonctions de structure en spin du nucleon : l'experience e143 au slac“. Clermont-Ferrand 2, 1995. http://www.theses.fr/1995CLF21720.

Der volle Inhalt der Quelle
Annotation:
Cette these decrit l'experience e143 de diffusion profondement inelastique d'electrons de 29 gev sur des cibles de nh#3 et nd#3 polarisees. Elle s'est deroulee aupres de l'accelerateur de stanford, en californie, et son but etait de mesurer les fonctions de structure polarisees g#1 et g#2 sur le proton et le deuton. Ces fonctions de structure permettent de tester la regle de somme de bjrken, regle fondamentale de la chromodynamique quantique. Elles nous renseignent egalement sur la structure en spin du nucleon et, en particulier, elles permettent d'acceder a la fraction du spin du nucleon portee par les quarks. Experimentalement, on determine les fonctions de structure par la mesure d'asymetries en sections efficaces. Nous obtenons pour le premier moment de g#1, pour le proton et le deuton, des valeurs qui se situent de deux a trois deviations standards en dessous des predictions d'ellis et jaffe, mais nos resultats confirment la validite de la regle de somme de bjrken et permettent d'estimer a environ 30% la contribution des quarks au spin du nucleon. Nos resultats sur g#2 semblent etre bien decrits par l'expression de wandzura et wilczek
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Hlísta, Juraj. „Reaktivní audit“. Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2010. http://www.nusl.cz/ntk/nusl-237105.

Der volle Inhalt der Quelle
Annotation:
The thesis deals with the proposal and the implementation of an extension for the audit system in Linux - the reactive audit. It brings a new functionality to the auditing in form of triggering reactions to certain audit events. The reactive audit is implemented within an audit plugin and its use is optional. Additionally, there is another plugin which stores some audit events and provides time-related statistics for the first plugin. As the result, the mechanism of the reactive audit does not only react to some audit events, it is also able to reveal anomalies according to the statistical information and set ofe the appropriate reactions. It is a fairly general mechanism that can be useful in various situations.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Caetano, Sidney Martins. „Ensaios sobre política monetária e fiscal no Brasil“. reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2007. http://hdl.handle.net/10183/12461.

Der volle Inhalt der Quelle
Annotation:
Esta tese apresenta três ensaios sobre política monetária e fiscal dentro do atual regime de metas de inflação. O primeiro ensaio buscou estudar uma possível integração monetária-fiscal ao determinar uma regra ótima de política monetária com restrição fiscal, analisando os efeitos de diversas preferências sobre a regra ótima em função da alteração dos pesos dados para os desvios da razão superávit primário/PIB em relação à sua meta pré-estabelecida. Os resultados mostraram que a regra ótima obtida apresenta uma resposta negativa das taxas de juros aos choques na relação dívida/PIB. Ainda, superávits primários/PIB maiores permitiriam reduções maiores nas taxas de juros e proporcionais aos pesos que essa variávelobjetivo teria na função de perda social. Do ponto de vista tradicional do mecanismo de transmissão da política monetária, a resposta positiva das taxas de juros a uma desvalorização real do câmbio e a uma elevação do prêmio de risco seria mantida. Portanto, os resultados sugerem que a adoção de uma meta explícita para o superávit primário/PIB tem conseqüências positivas sobre a regra ótima de política monetária e para a redução da taxa de juros, bem como na eficiência do atual instrumento de política monetária. O segundo ensaio buscou analisar a relação risco default através do modelo de regressão beta, bem como os impactos que os superávits primários podem trazer sobre o prêmio de risco e, consequentemente, sobre o câmbio. Do ponto de vista da relação default risk, ancorada no modelo de Blanchard (2004/2005), as estimativas baseadas no modelo de regressão beta para as quatro relações propostas neste ensaio apresentaram sinais estatisticamente significativos e compatíveis com a teoria. O fato interessante nos resultados referente ao período do regime de metas de inflação é que as estimativas indicaram uma relação direta e forte entre o superávit primário/PIB e a probabilidade de default; evidências que destacam a importância dos efeitos indiretos que o superávit pode gerar sobre o juro doméstico. O terceiro ensaio analisou a dinâmica discreta da taxa de juros SELIC-meta definida nas reuniões do Comitê de Política Monetária (COPOM). Dois métodos foram utilizados para estudar a possibilidade de o COPOM reduzir/manter/aumentar a taxa de juros básica: probit binomial e probit multinomial. Os resultados mostraram que os desvios de inflação e o hiato do produto são variáveis relevantes para explicar as decisões do COPOM. O modelo probit binomial aplicado para os casos de aumento e redução da taxa SELIC-meta mostraram que a inclusão da variável fiscal gerou melhores resultados. Para o caso agregado, método probit multinomial, os resultados indicaram que a inclusão da variável fiscal combinada com a expectativa de inflação gerou os melhores resultados relativamente aos demais casos. Assim, a resposta do COPOM a resultados fiscais bem como às expectativas do mercado quanto à inflação demonstraram ser os sinais que devem ser observados pelo mercado.
This thesis presents three essays on monetary and fiscal policy of the current regimen of inflation targeting. The first essay searched to study an integration monetary-fiscal when determining an optimal rule of monetary policy with fiscal restriction, analyzing the effect of diverse preferences on the optimal rule in function of the alteration of the weights given for the deviations of the surplus primary as a fraction of GDP in relation to its established targets. The results show that the gotten optimal rule presents a negative reply of the interest rates to the shocks in the debtto- GDP ratio. Primary surplus still bigger would allow bigger reductions in the interest rates and proportional to the weights that this variable-objective would have in the function of social loss. Of the traditional point of view of the mechanism of transmission of the monetary policy, the positive reply of the interest rates to a real depreciation of the exchange and to a rise of the risk premium it would be kept. Therefore, the results suggest that the adoption of explicit targets for the primary surplus in percentage of the GDP has positive consequences on the optimal rule of monetary policy and for the reduction of the interest rates, as well as in the efficiency of the current instrument of monetary policy. The second essay searched to analyze the relation default risk through of the beta regression model, as well as the impacts that primary surplus can bring on the risk premium and, consequently, on the exchange rate. Of the point of view of the relation default risk, anchored in the model of Blanchard (2004/2005), the estimates based on the beta regression model for the four relations proposals in the study had presented significant and compatible signals with the theory. The interesting fact in the results referring to the period of the regimen of inflation targeting is that the estimates had indicated a negative and strong relation between the primary surplus/GDP and the probability of default, evidences that detaching the importance of the positive and indirect impact of the surplus in relation to the interests rate domestic. The third analyzes the discrete dynamics of the SELIC interest rates-target defined in the meetings of the Brazilian Monetary Policy Council (COPOM). Two methods were applied in order to study the possibility of COPOM to reduce/maintain/increase the interest rates: probit model and multinomial probit. It was verified that the deviations of inflation and the GDP gap must be considered importants variables to explain the COPOM’s decisions. The probit model was applied to the cases of the increases probabilies and reduces probabilities showing that the inclusion of a fiscal variable generates better results. To the aggregated case, multinominal probit method, the results indicates that the inclusion of a fiscal variables combined with the inflation expectations generates better results than other possibilities. So, the responses of COPOM to the fiscal results as well as inflation expectations were the reals signs to be considered for the market.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Catalá, Bolós Alejandro. „AGORAS: Augmented Generation of Reactive Ambients on Surfaces. Towards educational places for action, discussion and reflection to support creative learning on interactive surfaces“. Doctoral thesis, Universitat Politècnica de València, 2012. http://hdl.handle.net/10251/16695.

Der volle Inhalt der Quelle
Annotation:
La creatividad es una habilidad de especial interés para el desarrollo humano dado que es una de las dimensiones que permite al individuo y en última instancia a la sociedad enfrentarse a nuevos problemas y retos de forma satisfactoria. Además de entender la creatividad como una serie de factores relativos al individuo creativo, debe tenerse en cuenta que el grado de motivación intrínseca, el entorno y otros factores sociales pueden tener un efecto relevante sobre el desarrollo de esta importante habilidad, por lo que resulta de interés explorarla en el contexto de utilización de tecnologías de la información. En particular, dado que los procesos comunicativos, el intercambio de ideas y la interacción colaborativa entre individuos son un pilar fundamental en los procesos creativos, y también que en gran medida todas ellas son características mayormente facilitadas por las mesas interactivas, una de las principales contribuciones de esta tesis consiste precisamente en la exploración de la idoneidad de las superficies interactivas en tareas creativas colaborativas de construcción en estudiantes adolescentes. Partiendo del estudio realizado, que aporta evidencia empírica acerca de la adecuación de las superficies interactivas como tecnología de potencial para el fomento de la creatividad, esta tesis presenta AGORAS: un middleware para la construcción de ecosistemas de juegos 2D para mesas interactivas, y cuya idea final es entender actividades de aprendizaje más enriquecedoras como aquellas que permiten la propia creación de juegos y su posterior consumo. En el contexto de esta tesis también se ha desarrollado un toolkit básico para construcción de interfaces de usuario para superficies interactivas, se ha desarrollado un modelo de ecosistema basado en entidades que son simulables de acuerdo a leyes físicas; y se ha dotado al modelo de aproximación basada en reglas de comportamiento enriquecidas con expresiones dataflows y de su correspondiente editor para superficies.
Catalá Bolós, A. (2012). AGORAS: Augmented Generation of Reactive Ambients on Surfaces. Towards educational places for action, discussion and reflection to support creative learning on interactive surfaces [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/16695
Palancia
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Miller, Daniel Paul. „Maintaining the Atom: U.S. Nuclear Power Plant Life and the 80-Year Maintenance Regulation Regime“. Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/96561.

Der volle Inhalt der Quelle
Annotation:
Large, ever more complex, technological systems surround us and provide products and services that both construct and define much of what we consider as modern society. Our societal bargain is the trade-off between the benefits of our technologies and our constant vigilance over the safe workings and the occasional failures of these often hazardous sociotechnical systems during their operating life. Failure of a system's infrastructure, whether a complex subsystem or a single component, can cause planes to crash, oil rigs to burn, or the release of radioactivity from a nuclear power plant. To prevent catastrophes, much depends not only on skilled and safe operations, but upon the effective maintenance of these systems. Using the commercial nuclear power industry, of the United States, as a case study, this dissertation examines how nuclear power plant maintenance functions to ensure the plants are reliable and can safely operate for, potentially, eighty years; the current, government regulation defined limit, of their functional life. This study explores the history of U.S. nuclear maintenance regulatory policy from its early Cold War political precursors, the effect of the 1979 Three Mile Island reactor melt-down accident, through its long development, and finally its implementation by nuclear power licensees as formal maintenance programs. By investigating the maintenance of nuclear power plants this research also intends to expand the conceptual framework of large- technological-system (LTS) theory, in general, by adding a recognizable, and practically achievable, end-of-life (EOL) phase to the heuristic structure. The dissertation argues that maintenance is a knowledge producing technology that not only keeps a sociotechnical system operating through comprehension, but can be a surveillance instrument to make system end-of-life legible; that is both visible and understandable. With a discernible and legible view of system end-of-life, operators, policy makers, and the public can make more informed decisions concerning a system's safety and its continued usefulness in society.
Doctor of Philosophy
Large, ever more complicated, technical systems surround us and provide products and services that define much of what we consider as modern society. Our societal bargain is the trade-off between the benefits of our technologies and our constant vigilance over the safe workings and the occasional failures of these often hazardous systems during their operating life. Failure of a system's infrastructure, whether a complex subsystem or a single component, can cause planes to crash, oil rigs to burn, or the release of radioactivity from a nuclear power plant. To prevent catastrophes, much depends not only on skilled and safe operations, but upon the effective maintenance of these systems. Using the commercial nuclear power industry, of the United States, as a case study, this dissertation examines how nuclear power plant maintenance functions to ensure the plants are reliable and can safely operate for, potentially, eighty years; the current, government regulation defined limit, of their functional life. This study explores the history of U.S. nuclear maintenance regulatory policy from its early Cold War political precursors, the effect of the 1979 Three Mile Island reactor melt-down accident, through its long development, and finally its implementation by nuclear power licensees as formal maintenance programs. By investigating the maintenance of nuclear power plants this research also intends to develop a method to determine when a nuclear power plant, or other large technological system, is approaching or has reached the end of its reliable and safe operational life. The dissertation presents maintenance as a technology of knowledge that not only keeps a system operating through understanding of its components, but can be a general surveillance instrument to make system end-of- life legible. With a discernible and understandable view of end-of-life, operators, policy makers, and the public can make more informed decisions concerning a system's safety and its continued usefulness to society.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Canoas, Ana Carolina Garcia. „Logica nebulosa e tecnica de otimização particle swarm aplicados ao controle de tensão e de potencia reativa“. [s.n.], 2008. http://repositorio.unicamp.br/jspui/handle/REPOSIP/260661.

Der volle Inhalt der Quelle
Annotation:
Orientador: Carlos Alberto Favarin Murari
Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação
Made available in DSpace on 2018-08-11T08:34:49Z (GMT). No. of bitstreams: 1 Canoas_AnaCarolinaGarcia_D.pdf: 790052 bytes, checksum: bdc65cd1e622b5ffa74f95691e03751e (MD5) Previous issue date: 2008
Resumo: Devido ao crescente aumento da demanda de energia elétrica e ao retardo de investimento na expansão dos sistemas e energia elétrica (SEE), a operação destes sistemas está cada vez mais próxima de seus limites operacionais, contribuindo para maior complexidade dos SEE. Neste sentido, para satisfazer as rígidas condições de operação, um gerenciamento do perfil de tensão e fluxo de potência tem se tornado cada vez mais importante para as concessionárias, de modo a aumentar a segurança operacional dos sistemas e otimizar o uso de fontes de potência reativa, visando suprir aos consumidores energia dentro de determinados padrões de qualidade e confiabilidade. o objetivo principal desta pesquisa é o desenvolvimento de metodologias com o objetivo de monitorar o perfil de tensão, mantendo-o dentro dos limites operativos, visando não perder a qualidade de fornecimento de energia elétrica. O primeiro método trata-se de uma ferramenta de auxílio à tomada de decisão dos operadores nos centros de controle, baseada em um conjunto de regras nebulosas, o qual é a base do sistema de inferência fuzzy (ou nebulosa) que por sua vez se fundamenta na teoria de conjuntos nebulosos. Considerando que o problema de controle de tensão e de potência reativa apresenta características de natureza não-linear e que envolve variáveis de controle contínuas e discretas, foi desenvolvido um segundo método, o qual utiliza lógica nebulosa em conjunto com a técnica de otimização particle swarm. Este método mostra a possibilidade de incorporar lógica nebulosa em algoritmos baseados em busca, possibilitando a redução das perdas do sistema, satisfazendo as restrições de operação, e garantindo que o perfil de tensão mantenha-se dentro dos limites operativos com uma melhor utilização das fontes de potência reativa
Abstract: Due to the growing electrical power demand and the lag of transmission system infrastruc ture, the operation of transmission systems approaches to its limits and increases the complexity of the system operation. ln this context, in order to satisfy the more rigid operating conditions, managing voltage profile and power flow in a more effective way becomes very important to the power companies that have the aim of enhancing the operating conditions and optimiz ing reactive power sources to provide the consumers with an adequate quality and reliability standards. The main objective of this research work is the development of methodologies for monitoring the volt age profile in order to keep it within operating limits and preserving the quality of the energy being served. The first method consists of a tool for helping decision making by system operators in control centres. This method is based on a set of fuzzy rules, which are the base of a fuzzy inference system. Considering that the voltage and reactive power control present nonlinear characteristics and mixed continuous and discrete variables, a hybrid meta-heuristic method based on fuzzy system and particle swarm optimization has been developed. The fuzzy system has been incorporated into swarm intelligence to provide operational point that allows reduction of system losses while satisfies the operationallimits, volt age constraints and the best utilisation of reactive power
Doutorado
Energia Eletrica
Doutor em Engenharia Elétrica
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Bernáth, František. „Rozptýlená výroba a jej vplyv na kvalitu dodávok elektrickej energie“. Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2014. http://www.nusl.cz/ntk/nusl-233625.

Der volle Inhalt der Quelle
Annotation:
This work deals with the deployment of distributed power sources into the electric power grid. The nature of these sources causes that the massive integration may experience problems with power quality in the local context, and also overall reliability of supply in the range of interconnected power systems may be endangered. Work is focused on local voltage quality problems with special emphasis on analysis and design of tools (e.g. reactive power compensation units of power plants or dynamic voltage control by transformers 110/22kV) for voltage control in power distribution systems. These tools should be used as a part of uniform concept of voltage control. The proposed concept is involved.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Wylock, Christophe. „Contribution à l'étude des transferts de matière gaz-liquide en présence de réactions chimiques“. Doctoral thesis, Universite Libre de Bruxelles, 2009. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210257.

Der volle Inhalt der Quelle
Annotation:
Le bicarbonate de soude raffiné, produit industriellement par la société Solvay, est fabriqué dans des colonnes à bulles de grande taille, appelées les colonnes BIR.

Dans ces colonnes, une phase gazeuse contenant un mélange d’air et dioxyde de carbone (CO2) est dispersée sous forme de bulles dans une solution aqueuse de carbonate et de bicarbonate de sodium (respectivement Na2CO3 et NaHCO3). Cette dispersion donne lieu à un transfert de CO2 des bulles vers la phase liquide. Au sein des colonnes, la phase gazeuse se répartit dans deux populations de bulles :des petites bulles (diamètre de quelques mm) et des grandes bulles (diamètre de quelques cm). Le transfert bulle-liquide de CO2 est couplé à des réactions chimiques prenant place en phase liquide, qui conduisent à la conversion du Na2CO3 en NaHCO3. Une fois la concentration de saturation dépassée le NaHCO3 précipite sous forme de cristaux et un mélange liquide-solide est recueilli à la sortie de ces colonnes.

Ce travail, réalisé en collaboration avec la société Solvay, porte sur l’étude et la modélisation mathématique des phénomènes de transfert de matière entre phases, couplés à des réactions chimiques, prenant place au sein d’une colonne BIR. L’association d’études sur des colonnes à bulles à l’échelle industrielle ou réduite (pilote) et d’études plus fondamentales sur des dispositifs de laboratoire permet de développer une meilleure compréhension du fonctionnement des colonnes BIR et d’en construire un modèle mathématique détaillé.

L’objectif appliqué de ce travail est la mise au point d’un modèle mathématique complet et opérationnel d’une colonne BIR. Cet objectif est supporté par trois blocs de travail, dans lesquels différents outils sont développés et exploités.

Le premier bloc est consacré à la modélisation mathématique du transfert bulle-liquide de CO2 dans une solution aqueuse de NaHCO3 et de Na2CO3. Ce transfert est couplé à des réactions chimiques en phase liquide qui influencent sa vitesse. Dans un premier temps, des modèles sont développés selon des approches unidimensionnelles classiquement rencontrées dans la littérature. Ces approches passent par une idéalisation de l’écoulement du liquide autour des bulles. Une expression simplifiée de la vitesse du transfert bulle-liquide de CO2, est également développée et validée pour le modèle de colonne BIR.

Dans un second temps, une modélisation complète des phénomènes de transport (convection et diffusion), couplés à des réactions chimiques, est réalisée en suivant une approche bidimensionnelle axisymétrique. L’influence de la vitesse de réactions sur la vitesse de transfert est étudiée et les résultats des deux approches sont également comparés.

Le deuxième bloc est consacré à l’étude expérimentale du transfert gaz-liquide de CO2 dans des solutions aqueuses de NaHCO3 et de Na2CO3. A cette fin, un dispositif expérimental est développé et présenté. Du CO2 est mis en contact avec des solutions aqueuses de NaHCO3 et de Na2CO3 dans une cellule transparente. Les phénomènes provoqués en phase liquide par le transfert de CO2 sont observés à l’aide d’un interféromètre de Mach-Zehnder.

Les résultats expérimentaux sont comparés à des résultats de simulation obtenus avec un des modèles unidimensionnels développés dans le premier bloc. De cette comparaison, il apparaît qu’une mauvaise estimation de la valeur de certains paramètres physico-chimiques apparaissant dans les équations de ce modèle conduit à des écarts significatifs entre les grandeurs observées expérimentalement et les grandeurs estimées par simulation des équations du modèle.

C’est pourquoi une méthode d’estimation paramétrique est également développée afin d’identifier les valeurs numériques de ces paramètres physico-chimiques sur base des résultats expérimentaux. Ces dernières sont également discutées.

Dans le troisième bloc, nous apportons une contribution à l’étude des cinétiques de précipitation du NaHCO3 dans un cristallisoir à cuve agitée. Cette partie du travail est réalisée en collaboration avec Vanessa Gutierrez (du service Matières et Matériaux de l’ULB).

Nous contribuons à cette étude par le développement de trois outils :une table de calcul Excel permettant de synthétiser les résultats expérimentaux, un ensemble de simulations de l’écoulement au sein du cristallisoir par mécanique des fluides numérique et une nouvelle méthode d’extraction des cinétiques de précipitation du NaHCO3 à partir des résultats expérimentaux. Ces trois outils sont également utilisés de façon combinée pour estimer les influences de la fraction massique de solide et de l’agitation sur la cinétique de germination secondaire du NaHCO3.

Enfin, la synthèse de l’ensemble des résultats de ces études est réalisée. Le résultat final est le développement d’un modèle mathématique complet et opérationnel des colonnes BIR. Ce modèle est développé en suivant l’approche de modélisation en compartiments, développée au cours du travail de Benoît Haut. Ce modèle synthétise les trois blocs d’études réalisées dans ce travail, ainsi que les travaux d’Aurélie Larcy (du service Transferts, Interfaces et Procédés de l’ULB) et de Vanessa Gutierrez. Les équations modélisant les différents phénomènes sont présentées, ainsi que la méthode utilisée pour résoudre ces équations. Des simulations des équations du modèle sont réalisées et discutées. Les résultats de simulation sont également comparés à des mesures effectuées sur une colonne BIR. Un accord raisonnable est observé.

A l’issue de ce travail, nous disposons donc d’un modèle opérationnel de colonne BIR. Bien que ce modèle doive encore être optimisé et validé, il peut déjà être utilisé pour étudier l’effet des caractéristiques géométriques des colonnes BIR et des conditions appliquées à ces colonnes sur le comportement des simulations des équations du modèle et pour identifier des tendances.

//

The refined sodium bicarbonate is produced by the Solvay company using large size bubble columns, called the BIR columns.

In these columns, a gaseous phase containing an air-carbon dioxyde mixture (CO2) is dispersed under the form of bubbles in an aqueous solution of sodium carbonate and sodium bicarbonate (Na2CO3 and NaHCO3, respectively). This dispersion leads to a CO2 transfer from the bubbles to the liquid phase. Inside these columns, the gaseous phase is distributed in two bubbles populations :small bubbles (a few mm of diameter) and large bubbles (a few cm of diameter).

The bubble-liquid CO2 transfer is coupled with chemical reactions taking places in the liquid phase that leads to the conversion of Na2CO3 to NaHCO3. When the solution is supersaturated in NaHCO3, the NaHCO3 precipitates under the form of crystals and a liquid-solid mixture is extracted at the outlet of the BIR columns.

This work, realized in collaboration with Solvay, aims to study and to model mathematically the mass transport phenomena between the phases, coupled with chemical reactions, taking places inside a BIR column. Study of bubble columns at the industrial and the pilot scale is combined to a more fundamental study at laboratory scale to improve the understanding of the BIR columns functioning and to develop a detailed mathematical modeling.

The applied objective of this work is to develop a complete and operational mathematical modeling of a BIR column. This objective is supported by three blocks of work. In each block, several tools are developed and used.

The first block is devoted to the mathematical modeling of the bubble-liquid CO2 transfer in an NaHCO3 and Na2CO3 aqueous solution. This transfer is coupled with chemical reactions in liquid phase, which affect the transfer rate.

In a first time, mathematical models are developed following the classical one-dimensional approaches of the literature. These approaches idealize the liquid flow around the bubbles. A simplified expression of the bubble-liquid CO2 transfer rate is equally developed and validated for the BIR column model.

In a second time, a complete modeling of the transport phenomena (convection and diffusion) coupled with chemical reactions is developed, following an axisymmetrical twodimensional approach. The chemical reaction rate influence on the bubble-liquid transfer rate is studied and the results of the two approaches are then compared.

The second block is devoted to the experimental study of the gas-liquid CO2 transfer to NaHCO3 and Na2CO3 aqueous solutions. An experimental set-up is developed and presented. CO2 is put in contact with NaHCO3 and Na2CO3 aqueous solutions in a transparent cell. The phenomena induced in liquid phase by the CO2 transfer are observed using a Mach-Zehnder interferometer.

The experimental results are compared to simulation results that are obtained using one of the one-dimensional model developed in the first block. From this comparison, it appears that a wrong estimation of some physico-chemical parameter values leads to significative differences between the experimentally observed quantities and those estimated by simulation of the model equations. Therefore, a parametric estimation method is developed in order to estimate those parameters numerical values from the experimental results. The found values are then discussed.

In the third block is presented a contribution to the NaHCO3 precipitation kinetic study in a stirred-tank crystallizer. This part of the work is realized in collaboration with Vanessa Gutierrez (Chemicals and Materials Department of ULB).

Three tools are developed :tables in Excel sheet to synthetize the experimental results, a set of simulations of the flow inside the crystallizer by Computational Fluid Dynamic (CFD) and a new method to extract the NaHCO3 precipitation kinetics from the experimental measurements. These three tools are combined to estimate the influences of the solid mass fraction and the flow on the NaHCO3 secondary nucleation rate.

Finally, the synthesis of all these results is realized. The final result is the development of a complete and operational mathematical model of BIR columns. This model is developed following the compartmental modeling approach, developed in the PhD thesis of Benoît Haut. This model synthetizes the three block of study realized in this work and the studies of Aurélie Larcy (Transfers, Interfaces and Processes Department of ULB) and those of Vanessa Gutierrez. The equations modeling the phenomena taking place in a BIR column are presented as the used method to solve these equations. The equations of the model are simulated and the results are discussed. The results are equally compared to experimental measurement realized on a BIR column. A reasonable agreement is observed.

At the end of this work, an operational model of a BIR column is thus developed. Although this model have to be optimized and validated, it can already be used to study the influences of the geometrical characteristics of the BIR columns and of the conditions applied to these columns on the behaviour of the model equation simulations and to identity tendencies.
Doctorat en Sciences de l'ingénieur
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Santos, José Maria Novaes dos. „Dispositivos adaptativos cooperantes: formulação e aplicação“. Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/3/3141/tde-16112015-144910/.

Der volle Inhalt der Quelle
Annotation:
Com a crescente complexidade das aplicações e sistemas computacionais, atualmente tem se tornado importante o uso de formalismos de várias naturezas na representação e modelagem de problemas complexos, como os sistemas reativos e concorrentes. Este trabalho apresenta uma contribuição na Tecnologia Adaptativa e uma nova técnica no desenvolvimento de uma aplicação para execução de alguns tipos de jogos, (General Game Playing), cuja característica está associada à capacidade de o sistema tomar conhecimento das regras do jogo apenas em tempo de execução. Com esse trabalho, amplia-se a classe de problemas que podem ser estudados e analisados sob a perspectiva da Tecnologia Adaptativa, através dos Dispositivos Adaptativos Cooperantes. A aplicação desenvolvida como exemplo neste trabalho introduz uma nova ótica no desenvolvimento de aplicações para jogos gerais (GGP) e abre novos horizontes para a aplicação da Tecnologia Adaptativa, como a utilização das regras para extração de informação e inferência.
The complexity of computer applications has grown so much that several formalisms of different kinds became important nowadays. Many systems (e.g. reactive and concurrent ones) employ such formalisms to represent and model actual complex problems. This work contributes to the field of Adaptive Technology, and proposes a new approach for developing general game playing system, whose feature is the capability to play a game by acknowledging the game rules only at run time. This work expands the set of problems that can be studied and analyzed under the Adaptive Technology perspective, by means of cooperating adaptive devices. The developed application used a new approach for general game playing development bringing and widens the application field of Adaptive Technology with subjects related to information extraction and inference based in the devices rules.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Mazuy, Nicolas. „Hétérogénéités en Union monétaire : quelles implications pour la zone euros ?“ Thesis, Strasbourg, 2020. http://www.theses.fr/2020STRAB001.

Der volle Inhalt der Quelle
Annotation:
Cette thèse a pour objectif d’étudier l’implication des hétérogénéités structurelles dans le cadre des politiques économiques de la zone euro. Nous étudions d’abord dans quelles mesures ces hétérogénéités et l’introduction d’un objectif de stabilité financière attribué à la banque centrale affectent la stabilisation conjoncturelle suite aux chocs et la coordination entre les autorités monétaires et budgétaires. Nous montrons la pertinence croissante de la coordination avec le degré d’hétérogénéité et la pro-activité de la banque centrale suite à l’ajout de l’objectif de stabilité financière qui améliore/dégrade la stabilisation conjoncturelle selon le type de choc. Ensuite, nous étudions des fonctions de réaction budgétaire nationales qui démontrent l’hétérogénéité des comportements budgétaires des gouvernements et les différents déterminants des politiques budgétaires. Enfin, nous mettons en évidence l’impact hétérogène de la politique monétaire unique sur les pays membres. Ceci s’explique notamment par des caractéristiques structurelles hétérogènes dans les spécialisations productives, dans le fonctionnement des marchés financiers, marchés du travail etc. De même, nous posons la question de la pertinence d’une politique monétaire unique dans le cadre d’une union monétaire hétérogène, en l’absence de mécanisme d’ajustement
The aim of this thesis is to examine the implications of structural heterogeneities in the policy framework of the euro area. The first step is to analyse the extent to which structural heterogeneities and the introduction of a financial stability objective assigned to the central bank influence the coordination of monetary and fiscal authorities as well as the economic stabilization that follows after shocks. Noteworthy is the increasing relevance of coordination with the degree of heterogeneity on the one hand and a proactivity of the central bank on the other hand, which improve / corrupts cyclical stabilization according to the type of shock after a financial stability objective has been added. Next step is to examine the fiscal reaction functions in the euro area to demonstrate the heterogeneity of government fiscal behavior on the one hand and the determinants of these fiscal policies on the other. Finally, we look at the impact of the single monetary policy on the euro area Member States and highlight a completely heterogeneous transmission of monetary policy, caused in particular by structural heterogeneities in productive specializations, functioning of financial and labor markets, just to name a few. Here, we ask about the relevance of single monetary policy in the context of heterogeneous monetary union without any mechanism of adjustment
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Kang, Yul Hyoung Ryul. „Inferring Decision Rules from Evidence, Choice, and Reaction Times“. Thesis, 2018. https://doi.org/10.7916/D86T1084.

Der volle Inhalt der Quelle
Annotation:
When a decision is made based on noisy evidence, it is often a good strategy to take multiple samples of evidence up to a threshold before committing to a choice. Such process, termed bounded evidence accumulation, have successfully explained human and nonhuman behavior (speed and accuracy of choices) and neural recordings quantitatively. In this thesis, we exploit the quantitative relationship between evidence, choice, and reaction times (inverse of speed), to infer decision rules that are not reported directly. In Part I, we consider decisions based on one stream of evidence. In Chapter 2, we start by examining decisions that are not reported immediately but felt to be made at some point. We show that, in a perceptual decision-making task, we can predict the proportion of choices from the reported timing of covert decisions. We suggest that the awareness of having decided corresponds to the threshold-crossing of the accumulated evidence, rather than a post hoc inference or arbitrary report. For the type of decisions reported in Chapter 2 and many others, it has been suggested that the terminating threshold is not constant but decreases over time. In Chapter 3, we propose a method that estimates the threshold without any assumption on its shape. As a step toward more complex decisions, in Part II we consider decisions based on two streams of evidence. In Chapter 4, we summarize the results from human psychophysics experiments involving simultaneous motion-color judgments. The results suggest that information bearing on two dimensions of a decision can be acquired in parallel, whereas incorporation of information into a combined decision involves serial access to these parallel streams. Here, one natural question is how complete the seriality is. In Chapter 5, we propose a method to estimate the degree of seriality of two evidence accumulation processes. Another question is whether the two streams are acquired in parallel even when the stimulus viewing duration is not limited, and hence there is no apparent advantage to parallel acquisition given the serial evidence accumulation stage. In Chapter 6, we propose a method to estimate the probability of simultaneous acquisition of two evidence streams given the choice and evidence streams. Collectively, the work in this thesis presents new ways to study decision rules quantitatively given noninvasive measures such as the contents of the evidence stream(s), decision times, and the choice.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Liscai, Alessandro. „EU fiscal framework reform: discretionary oolicy reaction to the cycle and the role of fiscal rules“. Master's thesis, 2020. http://hdl.handle.net/10362/108429.

Der volle Inhalt der Quelle
Annotation:
European fiscal rules failed to dampen shocks and are often criticized for being too complex, not transparent and poorly enforced. Using EU country-level data, we find evidence of fiscal policy procyclicality for the key current fiscal framework indicator, the change in structural balance. Wes how that an alternative measure, the cyclically adjusted government spending growth rate, is more effective in activating a counter cyclical fiscal response.Moreover,by simulating the binding ness of the new expenditure rule, we verify that it would have triggered a different course of action by EU member states, changing the EU fiscal history.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Cruz, Nuno Alexandre Neves. „Reactive hybrid knowledge bases“. Master's thesis, 2014. http://hdl.handle.net/10362/14058.

Der volle Inhalt der Quelle
Annotation:
Hybrid knowledge bases are knowledge bases that combine ontologies with non-monotonic rules, allowing to join the best of both open world ontologies and close world rules. Ontologies shape a good mechanism to share knowledge on theWeb that can be understood by both humans and machines, on the other hand rules can be used, e.g., to encode legal laws or to do a mapping between sources of information. Taking into account the dynamics present today on the Web, it is important for these hybrid knowledge bases to capture all these dynamics and thus adapt themselves. To achieve that, it is necessary to create mechanisms capable of monitoring the information flow present on theWeb. Up to today, there are no such mechanisms that allow for monitoring events and performing modifications of hybrid knowledge bases autonomously. The goal of this thesis is then to create a system that combine these hybrid knowledge bases with reactive rules, aiming to monitor events and perform actions over a knowledge base. To achieve this goal, a reactive system for the SemanticWeb is be developed in a logic-programming based approach accompanied with a language for heterogeneous rule base evolution having as its basis RIF Production Rule Dialect, which is a standard for exchanging rules over theWeb.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Huang, Yu-Ting, und 黃郁庭. „Using Temporal Association Rules to Detect Unexpected Adverse Drug Reactions from Taiwan Population“. Thesis, 2017. http://ndltd.ncl.edu.tw/handle/z2m9my.

Der volle Inhalt der Quelle
Annotation:
碩士
國立臺灣大學
資訊管理學研究所
105
The purpose of using drugs is to treat or prevent disease, but it still may have some unexpected adverse drug reactions (ADRs) which are harmful to people’s health. Though government provides National Adverse Drug Reactions Reporting System in Taiwan, many reasons make the reporting system inefficient and it take long time to report. Most of previous researches used data from National Adverse Drug Reactions Reporting System as analysis target. However, under reporting often happened and the source of the data lacked professional medical judgement. This research chooses National Health Insurance Research Database as analysis target, which is more reliable due to its huge dataset and authoritative prescription by doctors. This research form suspect ADR lists of ingredients in single drug by temporal association rule after it appear on the market. Besides, it uses chi-square test to filter results then establish the supervise process when new drugs appear on the market. The results can exclude common disease by the suspect ADR lists efficiently and decrease amount of ADRs by the ranking of temporal association rule and chi-square test.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Lee, LiHua, und 李麗華. „Is Taylor''s Rule Appropriate to be Called Monetary Reaction Functuon“. Thesis, 2003. http://ndltd.ncl.edu.tw/handle/47830283092297087484.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Lee, Menghsun, und 李孟勳. „Asset Prices and an Extended Taylor's Rule: the Study of Asymmetric Policy Reactions“. Thesis, 2011. http://ndltd.ncl.edu.tw/handle/17893706830319522735.

Der volle Inhalt der Quelle
Annotation:
碩士
國立臺北大學
經濟學系
99
Macroeconomists have been interested in modeling the central bank's reaction function for long times. The central bank's reaction function plays an important role in a wide variety of macroeconomic analyses. By estimating the central bank's reaction function, it let us know the way of adjusting monetary policy and forecasting that changes in the central bank's policy instruments effect on other policy actions. This paper uses Taylor’s rule and applies Hansen (2000) threshold model to examining whether monetary policy asymmetries exist in central bank’s reaction function. Sample period is from 1990 to 2010.The main findings of this study are stated as follows. First, the linear model can better describe the expectation of Taylor’s rule when the model takes account of asset prices. Second, only housing price growth rate has significantly threshold effect in the threshold model. Further, we find that central bank focuses on price stability when housing price growth ratio is below the threshold value but focuses on output gap, stock price and housing price growth ratio when housing price growth ratio is above the threshold value.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Hantouche, Mireille. „Impact of Uncertainties in Reaction Rates and Thermodynamic Properties on Ignition Delay Time“. Diss., 2021. http://hdl.handle.net/10754/670061.

Der volle Inhalt der Quelle
Annotation:
Ignition delay time, τ_ign, is a key quantity of interest that is used to assess the predictability of a chemical kinetic model. This dissertation explores the sensitivity of τ_ign to uncertainties in: 1. rate-rule kinetic rates parameters and 2. enthalpies and entropies of fuel and fuel radicals using global and local sensitivity approaches. We begin by considering variability in τ_ign to uncertainty in rate parameters. We consider a 30-dimensional stochastic germ in which each random variable is associated with one reaction class, and build a surrogate model for τ_ign using polynomial chaos expansions. The adaptive pseudo-spectral projection technique is used for this purpose. First-order and total-order sensitivity indices characterizing the dependence of τ_ign on uncertain inputs are estimated. Results indicate that τ_ign is mostly sensitive to variations in four dominant reaction classes. Next, we develop a thermodynamic class approach to study variability in τ_ign of n-butanol due to uncertainty in thermodynamic properties of species of interest, and to define associated uncertainty ranges. A global sensitivity analysis is performed, again using surrogates constructed using an adaptive pseudo-spectral method. Results indicate that the variability of τ_ign is dominated by uncertainties in the classes associated with peroxy and hydroperoxide radicals. We also perform a combined sensitivity analysis of uncertainty in kinetic rates and thermodynamic properties which revealed that uncertainties in thermodynamic properties can induce variabilities in ignition delay time that are as large as those associated with kinetic rate uncertainties. In the last part, we develop a tangent linear approximation (TLA) to estimate the sensitivity of τ_ign with respect to individual rate parameters and thermodynamic properties in detailed chemical mechanisms. Attention is focused on a gas mixture reacting under adiabatic, constant-volume conditions. The proposed approach is based on integrating the linearized system of equations governing the evolution of the partial derivatives of the state vector with respect to individual random variables, and a linearized approximation is developed to relate ignition delay sensitivity to scaled partial derivatives of temperature. The computations indicate that TLA leads to robust local sensitivity predictions at a computational cost that is order-of-magnitude smaller than that incurred by finite-difference approaches.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie