Rozprawy doktorskie na temat „Event-based evaluation”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Event-based evaluation.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 22 najlepszych rozpraw doktorskich naukowych na temat „Event-based evaluation”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.

1

Kakarla, Sujana. "Partial evaluation based triple modular redundancy for single event upset mitigation". [Tampa, Fla.] : University of South Florida, 2005. http://purl.fcla.edu/fcla/etd/SFE0001146.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Rathfelder, Christoph [Verfasser]. "Modelling Event-Based Interactions in Component-Based Architectures for Quantitative System Evaluation / Christoph Rathfelder". Karlsruhe : KIT Scientific Publishing, 2013. http://www.ksp.kit.edu.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Hellberg, Simon, i Dominik Hollidt. "Evaluation of Camera Resolution in Optical Flow Estimation Using Event-Based Cameras". Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280321.

Pełny tekst źródła
Streszczenie:
Developments in event-based camera technology and their recent increase in pixel count raised the question of whether resolution helps the accuracy and performance of algorithms. This thesis studies the impact of resolution on optical flow estimation for event-based cameras. For this purpose, we created a data set containing a mix of synthetic scenes and real camera recordings with ground truth available. For the modeling of low-resolution data, we designed three different downsampling algorithms. The camera used for the real scene recordings was the Prophesee (CSD3SVCD), which was determined to be the best out of the current state-of-the-art cameras in a prestudy. The camera investigation evaluated the camera’s performance in terms of temporal and spatial accuracy. In order to answer the question, whether resolution benefits the accuracy of optical flow estimation, we ran a total of 13 algorithms variations from four algorithm families (Lucas-Kanade [1, 2], Local-Planes fitting [2, 3], direction-selective filter [2, 4] and patch match [5]) on the data set. We then analysed their performance in terms of processing time, output density, angular error, endpoint error and relative endpoint error. The results show that no global correlation between resolution and accuracy across all algorithms can be identified. However, methods show individually different behaviour on different data. The best performing methods, the patch match algorithms, seemed to prefer the less dense downsampled data. The evaluation also showed that rather than resolution, the specific characteristics of the data seemed to have a larger impact on accuracy. Thus denoised data might increase accuracy more than a change of resolution.
De senaste utvecklingarna inom händelsebaserad kamerateknologi och deras nyligen utökade mängd pixlar ställer frågan om denna högre upplösning påverkar precision samt prestanda för algoritmer. Den här rapporten studerar påverkan av upplösning på optiskt flödes-algoritmer för händelsebaserade kameror. För att göra detta skapas en dataupsättning av riktiga och syntetiska scener, där det sanna optiska flödet är känt. För att modellera den lågupplösta datan används tre olika nedskalningsalgoritmer. Kameran som används för att spela in de riktiga scenerna var Prophesee (CSD3SVCD), som vi avgjorde var den bästa av de nuvarande existerande kamerorna i en förstudie. I förstudien bedömde vi kamerornas precision i tid samt rymd. För att besvara vår huvudsakliga fråga testades totalt 13 algoritmvariationer från fyra algoritmfamiljer (Lucas-Kanade [1, 2], Local Planes fitting [3, 2], direction-selective filter [4, 2] och patch match [5]). Vi analyserar deras prestanda i beräkningstid, densitet av vektorer, ändpunktsfel, relativt ändpunktsfel och vinkelfel. Resultaten visar ingen global trend över alla algoritmer för precision av optiskt flöde baserat på upplösning. Individuella trender kan dock skönjas inom algoritmfamiljer. Den bäst presterande algoritmfamiljen, patch match, verkade föredra de mindre täta typerna av nedskalning. Utvärderingen visar också att över upplösning så verkar datans specifika karaktäristik ha större påverkan på precision. Därför kan brusreducerad data ha mer påverkan på en algoritms precision än en ändrad upplösning.
Style APA, Harvard, Vancouver, ISO itp.
4

Borowski, Jimmy. "Software Architecture Simulation : Performance evaluation during the design phase". Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik och datavetenskap, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-5882.

Pełny tekst źródła
Streszczenie:
Due to the increasing size and complexity of software systems, software architectures have become a crucial part in development projects. A lot of effort has been put into defining formal ways for describing architecture specifications using Architecture Description Languages (ADLs). Since no common ADL today offers tools for evaluating the performance, an attempt to develop such a tool based on an event-based simulation engine has been made. Common ADLs were investigated and the work was based on the fundamentals within the field of software architectures. The tool was evaluated both in terms of correctness in predictions as well as usability to show that it actually is possible to evaluate the performance using high-level architectures as models.
Style APA, Harvard, Vancouver, ISO itp.
5

Askerud, Caroline, i Sara Wall. "Evaluation of bus terminals using microscopic traffic simulation". Thesis, Linköpings universitet, Kommunikations- och transportsystem, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-139028.

Pełny tekst źródła
Streszczenie:
Traffic simulation is a safe and efficient tool to investigate infrastructural changes as well as traffic conditions. This master thesis aims to analyse a microscopic traffic simulation method for evaluation of bus terminal capacity. The evaluation is performed by investigating a case study of the bus terminal at Norrköping travel centre. The analysed method, referred to as terminal logic in the thesis, uses a combination of time based and event based simulation. Through the combination of time and event, it is possible to capture all movements within the terminal for individual vehicles. The simulation model is built in the software Vissim. A new travel centre for Norrköping is under development. Among the reasons for a new travel centre is the railway project Ostlänken in the eastern part of Sweden. An evaluation of the bus terminal is interesting due to a suspicion of overcapacity and the opportunity of redesigning. To investigate both the terminal capacity and the terminal logic, three scenarios were implemented. Scenario 1: Current design and frequency Scenario 2: Current design with higher frequency Scenario 3: Decreased number of bus stops with current frequency The results from the scenarios confirm the assumption of overcapacity. The capacity was evaluated based on several different measures, all indicating a low utilization. Even so, the utilization was uneven over time and congestion could still occur when several buses departed at the same time. This was also seen when studying the simulation, which showed congestions when several buses departed at the same time. The case study established the terminal logic to be useful when evaluating capacity at bus terminals. It provides a good understanding of how the terminal operates and captures the movements. However, it was time-consuming to adjust the logic to the studied terminal. This is a disadvantage when investigating more than one alternative. The thesis resulted in two main conclusions. Firstly, a more optimised planning of the buses at Norrköping bus terminal would probably be achievable and lead to less congestions at the exits. Secondly, the terminal logic is a good method to use when evaluating bus terminals but it is not straight forward to implement.
Style APA, Harvard, Vancouver, ISO itp.
6

Toresson, Gabriel. "Documenting and Improving the Design of a Large-scale System". Thesis, Linköpings universitet, Programvara och system, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-157733.

Pełny tekst źródła
Streszczenie:
As software systems become increasingly larger and more complex, the need to make them easily maintained increases, as large systems are expected to last for many years. It has been estimated that system maintenance is a large part of many IT-departments’ software develop­ment costs. In order to design a complex system to be maintainable it is necessary to introduce structure, often as models in the form of a system architecture and a system design. As development of complex large-scale systems progresses over time, the models may need to be reconstructed. Perhaps because development may have diverted from the initial plan, or because changes had to be made during implementation. This thesis presents a reconstructed documentation of a complex large-scale system, as well as suggestions for how to improve the existing design based on identified needs and insufficiencies. The work was performed primarily using a qualitative manual code review approach of the source code, and the proposal was generated iteratively. The proposed design was evaluated and it was concluded that it does address the needs and insufficiencies, and that it can be realistically implemented.
Style APA, Harvard, Vancouver, ISO itp.
7

Montvida, Olga. "Evaluation of cardio-metabolic effects of treatment with incretin-based therapies in patients with type 2 diabetes". Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/122920/1/Olga_Montvida_Thesis.pdf.

Pełny tekst źródła
Streszczenie:
This dissertation provides a detailed exploration and valuable insights of type 2 diabetes management in the real-world setting. Incretin-based therapies and thiazolidinedione were found to provide higher chances of sustainable glycaemic and cardiovascular risk factor control, compared to older anti-diabetic treatment options. The project highlights alarming rates of the existing cardio-metabolic burden at the population level. Proper control in terms of timely intensification with anti-hyperglycaemic, anti-hypertensive, and anti-dyslipidemic therapies when needed, remains a key aspect to improve long-term outcomes in patients with type 2 diabetes.
Style APA, Harvard, Vancouver, ISO itp.
8

Trogadas, Giorgos, i Larissa Ekonoja. "The effect of noise filters on DVS event streams : Examining background activity filters on neuromorphic event streams". Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-302514.

Pełny tekst źródła
Streszczenie:
Image classification using data from neuromorphic vision sensors is a challenging task that affects the use of dynamic vision sensor cameras in real- world environments. One impeding factor is noise in the neuromorphic event stream, which is often generated by the dynamic vision sensors themselves. This means that effective noise filtration is key to successful use of event- based data streams in real-world applications. In this paper we harness two feature representations of neuromorphic vision data in order to apply conventional frame-based image tools on the neuromorphic event stream. We use a standard noise filter to evaluate the effectiveness of noise filtration using a popular dataset converted to neuromorphic vision data. The two feature representations are the best-of-class standard Histograms of Averaged Time Surfaces (HATS) and a simpler grid matrix representation. To evaluate the effectiveness of the noise filter, we compare classification accuracies using various noise filter windows at different noise levels by adding additional artificially generated Gaussian noise to the dataset. Our performance metrics are reported as classification accuracy. Our results show that the classification accuracy using frames generated with HATS is not significantly improved by a noise filter. However, the classification accuracy of the frames generated with the more traditional grid representation is improved. These results can be refined and tuned for other datasets and may eventually contribute to on- the- fly noise reduction in neuromorphic vision sensors.
Händelsekameror är en ny typ av kamera som registrerar små ljusförändringar i kamerans synfält. Sensorn som kameran bygger på är modellerad efter näthinnan som finns i våra ögon. Näthinnan är uppbyggd av tunna lager av celler som omvandlar ljus till nervsignaler. Eftersom synsensorer efterliknar nervsystemet har de getts namnet neuromorfiska synsensorer. För att registrera små ljusförändringar måste dessa sensorer vara väldigt känsliga vilket även genererar ett elektroniskt brus. Detta brus försämrar kvalitén på signalen vilket blir en förhindrande faktor när dessa synsensorer ska användas i praktiken och ställer stora krav på att hitta effektiva metoder för brusredusering. Denna avhandling undersöker två typer av digitala framställningar som omvandlar signalen ifrån händelsekameror till något som efterliknar vanliga bilder som kan användas med traditionella metoder för bildigenkänning. Vi undersöker brusreduseringens inverkan på den övergripande noggrannhet som uppnås av en artificiell intelligens vid bildigenkänning. För att utmana AIn har vi tillfört ytterligare normalfördelat brus i signalen. De digitala framställningar som används är dels histogram av genomsnittliga tidsytor (eng. histograms of averaged time surfaces) och en matrisrepresentation. Vi visar att HATS är robust och klarar av att generera digitala framställningar som tillåter AIn att bibehålla god noggrannhet även vid höga nivåer av brus, vilket medför att brusreduseringens inverkan var försumbar. Matrisrepresentationen gynnas av brusredusering vid högre nivåer av brus.
Style APA, Harvard, Vancouver, ISO itp.
9

Yun, Changgeun. "THREE ESSAYS ON PUBLIC ORGANIZATIONS". UKnowledge, 2015. http://uknowledge.uky.edu/msppa_etds/15.

Pełny tekst źródła
Streszczenie:
Organizations play key roles in modern societies. The importance of organizations for a society requires an understanding of organizations. In order to fully understand public organizations, it is necessary to recognize how organizational settings affect subjects of organizations and organizing. Although public and private organizations interrelate with each other, the two types are not identical. In this dissertation, I attempt to describe public organizations in their own setting by discussing three important topics in public organization theory: (1) innovation adoption in the public sector; (2) representative bureaucracy; and (3) decline and death of public organizations. In Chapter II, I scrutinize early adoption of innovations at the organizational level and explore which public organizations become early adopters in the diffusion process. The adoption of an innovation is directly related to the motivation to innovate. That is, organizations performing poorly will have a motivation to seek new solutions. I estimate the strength of the motivation by observing prior performance. The main finding of the second chapter is that performance-based motivation has a twofold impact on early innovation adoption: negative for organizations with low performance, but positive for those with very high performance. This study estimates top 3.8% as the turning point defining which organizations attain outstanding performance and show the positive relationship between performance and innovation adoption. In Chapter III, develop a theoretical framework for predicting and explaining active representation in bureaucracy and test two hypotheses from the framework to test its validity. First, active representation requires the loss of organizational rewards. Second, a minority group mobilizes external support to minimize the cost of active representation. These findings support that active representation is a political activity in which bargaining between formal and informal roles occurs. In addition, I add evidence to the literature demonstrating that the two prerequisites – policy discretion and a critical mass – must be satisfied for active representation to occur. In Chapter IV, I argue that organizational change is a result of a relationship between an organization and the environment. And, I suggest and advance the theory of organizational ecology for examining environment effect on organizational decline and death. The theory has been extensively studies in the business sector, so I advance the theory to be applicable to the public sector. First, I add political variables, such as change in the executive branch and the legislature, unified government, and hypothesize that (1) an organization established by a party other than the one in the executive branch in any given year will be more likely to be terminated or decline; that (2) an organization established by a party other than the one in the legislature in any given year will be more likely to be terminated or decline; and that (3) if an unfriendly party controls both the executive branch and the legislature, organizations established by other parties are more likely to be terminated or decline. Second, the effect of the economic environment on the life cycle of public organizations is not as straightforward and simple as their effect on business firms.
Style APA, Harvard, Vancouver, ISO itp.
10

Ahrsjö, Carl. "Real-time event based visualization of multivariate abstract datasets : Implementing and evaluating a dashboard visualization prototype". Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-170395.

Pełny tekst źródła
Streszczenie:
As datasets in general grow in size and complexity over time while the human cognitive ability to interpret said datasets essentially stays the same, it becomes important to enable intuitive visualization methods for analysis. Based on previous research in the field of information visualization and visual analytics, a dashboard visualization prototype handling real-time event based traffic was implemented and evaluated. The real-time data is collected by a script and sent to a self-implemented web server that opens up a websocket connection with the dashboard client where the data is then visualized. Said data consisted of transactions and related metadata of an ecommerce retail site applied to a real customer scenario. The dashboard was developed using an agile method, continuously involving the thesis supervisor in the design and functionality process. The final design also depended on the results of an interview with a representative from one of the two target groups. The two target groups consisted of 5 novice and 5 expert users to the field of information visualization and visual analytics. The intuitiveness of the dashboard visualization prototype was evaluated by conducting two user studies, one for each target group, where the test subjects were asked to interact with the dashboard visualization, answer some questions and lastly solving a predefined set of tasks. The time spent solving said tasks, the amount of serious misinterpretations and the number of wrong answers was recorded and evaluated. The results from the user study showed that the use of colors, icons, level on animation, the choice of visualization method and level of interaction were the most important aspects for carrying out an efficient analytical process according to the test subjects. The test subjects desired to zoom in on each component, to filter the contents of the dashboard and to get additional information about the components on-demand. The most important result produced from developing the dashboard was how to handle the scalability of the application. It is highly important that the websocket connection remain stable when scaling out to handle more concurrent HTTP requests. The research also conclude that the dashboard should handle visualization methods that are intuitive for all users, that the real-time data needs to be put into relation to historical data if one wishes to carry out a valid analytical process and that real-time data can be used to discover trends and patterns in an early-as-possible stage. Lastly, the research provides a set of guidelines for scalability, modularity, intuitiveness and relations between datasets.
Style APA, Harvard, Vancouver, ISO itp.
11

Ames, Zegarra Carolina, i Ananthan Indukaladharan. "Simulation of Assembly cell : Simulation based evaluation of automation solutions in an assembly cell". Thesis, Jönköping University, JTH, Industriell produktutveckling, produktion och design, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-53862.

Pełny tekst źródła
Streszczenie:
Purpose:The primary purpose of the current thesis is to develop a virtual model using discrete event simulation (DES), which aims at supporting the decision-making process regarding automation solution proposals for SMEs.  Method:The research approach is positivism, and it considers quantitative and empirical information. A literature search is conducted to generate a base for obtaining the theory required for the current report to answer the research questions. This search included the trace of relevant and reviewed topics regarding automation, discrete event simulation, and production lines. Then, a scenario simulation is designed and studied based on empiric knowledge and how automation would affect it, followed by a collection of information from the simulation iterations. Findings& Analysis: Two scenarios are presented. One with a fully manually operated assembly line consisting of only human operators and a second scenario, a semi-automated assembly line that includes some robots in specific areas doing specific operations. The two scenarios are simulated to check to what extend the KPI’s and parameters improved between each scenario. The experiment result concludes that by introducing automation elements in the production line, there is an increase in the overall efficiency, throughput rate, and a considerable gap against humans in WIP. Conclusions and recommendations: The results obtained from the experimentation in discrete event simulation software and study from previous research show that discrete event simulation has a significant contribution when considering a decision-making tool's role. Since it allows to understand and study the specific scenario by imitation and try different solutions in the same production system, it also allows studying several indicators from the scenarios to be checked to what extent they could be improved. Delimitations: The current thesis includes several delimitations. First, it focuses only on an operational level. Also, this study consists of a specific type of product with many variants, and finally, there are only two scenarios studied: a fully manual scenario and a semi-automated scenario with the presence of robots.
Style APA, Harvard, Vancouver, ISO itp.
12

Raupach, Staffan, i Fredrik Lindelöw. "Virtual Value Stream Mapping : Evaluation of simulation based value stream mapping using Plant Simulation". Thesis, Högskolan i Skövde, Institutionen för ingenjörsvetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-11176.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
13

Rusnock, Christina. "Simulation-Based Cognitive Workload Modeling and Evaluation of Adaptive Automation Invoking and Revoking Strategies". Doctoral diss., University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5857.

Pełny tekst źródła
Streszczenie:
In human-computer systems, such as supervisory control systems, large volumes of incoming and complex information can degrade overall system performance. Strategically integrating automation to offload tasks from the operator has been shown to increase not only human performance but also operator efficiency and safety. However, increased automation allows for increased task complexity, which can lead to high cognitive workload and degradation of situational awareness. Adaptive automation is one potential solution to resolve these issues, while maintaining the benefits of traditional automation. Adaptive automation occurs dynamically, with the quantity of automated tasks changing in real-time to meet performance or workload goals. While numerous studies evaluate the relative performance of manual and adaptive systems, little attention has focused on the implications of selecting particular invoking or revoking strategies for adaptive automation. Thus, evaluations of adaptive systems tend to focus on the relative performance among multiple systems rather than the relative performance within a system. This study takes an intra-system approach specifically evaluating the relationship between cognitive workload and situational awareness that occurs when selecting a particular invoking-revoking strategy for an adaptive system. The case scenario is a human supervisory control situation that involves a system operator who receives and interprets intelligence outputs from multiple unmanned assets, and then identifies and reports potential threats and changes in the environment. In order to investigate this relationship between workload and situational awareness, discrete event simulation (DES) is used. DES is a standard technique in the analysis of systems, and the advantage of using DES to explore this relationship is that it can represent a human-computer system as the state of the system evolves over time. Furthermore, and most importantly, a well-designed DES model can represent the human operators, the tasks to be performed, and the cognitive demands placed on the operators. In addition to evaluating the cognitive workload to situational awareness tradeoff, this research demonstrates that DES can quite effectively model and predict human cognitive workload, specifically for system evaluation. This research finds that the predicted workload of the DES models highly correlates with well-established subjective measures and is more predictive of cognitive workload than numerous physiological measures. This research then uses the validated DES models to explore and predict the cognitive workload impacts of adaptive automation through various invoking and revoking strategies. The study provides insights into the workload-situational awareness tradeoffs that occur when selecting particular invoking and revoking strategies. First, in order to establish an appropriate target workload range, it is necessary to account for both performance goals and the portion of the workload-performance curve for the task in question. Second, establishing an invoking threshold may require a tradeoff between workload and situational awareness, which is influenced by the task's location on the workload-situational awareness continuum. Finally, this study finds that revoking strategies differ in their ability to achieve workload and situational awareness goals. For the case scenario examined, revoking strategies based on duration are best suited to improve workload, while revoking strategies based on revoking thresholds are better for maintaining situational awareness.
Ph.D.
Doctorate
Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering
Style APA, Harvard, Vancouver, ISO itp.
14

Wulff, Tobias. "Evaluation of and Mitigation against Malicious Traffic in SIP-based VoIP Applications in a Broadband Internet Environment". Thesis, University of Canterbury. Computer Science and Software Engineering, 2010. http://hdl.handle.net/10092/5120.

Pełny tekst źródła
Streszczenie:
Voice Over IP (VoIP) telephony is becoming widespread, and is often integrated into computer networks. Because of his, it is likely that malicious software will threaten VoIP systems the same way traditional computer systems have been attacked by viruses, worms, and other automated agents. While most users have become familiar with email spam and viruses in email attachments, spam and malicious traffic over telephony currently is a relatively unknown threat. VoIP networks are a challenge to secure against such malware as much of the network intelligence is focused on the edge devices and access environment. A novel security architecture is being developed which improves the security of a large VoIP network with many inexperienced users, such as non-IT office workers or telecommunication service customers. The new architecture establishes interaction between the VoIP backend and the end users, thus providing information about ongoing and unknown attacks to all users. An evaluation of the effectiveness and performance of different implementations of this architecture is done using virtual machines and network simulation software to emulate vulnerable clients and servers through providing apparent attack vectors.
Style APA, Harvard, Vancouver, ISO itp.
15

Peredo, Ramirez Daniela. "Quels gains d’une modélisation hydrologique adaptée et d’une approche d’ensemble pour la prévision des crues rapides ?" Electronic Thesis or Diss., Sorbonne université, 2021. http://www.theses.fr/2021SORUS058.

Pełny tekst źródła
Streszczenie:
La prévision de crues joue un rôle fondamental dans l’anticipation et la mise en œuvre de mesures visant à protéger les personnes et les biens. L’objectif de cette thèse est d’examiner notre capacité à améliorer la simulation et la prévision d’événements majeurs de crues soudaines en France. Premièrement, nous examinons les limites de l’approche globale de modélisation hydrologique et la contribution du modèle hydrologique semi-distribué GRSD, à maillage fin et au pas de temps horaire, à la simulation d’évènements majeurs de crue. Nous proposons une modification de la structure de ce modèle afin qu’il soit mieux adapté à reproduire la réponse des bassins versants aux fortes intensités de pluie. Une adaptation de la structure du modèle, basée sur le calcul du rendement de pluies, a abouti à l’introduction d’un nouveau paramètre et à la proposition d’un nouveau modèle (GRSDi) mieux capable de simuler la réponse hydrologique à des pluies intenses qui surviennent en automne. Deuxièmement, nous explorons comment une approche de prévision d’ensemble météorologique, combinée au modèle hydrologique semi-distribué, peut contribuer à mieux prévoir les évènements de crues rapides, l’amplitude et l’instant d’occurrence des débits de pointe, que ce soit sur des bassins jaugés ou non-jaugés. Les résultats ont permis d’identifier, du point de vue hydrologique, les forces et faiblesses des produits proposés. Les travaux menés constituent un pas en avant vers l’utilisation de modèles hydrologiques conceptuels, continus et semi-distribués, dans le cadre de la prévision de crues majeures et rapides en contexte méditerranéen
Flood forecasting plays a fundamental role in anticipating and implementing measures to protect lives and property. The objective of this thesis is to investigate our ability to improve the simulation and forecasting of major flash flood events in France. First, we analyse the limitations of the lumped hydrological modelling approach, and how the contribution of the semi-distributed hydrological model GRSD, with fine mesh and hourly time step, to improve the simulation of major flood events. We also propose a modification of the structure of the model, in order to make it better suited to reproducing the response of the catchments to high rainfall intensities. An adaptation of the model structure, based on the calculation of the production rate function, resulted in the introduction of a new parameter and the proposal of a new model (GRSDi) capable of better simulating the hydrological response to heavy rains that occur in autumn, after a dry summer period. Second, we explore the ability of a meteorological ensemble prediction approach, combined with the semi-distributed hydrological model, to better predict flash flood events, the amplitude and the time of occurrence of peak flows, whether in gauged or ungauged basins. The results made it possible to identify, from a hydrological point of view, the strengths and weaknesses of the products evaluated. The work carried out constitutes a step forward towards the use of conceptual, continuous and semi-distributed hydrological models for the forecasting of major flood events and flash floods in the Mediterranean context
Style APA, Harvard, Vancouver, ISO itp.
16

Wilson, Brittany Michelle. "Evaluating and Improving the SEU Reliability of Artificial Neural Networks Implemented in SRAM-Based FPGAs with TMR". BYU ScholarsArchive, 2020. https://scholarsarchive.byu.edu/etd/8619.

Pełny tekst źródła
Streszczenie:
Artificial neural networks (ANNs) are used in many types of computing applications. Traditionally, ANNs have been implemented in software, executing on CPUs and even GPUs, which capitalize on the parallelizable nature of ANNs. More recently, FPGAs have become a target platform for ANN implementations due to their relatively low cost, low power, and flexibility. Some safety-critical applications could benefit from ANNs, but these applications require a certain level of reliability. SRAM-based FPGAs are sensitive to single-event upsets (SEUs), which can lead to faults and errors in execution. However there are techniques that can mask such SEUs and thereby improve the overall design reliability. This thesis evaluates the SEU reliability of neural networks implemented in SRAM-based FPGAs and investigates mitigation techniques against upsets for two case studies. The first was based on the LeNet-5 convolutional neural network and was used to test an implementation with both fault injection and neutron radiation experiments, demonstrating that our fault injection experiments could accurately evaluate SEU reliability of the networks. SEU reliability was improved by selectively applying TMR to the most critical layers of the design, achieving a 35% improvement reliability at an increase in 6.6% resources. The second was an existing neural network called BNN-PYNQ. While the base design was more sensitive to upsets than the CNN previous tested, the TMR technique improved the reliability by approximately 7× in fault injection experiments.
Style APA, Harvard, Vancouver, ISO itp.
17

Gu, Yan. "ROSENET: a remote server-based network emulation system". Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/22662.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.)--Computing, Georgia Institute of Technology, 2008.
Committee Chair: Fujimoto, Richard; Committee Member: Ammar, Mostafa; Committee Member: Bader, David; Committee Member: Goldsman, David; Committee Member: Park, Haesun; Committee Member: Riley, George.
Style APA, Harvard, Vancouver, ISO itp.
18

Yo, Janq-Lin, i 游政霖. "Construction of Event-Evaluation-Based Classifier System in the Stock Market Transaction Event of the Empirical Study". Thesis, 2010. http://ndltd.ncl.edu.tw/handle/29174956874473948241.

Pełny tekst źródła
Streszczenie:
碩士
國立交通大學
管理學院碩士在職專班資訊管理組
98
This study used classifier system of artificial intelligence methodology to construct an intelligent financial investment decision system, which is based on Event-Evaluation model, and has functions of knowledge mining and knowledge evolution. We named it as "Event Classifier System (ECS)". First, we have developed an Event-Evaluation model as the operational core of ECS. Second, we developed the ECS transaction system with user-friendly interface and intelligent operational functions for traders. Third, for the reason of application, two trading modules were designed and developed: institutional investors’ chips event strategy trading module and ex-dividend event strategy trading module. Final, to verify the effectiveness of the system, we invested real money (about NTD 33 million) into the Taiwan’s stock market, and according to the ECS's signal to do the actual trading of empirical experiments about 8 months. By comparison of the three performance indicators: the rate of return, volatility and Sharpe ratio, the empirical results show that the performance of ECS is better than Taiwan weighted stock index during the same period.
Style APA, Harvard, Vancouver, ISO itp.
19

Lee, Chih-Jung, i 李芷融. "Evaluation for designing audio-based early warning system of vehicle approaching event for improving pedestrian's safety". Thesis, 2017. http://ndltd.ncl.edu.tw/handle/9n8e62.

Pełny tekst źródła
Streszczenie:
碩士
國立中央大學
通訊工程學系在職專班
105
More and more people tend to carry smart phones and wear headphones or headsets, listening to the music while they are jogging or walking in the suburb area. This behavior could bring distraction or temporarily losing the hearing of environmental background and cause accident to happen. This work proposes a simple design for audio-based early warning system of vehicle approaching event for improving pedestrian’s safety and gives evaluation. Sound signals were collected by an external directional microphone connected to the smart phone. Multiple feature techniques like root mean square, zero crossings, spectral centroid, and spectral rolloff were applied on the short-time frames of audio samples. Multiple machine learning classifiers like K Nearest Neighbor, Multi-layer Perceptron, Decision Tree and Random Forest were applied to classify the audio frames to detect vehicle approaching sound. The results showed the accuracy and the feasible of the system, also point out the circumstance can’t be applicable.
Style APA, Harvard, Vancouver, ISO itp.
20

Simeone, Davide. "Un modello di simulazione del comportamento umano negli edifici". Doctoral thesis, 2013. http://hdl.handle.net/11573/918645.

Pełny tekst źródła
Streszczenie:
Predicting future users’ behaviour and their activities in a building is a cardinal and highly complex task that designers have to face during the design process. Even though architects and their clients have at their disposal several computational tools which can help them to predict and evaluate many aspects of building performance such as cost, energy consumption, and structural integrity, they have no means to predict and evaluate how well the proposed design will perform from the users’ point of view. Simulative approaches are gradually overcoming this shortcoming but, at present, they are limited to the representation of specific occurrences and behavioural performance aspects -such as emergency egress and crowd behaviour- while more extensive and comprehensive representations of human behaviour in built environments, able to simulate everyday life and activities in buildings, are still missing. Capitalizing on current developments in the video game industry, this research -partially developed at the Technion Israel Institute of Technology and at Berkeley University of California- aims at establishing a new approach to human behaviour simulation in built environments, based on a clear, reliable and precise formalization of the use processes as specific structures of active entities called Events. Their role is to comprise knowledge about the building users, the activities they perform and the spaces where those specific activities are performed. Equipped with AI engines, events control and coordinate the actors’ behaviour during the simulation, providing a coherent representation of their interaction, cooperation and collaboration. The proposed model allows designers to test and evaluate the impact of their decisions on future users’ life and activities in the design process, when it is still possible to intervene to improve the quality of the final product, to solve critical issues and to reduce time and costs. To test its reliability, the model has been applied to simulate the functioning of a hospital nursing ward, both in routine and emergency circumstances.
Style APA, Harvard, Vancouver, ISO itp.
21

Pereira, Amâncio Lucas de Sousa. "Hardware and software platforms to deploy and evaluate non-intrusive load monitoring systems". Doctoral thesis, 2016. http://hdl.handle.net/10400.13/1501.

Pełny tekst źródła
Streszczenie:
The work in this PhD thesis addresses the practical implications of deploying and testing Non-Intrusive Load Monitoring (NILM) and eco-feedback solutions in real-world scenarios. The contributions to this topic are centered around the design and development of NILM frameworks that have been deployed in the wild, supporting long-term research in ecofeedback and also serving the purpose of producing real-world datasets and furthering the state of the art regarding the performance metrics used to evaluate NILM algorithms. This thesis consists of three main parts: i) the development of tools and datasets for NILM and eco-feedback research, ii) the design, implementation and deployment of NILM and eco-feedback technologies in real world scenarios, and iii) an experimental comparison of performance metrics for event detection and event classification algorithms. In the first part we describe the Energy Monitoring and Disaggregation Data Format (EMD-DF) and the SustData and SustDataED public datasets. In second part we discuss the development and deployment of two hardware and software platforms in real households, to support eco-feedback research. We then report on more than five years of experience in deploying and maintaining such platforms. Our findings suggest that the main practical issues can be divided in two categories, technological (e.g., system installation) and social (e.g., maintaining a steady sample throughout the whole study). In the final part of this thesis we analyze experimentally the behavior of a number of performance metrics for event detection and event classification, identifying clusters and relationships between the different measures. Our results evidence some considerable differences in the behavior of the performance metrics when applied to the different problems.
O trabalho desenvolvido nesta tese de doutoramento aborda as implicações praticas da instalação e avaliação de soluções de monitorização não intrusiva de cargas elétricas (NILM) e eco-feedback em cenários reais. As contribuições para este tópico estão centradas em torno da concepção e desenvolvimento de plataformas NILM que foram instaladas em ambientes não controlados, suportando a pesquisa de longo termo em eco-feedback e servindo também o propósito de produzir conjuntos de dados científicos, bem como promover o avanço do estado da arte acerca das métricas de desempenho utilizadas para avaliar algoritmos NILM. Esta tese é constituída por três partes principais: i) o desenvolvimento de ferramentas e conjuntos de dados científicos para investigação em NILM e eco-feedback, ii) a concepção, desenho e instalação de tecnologias NILM e eco-feedback em cenários reais, e iii) uma comparação experimental de métricas de desempenho para algoritmos de detecção e de classificação de eventos. Na primeira parte descrevemos o Energy Monitoring and Disaggregation Data Format (EMD-DF) e os conjuntos de dados científicos SustData e SustDataED. Na segunda parte discutimos o desenvolvimento e instalação de duas plataformas de hardware e software em residências atuais com a finalidade de suportar a investigação em eco-feedback. Aqui, reportamos sobre mais de cinco anos de experiência na instalação e manutenção destes sistemas. Os nossos resultados sugerem que as principais implicações práticas podem ser divididas em duas categorias, físicas (e.g., instalação do sistema) e sociais (e.g., manter uma amostra constante ao longo de todo o estudo). Na terceira parte analisamos experimentalmente o comportamento de uma série de métricas de desempenho quando estas são utilizadas para avaliar algoritmos de detecção e de classificação de eventos. Calculamos as correlações lineares e não lineares entre os vários pares de métricas, e com base nesses valores procuramos agrupar as métricas que evidenciam um comportamento semelhante. Os nossos resultados sugerem a existência de diferenças evidentes no comportamento das métricas quando aplicadas a ambos dos problemas.
Fundação para a Ciência e a Tecnologia
Style APA, Harvard, Vancouver, ISO itp.
22

Αικατερινίδης, Ιωάννης. "Ανάπτυξη συστημάτων δημοσιεύσεων/συνδρομών σε δομημένα δίκτυα ομοτίμων εταίρων". Thesis, 2008. http://nemertes.lis.upatras.gr/jspui/handle/10889/751.

Pełny tekst źródła
Streszczenie:
Τα τελευταία χρόνια οι εφαρμογές συνεχούς μετάδοσης ροών πληροφορίας στο διαδίκτυο έχουν γίνει ιδιαίτερα δημοφιλείς. Με τον συνεχώς αυξανόμενο ρυθμό εισόδου νέων αντικειμένων πληροφορίας, γίνεται ολοένα και πιο επιτακτική η ανάγκη για την ανάπτυξη πληροφορικών συστημάτων που να μπορούν να προσφέρουν στους χρήστες τους μόνο εκείνες τις πληροφορίες που τους ενδιαφέρουν, φιλτράροντας τεράστιους όγκους από άσχετες για τον κάθε χρήστη, πληροφορίες. Ένα μοντέλο διάδοσης πληροφορίας ικανό να ενσωματώσει τέτοιου είδους ιδιότητες, είναι το μοντέλο δημοσιεύσεων/συνδρομών βασισμένο στο περιεχόμενο ( content-based publish/subscribe) Βασική συνεισφορά μας στο χώρο είναι η εφαρμογή του μοντέλου δημοσιεύσεων/συνδρομών βασισμένου στο περιεχόμενο (content-based publish/subscribe) πάνω στα δίκτυα ομοτίμων ώστε να μπορέσουμε να προσφέρουμε στους χρήστες υψηλή εκφραστικότητα κατά την δήλωση των ενδιαφερόντων τους, λειτουργώντας σε ένα πλήρως κατανεμημένο και κλιμακώσιμο περιβάλλον. Ο κορμός των προτεινόμενων λύσεων σε αυτή τη διατριβή είναι: (α) η ανάπτυξη αλγορίθμων για την αποθήκευση των κλειδιών των δημοσιεύσεων σε κατάλληλους κόμβους του δικτύου με βάση τις συνθήκες στο περιεχόμενο που έχουν δηλωθεί και (β) αλγορίθμων δρομολόγησης δημοσιεύσεων στο διαδίκτυο έτσι ώστε να ((συναντούν)) αυτούς τους κόμβους οι οποίοι περιέχουν συνδρομές που ικανοποιούνται από την πληροφορία της δημοσίευσης. Οι προτεινόμενοι αλγόριθμοι υλοποιήθηκαν και εξετάσθηκαν ενδελεχώς με προσομοίωση μελετώντας την απόδοσή τους με βάση μετρικές όπως: η δίκαιη κατανομή του φόρτου στους κόμβους του δικτύου από τη διακίνηση μηνυμάτων κατά την επεξεργασία των συνδρομών/δημοσιεύσεων, ο συνολικός αριθμός μηνυμάτων που διακινούνται, ο συνολικός όγκος επιπλέον πληροφορίας που απαιτούν οι αλγόριθμοι να εισέλθει στο δίκτυο (network bandwidth), και ο χρόνος που απαιτείται για την ανεύρεση των συνδρομών που συζευγνύουν με κάθε δημοσίευση.
In the past few years the continuous data streams applications have become particularly popular. With the continuously increasing rate of entry of new information, it becomes imperative the need for developing appropriate infrastructures that will offer only the information that users are interested for, filtering out large volumes of irrelevant for each user, information. The content-based publish/subscribe model, is capable of handling large volumes of data traffic in a distributed, fully decentralized manner. Our basic contribution in this research area is the coupling of the content-based publish/subscribe model with the structured (DHT-based) peer-to-peer networks, offering high expressiveness to users on stating their interests. The proposed infrastructure operated in a distributed and scalable environment. The proposed solutions in this thesis are related to the development and testing: (a) of a number of algorithms for subscription processing in the network and (b) of a number of algorithms for processing the publication events. The proposed algorithms were developed and thoroughly tested with a detailed simulation-based experimentation. The performance metrics are: the fair distribution of load in the nodes of network from the distribution of messages while processing subscriptions and publication events, the total number of messages that are generated, the total volume of additional information that is required from the algorithms to operate, and the time that is required for matching publication events to subscriptions.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii