Dissertations / Theses on the topic 'Real data model'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Real data model.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Frafjord, Christine. "Friction Factor Model and Interpretation of Real Time Data." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for petroleumsteknologi og anvendt geofysikk, 2013. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-21772.
Full textGranholm, George Richard 1976. "Near-real time atmospheric density model correction using space catalog data." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/44899.
Full textIncludes bibliographical references (p. 179-184).
Several theories have been presented in regard to creating a neutral density model that is corrected or calibrated in near-real time using data from space catalogs. These theories are usually limited to a small number of frequently tracked "calibration satellites" about which information such as mass and crosssectional area is known very accurately. This work, however, attempts to validate a methodology by which drag information from all available low-altitude space objects is used to update any given density model on a comprehensive basis. The basic update and prediction algorithms and a technique to estimate true ballistic factors are derived in detail. A full simulation capability is independently verified. The process is initially demonstrated using simulated range, azimuth, and elevation observations so that issues such as required number and types of calibration satellites, density of observations, and susceptibility to atmospheric conditions can be examined. Methods of forecasting the density correction models are also validated under different atmospheric conditions.
by George Richard Granholm.
S.M.
Bloodsworth, Peter Charles. "A generic model for real-time scheduling based on dynamic heterogeneous data." Thesis, Oxford Brookes University, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432716.
Full textLarsson, Daniel. "ARAVQ for discretization of radar data : An experimental study on real world sensor data." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-11114.
Full textRobidoux, Jeff. "Real-Time Spatial Monitoring of Vehicle Vibration Data as a Model for TeleGeoMonitoring Systems." Thesis, Virginia Tech, 2005. http://hdl.handle.net/10919/32426.
Full textMaster of Science
hu, xiaoxiang. "Analysis of Time-related Properties in Real-time Data Aggregation Design." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-39046.
Full textHwang, Yuan-Chun. "Local and personalised models for prediction, classification and knowledge discovery on real world data modelling problems." Click here to access this resource online, 2009. http://hdl.handle.net/10292/776.
Full textBergstrom, Sarah Elizabeth 1979. "An algorithm for reducing atmospheric density model errors using satellite observation data in real-time." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/17537.
Full textVita.
Includes bibliographical references (p. 233-240).
Atmospheric density mismodeling is a large source of errors in satellite orbit determination and prediction in the 200-600 kilometer range. Algorithms for correcting or "calibrating" an existing atmospheric density model to improve accuracy have been seen as a major way to reduce these errors. This thesis examines one particular algorithm, which does not require launching special "calibration satellites" or new sensor platforms. It relies solely on the large quantity of observations of existing satellites, which are already being made for space catalog maintenance. By processing these satellite observations in near real-time, a linear correction factor can be determined and forecasted into the near future. As a side benefit, improved estimates of the ballistic coefficients of some satellites are also produced. Also, statistics concerning the accuracy of the underlying density model can also be extracted from the correction. This algorithm had previously been implemented and the implementation had been partially validated using simulated data. This thesis describes the completion of the validation process using simulated data and the beginning of the real data validation process. It is also intended to serve as a manual for using and modifying the implementation of the algorithm.
by Sarah Elizabeth Bergstrom.
S.M.
Byrnes, Denise Dianne. "Static scheduling of hard real-time control software using an asynchronous data-driven execution model /." The Ohio State University, 1992. http://rave.ohiolink.edu/etdc/view?acc_num=osu14877799148243.
Full textHajjam, Sohrab. "Real-time flood forecasting model intercomparison and parameter updating rain gauge and weather radar data." Thesis, University of Salford, 1997. http://usir.salford.ac.uk/43019/.
Full textSELICATI, VALERIA. "Innovative thermodynamic hybrid model-based and data-driven techniques for real time manufacturing sustainability assessment." Doctoral thesis, Università degli studi della Basilicata, 2022. http://hdl.handle.net/11563/157566.
Full textLee, Kelvin Kai-wing. "A delay model approach to analysing the performance of wireless communications /." View abstract or full-text, 2005. http://library.ust.hk/cgi/db/thesis.pl?COMP%202005%20LEE.
Full textShay, Nathan Michael. "Investigating Real-Time Employer-Based Ridesharing Preferences Based on Stated Preference Survey Data." The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1471587439.
Full textZhang, Yingying. "Algorithms and Data Structures for Efficient Timing Analysis of Asynchronous Real-time Systems." Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4622.
Full textReynolds, Curt Andrew 1960. "Estimating crop yields by integrating the FAO crop specific water balance model with real-time satellite data and ground-based ancillary data." Thesis, The University of Arizona, 1998. http://hdl.handle.net/10150/192102.
Full textSchirber, Sebastian, Daniel Klocke, Robert Pincus, Johannes Quaas, and Jeffrey L. Anderson. "Parameter estimation using data assimilation in an atmospheric general circulation model: Parameter estimation using data assimilation in an atmosphericgeneral circulation model: from a perfect toward the real world." American Geophysical Union (AGU), 2013. https://ul.qucosa.de/id/qucosa%3A13463.
Full textRobinson-Mallett, Christopher. "Modellbasierte Modulprüfung für die Entwicklung technischer, softwareintensiver Systeme mit Real-Time Object-Oriented Modeling." Phd thesis, Universität Potsdam, 2005. http://opus.kobv.de/ubp/volltexte/2005/604/.
Full textDer Modultest dient als erste Qualititätssicherungsphase nach der Implementierung der Fehlerfindung und dem Qualitätsnachweis für jede separat prüfbare Softwarekomponente eines Systems. Während dieser Phase stellt die Durchführung von systematischen Tests die wichtigste Qualitätssicherungsmaßnahme dar. Während zum jetzigen Zeitpunkt zwar ausgereifte Methoden und Werkzeuge für die modellbasierte Softwareentwicklung zur Verfügung stehen, existieren nur wenig überzeugende Lösungen für eine systematische modellbasierte Modulprüfung.
Die durchgängige Verwendung ausführbarer Modelle und Codegenerierung stellen wesentliche Konzepte der modellbasierten Softwareentwicklung dar. Sie dienen der konstruktiven Fehlerreduktion durch Automatisierung ansonsten fehlerträchtiger, manueller Vorgänge. Im Rahmen einer modellbasierten Qualitätssicherung sollten diese Konzepte konsequenterweise in die späteren Qualitätssicherungsphasen transportiert werden. Daher ist eine wesentliche Forderung an ein Verfahren zur modellbasierten Modulprüfung ein möglichst hoher Grad an Automatisierung.
In aktuellen Entwicklungen hat sich für die Generierung von Testfällen auf Basis von Zustandsautomaten die Verwendung von Model Checking als effiziente und an die vielfältigsten Testprobleme anpassbare Methode bewährt. Der Ansatz des Model Checking stammt ursprünglich aus dem Entwurf von Kommunikationsprotokollen und wurde bereits erfolgreich auf verschiedene Probleme der Modellierung technischer Software angewendet. Insbesondere in der Gegenwart ausführbarer, automatenbasierter Modelle erscheint die Verwendung von Model Checking sinnvoll, das die Existenz einer formalen, zustandsbasierten Spezifikation voraussetzt. Ein ausführbares, zustandsbasiertes Modell erfüllt diese Anforderungen in der Regel. Aus diesen Gründen ist die Wahl eines Model Checking Ansatzes für die Generierung von Testfällen im Rahmen eines modellbasierten Modultestverfahrens eine logische Konsequenz.
Obwohl in der aktuellen Spezifikation der UML-RT keine eindeutigen Aussagen über den zur Verhaltensbeschreibung zu verwendenden Formalismus gemacht werden, ist es wahrscheinlich, dass es sich bei der UML-RT um eine zu Real-Time Object-Oriented Modeling (ROOM) kompatible Methode handelt. Alle in dieser Arbeit präsentierten Methoden und Ergebnisse sind somit auf die kommende UML-RT übertragbar und von sehr aktueller Bedeutung.
Aus den genannten Gründen verfolgt diese Arbeit das Ziel, die analytische Qualitätssicherung in der modellbasierten Softwareentwicklung mittels einer modellbasierten Methode für den Modultest zu verbessern. Zu diesem Zweck wird eine neuartige Testmethode präsentiert, die auf automatenbasierten Verhaltensmodellen und CTL Model Checking basiert. Die Testfallgenerierung kann weitgehend automatisch erfolgen, um Fehler durch menschlichen Einfluss auszuschließen. Das entwickelte Modultestverfahren ist in die technischen Konzepte Model Driven Architecture und ROOM, beziehungsweise UML-RT, sowie in die organisatorischen Konzepte eines qualitätsorientierten Prozessmodells, beispielsweise das V-Modell, integrierbar.
In consequence to the increasing complexity of technical software-systems the demand on highly productive methods and tools is increasing even in the field of safety-critical systems. In particular, object-oriented and model-based approaches to software-development provide excellent abilities to develop large and highly complex systems. Therefore, it can be expected that in the near future these methods will find application even in the safety-critical area. The Unified Modeling Language Real-Time (UML-RT) is a software-development methods for technical systems, which is propagated by the Object Management Group (OMG). For the practical application of this method in the field of technical and safety-critical systems it has to provide certain technical qualities, e.g. applicability of temporal analyses. Furthermore, it needs to be integrated into the existing quality assurance process. An important aspect of the integration of UML-RT in an quality-oriented process model, e.g. the V-Model, represents the availability of sophisticated concepts and methods for systematic unit-testing.
Unit-testing is the first quality assurance phase after implementation to reveal faults and to approve the quality of each independently testable software component. During this phase the systematic execution of test-cases is the most important quality assurance task. Despite the fact, that today many sophisticated, commercial methods and tools for model-based software-development are available, no convincing solutions exist for systematic model-based unit-testing.
The use of executable models and automatic code generation are important concepts of model-based software development, which enable the constructive reduction of faults through automation of error-prone tasks. Consequently, these concepts should be transferred into the testing phases by a model-based quality assurance approach. Therefore, a major requirement of a model-based unit-testing method is a high degree of automation. In the best case, this should result in fully automatic test-case generation.
Model checking already has been approved an efficient and flexible method for the automated generation of test-cases from specifications in the form of finite state-machines. The model checking approach has been developed for the verification of communication protocols and it was applied successfully to a wide range of problems in the field of technical software modelling. The application of model checking demands a formal, state-based representation of the system. Therefore, the use of model checking for the generation of test-cases is a beneficial approach to improve the quality in a model-based software development with executable, state-based models.
Although, in its current state the specification of UML-RT provides only little information on the semantics of the formalism that has to be used to specify a component’s behaviour, it can be assumed that it will be compatible to Real-Time Object-Oriented Modeling. Therefore, all presented methods and results in this dissertation are transferable to UML-RT.
For these reasons, this dissertations aims at the improvement of the analytical quality assurance in a model-based software development process. To achieve this goal, a new model-based approach to automated unit-testing on the basis of state-based behavioural models and CTL Model Checking is presented. The presented method for test-case generation can be automated to avoid faults due to error-prone human activities. Furthermore it can be integrated into the technical concepts of the Model Driven Architecture and ROOM, respectively UML-RT, and into a quality-oriented process model, like the V-Model.
Quaranta, Giacomo. "Efficient simulation tools for real-time monitoring and control using model order reduction and data-driven techniques." Doctoral thesis, Universitat Politècnica de Catalunya, 2019. http://hdl.handle.net/10803/667474.
Full textLa simulación numérica, el uso de ordenadores para ejecutar un programa que implementa un modelo matemático de un sistema físico, es una parte importante del mundo tecnológico actual. En muchos campos de la ciencia y la ingeniería es necesario estudiar el comportamiento de sistemas cuyos modelos matemáticos son demasiado complejos para proporcionar soluciones analíticas, haciendo posible la evaluación virtual de las respuestas de los sistemas (gemelos virtuales). Esto reduce drásticamente el número de pruebas experimentales para los diseños precisos del sistema real que el modelo numérico representa. Sin embargo, estos gemelos virtuales, basados en métodos clásicos que hacen uso de una rica representación del sistema (por ejemplo, el método de elementos finitos), rara vez permiten la retroalimentación en tiempo real, incluso cuando se considera la computación en plataformas de alto rendimiento. En estas circunstancias, el rendimiento en tiempo real requerido en algunas aplicaciones se ve comprometido. En efecto, los gemelos virtuales son estáticos, es decir, se utilizan en el diseño de sistemas complejos y sus componentes, pero no se espera que acomoden o asimilen los datos para definir sistemas de aplicación dinámicos basados en datos. Además, se suelen apreciar desviaciones significativas entre la respuesta observada y la predicha por el modelo, debido a inexactitudes en los modelos empleados, en la determinación de los parámetros del modelo o en su evolución temporal. En esta tesis se proponen diferentes métodos para resolver estas limitaciones con el fin de realizar un seguimiento y un control en tiempo real. En la primera parte se utilizan técnicas de Reducción de Modelos para satisfacer las restricciones en tiempo real; estas técnicas calculan una buena aproximación de la solución simplificando el procedimiento de resolución en lugar del modelo. La precisión de la solución no se ve comprometida y se pueden realizar simulaciones efficientes (gemelos digitales). En la segunda parte se emplea la modelización basada en datos para llenar el vacío entre la solución paramétrica, calculada utilizando técnicas de reducción de modelos no intrusivas, y los campos medidos, con el fin de hacer posibles los sistemas de aplicación dinámicos basados en datos (gemelos híbridos).
La simulation numérique, c'est-à-dire l'utilisation des ordinateurs pour exécuter un programme qui met en oeuvre un modèle mathématique d'un système physique, est une partie importante du monde technologique actuel. Elle est nécessaire dans de nombreux domaines scientifiques et techniques pour étudier le comportement de systèmes dont les modèles mathématiques sont trop complexes pour fournir des solutions analytiques et elle rend possible l'évaluation virtuelle des réponses des systèmes (jumeaux virtuels). Cela réduit considérablement le nombre de tests expérimentaux nécessaires à la conception précise du système réel que le modèle numérique représente. Cependant, ces jumeaux virtuels, basés sur des méthodes classiques qui utilisent une représentation fine du système (ex. méthode des éléments finis), permettent rarement une rétroaction en temps réel, même dans un contexte de calcul haute performance, fonctionnant sur des plates-formes puissantes. Dans ces circonstances, les performances en temps réel requises dans certaines applications sont compromises. En effet, les jumeaux virtuels sont statiques, c'est-à-dire qu'ils sont utilisés dans la conception de systèmes complexes et de leurs composants, mais on ne s'attend pas à ce qu'ils prennent en compte ou assimilent des données afin de définir des systèmes d'application dynamiques pilotés par les données. De plus, des écarts significatifs entre la réponse observée et celle prévue par le modèle sont généralement constatés en raison de l'imprécision des modèles employés, de la détermination des paramètres du modèle ou de leur évolution dans le temps. Dans cette thèse, nous proposons di érentes méthodes pour résoudre ces handicaps afin d'effectuer une surveillance et un contrôle en temps réel. Dans la première partie, les techniques de Réduction de Modèles sont utilisées pour tenir compte des contraintes en temps réel ; elles calculent une bonne approximation de la solution en simplifiant la procédure de résolution plutôt que le modèle. La précision de la solution n'est pas compromise et des simulations e caces peuvent être réalisées (jumeaux numériquex). Dans la deuxième partie, la modélisation pilotée par les données est utilisée pour combler l'écart entre la solution paramétrique calculée, en utilisant des techniques de réduction de modèles non intrusives, et les champs mesurés, afin de rendre possibles des systèmes d'application dynamiques basés sur les données (jumeaux hybrides).
Storoshchuk, Orest Lev Poehlman William Frederick Skipper. "Model based synchronization of monitoring and control systems /." *McMaster only, 2003.
Find full textSingh, Rahul. "A model to integrate Data Mining and On-line Analytical Processing: with application to Real Time Process Control." VCU Scholars Compass, 1999. https://scholarscompass.vcu.edu/etd/5521.
Full textFerreira, E. (Eija). "Model selection in time series machine learning applications." Doctoral thesis, Oulun yliopisto, 2015. http://urn.fi/urn:isbn:9789526209012.
Full textTiivistelmä Mallinvalinta on oleellinen osa minkä tahansa käytännön mallinnusongelman ratkaisua. Koska mallinnettavan ilmiön toiminnan taustalla olevaa todellista mallia ei voida tietää, on mallinvalinnan tarkoituksena valita malliehdokkaiden joukosta sitä lähimpänä oleva malli. Tässä väitöskirjassa käsitellään mallinvalintaa aikasarjamuotoista dataa sisältävissä sovelluksissa neljän koneoppimisprosessissa yleisesti noudatetun askeleen kautta: aineiston esikäsittely, algoritmin valinta, piirteiden valinta ja validointi. Väitöskirjassa tutkitaan, kuinka käytettävissä olevan aineiston ominaisuudet ja määrä tulisi ottaa huomioon algoritmin valinnassa, ja kuinka aineisto tulisi jakaa mallin opetusta, testausta ja validointia varten mallin yleistettävyyden ja tulevan suorituskyvyn optimoimiseksi. Myös erityisiä rajoitteita ja vaatimuksia tavanomaisten koneoppimismenetelmien soveltamiselle aikasarjadataan käsitellään. Työn tavoitteena on erityisesti tuoda esille mallin ylioppimiseen ja ylivalintaan liittyviä ongelmia, jotka voivat seurata mallinvalin- tamenetelmien huolimattomasta tai osaamattomasta käytöstä. Työn käytännön tulokset perustuvat koneoppimismenetelmien soveltamiseen aikasar- jadatan mallinnukseen kolmella eri tutkimusalueella: pistehitsaus, fyysisen harjoittelun aikasen energiankulutuksen arviointi sekä kognitiivisen kuormituksen mallintaminen. Väitöskirja tarjoaa näihin tuloksiin pohjautuen yleisiä suuntaviivoja, joita voidaan käyttää apuna lähdettäessä ratkaisemaan uutta koneoppimisongelmaa erityisesti aineiston ominaisuuksien ja määrän, laskennallisten resurssien sekä ongelman mahdollisen aikasar- jaluonteen näkökulmasta. Työssä pohditaan myös mallin lopullisen toimintaympäristön asettamien käytännön näkökohtien ja rajoitteiden vaikutusta algoritmin valintaan
GARBARINO, DAVIDE. "Acknowledging the structured nature of real-world data with graphs embeddings and probabilistic inference methods." Doctoral thesis, Università degli studi di Genova, 2022. http://hdl.handle.net/11567/1092453.
Full textBreithaupt, Krista J. "A comparison of the sample invariance of item statistics from the classical test model, item response model, and structural equation model, a case study of real response data." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ58266.pdf.
Full textRuchirat, Piyapathu. "State detection mechanism, productivity bias and a medium open economy real business cycle model for Thailand (1993-2005 data)." Thesis, Cardiff University, 2005. http://orca.cf.ac.uk/55568/.
Full textYoshida, Jiro 1970. "Effects of uncertainty on the investment decision : an examination of the option-based investment model using Japanese real estate data." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/70726.
Full textIncludes bibliographical references (leaves 49-52).
This paper examines the validity of the option-based investment model as opposed to the neoclassical investment model in the decision-making of commercial real estate development, using aggregate real estate data from Japan. I particularly focus on the effect of uncertainty because it is the central difference between the two models. I specify a structural model in order to incorporate the interactions between supply and demand in the real estate asset market. In order to conduct detailed empirical tests for a long period of time, I set three data series. The Long Series uses quarterly data of 25 years and Short Series 1 and Short Series 2 use monthly data of about 15 years. I find strong evidence that supports the option-based investment model. Especially in the supply equation, total uncertainty has significant effects on the investment decision. A lag structure is found in the effect of total uncertainty. The parameters for other variables also generally favor the option-based model. In the demand equation, too, the results strongly support the option-based investment model. It should be concluded from these results that various kinds of real options must be incorporated in investment and economic models.
by Jiro Yoshida.
S.M.
Kasinathan, Gokulnath. "Data Transformation Trajectories in Embedded Systems." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-205276.
Full textMobiltelefoners positionering är välfungerande för positionslokalisering av mobiltelefoner när de rör sig från en plats till en annan. Lokaliseringstjänsterna inkluderar mobil positionering system som kan användas till en mängd olika kundbehovs tjänster som sökning av position, position i kartor, navigering, vägtransporters trafik managering och nödsituationssamtal med positionering. Mobil positions system (MPS) stödjer komplementär positions metoder för 2G, 3G och 4G/LTE (Long Term Evolution) nätverk. Mobiltelefoner är populärt känd som UE (User Equipment) inom LTE. En prototypmetod med verkliga rörelsers estimering för massiv UE i LTE nätverk har blivit föreslagen för detta examens arbete. RSRP (Reference Signal Received Power) värden och TA (Timing Advance) värden är del av LTE händelser för UE. Dessa specifika LTE event kan strömmas till ett system från eNodeB del av LTE, i realtid genom aktivering av mätningar på UEar i nätverk. AoA (Angel of Arrival) och TA värden är använt för att beräkna UEs position. AoA beräkningar är genomförda genom användandet av RSRP värden. Den kalkylerade UE positionen är filtrerad genom användande av Particle Filter (PF) för att estimera rörelsen. För att identifiera verkliga rörelser, beräkningar för massiva UEs, LTE event streamer är modulerad att producera flera uppgifts enheter med event data från massiva UEar. De tasks modulerade data strukturerna är planerade över Arm Cortex A15 baserade MPcore, med multipla trådar. Slutligen, med massiva UE verkliga rörelser, beräkningar med IMSI(International mobile subscriber identity) är använt av den Hidden Markov kraven i Particle Filter’s funktionalitet medans kravet att underhålla last balansen för 4 Arm A15 kärnor. Detta är utfört genom seriell och parallell prestanda teknik. Framtida arbeten för decentraliserade task nivå skedulering med hash funktion för IMSI med utökning av kärnor och Concentric circles metod för AoA noggrannhet.
Rodittis, Katherine, and Patrick Mattingly. "USING MICROSOFT’S COMPONENT OBJECT MODEL (COM) TO IMPROVE REAL-TIME DISPLAY DEVELOPMENT FOR THE ADVANCED DATA ACQUISITION AND PROCESSING SYSTEM (ADAPS)." International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/606801.
Full textMicrosoft’s Component Object Model (COM) allows us to rapidly develop display and analysis features for the Advanced Data Acquisition and Processing System (ADAPS).
Zámečníková, Eva. "FORMÁLNÍ MODEL ROZHODOVACÍHO PROCESU PRO ZPRACOVÁNÍ VYSOKOFREKVENČNÍCH DAT." Doctoral thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2017. http://www.nusl.cz/ntk/nusl-412586.
Full textBrandão, Jesse Wayde. "Analysis of the truncated response model for fixed priority on HMPSoCs." Master's thesis, Universidade de Aveiro, 2014. http://hdl.handle.net/10773/14836.
Full textWith the ever more ubiquitous nature of embedded systems and their increasingly demanding applications, such as audio/video decoding and networking, the popularity of MultiProcessor Systems-on-Chip (MPSoCs) continues to increase. As such, their modern uses often involve the execution of multiple applications on the same system. Embedded systems often have applications that are faced with timing restrictions, some of which are deadlines, throughput and latency. The resources available to the applications running on these systems are nite and, therefore, applications need to share the available resources while guaranteeing that their timing requirements are met. These guarantees are established via schedulers which may employ some of the many techniques devised for the arbitration of resource usage among applications. The main technique considered in this dissertation is the Preemptive Fixed Priority (PFP) scheduling technique. Also, there is a growing trend in the usage of the data ow computational model for analysis of applications on MultiProcessor System-on-Chips (MPSoCs). Data ow graphs are functionally intuitive, and have interesting and useful analytical properties. This dissertation intends to further previous work done in temporal analysis of PFP scheduling of Real-Time applications on MPSoCs by implementing the truncated response model for PFP scheduling and analyzing the its results. This response model promises tighter bounds for the worst case response times of the actors in a low priority data ow graph by considering the worst case response times over consecutive rings of an actor rather than just a single ring. As a follow up to this work, we also introduce in this dissertation a burst analysis technique for actors in a data ow graph.
Com a natureza cada vez mais ubíqua de sistemas embutidos e as suas aplicações cada vez mais exigentes, como a decodicação de áudio/video e rede, a popularidade de MultiProcessor Systems-on-Chip (MPSoCs) continua a aumentar. Como tal, os seus usos modernos muitas vezes envolvem a execução de várias aplicações no mesmo sistema. Sistemas embutidos, frequentemente correm aplicações que são confrontadas com restrições temporais, algumas das quais são prazos, taxa de transferência e latência. Os recursos disponíveis para as aplicações que estão a correr nestes sistemas são finitos e, portanto, as aplicações necessitam de partilhar os recursos disponíveis, garantindo simultaneamente que os seus requisitos temporais sejam satisfeitos. Estas garantias são estabelecidas por meio escalonadores que podem empregar algumas das muitas técnicas elaboradas para a arbitragem de uso de recursos entre as aplicações. A técnica principal considerada nesta dissertação é Preemptive Fixed Priority (PFP). Além disso existe uma tendência crescente no uso do modelo computacional data flow para a análise de aplicações a correr em MPSoCs. Grafos data flow são funcionalmente intuitivos e possuem propriedades interessantes e úteis. Esta dissertação pretende avançar trabalho prévio na área de escalonamento PFP de aplicações ai implementar o modelo de resposta truncatedo para escalonamento PFP e analisar os seus resultados. Este modelo de resposta promete limites mais estritos para os tempos de resposta de pior caso para atores num grafo de baixa prioridade ao considerar os tempos de resposta de pior caso ao longo de várias execuções consecutivas de um actor em vez de uma só. Como seguimento a este trabalho, também introduzimos nesta dissertação uma técnica para a análise de execuções em rajada de atores num grafo data flow.
Gonella, Philippe. "Business Process Management and Process Mining within a Real Business Environment: An Empirical Analysis of Event Logs Data in a Consulting Project." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/11799/.
Full textCusinato, Rafael Tiecher. "Ensaios sobre previsão de inflação e análise de dados em tempo real no Brasil." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2009. http://hdl.handle.net/10183/22654.
Full textThis thesis presents three essays on inflation forecasting and real-time data analysis in Brazil. By using a Phillips curve, the first essay presents an “evolutionary model” to forecast Brazilian inflation. The evolutionary model consists in a combination of a non-linear model (that is formed by a combination of three artificial neural networks - ANNs) and a linear model (that is also a benchmark for comparison purposes). Some parameters of the evolutionary model, including the combination weight, evolve throughout time according to adjustments defined by three algorithms that evaluate the out-of-sample errors. The ANNs were estimated by using a hybrid approach based on a genetic algorithm (GA) and on a Nelder-Mead simplex algorithm. In a 3, 6, 9 and 12 steps ahead out-of-sample forecasting experiment, the performance of the evolutionary model was compared to the performance of the benchmark linear model, according to root mean squared errors (RMSE) and to mean absolute error (MAE) criteria. The evolutionary model performed better than the linear model for all forecasting steps that were analyzed, according to both criteria. The second essay is motivated by recent literature on real-time data analysis, which has shown that several measures of economic activities go through important data revisions throughout time, implying important limitations to the use of these measures. We developed a GDP real-time data set to Brazilian economy and we analyzed the extent to which GDP growth and output gap series are revised over time. We showed that revisions to GDP growth (quarter-onquarter) are economic relevant, although the GDP growth revisions lose part of their importance as aggregation period increases (for example, four-quarter growth). To analyze the output gap revisions, we applied four detrending methods: the Hodrick-Prescott filter, the linear trend, the quadratic trend, and the Harvey-Clark model of unobservable components. It was shown that all methods had economically relevant magnitude of revisions. In a general way, both GDP data revisions and the low accuracy of end-of-sample output trend estimates were relevant sources of output gap revisions. The third essay is also a study about real-time data, but focused on industrial production (IP) data and on industrial production gap estimates. We showed that revisions to IP growth (month-on-month) and to IP quarterly moving average growth are economic relevant, although the IP growth revisions become less important as aggregation period increases (for example, twelve-month growth). To analyze the output gap revisions, we applied three detrending methods: the Hodrick-Prescott filter, the linear trend, and the quadratic trend. It was shown that all methods had economically relevant magnitude of revisions. In general, both IP data revisions and low accuracy of end-of-sample IP trend estimates were relevant sources of IP gap revisions, although the results suggest some prevalence of revisions originated from low accuracy of end-of-sample estimates.
Pai, Yu-Jou. "Risks in Financial Markets." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1584003500272517.
Full textCattenoz, Mathieu. "MIMO Radar Processing Methods for Anticipating and Preventing Real World Imperfections." Thesis, Paris 11, 2015. http://www.theses.fr/2015PA112077/document.
Full textThe MIMO radar concept promises numerous advantages compared to today's radar architectures: flexibility for the transmitting beampattern design - including wide scene illumination and fine resolution after processing - and system complexity reduction, through the use of less antennas and the possibility to transfer system control and calibration to the digital domain. However, the MIMO radar is still at the stage of theoretical concept, with insufficient consideration for the impacts of waveforms' lack of orthogonality and system hardware imperfections.The ambition of this thesis is to contribute to paving the way to the operational MIMO radar. In this perspective, this thesis work consists in anticipating and compensating the imperfections of the real world with processing techniques. The first part deals with MIMO waveform design and we show that phase code waveforms are optimal in terms of spatial resolution. We also exhibit their limits in terms of sidelobes appearance at matched filter output. The second part consists in taking on the waveform intrinsic imperfections and proposing data-dependent processing schemes for the rejection of the induced residual sidelobes. We develop an extension for the Orthogonal Matching Pursuit (OMP) that satisfies operational requirements, especially localization error robustness, low computation complexity, and nonnecessity of training data. The third part deals with processing robustness to signal model mismatch, especially how it can be prevented or anticipated to avoid performance degradation. In particular, we propose a digital method of transmitter phase calibration. The last part consists in carrying out experiments in real conditions with the Hycam MIMO radar testbed. We exhibit that some unanticipated encountered distortions, even when limited at the matched filter output, can greatly impact the performance in detection of the data-dependent processing methods
Trapp, Matthias. "Analysis and exploration of virtual 3D city models using 3D information lenses." Master's thesis, Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2008/1393/.
Full textDiese Diplomarbeit behandelt echtzeitfähige Renderingverfahren für 3D Informationslinsen, die auf der Fokus-&-Kontext-Metapher basieren. Im folgenden werden ihre Anwendbarkeit auf Objekte und Strukturen von virtuellen 3D-Stadtmodellen analysiert, konzipiert, implementiert und bewertet. Die Focus-&-Kontext-Visualisierung für virtuelle 3D-Stadtmodelle ist im Gegensatz zum Anwendungsbereich der 3D Geländemodelle kaum untersucht. Hier jedoch ist eine gezielte Visualisierung von kontextbezogenen Daten zu Objekten von großer Bedeutung für die interaktive Exploration und Analyse. Programmierbare Computerhardware erlaubt die Umsetzung neuer Linsen-Techniken, welche die Steigerung der perzeptorischen und kognitiven Qualität der Visualisierung im Vergleich zu klassischen perspektivischen Projektionen zum Ziel hat. Für eine Auswahl von 3D-Informationslinsen wird die Integration in ein 3D-Szenengraph-System durchgeführt: • Verdeckungslinsen modifizieren die Gestaltung von virtuellen 3D-Stadtmodell- Objekten, um deren Verdeckungen aufzulösen und somit die Navigation zu erleichtern. • Best-View Linsen zeigen Stadtmodell-Objekte in einer prioritätsdefinierten Weise und vermitteln Meta-Informationen virtueller 3D-Stadtmodelle. Sie unterstützen dadurch deren Exploration und Navigation. • Farb- und Deformationslinsen modifizieren die Gestaltung und die Geometrie von 3D-Stadtmodell-Bereichen, um deren Wahrnehmung zu steigern. Die in dieser Arbeit präsentierten Techniken für 3D Informationslinsen und die Anwendung auf virtuelle 3D Stadt-Modelle verdeutlichen deren Potenzial in der interaktiven Visualisierung und bilden eine Basis für Weiterentwicklungen.
Kalapati, Raga S. "Analysis of Ozone Data Trends as an Effect of Meteorology and Development of Forecasting Models for Predicting Hourly Ozone Concentrations and Exceedances for Dayton, OH, Using MM5 Real-Time Forecasts." University of Toledo / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1091216133.
Full textV´ásquez, Huanchi Miriam Elizabeth. "El impacto del tipo de cambio real y su volatilidad en el desempeño de las exportaciones de América Latina durante el periodo 1989-2018." Bachelor's thesis, Universidad Peruana de Ciencias Aplicadas (UPC), 2020. http://hdl.handle.net/10757/653817.
Full textThis research work examines the impact of real exchange rate volatility, as a proxy for exchange rate uncertainty, on the performance of total exports for a panel of Latin American countries in the period 1989-2018. Variables such as the export gap, the real exchange rate gap, the gross domestic product gap, the world demand gap, and the trade terms gap are used. Likewise, the behavior of the volatility of the real exchange rate is estimated by modeling it through GARCH models. A panel model of Autoregressive Vectors is estimated for a balanced sample of five Latin American countries (Argentina, Brazil, Chile, Mexico, and Peru) for the period 1989-2018. The results suggest that the volatility of the real exchange rate has a negative effect on the exports of the selected countries. Additionally, this research is relevant because it provides empirical evidence from countries with different economic characteristics to understand the effect of variations in the real exchange rate on export performance and, therefore, on the stability of economic growth.
Trabajo de investigación
Wahlberg, Fredrik. "Parallel algorithms for target tracking on multi-coreplatform with mobile LEGO robots." Thesis, Uppsala universitet, Avdelningen för systemteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-155537.
Full textAllen, Brett. "Learning body shape models from real-world data /." Thesis, Connect to this title online; UW restricted, 2005. http://hdl.handle.net/1773/6969.
Full textTruzzi, Stefano. "Event classification in MAGIC through Convolutional Neural Networks." Doctoral thesis, Università di Siena, 2022. http://hdl.handle.net/11365/1216295.
Full textPettersson, Emma, and Johanna Karlsson. "Design för översikt av kontinuerliga dataflöden : En studie om informationsgränssnitt för energimätning till hjälp för fastighetsbolag." Thesis, Linnéuniversitetet, Institutionen för informatik (IK), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-78510.
Full textSoftware and interfaces are today a natural part of our everyday lives. Developing useful and successful interfaces is in business interest as it can lead to more satisfied customers. The problem in this report is based on user surveys conducted on an energy-presented information interface used by individuals in the real estate industry. The company that owns the software conducted a survey, indicating that the software usability needed to develop, and this was assigned to the project team to further develop. Further development is based on Delone and McLeans (2003) Information System Success Model as well as the terms information design, usability and featuritis. Based on these, the theoretical background used was the basis for the qualitative interviews and questionnaires that were presented. The theoretical background provided the basis for the interface proposals that were finally presented in the project (See Figure 6). The results of the survey showed that users and support staff had relatively different experiences of the software. The other conclusions that could be drawn about how an information interface should be designed to serve as support for the user were the following, it should follow conventional design patterns. The design should be consistent throughout the software, it should use an adapted and clear language, and either be so clear and intuitive that anyone can understand the software or offer a clear manual.
Chen, Kui. "Modeling and estimation of degradation for PEM fuel cells in real conditions of use for mobile applications." Thesis, Bourgogne Franche-Comté, 2020. http://www.theses.fr/2020UBFCA022.
Full textProton Exchange Membrane Fuel Cells (PEMFC) is a clean energy source because of the merits like high energy efficiency, low noise, low operating temperature, and zero pollutants. However, the short lifetime caused by degradation has a great impact on the integration of PEMFC in the transportation systems. Prognostics and health management is an important way to improve performance and remaining useful life for PEMFC. This thesis proposes five degradation prognosis methods for PEMFC. The thesis considers the influence of main operating conditions including the load current, temperature, hydrogen pressure, and relative humidity on the degradation of PEMFC. The global degradation trend and reversible phenomena are analyzed on the basis of data from three PEMFC experiments conducted under different conditions of use (a fleet of 10 PEMFC vehicles and two laboratory test benches). First, the model-driven method based on unscented Kalman Filter algorithm and voltage degradation model is presented to predict the degradation of PEMFC in fuel cell electric vehicles. Then, the hybrid method based on the wavelet analysis, extreme learning machine and genetic algorithm is proposed to build the degradation model of PEMFC. To forecast the degradation of PEMFC with limited experimental data, the improved data-driven method based on the combination of the grey neural network model, the particle swarm optimization and the moving window methods, is used for developing the third model. The fourth contribution is an aging prognosis model of PEMFC operating in different conditions, by using the Backpropagation neural network and evolutionary algorithm. Finally, a degradation prognosis of PEMFC based on wavelet neural network and cuckoo search algorithm is proposed to predict the remaining useful life of PEMFC
Hartmann, Daniel. "Stock markets and real-time macroeconomic data /." Hamburg : Kovač, 2007. http://d-nb.info/985325682/04.
Full textRao, Ashwani Pratap. "Statistical information retrieval models| Experiments, evaluation on real time data." Thesis, University of Delaware, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=1567821.
Full textWe are all aware of the rise of information age: heterogeneous sources of information and the ability to publish rapidly and indiscriminately are responsible for information chaos. In this work, we are interested in a system which can separate the "wheat" of vital information from the chaff within this information chaos. An efficient filtering system can accelerate meaningful utilization of knowledge. Consider Wikipedia, an example of community-driven knowledge synthesis. Facts about topics on Wikipedia are continuously being updated by users interested in a particular topic. Consider an automatic system (or an invisible robot) to which a topic such as "President of the United States" can be fed. This system will work ceaselessly, filtering new information created on the web in order to provide the small set of documents about the "President of the United States" that are vital to keeping the Wikipedia page relevant and up-to-date. In this work, we present an automatic information filtering system for this task. While building such a system, we have encountered issues related to scalability, retrieval algorithms, and system evaluation; we describe our efforts to understand and overcome these issues.
pande, anurag. "ESTIMATION OF HYBRID MODELS FOR REAL-TIME CRASH RISK ASSESSMENT ON FREEWAYS." Doctoral diss., University of Central Florida, 2005. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/3016.
Full textPh.D.
Department of Civil and Environmental Engineering
Engineering and Computer Science
Civil Engineering
Buchholz, Henrik. "Real-time visualization of 3D city models." Phd thesis, Universität Potsdam, 2006. http://opus.kobv.de/ubp/volltexte/2007/1333/.
Full textCheng, Andersson Penny Peng. "Yield curve estimation models with real market data implementation and performance observation." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-52399.
Full textColombelli, Simona <1986>. "Early Warning For Large Earthquakes: Observations, Models and Real-Time Data Analysis." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6339/1/colombelli_simona_tesi.pdf.
Full textColombelli, Simona <1986>. "Early Warning For Large Earthquakes: Observations, Models and Real-Time Data Analysis." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6339/.
Full textLauzeral, Nathan. "Reduced order and sparse representations for patient-specific modeling in computational surgery." Thesis, Ecole centrale de Nantes, 2019. http://www.theses.fr/2019ECDN0062.
Full textThis thesis investigates the use of model order reduction methods based on sparsity-related techniques for the development of real-time biophysical modeling. In particular, it focuses on the embedding of interactive biophysical simulation into patient-specific models of tissues and organs to enhance medical images and assist the clinician in the process of informed decision making. In this context, three fundamental bottlenecks arise. The first lies in the embedding of the shape parametrization into the parametric reduced order model to faithfully represent the patient’s anatomy. A non-intrusive approach relying on a sparse sampling of the space of anatomical features is introduced and validated. Then, we tackle the problem of data completion and image reconstruction from partial or incomplete datasets based on physical priors. The proposed solution has the potential to perform scene registration in the context of augmented reality for laparoscopy. Quasi-real-time computations are reached by using a new hyperreduction approach based on a sparsity promoting technique. Finally, the third challenge concerns the representation of biophysical systems under uncertainty of the underlying parameters. It is shown that traditional model order reduction approaches are not always successful in producing a low dimensional representation of a model, in particular in the case of electrosurgery simulation. An alternative is proposed using a metamodeling approach. To this end, we successfully extend the use of sparse regression methods to the case of systems with stochastic parameters
De, Wulf Martin. "From timed models to timed implementations." Doctoral thesis, Universite Libre de Bruxelles, 2006. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210797.
Full textComputer Science is currently facing a grand challenge :finding good design practices for embedded systems. Embedded systems are essentially computers interacting with some physical process. You could find one in a braking systems or in a nuclear power plant for example. They present several design difficulties :first they are reactive systems, interacting indefinitely with their environment. Second,they must satisfy real-time constraints specifying when they should respond, and not only how. Finally, their environment is often deeply continuous, presenting complex dynamics. The formal models of choice for specifying such systems are timed and hybrid automata for which model checking is pretty well studied.
In a first part of this thesis, we study a complete design approach, including verification and code generation, for timed automata. We have to define a new semantics for timed automata, the AASAP semantics, that preserves the decidability properties for model checking and at the same time is implementable. Our notion of implementability is completely novel, and relies on the simulation of a semantics that is obviously implementable on a real platform. We wrote tools for the analysis and code generation and exemplify them on a case study about the well known Philips Audio Control Protocol.
In a second part of this thesis, we study the problem of controller synthesis for an environment specified as a hybrid automaton. We give a new solution for discrete controllers having only an imperfect information about the state of the system. In the process, we defined a new algorithm, based on the monotonicity of the controllable predecessors operator, for efficiently finding a controller and we show some promising applications on a classical problem :the universality test for finite automata.
Doctorat en sciences, Spécialisation Informatique
info:eu-repo/semantics/nonPublished