Dissertations / Theses on the topic 'Predictive Spectral Analysis'

To see the other types of publications on this topic, follow the link: Predictive Spectral Analysis.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 31 dissertations / theses for your research on the topic 'Predictive Spectral Analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Chhatwal, Harprit Singh. "Spectral modelling techniques for speech signals based on linear predictive analysis." Thesis, Imperial College London, 1988. http://hdl.handle.net/10044/1/46996.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Losik, Len. "Adapting Fourier Analysis for Predicting Earth, Mars and Lunar Orbiting Satellite's Telemetry Behavior." International Foundation for Telemetering, 2011. http://hdl.handle.net/10150/595773.

Full text
Abstract:
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada
Prognostic technology uses a series of algorithms, combined forms a prognostic-based inference engine (PBIE) for the identification of deterministic behavior embedded in completely normal appearing telemetry from fully functional equipment. The algorithms used to define normal behavior in the PBIE from which deterministic behavior is identified can be adapted to quantify normal spacecraft telemetry behavior while in orbit about a moon or planet or during interplanetary travel. Time-series analog engineering data (telemetry) from orbiting satellites and interplanetary spacecraft are defined by harmonic and non-harmonic influences, which shape it behavior. Spectrum analysis can be used to understand and quantify the fundamental behavior of spacecraft analog telemetry and relate the behavior's frequency and phase to its time-series behavior through Fourier analysis.
APA, Harvard, Vancouver, ISO, and other styles
3

Losik, Len. "Adapting Fourier Analysis for Predicting Earth, Mars and Lunar Orbiting Satellite's Telemetry Behavior." International Foundation for Telemetering, 2010. http://hdl.handle.net/10150/604279.

Full text
Abstract:
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California
Prognostic technology uses a series of algorithms, combined forms a prognostic-based inference engine (PBIE) for the identification of deterministic behavior embedded in completely normal appearing telemetry from fully functional equipment. The algorithms used to define normal behavior in the PBIE from which deterministic behavior is identified can be adapted to quantify normal spacecraft telemetry behavior while in orbit about a moon or planet or during interplanetary travel. Time-series analog engineering data (telemetry) from orbiting satellites and interplanetary spacecraft are defined by harmonic and non-harmonic influences, which shape it behavior. Spectrum analysis can be used to understand and quantify the fundamental behavior of spacecraft analog telemetry and relate the behavior's frequency and phase to its time-series behavior through Fourier analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

Losik, Len. "Using Telemetry Science, An Adaptation of Prognostic Algorithms for Predicting Normal Space Vehicle Telemetry Behavior from Space for Earth and Lunar Satellites and Interplanetary Spacecraft." International Foundation for Telemetering, 2009. http://hdl.handle.net/10150/606150.

Full text
Abstract:
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada
Prognostic technology uses a series of algorithms, combined forms a prognostic-based inference engine (PBIE) for the identification of deterministic behavior embedded in completely normal appearing telemetry from fully functional equipment. The algorithms used to define normal behavior in the PBIE from which deterministic behavior is identified can be adapted to quantify normal spacecraft telemetry behavior while in orbit about a moon or planet or during interplanetary travel. Time-series analog engineering data (telemetry) from orbiting satellites and interplanetary spacecraft are defined by harmonic and non-harmonic influences which shape it behavior. Spectrum analysis can be used to understand and quantify the fundamental behavior of spacecraft analog telemetry and relate the behavior's frequency and phase to its time-series behavior through Fourier analysis.
APA, Harvard, Vancouver, ISO, and other styles
5

Guldemir, Hanifi. "Prediction of induction motor line current spectra from design data." Thesis, University of Nottingham, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.287180.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kwag, Jae-Hwan. "A comparative study of LP methods in MR spectral analysis /." free to MU campus, to others for purchase, 1999. http://wwwlib.umi.com/cr/mo/fullcit?p9962536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Nancy. "Spectral Portfolio Optimisation with LSTM Stock Price Prediction." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-273611.

Full text
Abstract:
Nobel Prize-winning modern portfolio theory (MPT) has been considered to be one of the most important and influential economic theories within finance and investment management. MPT assumes investors to be riskaverse and uses the variance of asset returns as a proxy of risk to maximise the performance of a portfolio. Successful portfolio management reply, thus on accurate risk estimate and asset return prediction. Risk estimates are commonly obtained through traditional asset pricing factor models, which allow the systematic risk to vary over time domain but not in the frequency space. This approach can impose limitations in, for instance, risk estimation. To tackle this shortcoming, interest in applications of spectral analysis to financial time series has increased lately. Among others, the novel spectral portfolio theory and the spectral factor model which demonstrate enhancement in portfolio performance through spectral risk estimation [1][11]. Moreover, stock price prediction has always been a challenging task due to its non-linearity and non-stationarity. Meanwhile, Machine learning has been successfully implemented in a wide range of applications where it is infeasible to accomplish the needed tasks traditionally. Recent research has demonstrated significant results in single stock price prediction by artificial LSTM neural network [6][34]. This study aims to evaluate the combined effect of these two advancements in a portfolio optimisation problem and optimise a spectral portfolio with stock prices predicted by LSTM neural networks. To do so, we began with mathematical derivation and theoretical presentation and then evaluated the portfolio performance generated by the spectral risk estimates and the LSTM stock price predictions, as well as the combination of the two. The result demonstrates that the LSTM predictions alone performed better than the combination, which in term performed better than the spectral risk alone.
Den nobelprisvinnande moderna portföjlteorin (MPT) är utan tvekan en av de mest framgångsrika investeringsmodellerna inom finansvärlden och investeringsstrategier. MPT antar att investerarna är mindre benägna till risktagande och approximerar riskexponering med variansen av tillgångarnasränteavkastningar. Nyckeln till en lyckad portföljförvaltning är därmed goda riskestimat och goda förutsägelser av tillgångspris. Riskestimering görs vanligtvis genom traditionella prissättningsmodellerna som tillåter risken att variera i tiden, dock inte i frekvensrummet. Denna begränsning utgör bland annat ett större fel i riskestimering. För att tackla med detta har intresset för tillämpningar av spektraanalys på finansiella tidsserier ökat de senast åren. Bland annat är ett nytt tillvägagångssätt för att behandla detta den nyintroducerade spektralportföljteorin och spektralfak- tormodellen som påvisade ökad portföljenprestanda genom spektralriskskattning [1][11]. Samtidigt har prediktering av aktierpriser länge varit en stor utmaning på grund av dess icke-linjära och icke-stationära egenskaper medan maskininlärning har kunnat använts för att lösa annars omöjliga uppgifter. Färska studier har påvisat signifikant resultat i aktieprisprediktering med hjälp av artificiella LSTM neurala nätverk [6][34]. Detta arbete undersöker kombinerade effekten av dessa två framsteg i ett portföljoptimeringsproblem genom att optimera en spektral portfölj med framtida avkastningar predikterade av ett LSTM neuralt nätverk. Arbetet börjar med matematisk härledningar och teoretisk introduktion och sedan studera portföljprestation som genereras av spektra risk, LSTM aktieprispredikteringen samt en kombination av dessa två. Resultaten visar på att LSTM-predikteringen ensam presterade bättre än kombinationen, vilket i sin tur presterade bättre än enbart spektralriskskattningen.
APA, Harvard, Vancouver, ISO, and other styles
8

Bahrampouri, Mahdi. "Ground Motion Prediction Equations for Non-Spectral Parameters using the KiK-net Database." Thesis, Virginia Tech, 2017. http://hdl.handle.net/10919/87704.

Full text
Abstract:
The KiK-net ground motion database is used to develop ground motion prediction equations for Arias Intensity (Ia), 5-95% Significant Duration (Ds5-95), and 5-75% Significant Duration (Ds5-75). Relationships are developed both for shallow crustal earthquakes and subduction zone earthquakes (hypocentral depth less than 45 km). The models developed consider site amplification using VS30 and the depth to a layer with VS=800 m/s (h800). We observe that the site effect for is magnitude dependent. For Ds5-95 and Ds5-75, we also observe strong magnitude dependency in distance attenuation. We compare the results with previous GMPEs for Japanese earthquakes and observe that the relationships are similar. The results of this study also allow a comparison between earthquakes in shallow-crustal regions, and subduction regions. This comparison shows that Arias Intensity has similar magnitude and distance scaling between both regions and generally Arias Intensity of shallow crustal motions are higher than subduction motions. On the other hand, the duration of shallow crustal motions are longer than subduction earthquakes except for records with large distance and small magnitude causative earthquakes. Because small shallow crustal events saturate with distance, ground motions with large distances and small magnitudes have shorter duration for shallow crustal events than subduction earthquakes.
APA, Harvard, Vancouver, ISO, and other styles
9

Winn, Olivia, and Sivaram Kiran Thekkemadathil. "Near-Infrared Spectral Measurements and Multivariate Analysis for Predicting Glass Contamination of Boiler Fuel." Thesis, Mälardalens högskola, Akademin för ekonomi, samhälle och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-36058.

Full text
Abstract:
This degree project investigates how glass contamination in refuse-derived fuel for a fluidised bed boiler can be detected using near-infrared spectroscopy. It is motivated by the potential to reduce greenhouse gas emissions by replacing fossil fuels with refuse-derived fuel. The intent was to develop a multivariate predictive model of near-infrared spectral data to detect the presence of glass cullet against a background material that represents refuse-derived fuel. Existing literature was reviewed to confirm the usage of near-infrared spectroscopy as a sensing technology and determine the necessity of glass detection. Four unique background materials were chosen to represent the main components in municipal solid waste: wood shavings, shredded coconut, dry rice and whey powder. Samples of glass mixed with the background material were imaged using near-infrared spectroscopy, the resulting data was pre-processed and analysed using partial least squares regression. It was shown that a predictive model for quantifying coloured glass cullet content in one of several background materials were reasonably accurate with a validation coefficient of determination of 0.81 between the predicted and reference data. Models that used data from a single type of background material, wood shavings, were more accurate. Models for quantifying clear glass cullet content were significantly less accurate. These types of models could be applied to predict coloured glass content in different kinds of background materials. However, the presence of clear glass in municipal solid waste, and thus refuse-derived fuel, limit the opportunities to apply these methods to the detection of glass contamination in fuel.
APA, Harvard, Vancouver, ISO, and other styles
10

Badenhorst, Dirk Jakobus Pretorius. "Improving the accuracy of prediction using singular spectrum analysis by incorporating internet activity." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/80056.

Full text
Abstract:
Thesis (MComm)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: Researchers and investors have been attempting to predict stock market activity for years. The possible financial gain that accurate predictions would offer lit a flame of greed and drive that would inspire all kinds of researchers. However, after many of these researchers have failed, they started to hypothesize that a goal such as this is not only improbable, but impossible. Previous predictions were based on historical data of the stock market activity itself and would often incorporate different types of auxiliary data. This auxiliary data ranged as far as imagination allowed in an attempt to find some correlation and some insight into the future, that could in turn lead to the figurative pot of gold. More often than not, the auxiliary data would not prove helpful. However, with the birth of the internet, endless amounts of new sources of auxiliary data presented itself. In this thesis I propose that the near in finite amount of data available on the internet could provide us with information that would improve stock market predictions. With this goal in mind, the different sources of information available on the internet are considered. Previous studies on similar topics presented possible ways in which we can measure internet activity, which might relate to stock market activity. These studies also gave some insights on the advantages and disadvantages of using some of these sources. These considerations are investigated in this thesis. Since a lot of this work is therefore based on the prediction of a time series, it was necessary to choose a prediction algorithm. Previously used linear methods seemed too simple for prediction of stock market activity and a new non-linear method, called Singular Spectrum Analysis, is therefore considered. A detailed study of this algorithm is done to ensure that it is an appropriate prediction methodology to use. Furthermore, since we will be including auxiliary information, multivariate extensions of this algorithm are considered as well. Some of the inaccuracies and inadequacies of these current multivariate extensions are studied and an alternative multivariate technique is proposed and tested. This alternative approach addresses the inadequacies of existing methods. With the appropriate methodology chosen and the appropriate sources of auxiliary information chosen, a concluding chapter is done on whether predictions that includes auxiliary information (obtained from the internet) improve on baseline predictions that are simply based on historical stock market data.
AFRIKAANSE OPSOMMING: Navorsers en beleggers is vir jare al opsoek na maniere om aandeelpryse meer akkuraat te voorspel. Die moontlike finansiële implikasies wat akkurate vooruitskattings kan inhou het 'n vlam van geldgierigheid en dryf wakker gemaak binne navorsers regoor die wêreld. Nadat baie van hierdie navorsers onsuksesvol was, het hulle begin vermoed dat so 'n doel nie net onwaarskynlik is nie, maar onmoontlik. Vorige vooruitskattings was bloot gebaseer op historiese aandeelprys data en sou soms verskillende tipes bykomende data inkorporeer. Die tipes data wat gebruik was het gestrek so ver soos wat die verbeelding toegelaat het, in 'n poging om korrelasie en inligting oor die toekoms te kry wat na die guurlike pot goud sou lei. Navorsers het gereeld gevind dat hierdie verskillende tipes bykomende inligting nie van veel hulp was nie, maar met die geboorte van die internet het 'n oneindige hoeveelheid nuwe bronne van bykomende inligting bekombaar geraak. In hierdie tesis stel ek dus voor dat die data beskikbaar op die internet dalk vir ons kan inligting gee wat verwant is aan toekomstige aandeelpryse. Met hierdie doel in die oog, is die verskillende bronne van inligting op die internet gebestudeer. Vorige studies op verwante werk het sekere spesifieke maniere voorgestel waarop ons internet aktiwiteit kan meet. Hierdie studies het ook insig gegee oor die voordele en die nadele wat sommige bronne inhou. Hierdie oorwegings word ook in hierdie tesis bespreek. Aangesien 'n groot gedeelte van hierdie tesis dus gebasseer word op die vooruitskatting van 'n tydreeks, is dit nodig om 'n toepaslike vooruitskattings algoritme te kies. Baie navorsers het verkies om eenvoudige lineêre metodes te gebruik. Hierdie metodes het egter te eenvoudig voorgekom en 'n relatiewe nuwe nie-lineêre metode (met die naam "Singular Spectrum Analysis") is oorweeg. 'n Deeglike studie van hierdie algoritme is gedoen om te verseker dat die metode van toepassing is op aandeelprys data. Verder, aangesien ons gebruik wou maak van bykomende inligting, is daar ook 'n studie gedoen op huidige multivariaat uitbreidings van hierdie algoritme en die probleme wat dit inhou. 'n Alternatiewe multivariaat metode is toe voorgestel en getoets wat hierdie probleme aanspreek. Met 'n gekose vooruitskattingsmetode en gekose bronne van bykomende data is 'n gevolgtrekkende hoofstuk geskryf oor of vooruitskattings, wat die bykomende internet data inkorporeer, werklik in staat is om te verbeter op die eenvoudige vooruitskattings, wat slegs gebaseer is op die historiese aandeelprys data.
APA, Harvard, Vancouver, ISO, and other styles
11

Goetz, Ryan P. Rosenblad Brent L. "Study of the horizontal-to-vertical spectral ratio (HVSR) method for characterization of deep soils in the Mississippi Embayment." Diss., Columbia, Mo. : University of Missouri--Columbia, 2009. http://hdl.handle.net/10355/5334.

Full text
Abstract:
The entire thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file; a non-technical public abstract appears in the public.pdf file. Title from PDF of title page (University of Missouri--Columbia, viewed on December 22, 2009). Thesis advisor: Dr. Brent L. Rosenblad. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
12

Hernandez, Villapol Jorge Luis. "Spectrum Analysis and Prediction Using Long Short Term Memory Neural Networks and Cognitive Radios." Thesis, University of North Texas, 2017. https://digital.library.unt.edu/ark:/67531/metadc1062877/.

Full text
Abstract:
One statement that we can make with absolute certainty in our current time is that wireless communication is now the standard and the de-facto type of communication. Cognitive radios are able to interpret the frequency spectrum and adapt. The aim of this work is to be able to predict whether a frequency channel is going to be busy or free in a specific time located in the future. To do this, the problem is modeled as a time series problem where each usage of a channel is treated as a sequence of busy and free slots in a fixed time frame. For this time series problem, the method being implemented is one of the latest, state-of-the-art, technique in machine learning for time series and sequence prediction: long short-term memory neural networks, or LSTMs.
APA, Harvard, Vancouver, ISO, and other styles
13

Boozer, Benjamin Bryan Permaloff Anne. "An analysis of economic efficiency in predicting legislative voting beyond a traditional liberal-conservative spectrum." Auburn, Ala, 2008. http://repo.lib.auburn.edu/EtdRoot/2008/SPRING/Political_Science/Dissertation/Boozer_Benjamin_34.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Loubaton, Philippe. "Prediction et representation markovienne des processus stationnaires vectoriels sur z::(2) : utilisation de techniques d'estimation spectrale 2-d en traitement d'antenne." Paris, ENST, 1988. http://www.theses.fr/1988ENST0012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Dotto, André Carnieletto. "Espectroscopia do solo no Vis-IR: potencial predictivo e desenvolvimento de uma interface gráfica de usuário em R." Universidade Federal de Santa Maria, 2017. http://repositorio.ufsm.br/handle/1/11343.

Full text
Abstract:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
This thesis presents a study of Visible Near-infrared spectroscopy technique applied to predict soil properties. The purpose was to develop quantitative soil information due to the demand of digital soil mapping, environmental monitoring, agricultural production and for increasing spatial information on soil. Soil spectroscopy emerge as an alternative to revolutionize soil monitoring, allowing rapid, low-cost, non-destructive samples sampling, environmental-friendly, reproducible, and repeatable analysis. To improve the efficiency of soil prediction using spectral data, several spectral preprocessing techniques and multivariate models were exploited. A graphical user interface (GUI) in R, named Alrad Spectra, was developed to perform preprocessing, multivariate modeling and prediction using spectral data. Hereby, the objectives were: The objectives were: i) to predict soil properties to improve soil information using spectral data, ii) to compare the performance of spectral preprocessing and multivariate calibration methods in the prediction of soil organic carbon, iii) to obtain reliable soil organic carbon prediction, and iv) to develop a graphical user interface that performs spectral preprocessing and prediction of the soil property using spectroscopic data. A total of 595 soil samples were collected in central region of Santa Catarina State, Brazil. Soil spectral reflectance was obtained using a FieldSpec 3 spectroradiometer with a spectral range of 350–2500 nm with 1 nm of spectral resolution. The outcomes of the thesis have demonstrated the great performance of predicting soil properties using Vis-NIR spectroscopy. Apparently, soil properties that are directly related to the chromophores such as organic carbon presented superior prediction statistics than particle size. Spectral preprocessing applied in the soil spectra contribute to the development of high-level prediction model. Comparing different spectral preprocessing techniques for soil organic carbon (SOC) prediction revealed that the scatter–corrective preprocessing techniques presented superior prediction results compared to spectral derivatives. In scatter–correction technique, continuum removal is the most suitable preprocessing to be used for SOC prediction. In the calibration modeling, excepting for random forest, all of methods presented robust prediction, with emphasis on the support vector machine method. The systematic methodology applied in this study can improve the reliability of SOC estimation by examining how techniques of spectral preprocessing and multivariate methods affect the prediction performance using spectral analysis. The development of easy-to-use graphical user interface may benefit a large number of users, who will take advantage of this useful chemometrics analysis. Alrad Spectra is the first GUI of its kind and the expectation is that this tool can expand the application of the spectroscopy technique.
Esta tese apresenta um estudo da técnica de espectroscopia do visível ao infravermelho próximo aplicado à predição de propriedades do solo. O proposito foi de desenvolver informações quantitativas sobre o solo, devido à demanda do mapeamento digital de solos, monitoramento ambiental, produção agrícola e aumento das informações espaciais do solo. A espectroscopia surge como uma alternativa para revolucionar a monitorização do solo, permitindo uma amostragem rápida, de baixo custo, não destrutiva, ambientalmente amigável, reprodutível e repetitiva. Para melhorar a eficiência da predição do solo usando dados espectrais, várias técnicas de pré-processamento espectral e modelos multivariados foram explorados. Uma interface gráfica de usuário (GUI) no R, denominada Alrad Spectra, foi desenvolvida para realizar pré-processamento, modelagem multivariada e predição usando dados espectrais. Os objetivos foram: i) predizer as propriedades do solo para melhorar a informação do solo usando dados espectrais, ii) comparar os desempenhos dos pré-processamentos espectrais e métodos de calibração multivariada na predição do carbono orgânico do solo, iii) obter predições confiáveis do carbono orgânico do solo, e iv) desenvolver uma interface gráfica de usuário que realize o pré-processamento espectral e a predição do atributo solo usando dados espectroscópicos. Um total de 595 amostras de solo foram coletadas na região central do estado de Santa Catarina, Brasil. A reflectância espectral do solo foi obtida utilizando um espectrorradiômetro FieldSpec 3 com uma alcance espectral de 350-2500 nm com 1 nm de resolução espectral. Os resultados da tese demonstraram o grande desempenho da predição de propriedades do solo usando espectroscopia do vísivel ao infravermelho próximo. As propriedades do solo que estão diretamente relacionadas aos cromóforos, como o carbono orgânico, apresentaram predições superiores comparados com o tamanho de partículas. O pré-processamento espectral aplicado nos espectros do solo contribui para o desenvolvimento de um modelo de predição de alto nível. Comparando diferentes técnicas de pré-processamento espectral para a predição de carbono orgânico revelou que as técnicas de pré-processamento de correção de dispersão apresentaram resultados de predição superiores em comparação com as técnicas de derivação espectrais. Na técnica de correção de dispersão, a remoção do contínuo é o pré-processamento mais adequado a ser usado para a predição de carbono. Na modelagem de calibração, com exceção da floresta aleatória, todos os métodos apresentaram uma elevada predição, sendo destaque o método máquina de vetores de suporte. A metodologia sistemática aplicada neste estudo pode melhorar a confiabilidade da estimativa do carbono orgânico ao examinar como as técnicas de pré-processamento espectral e métodos multivariados afetam a performance da predição usando a análise espectral. O desenvolvimento da GUI de fácil utilização pode beneficiar um grande número de usuários, os quais podem tirar proveito desta análise quimiométrica. Alrad Spectra é a primeira GUI desse tipo e a expectativa é que esta ferramenta possa expandir a aplicação da técnica de espectroscopia.
APA, Harvard, Vancouver, ISO, and other styles
16

Li, Qi. "Intermittency of Global Solar Radiation over Reunion island : Daily Mapping Prediction Model and Multifractal Parameters." Thesis, La Réunion, 2018. http://www.theses.fr/2018LARE0016/document.

Full text
Abstract:
Les îles tropicales sont soumises à un ennuagement hétérogène et changeant rapidement. Par ailleurs, elles ont une ressource solaire importante mais significativement variable d’un jour à l’autre. Dans le sud-ouest de l’océan indien (SWIO), La Réunion fait partie de ces îles tropicales ayant un potentiel solaire colossal mais fortement intermittent. Dans cette étude, nous proposons une nouvelle approche de prévision déterministe des cartes journalières rayonnement solaire (SSR), basée sur quatre modèles de régression linéaire : une régression linéaire multiple (MLR), une régression en composantes principales (PCR), une régression des moindres carrés (PLSR) et une régression pas à pas (stepwise--SR). Ces quatre régressions sont appliquées sur les données satellites SARAH-E (CM SAF) à 5km de résolution entre 2007 et 2016, en vue d’en effectuer la prévision. Pour obtenir de meilleures performances, nous proposons d'inclure les paramètres multi-fractale (H,C_1 et α) comme nouveaux paramètres prédictifs. Ceux-ci sont obtenus à partir de l'analyse de l'intermittence du SSR basée sur la méthode d’analyse d’ordre spectral arbitraire de Hilbert. Cette analyse qui est une extension de la transformation d’Hilbert Huang (HHT) est utilisée afin d’estimer l’exposant d’échelle ξ(q). On effectue la combinaison d’une décomposition en mode empirique et de l’analyse spectrale de Hilbert (EMD + HSA). Dans une première étape, l’analyse multi-fractale est appliquée sur une mesure du SSR d'une seconde échelle à partir d'un pyranomètre SPN1 à Moufia en 2016. La moyenne infra journalière, journalière et saisonnière de la structure multi-fractale a été dérivée, et la loi d’échelle d’exposants ξ(q) a été analysée. Dans une seconde partie, l’analyse de l’intermittence est effectuée sur les mesures du SSR, d'une période d’une minute, à partir le réseau de SPN1 contenant 11 stations en 2014. Les modèles spatiaux pour toutes les stations avec les paramètres multi-fractales H,C_1 et α sont mis en évidence. La variabilité de la largeur du spectre de singularité est considérée pour étudier l'intermittence spatiale et la multi-fractalité dans l'échelle quotidienne et l'échelle saisonnière. Sur la base de ces analyses d'intermittence faites sur les mesures de plusieurs stations, les paramètres multi-fractaux universels (H,C_1 et α) pourraient être choisis comme de nouveaux prédicteurs afin d’indiquer les propriétés multi-fractales du SSR
Due to the heterogeneous and rapidly-changing cloudiness, tropical islands, such as Reunion Island in the South-west Indian Ocean (SWIO), have significant solar resource that is highly variable from day-to-day. In this study, we propose a new approach for deterministic prediction of daily surface solar radiation (SSR) maps based on four linear regression models: multiple linear regression (MLR), principal component regression (PCR), partial least squares regression (PLSR), and stepwise regression (SR), that we have applied on the SARAH-E@5km satellite data (CM SAF) for the period during 2007-2016. To improve the accuracy of prediction, the multifractal parameters (H,C_1 and α) are proposed to include as new predictors in the predictive model. These parameters are obtained from the analysis of SSR intermittency based on arbitrary order Hilbert spectral analysis. This analysis is the extension of Hilbert Huang Transform (HHT) and it is used to estimate the generalized scaling exponent ξ(q). It is the combination of the Empirical Mode Decomposition and Hilbert spectral analysis (EMD+HSA). In a first step, the multifractal analysis is applied onto one-second SSR measurements form a SPN1 pyranometer in Moufia in 2016. The mean sub-daily, daily and seasonal daily multifractal patterns are derived, and the scaling exponent ξ(q) is analyzed. In a second step, the intermittency study is conducted on one-minute SSR measurements from a SPN1 network with 11 stations in 2014. The spatial patterns for all the stations with the multifractal parameters H,C_1 and α are shown. The variability of singularity spectrum width is considered to study the spatial intermittency at the daily and seasonal scale. Based on this intermittency analysis from measurements at several stations, the universal multifractal parameters (H,C_1 and α) could be taken as new predictors for indicating the multifractal properties of SSR
APA, Harvard, Vancouver, ISO, and other styles
17

Manero, Font Jaume. "Deep learning architectures applied to wind time series multi-step forecasting." Doctoral thesis, Universitat Politècnica de Catalunya, 2020. http://hdl.handle.net/10803/669283.

Full text
Abstract:
Forecasting is a critical task for the integration of wind-generated energy into electricity grids. Numerical weather models applied to wind prediction, work with grid sizes too large to reproduce all the local features that influence wind, thus making the use of time series with past observations a necessary tool for wind forecasting. This research work is about the application of deep neural networks to multi-step forecasting using multivariate time series as an input, to forecast wind speed at 12 hours ahead. Wind time series are sequences of meteorological observations like wind speed, temperature, pressure, humidity, and direction. Wind series have two statistically relevant properties; non-linearity and non-stationarity, which makes the modelling with traditional statistical tools very inaccurate. In this thesis we design, test and validate novel deep learning models for the wind energy prediction task, applying new deep architectures to the largest open wind data repository available from the National Renewable Laboratory of the US (NREL) with 126,692 wind sites evenly distributed on the US geography. The heterogeneity of the series, obtained from several data origins, allows us to obtain conclusions about the level of fitness of each model to time series that range from highly stationary locations to variable sites from complex areas. We propose Multi-Layer, Convolutional and recurrent Networks as basic building blocks, and then combined into heterogeneous architectures with different variants, trained with optimisation strategies like drop and skip connections, early stopping, adaptive learning rates, filters and kernels of different sizes, between others. The architectures are optimised by the use of structured hyper-parameter setting strategies to obtain the best performing model across the whole dataset. The learning capabilities of the architectures applied to the various sites find relationships between the site characteristics (terrain complexity, wind variability, geographical location) and the model accuracy, establishing novel measures of site predictability relating the fit of the models with indexes from time series spectral or stationary analysis. The designed methods offer new, and superior, alternatives to traditional methods.
La predicció de vent és clau per a la integració de l'energia eòlica en els sistemes elèctrics. Els models meteorològics es fan servir per predicció, però tenen unes graelles geogràfiques massa grans per a reproduir totes les característiques locals que influencien la formació de vent, fent necessària la predicció d'acord amb les sèries temporals de mesures passades d'una localització concreta. L'objectiu d'aquest treball d'investigació és l'aplicació de xarxes neuronals profundes a la predicció \textit{multi-step} utilitzant com a entrada series temporals de múltiples variables meteorològiques, per a fer prediccions de vent d'ací a 12 hores. Les sèries temporals de vent són seqüències d'observacions meteorològiques tals com, velocitat del vent, temperatura, humitat, pressió baromètrica o direcció. Les sèries temporals de vent tenen dues propietats estadístiques rellevants, que són la no linearitat i la no estacionalitat, que fan que la modelització amb eines estadístiques sigui poc precisa. En aquesta tesi es validen i proven models de deep learning per la predicció de vent, aquests models d'arquitectures d'autoaprenentatge s'apliquen al conjunt de dades de vent més gran del món, que ha produït el National Renewable Laboratory dels Estats Units (NREL) i que té 126,692 ubicacions físiques de vent distribuïdes per total la geografia de nord Amèrica. L'heterogeneïtat d'aquestes sèries de dades permet establir conclusions fermes en la precisió de cada mètode aplicat a sèries temporals generades en llocs geogràficament molt diversos. Proposem xarxes neuronals profundes de tipus multi-capa, convolucionals i recurrents com a blocs bàsics sobre els quals es fan combinacions en arquitectures heterogènies amb variants, que s'entrenen amb estratègies d'optimització com drops, connexions skip, estratègies de parada, filtres i kernels de diferents mides entre altres. Les arquitectures s'optimitzen amb algorismes de selecció de paràmetres que permeten obtenir el model amb el millor rendiment, en totes les dades. Les capacitats d'aprenentatge de les arquitectures aplicades a ubicacions heterogènies permet establir relacions entre les característiques d'un lloc (complexitat del terreny, variabilitat del vent, ubicació geogràfica) i la precisió dels models, establint mesures de predictibilitat que relacionen la capacitat dels models amb les mesures definides a partir d'anàlisi espectral o d'estacionalitat de les sèries temporals. Els mètodes desenvolupats ofereixen noves i superiors alternatives als algorismes estadístics i mètodes tradicionals.
Arquitecturas de aprendizaje profundo aplicadas a la predición en múltiple escalón de series temporales de viento. La predicción de viento es clave para la integración de esta energía eólica en los sistemas eléctricos. Los modelos meteorológicos tienen una resolución geográfica demasiado amplia que no reproduce todas las características locales que influencian en la formación del viento, haciendo necesaria la predicción en base a series temporales de cada ubicación concreta. El objetivo de este trabajo de investigación es la aplicación de redes neuronales profundas a la predicción multi-step usando como entrada series temporales de múltiples variables meteorológicas, para realizar predicciones de viento a 12 horas. Las series temporales de viento son secuencias de observaciones meteorológicas tales como, velocidad de viento, temperatura, humedad, presión barométrica o dirección. Las series temporales de viento tienen dos propiedades estadísticas relevantes, que son la no linealidad y la no estacionalidad, lo que implica que su modelización con herramientas estadísticas sea poco precisa. En esta tesis se validan y verifican modelos de aprendizaje profundo para la predicción de viento, estos modelos de arquitecturas de aprendizaje automático se aplican al conjunto de datos de viento más grande del mundo, que ha sido generado por el National Renewable Laboratory de los Estados Unidos (NREL) y que tiene 126,682 ubicaciones físicas de viento distribuidas por toda la geografía de Estados Unidos. La heterogeneidad de estas series de datos permite establecer conclusiones válidas sobre la validez de cada método al ser aplicado en series temporales generadas en ubicaciones físicas muy diversas. Proponemos redes neuronales profundas de tipo multi capa, convolucionales y recurrentes como tipos básicos, sobre los que se han construido combinaciones en arquitecturas heterogéneas con variantes de entrenamiento como drops, conexiones skip, estrategias de parada, filtros y kernels de distintas medidas, entre otros. Las arquitecturas se optimizan con algoritmos de selección de parámetros que permiten obtener el mejor modelo buscando el mejor rendimiento, incluyendo todos los datos. Las capacidades de aprendizaje de las arquitecturas aplicadas a localizaciones físicas muy variadas permiten establecer relaciones entre las características de una ubicación (complejidad del terreno, variabilidad de viento, ubicación geográfica) y la precisión de los modelos, estableciendo medidas de predictibilidad que relacionan la capacidad de los algoritmos con índices que se definen a partir del análisis espectral o de estacionalidad de las series temporales. Los métodos desarrollados ofrecen nuevas alternativas a los algoritmos estadísticos tradicionales.
APA, Harvard, Vancouver, ISO, and other styles
18

Koch, Tim Verfasser], Michael [Akademischer Betreuer] Wicke, Kay [Akademischer Betreuer] Raum, and Claus-Peter [Akademischer Betreuer] [Czerny. "Predicting the intramuscular fat content in porcine M. longissimus via ultrasound spectral analysis with consideration of structural and compositional traits / Tim Koch. Gutachter: Michael Wicke ; Kay Raum ; Claus-Peter Czerny. Betreuer: Michael Wicke." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2011. http://d-nb.info/1043719369/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Belovič, Boris. "Řešení složitých problémů s využitím evolučních algoritmů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-218189.

Full text
Abstract:
Difficult problems are tasks which number of possible solutions increase exponentially or factorially. Application of common mathematical methods for finding proper solution in polynomial time is ineffective. Signal prediction is an example of diffucult problem. Signal is represented with a time serie and there is no explicit mathematical formula describing the signal. When genetic algorithms are applicated, they try to discover hidden patterns in time serie. These patterns can be used for prediction. Implication rules are used for discovery of these hidden patterns in time serie. Each rule is represented by one chromosome in population. Rules consist of two parts: conditional part and result part. Rules in population are compared with time serie and then the rules are evaluated according to their success in prediction. After the evaluation of rules, simulated evolution is started. Result of this evolution process is a group of rules which represent the most distinct patterns in time serie. These rules are then validated on validation set. Application is implemented in JAVA programming language.
APA, Harvard, Vancouver, ISO, and other styles
20

Janstad, Tobias. "Case study of a contract system : considering pulp prices from 1996-2006." Thesis, Växjö University, School of Mathematics and Systems Engineering, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-1684.

Full text
Abstract:

Södra Cell sells 1 900 000 ton pulp every year. Of this 490 000 tonne is sold with a contract system based on a pricing index called PIX NBSK. This index was started in 1996 and reflects the price of pulp from conferious forest. We study the NBSK PIX value of softwood from October 1996 to December 2006.

People working in this branch known that there is strong periodicity in the prices. We use predictive analysis to see if clients can benefit from the periodicity and use the options in the contract system Södra offers today. We conclude that a drawback for the current contract system is that there are too many contracts in proportion to the duration time that is one year for all contracts. Using a time series model called ARMA we make successfull predictions the price difference between two contracts. Based on this prediction we change between these contracts, reducing the price with 0.81% in mean during 1997-2006. Due to the total turnover, if all clients would used such predictions during 1997-2006 Södra's income would have been reduced with 2.77 million USD a year in mean.

The prices used before PIX are called list prices. The list prices seem to behave like the PIX index. Supposing that the same contract system we see in PIX today was used 1975-2006 with the list price as the base index I made a prediction of the list prices from 1986-2006. Thanks to my predictions, if I had been a client during this period and under mentioned considerations I would have been buying pulp to a price reduced with 0.57%.

If clients had known the PIX between 1996-2006 in say 1995 Södra's contract system based on PIX would give them a price reduction that were 1.5% in mean during 1996-2006. Price reduction is not possible all years, but when it occurs it can be as big as 3% of the price. Suppose the clients always choose the contract with the lowest price and thereby get a reduced price over time. Then with 95% probability over a long period the price reduction is somewhere in between 0.4-2.7%.

To strangle this price reduction possibility for the clients there are two ways to go: either reduce the number of contracts or extend the duration time of the contracts.

To find a suitable duration time, we do spectral density estimation to get indications of which periods that are most important. From this we see that PIX index has a period of five years, wavelet approximated PIX index has 3.4 years and the list prices has a period of 5.6 years. This indicates that current duration time one year is too short. Therefore if it wouldn't effect Södra's clients, an extension of the duration time from one to five years would be good.

If Södra don't extend the duration time of the contracts my recommendation is to have fewer contracts. The possibility to change between the contracts ''average last three months'' and ''average current month'' every other year is the weakest point of today's system. Therefore I recommend stop selling pulp to the contract ''average PIX last three months''.

We can't prove any longterm difference between the contracts. If Södra chooses to have just one contract from this point of view it does not matter which one they choose. However, it seems like a good idea to follow the global market and therefore I recommend to choose ''average PIX current month'' rather than ''average PIX last three months'' which lags behind the market front. Since the price ''average current month'' is available at FOEX web page I think Södra should choose this contract if they decide to have only one contract.


Södra Cell säljer årligen 1 900 000 ton pappersmassa. Av denna mängd säljs 490 000 ton enligt ett kontraktsystem baserat på ett prisindex som heter PIX NBSK. Detta index introducerades 1996 och reflekterar priset på pappersmassa gjord av barrträd. Jag studerar priset på indexet från Oktober 1996 till December 2006.

Dagens kontraktsystem är baserat på kontrakt med löptiden ett år. Jag undersöker om man kan prediktera prisskilllnaden mellan kontrakten, dra nytta att dagens löptid som bara är ett år och välja det kontrakt som ger det billigaste priset så ofta att priset över lång tid reduceras. När man predikterar gör man en uppskattningen av framtiden utifrån en modell av hur framtid beror på dåtid och nutid. Den modell jag har använt kallas ARMA. Denna tillsammans med priserna på pappersmassa från 1975 och framåt gav mig ett fruktbart sätt att förutsäga priserna. Resultatet blev ett pris reducerat med 0.81% i medel under perioden 1996-2006. Eftersom Södra ha så stor försäljningsvolym skulle de ha förlorat 2.27 miljoner dollar per ton i medel om alla kunder ha spekulerat utifrån den modellen jag använde.

Om dagens kontraktsystem hade börjat användas 1975 med listpriserna som bas hade en kund som använt min prediktionsmetod fått ett pris reducerat med 0.57% under perioden 1986-2006.

Om kunderna i förväg hade vetat priset under 1996-2001 gav det nuvarande systemet en reducerad medelintäkt med 1.5% av priset. Enskilda år reducerades intäkten med så mycket som 3%. Beräknar man konfidensintervall för prisreduktionerna så inser man att på lång sikt kommer dessa vara av storleksordningen 0.4-2.7% med sannolikheten 95%. Detta förutsatt att klienterna kan se in i framtiden. Siffran 2.7% alltså ett mått på hur stor risk man tar med dagens system. Jag tror inte att klienterna kommer reducera priset med 2.7% med nuvarande system, men det är en övre gräns.

De gynsamma prediktionerna har sitt ursprung i att det finns periodicitet i priserna. Jag undersöker denna periodicitet med spektralanalys. Periodiciteten för PIX indexet är starkast kring 5 år. En wavelet-approximation av PIX-indexet hade störst periodicitet kring 3.4 år. Listpriserna hade starkast periodicitet kring 5.6 år. Detta indikerar att den nuvarande löptiden, ett år, är för kort. En lämpligare löptid för kontrakten är 5 år.

Förmodligen är fem års löptid alltför lång tid att binda sig för många kunder. Därför föreslår jag att man reducerar antalet kontrakt istället. Den största svagheten i dagens system är den korta löptiden tillsammans med kontrakten ''average current month'' och ''average last three months''. Jag rekommenderar att man slutar erbjuda kontraktet ''average last three months''. Det allra säkraste är att endast erbjuda ett kontrakt. Vi har inte kunnat påvisa några skillnader över lång sikt mellan kontrakten såtillvida att något kontrakt skulle ge ett lägre medelpris än ett annat. Ur den aspekten är det godtyckligt vilket kontrakt man väljer, men det verkar vettigt att följa den globala marknaden. Därför är det eftersläpande kontraktet ''average PIX last three months'' inte att rekommendera, välj heller ''average PIX current month''. Ett annat argument för att välja ''average PIX current month'' är att dessa priser finns på FOEX hemsida och inga extra beräkningar behöver göras.

APA, Harvard, Vancouver, ISO, and other styles
21

Hrušovský, Enrik. "Automatická klasifikace výslovnosti hlásky R." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2018. http://www.nusl.cz/ntk/nusl-377664.

Full text
Abstract:
This diploma thesis deals with automatic clasification of vowel R. Purpose of this thesis is to made program for detection of pronounciation of speech defects at vowel R in children. In thesis are processed parts as speech creation, speech therapy, dyslalia and subsequently speech signal processing and analysis methods. In the last part is designed software for automatic detection of pronounciation of vowel R. For recognition of pronounciation is used algorithm MFCC for extracting features. This features are subsequently classified by neural network to the group of correct or incorrect pronounciation and is evaluated classification success.
APA, Harvard, Vancouver, ISO, and other styles
22

Kahaei, Mohammad Hossein. "Performance analysis of adaptive lattice filters for FM signals and alpha-stable processes." Thesis, Queensland University of Technology, 1998. https://eprints.qut.edu.au/36044/7/36044_Digitised_Thesis.pdf.

Full text
Abstract:
The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.
APA, Harvard, Vancouver, ISO, and other styles
23

Hanzálek, Pavel. "Praktické ukázky zpracování signálů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2019. http://www.nusl.cz/ntk/nusl-400849.

Full text
Abstract:
The thesis focuses on the issue of signal processing. Using practical examples, it tries to show the use of individual signal processing operations from a practical point of view. For each of the selected signal processing operations, an application is created in MATLAB, including a graphical interface for easier operation. The division of the thesis is such that each chapter is first analyzed from a theoretical point of view, then it is shown using a practical demonstration of what the operation is used in practice. Individual applications are described here, mainly in terms of how they are handled and their possible results. The results of the practical part are presented in the attachment of the thesis.
APA, Harvard, Vancouver, ISO, and other styles
24

Waddle, C. Allen. "Fast spectral multiplication for real-time rendering." Thesis, 2018. https://dspace.library.uvic.ca//handle/1828/9332.

Full text
Abstract:
In computer graphics, the complex phenomenon of color appearance, involving the interaction of light, matter and the human visual system, is modeled by the multiplication of RGB triplets assigned to lights and materials. This efficient heuristic produces plausible images because the triplets assigned to materials usually function as color specifications. To predict color, spectral rendering is required, but the O(n) cost of computing reflections with n-dimensional point-sampled spectra is prohibitive for real-time rendering. Typical spectra are well approximated by m-dimensional linear models, where m << n, but computing reflections with this representation requires O(m^2) matrix-vector multiplication. A method by Drew and Finlayson [JOSA A 20, 7 (2003), 1181-1193], reduces this cost to O(m) by “sharpening” an n x m orthonormal basis with a linear transformation, so that the new basis vectors are approximately disjoint. If successful, this transformation allows approximated reflections to be computed as the products of coefficients of lights and materials. Finding the m x m change of basis matrix requires solving m eigenvector problems, each needing a choice of wavelengths in which to sharpen the corresponding basis vector. These choices, however, are themselves an optimization problem left unaddressed by the method's authors. Instead, we pose a single problem, expressing the total approximation error incurred across all wavelengths as the sum of dm^2 squares for some number d, where, depending on the inherent dimensionality of the rendered reflectance spectra, m <= d << n, a number that is independent of the number of approximated reflections. This problem may be solved in real time, or nearly, using standard nonlinear optimization algorithms. Results using a variety of reflectance spectra and three standard illuminants yield errors at or close to the best lower bound attained by projection onto the leading m characteristic vectors of the approximated reflections. Measured as CIEDE2000 color differences, a heuristic proxy for image difference, these errors can be made small enough to be likely imperceptible using values of 4 <= m <= 9. An examination of this problem reveals a hierarchy of simpler, more quickly solved subproblems whose solutions yield, in the typical case, increasingly inaccurate approximations. Analysis of this hierarchy explains why, in general, the lowest approximation error is not attained by simple spectral sharpening, the smallest of these subproblems, unless the spectral power distributions of all light sources in a scene are sufficiently close to constant functions. Using the methods described in this dissertation, spectra can be rendered in real time as the products of m-dimensional vectors of sharp basis coefficients at a cost that is, in a typical application, a negligible fraction above the cost of RGB rendering.
Graduate
APA, Harvard, Vancouver, ISO, and other styles
25

Chen, Chien-An, and 陳建安. "Streamflow Prediction Using Support Vector Regression and Higher Order Spectral Analysis." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/98221480497689654894.

Full text
Abstract:
博士
國立臺灣大學
土木工程學研究所
103
In this study, high efficient support vector regression (SVR) and higher order spectral analysis (HOSA) for developing streamflow prediction models. Furthermore, runoff coefficient, stage-discharge rating curve and HEC-HMS are also utilized to simulate and adjust storm hydrograph. First of all, the principle of third-order cumulants is introduced. The largest order of the autoregressive moving average (ARMA) model can be rapidly and accurately solved using singular value decomposition and hypothesis testing. This method could overcome complex calculations and errors resulted by determining orders using autocorrelation function and partial autocorrelation function. Secondly, establish a lag time using the determined order and a streamflow prediction model using SVR. To avoid drawbacks of SVR estimation using trial-and-error method, simulated annealing (SA) is utilized to seek out the optimal parameters. As results indicated SA, coupled with SVR, predict streamflow effectively, and prove the advantages of calculating orders using HOSA. Lastly, the streamflow prediction model developed from this study has been successfully applied to three actual watersheds in Taiwan. Meanwhile to deal with underestimate and missing data, traditional stage-discharge rating curve method has been improved by adjusting storm discharge using runoff coefficient, so as to represent the actual hydrological state and examine the reliability of streamflow adjustment. The results are proven to be realistic and can be utilized as a reference for water resources policy, flood prevention and decision-making.
APA, Harvard, Vancouver, ISO, and other styles
26

Morais, Mariana Francisca Lira de. "Conventional, spectral and non-linear analysis of external uterine contraction recordings in prediction of dystocia during labour." Master's thesis, 2015. https://repositorio-aberto.up.pt/handle/10216/89787.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Morais, Mariana Francisca Lira de. "Conventional, spectral and non-linear analysis of external uterine contraction recordings in prediction of dystocia during labour." Dissertação, 2015. https://repositorio-aberto.up.pt/handle/10216/89787.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Kuo, Hsuan-Wei, and 郭炫偉. "The Spectrum Analysis and Prediction of Phosphor-converted White LED with Green and Red Phosphors." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/76794747574256952040.

Full text
Abstract:
碩士
明新科技大學
光電系統工程系碩士班
103
Generally, the phosphor-converting white Light-Emitting-Diode (LED) packages are typically fabricated by blue chip coated with green and red phosphors. However, for the phosphor-converting white LED with green and red phosphors, the re-absorption between green and red phosphors would introduce intensity deviations of the green and red emissions. The package engineers will pay a lot of effort and take excessive time to achieve the correct chromaticity of the white LEDs. In this study, we provide a method to predict the spectrum of the white LEDs through experimental analysis. With the spectrum model, the deviations of the CIE 1931 x and y between the prediction and real sample is less than 0.002; the deviation of the CCT between the prediction and real sample is less than 20K; the deviation of the CRI between the prediction and real sample is less than 3. Finally, we provide a good method to predict the spectrum of the phosphor-converting white LED with green and red phosphors with high accuracy.
APA, Harvard, Vancouver, ISO, and other styles
29

Koch, Tim. "Predicting the intramuscular fat content in porcine M. longissimus via ultrasound spectral analysis with consideration of structural and compositional traits." Doctoral thesis, 2011. http://hdl.handle.net/11858/00-1735-0000-0006-AB2A-F.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Bandarabadi, Mojtaba. "Low-complexity measures for epileptic seizure prediction and early detection based on classification." Doctoral thesis, 2015. http://hdl.handle.net/10316/27608.

Full text
Abstract:
Tese de doutoramento em Ciência da Informação e Tecnologia, apresentada ao Departamento de Engenharia Informática da Faculdade de Ciências e Tecnologia da Universidade de Coimbra
This thesis concerns the problems of epileptic seizure prediction and detection. We analyzed multichannel intracranial electroencephalogram (iEEG) and surface electroencephalogram (sEEG) recordings of patients suffering from refractory epilepsy, to access the brain state in real time by using relevant EEG features and computational intelligence techniques, and aiming for detection of pre-seizure state (in the case of prediction) or seizure onset times (in the case of detection). Our main original contribution is the development of a novel relative bivariate spectral power feature to track gradual transient changes prior to ictal events for real-time seizure prediction. Furthermore a novel robust and generalized measure for early seizure detection is developed, aimed to be used in closed-loop neurostimulation systems. The development of a general platform embeddable on a transportable low-power-budget device is of utmost importance, for real time warning to patients and their relatives about the impending seizure or beginning of an occurring seizure. The portable device can also be integrated to work in conjunction with a closed-loop neurostimulation or fast-acting drug injection mechanism to eventually disarm the impending seizure or to suppress the just-occurring seizure. Therefore, in this thesis we try to meet the dual-objective of developing algorithms for seizure prediction and early seizure detection that provide high sensitivity and low number of false alarms, fulfilling the requirements of clinical applications, while being low computational cost. To seek the first objective, a patient-specific seizure prediction was developed based on the extraction of novel relative bivariate spectral power features, which were then preprocessed, dimensionally reduced, and classified using a machine-learning algorithm. The introduced feature bears low complexity, and was discriminated using the powerful support vector machine (SVM) classifier. We analyzed the preictal EEG dynamics across different brain regions and throughout several frequency bands, using relative bivariate features to uncover the underlying mechanisms ending in epileptic seizures. The suggested prediction system was evaluated on long-term continuous sEEG and iEEG recordings of 24 patients, and produced statistically significant results with average sensitivity of 75.8% and false prediction rate of 0.1 per hour. Furthermore a novel statistical method was developed for proper selection of preictal period, and also for the evaluation of predictive capability of features, as well as for the predictability of seizures. The method uses amplitude distribution histograms (ADHs) of the features extracted from the preictal and interictal iEEG and sEEG recordings, and then calculates a criterion of discriminability among two classes. The method was evaluated on spectral power features extracted from monopolar and bipolar iEEG and sEEG recordings of 18 patients, in overall consisting of 94 epileptic seizures. To approach the objective of early seizure detection, we have formulated power spectral density (PSD) of bipolar EEG signal in the form of a measure of neuronal potential similarity (NPS) between two EEG signals. This measure encompasses the phase and amplitude similarities of two EEG channels in a simultaneous fashion. The NPS measure was then studied in several narrow frequency bands to find out the most relevant sub-bands involved in seizure initiations, and the best performing ratio of two NPS measures for seizure onset detection was determined. Evaluating on long-term continuous iEEG recordings of 11 patients with refractory partial epilepsy (overall of 1785 h and 183 seizures) the results showed high performance, while requiring a very low computational cost. On average, we could achieve a sensitivity of 86.3%, a low false detection rate (FDR) of 0.048/h, and a mean detection latency of 14.2s from electrographic seizure onsets, while in average preceding clinical onsets by 1.1s. Apart from the above mentioned primary objectives, we introduced two new and robust methods for offline or real-time labelling of epileptic seizures in long-term continuous EEG recordings for further studies. Methods include mean phase coherence estimated from bandpass filtered iEEG signals in specific frequency bands, and singular value decomposition (SVD) of bipolar iEEG signals. Both methods were evaluated on the same dataset employed in the previous study and demonstrated sensitivity of 84.2% and FDR of 0.09/h for sub-band mean phase coherence, and sensitivity of 84.1% and FDR of 0.05/h for bipolar SVD, on average. Most of this work was established in collaboration with the EPILEPSIAE project, aimed to predict of pharmacoresistant epileptic seizures. The developed methods in this thesis were evaluated by the accessibility of long-term continuous multichannel EEG recordings of more than 275 patients with refractory epilepsy, referred to as The European Epilepsy Database. This database was collected by the three clinical centers involved in EPILEPSIAE, and contains well-documented metadata. The results of this thesis are backing the hypothesis of the predictability of most of epileptic seizures using linear bivariate spectral-temporal brain dynamics. Moreover, the promising results of early seizure detection sustain the feasibility of integrating the proposed method with closed-loop neurostimulation systems. We hope the developed methods could be a step forward towards the clinical applications of seizure prediction and onset detection algorithms.
Esta tese versa os problemas de predição e de deteção de crises epiléticas. Analisa-se o eletroencefalograma multicanal intracraniano (iEEG) e de superfície (sEEG) de pacientes que sofrem de epilepsia refratária, para a estimação em tempo real do estado cerebral, usando características relevantes do EEG e técnicas de inteligência computacional, ambicionando a deteção do estado pré-ictal (no caso de previsão) ou dos instantes de início de uma crise (no caso de deteção). A principal contribuição original é o desenvolvimento de uma característica de potência espectral bivariada relativa para captar as mudanças transitórias graduais que levam a crises e que poderão ser usadas para previsão em tempo real. Além disso, é desenvolvida uma nova medida, robusta e generalizada para a deteção precoce, destinada a ser utilizada em sistemas de neuro estimulação em malha fechada. O desenvolvimento de uma plataforma geral possível de ser integrada num dispositivo transportável, energeticamente económico, é de grande relevância para o aviso em tempo real do doente e dos seus próximos sobre a eminência da ocorrência de uma crise. O dispositivo transportável também pode ser usado em malha fechada com um neuro estimulador ou com um dispositivo de injeção rápida de um fármaco que desarme eventualmente a crise em curso. Por isso nesta tese persegue-se o objectivo de desenvolver algoritmos para previsão mas também para deteção de crises. Em ambos os casos, pretende-se que os algoritmos tenham uma elevada sensibilidade e uma baixa taxa de falsos positivos, tornando viável a sua utilização clínica. Para o objectivo de previsão, desenvolveu-se um método de previsão personalizado baseado na extração de uma característica nova, denominada de potência relativa espectral bivariada, que foi submetida a pre-processamento, redução de dimensão e classificação com Máquinas de Vetores de Suporte (SVM). Esta nova característica, de baixa complexidade, é computacionalmente simples, mas permite a análise da dinâmica do EEG preictal em diferentes regiões do cérebro e ao longo de várias bandas de frequência, de modo a descobrir os mecanismos subjacentes às crises epiléticas. O sistema de previsão obtido foi avaliado em registos contínuos de sEEG e iEEG de 24 pacientes, e produziu resultados estatisticamente significativos com sensibilidade média de 75.8% e taxa de predição falsa de 0.1 por hora. Além disso, foi desenvolvido um novo método estatístico para a seleção apropriada do período preictal, e também para a avaliação da capacidade preditiva das características, assim como para a própria previsibilidade das crises. O método utiliza os histogramas de distribuição de amplitude (ADHS) das características extraídas nos períodos pré-ictal e ictal dos registos de iEEG e sEEG e, em seguida, calcula um critério de discriminabilidade entre as duas classes. O método foi avaliado nas características de potencia espectral extraídas de registos iEEG e sEEG, monopolares e bipolares de 18 pacientes, consistindo num número total de crises epilépticas de 94. O segundo objetivo, a deteção precoce de crises, foi abordado através da formulação da densidade de potência espectral (PSD) de canais de EEG bipolares na forma de uma medida da similaridade do potencial neuronal (NPS) entre dois sinais de EEG. Esta medida usa as similaridades entre as fases e as amplitudes de dois canais de EEG de um modo simultâneo. A medida NPS foi estudada em várias bandas estreitas de frequência de modo a descobrir-se quais as sub-bandas mais envolvidas na inicialização das crises; buscou-se assim a melhor razão entre duas NPS do ponto de vista da deteção precoce. Avaliadas em iEEG contínuos de longa duração de 11 doentes com epilepsia refratária parcial (num total de 1785 h e 183 crises), os resultados apresentam um desempenho com sensibilidade de 86.3% e taxa de deteção falsa (FDR) de 0.048/h, uma latência de 14.2s em relação ao início eletrográfico, sendo uma crise detetada em média 1.1s antes da sua manifestação clínica. Para além dos objetivos principais referidos acima, introduziram-se dois novos métodos, robustos, para etiquetagem em diferido e em tempo real das crises em registos contínuos de EEG de longa duração para estudos posteriores. Esses métodos incluem a coerência de fase média (mean phase coherence) estimada a partir de registos iEEG em bandas de frequência específicas (usando filtros passa-banda), e a decomposição em valores singulares (SVD) de sinais iEEG bipolares. Ambos os métodos foram avaliados no mesmo conjunto de dados do estudo anterior e apresentaram, em média, uma sensibilidade de 84.2% e um FDR de 0.09/h para a coerência de fase média calculada para as sub-bandas, e sensibilidade de 84.1% e FDR de 0.05/h para a metodologia que usa a decomposição SVD bipolar. Grande parte deste trabalho foi feito no âmbito do projeto EPILEPSIAE, visando a previsão de crises em doentes epiléticos fármaco-resistentes. Os métodos desenvolvidos nesta tese aproveitaram a acessibilidade aos dados bem documentados de mais de 275 pacientes que constituem a Base de Dados Europeia de Epilepsia (European Epilepsy Database), provenientes dos três centros hospitalares participantes no projeto. Os resultados desta tese apoiam a hipótese da previsibilidade da maioria das crises epiléticas usando dinâmicas cerebrais bivariadas lineares espetrais e temporais. Além disso os resultados são promissores relativamente à deteção precoce de crises e sustentam a fazibilidade da integração desses métodos com técnicas de neuroestimulação em malha fechada. Esperamos que os métodos desenvolvidos resultem num avanço no que respeita à aplicação clínica de algoritmos de previsão e deteção de crises.
FCT - SFRH/BD/71497/2010
APA, Harvard, Vancouver, ISO, and other styles
31

Wu, Bing-Han, and 吳秉翰. "Analysis of the time series and multi-variate model for fault detection and prediction in the semiconductor plasma using wider range of spectrum as multi-sensor." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/24850349965438907845.

Full text
Abstract:
碩士
國立東華大學
電機工程學系
93
It is a great challenge in semiconductor manufacturing for the control of relatively complicated plasma system at the stage of nano- scale device development, especially in the process window stability issues due to the actions among species in plasma not belong to stoichiometry. Real-time fault detection can help to make nano-scale device not only for detecting the micro-change inside plasma, but also for the good critical dimension control in the next generation manufacturing technology. In this research, we use the principal component analysis (PCA) of multivariate statistics on the gas emission spectrum to correlate the variation of inside plasma parameters, then develop a fault prediction system dependent on spectrum trends and dynamic time by using time series model to provide an excellent plasma monitoring system. The detectable range of plasma spectrum is from 300 to 1000 nm in this experiment. By using principal component analysis, we successfully extract three dominated wavelengths, 751nm, 764nm and 812nm, and establish a fault diagnostic system individually. The accuracy of this system is 5% variation at the condition of RF power 50W and chamber pressure 117mtorr. We also establish a predication system by time series based on these wavelengths in this experiment. We made 9 faults and successfully detected them in this model described in my thesis. All experimental data are collected at the condition of radio frequency power 50W and chamber pressure 117mtorr.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography