Dissertations / Theses on the topic 'Value anomaly'

To see the other types of publications on this topic, follow the link: Value anomaly.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 22 dissertations / theses for your research on the topic 'Value anomaly.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Zhang, Lingsong Marron James Stephen Zhu Zhengyuan Shen Haipeng. "Functional singular value decomposition and multi-resolution anomaly detection." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2007. http://dc.lib.unc.edu/u?/etd,1166.

Full text
Abstract:
Thesis (Ph. D.)--University of North Carolina at Chapel Hill, 2007.
Title from electronic title page (viewed Mar. 27, 2008). "... in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Statistics and Operations Research." Discipline: Statistics and Operations Research; Department/School: Statistics and Operations Research.
APA, Harvard, Vancouver, ISO, and other styles
2

Andrikopoulos, Panagiotis. "An investigation of the value anomaly in the UK stock market 1987-2000." Thesis, University of Portsmouth, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.247478.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hepfer, Bradford Fitzgerald. "A closer examination of the book-tax difference pricing anomaly." Diss., University of Iowa, 2016. https://ir.uiowa.edu/etd/3096.

Full text
Abstract:
In this study, I examine whether the pricing of book-tax differences reflects mispricing or a priced risk factor. I provide new evidence that temporary book-tax differences are mispriced by developing portfolios that trade on the information in book-tax differences for future accruals and cash flows. I develop and test predictions on whether book-tax difference mispricing is the value-glamour anomaly in disguise. Both signals of mispricing relate to firm growth and, thus, both may capture mispricing due to over-extrapolation of realized growth to future growth. I find that the book-tax difference pricing anomaly is subsumed by the value-glamour anomaly. Specifically, trading on the information in book-tax differences does not yield incremental returns relative to a value-glamour trading strategy. Hence, mispricing associated with book-tax differences relates more generally to the mispricing of expected growth as extrapolated from past growth.
APA, Harvard, Vancouver, ISO, and other styles
4

Haboub, Ahmad. "Essays on equity valuation and accounting conservatism for insurance companies." Thesis, Brunel University, 2017. http://bura.brunel.ac.uk/handle/2438/15823.

Full text
Abstract:
This thesis contributes to the literature in the finance and accounting field throughout its three empirical chapters. The first empirical chapter contributes to the literature on accounting conservatism in several ways; first, it investigates the accounting conservatism of US insurance companies using four measures, namely, non-operating accruals, skewness of earnings and cash flows, book to market ratio and asymmetric timeliness measures. Second, this paper compares these four measures in order to determine the association and differences between them. Finally, the level of accounting conservatism of the insurance companies is compared to that of a sample of commercial banks to check whether they have similar levels of accounting conservatism. The results of the first chapter suggest that the changes in accounting performance, as measured by return over assets, can be partly explained by accounting conservatism, since it is measured by the accumulation of non-operating accruals, skewness of operating cash flow and accruals, book to market ratio, adjusted book to market ratio and Basu's asymmetric measure. All of these four measures give robust evidence that insurance companies' accounts tended to be conservative for the whole sample period, and that the level of conservatism has risen over the years. More interestingly, a t test for the differences in means suggests that accruals conservatism show on average a higher level of accounting conservatism than book value conservatism does. Finally, our results, based on a constant sample consist of 92 banks and 46 insurance companies whose data are available for all the sample years; they suggest that both insurance companies and banks have similar levels of accounting conservatism due to their similar reporting characteristics. The second empirical chapter contributes to the existing literature on equity valuation in two ways. First, it confirms the importance of imposing linear information dynamics when predicting the equity values of insurance companies, because the restricted models result in fewer error metrics. Second, it highlights the role of the accruals components in the equity valuation of US insurance companies by demonstrating that the incorporation of accrual components in the residuals income valuation model suggested by Ohlson (1995) has smaller error metrics than those of aggregate net income. Our results are based on a sample of US insurance companies, which consists of 718 firm-year observations over the period from 2001 to 2012. For instance, our results suggest that total accruals, changes in insurance reserve, changes in account receivables, and deferred acquisition costs have an incremental ability to predict equity market value over abnormal earnings and book values. Furthermore, the predictive ability of changes in insurance reserves is higher than the predictive ability of changes in account receivables and the change in deferred acquisition costs without imposing the LIM structures. However, when the LIM structure is imposed the predictive ability of changes in deferred acquisition costs is higher than the predictive ability of both changes in accounts receivable and changes in insurance reserves. Our final empirical chapter contributes to the literature on accounting anomalies by investigating the value to price anomaly (V/P), where the fundamental value (V) is estimated using the residual income valuation model. Motivated by the findings of Hwang and Lee (2013), Fama and French (2015), and Fama and French (2016), Chapter Four asks whether V/P strategies reflect the risks factor or whether this is better explained by market inefficiency, and whether Fama and French's five-factor model can explain the excess return of V/P. To answer the previous questions we use data from the merger of COMPUSTAT, CRSP, I/B/E/S for all the non-financial firms listed in AMEX, NYSE, and NASDAQ during the period from 1987 to 2015. Our findings suggest that the V/P ratio is positively correlated to future stock returns after controlling for several firm characteristics, which are known to be proxies of common risks. Our results indicate that the omission of risk factors is not likely to be an explanation of the V/P effect. To answer the second question, we compare the performances of different asset pricing models by calculating the GRS F-statistics. Our findings clearly indicate that the five-factor model of Fama and French performs better than either the CAPM or the traditional Fama and French three factor model. These results confirm that the excess returns of V/P strategy vary due to the differences in size, the B/M ratio, operating profit and betas across quintile portfolios. However, these factors cannot explain all the variation in excess returns; moreover, the stocks in the high V/P may be riskier than the stocks in the low V/P portfolios in certain other dimensions.
APA, Harvard, Vancouver, ISO, and other styles
5

Westerlind, Simon. "Anomaly Detection for Portfolio Risk Management : An evaluation of econometric and machine learning based approaches to detecting anomalous behaviour in portfolio risk measures." Thesis, KTH, Nationalekonomi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232131.

Full text
Abstract:
Financial institutions manage numerous portfolios whose risk must be managed continuously, and the large amounts of data that has to be processed renders this a considerable effort. As such, a system that autonomously detects anomalies in the risk measures of financial portfolios, would be of great value. To this end, the two econometric models ARMA-GARCH and EWMA, and the two machine learning based algorithms LSTM and HTM, were evaluated for the task of performing unsupervised anomaly detection on the streaming time series of portfolio risk measures. Three datasets of returns and Value-at-Risk series were synthesized and one dataset of real-world Value-at-Risk series had labels handcrafted for the experiments in this thesis. The results revealed that the LSTM has great potential in this domain, due to an ability to adapt to different types of time series and for being effective at finding a wide range of anomalies. However, the EWMA had the benefit of being faster and more interpretable, but lacked the ability to capture anomalous trends. The ARMA-GARCH was found to have difficulties in finding a good fit to the time series of risk measures, resulting in poor performance, and the HTM was outperformed by the other algorithms in every regard, due to an inability to learn the autoregressive behaviour of the time series.
Finansiella institutioner hanterar otaliga portföljer vars risk måste hanteras kontinuerligt, och den stora mängden data som måste processeras gör detta till ett omfattande uppgift. Därför skulle ett system som autonomt kan upptäcka avvikelser i de finansiella portföljernas riskmått, vara av stort värde. I detta syftet undersöks två ekonometriska modeller, ARMA-GARCH och EWMA, samt två maskininlärningsmodeller, LSTM och HTM, för ändamålet att kunna utföra så kallad oövervakad avvikelsedetektering på den strömande tidsseriedata av portföljriskmått. Tre dataset syntetiserades med avkastningar och Value-at-Risk serier, och ett dataset med verkliga Value-at-Risk serier fick handgjorda etiketter till experimenten i denna avhandling. Resultaten visade att LSTM har stor potential i denna domänen, tack vare sin förmåga att anpassa sig till olika typer av tidsserier och för att effektivt lyckas finna varierade sorters anomalier. Däremot så hade EWMA fördelen av att vara den snabbaste och enklaste att tolka, men den saknade förmågan att finna avvikande trender. ARMA-GARCH hade svårigheter med att modellera tidsserier utav riskmått, vilket resulterade i att den preseterade dåligt. HTM blev utpresterad utav de andra algoritmerna i samtliga hänseenden, på grund utav dess oförmåga att lära sig tidsserierna autoregressiva beteende.
APA, Harvard, Vancouver, ISO, and other styles
6

Abrahamsson, Isak, and Malin Karlsson. "Värdeinvestering – en hållbar strategi för överavkastning? : Ett test av investeringsstrategin F_SCORE på värdeaktier med hög book-to-market kvot." Thesis, Högskolan i Gävle, Företagsekonomi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-26119.

Full text
Abstract:
Syfte: Det huvudsakliga syftet är att testa om Piotroskis F_SCORE tillämpat på aktier med hög book-to-market kvot kan överavkasta marknadsportföljen samt, som en konsekvens av detta, undersöka vilken grad av marknadseffektivitet som föreligger. Det sekundära syftet är att tillföra ett kunskapsbidrag till företagsledare om relevansen i book-to-market kvoten. Metod: Detta är en kvantitativ studie som utgår från ett positivistiskt synsätt och en hypotetiskt-deduktiv ansats. Statistiska tester i form av regressionsanalyser har utformats för att bestämma resultatets signifikansnivå. Den empiriska datan har inhämtats från databasen Thomson Reuter Datastream och sammanställts i Excel för att sedan analyseras i statistikprogrammet Stata. Resultat & slutsats: Studiens resultat visar att värdeportföljen överavkastar marknadsindex samt att den gör det över en längre tidsperiod. Det går också att fastställa att den riskjusterade avkastningen för värdeportföljen är högre än för marknaden, vilket tyder på att överavkastningen inte beror på en högre risk. Det går dock inte att avgöra om den effektiva marknadshypotesen råder eller ej, däremot går det att utesluta att den starka och semi-starka formen av marknadseffektivitet gäller. Förslag till fortsatt forskning: För att studera vidare huruvida den svaga formen av marknadseffektivitet råder är ett förslag till vidare forskning att göra en studie utifrån Contrarian modellen för att använda teknisk analys som endast tar hänsyn till historiska kursrörelser för att förutspå framtida avkastning. Ett annat förslag till vidare forskning är att genomföra en liknande studie som denna men då bortse från book to market kvoten och istället köpa aktier med ett F_SCORE högre eller lika med 5 samt att blanka de aktier som har ett F_SCORE under 5. Det tredje förslaget är att studera vidare kring sambandet mellan avkastning och anomalier som småbolagseffekten, likviditet och beteendefinans för att få en tydligare förståelse för vad som orsakar överavkastningen. Uppsatsens bidrag: Det teoretiska bidraget är att den aktuella investeringsstrategin överavkastar marknadsindex för vald tidsperiod utan en nödvändigtvis högre risk. F_SCORE antar en normalfördelningskurva där de bolag som har F_SCORE över fem generellt presterar bättre. Resultatet visar även att book to market kvoten är ett användbart nyckeltal för bolagsvärdering. Det praktiska bidraget är att det kan vara av vikt för företagsledare att fokusera på book to market kvoten för att locka investerare. För investerare är bidraget att denna investeringsstrategi kan slå marknadsindex utan att risken i portföljen ökar.
Aim The main aim is to test if Piotroskis F_SCORE applied on stocks with high book-to- market ratio outperforms the market portfolio and therefore determine the level of market efficiency. The secondary aim is to provide knowledge to business executives about the relevance of a book-to-market policy. Method This study is a quantitative research which assumes a positivistic research philosophy with a deductive approach. Several regression analyses have been used to confirm the statistical significance of the different estimated parameters. The empirical results give answers to two hypotheses based on the aim of this research. The empirical data have been collected from Thomson Reuter Datastream, compiled in Excel and analyzed with the statistical software Stata. Result & Conclusions The empirical results of this study show that the value portfolio has a higher return than the market index. The risk-adjusted return for the value portfolio is higher compared to the market portfolio. This indicates that the higher return of the value portfolio is not due to a higher risk. By the results of this study there is not possible to determine whether the market is fully efficient or not. It is only possible to exclude the strong and semi-strong form of market efficiency. Suggestions for future research For future studies, we suggest further research about the weak form of market efficiency. Using historical data to determine future return, as Contrarian model, is one suggestion to reach further evidence of market (in)efficiency. Since F_SCORE assumes a normal distribution and because of the poor performance of the low F_SCORE firms another suggestion is short-sell these stocks to see if the return ca be increased. This empirical field needs further research about which factors that causes the higher return for these stocks. The small firm effect, liquidity and behavioral finance are just a few anomalies that may have a relationship with excess return. Contribution of the thesis The investment strategy in this research shows a higher excess return compared to the market index as well as a higher risk-adjusted return over the given period. This is not only a contribution to investors but also in a theoretical field due to the efficient market hypothesis. F_SCORE have a normal distribution curve where the stocks with F_SCORE of 5 or higher generally have a higher mean return. Another contribution is the relevance of book to market ratio as a useful strategy for valuating companies. The practical contribution gives business executives better understanding about the relevance of a book-to-market policy when attracting investors.
APA, Harvard, Vancouver, ISO, and other styles
7

Warsi, Mohammed Ali. "Ebstein anomaly of the tricuspid valve in an adult cohort." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0003/MQ46203.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jämtander, Jämtander. "Models explaining the average return on the Stockholm Stock Exchange." Thesis, Högskolan i Jönköping, Internationella Handelshögskolan, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-40360.

Full text
Abstract:
Using three different models, we examine the determinants of average stock returns on the Stockholm Stock Exchange during 2012-2016. By using time-series data, we find that a Fama-French three-factor model (directed at capturing size and book-to-market ratio) functions quite well in the Swedish stock market and is able to explain the variation in returns better than the traditional CAPM. Additionally, we investigated if the addition of a Price/Earning variable to the Fama-French model would increase the explanatory power of the expected returns of the different dependent variables portfolios. We conclude that the P/E ratio does not influence the expected returns in the sample we used.
APA, Harvard, Vancouver, ISO, and other styles
9

Boulfani, Fériel. "Caractérisation du comportement de systèmes électriques aéronautiques à partir d'analyses statistiques." Thesis, Toulouse 1, 2021. http://publications.ut-capitole.fr/43780/.

Full text
Abstract:
La caractérisation des systèmes électriques est une tâche essentielle dans la conception aéronautique. Elle consiste notamment à dimensionner les composants des systèmes, définir les exigences à respecter par les charges électriques, définir les intervalles de maintenance et identifier les causes racines des pannes sur avions. Aujourd'hui, les calculs sont basés sur la théorie du génie électrique ou des modèles physiques simulés. L'objectif de cette thèse est d'utiliser une approche statistique basée sur les données observées durant les vols et des modèles d'apprentissage automatique pour caractériser le comportement du système électrique aéronautique. La première partie de cette thèse traite de l'estimation de la consommation électrique maximale que fournit un système électrique, dans le but d'optimiser le dimensionnement des générateurs et de mieux connaître les marges réelles. La théorie des valeurs extrêmes a été utilisée pour estimer des quantiles qui sont comparés aux valeurs théoriques calculées par les ingénieurs. Dans la deuxième partie, différents modèles régularisés sont considérés pour prédire la température de l'huile du générateur électrique dans un contexte de données fonctionnelles. Cette étude permet notamment de comprendre le comportement du générateur dans des conditions extrêmes qui ne peuvent pas être réalisées physiquement. Enfin, dans la dernière partie, un modèle de maintenance prédictive est proposé afin de détecter des anomalies dans le fonctionnement du générateur électrique pour anticiper les pannes. Le modèle proposé utilise des variantes de la méthode "Invariant Coordinate Selection" pour des données fonctionnelles
The characterization of electrical systems is an essential task in aeronautic conception. It consists in particular of sizing the electrical components, defining maintenance frequency and finding the root cause of aircraft failures. Nowadays, the computations are made using electrical engineering theory and simulated physical models. The aim of this thesis is to use statistical approaches based on flight data and machine learning models to characterize the behavior of aeronautic electrical systems. In the first part, we estimate the maximal electrical consumption that the generator should deliver to optimize the generator size and to better understand its real margin. Using the extreme value theory we estimate quantiles that we compare to the theoretical values computed by the electrical engineers. In the second part, we compare different regularized procedures to predict the oil temperature of a generator in a functional data framework. In particular, this study makes it possible to understand the generator behavior under extreme conditions that could not be reproduced physically. Finally, in the last part, we develop a predictive maintenance model that detects the abnormal behavior of a generator to anticipate failures. This model is based on variants of "Invariant Coordinate Selection" adapted to functional data
APA, Harvard, Vancouver, ISO, and other styles
10

Silva, José Pedro da. "Nova técnica cirúrgica para a correção da anomalia de Ebstein: resultados imediatos e em longo prazo." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/5/5156/tde-28012009-154640/.

Full text
Abstract:
Introdução: As principais operações para correção da anomalia de Ebstein baseiam-se na reconstrução da valva atrioventricular direita (AVD) em formato monovalvular, tendo resultados comprometidos pela necessidade de substituição ou alta reincidência de insuficiência valvar. Uma nova técnica foi desenvolvida, diferenciado-se das anteriores, pela correção anatômica da valva AVD, resultando na coaptação plena das válvulas no fechamento valvar. O objetivo deste estudo é avaliar a aplicabilidade dessa técnica, estudando os seus efeitos na evolução clínica, função da valva, restauração do ventrículo direito funcional e remodelamento reverso do coração no pós-operatório imediato (POI) e no pós-operatório em longo prazo (POL). Métodos: Estudo retrospectivo de 52 pacientes consecutivos, com idade média de 18,5±13,8anos, submetidos técnica do cone para correção da anomalia de Ebstein, entre novembro 1993 e dezembro de 2006, cujos principais detalhes cirúrgicos são: as válvulas anterior e posterior da valva AV direita são mobilizadas das suas implantações anômalas no ventrículo direito (VD), a borda livre desse conjunto é rodada no sentido horário para ser suturada à, previamente mobilizada, borda septal da válvula anterior, formando um cone cujo vértice permanece fixo ao ápice do VD e a base é suturada no nível do anel atrioventricular verdadeiro, reduzido ao mesmo tamanho da base do cone. A válvula septal, sempre que possível, é incorporada à parede do cone. A porção atrializada do VD é reduzida por plicatura longitudinal. Os dados clínicos, ecocardiográficos e os índices cardiotorácicos, obtidos nos períodos pré-operatório (PREOP) e pós-operatório, foram analisados. Resultados: Houve dois óbitos hospitalares (3,8 %) e dois óbitos no seguimento em longo prazo A melhora clínica foi significante, sendo a distribuição dos pacientes em classes funcionais de insuficiência cardíaca (NYHA), IV = 4, III = 27, II = 11 e I = 5 no PREOP, modificada para IV = 0, III = 1, II = 2 e I = 44 no pós-operatório em longo prazo (POL) (p<0,0001), com seguimento médio de 57 meses. Quatro pacientes foram reoperados, sendo realizada nova plastia valvar. O índice cardiotorácico de 0,66±0,09 no PREOP diminuiu para 0,54±0,06 no POL (p<0,0001). Os ecocardiogramas mostraram redução dos graus de insuficiência da valva AV direita (p<0,001), sendo a distribuição dos pacientes no pré-operatório, grau 1 = 0, grau 2 = 1, grau 3 = 15, grau 4 = 24, modificada para grau 1 = 19, grau 2 = 17, grau 3 = 4, grau 4 = 0 no POI, com pequena alteração no POL (grau 1 = 11, grau 2 = 22, grau 3 = 7, grau 4 = 0). A cavidade funcional do VD foi restaurada pela operação, ocorrendo aumento da área do VD funcional indexada de 8,53± 7,02 cm2/m2 no PREOP para 21,01±6,87 cm2/m2 no POI (p<0,001), e ficando inalterada em 20,28±5,26 cm2/m2 no POL (p>0,05). Conclusões: Esta técnica foi aplicável com baixa mortalidade hospitalar e sem necessidade de substituição valvar. Houve melhora clínica pós-operatória e baixa incidência de reoperações em longo prazo. A correção da insuficiência valvar foi eficaz e duradoura na maioria dos pacientes. Houve restauração da área funcional do VD e remodelamento reverso do coração.
Background: The main operations for Ebsteins anomaly repair are conceived to reconstruct the tricuspid valve (TV) in a monocusp format, but their results are restricted either by the need for valve replacement or by high incidence of postoperative valve regurgitation. A new surgical technique was developed, that performs an anatomical reconstruction of the tricuspid valve, realizing a leaflet-to-leaflet coaptation at the TV closure. The objective of this study is to access the feasibility of this technique, evaluating its effects in clinical outcome, tricuspid valve function, right ventricle (RV) morphology and reverse remodeling of the heart.Methods: Retrospective study on 52 consecutive patients, mean age of 18,5+- 13,8 years, treated with a new surgical technique for Ebsteins anomaly repair (the cone technique), between November 1993 and December 2006, which principal details are: a) the anterior and posterior tricuspid valve leaflets re mobilizedfrom their anomalous attachments in the RV, the free edge of this complex is rotated clockwise to be sutured to the septal border of anterior leaflet, creatind a cone which vertex remains fixed at RV apex and whose base is the sutured to a true tricuspid annulus, plicated to match it to base of said cone. The septal leaflet is incorporated into the cone wall ewhenever possible. The atrialized chamber is reduced by longitudinal placation. The clinical and echocardiographic data and the patients cardiothoracic ratios, collected at the preoperative, early and late postoperative periods, were analyzed. Results: There were two hospital deaths (3.8 %) and two more deaths in the long term followup. The significant clinical improvement was evident by the change of patients functional class of heart failure (NYHA) from IV=4, III=27, II=11 and I=5, in the preoperative to IV =0, III = 1, II = 2 e I = 44 at 57 months mean long term follow-up (p<0,0001). Four patients required late TV re-repair. Atrioventricular block did not occur and there was no need for tricuspid valve replacement at any time. The cardiothoracic ratio decreased from 0,66+-0,09, preoperatively, to 0,54+-0,06 in long term follow-up (p<0,001). Echocardiographic studies showed significant TV insufficiency reduction from the preoperative patient distribution of: grade 1 = 0, grade 2 = 1, grade 3 = 15, grade 4 = 24, modified to: grade 1 = 19, grade 2 = 17, grade 3 = 4, grade 4 = 0 on early postoperative period (p<0.001), with little change afterwards (grade 1 = 11, grade 2 = 22, grade 3 = 7, grade 4 = 0). The normal RV morphology was surgically restored, indicated by the enlargement of RV indexed area from 8.53+-7.02 cm2/m2, preoperatively to 21.01+-6.87 cm2/m2 in the early perioperative period (p<0.001), remaining unchanged, 20.28+-5.26 cm2/m2 in long term echocardiogram (p>0,05). Conclusions: This operative technique was feasible with low hospital mortality and no need for TV replacement. There was improvement in the patients clinical status and low incidence of reoperations in long term follow-up. The TV repair was efficacious and durable for the great marjority of patients and there was immediate RV morphology restoration and reverse remodeling of heart in long term follow-up
APA, Harvard, Vancouver, ISO, and other styles
11

Schönenberger, Fabian. "Kennzahlen in Faktormodellen Untersuchung von Anlagestrategien mit betriebswirtschaftlichen Kennzahlen basierend auf der Value-Anomalie /." St. Gallen, 2008. http://www.biblio.unisg.ch/org/biblio/edoc.nsf/wwwDisplayIdentifier/02600088002/$FILE/02600088002.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Goix, Nicolas. "Apprentissage automatique et extrêmes pour la détection d'anomalies." Thesis, Paris, ENST, 2016. http://www.theses.fr/2016ENST0072/document.

Full text
Abstract:
La détection d'anomalies est tout d'abord une étape utile de pré-traitement des données pour entraîner un algorithme d'apprentissage statistique. C'est aussi une composante importante d'une grande variété d'applications concrètes, allant de la finance, de l'assurance à la biologie computationnelle en passant par la santé, les télécommunications ou les sciences environnementales. La détection d'anomalies est aussi de plus en plus utile au monde contemporain, où il est nécessaire de surveiller et de diagnostiquer un nombre croissant de systèmes autonomes. La recherche en détection d'anomalies inclut la création d'algorithmes efficaces accompagnée d'une étude théorique, mais pose aussi la question de l'évaluation de tels algorithmes, particulièrement lorsque l'on ne dispose pas de données labellisées -- comme dans une multitude de contextes industriels. En d'autres termes, l'élaboration du modèle et son étude théorique, mais aussi la sélection du modèle. Dans cette thèse, nous abordons ces deux aspects. Tout d'abord, nous introduisons un critère alternatif au critère masse-volume existant, pour mesurer les performances d'une fonction de score. Puis nous nous intéressons aux régions extrêmes, qui sont d'un intérêt particulier en détection d'anomalies, pour diminuer le taux de fausse alarme. Enfin, nous proposons deux méthodes heuristiques, l'une pour évaluer les performances d'algorithmes de détection d'anomalies en grande dimension, l'autre pour étendre l'usage des forets aléatoires à la classification à une classe
Anomaly detection is not only a useful preprocessing step for training machine learning algorithms. It is also a crucial component of many real-world applications, from various fields like finance, insurance, telecommunication, computational biology, health or environmental sciences. Anomaly detection is also more and more relevant in the modern world, as an increasing number of autonomous systems need to be monitored and diagnosed. Important research areas in anomaly detection include the design of efficient algorithms and their theoretical study but also the evaluation of such algorithms, in particular when no labeled data is available -- as in lots of industrial setups. In other words, model design and study, and model selection. In this thesis, we focus on both of these aspects. We first propose a criterion for measuring the performance of any anomaly detection algorithm. Then we focus on extreme regions, which are of particular interest in anomaly detection, to obtain lower false alarm rates. Eventually, two heuristic methods are proposed, the first one to evaluate anomaly detection algorithms in the case of high dimensional data, the other to extend the use of random forests to the one-class setting
APA, Harvard, Vancouver, ISO, and other styles
13

Caussé, Brigitte. "Insuffisance mitrale et naissance anormale de la coronaire gauche : remplacement valvulaire mitral : à propos d'une observation." Bordeaux 2, 1990. http://www.theses.fr/1990BOR2M109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

DI, GIACOMO STEFANIA. "Essays on financial markets and on effects of information and communication technology." Doctoral thesis, Università degli Studi di Roma "Tor Vergata", 2005. http://hdl.handle.net/2108/39.

Full text
Abstract:
La tesi di dottorato si compone di quattro saggi empirici. Il primo saggio verifica la performance delle strategie di portafoglio "value" e "growth" formate sulle deviazioni fra il valore osservato di un titolo e il valore fondamentale (determinato con il metodo di attualizzazione dei flussi di cassa), utilizzando il modello CAPM a 4-fattori. I risultati mostrano che, sia nel mercato azionario europeo che in quello americano, le strategie "short term DCF value" (basate su una selezione mensile dei titoli che hanno il più basso rapporto tra valore osservato e valore fondamentale nel periodo precedente) hanno rendimenti medi mensili che sono superiori a, non soltanto le strategie "growth" alternative, ma anche a quelle passive di "buy&hold" sul portafoglio campionario totale (il benchmark). Il secondo saggio è dedicato allo studio di quanto i componenti "fondamentali" e "non fondamentali" sono importanti nella determinazione dei prezzi dei titoli azionari, in base alle differenze regolamentari tra paesi e alla composizione degli investitori nel mercato finanziario. I risultati empirici mostrano che il P/E fondamentale spiega una parte significativa della variazione del P/E osservato, soprattutto nel mercato americano (dove c’è più trasparenza di informazioni e una più dominante presenza dei fondi pensione). Mentre soltanto per il campione europeo c’è presenza di insider trading. Il terzo saggio analizza il contributo dell’Information&Communication Technology ai livelli e alla crescita del GDP pro-capite. Le due ipotesi, che l’ICT aggiunge valore al capitale fisico tradizionale o rimuove i "colli di bottiglia" che limitano l’accesso alla conoscenza, risultano migliori del modello di MRW (1992) e di quello di Islam (1995). Il miglioramento della "within country" significatività nelle stime panel conferma che l’approccio seguito riesce a catturare due dimensioni del "time varying-country specific" progresso tecnologico. Il quarto saggio è dedicato allo studio, tramite un modello a coefficienti random, del ruolo della tecnologia come fattore che, influenzando il potere e la produttività delle donne, ha effetti significativi sulle decisioni di fertilità. I risultati indicano che la diffusione di ICT ha un effetto negativo e significativo sui tassi di fertilità, anche dopo il controllo per il capitale umano e la qualità delle istituzioni di un paese. Inoltre questo effetto è altamente eterogeneo tra macroaree (vengono infatti identificati cinque diversi sottogruppi di paesi) per via di tre fattori latenti: le norme religiose pro-fertilità delle culture cattoliche ed islamiche, il grado di modernizzazione e di educazione di un paese e il "digital divide".
The present dissertation is divided into four empirical essays. The first essay tests the performance of "value" and "growth" portfolio strategies formed on deviations between observed and discounted cash flow fundamental (DCF) values, using the four-factor CAPM model.The results show that, both in the American and European stock exchanges, "short term DCF value" strategies (based on a monthly selection of the stocks with the lowest observed to fundamental ratio in the previous period) have mean monthly returns which are higher than, not only the corresponding growth strategies, but also passive buy and hold strategies on the total sample portfolio (the benchmark). The second essay is dedicated to the study of how much "fundamental" and "non- fundamental" components matter in determining stock prices according to differences in regulatory environments between countries and in the composition of financial market investors. Empirical show that the "fundamental" P/E explains a significant share of variation of the observed P/E, expectially for US stocks (where there is more transparency of information and more pervasive presence of pension funds). Instead only for the EU sample there is presence of insider trading. The third essay analyzes the contribution of Information&Communication Technology to levels and growth of per capita GDP. The two hypotheses, that ICT adds value to traditional physical capital or removes the "bottlenecks" which limit access to knowledge, improve upon the classical MRW (1992)-Islam (1995) framework. The improvement of "within" country significance in panel estimates documents that this approach captures two dimensions of time varying-country specific technological progress. The forth essay is dedicated to the study, by a random coefficient model, of the role of technology as a factor which, by affecting women’s empowerment and productivity, have significant effects on fertility decisions. The empirical results show that ICT diffusion has significant negative effect on fertility rates, after controlling for human capital and institutional quality. Moreover this effect is highly heterogeneous across macroareas (five subgroups of countries are optimally identified) because of three latent factors: pro fertility religious norms of Catholic and Islamic culture, the degree of secularization and education of a country, and the digital divide.
APA, Harvard, Vancouver, ISO, and other styles
15

Chan, Shun-Hsing, and 詹順興. "The Sudty of Value-Glamour Anomaly." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/12314644156632802185.

Full text
Abstract:
碩士
東海大學
會計學系
95
This paper analyzes value-glamour anomaly for firms listed on the Taiwan Security Exchange during 1996 to 2004 and further investigate the source of value-glamour anomaly. There are two parts in this study. The first part is value-glamour anomaly, I use value-glamour factor into one factor and Fama and French (1993) three factor model to analyze value-glamour anomaly. The second part is that use factor loading and cross-sectional to test the source of value-glamour anomaly. The portfolio is formed, which control size and earnings to price ratio. The result shows that reversal size effect and earnings to price effect in Taiwan. However, one factor and four factor model tests indicate the existence of value-glamour anomaly. Besides, the four factor models explain value-glamour anomaly much more than three factor model. Finally, loading and cross-sectional test suggest that the source of value-glamour anomaly is mispricing, but it can not exclude little portion of anomaly form risk. This result is consistent with La Porta et al.(1997)、Daniel and Titman (1997)、Skinner and Sloan (2002).
APA, Harvard, Vancouver, ISO, and other styles
16

DAL, MASO LORENZO. "Accounting anomalies, fundamental analysis and variables reduction." Doctoral thesis, 2014. http://hdl.handle.net/2158/1003710.

Full text
Abstract:
“It is inconceivable that accounting data can be analyzed without transferring it into ratios, in one way or another...” James O. Horrigan (July 1965) Con questa frase nel 1965 l’allora Assistant Professor dell’Università di Notre Dame James Horrigan concludeva il proprio articolo intitolato “Some Empirical Bases of Finanancial Ratio Analysis”. Da quel momento ci fu una vera e propria esplosione d’interesse verso l’analisi di bilancio intesa come metodo di valutazione e confronto delle performance aziendali e non più solamente quale strumento di indagine del merito creditizio da parte dei credit analysts (Horrigan, 1965). Tuttavia a distanza di molti anni rimane irrisolto il quesito: “quale indici usare data l’ingente numero di indicatori calcolabili? Quali tra questi hanno maggior significatività ai fini di comparazione e previsione? Nel corso degli anni, numerosi studiosi hanno cercato di dare risposta a questa domanda approcciando il problema in maniera pragmatica, deduttiva oppure induttiva. Il metodo pragmatico consisteva in una classificazione soggettiva dei ratios basata per lo più sull’esperienza personale di colui che scrive; il secondo, invece, non basandosi solamente sull’esperienza personale cercava di identificare gli schemi deduttivamente (i.e. triangolo di Du Pont del 1919) mentre l’approccio induttivo, epistemologicamente parlando di stampo logico positivista, utilizzava metodologie statistiche create ad hoc per raggruppare empiricamente gli indicatori (Salmi e Martikainen, 1994). Ripartendo proprio dai contributi afferenti ai metodi di classificazione empirica degli indicatori di bilancio il presente lavoro vuole indagare l’esistenza, o meno, di una differente classificazioni dei ratios di bilancio discriminando l’impresa quale value oppure growth. Ciò viene motivato dalla presenza della cosiddetta “value anomaly” ovvero la tendenza delle società value (i.e. quindi quelle società con elevato valori di book to market ratio) di sovraperformare le growth (i.e. quindi società con bassi valori di book to market – Zacks, 2011). Nella letteratura è stato affermato che tale anomalia dipende in larga parte dagli investors’ behavioral biases pertanto risulta interessante verificare se esistano, a livello di classificazione degli indicatori di bilancio, delle differenze tra le due tipologie di società. Il presente elaborato si compone delle seguenti parti: 1. Il primo capitolo presenta il background teorico, le ipotesi, l’obiettivo di ricerca e la rilevanza dello studio, nonchè i contributi alla letteratura. 2. Il secondo capitolo analizza la principale letteratura afferente ai filoni di ricerca che stanno alla base dell’elaborato, in particolare la letteratura inerente a: (a) classification pattern dei ratios e (b) fundamental analysis anomalies. 3. Nel capitolo terzo sono presentate le principali metodologie statistiche utilizzate nell’analisi empirica. Viene fornita la spiegazione circa il funzionamento dell’analisi delle componenti principali e su come questi siano stati implementati sul campione in esame, ovvero presentato il campione, le variabili ed i principali step dell’analisi. 4. Nel quarto, ed ultimo, capitolo sono presentati i risultati dell’analisi empirica, oltre ché i vincoli e le opportunità di future ricerche su questo tema.
APA, Harvard, Vancouver, ISO, and other styles
17

Chen, Mu-Jen, and 陳睦仁. "How to Arbitrage on the Recovery of Fundamental Value-to-Price Anomaly?" Thesis, 2008. http://ndltd.ncl.edu.tw/handle/31529097755898219795.

Full text
Abstract:
碩士
元智大學
財務金融學系
96
Shleifer and Vishny (1997) argue that arbitrage can be both costly and risky. As a result, arbitrageurs will not exploit arbitrage opportunities if the costs and risk of arbitrage exceed its benefits, thereby allow mispricing to survive for long periods of time. Frankel and Lee (1998) document that the fundamental value-to-price (Vf/P) ratio predicts future abnormal returns for up to three years, where Vf is an estimate of fundamental value based on a residual income model that uses analyst earnings forecasts. Ali, Hwang and Trombley (2003a) further show that their results seem consistent with the mispricing explanation rather than with the risk explanation of the Vf/P effect. Thus, the Vf /P effect provides a good means to examine the limits of arbitrage. Wei and Zhang (2006) find that the Vf /P effect shows that firm age, earnings quality, and divergence of opinion have incremental power beyond other measures of risk in explaining the cross-sectional variation in the Vf /P effect. The results appear to be consistent with the argument of the limits of arbitrage. But this strategy may take long time, we hope the strategy is effective in short-term. Akhigbe, Larson and Madura (2002) thought that 10% daily rise or fall is a signal of overreaction and underreaction. Wei and Zhang (2006) provide a strategy to arbitrage. But this strategy takes a long time, it needs three years to arbitrage. We want to find out the relation between high daily return and Vf /P. If the stock price match its fundamental value quickly after high daily return.
APA, Harvard, Vancouver, ISO, and other styles
18

Lancastre, Pedro Jácome Henriques de. "The Relation Between Post-Earnings Announcement Drift and the Value Anomaly in the UK Stock Market." Master's thesis, 2017. https://repositorio-aberto.up.pt/handle/10216/108900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Lancastre, Pedro Jácome Henriques de. "The Relation Between Post-Earnings Announcement Drift and the Value Anomaly in the UK Stock Market." Dissertação, 2017. https://repositorio-aberto.up.pt/handle/10216/108900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Lin, Yu-Hsien, and 林祐賢. "Application of Predomination Period and Background Value Anomaly to Establish Earthquake Early Warning Model-Application in Chiufenershan Landslide." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/15336556933593067565.

Full text
Abstract:
碩士
國立中興大學
土木工程學系所
99
In this study, for dangerous slope and underground pipelines, the application of seismic monitoring stations of P-wave data analysis, and use the slope of the critical acceleration to establish of early warning criteria and development of earthquake early warning system prototype. Base on the P-wave predominant period concept, this study aimed to find relation of predominant period and peak ground acceleration for the Chiufenershan landslide. The seismic records from 4 seismic monitoring station of Central Weather Bureau near the Chiufenershan in the past 17 years were obtained and applied for the correlation analysis. Used the Matlab program to find all three spatial directions of the relationship between predomianat period and peak ground acceleration. In addition, this study also used the concept of background anomaly with morning, afternoon and evening of the constant background values when the average maximum displacement and P-wave arrival within 3 seconds of the maximum displacement divided by draw strength index, which one of index and three spatial direction of the peak ground acceleration regression analysis to explore the best relationship. By a research results of predominant period can be obtained the warning formula from the Chiufenershan landslide area and the Chung Hsing university area. For example, the acceleration range for intensity of 5 was 80gal to 250gal, it can be use to estimate the predomianat period range was 12.1 to 34.1 and 14.5 to 76.9 , respectively. Background anomaly in the study found that when the strong index(K) between 18.64 and 19.43, it was likely to occur when the landslide slope leading to the critical sliding acceleration. In this study, not only investigate the Chiufenershan landslide warning criteria, but also analysis two stations (Chung Hsiao primary school station and Chong Guang primary school station) of Chung Hsing University area. Use Visual basic 6.0 program with new seismic instrument Palert to establish a region and for automated earthquake early warning early warning system prototype for Chung Hsing University area.
APA, Harvard, Vancouver, ISO, and other styles
21

Sun, Le. "Data stream mining in medical sensor-cloud." Thesis, 2016. https://vuir.vu.edu.au/31032/.

Full text
Abstract:
Data stream mining has been studied in diverse application domains. In recent years, a population aging is stressing the national and international health care systems. Along with the advent of hundreds and thousands of health monitoring sensors, the traditional wireless sensor networks and anomaly detection techniques cannot handle huge amounts of information. Sensor-cloud makes the processing and storage of big sensor data much easier. Sensor-cloud is an extension of Cloud by connecting the Wireless Sensor Networks (WSNs) and the cloud through sensor and cloud gateways, which consistently collect and process a large amount of data from various sensors located in different areas. In this thesis, I will focus on analysing a large volume of medical sensor data streams collected from Sensor-cloud. To analyse the Medical data streams, I propose a medical data stream mining framework, which is targeted on tackling four main challenges ...
APA, Harvard, Vancouver, ISO, and other styles
22

DUGHETTI, FRANCESCA. "Studio sulle anomalie in metalli pesanti nelle diverse matrici ambientali della zona costiera tirrenica fra la valle della Bruna e del Cornia." Doctoral thesis, 2013. http://hdl.handle.net/2158/796878.

Full text
Abstract:
Studio geochimico mineralogico sulla presenza di arsenico e metalli pesanti nelle diverse matrici ambientali, quali suoli, stream sediments e acque, della zona costiera tirrenica fra la valle della Bruna e del Cornia
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography