Dissertations / Theses on the topic 'Evaluation of extreme classifiers'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 49 dissertations / theses for your research on the topic 'Evaluation of extreme classifiers.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Legrand, Juliette. "Simulation and assessment of multivariate extreme models for environmental data." Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASJ015.
Full textAccurate estimation of the occurrence probabilities of extreme environmental events is a major issue for risk assessment. For example, in coastal engineering, the design of structures installed at or near the coasts must be such that they can withstand the most severe events they may encounter in their lifetime. This thesis focuses on the simulation of multivariate extremes, motivated by applications to significant wave height, and on the evaluation of models predicting the occurrences of extreme events.In the first part of the manuscript, we propose and study a stochastic simulator that, given offshore conditions, produces jointly offshore and coastal extreme significant wave heights (Hs). We rely on bivariate Peaks over Threshold and develop a non-parametric simulation scheme of bivariate generalised Pareto distributions. From such joint simulator, we derive a conditional simulation model. Both simulation algorithms are applied to numerical experiments and to extreme Hs near the French Brittanny coast. A further development is addressed regarding the marginal modelling of Hs. To take into account non-stationarities, we adapt the extended generalised Pareto model, letting the marginal parameters vary with the peak period and the peak direction.The second part of this thesis provides a more theoretical development. To evaluate different prediction models for extremes, we study the specific case of binary classifiers, which are the simplest type of forecasting and decision-making situation: an extreme event did or did not occur. Risk functions adapted to binary classifiers of extreme events are developed, answering our second question. Their properties are derived under the framework of multivariate regular variation and hidden regular variation, allowing to handle finer types of asymptotic independence. This framework is applied to extreme river discharges
Lavesson, Niklas. "Evaluation and Analysis of Supervised Learning Algorithms and Classifiers." Licentiate thesis, Karlskrona : Blekinge Institute of Technology, 2006. http://www.bth.se/fou/Forskinfo.nsf/allfirst2/c655a0b1f9f88d16c125714c00355e5d?OpenDocument.
Full textNygren, Rasmus. "Evaluation of hyperparameter optimization methods for Random Forest classifiers." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-301739.
Full textFör att skapa en maskininlärningsmodell behöver en ofta välja olika hyperparametrar som konfigurerar modellens egenskaper. Prestandan av en sådan modell beror starkt på valet av dessa hyperparametrar, varför det är relevant att undersöka hur optimering av hyperparametrar kan påverka klassifikationssäkerheten av en maskininlärningsmodell. I denna studie tränar och utvärderar vi en Random Forest-klassificerare vars hyperparametrar sätts till särskilda standardvärden och jämför denna med en klassificerare vars hyperparametrar bestäms av tre olika metoder för optimering av hyperparametrar (HPO) - Random Search, Bayesian Optimization och Particle Swarm Optimization. Detta görs på tre olika dataset, och varje HPO- metod utvärderas baserat på den ändring av klassificeringsträffsäkerhet som den medför över dessa dataset. Vi fann att varje HPO-metod resulterade i en total ökning av klassificeringsträffsäkerhet på cirka 2-3% över alla dataset jämfört med den träffsäkerhet som kruleslassificeraren fick med standardvärdena för hyperparametrana. På grund av begränsningar i form av tid och data kunde vi inte fastställa om den positiva effekten är generaliserbar till en större skala. Slutsatsen som kunde dras var istället att användbarheten av metoder för optimering av hyperparametrar är beroende på det dataset de tillämpas på.
Dang, Robin, and Anders Nilsson. "Evaluation of Machine Learning classifiers for Breast Cancer Classification." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280349.
Full textBröstcancer är en vanlig och dödlig sjukdom bland kvinnor globalt där en tidig upptäckt är avgörande för att förbättra prognosen för patienter. I dagens digitala samhälle kan datorer och komplexa algoritmer utvärdera och diagnostisera sjukdomar mer effektivt och med större säkerhet än erfarna läkare. Flera studier har genomförts för att automatisera tekniker med medicinska avbildningsmetoder, genom maskininlärnings tekniker, för att förutsäga och upptäcka bröstcancer. I den här rapport utvärderas och jämförs lämpligheten hos fem olika maskininlärningsmetoder att klassificera huruvida bröstcancer är av god- eller elakartad karaktär. Vidare undersöks hur metodernas effektivitet, med avseende på klassificeringssäkerhet samt exekveringstid, påverkas av förbehandlingsmetoden Principal component analysis samt ensemble metoden Bootstrap aggregating. I teorin skall båda förbehandlingsmetoder gynna vissa maskininlärningsmetoder och således öka klassificeringssäkerheten. Undersökningen är baserat på ett välkänt bröstcancer dataset från Wisconsin som används till att träna algoritmerna. Resultaten är evaluerade genom applicering av statistiska metoder där träffsäkerhet, känslighet och exekveringstid tagits till hänsyn. Följaktligen jämförs resultaten mellan de olika klassificerarna. Undersökningen visade att användningen av varken Principal component analysis eller Bootstrap aggregating resulterade i några nämnvärda förbättringar med avseende på klassificeringssäkerhet. Dock visade resultaten att klassificerarna Support vector machines Linear och RBF presterade bäst. I och med att undersökningen var begränsad med avseende på antalet dataset samt val av olika evalueringsmetoder med medförande justeringar är det därför osäkert huruvida det erhållna resultatet kan generaliseras över andra dataset och populationer.
Fischer, Manfred M., Sucharita Gopal, Petra Staufer-Steinnocher, and Klaus Steinocher. "Evaluation of Neural Pattern Classifiers for a Remote Sensing Application." WU Vienna University of Economics and Business, 1995. http://epub.wu.ac.at/4184/1/WSG_DP_4695.pdf.
Full textSeries: Discussion Papers of the Institute for Economic Geography and GIScience
Alorf, Abdulaziz Abdullah. "Primary/Soft Biometrics: Performance Evaluation and Novel Real-Time Classifiers." Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/96942.
Full textDoctor of Philosophy
The relevance of faces in our daily lives is indisputable. We learn to recognize faces as newborns, and faces play a major role in interpersonal communication. Faces probably represent the most accurate biometric trait in our daily interactions. Thereby, it is not singular that so much effort from computer vision researchers have been invested in the analysis of faces. The automatic detection and analysis of faces within images has therefore received much attention in recent years. The spectrum of computer vision research about face analysis includes, but is not limited to, face detection and facial attribute classification, which are the focus of this dissertation. The face is a primary biometric because by itself revels the subject's identity, while facial attributes (such as hair color and eye state) are soft biometrics because by themselves they do not reveal the subject's identity. Soft biometrics have many uses in the field of biometrics such as (1) they can be utilized in a fusion framework to strengthen the performance of a primary biometric system. For example, fusing a face with voice accent information can boost the performance of the face recognition. (2) They also can be used to create qualitative descriptions about a person, such as being an "old bald male wearing a necktie and eyeglasses." Face detection and facial attribute classification are not easy problems because of many factors, such as image orientation, pose variation, clutter, facial expressions, occlusion, and illumination, among others. In this dissertation, we introduced novel techniques to classify more than 40 facial attributes in real-time. Our techniques followed the general facial attribute classification pipeline, which begins by detecting a face and ends by classifying facial attributes. We also introduced a new facial attribute related to Middle Eastern headwear along with its detector. The new facial attribute were fused with a face detector to improve the detection performance. In addition, we proposed a new method to evaluate the robustness of face detection, which is the first process in the facial attribute classification pipeline. Detecting the states of human facial attributes in real time is highly desired by many applications. For example, the real-time detection of a driver's eye state (open/closed) can prevent severe accidents. These systems are usually called driver drowsiness detection systems. For classifying 40 facial attributes, we proposed a real-time model that preprocesses faces by localizing facial landmarks to normalize faces, and then crop them based on the intended attribute. The face was cropped only if the intended attribute is inside the face region. After that, 7 types of classical and deep features were extracted from the preprocessed faces. Lastly, these 7 types of feature sets were fused together to train three different classifiers. Our proposed model yielded 91.93% on the average accuracy outperforming 7 state-of-the-art models. It also achieved state-of-the-art performance in classifying 14 out of 40 attributes. We also developed a real-time model that classifies the states of three human facial attributes: (1) eyes (open/closed), (2) mouth (open/closed), and (3) eyeglasses (present/absent). Our proposed method consisted of six main steps: (1) In the beginning, we detected the human face. (2) Then we extracted the facial landmarks. (3) Thereafter, we normalized the face, based on the eye location, to the full frontal view. (4) We then extracted the regions of interest (i.e., the regions of the mouth, left eye, right eye, and eyeglasses). (5) We extracted low-level features from each region and then described them. (6) Finally, we learned a binary classifier for each attribute to classify it using the extracted features. Our developed model achieved 30 FPS with a CPU-only implementation, and our eye-state classifier achieved the top performance, while our mouth-state and glasses classifiers were tied as the top performers with deep learning classifiers. We also introduced a new facial attribute related to Middle Eastern headwear along with its detector. After that, we fused it with a face detector to improve the detection performance. The traditional Middle Eastern headwear that men usually wear consists of two parts: (1) the shemagh or keffiyeh, which is a scarf that covers the head and usually has checkered and pure white patterns, and (2) the igal, which is a band or cord worn on top of the shemagh to hold it in place. The shemagh causes many unwanted effects on the face; for example, it usually occludes some parts of the face and adds dark shadows, especially near the eyes. These effects substantially degrade the performance of face detection. To improve the detection of people who wear the traditional Middle Eastern headwear, we developed a model that can be used as a head detector or combined with current face detectors to improve their performance. Our igal detector consists of two main steps: (1) learning a binary classifier to detect the igal and (2) refining the classier by removing false positives. Due to the similarity in real-life applications, we compared the igal detector with state-of-the-art face detectors, where the igal detector significantly outperformed the face detectors with the lowest false positives. We also fused the igal detector with a face detector to improve the detection performance. Face detection is the first process in any facial attribute classification pipeline. As a result, we reported a novel study that evaluates the robustness of current face detectors based on: (1) diffraction blur, (2) image scale, and (3) the IoU classification threshold. This study would enable users to pick the robust face detector for their intended applications. Biometric systems that use face detection suffer from huge performance fluctuation. For example, users of biometric surveillance systems that utilize face detection sometimes notice that state-of-the-art face detectors do not show good performance compared with outdated detectors. Although state-of-the-art face detectors are designed to work in the wild (i.e., no need to retrain, revalidate, and retest), they still heavily depend on the datasets they originally trained on. This condition in turn leads to variation in the detectors' performance when they are applied on a different dataset or environment. To overcome this problem, we developed a novel optics-based blur simulator that automatically introduces the diffraction blur at different image scales/magnifications. Then we evaluated different face detectors on the output images using different IoU thresholds. Users, in the beginning, choose their own values for these three settings and then run our model to produce the efficient face detector under the selected settings. That means our proposed model would enable users of biometric systems to pick the efficient face detector based on their system setup. Our results showed that sometimes outdated face detectors outperform state-of-the-art ones under certain settings and vice versa.
Ayhan, Tezer Bahar. "Damage evaluation of civil engineering structures under extreme loadings." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2013. http://tel.archives-ouvertes.fr/tel-00975488.
Full textZuzáková, Barbora. "Exchange market pressure: an evaluation using extreme value theory." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-199589.
Full textBuolamwini, Joy Adowaa. "Gender shades : intersectional phenotypic and demographic evaluation of face datasets and gender classifiers." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/114068.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 103-116).
This thesis (1) characterizes the gender and skin type distribution of IJB-A, a government facial recognition benchmark, and Adience, a gender classification benchmark, (2) outlines an approach for capturing images with more diverse skin types which is then applied to develop the Pilot Parliaments Benchmark (PPB), and (3) uses PPB to assess the classification accuracy of Adience, IBM, Microsoft, and Face++ gender classifiers with respect to gender, skin type, and the intersection of skin type and gender. The datasets evaluated are overwhelming lighter skinned: 79.6% - 86.24%. IJB-A includes only 24.6% female and 4.4% darker female, and features 59.4% lighter males. By construction, Adience achieves rough gender parity at 52.0% female but has only 13.76% darker skin. The Parliaments method for creating a more skin-type-balanced benchmark resulted in a dataset that is 44.39% female and 47% darker skin. An evaluation of four gender classifiers revealed a significant gap exists when comparing gender classification accuracies of females vs males (9 - 20%) and darker skin vs lighter skin (10 - 21%). Lighter males were in general the best classified group, and darker females were the worst classified group. 37% - 83% of classification errors resulted from the misclassification of darker females. Lighter males contributed the least to overall classification error (.4% - 3%). For the best performing classifier, darker females were 32 times more likely to be misclassified than lighter males. To increase the accuracy of these systems, more phenotypically diverse datasets need to be developed. Benchmark performance metrics need to be disaggregated not just by gender or skin type but by the intersection of gender and skin type. At a minimum, human-focused computer vision models should report accuracy on four subgroups: darker females, lighter females, darker males, and lighter males. The thesis concludes with a discussion of the implications of misclassification and the importance of building inclusive training sets and benchmarks.
by Joy Adowaa Buolamwini.
S.M.
Pydipati, Rajesh. "Evaluation of classifiers for automatic disease detection in citrus leaves using machine vision." [Gainesville, Fla.] : University of Florida, 2004. http://purl.fcla.edu/fcla/etd/UFE0006991.
Full textLantz, Linnea. "Evaluation of the Robustness of Different Classifiers under Low- and High-Dimensional Settings." Thesis, Uppsala universitet, Statistiska institutionen, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385554.
Full textYanko, William Andrew. "Experimental and numerical evaluation of concrete spalling during extreme thermal loading." [Gainesville, Fla.] : University of Florida, 2004. http://purl.fcla.edu/fcla/etd/UFE0006380.
Full textWilson, P. C. "Construction and evaluation of probabilistic classifiers to detect acute myocardial infarction from multiple cardiac markers." Thesis, Queen's University Belfast, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.411812.
Full textNyaupane, Narayan. "STATISTICAL EVALUATION OF HYDROLOGICAL EXTREMES ON STORMWATER SYSTEM." OpenSIUC, 2018. https://opensiuc.lib.siu.edu/theses/2300.
Full textWatkins, Bobby Gene II. "Materials selection and evaluation of Cu-W particulate composites for extreme electrical contacts." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/39494.
Full textPapšys, Kęstutis. "Methodology of development of cartographic information system for evaluation of risk of extreme events." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2013. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2013~D_20130220_160846-94374.
Full textDisertacijoje aprašoma ekstremalių įvykių vertinimo kartografinės informacinės sistemos kūrimo metodologija. Analizuojamos pasaulyje egzistuojančios kompleksinės rizikos vertinimo sistemos išryškinami jų trūkumai ir privalumai. Atliktos analizės pagrindu sukuriama originali daugeliu duomenų šaltinių pagrįsta kompleksinio rizikos vertinimo metodologija ir aprašoma autoriaus suprojektuota informacinė sistema leidžianti vertinti ekstremalių įvykių grėsmes ir riziką. Sukurta metodologija apima kartografinės informacinės sistemos sudedamųjų dalių kūrimo ir diegimo metodiką. Aprašomi sistemos veikimui reikiamų duomenų tipai, jų surinkimas, ekstremalių įvykių duomenų bazės kaupimo principai, sukuriamas ekstremalių įvykių grėsmių skaičiavimo ir kelių grėsmių apjungimo į vieną sintetinę grėsmę modelis. Aprašomas rizikos ir grėsmės santykis ir rizikos vertinimo metodologija. Disertacijoje taip pat pateikiama visos sistemos, veikiančios Lietuvos geografinės informacijos infrastruktūroje, ir integruotos Lietuvos erdvinės informacijos portale projektas. Sistema išbandyta su Lietuvoje pasiekiamais ir realiai egzistuojančiais erdvinių duomenų rinkiniais. Pateikiami eksperimento metu gauti rezultatai, rodantys padidintų geologinių ir meteorologinių rizikos rajonus Lietuvoje. Darbo pabaigoje pateikiamos metodologinės ir praktinės išvados apie metodų ir sistemos pritaikymą, patikimumą ir atitikimą standartams.
Kallio, Rebecca Mae. "Evaluation of Channel Evolution and Extreme Event Routing for Two-Stage Ditches in a Tri-State Region of the USA." The Ohio State University, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=osu1275424336.
Full textKatsara, Maria-Alexandra [Verfasser], Michael [Gutachter] Nothnagel, Meaux Juliette [Gutachter] de, and Thomas [Gutachter] Wiehe. "Evaluation of a prior-incorporated statistical model and established classifiers for externally visible characteristics prediction / Maria-Alexandra Katsara ; Gutachter: Michael Nothnagel, Juliette de Meaux, Thomas Wiehe." Köln : Universitäts- und Stadtbibliothek Köln, 2021. http://d-nb.info/1237814405/34.
Full textBurke, Susan Marie. "Striving for Credibility in the Face of Ambiguity: A Grounded Theory Study of Extreme Hardship Immigration Psychological Evaluations." Antioch University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=antioch1570121587640465.
Full textMugume, Seith Ncwanga. "Modelling and resilience-based evaluation of urban drainage and flood management systems for future cities." Thesis, University of Exeter, 2015. http://hdl.handle.net/10871/18870.
Full textKanbe, Yuichi. "Control of Alloy Composition and Evaluation of Macro Inclusions during Alloy Making." Doctoral thesis, KTH, Tillämpad processmetallurgi, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-27773.
Full textQC 20101222
Dogs, Carsten, and Timo Klimmer. "An Evaluation of the Usage of Agile Core Practices : How they are used in industry and what we can learn from their usage." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4725.
Full textSimões, Ana Carolina Quirino. "Planejamento, gerenciamento e análise de dados de microarranjos de DNA para identificação de biomarcadores de diagnóstico e prognóstico de cânceres humanos." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/95/95131/tde-12092013-172649/.
Full textIn this PhD Thesis, we present our strategies to the development of a mathematical and computational environment aiming the analysis of large-scale microarray datasets. The analyses focused mainly on the identification of molecular markers for diagnosis and prognosis of human cancers. Here we show the results of several analyses implemented using this environment, which led to the development of a computational tool for automatic annotation of DNA microarray platforms and a tool for tracking the analysis within R environment. We also applied eXtreme Programming (XP) as a tool for planning and management of gene expression analyses projects. All data sets were obtained by our collaborators using two different microarray platforms. The first is enriched in non-coding human sequences, particularly intronic sequences. The second one represents exonic regions of human genes. Using the first platform, we evaluated gene expression profiles of prostate and kidney human tumors. Applying SAM to prostate tumor data revealed 49 potential molecular markers for prognosis of this disease. Gene expression in samples of sarcomas, epidermoid carcinomas and head and neck epidermoid carcinomas was investigated using the second platform. A set of 12 genes were identified as potential biomarkers for local aggressiveness and metastasis in sarcoma. In addition, the analyses of data obtained from head and neck epidermoid carcinomas allowed the identification of 7 potential biomarkers for lymph-nodal metastases.
Bernardini, Flávia Cristina. "Combinação de classificadores simbólicos utilizando medidas de regras de conhecimento e algoritmos genéticos." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-29092006-110806/.
Full textThe quality of hypotheses induced by most of the available supervised machine learning algorithms depends on the quantity and quality of the instances in the training set. However, several well known learning algorithms are not able to manipulate many instances making it difficult to induce good classifiers from large databases, as are needed in the Data Mining process. One approach to overcome this problem is to construct ensembles of classifiers. An ensemble is a set of classifiers whose decisions are combined in some way to classify new cases (instances). However, although ensembles improve learning algorithms power prediction, ensembles may use an undesired large set of classifiers. Furthermore, despite classifying new cases better than each individual classifier, ensembles are generally a sort of ?black-box? classifier, not being able to explain their classification decisions. To this end, in this work we propose an approach that uses symbolic learning algorithms to construct ensembles of symbolic classifiers that can explain their classification decisions so that the ensemble is as accurate as or more accurate than the individual classifiers. Furthermore, considering that symbolic learning algorithms use local search methods to induce classifiers while genetic algorithms use global search methods, we propose a second approach to learn symbolic concepts from large databases using genetic algorithms to evolve symbolic classifiers into only one symbolic classifier so that the evolved classifier is more accurate than the initial ones. Both proposals were implemented in two computational systems. Several experiments using different databases were conducted in order to evaluate both proposals. Results show that although both proposals are promising, the approach using genetic algorithms produces better results.
Olausson, Katrin. "On Evaluation and Modelling of Human Exposure to Vibration and Shock on Planing High-Speed Craft." Licentiate thesis, KTH, Marina system, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-159168.
Full textQC 20150126
Liu, Qiang. "Microstructure Evaluation and Wear-Resistant Properties of Ti-alloyed Hypereutectic High Chromium Cast Iron." Doctoral thesis, KTH, Tillämpad processmetallurgi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-128532.
Full textQC 20130913
Michaelsson, Ludvig, and Sebastian Quiroga. "Design and evaluation of an adaptive dairy cow indoor positioning system : A study of the trade-off between position accuracy and energy consumption in mobile units with extreme battery life." Thesis, KTH, Maskinkonstruktion (Inst.), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-190203.
Full textMed växande gårdsstorlekar, ökande arbetsbelastning, påtryckningar från samhället och lagstiftning för lösdrift, gör att hälsoövervakning av gårdsdjur spelar en större roll för jordbrukare världen över. En typ av information som kan användas för att bestämma mjölkkors hälsa är positioneringsdata. Eftersom mjölkkor spenderar mycket tid inomhus för att skyddas mot väder, eller för att utföra andra aktiviteter, så lämpar sig inte lösningar baserade på GPS. Utöver det så krävs det att enheterna som korna bär med sig har en lång batteritid för att undvika frekventa systemunderhåll. Den här masteruppsatsen undersöker potentiella systemlösningar för att möjliggöra inomhuspositionering av mjölkkor i lösdriftsladugårdar. Den valda konfigurationen är sedan optimerad med avseende på energikonsumtion. Därefter undersöks förhållandet mellan dynamisk energikonsumtion och lokaliseringsnoggrannhet, tidigare forskning har fokuserat på antingen eller. Utvecklingen av system med lång livslängd har inte heller varit en prioritet. Det föreslagna systemet använder sig utav proprietära radiotekniker på 433MHz-bandet för att skatta mjölkkornas position. Dessutom används accelerometerdata till att adaptivt justera skattningsfrekvensen för att minimera energikonsumtion. Efter optimeringsprocessen har det föreslagna systemet en batteritid på minst två år, med en noggrannhet på ungefär 7–8m och en precision på 11–12m, när endast fyra ankarnoder användes i en experimentladugård. Den teoretiserade korrelationen mellan lokaliseringsnoggrannhet och energikonsumtion kunde ej påvisas. Nyckelord: inomhuspositionering, mjölkkor, viktad icke-linjär minstakvadratmetod,energikonsumtion, jordbruk, systemdesign, optimering, lokaliseringsnoggrannhet, SubGHz radio, batteritid
Hmad, Ouadie. "Evaluation et optimisation des performances de fonctions pour la surveillance de turboréacteurs." Thesis, Troyes, 2013. http://www.theses.fr/2013TROY0029.
Full textThis thesis deals with monitoring systems of turbojet engines. The development of such systems requires a performance evaluation and optimization phase prior to their introduction in operation. The work has been focused on this phase, and more specifically on the performance of the detection and the prognostic functions of two systems. Performances metrics related to each of these functions as well as their estimate have been defined. The monitored systems are, on the one hand, the start sequence for the detection function and on the other hand, the oil consumption for the prognostic function. The used data come from flights in operation without degradation, simulations of degradation were necessary for the performance assessment. Optimization of detection performance was obtained by tuning a threshold on the decision statistics taking into account the airlines requirements in terms of good detection rate and false alarm rate. Two approaches have been considered and their performances have been compared for their best configurations. Prognostic performances of over oil consumption, simulated using Gamma processes, have been assessed on the basis of the relevance of maintenance decision induced by the prognostic. This thesis has allowed quantifying and improving the performance of the two considered functions to meet the airlines requirements. Other possible improvements are proposed as prospects to conclude this thesis
Backman, Emil, and David Petersson. "Evaluation of methods for quantifying returns within the premium pension." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-288499.
Full textPensionsmyndighetens nuvarande beräkning av internräntan för 7,7 miljoner pensionssparare är både tid- och resurskrävande. Denna avkastning ger en översikt av hur väl den fonderade delen av pensionssystemet fungerar. Detta analyseras internt men rapporteras även till allmänheten varje månad samt årligen baserat på olika urval av data. Denna uppsats avser att undersöka möjligheten att använda andra tillvägagångssätt för att förbättra prestanda för denna typ av beräkningar. Vidare syftar studien till att verifiera resultaten som härrör från dessa beräkningar och undersöka deras stabilitet. För att undersöka om det finns konkurrerande matrismetoder jämförs ett urval av tillvägagångssätt med de mer klassiska numeriska metoderna. Metoderna jämförs i flera olika scenarier som syftar till att spegla verklig praxis. Stabiliteten i resultaten analyseras med en stokastisk modellering där en felterm införs för att efterlikna möjliga fel som kan uppstå i datahantering. Man drar slutsatsen att en kombination av Halleys metod och Jacobi-Davidson-algoritmen är den mest robusta och högpresterande metoden. Den föreslagna metoden kombinerar hastigheten från numeriska metoder och tillförlitlighet från matrismetoder. Resultatet visar en prestandaförbättring på 550 % i tid, samtidigt som samma noggrannhet som ses i de befintliga serverberäkningarna bibehålls. Analysen av felutbredning föreslår att felet i 99 procent av fallen är mindre än 0,12 procentenheter i det fall där införd felterm har stora proportioner. I detta extrema fall uppskattas det förväntade antalet individer med ett fel som överstiger 1 procentenhet vara 212 av hela befolkningen.
Souza, Crisla Serra. "Avaliação da produção de etanol em temperaturas elevadas por uma linhagem de S. cerevisiae." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/87/87131/tde-05082009-171501/.
Full textSurface response methodology was used to optimize the conditions and to obtain higher ethanol production and viability to strain 63M of S. cerevisiae in batch culture, resulting in the conditions: 200 g.L-1 sucrose, 40 g.L-1 inoculum at 30 °C. Different types of processes were compared and the process that presented higher viability, productivity and yield was pulse fed-batch using five decreasing pulses of sucrose at 30 °C. The reduction of the sucrose concentration was a strategy that allowed increasing the temperature up to 37 °C without losses in viabilities. An industrial strain used in Brazilian distilleries was compared with strain 63M at high temperatures and it was observed that strain 63M produced higher productivity and yield. Eight successive cycles of fermentation with reuse of cells of strain 63M were carried out in synthetic medium in fed-batch process using sucrose pulses at 37 °C and a gradual loss of viability was observed, but the final ethanol was kept constant in the eight fermentation cycles.
Сальник, К. О. "Інформаційно-аналітична система адаптації навчального контенту до вимог ринку праці. Функціонування системи в режимі моніторингу." Master's thesis, Сумський державний університет, 2020. https://essuir.sumdu.edu.ua/handle/123456789/79560.
Full textFarvacque, Manon. "Evaluation quantitative du risque rocheux : de la formalisation à l'application sur les linéaires et les zones urbanisées ). How argest wildfire events in France? A Bayesian assessment based on extreme value theory ). Hows rockfall risk impacted by land-use and land-cover changes? Insights from the French Alps. Quantitative risk assessment in a rockfall-prone area: the case study of the Crolles municipality (Massif de la Chartreuse, French Alps)." Thesis, Université Grenoble Alpes, 2020. https://tel.archives-ouvertes.fr/tel-02860296.
Full textRockfalls are a common type of fast moving landslide, corresponding to the detachment of individual rocks and boulders of different sizes from a vertical or sub-vertical cliff, and to their travel down the slope by free falling, bouncing and/or rolling. Every year, in the Alpine environment, rockfalls reach urbanized areas causing damage to structures and injuring people. Precise rockfall risk analysis has therefore become an essential tool for authorities and stakeholders in land-use planning.To this aim, quantitative risk assessment (QRA) procedures originally developed for landslides have been adapted to rockfall processes. In QRAs, rockfall risk for exposed elements is estimated by coupling the hazard, exposure and vulnerability components. However in practice, the estimation of the different components of risk is challenging, and methods for quantifying risk in rockfall-prone regions remain scarce. Similarly, the few studies which so far performed QRAs for rockfall assume stationary, precluding reliable anticipation of the risk in a context where environmental and societal conditions are evolving rapidly and substantially. Moreover, rockfall risk remains - as for most of natural hazards - always defined as the loss expectation. This metric offers a unique risk value, usually inconsistent with short/long term constraints or trade-offs faced by decision-makers.On this basis, this PhD thesis therefore aims at (i) reinforcing the basis of QRA, (ii) assessing the effect of environmental changes on rockfall risk, and (iii) proposing method for quantifying rockfall risk from measures of risk alternative to the standard loss expectation. In that respect, we propose a QRA procedure where the rockfall risk is quantified by combining a rockfall simulation model with the physical vulnerability of potentially affected structures and a wide spectrum of rockfall volumes as well as release areas. The practicability and interest of this procedure is illustrated on two real case studies, i.e. the municipality of Crolles, in the French Alps, and the Uspallata valley, in the central Andes mountains. Similarly, the effect of environmental changes on rockfall risk is considered by comparing rockfall risk values in different land-use and land-cover contexts. Last, we implement in our procedure on an individual basis two quantile-based measures, namely the value-at-risk and the expected-shortfall, so as to assess rockfall risk for different risk-management horizon periods. All in all, this PhD thesis clearly demonstrates the added value of QRA procedure in the field of rockfall, and reinforces its basis by implementing analytical, statistical or numerical models. The resulting panel of risk maps, also proposed under non-stationary contexts, are of major interest for stakeholders in charge of risk management, and constitute appropriate basis for land-use planning and prioritizing of mitigation strategies
Hamdi, Haykel. "Théorie des options et fonctions d'utilité : stratégies de couverture en présence des fluctuations non gaussiennes." Thesis, Paris 2, 2011. http://www.theses.fr/2011PA020006/document.
Full textThe traditional approach of derivatives involves, under certain clearly defined hypothesis, to construct hedging strategies for strictly zero risk. However, in the general case these perfect hedging strategies do not exist, and the theory must be rather based on the idea of risk minimization. In this case, the optimal hedging strategy depends on the amount of risk to be minimized. Under the options approach, we consider here a new measure of risk via the expected utility approach that takes into account both, the moment of order four, which is more sensitive to fluctuations than large variance, and risk aversion of the investor of an option towards risk. Compared to delta hedging, optimization of the variance and maximizing the moment of order four, the hedging strategy, via the expected utilitiy approach, reduces the sensitivy of the hedging approach reported in the underlying asset price. This is likely to reduce the associated transaction costs
Dočekal, Martin. "Porovnání klasifikačních metod." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2019. http://www.nusl.cz/ntk/nusl-403211.
Full textSimmons, Kenneth Rulon. "EXTREME HEAT EVENT RISK MAP CREATION USING A RULE-BASED CLASSIFICATION APPROACH." Thesis, 2012. http://hdl.handle.net/1805/2762.
Full textDuring a 2011 summer dominated by headlines about an earthquake and a hurricane along the East Coast, extreme heat that silently killed scores of Americans largely went unnoticed by the media and public. However, despite a violent spasm of tornadic activity that claimed over 500 lives during the spring of the same year, heat-related mortality annually ranks as the top cause of death incident to weather. Two major data groups used in researching vulnerability to extreme heat events (EHE) include socioeconomic indicators of risk and factors incident to urban living environments. Socioeconomic determinants such as household income levels, age, race, and others can be analyzed in a geographic information system (GIS) when formatted as vector data, while environmental factors such as land surface temperature are often measured via raster data retrieved from satellite sensors. The current research sought to combine the insights of both types of data in a comprehensive examination of heat susceptibility using knowledge-based classification. The use of knowledge classifiers is a non-parametric approach to research involving the creation of decision trees that seek to classify units of analysis by whether they meet specific rules defining the phenomenon being studied. In this extreme heat vulnerability study, data relevant to the deadly July 1995 heat wave in Chicago’s Cook County was incorporated into decision trees for 13 different experimental conditions. Populations vulnerable to heat were identified in five of the 13 conditions, with predominantly low-income African-American communities being particularly at-risk. Implications for the results of this study are given, along with direction for future research in the area of extreme heat event vulnerability.
Tsao, Chin-Chen, and 曹晉誠. "An Evaluation of Service Quality and Leisure Benefit for the Extreme Sports Stadium." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/85021493595839478149.
Full text亞洲大學
休閒與遊憩管理學系
104
Low utilization has become a problem of most extreme sports stadiums in Taiwan. In order to enhance the utilization of stadiums, government authorized external units or firms to operate some of extreme sports stadiums. It is critical to explore how the firm can enhance the utilization of the sports stadium for avoiding of becoming empty. If the firm can meet customers’ leisure demands by designing and offering leisure sports services with considering the characteristics of equipment and facilities in the stadium, it can pull people’s demands of extreme sports stadiums. This study attempts to offer the managerial strategies related to customer needs and service qualities to the firm for increasing the utilization of the sports stadiums. In order to understand whether the firm’s leisure services and service quality can create leisure benefits for customers and thus satisfy their demands, this study attempts to explore the relationship between the service quality and leisure benefit. Specifically, this study tries to understand the expected leisure benefits of various segments of customers (including various ages, genders, occupations etc.) before experiencing services. In addition, this study also investigates the relationship between the service quality and leisure benefit based on customers’ experiences for identifying important quality item. Based on the findings, the firm that operating the sports stadiums can design the customer-oriented leisure services for enhancing the utilization of the stadium.
Vila, Verde Francisca Viçoso. "Peer-to-peer lending: Evaluation of credit risk using Machine Learning." Master's thesis, 2021. http://hdl.handle.net/10362/127084.
Full textPeer-to-peer lenders have transformed the credit market by being an alternative to traditional financial services and taking advantage of the most advanced analytics techniques. Credit scoring and accurate assessment of borrower’s creditworthiness is crucial to managing credit risk and having the capacity of adapting to current market conditions. The Logistic Regression has long been recognised as the benchmark model for credit scoring, so this dissertation aims to evaluate and compare its capabilities to predict loan defaults with other parametric and non-parametric methods, to assess the improvement in predictive power between the most modern techniques and the traditional models in a peer-to-peer lending context. We compare the performance of four different algorithms, the single classifiers Decision Trees and K-Nearest Neighbours, and the ensemble classifiers Random Forest and XGBoost against a benchmark model, the Logistic Regression, using six performance evaluation measures. This dissertation also includes a review of related work, an explanation of the pre-processing involved, and a description of the models. The research reveals that both XGBoost and Random Forest outperform the benchmark’s predictive capacity and that the KNN and the Decision Tree models have weaker performance compared to the benchmark. Hence, it can be concluded that it still makes sense to use this benchmark model, however, the more modern techniques should also be taken into consideration.
Hong, Yu-Ting, and 洪郁婷. "Evaluation on the Influence Zone of rainfall induced Debris Flow under Extreme Climate Condition." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/29674439554412281298.
Full text國立臺灣海洋大學
河海工程學系
98
Taiwan is located at the center of Circumpacific belt. Many faults and the fracture of geological conditions are need considering due to the Philippine Sea and the Eurasia plate action. When the typhoon season coming during June to Oct., the rainfall intensity is plentiful to concentrate and could be caused the different slope hazards. Practically, the hazard of debris flow will cause large range damages and impact at short duration. In recent years, due to climate changing, the frequency of extreme climate condition is double. To order to find out the relationship between extreme climate induce rainfall intensity and influence zone of debris flow, the program of FLO-2D is adopted to simulate the two-dimensional floods, and the influence zone and accumulation depth of debris flows. The study areas are five potential debris flow torrent, which located at Chen-Yu-Lan River in Nantou County. The rainfall data of Typhoon Mindulle, Toraji and Morakot are collected. All of these typhoons caused many times of debris flow and serious disaster. From the analysis of the properties of rainfall, we can indicate the main reason of Toraji typhoon induce-damage is highly rainfall intensity large than 100mm per hour. And Mindulle typhoon and Morakot typhoon induce damage are caused by huge accumulated rainfall. From the results with rainfall properties and watershed area, it could find the influence area and accumulation depth of debris flow are much correlated with rainfall intensity and watershed area. It also can find that the rainfall intensity effect is indistinct to a large watershed area.
Nai-YuYang and 楊乃玉. "An Evaluation Study for the Impact of Discretization Methods on the Performance of Naive Bayesian Classifiers with Prior Distributions." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/40262687553999457567.
Full text國立成功大學
工業與資訊管理學系專班
98
Na?ve Bayesian classifiers are widely employed for classification tasks, because of their computational efficiency and competitive accuracy. Discretization is a major approach for processing continuous attributes for na?ve Bayesian classifiers. In addition, the prior distributions of attributes in the na?ve Bayesian classifier are implicitly or explicitly assumed to follow either Dirichlet or generalized Dirichlet distributions. Previous studies have found that discretization methods for continuous attributes do not have significant impact on the performance of the na?ve Bayesian classifier with noninformative Dirichlet priors. Since generalized Dirichlet distribution is a more appropriate prior for the na?ve Bayesian classifier, the purpose of this thesis is to investigate the impact of four well-known discretization methods, equal width, equal frequency, proportional, and minimization entropy, on the performance of na?ve Bayesian classifiers with either noninformative Dirichlet or noninformative generalized Dirichlet priors. The experimental results on 23 data sets demonstrate that the equal width, the equal frequency, and the proportional discretization methods can achieve a higher classification accuracy when priors follow generalize Dirichlet distributions. However, generalized Dirichlet and Dirichlet priors have similar performance for the minimization entropy discretization method. The experimental results suggest that noninformative generalized Dirichlet priors can be employed for the minimization entropy discretization method only when neither the number of classes nor the number of intervals is small.
Chih, Chen Ping, and 陳秉志. "The Evaluation of Value at Risk for Real Estate Investment Trusts with Extreme Value Model." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/88380207208394416745.
Full text真理大學
財經研究所
96
Based on 6 domestic Real Estate Investment Trusts(Hence REITs), we furthermore compares the performance of Value-at-Risk and Extreme Value Model. We consider different volatilities model in Variance-Covariance Method. On the other side, we also apply GEV and GDP Extreme Value Models for estimating Value-at-Risk under the confidence level of 99%, 97.5% and 95% respectively. Then, we check the performances of all VaR models through Back testing. The empirical results show that Extreme Value Model is a substantial improvement for Variance-Covariance Method and the GDP’s performance is better than GEV.
Kao, Juo-Han, and 高若涵. "The Evaluation of Value at Risk for Automobile Physical Damage Insurance with Extreme Value Model." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/06094839656768259872.
Full text真理大學
財經研究所
95
This research apply the extreme value model to evaluate the VaR of automobile physical damage insurance and hope to find out the distribution in automobile physical damage insurance and estimate its VaR through the extreme value theory. We first check the VaR of the automobile physical damage insurance via P-P Plot and Q-Q Plot, we compare normal, log- normal, index and student t distribution to discover the amount of losing money and probability above were unable to mix rightly. So, it can be work to put this model into practice of estimating VaR. We further to use the GEV model and GPD model to assess the VaR for automobile physical damage insurance. Empirical results reveal that the two models are quite close in some cases. In last, we use the backtesting to examine the performance of the two models, we also obtain different outcomes under different significant levels.
Huang, Zhao-Kai, and 黃照凱. "Extreme Learning Machine for Automatic License Plate Recognition With Imbalance Data Set and Performance Evaluation." Thesis, 2019. http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5441070%22.&searchmode=basic.
Full text國立中興大學
電機工程學系所
107
This paper uses Extreme Learning Machines for license plate recognition. It also uses edge statistics, template comparison, Radial Basis Function, and Support Vector Machine (SVM) for comparison. The Extreme Learning Machine (ELM) only needs to propose image features for the neural network, and input the image features into the neural network for training, and obtain the output results, which can effectively improve the operation speed. In order to solve a large amount of training time, this study uses principal component analysis (PCA) to reduce the training time, so that the training data will be reduced from 288 dimensions to 192 dimensions, and deleting unnecessary feature vectors will greatly reduce the training time Using the Extreme Learning Machine (ELM), the calculation time is about 0.2 seconds, which is obviously superior to other methods in the recognition speed. The identification rate in the license plate recognition (EI-ELM) is up to 84.12%. In the experimental results, the confusion matrix is used. Compared with the performance evaluation, the ELM performance indicators are more advantageous than the ELM.
Nieh, Chih-Chien, and 聶至謙. "Evaluation of accuracy of the Anisotropic Analytical Algorithm (AAA) under extreme inhomogeneities using Monte Carlo (MC) simulations." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/24944445854171147607.
Full textCope, Julia Lee. "Evaluation of Microbial Communities from Extreme Environments as Inocula in a Carboxylate Platform for Biofuel Production from Cellulosic Biomass." Thesis, 2013. http://hdl.handle.net/1969.1/151350.
Full textNewton, Brandi Wreatha. "An evaluation of winter hydroclimatic variables conducive to snowmelt and the generation of extreme hydrologic events in western Canada." Thesis, 2018. https://dspace.library.uvic.ca//handle/1828/9965.
Full textGraduate
"Evaluation of Flood Mitigation Strategies for the Santa Catarina Watershed using a Multi-model Approach." Master's thesis, 2016. http://hdl.handle.net/2286/R.I.38363.
Full textDissertation/Thesis
Masters Thesis Civil and Environmental Engineering 2016
"Athletic Surfaces’ Influence on the Thermal Environment: An Evaluation of Wet Bulb Globe Temperature in the Phoenix Metropolitan Area." Master's thesis, 2020. http://hdl.handle.net/2286/R.I.57303.
Full textDissertation/Thesis
Masters Thesis Geography 2020
Beerval, Ravichandra Kavya Urs. "Spatiotemporal analysis of extreme heat events in Indianapolis and Philadelphia for the years 2010 and 2011." Thesis, 2014. http://hdl.handle.net/1805/4083.
Full textOver the past two decades, northern parts of the United States have experienced extreme heat conditions. Some of the notable heat wave impacts have occurred in Chicago in 1995 with over 600 reported deaths and in Philadelphia in 1993 with over 180 reported deaths. The distribution of extreme heat events in Indianapolis has varied since the year 2000. The Urban Heat Island effect has caused the temperatures to rise unusually high during the summer months. Although the number of reported deaths in Indianapolis is smaller when compared to Chicago and Philadelphia, the heat wave in the year 2010 affected primarily the vulnerable population comprised of the elderly and the lower socio-economic groups. Studying the spatial distribution of high temperatures in the vulnerable areas helps determine not only the extent of the heat affected areas, but also to devise strategies and methods to plan, mitigate, and tackle extreme heat. In addition, examining spatial patterns of vulnerability can aid in development of a heat warning system to alert the populations at risk during extreme heat events. This study focuses on the qualitative and quantitative methods used to measure extreme heat events. Land surface temperatures obtained from the Landsat TM images provide useful means by which the spatial distribution of temperatures can be studied in relation to the temporal changes and socioeconomic vulnerability. The percentile method used, helps to determine the vulnerable areas and their extents. The maximum temperatures measured using LST conversion of the original digital number values of the Landsat TM images is reliable in terms of identifying the heat-affected regions.
Ojumu, Adefolake Mayokun. "Transport of nitrogen oxides and nitric acid pollutants over South Africa and air pollution in Cape Town." Diss., 2013. http://hdl.handle.net/10500/11911.
Full textEnvironmental Sciences
M. Sc. (Environmental Science)