Thèses sur le sujet « Weighting methods »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les 50 meilleures thèses pour votre recherche sur le sujet « Weighting methods ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.
Lu, Ling, et Bofeng Li. « Combining Different Feature Weighting Methods for Case Based Reasoning ». Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-26603.
Texte intégralChoo, Wei-Chong. « Volatility forecasting with exponential weighting, smooth transition and robust methods ». Thesis, University of Oxford, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.489421.
Texte intégralFerreira, Junior Valnir, et N/A. « Improvements to Clause Weighting Local Search for Propositional Satisfiability ». Griffith University. Institute for Integrated and Intelligent Systems, 2007. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20070823.123257.
Texte intégralFerreira, Junior Valnir. « Improvements to Clause Weighting Local Search for Propositional Satisfiability ». Thesis, Griffith University, 2007. http://hdl.handle.net/10072/365857.
Texte intégralThesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Institute for Integrated and Intelligent Systems
Full Text
Hawley, Kevin J. « A comparative analysis of areal interpolation methods ». Connect to resource, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1139949635.
Texte intégralChen, Ziyue. « Generalizing Results from Randomized Trials to Target Population via Weighting Methods Using Propensity Score ». The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1503007759352248.
Texte intégralKaatz, Ewelina. « Development of benchmarks and weighting systems for building environmental assessment methods : opportunities of a participatory approach ». Master's thesis, University of Cape Town, 2001. http://hdl.handle.net/11427/4767.
Texte intégralSustainable construction is a tenns that emerged with the introduction of the concept of sustainable development in construction. Therefore, sustainable construction embraces socio-economic, cultural, biophysical, technical and process-orientated aspects of construction practice and activities. The progress towards sustain ability in construction may be assessed by implementation of good practice in building developments. Therefore, building environmental assessment methods are valuable tools of indicating such a progress as well as promoting sustainable approaches in construction. An effective building environmental assessment method requires definition of explicit benchmarks and weightings. These should take into account environmental, social and economic contexts of building developments. As the existing building environmental assessment methods largely ignore socioeconomic impacts of building developments, the implementation of a participatory approach in the development of benchmarks and weighting systems could greatly contribute to a more meaningful incorporation of social and economic aspects into the assessment process. Furthennore, the participation of stakeholders in establishing qualitative benchmarks and weights should increase the credibility of such a process. The participatory approach could allow for education of all stakeholders about the potential environmental, social and economic consequences of their decisions and actions, which is so vital for achieving their commitment to strive towards sustainable construction.
Varma, Krishnaraj M. « Fast Split Arithmetic Encoder Architectures and Perceptual Coding Methods for Enhanced JPEG2000 Performance ». Diss., Virginia Tech, 2006. http://hdl.handle.net/10919/26519.
Texte intégralPh. D.
Wong, Mark. « Comparison of heat maps showing residence price generated using interpolation methods ». Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-214110.
Texte intégralDen här rapporten försöker ge insikter i hur interpolation kan användas för att skapa färgdiagram över bostadspriser för olika bostadsmarknader i Sverige. Mer specifikt implementeras tre interpolationsmetoder som sedan används på tre olika svenska bostadsmarknader. Dessa tre bostadsmarknader är av olika karaktär med hänsyn till storlek och bostadstyp. Bostadsförsäljningsdata och de fysiska definitionerna för bostadsmarknaderna samlades in. Eftersom bostadsförsäljningar aldrig är identiska, behandlas de först i syfte att göra dem jämförbara. En extern indikator, vilket är en extra parameter för interpolationsmetoder, undersöktes även. I den här rapporten användes avståndet till närmaste kollektiva transportmedel som extern indikator. De interpolerade färgdiagrammen jämfördes och utvärderades både med en kvantiativ och en kvalitativ metod. Resultaten visar att varje interpolationsmetod har sina styrkor och svagheter och att användandet av en extern indikator alltid renderade i ett bättre färgdiagram jämfört med att endast använda bostadspris som indikator. Kriging bedöms vara den mest robusta interpolationsmetoden och interpolerade även de bästa färgdiagrammen för alla bostadsmarknader. Samtidigt var det även den mest tidskrävande interpolationsmetoden.
Schmidl, Ricarda. « Empirical essays on job search behavior, active labor market policies, and propensity score balancing methods ». Phd thesis, Universität Potsdam, 2014. http://opus.kobv.de/ubp/volltexte/2014/7114/.
Texte intégralIn Kapitel 1 der Dissertation wird die Rolle von sozialen Netzwerken als Determinante im Suchverhalten von Arbeitslosen analysiert. Basierend auf der Hypothese, dass Arbeitslose durch ihr soziales Netzwerk Informationen über Stellenangebote generieren, sollten Personen mit großen sozialen Netzwerken eine erhöhte Produktivität ihrer informellen Suche erfahren, und ihre Suche in formellen Kanälen reduzieren. Durch die höhere Produktivität der Suche sollte für diese Personen zudem der Reservationslohn steigen. Die modelltheoretischen Vorhersagen werden empirisch getestet, wobei die Netzwerkinformationen durch die Anzahl guter Freunde, sowie Kontakthäufigkeit zu früheren Kollegen approximiert wird. Die Ergebnisse zeigen, dass das Suchverhalten der Arbeitslosen durch das Vorhandensein sozialer Kontakte signifikant beeinflusst wird. Insbesondere sinkt mit der Netzwerkgröße formelle Arbeitssuche - die Substitution ist besonders ausgeprägt für passive formelle Suchmethoden, d.h. Informationsquellen die eher unspezifische Arten von Jobangeboten bei niedrigen relativen Kosten erzeugen. Im Einklang mit den Vorhersagen des theoretischen Modells finden sich auch deutlich positive Auswirkungen einer Erhöhung der Netzwerkgröße auf den Reservationslohn. Kapitel 2 befasst sich mit den Arbeitsmarkteffekten von Vermittlungsangeboten (VI) in der frühzeitigen Aktivierungsphase von Arbeitslosen. Die Nutzung von VI könnte dabei eine „doppelte Dividende“ versprechen. Zum einen reduziert die frühe Aktivierung die Dauer der Arbeitslosigkeit, und somit auch die Notwendigkeit späterer Teilnahme in Arbeitsmarktprogrammen (ALMP). Zum anderen ist die Aktivierung durch Information mit geringeren locking-in‘‘ Effekten verbunden als die Teilnahme in ALMP. Ziel der Analyse ist es, die Effekte von frühen VI auf die Eingliederungsgeschwindigkeit, sowie die Teilnahmewahrscheinlichkeit in ALMP zu messen. Zudem werden mögliche Effekte auf die Qualität der Beschäftigung untersucht. Die Ergebnisse zeigen, dass VI die Beschäftigungswahrscheinlichkeit signifikant erhöhen, und dass gleichzeitig die Wahrscheinlichkeit in ALMP teilzunehmen signifikant reduziert wird. Für die meisten betrachteten Subgruppen ergibt sich die langfristige Reduktion der ALMP Teilnahme als Konsequenz der schnelleren Eingliederung. Für einzelne Arbeitsmarktgruppen ergibt sich zudem eine frühe und temporare Reduktion, was darauf hinweist, dass Maßnahmen mit hohen und geringen „locking-in“ Effekten aus Sicht der Sachbearbeiter austauschbar sind, was aus Effizienzgesichtspunkten fragwürdig ist. Es wird ein geringer negativer Effekt auf die wöchentliche Stundenanzahl in der ersten abhängigen Beschäftigung nach Arbeitslosigkeit beobachtet. In Kapitel 3 werden die Langzeiteffekte von ALMP für arbeitslose Jugendliche unter 25 Jahren ermittelt. Die untersuchten ALMP sind ABM-Maßnahmen, Lohnsubventionen, kurz-und langfristige Maßnahmen der beruflichen Bildung sowie Maßnahmen zur Förderung der Teilnahme an Berufsausbildung. Ab Eintritt in die Maßnahme werden Teilnehmer und Nicht-Teilnehmer für einen Zeitraum von sechs Jahren beobachtet. Als Zielvariable wird die Wahrscheinlichkeit regulärer Beschäftigung, sowie die Teilnahme in Ausbildung untersucht. Die Ergebnisse zeigen, dass alle Programme, bis auf ABM, positive und langfristige Effekte auf die Beschäftigungswahrscheinlichkeit von Jugendlichen haben. Kurzfristig finden wir jedoch nur für kurze Trainingsmaßnahmen positive Effekte, da lange Trainingsmaßnahmen und Lohnzuschüsse mit signifikanten locking-in‘‘ Effekten verbunden sind. Maßnahmen zur Förderung der Berufsausbildung erhöhen die Wahrscheinlichkeit der Teilnahme an einer Ausbildung, während alle anderen Programme keinen oder einen negativen Effekt auf die Ausbildungsteilnahme haben. Jugendliche mit höherem Ausbildungsniveau profitieren stärker von der Programmteilnahme. Jedoch zeigen sich für längerfristige Lohnsubventionen ebenfalls starke positive Effekte für Jugendliche mit geringer Vorbildung. Der relative Nutzen von Trainingsmaßnahmen ist höher in West- als in Ostdeutschland. In den Evaluationsstudien der Kapitel 2 und 3 werden die semi-parametrischen Gewichtungsverfahren Propensity Score Matching (PSM) und Inverse Probability Weighting (IPW) verwendet, um den Einfluss verzerrender Faktoren, die sowohl die Maßnahmenteilnahme als auch die Zielvariablen beeinflussen zu beseitigen, und kausale Effekte der Programmteilahme zu ermitteln. Während PSM and IPW intuitiv und methodisch sehr attraktiv sind, stellt die Implementierung der Methoden in der Praxis jedoch oft eine große Herausforderung dar. Das Ziel von Kapitel 4 ist es daher, praktische Hinweise zur Implementierung dieser Methoden zu geben. Zu diesem Zweck werden neue Erkenntnisse der empirischen und statistischen Literatur zusammengefasst und praxisbezogene Richtlinien für die angewandte Forschung abgeleitet. Basierend auf einer theoretischen Motivation und einer Skizzierung der praktischen Implementierungsschritte von PSM und IPW werden diese Schritte chronologisch dargestellt, wobei auch auf praxisrelevante Erkenntnisse aus der methodischen Forschung eingegangen wird. Im Anschluss werden die Themen Effektschätzung, Inferenz, Sensitivitätsanalyse und die Kombination von IPW und PSM mit anderen statistischen Methoden diskutiert. Abschließend werden neue Erweiterungen der Methodik aufgeführt.
Diop, Serigne Arona, et Serigne Arona Diop. « Comparing inverse probability of treatment weighting methods and optimal nonbipartite matching for estimating the causal effect of a multicategorical treatment ». Master's thesis, Université Laval, 2019. http://hdl.handle.net/20.500.11794/34507.
Texte intégralDes débalancements des covariables entre les groupes de traitement sont souvent présents dans les études observationnelles et peuvent biaiser les comparaisons entre les traitements. Ce biais peut notamment être corrigé grâce à des méthodes de pondération ou d’appariement. Ces méthodes de correction ont rarement été comparées dans un contexte de traitement à plusieurs catégories (>2). Nous avons mené une étude de simulation pour comparer une méthode d’appariement optimal non-biparti, la pondération par probabilité inverse de traitement ainsi qu’une pondération modifiée analogue à l’appariement (matching weights). Ces comparaisons ont été effectuées dans le cadre de simulation de type Monte Carlo à travers laquelle une variable d’exposition à 3 groupes a été utilisée. Une étude de simulation utilisant des données réelles (plasmode) a été conduite et dans laquelle la variable de traitement avait 5 catégories. Parmi toutes les méthodes comparées, celle du matching weights apparaît comme étant la plus robuste selon le critère de l’erreur quadratique moyenne. Il en ressort, aussi, que les résultats de la pondération par probabilité inverse de traitement peuvent parfois être améliorés par la troncation. De plus, la performance de la pondération dépend du niveau de chevauchement entre les différents groupes de traitement. La performance de l’appariement optimal nonbiparti est, quant à elle, fortement tributaire de la distance maximale pour qu’une paire soit formée (caliper). Toutefois, le choix du caliper optimal n’est pas facile et demeure une question ouverte. De surcroît, les résultats obtenus avec la simulation plasmode étaient positifs, dans la mesure où une réduction importante du biais a été observée. Toutes les méthodes ont pu réduire significativement le biais de confusion. Avant d’utiliser la pondération de probabilité inverse de traitement, il est recommandé de vérifier la violation de l’hypothèse de positivité ou l’existence de zones de chevauchement entre les différents groupes de traitement
Damesa, Tigist Mideksa [Verfasser], et Hans-Peter [Akademischer Betreuer] Piepho. « Weighting methods for variance heterogeneity in phenotypic and genomic data analysis for crop breeding / Tigist Mideksa Damesa ; Betreuer : Hans-Peter Piepho ». Hohenheim : Kommunikations-, Informations- und Medienzentrum der Universität Hohenheim, 2019. http://d-nb.info/1199440035/34.
Texte intégralEnoch, John. « Application of Decision Analytic Methods to Cloud Adoption Decisions ». Thesis, Högskolan i Gävle, Avdelningen för Industriell utveckling, IT och Samhällsbyggnad, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-25560.
Texte intégralMay, Michael. « Data analytics and methods for improved feature selection and matching ». Thesis, University of Manchester, 2012. https://www.research.manchester.ac.uk/portal/en/theses/data-analytics-and-methods-for-improved-feature-selection-and-matching(965ded10-e3a0-4ed5-8145-2af7a8b5e35d).html.
Texte intégralJohansson, Sven. « Active Control of Propeller-Induced Noise in Aircraft : Algorithms & ; Methods ». Doctoral thesis, Karlskrona, Ronneby : Blekinge Institute of Technology, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00171.
Texte intégralBuller i vår dagliga miljö kan ha en negativ inverkan på vår hälsa. I många sammanhang, i tex bilar, båtar och flygplan, förekommer lågfrekvent buller. Lågfrekvent buller är oftast inte skadligt för hörseln, men kan vara tröttande och försvåra konversationen mellan personer som vistas i en utsatt miljö. En dämpning av bullernivån medför en förbättrad taluppfattbarhet samt en komfortökning. Att dämpa lågfrekvent buller med traditionella passiva metoder, tex absorbenter och reflektorer, är oftast ineffektivt. Det krävs stora, skrymmande absorbenter för att dämpa denna typ av buller samt tunga skiljeväggar för att förhindra att bullret transmitteras vidare från ett utrymme till ett annat. Metoder som är mera lämpade vid dämpning av lågfrekvent buller är de aktiva. De aktiva metoderna baseras på att en vågrörelse som ligger i motfas med en annan överlagras och de släcker ut varandra. Bullerdämpningen erhålls genom att ett ljudfält genereras som är lika starkt som bullret men i motfas med detta. De aktiva bullerdämpningsmetoderna medför en effektiv dämpning av lågfrekvent buller samtidigt som volymen, tex hos bilkupen eller båt/flygplanskabinen ej påverkas nämnvärt. Dessutom kan fordonets/farkostens vikt reduceras vilket är tacksamt för bränsleförbrukningen. I de flesta tillämpningar varierar bullrets karaktär, dvs styrka och frekvensinnehåll. För att följa dessa variationer krävs ett adaptivt (självinställande) reglersystem som styr genereringen av motljudet. I propellerflygplan är de dominerande frekvenserna i kabinbullret relaterat till propellrarnas varvtal, man känner alltså till frekvenserna som skall dämpas. Man utnyttjar en varvtalssignal för att generera signaler, så kallade referenssignaler, med de frekvenser som skall dämpas. Dessa bearbetas av ett reglersystem som generar signaler till högtalarna som i sin tur generar motljudet. För att ställa in högtalarsignalerna så att en effektiv dämpning erhålls, används mikrofoner utplacerade i kabinen som mäter bullret. För att åstadkomma en effektiv bullerdämpning i ett rum, tex i en flygplanskabin, behövs flera högtalare och mikrofoner, vilket kräver ett avancerat reglersystem. I doktorsavhandlingen ''Active Control of Propeller-Induced Noise in Aircraft'' behandlas olika metoder för att reducera kabinbuller härrörande från propellrarna. Här presenteras olika strukturer på reglersystem samt beräkningsalgoritmer för att ställa in systemet. För stora system där många högtalare och mikrofoner används, samt flera frekvenser skall dämpas, är det viktigt att systemet inte behöver för stor beräkningskapacitet för att generera motljudet. Metoderna som behandlas ger en effektiv dämpning till låg beräkningskostnad. Delar av materialet som presenteras i avhandlingen har ingått i ett EU-projekt med inriktning mot bullerundertryckning i propellerflygplan. I projektet har flera europeiska flygplanstillverkare deltagit. Avhandlingen behandlar även aktiv bullerdämpning i headset, som används av helikopterpiloter. I denna tillämpning har aktiv bullerdämpning används för att öka taluppfattbarheten.
Pagliarani, Andrea. « New markov chain based methods for single and cross-domain sentiment classification ». Master's thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amslaurea.unibo.it/8445/.
Texte intégralKiichenko, Vladyslav Yuriiovych. « Use of surface interpolation methods for determination of dioxide hydroxide in the air of the city of Kyiv ». Thesis, National Aviation University, 2021. https://er.nau.edu.ua/handle/NAU/50612.
Texte intégralIn geographic information systems, interpolation of surfaces by various methods is often used. Topics in this area are relevant today and promising for further study and practical research in the field of geoinformation using GIS technologies. The purpose of interpolation in GIS is to fill in the gaps between known measurement points and thus simulate a continuous distribution of a property (attribute). Interpolation is based on the assumption that spatially distributed objects are correlated in space, that is, adjacent objects have similar characteristics. Spatial interpolation of point data is based on the choice of analytical surface model.
В геоінформаційних системах часто використовується інтерполяція поверхонь різними методами. Теми в цій галузі є актуальними сьогодні та перспективними для подальшого вивчення та практичних досліджень у галузі обробки геоінформації із використанням ГІС-технологій. Метою інтерполяції в ГІС є заповнення прогалин між відомими точками вимірювання і, таким чином, моделювання безперервного розподілу властивості (атрибута). Інтерполяція базується на припущенні, що просторово розподілені об'єкти співвідносяться в просторі, тобто сусідні об'єкти мають подібні характеристики. Просторова інтерполяція точкових даних базується на виборі аналітичної моделі поверхні.
Pehrson, Ida. « Integrating planetary boundaries into the life cycle assessment of electric vehicles : A case study on prioritising impact categories through environmental benchmarking in normalisation and weighting methods when assessing electric heavy-duty vehicles ». Thesis, KTH, Hållbar utveckling, miljövetenskap och teknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-281862.
Texte intégralTransportsektorn står inför stora utmaningar för att nå en utveckling inom planetens gränser. I nuläget har LCA studier för tunga och medeltunga transporter fokuserat på ‘well-to-wheel’ vilket är stegen bränsleproduktionen (från källan till tanken) och konsekvenserna av fordonets användning (från tank till hjul) och påverkanskategorin klimat. För att förstå fordonets totala miljöpåverkan, behövs ett holistiskt synsätt för att förstå flera hållbarhetsdimensioner av fordonets miljöpåverkan. Utvecklingen av nya fordonstekniker, så som batterifordon, kommer leda till att miljöpåverkan möjligen främst uppstår i produktions och avfallsfasen av livscykeln, det är därav viktigt att analysera ett fordon från ́vaggan till graven ́. Denna uppsats har analyserat Scanias LCA resultat genom normalisering och viktning. Normaliserings- och viktningsmetoderna som används är baserade på dom planetära gränserna och andra tröskelvärden för planetens bärkapacitet. Det normaliserade resultatet visar att för en diesel lastbil är klimat en betydande påverkanskategori, dock för en BEV (”Battery Electric Vehicle”) med EU elektricitet är det sötvattentoxicitet, stratosfärisk ozonbildning och klimat som är dom mest betydande påverkanskategorierna. Det normaliserade resultatet för BEV med vindenergi visar att det är sötvattentoxicitet och klimat som dom mest betydande påverkanskategorierna. Enligt den valda viktningsmetoden framgår det att klimat och fossil resursutarmning är dom viktigaste påverkanskategorierna för en diesel lastbil. För en BEV med EU mix är den viktigaste klimat och fossil resursutarmning följt av mineralresursbrist. För BEV laddad med energi från vindkraft, är dom viktigaste påverkanskategorierna mineralresursbrist, klimat och fossil resursutarmning. Det viktade resultatet visade även att påverkanskategorierna, humantoxicitet cancer, sötvatten ekotoxicitet, partiklar och vattenresursbrist bör tas i beaktning i en LCA av en BEV. Slutligen behövs det mer forskning kring sammankoppling av planetära gränser och LCA ramverket, även utveckling av normaliseringsreferenser och viktningsfaktorer som är baserat på företags- och sektorsnivåer för utsläppsrätter behövs för att ett företag ska förstå produkters absoluta miljöpåverkan.
Sjöwall, Fredrik. « Alternative Methods for Value-at-Risk Estimation : A Study from a Regulatory Perspective Focused on the Swedish Market ». Thesis, KTH, Industriell ekonomi och organisation (Inst.), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-146217.
Texte intégralBetydelsen av sund finansiell riskhantering har blivit alltmer betonad på senare år, i synnerhet i och med finanskrisen 2007-08. Baselkommittén fastställer internationella normer och regler för banker och finansiella institutioner, och särskilt under marknadsrisk föreskriver de intern tillämpning av måttet Value-at-Risk. Däremot har den mest etablerade icke-parametriska Value-at-Risk-modellen, historisk simulering, kritiserats för några av dess orealistiska antaganden. Denna avhandling undersöker alternativa metoder för att beräkna icke-parametrisk Value-at‑Risk, genom att granska och jämföra prestationsförmågan hos tre motverkande viktningsmetoder för historisk simulering: en exponentiellt avtagande tidsviktningsteknik, en volatilitetsuppdateringsmetod, och slutligen ett mer generellt tillvägagångssätt för viktning som möjliggör specifikation av en avkastningsfördelnings centralmoment. Modellerna utvärderas med verklig finansiell data ur ett prestationsbaserat perspektiv, utifrån precision och kapitaleffektivitet, men också med avseende på deras lämplighet i förhållande till existerande regelverk, med särskilt fokus på den svenska marknaden. Den empiriska studien visar att prestandan hos historisk simulering förbättras avsevärt, från båda prestationsperspektiven, genom införandet av en viktningsmetod. Dessutom pekar resultaten i huvudsak på att volatilitetsuppdateringsmodellen med ett 500 dagars observationsfönster är den mest användbara viktningsmetoden i alla berörda aspekter. Slutsatserna i denna uppsats bidrar i väsentlig grad både till befintlig forskning om Value-at-Risk, liksom till kvaliteten på bankers och finansiella institutioners interna hantering av marknadsrisk.
Al-Nashashibi, May Y. A. « Arabic Language Processing for Text Classification. Contributions to Arabic Root Extraction Techniques, Building An Arabic Corpus, and to Arabic Text Classification Techniques ». Thesis, University of Bradford, 2012. http://hdl.handle.net/10454/6326.
Texte intégralPetra University, Amman (Jordan)
Al-Nashashibi, May Yacoub Adib. « Arabic language processing for text classification : contributions to Arabic root extraction techniques, building an Arabic corpus, and to Arabic text classification techniques ». Thesis, University of Bradford, 2012. http://hdl.handle.net/10454/6326.
Texte intégralCalandruccio, Lauren. « Spectral weighting strategies for sentences measured by a correlational method ». Related electronic resource:, 2007. http://proquest.umi.com/pqdweb?did=1342726281&sid=1&Fmt=2&clientId=3739&RQT=309&VName=PQD.
Texte intégralJeon, Byung Ho. « Proposed automobile steering wheel test method for vibration ». Thesis, Brunel University, 2010. http://bura.brunel.ac.uk/handle/2438/4623.
Texte intégralAsgeirsson, David J. « Development of a Monte Carlo re-weighting method for data fitting and application to measurement of neutral B meson oscillations ». Thesis, University of British Columbia, 2011. http://hdl.handle.net/2429/37021.
Texte intégralЯременко, Наталія Сергіївна, Наталья Сергеевна Яременко et Nataliia Serhiivna Yaremenko. « Метод рандомізованих зведених показників визначення вагових коефіцієнтів в таксономічних показниках ». Thesis, Запорізький національний університет, 2015. http://essuir.sumdu.edu.ua/handle/123456789/60125.
Texte intégralThe method of randomized aggregates for finding weighting coefficients when necessary convolution of standardized indicators given their unequal contribution to the integral index.
De, la Rey Tanja. « Two statistical problems related to credit scoring / Tanja de la Rey ». Thesis, North-West University, 2007. http://hdl.handle.net/10394/3689.
Texte intégralThesis (Ph.D. (Risk Analysis))--North-West University, Potchefstroom Campus, 2008.
Col, Juliana Sipoli. « Coerência, ponderação de princípios e vinculação à lei : métodos e modelos ». Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/2/2139/tde-29082013-132628/.
Texte intégralThe subject of this study is rationality of judgments when there is collision of principles or conflict between principles and rules, which are hard cases, since there is no predetermined solution in legal system that allows only subsuming facts to the norm. Alternative methods are then examined. The first is the method of weighting and balancing proposed mainly by Robert Alexy, in spite of its variants. However, the difficulty to apply such method is theweightlessness between weighing and law binding, that is, the choice of weight of principles and its untying to the Law. The second model, called coherence model, intends to reach any rationality and provide criteria that could explain choices between conflicting values underlying Law and also the ascription of weights of the weighing and balancing method. In coherence model, it is studied especially its inferential version that explores coherence between rules and principles through abduction of principles from rules. These methods are tested in two decisions by Brazilian Supreme Court in cases of collision of principle, in Ellwanger and anencephalic abortion cases. That does not allow a general approach, but only specific outlines of the virtues and defects of these models of decision.
Gencturk, Bilgehan. « Nickel Resource Estimation And Reconciliation At Turkmencardagi Laterite Deposits ». Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614978/index.pdf.
Texte intégralrkmenç
ardagi - Gö
rdes region. Since the nickel (Ni) grade recovered from drilling studies seem to be very low, a reconciliation pit having dimensions of 40 m x 40 m x 15 m in x-y-z directions was planned by Meta Nikel Kobalt Mining Company (META), the license owner of the mine, to produce nickel ore. 13 core drilling and 13 reverse circulation drilling (RC) and 26 column samplings adjacent to each drillholes were located in this area. Those three sampling results were compared to each other and as well as the actual production values obtained from reconciliation pit. On the other side 3D computer modeling was also used to model the nickel resource in Tü
rkmenç
ardagi - Gö
rdes laterites. The results obtained from both inverse distance weighting and kriging methods were compared to the results of actual production to find out the applicability of 3D modeling to laterite deposits. Modeling results showed that Ni grade of the reconciliation pit in Tü
rkmenç
ardagi - Gö
rdes, considering 0.5% Ni cut-off value, by using drillholes data, inverse distance weighting method estimates 622 tonnes with 0.553% Ni and kriging method estimates 749 tonnes with 0.527% Ni. The actual production pit results provided 4,882 tonnes of nickel ore with 0.649% Ni grade. These results show that grade values seem to be acceptable but in terms of tonnage, there are significant differences between theoretical estimated values and production values.
Alqasrawi, Yousef T. N. « Natural scene classification, annotation and retrieval : developing different approaches for semantic scene modelling based on Bag of Visual Words ». Thesis, University of Bradford, 2012. http://hdl.handle.net/10454/5523.
Texte intégralHakala, Tim. « Settling-Time Improvements in Positioning Machines Subject to Nonlinear Friction Using Adaptive Impulse Control ». BYU ScholarsArchive, 2006. https://scholarsarchive.byu.edu/etd/1061.
Texte intégralWensveen, Paul J. « Detecting, assessing, and mitigating the effects of naval sonar on cetaceans ». Thesis, University of St Andrews, 2016. http://hdl.handle.net/10023/8684.
Texte intégralČernoch, Adam. « Vyhodnocování dopravního hluku a jeho modelování ». Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2014. http://www.nusl.cz/ntk/nusl-226952.
Texte intégralBoutoux, Guillaume. « Sections efficaces neutroniques via la méthode de substitution ». Phd thesis, Bordeaux 1, 2011. http://tel.archives-ouvertes.fr/tel-00654677.
Texte intégralNodehi, Anahita. « Weighting Methods For Causal Inference With Survival Outcomes ». Doctoral thesis, 2023. https://hdl.handle.net/2158/1297881.
Texte intégralWang, Zhen-Lin, et 王禎麟. « The Selection and Analysis of Weighting Methods for Technical Evaluation ». Thesis, 2009. http://ndltd.ncl.edu.tw/handle/44673125091499839201.
Texte intégral國立臺灣大學
機械工程學研究所
97
Weightings of the evaluation criteria are normally determined by evaluators, which are also important factors influencing results of evaluation. Engineers need to know preference of criteria based on consumer’s viewpoints through weightings. The objective of this research is to find a method to more accurately estimating weightings for evaluation. A standard set of data has been devised in this research for comparing various multiple criteria decision making methods. The comparison shows benefits and drawbacks of each method along with conditions of applications. It concludes that the optimum weight model method is more accurate than others. Weightings also have the properties of changing values over time. Therefore, the time series analysis methods for historical data should be introduced. The autoregressive moving average (ARMA) model was found possessing the properties of better accuracy for near-period weightings, and was more accurate with adequate data. The ARMA model is therefore a better method for estimating the trend weightings in history.
Steele, Bonita Marie. « Reliability of composite test scores : an evaluation of teachers' weighting methods ». Thesis, 2000. http://hdl.handle.net/2429/10342.
Texte intégralYang, Sanky, et 楊森評. « Applying the Weighting Methods in Importance-Performance Analysis to Evaluate the Service Quality - Case Studies of Service Industries around the Campus ». Thesis, 2008. http://ndltd.ncl.edu.tw/handle/61476424009160002532.
Texte intégral亞洲大學
資訊科學與應用學系碩士班
96
This study has two main parts. The first one is the descriptive statistics of basic information, which focus on the service industries around the campus. This part can provide a deep understanding about customers to the business management of such a particular industry. Moreover, through applying the traditional Importance- Performance Analysis (IPA), the major strengths can be found and maintained, the major weaknesses can be specifically improved and then the quality of service will be enhanced. Originally, the traditional IPA must equip with the importance information. If not, the importance information can be derived by weighting methods. Conceptually, the second part is to implement IPA method with the derived importance from a variety of weighting methods. There are eight weighting methods introduced in our study, namely variance weighted method, sensitivity weighted method, entropy weighted method, coefficient of variation weighted method, correlation coefficient weighted method, certainty weighted method, Gini–Simpson index (GSI) weighted method and mutual information weighted method. Furthermore, case studies are provided. The result for each weighting method is compared one another along with that of traditional IPA. Percent rate and Kappa test are the two indicators to measure the consistency among eight weighted methods and the traditional IPA. Specifically, the Box plot is used to measure the stability in a variety of scenarios of IPA. The results of this study show several essential points. By using the traditional IPA in convenient stores, findings the keys to gain advantage are the shelf stands, store tidiness, attitude and politeness of the clerks. The neat and placing of shelf stands, freshness of goods, speed of restock, completeness of equipment and prices are the issues needed to be made better immediately. The response of dormitory manager, crisis handling, relationship and respect of residence are the essential advantage required to be maintained. On the other hand, facilities, improving and repairing are the problems have to be solved as soon as possible. In the field of library, its preserved key advantages are the service of librarians as well as the library itself, response, facilities and environment. Thus, the collection of library, seats, and computers in information search area are vital parts should be upgraded instantly. The conclusion of comparing the consistency between the result of each IPA method with the importance computed from each weighted method and that of traditional IPA is as following: there are five weighted methods are suggested when the overall satisfaction is unavailable, such as variance weighted method, entropy weighted method, coefficient of variation weighted method, certainty weighted method and Gina–Simpson index (GSI) weighted method. From the consistency and stability criteria, the analyzing result is that certainty weighted method is most recommended for implementing IPA method when the overall satisfaction is unavailable; there are three weighted methods are suggested when the overall satisfaction is available, which are sensitivity weighted method, correlation coefficient weighted method and mutual information. Among them, the result of IPA method with the importance computed from mutual information weighted method is much similar to that of traditional IPA.
Chen, Shih-Keng, et 陳世耿. « Weighting Adjustment Method ». Thesis, 2004. http://ndltd.ncl.edu.tw/handle/7ack8j.
Texte intégral中原大學
應用數學研究所
92
The classical inference method for opinion survey is generally inappropriate when certain amount of the subjects sampled refused or were unable to provide responses. Based on the assumption that the missing mechanism is at random, differential weights derived from poststratification, which is closely related to the Horvitz-Thompson estimator, can be used to modify the estimation to adjust for bias. It is shown that under the missing-at-random assumption the estimator is unbiased. The method is primarily used to handle unit nonresponse, where a subset of sampled individuals do not complete the survey because of noncontact, refusal, or some other reason. We consider the applications to three common sampling schemes, namely, simple random sampling, stratified random sampling and cluster sampling. Monte-Carlo simulations show that the weighting adjustments have significantly improved the accuracy of the estimators in many cases.
Taljaard, Monica. « Non-response error in surveys ». Diss., 1997. http://hdl.handle.net/10500/16167.
Texte intégralMathematical Sciences
M. Com. (Statistics)
Chang, Yi-Te, et 張奕得. « A Dynamic Weighting Method and Analysis ». Thesis, 2016. http://ndltd.ncl.edu.tw/handle/52745826902692887589.
Texte intégral國立交通大學
統計學研究所
104
Markov Chain Monte Carlo method is a universal-used method in numerical integration. In this talk, we will discuss the dynamic weighting MCMC proposed by Wong and Liang (1997), which makes the Markov chain converges faster. In the decades, Metropolis Hasting algorithm is an important simulation method, but there are still some drawbacks in the simulation. For example, the movement of the process can be influenced by some tiny probability nodes. This phenomenon may directly affect to our simulated estimation. Our main work is to review the weighted MCMC and give some theoretical proof in some special cases. Through the manner, we can make the MCMC method more efficient.
Huang, Chien-Chung, et 黃建中. « The Objective Weighting Method of Life Cycle Impact Assessment ». Thesis, 2005. http://ndltd.ncl.edu.tw/handle/74554777745416017027.
Texte intégral國立臺灣大學
環境工程學研究所
93
Life cycle assessment (LCA), the powerful environmental management tool of design for environment (DfE), could consider all stages and aspects of a specific product. Although LCA is widely recognized in environmental management, there are still some shortcomings that need to be overcome. The weighting process, the step couldn’t be standardize, is one of the shortcomings. In the past, the weighting process is usually implemented with subject methodologies. However, the subject weights couldn’t be used in different cases and the process of getting subject weights are not transparent. The goal of this paper is to improve the valuation stage in LCA by combining the LCA with the environmental indicator system through factor analysis and sample additive weighting method. The new weighting method, “additive weighting method based on environmental indicators, AWBEI”, is expected to increase the objectivity and reflect spatial variability to enhance the reliability of the assessment results. There are two case studies in the paper. One is the production of coffee machine in Taiwan, and the other is the global allocation of green supply chain on LCD (Liquid Crystal Display). These two cases illustrated the processes, the applications, even the limitations of AWBEI.
WU, WEN-LONG, et 吳文龍. « Study of the weighting method for environmental impact assessment ». Thesis, 1990. http://ndltd.ncl.edu.tw/handle/98597498098397710447.
Texte intégralYi-HanLin et 林意涵. « An e-Journal Recommendation Method by Considering Time Weighting ». Thesis, 2013. http://ndltd.ncl.edu.tw/handle/83772003250925368690.
Texte intégral國立成功大學
工業與資訊管理學系專班
101
With the explosive growth of the emerging popularity of the Internet, it has the trend to publish researching papers on-line. The Institute for Scientific Information (ISI, now Thomas Reuters) has the most widely used online database (Journal Citation Reports, JCR) to provide the most valuable journals. The JCR Science edition contains data over 8,000 journals in science and technology. However, the mass of content available on the Internet arises a problem of information overload and disorientation. Currently, the teachers usually seek the researching papers by keying the keywords from academic search engine. However, the number of selected papers from the searching results is still very large. On the other hand, many teachers get used to using the target-based searching to seek the journal papers on-line. Since they don’t consider the factor of time-varying, it becomes difficult to find the appropriate journal papers to match their current studying. To overcome above-mentioned problems, we propose a journal recommendation method based on considering time-weighting parameter. Firstly, we utilize the N-gram and term frequency (TF) to classify and categorize words. Secondly, we identify the keywords according to the Java Wikimedia API. Generally, we judge the newer papers to be more important than older papers. Therefore, in order to extract the suitable for researching topics, a time-weighting is set in our method according to time factor of the journal papers. Finally, we make a reference vector (RV) from the set of studying topics of teachers, and utilize the RV to set up the binary vectors of researching topics of teachers and journals. To reduce the complexity of the proposed method, we perform the similarity matching module in binary vector space. In this thesis, we propose a time-aware journal recommendation method to seek appropriate journals to teachers. Experimental results show that the proposed approach can efficiently improve the accuracy of the recommendation.
Su, Yung-Yu, et 蘇永裕. « Applying Entropy Weighting Method and Gray Theory to Product Design ». Thesis, 2004. http://ndltd.ncl.edu.tw/handle/vhf894.
Texte intégral國立成功大學
工業設計學系碩博士班
92
Abstract The purpose of this study is to construct a product development design process. There are two major methods as entropy weighting and gray theory. QFD, ACE and the Structure Variation Method are accessory methods. Starting with questionnaire, the method of entropy weighting evaluates the importance of each factor in the product. Second, the method of gray statistic decides the favor of the design factors from customers. Furthermore, by using the QFD to evaluate the relationship between evaluation factors and design factors. By using the result of above to set up the essentiality design factors and through the web questionnaire, the alternative design will show. Using the Structure Variation Method to analyze the alternative design and using the 3D MAX to present the model of the product. This study establishes the product development principle and use the MP3 player design as the example. The combination of entropy weighting and gray theory bring the successful conclusion of product design method.
Yuan, Hui-Wei, et 袁輝偉. « Applying the keyword weighting method to measure SQL statement complexities ». Thesis, 2013. http://ndltd.ncl.edu.tw/handle/sjg875.
Texte intégral中華大學
資訊管理學系碩士在職專班
101
Databases have been extensively applied in our daily life. In the modern database almost all of them employ SQL as an interface for retrieving or maintaining data. Therefore coding SQL query statements efficiently becomes a necessary skill of programmers. To explore the issue of how to help the programmers equip the coding skill, it is necessary to develop a method for measuring the complexity of coding SQL statements. In the research community most of the existing research used Halstead complexity to measure the complexity of coding the SQL query statements. This study however finds the Halstead complexity could not consistently measure the complexity of coding SQL statement if the programmer uses different SQL statements to answer the same query. This study hence proposed the keyword weighting method instead of Halstead complexity method to measure the complexity of coding the SQL statements. The results of this study are listed as follows. (1) Different keywords have different complexities, (2) the complexity of coding the SQL statement can be represented as a function which depends on the complexity of the keywords within the SQL statement, (3) the complexity of coding the SQL statement correlates with the accuracy and the confidence. This study concluded that, comparing with the Halstead complexity method, the keyword weighting method excels in measuring the complexity of coding the SQL statements.
Yeh, Tien-Yu, et 葉天煜. « Research of Reducing Bit Error Rate by Gray Level Weighting Method ». Thesis, 2009. http://ndltd.ncl.edu.tw/handle/82043063228214877496.
Texte intégral國立中央大學
光電科學研究所
97
The purpose in this study is to reduce the error bits caused in the holographic data storage system, and use the RS code of error correction codes to slove the random noise effectively. After using the gray level weighting method, RS code can correct all error bits of reduced coding pages, and get decoded figure without error bits. The gray level weighting method can decrease error bits effectively when optical system has defocus aberration or more serious lens aberration. We can know that using the gray level weighting method can decrease the optical demand of holographic data storage. Utilize Gaussian accumulate probability to get the mean value and standard deviation to describe the Gaussian destribution curve matched the gray level distribution of actual experiment. Calculate the theoretical value of bit error rate of reduced figure. Using the Himax LCoS and common lens to get the optimal theoretical value of Bit Error Rate is 4.19×10^-10。
YANG, JIN-DE, et 楊金德. « Dividing a rectangular land using numerical method weighting with land price ». Thesis, 1990. http://ndltd.ncl.edu.tw/handle/45983060881380483961.
Texte intégralLiu, Pin-Yen, et 劉品言. « An Edge Detection Method via the Tuning of RGB Weighting Ratio ». Thesis, 2018. http://ndltd.ncl.edu.tw/handle/bc99a5.
Texte intégralWu, Cheng-Mao, et 吳承懋. « Applying Entropy Weighting Method and Grey Theory to Optimize Multi-response Problems ». Thesis, 2015. http://ndltd.ncl.edu.tw/handle/42247124042956765273.
Texte intégral國立交通大學
工業工程與管理系所
103
Facing the sharp competitiveness in twenty-first century, the advanced technology and sophisticated manufacturing process are necessary for manufacturers to meet the consumer’s requirements. Developing innovative products, improving product quality and reducing production cost are effective ways to maintain market competitiveness. Therefore, finding the optimal factor-level combination in a multi-response process under the restricted experimental cost, experimental time and machine feasibility becomes a very important issue for manufacturers. Design of Experiments (DOE) is often applied in industry to determine the optimal parameter setting of a process. However, DOE can only be utilized to optimize single response. Although many studies have developed optimization procedures for multi-response problems, they still have some shortcoming. Therefore, the main purpose of this study is to develop a method of optimizing multiple responses simultaneously using Grey Relation Analysis (GRA), Entropy Weight Method and Dual Response Surface Methodology (DRSM). Finally, a real case from a semiconductor factory in Taiwan is utilized to verify the effectiveness of the proposed procedure.
HUNG, CHANG-YI, et 洪長義. « Study on The Weighting Analysis of Theft by Using GM(0,N) Method ». Thesis, 2014. http://ndltd.ncl.edu.tw/handle/50362618602878734820.
Texte intégral建國科技大學
自動化工程系暨機電光系統研究所
102
Under the impact of global economic recession and financial tsunami, Taiwan is also not free from the impact of the entire environment. Because of the high unemployment rate, social security problems are also getting more and more important and restless. In addition, there are significant changes among people’s interpersonal relationships and values. Therefore the original function of social control is facing disruption, the society is becoming disorder and it seems to become more unsafe and disorder. According to major types of criminal cases statistics taken by domestic National Police Agency, among the nationwide numbers of theft, fraud, breach of trust, breach of drug control, and the theft cases in Taipei, the theft cases scored at the top on the criminal cases. So, we can imagine the seriousness of this problem. Therefore, this paper is based on the number of the theft cases and applies GM(0,N) model of grey system theory to find out the weighting of impact factors in theft cases. Then, the possible solution and possible directions for improvement can be found. This study can provide another statistical method for the executive police when they propose further policies.