Dissertations / Theses on the topic 'Corpus method'

To see the other types of publications on this topic, follow the link: Corpus method.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Corpus method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Tariq, Mariam. "A corpus-based method for ontology acquisition." Thesis, University of Surrey, 1994. http://epubs.surrey.ac.uk/843178/.

Full text
Abstract:
In this thesis we explore the acquisition of a domain ontology based on the characteristics of languages, in particular specialist languages. Our work is supported by the presumption that language can communicate information, specifically classification information, and especially when employed within specialist domains of knowledge. Knowledge involves being familiar with the existence of important objects and interrelationships between objects that make up a specific world, and language is often used as a medium to make this knowledge explicit. We examine the possibility of a local grammar for statements that convey ontological information. Assuming a correlation between the conceptual structure of a domain and a substantial collection of domain specific documents, we propose a method to analyse such a collection in an attempt to elicit this conceptual structure, which may help in understanding the ontological commitment of the domain experts. We have developed a prototype to implement the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
2

Rayson, Paul Edward. "Matrix : a statistical method and software tool for linguistic analysis through corpus comparison." Thesis, Lancaster University, 2003. http://eprints.lancs.ac.uk/12287/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Norkevičius, Giedrius. "Method for creating phone duration models using very large, multi-speaker, automatically annotated speech corpus." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2011. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2011~D_20110201_144440-12017.

Full text
Abstract:
Two heretofore unanalyzed aspects are addressed in this dissertation: 1. Building a model capable of predicting phone duration of Lithuanian. All existing investigations of phone durations of Lithuanian were performed by linguists. Usually these investigations are the kind of exploratory statistics and are limited to a single factor, affecting phone duration, analysis. Phone duration dependencies on contextual factors were estimated and written in explicit form (decision tree) in this work by means of machine learning method. 2. Construction of language independent method for creating phone duration models using very large, multi-speaker, automatically annotated speech corpus. Most of the researchers worldwide use speech corpus that are: relatively small scale, single speaker, manually annotated or at least validated by experts. Usually the referred reasons are: using multi-speaker speech corpora is inappropriate because different speakers have different pronunciation manners and speak in different speech rate; automatically annotated corpuses lack accuracy. The created method for phone duration modeling enables the use of such corpus. The main components of the created method are: the reduction of noisy data in speech corpus; normalization of speaker specific phone durations by using phone type clustering. The performed listening tests of synthesized speech, showed that: the perceived naturalness is affected by the underlying phones durations; The use of contextual... [to full text]
Disertacijoje nagrinėjamos dvi iki šiol netyrinėtos problemos: 1. Lietuvių kalbos garsų trukmių prognozavimo modelių kūrimas Iki šiol visi darbai, kuriuose yra nagrinėjamos lietuvių kalbos garsų trukmės, yra atlikti kalbininkų, tačiau šie tyrimai yra daugiau aprašomosios statistikos pobūdžio ir apsiriboja pavienių požymių įtakos garso trukmei analize. Šiame darbe, mašininio mokymo algoritmo pagalba, požymių įtaka garsų trukmei yra išmokstama iš duomenų ir užrašoma sprendimo medžio pavidalu. 2. Nuo kalbos nepriklausomų garsų trukmių prognozavimo modelių kūrimo metodas, naudojant didelės apimties daugelio, kalbėtojų automatiškai, anotuotą garsyną. Dėl skirtingų kalbėtojų tarties specifikos ir dėl automatinio anotavimo netikslumų, kuriant garsų trukmės modelius visame pasaulyje yra apsiribojama vieno kalbėtojo ekspertų anotuotais nedidelės apimties garsynais. Darbe pasiūlyti skirtingų kalbėtojų tarties ypatybių normalizavimo ir garsyno duomenų triukšmo atmetimo algoritmai leidžia garsų trukmių modelių kūrimui naudoti didelės apimties, daugelio kalbėtojų automatiškai anotuotus garsynus. Darbo metu atliktas audicinis tyrimas, kurio pagalba parodoma, kad šnekos signalą sudarančių garsų trukmės turi įtakos klausytojų/respondentų suvokiamam šnekos signalo natūralumui; kontekstinės informacijos panaudojimas garsų trukmių prognozavimo uždavinio sprendime yra svarbus faktorius įtakojantis sintezuotos šnekos natūralumą; natūralaus šnekos signalo atžvilgiu, geriausiai vertinamas yra... [toliau žr. visą tekstą]
APA, Harvard, Vancouver, ISO, and other styles
4

Lynn, Ethan Michael. "Getting All the Ducks in a Row: Towards a Method for the Consolidation of English Idioms." BYU ScholarsArchive, 2016. https://scholarsarchive.byu.edu/etd/6014.

Full text
Abstract:
Idioms play an important role in language acquisition but learners do not have sufficient time to learn all of them. Therefore, learners need to focus on the most frequently occurring idioms, which can be determined by corpus searches. Building off previous corpus studies, this study generated a comprehensive list of English idioms by combining lists from several sources and developed a methodology for organizing and sorting idioms within the list. In total, over 27,000 idiom forms were amalgamated and a portion of the list was compiled, which featured 2,697 core idioms and 5,559 variant idiom forms. It was found that over 35% of idioms varied structurally and thirteen types of idiom variation were highlighted. Additionally, issues concerning idiom boundaries were investigated. These results are congruent with previous findings which show that variation is a commonly occurring element of idioms. Furthermore, specific problematic elements for future corpus searches and English language learners are identified.
APA, Harvard, Vancouver, ISO, and other styles
5

Theunissen, M. W. (Marthinus Wilhelmus). "Phonene-based topic spotting on the switchboard corpus." Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/52998.

Full text
Abstract:
Thesis (MScEng)--Stellenbosch University, 2002.
ENGLISH ABSTRACT: The field of topic spotting in conversational speech deals with the problem of identifying "interesting" conversations or speech extracts contained within large volumes of speech data. Typical applications where the technology can be found include the surveillance and screening of messages before referring to human operators. Closely related methods can also be used for data-mining of multimedia databases, literature searches, language identification, call routing and message prioritisation. The first topic spotting systems used words as the most basic units. However, because of the poor performance of speech recognisers, a large amount of topic-specific hand-transcribed training data is needed. It is for this reason that researchers started concentrating on methods using phonemes instead, because the errors then occur on smaller, and therefore less important, units. Phoneme-based methods consequently make it feasible to use computer generated transcriptions as training data. Building on word-based methods, a number of phoneme-based systems have emerged. The two most promising ones are the Euclidean Nearest Wrong Neighbours (ENWN) algorithm and the newly developed Stochastic Method for the Automatic Recognition of Topics (SMART). Previous experiments on the Oregon Graduate Institute of Science and Technology's Multi-Language Telephone Speech Corpus suggested that SMART yields a large improvement over ENWN which outperformed competing phoneme-based systems in evaluations. However, the small amount of data available for these experiments meant that more rigorous testing was required. In this research, the algorithms were therefore re-implemented to run on the much larger Switchboard Corpus. Subsequently, a substantial improvement of SMART over ENWN was observed, confirming the result that was previously obtained. In addition to this, an investigation was conducted into the improvement of SMART. This resulted in a new counting strategy with a corresponding improvement in performance.
AFRIKAANSE OPSOMMING: Die veld van onderwerp-herkenning in spraak het te doen met die probleem om "interessante" gesprekke of spraaksegmente te identifiseer tussen groot hoeveelhede spraakdata. Die tegnologie word tipies gebruik om gesprekke te verwerk voor dit verwys word na menslike operateurs. Verwante metodes kan ook gebruik word vir die ontginning van data in multimedia databasisse, literatuur-soektogte, taal-herkenning, oproep-kanalisering en boodskap-prioritisering. Die eerste onderwerp-herkenners was woordgebaseerd, maar as gevolg van die swak resultate wat behaal word met spraak-herkenners, is groot hoeveelhede hand-getranskribeerde data nodig om sulke stelsels af te rig. Dit is om hierdie rede dat navorsers tans foneemgebaseerde benaderings verkies, aangesien die foute op kleiner, en dus minder belangrike, eenhede voorkom. Foneemgebaseerde metodes maak dit dus moontlik om rekenaargegenereerde transkripsies as afrigdata te gebruik. Verskeie foneemgebaseerde stelsels het verskyn deur voort te bou op woordgebaseerde metodes. Die twee belowendste stelsels is die "Euclidean Nearest Wrong Neighbours" (ENWN) algoritme en die nuwe "Stochastic Method for the Automatic Recognition of Topics" (SMART). Vorige eksperimente op die "Oregon Graduate Institute of Science and Technology's Multi-Language Telephone Speech Corpus" het daarop gedui dat die SMART algoritme beter vaar as die ENWN-stelsel wat ander foneemgebaseerde algoritmes geklop het. Die feit dat daar te min data beskikbaar was tydens die eksperimente het daarop gedui dat strenger toetse nodig was. Gedurende hierdie navorsing is die algoritmes dus herimplementeer sodat eksperimente op die "Switchboard Corpus" uitgevoer kon word. Daar is vervolgens waargeneem dat SMART aansienlik beter resultate lewer as ENWN en dit het dus die geldigheid van die vorige resultate bevestig. Ter aanvulling hiervan, is 'n ondersoek geloods om SMART te probeer verbeter. Dit het tot 'n nuwe telling-strategie gelei met 'n meegaande verbetering in resultate.
APA, Harvard, Vancouver, ISO, and other styles
6

Huiza, Pereyra Eric Raphael. "Talking with signs: a simple method to detect nouns and numbers in a non annotated signs language corpus." Master's thesis, Pontificia Universidad Católica del Perú, 2020. http://hdl.handle.net/20.500.12404/16906.

Full text
Abstract:
People with deafness or hearing disabilities who aim to use computer based systems rely on state-of-art video classification and human action recognition techniques that combine traditional movement pat-tern recognition and deep learning techniques. In this work we present a pipeline for semi-automatic video annotation applied to a non-annotated Peru-vian Signs Language (PSL) corpus along with a novel method for a progressive detection of PSL elements (nSDm). We produced a set of video annotations in-dicating signs appearances for a small set of nouns and numbers along with a labeled PSL dataset (PSL dataset). A model obtained after ensemble a 2D CNN trained with movement patterns extracted from the PSL dataset using Lucas Kanade Opticalflow, and a RNN with LSTM cells trained with raw RGB frames extracted from the PSL dataset reporting state-of-art results over the PSL dataset on signs classification tasks in terms of AUC, Precision and Recall.
Trabajo de investigación
APA, Harvard, Vancouver, ISO, and other styles
7

Lorenzi, Mikaela, and Sofia Bergström. ""I can tell a story that my dads friend tell me" : A corpus- and interview-based study on grammar education, with focus on verb forms." Thesis, Uppsala universitet, Institutionen för pedagogik, didaktik och utbildningsstudier, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-275268.

Full text
Abstract:
This study consists of two methods: textual analysis and interviews, which are based on text from The Uppsala Learner English Corpus (ULEC), and teachers as interview objects. The textual analysis investigates errors made by students in year seven and year nine, regarding the construction of different verb forms in written English essays. A potential difference between errors made in year seven and nine is also examined. Moreover, the interview based analysis investigates professional junior high school teachers’ teaching methods and attitudes towards grammar. The errors investigated in the textual analysis are compared with the responses of the teachers’ perception of common errors in verb forms made by their students.    The textual analysis showed that the most common errors made regard spelling within the verb phrase, auxiliary verbs, subject-verb agreement, and irregular verbs, and that year seven had a higher frequency of errors than year nine in most categories, even if the results differed inconsiderably.    The analysis of the interviews of the teachers found that teachers, in general, enjoy grammar, and aim to have a student-centered approach, however, the teachers testify of characteristics of traditional teacher-centered grammar teaching. It is reasoned that traditional teacher-centered grammar teaching is fundamentally established, where teachers today appear not to acquire the tools to move away from the teacher-centered approach onwards to a student-centered grammar teaching.    We reason that the education of L2 teachers needs to be reformed and provide tools to help teachers achieve a student-centered approach, and therein enable students to become more successful in grammar.
APA, Harvard, Vancouver, ISO, and other styles
8

Olsson, Fredrik. "Bootstrapping Named Entity Annotation by Means of Active Machine Learning: A Method for Creating Corpora." Doctoral thesis, SICS, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:ri:diva-22935.

Full text
Abstract:
This thesis describes the development and in-depth empirical investigation of a method, called BootMark, for bootstrapping the marking up of named entities in textual documents. The reason for working with documents, as opposed to for instance sentences or phrases, is that the BootMark method is concerned with the creation of corpora. The claim made in the thesis is that BootMark requires a human annotator to manually annotate fewer documents in order to produce a named entity recognizer with a given performance, than would be needed if the documents forming the basis for the recognizer were randomly drawn from the same corpus. The intention is then to use the created named en- tity recognizer as a pre-tagger and thus eventually turn the manual annotation process into one in which the annotator reviews system-suggested annotations rather than creating new ones from scratch. The BootMark method consists of three phases: (1) Manual annotation of a set of documents; (2) Bootstrapping – active machine learning for the purpose of selecting which document to an- notate next; (3) The remaining unannotated documents of the original corpus are marked up using pre-tagging with revision. Five emerging issues are identified, described and empirically investigated in the thesis. Their common denominator is that they all depend on the real- ization of the named entity recognition task, and as such, require the context of a practical setting in order to be properly addressed. The emerging issues are related to: (1) the characteristics of the named entity recognition task and the base learners used in conjunction with it; (2) the constitution of the set of documents annotated by the human annotator in phase one in order to start the bootstrapping process; (3) the active selection of the documents to annotate in phase two; (4) the monitoring and termination of the active learning carried out in phase two, including a new intrinsic stopping criterion for committee-based active learning; and (5) the applicability of the named entity recognizer created during phase two as a pre-tagger in phase three. The outcomes of the empirical investigations concerning the emerging is- sues support the claim made in the thesis. The results also suggest that while the recognizer produced in phases one and two is as useful for pre-tagging as a recognizer created from randomly selected documents, the applicability of the recognizer as a pre-tagger is best investigated by conducting a user study involving real annotators working on a real named entity recognition task.
APA, Harvard, Vancouver, ISO, and other styles
9

Do, Thi Ngoc Diep. "Extraction de corpus parallèle pour la traduction automatique depuis et vers une langue peu dotée." Phd thesis, Université de Grenoble, 2011. http://tel.archives-ouvertes.fr/tel-00680046.

Full text
Abstract:
Les systèmes de traduction automatique obtiennent aujourd'hui de bons résultats sur certains couples de langues comme anglais - français, anglais - chinois, anglais - espagnol, etc. Les approches de traduction empiriques, particulièrement l'approche de traduction automatique probabiliste, nous permettent de construire rapidement un système de traduction si des corpus de données adéquats sont disponibles. En effet, la traduction automatique probabiliste est fondée sur l'apprentissage de modèles à partir de grands corpus parallèles bilingues pour les langues source et cible. Toutefois, la recherche sur la traduction automatique pour des paires de langues dites "peu dotés" doit faire face au défi du manque de données. Nous avons ainsi abordé le problème d'acquisition d'un grand corpus de textes bilingues parallèles pour construire le système de traduction automatique probabiliste. L'originalité de notre travail réside dans le fait que nous nous concentrons sur les langues peu dotées, où des corpus de textes bilingues parallèles sont inexistants dans la plupart des cas. Ce manuscrit présente notre méthodologie d'extraction d'un corpus d'apprentissage parallèle à partir d'un corpus comparable, une ressource de données plus riche et diversifiée sur l'Internet. Nous proposons trois méthodes d'extraction. La première méthode suit l'approche de recherche classique qui utilise des caractéristiques générales des documents ainsi que des informations lexicales du document pour extraire à la fois les documents comparables et les phrases parallèles. Cependant, cette méthode requiert des données supplémentaires sur la paire de langues. La deuxième méthode est une méthode entièrement non supervisée qui ne requiert aucune donnée supplémentaire à l'entrée, et peut être appliquée pour n'importe quelle paires de langues, même des paires de langues peu dotées. La dernière méthode est une extension de la deuxième méthode qui utilise une troisième langue, pour améliorer les processus d'extraction de deux paires de langues. Les méthodes proposées sont validées par des expériences appliquées sur la langue peu dotée vietnamienne et les langues française et anglaise.
APA, Harvard, Vancouver, ISO, and other styles
10

Chan, Nok Chin Lydia. "Grammar "bores the crap out of me!": A mixed-method study on the XTYOFZ construction and its usage by ESL and ENL speakers." Thesis, Stockholms universitet, Engelska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-194086.

Full text
Abstract:
Different from Generative Grammar which sees grammar as a formal system of how words are put together to form sentences, Construction Grammar suggests that grammar is more than just rules and surface forms; instead, grammar includes many form-and-meaning pairings which are called constructions. For years, Construction Grammarians have been investigating constructions with various approaches, including corpus-linguistics, pedagogical, second language acquisition and so on, yet there is still room for exploration. The present paper aims to further investigate the [V the Ntaboo-word out of]-construction (Hoeksema & Napoli, 2008; Haïk, 2012; Perek, 2016; Hoffmann, 2020) (e.g., I kick the hell out of him.) and propose a new umbrella construction, “X the Y out of Z” (XTYOFZ) construction, for it. Another aim is to examine the usage and comprehension of the XTYOFZ construction by English as a Second Language (ESL) and English as Native Language (ENL) speakers. The usage context, syntactic and semantic characteristics of the XTYOFZ construction were examined through corpus linguistic methodology. Furthermore, processing and understanding of the construction by ESL and ENL speakers were tested via an online timed Lexical Decision Task as well as an online follow-up survey consisting of questions on English acquisition and usage, and a short comprehension task on the XTYOFZ construction. Corpus data shows that in general, the combination of non-motion action verbs (e.g., scare, beat) as X and taboo terms (e.g., shit, hell) as Y was the most common. Also, it was found that the construction occurs mostly in non-academic contexts such as websites and TV/movies. On the other hand, results from the Lexical Decision Task show that ESL speakers access constructional meaning slightly more slowly than ENL speakers. The follow-up survey also reflects that ESL speakers seem to have a harder time to produce and comprehend the construction compared to ENL speakers. By investigating the features of a relatively less-discussed construction and its usage by ESL speakers, this study hopes to increase the knowledge base of Construction Grammar and ESL construction comprehension and usage, particularly on the constructions that are mainly used in more casual settings.
APA, Harvard, Vancouver, ISO, and other styles
11

Таценко, Наталія Віталіївна, Наталия Витальевна Таценко, and Nataliia Vitaliivna Tatsenko. "Методи корпусної лінгвістики в підготовці фахівців-філологів." Thesis, “Baltija Publishing”, 2020. https://essuir.sumdu.edu.ua/handle/123456789/81070.

Full text
Abstract:
У тезах здійснено аналіз методів корпусної лінгвістики в підготовці фахівців-філологів. Надано визначення поняття "корпус", розглянуто сучасні комп’ютерні корпуси та їхні функції. З’ясовано, що у процесі навчання фахівців-філологів корпусні ресурси забезпечують викладачів емпіричним матеріалом для підтвердження їхніх гіпотез, а також екстралінгвальною інформацією (вік, рід автора чи мовця, часові та просторові параметри походження тексту тощо).
В тезисах осуществлен анализ методов корпусной лингвистики в подготовке специалистов-филологов. Дано определение понятия "корпус", рассмотрены современные компьютерные корпусы и их функции. Установлено, что в процессе обучения специалистов-филологов корпусные ресурсы обеспечивают преподавателей эмпирическим материалом для подтверждения их гипотез, а также экстралингвистической информацией (возраст, род автора или говорящего, временные и пространственные параметры происхождения текста и т.д.).
The conference abstracts analyze the corpus linguistics methods in the training of philologists. The definition of the notion "corpus" is given, modern computer corpora and their functions are considered. It has been established that in the process of philologists’ teaching, corpus resources provide teachers with empirical material to confirm their hypotheses, as well as extralinguistic information (age, gender of the author or speaker, temporal and spatial parameters of the text origin, etc.).
APA, Harvard, Vancouver, ISO, and other styles
12

MATSUBARA, Shigeki, Yoshihide KATO, Seiji EGAWA, 茂樹 松原, 芳秀 加藤, and 誠二 江川. "構文木からの再帰構造の除去による文圧縮." 一般社団法人情報処理学会, 2008. http://hdl.handle.net/2237/15072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

KORTE, MATTHEW. "Corpus Methods in Interlanguage Analysis." University of Cincinnati / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1218835515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Совпенко, Маргарита Олексіївна. "Лінгвокультурний та прагматичний аспекти перекладу політкоректної лексики в англомовних медійних текстах." Master's thesis, Київ, 2018. https://ela.kpi.ua/handle/123456789/25779.

Full text
Abstract:
Магістерська дисертація складається зі вступу, трьох розділів, висновків до кожного з них, загальних висновків, та списку використаної літератури, який налічує 100 джерел, 4 додатки. Загальний обсяг роботи 223 сторінок. Однією з актуальних проблем сучасного мовознавства залишається дослідження співвідношення мови та культури, оскільки мова є дзеркалом культури, в якому відбивається суспільна самосвідомість, спосіб життя, традиції, мораль, система цінностей і світогляд. Протягом останніх десятиліть в світі, зокрема в англомовних країнах, надзвичайної популярності набув новий культурний і мовний феномен «політична коректність». Актуальність теми роботи зумовлена інтересом до дослідження політкоректної лексики в сучасній англійській мові, що, своєю чергою, пояснюється зростаючими процесами глобалізації та міжнародної інтеграції і співпраці в суспільстві. Теоретичною основою нашого дослідження є теорії та положення, розроблені сучасними вітчизняними та закордонними мовознавцями та перекладознавцями: М. Бауманом, С. Г. Тер-Мінасовою, О. Б. Сінькевичем, В. Б. Великородою, Т. Р. Кияком, О. В. Завадською, З. С. Трофімовою, Я. С. Бондарук, Р. Й. Вишнівським, Т. Лоусоном, Д. Герродом, М. Г. Комлєвим та Ф. Беквітом. Об’єктом дослідження є політкоректна лексика в англомовних медійних текстах. Предметом дослідження виступають лінгвокультурологічні та прагматичні особливості функціонування та перекладу політкоректної лексики в англомовних медійних текстах. Мета роботи полягає у вивченні функціональних особливостей політкоректної лексики в англомовних медійних текстах та шляхів її відтворення українською мовою з огляду на лінгвокультурологічні розбіжності. Досягнення поставленої мети передбачає вирішення таких завдань: 1) з’ясувати соціокультурні передумови виникнення явища політкоректності та розглянути політкоректність як лінгвістичний феномен; 2) вивчити особливості загального поняття «медійний текст»; 3) обґрунтувати вживання політкоректної мови в англомовному медійному тексті; 4) дослідити класифікації політкоректних одиниць та їхнє функціональне призначення в англомовних медійних текстах; 5) проаналізувати шляхи відтворення англомовних політкоректних термінів українською мовою. Наукова новизна дисертаційного дослідження полягає в комплексному описі політкоректних одиниць медійних текстів, зокрема в детальному аналізі їх перекладу. Вперше досліджено прагматичний потенціал політкоректної лексики за допомогою корпусного методу. Практичне значення здобутих результатів полягає у тому, що дисертація збагачує дослідницький досвід сучасної германістики новими знаннями про політкоректність її репрезентацію в сучасній англійській мові. Отримані результати поглиблюють уявлення про закономірність функціонування англомовних політкоректних одиниць, особливості їх перекладу українською мовою та можуть бути використані в освітньому процесі, зокрема в теоретичних та практичних курсах теорії та практики перекладу, мовної комунікації та лінгвокультурології (розділ ІІІ. Шляхи перекладу політкоректної лексики сучасних англомовних медійних текстів українською мовою). Матеріалом дослідження є 470 політкоректних дібраних методом суцільної вибірки з 90 медіатекстів, зокрема видань The Economist, The Guardian, The Daily Mail, Princeton Info, The Washington Post, The American Prospect, The Bolton News та текстів BBC та СNN. Методи дослідження. У роботі застосовано загальнонаукові (аналіз, синтез, індукція, дедукція, узагальнення) та власне лінгвістичні методи. За допомогою методу суцільної вибірки були виокремлені одиниці аналізу. Описовий метод дозволив здійснити таксономію та інтерпретацію політкоректних одиниць. Контекстуальний метод допоміг у визначенні лінгвальних і позалінгвальних особливостей мовних одиниць; кількісний аналіз – у встановленні частотності застосування перекладацьких трансформацій; корпусний – в обґрунтуванні прагматичного потенціалу політкоректної мови в медійному тексті, визначенні лексичних та семантичних особливостей політкоректних термінів та встановленні сучасних тенденцій вживання політкоректних слів. Апробація результатів дослідження. Основні методологічні, теоретичні результати і концептуальні положення дослідження обговорювалися на: ІІ Всеукраїнській студентській науково-практичній конференції «Наука ХХІ століття: виклики, пріоритети, перспективи досліджень»; IV Всеукраїнській студентській науково-практичній конференції «Студент як суб’єкт процесу модернізації вищої освіти ХХІ століття: візії, цінності, пріоритетні завдання». Публікації. Основні положення і результати дисертаційного дослідження висвітлено в 3 публікаціях, з яких: 1 стаття у науковому фаховому виданні України, 2 – у збірниках матеріалів всеукраїнських науково-практичних конференцій.
The master's dissertation consists of an introduction, three chapters, conclusions to each chapter, general conclusions, and list of references that includes 100 points and 4 applications. The paper amounts to 223 pages. One of the current issues of modern linguistics is the study of the relation of a language and culture since the language is a mirror of the culture, which reflects the social consciousness, lifestyle, traditions, morals, the system of values and worldview. In the world, particularly in the English-speaking countries, a new cultural and linguistic phenomenon "political correctness" has gained immense popularity over the last decades. The significant topicality is presented by the interest in the study of politically correct vocabulary in modern English, which, in its turn, is explained by the fast-growing processes of globalization and international integration and cooperation in society. The theoretical basis of our research includes the theories and statements developed by modern domestic and foreign linguists and translators: M. Bauman, S. G. Ter-Minasova, O. B. Sinkevich, V. B. Velikoroda, T. R. Kiyak, O. V. Zavadskaya, S. S. Trofimova, J. S. Bondaruk, R. Y. Vishnevsky, T. Lewisohn, D. Gerrod, M. G. Komlev, and F. Beckwith. The object of the study is politically correct terms in English media texts. The subject of the study is linguocultural and pragmatic peculiarities of the functioning and translation of politically correct terms in English media texts. The aim of the research is to study the functional features of politically correct terms in English media texts and ways of their translation in Ukrainian with regard to the linguocultural differences. Achieving this goal involves the solution of the following tasks: 1) to determine socio-cultural preconditions of occurrence of the phenomenon of political correctness and consider political correctness as a linguistic phenomenon; 2) to study the features of the general notion of "media text"; 3) to substantiate the use of politically correct language in the English media text; 4) to classify the politically correct units and their functional purpose in English media texts; 5) to analyze the ways of English politically correct terms translation in Ukrainian. The scientific novelty of the dissertation research lies in the comprehensive description of the politically correct units of media texts, in a detailed analysis of their translation in particular. For the first time, the pragmatic potential of the politically correct vocabulary is studied with the application of the corpus method. The practical value of the obtained results is that the dissertation enriches the research of contemporary Germanic literature with new knowledge about political correctness and its representation in modern English. The obtained results deepen the comprehension of the functioning pattern of English politically correct terms and the peculiarities of their translation into Ukrainian and can be used in the educational process, particularly in the theoretical and practical courses of the theory and practice of translation, communication and linguocultural studies (Chapter III. The Ways of translation of the politically correct terms of modern English media texts in Ukrainian). The research material amounts to 470 politically correct terms selected from 90 media texts, including The Economist, The Guardian, The Daily Mail, Princeton Info, The Washington Post, The American Prospect, The Bolton News and BBC and CNN, by a multistage sampling approach. Research methods. In this work, general (analysis, synthesis, induction, deduction, generalization) and linguistic methods are applied. Units of analysis were determined by means of a continuous sampling method. The descriptive method allowed to present the taxonomy and interpretation of politically correct units. The contextual method helped in the determination of the linguistic and extra-linguistic features of politically correct units; quantitative analysis - in the determination of the frequency of translation transformations application; corpusmethod facilitated the study of the pragmatic potential of the politically correct language in the media text, determining the lexical and semantic peculiarities of politically correct terms, and establishing modern trends in the politically correctness. The main methodological and theoretical results and conceptualities of the study were discussed at: II All-Ukrainian Student Scientific and Practical Conference "The Science of the 21st Century: Challenges, Priorities, Perspectives of Research"; IV All-Ukrainian student scientific and practical conference "Student as a subject of the process of modernization of higher education of the XXI century: visions, values, priority tasks". Publications. The main statements and results of the dissertation research are presented in 3 publications, of which: 1 article in the scientific professional edition of Ukraine, 2 - in collections of materials of all-Ukrainian scientific and practical conferences.
APA, Harvard, Vancouver, ISO, and other styles
15

Wennerberg, Pinar [Verfasser]. "Corpus based methods for ontology modularization in healthcare / Pinar Wennerberg." München : Verlag Dr. Hut, 2011. http://d-nb.info/1011441837/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Mohamed, Ghada. "Text classification in the BNC using corpus and statistical methods." Thesis, Lancaster University, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.658020.

Full text
Abstract:
The main part of this thesis sets out to develop a system of categories within a text typology. Although there exist many different approaches to the classification of text into categories, this research fills a gap in the literature, as most work on text classification is based on features external to the text such as the text's purpose, the aim of discourse, and the medium of communication. Text categories that have been set up based on some external features are not linguistically defined. In consequence, texts that belong to the same type are not necessarily similar in their linguistic forms. Even Biber's (1988) linguistically-oriented work was based on externally defined ~registers. Further, establishing text categories based on text-external features favours theoretical and qualitative approaches of text classification. These approaches can be seen as top-down approaches where external features are defined functionally in advance, and subsequently patterns of linguistic features are described in relation to each function. In such a case, the process of linking texts with a particular type is not done in a systematic way. In this thesis, I show how a text typology based on similarities in linguistic form can be developed systematically using a multivariate statistical technique; namely, cluster analysis. Following a review of various possible approaches to multivariate statistical analysis, I argue that cluster analysis is the most appropriate for systematising the study of text classification, because it has the distinctive feature of placing objects into distinct groupings based on their overall similarities across multiple variables. Cluster analysis identifies these grouping algorithmically. The objects to be clustered in my thesis are the written texts in the British National Corpus (BNC). I will make use of the written part only, since results of previous research which attempts to classify texts of this dataset were not very beneficial. Takahashi (2006), for instance, identified merely a broad distinction between formal and informal styles in the written part; whereas in the spoken part, he could come up with insightful results. Thus, it seems justifiable to look at the part of the BNC which Taka..1.ashi found intractable, using a different multivariate technique, to see if this methodology allows patterns to emerge in the dataset. Further, there are two other reasons to use the written BNC. First, some studies (e.g. Akinnaso 1982; Chafe and Danielewicz 1987) suggest that distinctions between text varieties based on frequencies of linguistic features can be identified even within one mode of communication, i.e. writing. Second, analysing written text varieties has direct implications for pedagogy (Biber and Conrad 2009). The variables measured in the written texts of the BNC are linguistic features that have functional associations. However, any linguistic feature can be interpreted functionally; hence, we cannot say that there is an easy way to decide on a list of linguistic features to investigate text varieties. In this thesis, the list of linguistic features is informed by some aspects of Systemic Functional Theory (STF) and characteristics identified in previous research on writing, as opposed to speech. SFT lends itself to the interpretation of how language is used through functional associations of linguistic features, treating meaning and form as two inseparable notions. This characteristic of SFT can be one source to inform my research to some extent, which assumes that a model of text-types can be established by investigating not only the linguistic features shared in each type, but also the functions served by these linguistic features in each type. However, there is no commitment in this study to aspects of SFT other than those I have discussed here. Similarly, the linguistic features that reflect characteristics of speech and writing identified in previous research also have a crucial role in distinguishing between different texts. For instance, writing is elaborate, and this is associated with linguistic features such as subordinate clauses, prepositional phrases, adjectives, and so on. However, these characteristics do not only reflect the distinction between speech and writing; they can also distinguish between different spoken texts or different written texts (see Akinnaso 1982). Thus, the linguistic features seen as important from these two perspectives are included in my list of linguistic features. To make the list more principled and exhaustive, I also consult a comprehensive corpus-based work on English language, along with some microscopic studies examining individual features in different registers. The linguistic features include personal pronouns, passive constructions, prepositional phrases, nominalisation, modal auxiliaries, adverbs, and adjectives. Computing a cluster analysis based on this data is a complex process with many steps. At each step, several alternative techniques are available. Choosing among the available teclmiques is a non-trivial decision, as multiple alternatives are in common use by statisticians. I demonstrate how a process of testing several combinations of clustering methods, in order to determine the most useful/stable clustering combination(s) for use in the classification of texts by their linguistic features . To test the robustness of the clustering algorithms techniques and to validate the cluster analysis, I use three validation techniques for cluster analysis, namely the cophenetic coefficient, the adjusted Rand index, and the AV p-value. The findings of the cluster analysis represent a plausible attempt to systematise the study of the diversity of texts by means of automatic classification. Initially, the cluster analysis resulted in 16 clusters/text types. However, a thorough investigation of those 16 clusters reveals that some clusters represent quite similar text types. Thus, it is possible to establish overall headings for similar types, reflecting their shared linguistic features. The resulting typology contains six major text types: persuasion, narration, informational narration, exposition, scientific exposition, and literary exposition. Cluster analysis thus proves to be a powerful tool for structuring the data, if used with caution. The way it is implemented in this study constitutes an advance in the field of text typology. Finally, a small-scale case study of the validity of the text typology is carried out. A questionnaire is used to find out whether and to what extent my taxonomy corresponds to native speakers' understanding of textual variability, that is, whether the taxonomy has some mental reality for native speakers of English. The results showed that native speakers of English, on the one hand, are good at explicitly identifying the grammatical features associated with scientific exposition and narration; but on the other hand, they are not so good at identifying the grammatical features associated with literary exposition and persuasion. The results also showed that participants seem to have difficulties in identifying grammatical features of informational narration. The results of this small-scale case study indicate that the text typology in my thesis is, to some extent, a phenomenon that native speakers are aware of, and thus we can justify placing our trust in the results - at least in their general pattern, if not in every detail.
APA, Harvard, Vancouver, ISO, and other styles
17

Krull, Kirsten. "Lieber Gott, mach mich fromm ... : Zum Wort und Konzept “fromm” im Wandel der Zeit." Doctoral thesis, Umeå : Institutionen för moderna språk, Umeå univ, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-286.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Lindqvist, Ingemar. "Arabismos en el español cotidiano : Un estudio diacrónico de frecuencias." Thesis, Stockholms universitet, Romanska och klassiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-194102.

Full text
Abstract:
La larga presencia histórica de arabehablantes en la península ibérica tuvo como resultado la inclusión de préstamos léxicos, a menudo llamados arabismos, en el español. El objetivo de esta investigación ha sido comparar el uso de estos arabismos en el español cotidiano moderno con el del siglo XVI. Con este fin, se presenta una enumeración ordenada de los arabismos más frecuentes en el español moderno y se la compara con una lista correspondiente del español del siglo XVI. Las listas están basadas en dos corpus que maneja la Real Academia Española. Se realzan las semejanzas y diferencias entre las  dos listas y se discuten las posibles explicaciones de cambios en popularidad para los arabismos respectivos. Además, se presentan cálculos del porcentaje total de arabismos en el español de hoy y el del siglo XVI. Para este cálculo han sido usadas novelas de las dos épocas. Las novelas escogidas están todas arraigadas el las dos Castillas; por consiguiente, la comparación del porcentaje se limitará al español castellano. El estudio añade información cuantitativa que hasta ahora parece faltar respecto al conocimiento existente de arabismos en el español. El resultado de la investigación indica que la frecuencia de arabismos en la lengua cotidiana ha disminuido solo marginalmente desde el siglo XVI, mientras que el número de arabismos distintos en el uso corriente del español ha sufrido una reducción más pronunciada y el número de raíces hispanoárabes utilizadas ha decrecido aún más. Aproximadamente la mitad de los arabismos más frecuentes en el siglo XVI todavía mantienen esta posición; para la mayoría de los arabismos que han experimentado un aumento o reducción evidente en popularidad de uso existen explicaciones plausibles en forma de cambios en la sociedad.
The long-lasting historical presence of Arabic-speaking groups on the Iberian Peninsula resulted in various lexical loans, often referred to as arabisms, in Spanish. The objective of this investigation has been to compare the use of these arabisms in modern colloquial Spanish with that of the 16th century. For this purpose an ordered list of the most frequent arabisms found in modern Spanish is presented and compared with a similar list of arabisms found in texts from the 16th century. The lists are based on two corpus managed by the Royal Spanish Academy. Similarities and differences between the two lists are highlighted and possible explanations for the change in popularity of the respective arabisms are discussed. In addition, calculations of the total percentage of arabisms in current and 16th century Spanish are presented. Novels from the respective periods are used as a basis for these calculations. All the chosen novels are rooted in Castile; consequently, the percentage comparison is limited to Castilian Spanish. The study adds quantitative information that currently seems to be lacking to the existing knowledge of arabisms in Spanish. The result of the investigation indicates that the frequency of arabisms in colloquial language has diminished marginally since the 16th century, whereas the number of distinct arabisms in everyday usage of Spanish has suffered a more pronounced reduction and the number of hispanoarabic roots used has decreased even more. Approximately half of the most frequent arabisms in the 16th century still maintain this position; for a majority of the arabisms that have experienced an evident increase or decrease in popularity there exist plausible explanations in the form of changes in society.
APA, Harvard, Vancouver, ISO, and other styles
19

Sánchez-Martínez, Felipe. "Using unsupervised corpus-based methods to build rule-based machine translation systems." Doctoral thesis, Universidad de Alicante, 2008. http://hdl.handle.net/10045/13879.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Henninger, Peggy. "Coping Challenges and Methods Among Parents of Children with Corpus Callosum Disorders." ScholarWorks, 2019. https://scholarworks.waldenu.edu/dissertations/6723.

Full text
Abstract:
Disorders of the corpus callosum (ADCC) present developmental challenges to children and adults. These disorders are characterized by symptoms of abnormal behaviors and/or thinking patterns. Because ADCC may exist in combination with other disabilities, individual IQs and the severity and problems vary from individual to individual. Using the double ABCx model of family adaptation to stress related to a family member with a disability, the purpose of this cross-sectional study was to provide the first evaluation of parental adaptation among parents of children with ADCC. The final sample, 265 mothers of children with ADCC, was recruited through online support groups for ADCC parents. Parent adaptation was operationally defined as quality of life and operationalized by scores on the World Health Organization (WHO) Quality of Life Questionnaire (QOL). The predictors were measured by the Questionnaire on Resources and Stress (QRS), Family Empowerment Scale (FES), Sense of Coherence Scale (SOC), and Coping Health Inventory for Parents (CHIP). Linear regressions were used to evaluate the predictors in the 4-factor double ABCx prediction model of parent adaptation. Except for parent stress level, family empowerment, sense of coherence, and coping styles were statistically significant predictors of parental quality of life. That is, mothers who reported experiences of empowerment, coherence, and positive coping also have high self-reported quality of life. The findings, the first for experiences of parents of children with ADCC, provide valuable information for further research, but also for other parents and those who may be instrumental in the development of supportive services for this population.
APA, Harvard, Vancouver, ISO, and other styles
21

Aljohani, Samirah. "Subsective gradience in 2nd participles : an aspectual approach to adjectival passives and attributive participles in English." Thesis, University of St Andrews, 2018. http://hdl.handle.net/10023/12987.

Full text
Abstract:
This study investigates the adjectival passive, in accordance with Beedham's (2005, 1982) analysis of the passive as an aspect, with the caveat that telicity is an optimal, not sufficient, condition. The affinity of the adjectival passive with attributive participles and the existence of implicit agents in adjectival passives has divided opinion amongst linguists. The thesis deploys grammaticality judgment questionnaires surveying 1043 2nd participles and a corpus-based study investigating 1035 2nd participles. A subsective gradience (Aarts 2007, 2006, 2004) is modelled on five morpho-syntactic properties of 2nd participles: attributive function without modification, attributive function with modification, adjectival, verbal and prepositional passive, measuring formally the ability of 2nd participles to function like adjectives. The thesis consists of seven chapters. Chapter one introduces the research questions, adjectival passives and theoretical background. Chapter two reviews the aspect analysis, telicity, offers a qualification, and sets the theoretical approach. Chapter three is about the data and methodology. Chapter four discusses the affinity between adjectival passive and attributive participles. Chapter five discusses subsective gradience. Chapter six discusses the implications of the findings. Chapter seven gives a summary and conclusion. The empirical findings in our study provide further evidence in support of a subsective gradience in 2nd participles indicative of how ‘adjectival' a participle can be, on a continuum or gradient ranging from ‘verby' 2nd participles – relatively low compatibility with adjectival properties – to very adjectival 2nd participles. 2nd participles in this study are shown to have an inherent meaning of ‘action + state'. 2nd participles which form adjectival passives function attributively and form verbal passives. However, a 2nd participle functioning attributively does not entail that it will form an adjectival passive. There is evidence that attributive un- participles can host manner adverbials. It was also found that the interpretation of attributive participles goes beyond a simple passive/perfect dichotomy, and there are cases whereby a 2nd participle modifies an NP that is not an argument of the corresponding verb. This study makes a contribution to the wider analysis of the adjectival passive and provides further support for the similarity between adjectival and verbal passives.
APA, Harvard, Vancouver, ISO, and other styles
22

Tsukamoto, Marcio Michiharu. "Desenvolvimento do método de partículas na representação de corpos flutuantes em ondas altamente não-lineares." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-19092006-111325/.

Full text
Abstract:
O método numérico para fluidos incompressíveis desenvolvido no presente estudo é o Moving Particle Semi-Implicit Method (MPS) que enxerga o domínio discretizado em partículas, é baseado em representação lagrangeana e não tem a necessidade de utilização de malhas. O método MPS tem como equações governantes uma forma particular da equação de Navier-Stokes e a equação da continuidade para fluidos incompressíveis e não viscosos. Os métodos de simulação de fluidos mais comumente utilizados são baseados em representação euleriana e utilizam malhas para descrever a geometria do domínio a ser simulado. Devido a essas diferenças, uma das grandes virtudes do método de partículas é a facilidade de investigação de fenômenos altamente não-lineares como o de superfície livre com quebra de ondas, de líquidos no interior de uma embarcação em movimento, de ondas batendo na parte externa do casco de um navio, etc. Em artigos já publicados, resultados de experimentos físicos mostraram boa aderência aos resultados numéricos de simulações realizadas com o método MPS. No presente trabalho, resultados das forças de excitação das simulações com ondas regulares foram comparados com os resultados do programa Wave Analysis MIT (WAMIT) que é um programa consagrado no meio científico. Houve uma boa concordância de resultados entre os dois programas. A otimização do cálculo de vizinhança forneceu uma grande economia de tempo computacional. A maior contribuição deste estudo foi a otimização da função que resolve o sistema linear implementando no código desenvolvido um código paralelizado de uso público existente chamado Portable, Extensible Toolkit for Scientific Computation (PETSc) que proporcionou um bom ganho de desempenho.
A numerical method called Moving Particle Semi-implicit (MPS) method was developed in this study to analyze incompressible fluids. It is a particle method using a lagrangean representation without any grid. The governing equations are the Navier-Stokes equation and continuity equation for incompressible and non-viscous flow. Most of the computational fluid dynamics (CFD) methods are based on eulerian representation and use grids to describe the geometry of the simulated domain. These differences make the MPS method easier to analyze highly nonlinear phenomena as free surface with wave breaking, sloshing, slamming, etc. In previously published articles, results of physical experiments had shown good agreement with the numerical results obtained with MPS method. In the present work, results of exciting forces were compared with the results obtained with a validated program called Wave Analysis MIT (WAMIT). It had a good agreement of results between these two programs. The optimization of the neighborhood calculation function got a good economy of computational time. The greatest contribution of this study was the optimization of the linear system solver. It was made implementing in the developed code a parallelized public code called Portable, Extensible Toolkit for Scientific Computation (PETSc) that provided a good performance profit.
APA, Harvard, Vancouver, ISO, and other styles
23

Cromieres, Fabien. "Using Scalable Run-Time Methods and Syntactic Structure in Corpus-Based Machine Translation." 京都大学 (Kyoto University), 2011. http://hdl.handle.net/2433/142121.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Durand, Jacques. "Geometries de corps sedimentaires : concepts, methodes, exemples." Rennes 1, 1989. http://www.theses.fr/1989REN10073.

Full text
Abstract:
La logique de construction et de preservation des corps sedimentaires permet d'expliquer leurs geometries qui sont directement fonction des evenements tectoniques, eustatiques, glacio-eustatiques ou tectono-eustatiques. A partir de l'etude de differents exemples, on montre comment on peut actuellement reconstruire les geometries des corps sedimentaires. Les geometries originelles, rarement conservees, s'explique par le seul effet des facteurs autocycliques. Les geometries preservees sont la consequence des interactions entre facteurs autocycliques et allocycliques. Les modifications architecturales lors de la fossilisation produisent un reetalement plus ou moins important du stock sableux constitutif des corps sedimentaires
APA, Harvard, Vancouver, ISO, and other styles
25

Levy, Marlow H. "Allocating non-monetary incentives for Navy Nurse Corps Officers menu method vs. bid method Combinatorial Retention Auction Mechanism (CRAM) /." Thesis, Monterey, California : Naval Postgraduate School, 2010. http://edocs.nps.edu/npspubs/scholarly/theses/2010/Mar/10Mar%5FLevy.pdf.

Full text
Abstract:
Thesis (M.S. in Management)--Naval Postgraduate School, March 2010.
Thesis Advisor(s): Gates, William R. ; Coughlan, Peter. "March 2010." Author(s) subject terms: Combinatorial Retention Auction Mechanism, auction mechanism, auction, Nurse Corps, Nurse Corps retention, retention, retention mechanism, Menu Method, Bid Method. Includes bibliographical references (p. 95-99). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
26

Simoneau, Craig L. (Craig Lance). "Alternative contracting methods in the U.S. Army Corps of Engineers." Thesis, Massachusetts Institute of Technology, 1992. http://hdl.handle.net/1721.1/45728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Goldhahn, Dirk. "Quantitative Methoden in der Sprachtypologie: Nutzung korpusbasierter Statistiken." Doctoral thesis, Universitätsbibliothek Leipzig, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-130550.

Full text
Abstract:
Die Arbeit setzt sich mit verschiedenen Aspekten der Nutzung korpusbasierter Statistiken in quantitativen typologischen Untersuchungen auseinander. Die einzelnen Abschnitte der Arbeit können als Teile einer sprachunabhängigen Prozesskette angesehen werden, die somit umfassende Untersuchungen zu den verschiedenen Sprachen der Welt erlaubt. Es werden dabei die Schritte von der automatisierten Erstellung der grundlegenden Ressourcen über die mathematisch fundierten Methoden bis hin zum fertigen Resultat der verschiedenen typologischen Analysen betrachtet. Hauptaugenmerk der Untersuchungen liegt zunächst auf den Textkorpora, die der Analyse zugrundeliegen, insbesondere auf ihrer Beschaffung und Verarbeitung unter technischen Gesichtspunkten. Es schließen sich Abhandlungen zur Nutzung der Korpora im Gebiet des lexikalischen Sprachvergleich an, wobei eine Quantifizierung sprachlicher Beziehungen mit empirischen Mitteln erreicht wird. Darüber hinaus werden die Korpora als Basis für automatisierte Messungen sprachlicher Parameter verwendet. Zum einen werden derartige messbare Eigenschaften vorgestellt, zum anderen werden sie hinsichtlich ihrer Nutzbarkeit für sprachtypologische Untersuchungen systematisch betrachtet. Abschließend werden Beziehungen dieser Messungen untereinander und zu sprachtypologischen Parametern untersucht. Dabei werden quantitative Verfahren eingesetzt.
APA, Harvard, Vancouver, ISO, and other styles
28

Peacock, Jerry Edgar. "Marine Corps IT hardware: a method for categorizing and determining technology refreshment cycles." Thesis, Monterey, California: Naval Postgraduate School, 2015. http://hdl.handle.net/10945/45924.

Full text
Abstract:
Approved for public release; distribution is unlimited
Management of information technology (IT) assets within an enterprise is necessary to control organizational costs and ensure that the necessary business requirements are supported. For over 10 years, the Navy Marine Corps Intranet (NMCI) was charged with this task in Navy and Marine Corps IT systems. With the expiration of the NMCI contract, the Marine Corps is now managing its own IT assets. To understand the scope of IT assets to enable better management, this research explores items accounted for within the master data repository, which is aiding in the migration of legacy logistics systems to GCSS-MC. These items and their associated costs are divided into categories to provide a baseline view of Marine Corps IT hardware assets. An equivalent annual cost is applied to assets to suggest a refreshment cycle for laptops, desktops, and servers. This demonstrates a method that can provide IT managers with a means of determining when an asset should be refreshed.
APA, Harvard, Vancouver, ISO, and other styles
29

Herichon, Eliam. "Modélisation et simulation du déplacement de corps indéformables dans les écoulements diphasiques." Thesis, Aix-Marseille, 2014. http://www.theses.fr/2014AIXM4768.

Full text
Abstract:
Ces travaux portent sur la modélisation et la simulation numérique des effets du déplacement d'un corps indéformable dans un écoulement multiphasique compressible. Ils se placent dans le cas où plusieurs objets sont en mouvement ou dans le cas où un objet est en mouvement dans un milieu aux géométries complexes. L'étude ne peut alors pas être placée dans le référentiel lié à l'objet en mouvement. Le modèle est basé sur une méthode multiphasique à interfaces diffuses où les différentes phases sont en équilibre mécanique. Le système régissant l'écoulement fluide est augmenté d'une équation d'advection. Cette dernière s'applique sur une fonction Level Set dont le niveau zéro permet de localiser le mobile dans l'espace. Des termes de couplage sont ajoutés au membre de droite des équations d'évolution de la quantité de mouvement et de l'énergie totale. Ces termes sont composés d'un facteur du type pénalisation et d'un facteur du type relaxation de vitesses. Cette nouvelle méthode permet de simuler des cas complexes où peuvent interagir des mobiles à hautes vitesses, des ondes de choc et des interfaces liquide/gaz
This work deals with modelling and the numerical simulation of the effects of a moving rigid body on a multiphase flow. Here more than one object is moving, or an object is moving in a complex geometry domain. So the reference frame linked to the moving body can't be used. The model is build on a multiphase diffuse interface method with mechanical equilibrium. An advection equation is added. It applies on a Level Set function used to track the moving body. Coupling terms are added to the momentum equation and to the total energy equation. These terms are made of a penalization factor and a velocity relaxation factor. This new method allows to simulate complex cases where can interact high velocity objects, shock waves and liquid / gas interfaces
APA, Harvard, Vancouver, ISO, and other styles
30

Shaterzadeh-Yazdi, Mohammad Hossein 1991. "Análise de contato entre dois corpos elásticos usando o Método dos Elementos de Contorno." [s.n.], 2015. http://repositorio.unicamp.br/jspui/handle/REPOSIP/265747.

Full text
Abstract:
Orientadores: Paulo Sollero, Eder Lima de Albuquerque
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecânica
Made available in DSpace on 2018-08-28T12:08:28Z (GMT). No. of bitstreams: 1 Shaterzadeh-Yazdi_MohammadHossein_M.pdf: 5411191 bytes, checksum: 83da697ff892a31af99059f3e88bd338 (MD5) Previous issue date: 2015
Resumo: Em problemas de contato mecânico entre dois corpos elásticos, o cálculo de tensões e deformações dos componentes é de grande importância. Em casos particulares os corpos estão sujeitos a cargas normal e tangencial na presença de atrito, o qual aumenta a complexidade do problema. O estudo do fenômeno e a modelagem do problema, empregando o método dos elementos de contorno (MEC), é apresentado neste trabalho. Devido à presença de atrito e restrições de contato, esse problema torna-se um caso não linear. A não linearidade do problema foi contornada com a aplicação incremental de carga e o uso de um método de resolução de sistemas não lineares. A zona de contato é uma das variáveis do problema e pode conter estados de adesão e escorregamento, simultaneamente. Esses estados dependem dos esforços normais e tangenciais no componente e podem variar durante o processo de aplicação de carga. Dessa forma, cada incremento de carga pode perturbar em relação ao estado anterior. Portanto, o cálculo de variáveis e a atualização do sistema de equações em cada iteração é indispensável. Por este motivo, um algoritmo robusto para definição dos estados de contato é proposto. Como o sistema de equações obtido é não linear, o uso de um método numérico adequado é exigido. Para a solução deste sistema, o método de Newton foi aplicado, o qual permite a verificação do estado de contato em cada incremento. A análise é feita com o uso de elementos quadráticos contínuos, apresentando resultados contínuos e sem oscilação. A comparação dos resultados com as soluções analíticas de Hertz e Mindlin-Cattaneo mostram boa concordância
Abstract: The computation of stresses and strains on the components is of great importance, when the contact mechanics problems between two elastic bodies are analyzed. In particular cases, bodies are subjected to normal and shear loading in the presence of friction, which increases the complexity of the problem. The study of the phenomenon and modeling of the problem, using the boundary element method (BEM), are presented in this work. Due to the presence of friction and natural restrictions, this problem becomes non-linear. The non-linearity of the problem was solved with an incremental applied load and with the use of solvers to non linear systems. The contact zone can contain stick and slip states, simultaneously. These states are dependent on the normal and shear forces on the component and can vary during the application load process. Thus, each load increment can violate the previous state and therefore, the evaluation of variables and the updating of the system of equations after each iteration is indispensable For this reason, a robust algorithm for contact state definition is suggested. Since a non linear system of equations is obtained, an appropriate numerical method is required. To solve this system, Newton¿s method is applied, which allows the verification of the state of contact at each increment. The analysis is done with the use of quadratic continuous elements and provides continuous and non-oscillatory results. Comparisons of the results with the analytical solutions of Hertz and Mindlin-Cattaneo show good agreement
Mestrado
Mecanica dos Sólidos e Projeto Mecanico
Mestre em Engenharia Mecânica
APA, Harvard, Vancouver, ISO, and other styles
31

Njike, Fotzo Hermine. "Structuration Automatique de Corpus Textuels par Apprentissage Automatique : Automatically structuring textual corpora with machine learning methods." Paris 6, 2004. http://www.theses.fr/2004PA066567.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Vesterinen, Niklas. "Discrepancy of sequences and error estimates for the quasi-Monte Carlo method." Thesis, Karlstads universitet, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-78525.

Full text
Abstract:
We present the notions of uniform distribution and discrepancy of sequences contained in the unit interval, as well as an important application of discrepancy in numerical integration by way of the quasi-Monte Carlo method. Some fundamental (and other interesting) results with regards to these notions are presented, along with some detalied and instructive examples and comparisons (some of which not often provided by the literature). We go on to analytical and numerical investigations of the asymptotic behaviour of the discrepancy (in particular for the van der Corput-sequence), and for the general error estimates of the quasi-Monte Carlo method. Using the discoveries from these investigations, we give a conditional proof of the van der Corput theorem. Furthermore, we illustrate that by using low discrepancy sequences (such as the vdC-sequence), a rather fast convergence rate of the quasi-Monte Carlo method may still be achieved, even for situations in which the famous theoretical result, the Koksma inequality, hasbeen rendered unusable.
Vi presenterar begreppen likformig distribution och diskrepans hos talföljder på enhetsintervallet, såväl som en viktig tillämpning av diskrepans inom numerisk integration via kvasi-Monte Carlo metoden. Några fundamentala (och andra intressanta) resultat presenteras med avseende på dessa begrepp, tillsammans med några detaljerade och instruktiva exempel och jämförelser (varav några sällan presenterade i litteraturen). Vi går vidare med analytiska och numeriska undersökningar av det asymptotiska beteendet hos diskrepansen (särskilt för van der Corput-följden), såväl som för den allmänna feluppskattningen hos kvasi-Monte Carlo metoden. Utifrån upptäckterna från dessa undersökningar ger vi ett villkorligt bevis av van der Corput's sats, samt illustrerar att man genom att använda lågdiskrepanstalföljder (som van der Corput-följden) fortfarande kan uppnå tämligen snabb konvergenshastighet för kvasi-Monte Carlo metoden. Detta även för situationer där de kända teoretiska resultatet, Koksma's olikhet, är oandvändbart.
APA, Harvard, Vancouver, ISO, and other styles
33

Allen, Bernal B. "Meteor burst communications for the U.S. Marine Corps Expeditionary Force." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27131.

Full text
Abstract:
Approved for public release; distribution is unlimited
Meteor Burst Communications (MBC) is explored in relation to its usefulness to Marine Expeditionary Force Communications. A description of the physics and geometry of meteor trail propagation is presented. Communication techniques used to exploit the phenomenon are discussed. Current MBC circuits have operational ranges of 1200 miles without relay and maintain average data rates of 60 to 150 Bits per Second(BPS). MBC is primarily limited by the physics and geometry of the propagation medium and its usefulness is bounded by its slow data rate. Within these boundaries however, several significant use of MBC are identified. Keywords: Theses. (fr)
APA, Harvard, Vancouver, ISO, and other styles
34

Souza, Daniel Câmara de. "Eletrodinâmica variacional e o problema eletromagnético de dois corpos." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-26012015-213657/.

Full text
Abstract:
Estudamos a Eletrodinâmica de Wheeler-Feynman usando um princípio variacional para um funcional de ação finito acoplado a um problema de valor na fronteira. Para trajetórias C2 por trechos, a condição de ponto crítico desse funcional fornece as equações de movimento de Wheeler-Feynman mais uma condição de continuidade dos momentos parciais e energias parciais, conhecida como condição de quina de Weierstrass-Erdmann. Estudamos em detalhe um sub-caso mais simples, onde os dados de fronteira têm um comprimento mínimo. Nesse caso, mostramos que a condição de extremo se reduz a um problema de valor na chegada para uma equação diferencial com retardo misto dependente do estado e do tipo neutro. Resolvemos numericamente esse problema usando um método de shooting e um método de Runge-Kutta de quarta ordem. Para os casos em que as fronteiras mínimas têm velocidades descontínuas, elaboramos uma técnica para resolver as condições de quina de Weierstrass-Erdmann junto com o problema de valor na chegada. As trajetórias com velocidades descontínuas previstas pelo método variacional foram verificadas por experimentos numéricos. Em um segundo desenvolvimento, para o caso mais difícil de fronteiras de comprimento arbitrário, implementamos um método de minimização com gradiente fraco para o princípio variacional e problema de fronteira acima citado. Elaboramos dois métodos numéricos, ambos implementados em MATLAB, para encontrar soluções do problema eletromagnético de dois corpos. O primeiro combina o método de elementos finitos com o método de Newton para encontrar as soluções que anulam o gradiente fraco do funcional para fronteiras genéricas. O segundo usa o método do declive máximo para encontrar as soluções que minimizam a ação. Nesses dois métodos as trajetórias são aproximadas dentro de um espaço de dimensão finita gerado por uma Galerkiana que suporta velocidades descontínuas. Foram realizados diversos testes e experimentos numéricos para verificar a convergência das trajetórias calculada numericamente; também comparamos os valores do funcional calculados numericamente com alguns resultados analíticos sobre órbitas circulares.
We study the Wheeler-Feynman electrodynamics using a variational principle for an action functional coupled to a finite boundary value problem. For piecewise C2 trajectories, the critical point condition for this functional gives the Wheeler-Feynman equations of motion in addition to a continuity condition of partial moments and partial energies, known as the Weierstrass-Erdmann corner conditions. In the simplest case, for the boundary value problem of shortest length, we show that the critical point condition reduces to a two-point boundary value problem for a state-dependent mixed-type neutral differential-delay equation. We solve this special problem numerically using a shooting method and a fourth order Runge-Kutta. For the cases where the boundary segment has discontinuous velocities we developed a technique to solve the Weierstrass-Erdmann corner conditions and the two-point boundary value problem together. The trajectories with discontinuous velocities presupposed by the variational method were verified by numerical experiments. In a second development, for the harder case with boundaries of arbitrary length, we implemented a method of minimization with weak gradient for the variational principle quoted above. Two numerical methods were implemented in MATLAB to find solutions of the two-body electromagnetic problem. The first combines the finite element method with Newtons method to find the solutions that vanish the weak gradient. The second uses the method of steepest descent to find the solutions that minimize the action. In both methods the trajectories are approximated within a finite-dimensional space generated by a Galerkian that supports discontinuous velocities. Many tests and numerical experiments were performed to verify the convergence of the numerically calculated trajectories; also were compared the values of the functional computed numerically with some known analytical results on circular orbits.
APA, Harvard, Vancouver, ISO, and other styles
35

Delhommeau, Gérard. "Les problemes de diffraction-radiation et de resistance de vagues : etude theorique et resolution numerique par la methode des singularites." Nantes, 1987. http://www.theses.fr/1987NANT2032.

Full text
Abstract:
Etude du comportement des structures multiples sollicitees par la houle en profondeur illimitee ou en presence d'un fond horizontal. On expose une theorie permettant de resoudre les problemes hydrodynamiques du premier ordre a l'aide de la methode des singularites de kelvin. Etude du probleme de la resistance de vagues. On pose le probleme de neumann kelvin. Etude des difficultes apparaissant lorsque les corps coupent la surface libre
APA, Harvard, Vancouver, ISO, and other styles
36

Holland, Michael. "An Assessment of the U.S. Army Corps of Engineers' Environmental Plan Evaluation Methods." ScholarWorks@UNO, 2011. http://scholarworks.uno.edu/td/124.

Full text
Abstract:
The U.S. Army Corps of Engineers is a federal agency with a mission to develop water resource projects to benefit the nation. Some of its large scale projects have been built to benefit cities, but through unintended consequences have caused economic and environmental damages. For example, its control of Mississippi River flooding has protected the City of New Orleans, but contributed to land loss in coastal Louisiana, and by some accounts, made the population more susceptible to hurricane damage. The agency has now embarked on a mission to restore some of the damaged environmental areas. This dissertation evaluates whether policies and practices used by the agency to evaluate and select plans to implement is logically flawed and could produce suboptimal project selection. The primary issue is the practice of including only implementation costs in the analysis while excluding other positive and negative economic impacts. A case study is performed using the method to evaluate a traditional economic development project for which optimal project selection has already been determined using widely accepted benefit-cost practices. The results show that the Corps' environmental project evaluation method would cause rejection of the most efficient plan. The loss of welfare that would result from using this technique is measured by comparing the welfare gain of the optimal project to the welfare gain of the suboptimal projects which could be selected using the flawed methodology. In addition, the dissertation evaluates whether suboptimal results could be produced using two other current Corps policies: selecting projects based on production efficiency, and the exclusion of environmental benefits from the discounting process. For the first policy, a simple counter example shows how clearly inferior choices may come from including only supply considerations in investment choices. For the second policy, it is demonstrated mathematically that refraining from discounting benefits while discounting costs causes a bias towards selection of plans that take longer to build, are delayed in their implantation, or a combination of the two.
APA, Harvard, Vancouver, ISO, and other styles
37

Joyce, Karen E. "A method for mapping live coral cover using remote sensing /." [St. Lucia, Qld.], 2004. http://www.library.uq.edu.au/pdfserve.php?image=thesisabs/absthe18618.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Ghienne, Cécile. "L’opération de suppression comme indice de compétences dans des réécritures de textes narratifs de CM2 et de 6e." Electronic Thesis or Diss., Toulon, 2019. http://www.theses.fr/2019TOUL3001.

Full text
Abstract:
Cette recherche, menée dans deux établissements, m’a permis de constituer un corpus de textes d'élèves de CM2 et de 6e, accompagnés de leurs brouillons ou autres états intermédiaires, qu'ils aient été exigés ou non des enseignants. A partir de la méthodologie de la critique génétique, une base de données a été créée afin de recenser les quatre opérations du brouillon de certains élèves (ajouts, suppressions, remplacements et déplacements). L'étude propose ensuite une analyse plus détaillée de l’opération de suppression. Alors que dans les précédentes recherches, l’opération de suppression reste encore majoritairement associée à un désaveu ou à un échec dans les copies des élèves, les analyses quantitative et qualitative de cette thèse permettent d’envisager la suppression comme le fruit d’une véritable stratégie engagée par l’élève. Enfin, des dispositifs pédagogiques de relectures par les pairs ont été mis en place dans les classes afin d’étudier d’une part les conseils de suppressions émis par les pairs et d’autre part l’impact de ces remarques dans les réécritures des élèves
This work, conducted in two schools, enabled me to establish a corpus of pupil texts (fifth and sixth grades), composed by drafts and other intermediate versions, demanded or not by the teachers. Using a genetic criticism methodology, a database has been created in order to make an inventory of the four textual alterations (additions, deletions, substitutions and transpositions) of some pupils. Whereas previous studies have mostly associated deletions in pupil texts with a failure, quantitative and qualitative analyses performed in the present thesis suggest that deletions may reflect a genuine strategy. Finally, educational measures involving peer-review have been implemented in classes in order to study, on the one hand, peer advices about deletions and, on the other hand, the impact of these comments on the rewriting process
APA, Harvard, Vancouver, ISO, and other styles
39

FETHEDDINE, ABDELATIF. "Le statut de la logique dans le corpus d'averroes. Essai sur sa methode dans la jurisprudence et la science." Paris 10, 1995. http://www.theses.fr/1995PA100178.

Full text
Abstract:
Cette recherche porte sur la methode d'averroes dans la jurisprudence et la science de la logique. Le but est de savoir si sa methode dans ses ecrits juridiques se reconnait-elle dans son travail de commentateur comment le jeune averroes et le grand juge se reconnait-il dans le profil intellectuel du commantateur juriste de formation, averroes s'est consacre en premier lieu sur les problemes de l'exegese en redigeant l'abrege du muhtasar et la bidaya. Dans ces deux ouvrages, averroes s'est consacre a l'analyse operatoire des principales regles de methodologie juridique, il a pris la tache d'elaborer sa propre conception sur la pratique juridique en precisant les lois et les conditions qui determinent le role qu'un juge doit remplir pour atteindre le plus haut degre de la competence. Le caractere de sa methode juridique est marquee par un esprit logique et une preoccupation philosophique. Dans ses commentaires sur l'organon, la logique est concu comme un ensemble des arts qui nous apprend a raisonner correctement afin d'evaluer la cohenre et la rigueur dans tout argument que nous entendons. Ainsi il entrevoit les eventuelles consequences philosophiques des differentes positions qu' prendre aristote, il n'hesite pas a considerer plusieurs interpretations differentes mais tout de meme admissible afin de saisir celle qui rend justice a l'intention du stagirite
This research focuses on the method of averroes in the jurisprudence and the science. The purpose is to know if his method in his judicial documents is reflected in his work of commentator? how the young averroes and the great judge are recognizable in the intellectual profile of the commentator? jurist of training, averroes has devored himself in the first place, to problems of the exegese by writing the summary of the muhtasar and the bidaya in these two works, averroes is mainly interested in the operative analysis of the principal rules of judicial methodology. It has taken the task to elaborate his own conception on the judicial pratice by specifying the laws and the conditions that determine the role that a judge has to fill to reach the highest degree of competence the nature of its judicial method is marked by a logical mind and a philosophical preoccupation. In his comments on the organon, the logic is conceived as a totality of arts that teaches us to reason correctly so as to evaluate the coherence and the rigor in all argumentsthat we hear
APA, Harvard, Vancouver, ISO, and other styles
40

Silva, Marcos Valério Gebra da 1971. "Determinação das dimensões espaciais de corpos sólidos por técnicas ópticas de moiré." [s.n.], 2011. http://repositorio.unicamp.br/jspui/handle/REPOSIP/256864.

Full text
Abstract:
Orientadores: Inacio Maria Dal Fabbro, Celina de almeida
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Agrícola
Made available in DSpace on 2018-08-18T14:46:35Z (GMT). No. of bitstreams: 1 Silva_MarcosValerioGebrada_M.pdf: 5359194 bytes, checksum: e831ae3ea9b56a4d77ebb1a53f2352da (MD5) Previous issue date: 2011
Resumo: A medição de sólidos tridimensionais tem recebido uma grande atenção da comunidade científica, devido à sua ampla gama de aplicações. Como por exemplo, no controle de qualidade industrial, na medição do corpo humano para aplicações de ergonomia, e muitas outras áreas. Porém existem diversos métodos e técnicas para se obter tais medições, este trabalho demonstra a técnica de moiré que é uma técnica sem contato e não destrutiva, com um rápido processo de digitalização cujos fenômenos de Franjas de Moiré são o resultado da subtração da projeção de grades sobre um certo objeto com relação as grades projetadas em um plano referencial. Possui medição precisa comparável com a de outros sistemas. Demonstra também a exatidão das técnicas de moiré, sendo dado maior enfoque na técnica de moiré de projeção com deslocamento de fase, e pela utilização de dois tipos de grades a de Ronchi e senoidal, onde são observados os possíveis erros das diversas técnicas de moiré e por outros métodos metrológicos. Neste trabalho foi comprovado o melhor desempenho dos tipos e variação da frequência de grades incluindo vários exemplos práticos da sua aplicação em sólidos regulares e irregulares (frutos), comparação com outras técnicas em vários problemas em engenharia agrícola e determinação volumétrica de sólidos regulares e irregulares. Emprego de "softwares" gratuitos o qual também foi uma preocupação para disseminação da técnica, tais como ImageJ, RisingSun Moiré, SCILAB/SIP e rotinas
Abstract: Measurement of three-dimensional solids has received great attention from the scientific community due to its wide range of applications. As examples in can be mentioned industrial quality control, human body measurement applied to ergonomics and many other areas. The pertinent literature discloses several methods and techniques to carry three dimensional measurements. Moiré technique is a group of non-contact and non-destructive methods based on the more phenomena which fringes are the result of the subtraction of the grid projected onto the surface under study and the grid projected onto a reference plane. Moiré methods are yield accurate measurements if compared to other measuring systems. This work was foccused on the projection moiré technique with phase shift, and the use of two types of grids named Ronchi and sinusoidal one. Metrological errors of various techniques as compared with the moiré method have been determined as well. This work demonstrated the best performance of grid frequency variation through several practical applied to regular and irregular solids (fruits). Body dimensions were compared with convention techniques as water immersion and calypper. The application of free software such as ImageJ, RisingSun Moire, Scilab / SIP and routines was considered very useful to reach the final results
Mestrado
Maquinas Agricolas
Mestre em Engenharia Agrícola
APA, Harvard, Vancouver, ISO, and other styles
41

Blanchard, Jonathan Peter. "Rainwater Harvesting Storage Methods and Self Supply in Uganda." Scholar Commons, 2012. http://scholarcommons.usf.edu/etd/3979.

Full text
Abstract:
Self supply is an emerging approach to water supply which focuses on fostering household investment in incremental improvements to their water sources. When successful, it can lower costs and increase sustainability by offering users a larger share of ownership in their own supply, and harnessing the already existing strengths of a community rather than trying to impose an external perspective. In addition to well upgrading and source protection, one of the key self supply areas is rainwater harvesting. Uganda has a diverse selection of rainwater storage options, but many of them are scattered and disparate. The objective of this study was to create a comprehensive collection of well-established Ugandan rainwater storage options, and to demonstrate the geographical disparities in availability, particularly for Rakai District, where the author lived and worked as a Water and Sanitation Engineer for two years. Data was gathered by interviewing key stakeholders in rainwater harvesting at the national, regional, and district level in order to gather their collective knowledge in rainwater harvesting storage techniques. In order to understand the availability and pricing of manufactured products, a survey of Rakai District hardware stores determined the prices and range of volumes at which different manufactured products were available. The study found 11 distinct technologies widely used for rainwater storage: three informal or traditional, three manufactured, and five built-in-place by skilled artisans. The traditional/informal technologies consisted of clay pots, pots and basins, and brick mortar tanks. The manufactured products were plastic tanks ranging from 60 to 24,000 liters, corrugated iron tanks, and 55-gallon metal drums. The built-in-place tank technologies were mortar jars, tarpaulin tanks, ferrocement tanks, partially below ground ferrocement tanks, and interlocking stabilized soil brick tanks. The study also found that while the manufactured products are well distributed, built-in-place options have not spread beyond where they were originally introduced by NGO's trying to promote certain technologies. With regard to costs, tanks with storage volume less than 1,000 liters had costs that ranged from 182 to 724 UGX/liter, with small plastic tanks being least expensive. For volumes between 1,000 and 10,000 liters, costs ranged between 42 and 350 UGX/liter, with tarpaulin tanks providing the largest storage per unit cost. Above 10,000 liters of storage, tanks ranged from 35 to 341 UGX/liter, with tarpaulin tanks again ranking first by cost per unit volume. In order for self supply to flourish, these technologies need to be implemented in such a way that fosters a thriving private sector and independent uptake of rainwater harvesting. This research provides a starting point by laying out the technologies, costs, and volumes available.
APA, Harvard, Vancouver, ISO, and other styles
42

Slater, Richard S. "An analysis of credit card use as a method for making small purchases in the United States Marine Corps." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA297054.

Full text
Abstract:
Thesis (M.S. in Management) Naval Postgraduate School, December 1994.
"December 1994." Thesis advisor(s): David V. Lamm, Louis G. Kalmar. Includes bibliographical references ( p. 119-121). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
43

Karaliūtė, Asta. "Statistiniai kolokacijų nustatymo metodai ir vertimo atitikmenys lygiagrečiajame grožinės literatūros tekstyne." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2010. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20100617_111239-72584.

Full text
Abstract:
Darbo tyrimo objektas – kolokacijos ir jų tyrimo metodai. Pagrindinis darbo tikslas – išanalizuoti statistiniais metodais nustatytų kolokacijų sąrašus, juos palyginti ir išnagrinėti atrinktų kolokacijų vertimo atitikmenis. Darbo aktualumas – kolokacijų analizė padės lingvistams ir kitiems kalbos specialistams pasirinkti tinkamą kolokacijų nustatymo metodą tiek anglų, tiek lietuvių kalbai. O kolokacijų vertimo proceso supratimas svarbus vertimo analizei, vertėjų darbui. Tyrimas susideda iš penkių dalių. Antrajame skyriuje pristatoma teorinė kolokacijos sąvoka. Pateikiama sudėtinga kolokacijų vertimo problematika ir keturių analizei pasirinktų statistinių metodų charakteristikos: Tarpusavio Informacija (angl. Mutual Information), T-lygmuo (angl. T-score), Lošimo kauliukų metodas (angl. Dice) ir Logaritminio tikėtinumo santykis (angl. Log-likelihood ratio). Trečiajame skyriuje, remiantis pagrindiniu analizės šaltiniu – lygiagrečiu grožinės literatūros tekstynu, nustatomi kolokacijų sąrašai. Paaiškėja, kad T-lygmens ir Logaritminio tikėtinumo santykio (LTS) metoduose išryškėjo gramatinės kolokacijos, o Tarpusavio Informacijos (TI) ir Lošimo kauliukų (LK) metoduose – leksinės. Parinktos ir apibrėžtos kolokacijų ribos bei metodų panašumo koeficientai. Ketvirtajame skyriuje pasirenkamas 200 geriausiųjų kolokacijų sąrašas ir atliekamas kiekvienos kalbos statistinių metodų palyginimas. Metodai lyginami poromis pagal panašumo kriterijus – LK su TI (leksinės kolokacijos) bei... [toliau žr. visą tekstą]
The main objective of the Master thesis is collocations and collocation extraction methods. The aim of the research is to analyze collocation lists extracted by statistical methods from the parallel corpus of fiction and determine the collocation equivalents. Relevance of the thesis – collocation analysis can help linguists and other language specialists choose the right collocaton extraction methods in both, English and Lithuanian, languages. What is more, understanding of collocation translation process is very important for the translation analysis and interpreters. Research consists of 5 parts. Chapter 2 presents the concept of collocation and possible collocation translation problems. The theoretical part also includes the characteristics of the four selected statistical methods: Mutual Information (MI), T-score, Dice and Log-likelihood ratio (LLR). In chapter 3, collocation lists for each language, English and Lithuanian, are extracted. The analysis reveal that T-score and LLR methods extract grammatical collocations, while MI and Dice – lexical ones. Futher in this chapter, collocation boundaries and the coefficients of each method are defined. Chapter 4 presents a list of top 200 collocations of each language and method. The methods with new collocation lists are compared in pairs according to similarity criteria - Dice with MI (lexical collocations) and T-score with LLR (grammatical). Another distribution of bigrams according to frequency is identified, and both... [to full text]
APA, Harvard, Vancouver, ISO, and other styles
44

Santos, Flávia Milo dos. "Impacto hidrodinâmico vertical de corpos axissimétricos através de uma abordagem variacional." Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/3/3152/tde-22092014-110644/.

Full text
Abstract:
Do ponto de vista da hidrodinâmica clássica, o problema de impacto hidrodinâmico configura-se como um problema de contorno com fronteiras móveis cuja posição deve ser determinada simultaneamente à solução da equação de campo. Essa característica traz dificuldades para obtenção de soluções analíticas e numéricas. Nesse sentido, o presente trabalho propõe o desenvolvimento de um método numérico específico para analisar o problema de impacto hidrodinâmico de corpos sólidos rígidos contra a superfície livre da água. A solução da equação dinâmica não linear do problema de impacto depende da determinação do tensor de massa adicional a cada instante de tempo, o qual depende da posição e atitude do corpo no instante considerado. Um método variacional específico é empregado, através do qual os coeficientes de massa adicional são determinados com erro de segunda ordem, na posição considerada. Tal método é exemplo de técnicas numéricas dessingularizadas, através das quais o potencial de velocidade é aproximado em um espaço finito-dimensional formado por funções-teste derivadas de soluções potenciais elementares, tais como pólos, dipolos, anéis de dipolos, de vórtices, etc. O problema potencial de impacto hidrodinâmico, que se caracteriza pela dominância das forças de inércia, é formulado admitindo-se a superfície líquida como equipotencial, o que permite a analogia com o limite assintótico de frequência infinita do problema de radiação de ondas causada pelo movimento de corpos flutuantes. O método desenvolvido é então aplicado ao caso de impacto vertical de corpos axissimétricos, formulando o problema sob o chamado modelo de von Kármán generalizado (GvKM). Nesse modelo as condições de contorno na geometria exata do corpo são satisfeitas, porém os efeitos do empilhamento de água junto às raízes do jato, que se forma ao longo da intersecção com a superfície livre, não são considerados no caso geral. Resultados numéricos do coeficiente de massa adicional para uma família de esferoides são apresentados e tabulados para o pronto uso em análise e projeto. Além disso, considerações acerca da inclusão do efeito de empilhamento de água junto às raízes do jato, ou seja, da elevação da superfície livre são também feitas para o caso de esferas, fazendo uso de abordagens analíticas encontradas na literatura especializada.
In terms of classical hydrodynamics, the hydrodynamic impact problem is characterized as a boundary problem with moving boundary which position must be determined simultaneously with the solution of the field equation. This feature brings difficulties to get analytical and numerical solutions. In this sense, the purpose of this work is to present a variational method technique specifically designed for the hydrodynamic impact problem of axisymmetric rigid bodies on the free surface. The solution of the nonlinear dynamic equation of the impacting motion depends on the determination of the added mass tensor and its derivative with respect to time at each integration time step. This is done through a variational method technique that leads to a second-order error approximation for the added mass if a first-order error approximation is sought for the velocity potential. This method is an example of desingularized numerical techniques, through which the velocity potential is approximated in a sub-space of finite dimension, formed by trial functions derived from elementary potential solutions, such as poles, dipoles, and vortex rings, which are placed inside the body. The potential problem of hydrodynamic impact, characterized by the dominance of inertial forces, is here formulated by assuming the liquid surface as equipotential, what allows the analogy with the infinity frequency limit in the usual free surface oscillating floating body problem. The method is applied to the vertical hydrodynamic impact of axisymmetric bodies within the so-called Generalized von Kármán Model (GvKM). In such approach, the exact body boundary condition is full-filled and the wet correction is not taken into account. Numerical results for the added mass coefficient for a family of spheroids are presented. Moreover, considerations are made on the effects of the free surface elevation for the specific case of an impacting sphere, through analytical approaches.
APA, Harvard, Vancouver, ISO, and other styles
45

Melin, Håkan. "Automatic speaker verification on site and by telephone: methods, applications and assessment." Doctoral thesis, KTH, Tal, musik och hörsel, TMH, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4242.

Full text
Abstract:
Speaker verification is the biometric task of authenticating a claimed identity by means of analyzing a spoken sample of the claimant's voice. The present thesis deals with various topics related to automatic speaker verification (ASV) in the context of its commercial applications, characterized by co-operative users, user-friendly interfaces, and requirements for small amounts of enrollment and test data. A text-dependent system based on hidden Markov models (HMM) was developed and used to conduct experiments, including a comparison between visual and aural strategies for prompting claimants for randomized digit strings. It was found that aural prompts lead to more errors in spoken responses and that visually prompted utterances performed marginally better in ASV, given that enrollment data were visually prompted. High-resolution flooring techniques were proposed for variance estimation in the HMMs, but results showed no improvement over the standard method of using target-independent variances copied from a background model. These experiments were performed on Gandalf, a Swedish speaker verification telephone corpus with 86 client speakers. A complete on-site application (PER), a physical access control system securing a gate in a reverberant stairway, was implemented based on a combination of the HMM and a Gaussian mixture model based system. Users were authenticated by saying their proper name and a visually prompted, random sequence of digits after having enrolled by speaking ten utterances of the same type. An evaluation was conducted with 54 out of 56 clients who succeeded to enroll. Semi-dedicated impostor attempts were also collected. An equal error rate (EER) of 2.4% was found for this system based on a single attempt per session and after retraining the system on PER-specific development data. On parallel telephone data collected using a telephone version of PER, 3.5% EER was found with landline and around 5% with mobile telephones. Impostor attempts in this case were same-handset attempts. Results also indicate that the distribution of false reject and false accept rates over target speakers are well described by beta distributions. A state-of-the-art commercial system was also tested on PER data with similar performance as the baseline research system.
QC 20100910
APA, Harvard, Vancouver, ISO, and other styles
46

Andrade, Adson Íkaro Silva Leite de. "Estudos analíticos e em Pspice de Transferência de calor em corpos cilíndricos." Universidade Federal da Paraíba, 2016. http://tede.biblioteca.ufpb.br:8080/handle/tede/8969.

Full text
Abstract:
Submitted by Maike Costa (maiksebas@gmail.com) on 2017-05-29T13:55:12Z No. of bitstreams: 1 arquivototal.pdf: 5619701 bytes, checksum: 3d5e3403144a3249722499b4dea043e9 (MD5)
Made available in DSpace on 2017-05-29T13:55:12Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 5619701 bytes, checksum: 3d5e3403144a3249722499b4dea043e9 (MD5) Previous issue date: 2016-08-31
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
It is presented in this thesis, a proposal for solution of engineering problems in heat transfer area using the network simulation method (NSM - Network Simulation Method), which is to solve these problems by analogy between the thermal circuits and electrical circuits. It will be presented to validate the two-dimensional problem of heat conduction in a cylinder in which the solutions obtained by simulation with the solution analytically obtained via Technical Generalized Integral Transform will face (GITT). In the simulations makes up the body discretization study with the aim of establishing a correlation between the amount of cells (the mesh refinement) and the relationship between the radius of the size and length of the cylinder so that the problem solution of heat transfer in the body can be considered one-dimensional. A comparison will be made between the analytical response and obtained by simulation, varying the number of divisions and different relationships between beam dimensions and length of the cylinder. From the solution validated the proposed method, applying the work presented as a generic fabric prepared by the NSM and implementation PSPICE, which serves for solving many problems of heat conduction in a cylindrical geometry.
É apresentada neste trabalho de tese, uma proposta para solução de problemas de engenharia na área de transferência de calor utilizando o método de simulação de rede (NSM – Network Simulation Method), que consiste na resolução destes problemas pela analogia existente entre os circuitos térmicos e os circuitos elétricos. Nele será apresentada a validação do problema bidimensional de condução de calor em um cilindro, no qual, serão confrontadas as soluções obtidas por simulação com a solução obtida analiticamente via Técnica da Transformada Integral Generalizada (GITT). Nas simulações realizadas fazse a discretização do corpo em estudo, com o objetivo de se estabelecer uma correlação entre a quantidade de células (refinamento da malha) e a relação entre o tamanho do raio e comprimento do cilindro para que a solução do problema de transferência de calor no corpo possa ser considerada unidimensional. A comparação se dará entre a resposta analítica e a obtida por simulação, variando o número de divisões e as diferentes relações entre as dimensões raio e comprimento do cilindro. A partir da solução validada pela metodologia proposta, o trabalho apresenta como aplicação, uma malha genérica elaborada através do NSM e implementação no PSPICE, a qual serve para resolução de diversos problemas de condução de calor em geometria cilíndrica.
APA, Harvard, Vancouver, ISO, and other styles
47

Robin, Caroline. "Fully self-consistent multiparticle-multihole configuration mixing method : applications to a few light nuclei." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112193/document.

Full text
Abstract:
Ce travail de thèse s'inscrit dans le cadre du développement de la méthode de mélange de configurations multiparticules-multitrous visant à décrire les propriétés de structure des noyaux atomiques. Basée sur un double principe variationnel, cette approche permet de déterminer simultanément les coefficients d'expansion de la fonction d'onde et les orbitales individuelles.Dans ce manuscrit, le formalisme complet méthode de mélange de configurations multiparticules-multitrous auto-cohérente est pour la première fois appliqué à la description de quelques noyaux des couches p et sd, avec l'interaction de Gogny D1S.Un première étude du 12C est effectuée afin de tester et comparer le double processus de convergence lorsque différents types de critères sont appliqués pour sélectionner les configurations à N-corps inclues dans la fonction d'onde du noyau. Une analyse détaillée de l'effet induit par l'optimisation des orbitales est conduite. En particulier, son impact sur la densité à un corps et sur la fragmentation de la fonction d'onde de l'état fondamental, est analysé.Une étude systématique de noyaux de la couche sd est ensuite conduite. Une analyse précise du contenu en corrélation de l'état fondamental est effectuée, et quelques quantités observables telles que les énergies de liaison et de séparation, ainsi que les rayons de charge, sont calculées et comparées à l'expérience. Les résultats obtenus sont satisfaisants. La spectroscopie de basse énergie est ensuite étudiée. Les énergies d'excitation théoriques sont en très bon accord avec les données expérimentales, et les caractéristiques dipolaires magnétiques sont également satisfaisantes. Les propriétés quadripolaires électriques, et en particulier les probabilités de transition B(E2), sont par contre largement sous-estimée par rapport aux valeurs expérimentales, et révèle un manque important de collectivité dans la fonction d'onde, dû à l'espace de valence restreint considéré. Si la renormalisation des orbitales induit une importante fragmentation de la fonction d'onde de l'état fondamental, seul un effet très faible est obtenu sur les probabilités de transition B(E2). Une tentative d'explication est donnée.Enfin, les informations de structure fournies par la méthode de mélange de configurations multiparticules-multitrous sont utilisées comme ingrédient de base pour des calculs de réactions telles que la diffusion inélastique de protons et d'électrons sur noyaux de la couche sd. Si les résultats révèlent aussi un manque de collectivité, les tendances expérimentales sont bien reproduites et sont améliorées par l'optimisation des orbitales
This thesis project takes part in the development of the multiparticle-multihole configuration mixing method aiming to describe the structure of atomic nuclei. Based on a double variational principle, this approach allows to determine the expansion coefficients of the wave function and the single-particle states at the same time. In this work we apply for the first time the fully self-consistent formalism of the mp-mh method to the description of a few p- and sd-shell nuclei, using the D1S Gogny interaction.A first study of the 12C nucleus is performed in order to test the doubly iterative convergence procedure when different types of truncation criteria are applied to select the many-body configurations included in the wave-function. A detailed analysis of the effect caused by the orbital optimization is conducted. In particular, its impact on the one-body density and on the fragmentation of the ground state wave function is analyzed.A systematic study of sd-shell nuclei is then performed. A careful analysis of the correlation content of the ground state is first conducted and observables quantities such as binding and separation energies, as well as charge radii are calculated and compared to experimental data. Satisfactory results are found. Spectroscopic properties are also studied. Excitation energies of low-lying states are found in very good agreement with experiment, and the study of magnetic dipole features are also satisfactory. Calculation of electric quadrupole properties, and in particular transition probabilities B(E2), however reveal a clear lack of collectivity of the wave function, due to the reduced valence space used to select the many-body configurations. Although the renormalization of orbitals leads to an important fragmentation of the ground state wave function, only little effect is observed on B(E2) probabilities. A tentative explanation is given.Finally, the structure description of nuclei provided by the multiparticle-multihole configuration mixing method is utilized to study reaction mechanisms such as electron and proton inelastic scattering on sd-shell nuclei. Although the results also suffer from the lack of collectivity, the experimental trends are well reproduced and improved by the orbital optimization
APA, Harvard, Vancouver, ISO, and other styles
48

Austin, Bradley J. "Perspectives of weather and sensitivities to heat: Social media applications for cultural climatology." Kent State University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=kent1401710675.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Silva, Gabriel Hattori da. "Escolha de parametros para analise de contato entre corpos elasticos usando elementos finitos e redes neurais." [s.n.], 2009. http://repositorio.unicamp.br/jspui/handle/REPOSIP/265005.

Full text
Abstract:
Orientador: Alberto Luiz Serpa
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica
Made available in DSpace on 2018-08-12T19:17:55Z (GMT). No. of bitstreams: 1 Silva_GabrielHattorida_M.pdf: 1021601 bytes, checksum: 4cc8fdf21ddc6d423c4f3264c5c4a0f2 (MD5) Previous issue date: 2009
Resumo: Este projeto tem o objetivo de estudar o efeito dos principais parâmetros que afetam a solução do problema de contato entre corpos elásticos. Foi utilizado o software comercial ANSYS 11.0 para realizar as análises de contato. A influência dos principais parâmetros considerados pelo ANSYS no problema de contato, tais como a rigidez de contato normal, o limite de penetração, os algoritmos de contato e métodos de solução, é investigada no trabalho. Observou-se que a rigidez de contato normal influi diretamente na convergência e nos resultados obtidos. Foram estudados alguns exemplos com resultados conhecidos (analíticos ou numéricos) para uma comparação com a solução do ANSYS, e exemplos de maior interesse prático, como o problema de contato do olhal menor de uma biela automotiva. A partir dos casos analisados, algumas recomendações foram feitas para a escolha dos parâmetros de contato. No entanto, existem parâmetros que dependem do conhecimento do usuário ou da realização de testes preliminares, o que requer em muitas situações um maior tempo para se obter os resultados. Como alternativa, foi investigado o potencial das redes neurais para contornar esta limitação. As redes neurais foram treinadas com resultados obtidos da solução do problema de contato (penetração e variação da pressão de contato) de modelos simplificados, tendo como saída da rede a rigidez de contato normal, que é então usada para estimar a rigidez de contato normal de problemas mais complexos. Foi usada a implementação de redes neurais do software MATLAB 7.0 para o treinamento e a simulação das redes neurais
Abstract: The objective of this project is to study the effect of the main contact parameters that affect the solution of the elastic bodies contact problem. The commercial software ANSYS 11.0 was used to run the contact analysis. The influence of ANSYS main parameters in the contact problem, such as normal contact stiffness, penetration limit, contact algorithms and solvers, is investigated in this work. The normal contact stiffness acts directly in convergence and in the obtained results. Some examples with known results (analytic or numeric) were studied to be compared with ANSYS solution, and some examples of more practical interest, as the connecting rod small end contact problem, were also studied. With the analysed cases, some recommendations were done to the choice of the contact parameters. However, there are parameters that depend on the user's knowledge or it is necessary to run some preliminary tests. As an alternative, it was investigated the neural networks potential to overcome this limitation. The neural networks were trained with obtained results of the contact problem solution (penetration and contact pressure variation) of simplified models. The normal contact stiffness was used as output of the network, which was used to estimate the normal contact stiffness of more complex problems. It was used the neural network implementation of the softwareMATLAB 7.0 to the training and simulation of the neural networks
Mestrado
Mecanica dos Solidos
Mestre em Engenharia Mecânica
APA, Harvard, Vancouver, ISO, and other styles
50

Williams, Brian T. (Brian Thomas). "Developing flexibility through alternative project delivery methods for the U.S. Army Corps of Engineers project management business process." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/118509.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, System Design and Management Program, 2018.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 101-109).
Inflexibility, failure to adapt technology, and overly regulatory processes frustrate construction industry productivity and reduce the likelihood that large infrastructure projects will be delivered on-schedule and on-budget. Divergence from entrenched project delivery methods can provide flexibility to project managers and offers advantages for improving quality, collaboration, costs, and timeliness. The objective of this research is to provide the U.S. Army Corps of Engineers (USACE) recommendations for their Project Management Business Process (PMBP). This study reviews the current state of project management in USACE, conducts a structured systems architecture analysis of the PMBP, evaluates USACE project statistics, assesses alternative project delivery methods through a literature review, and provides case studies to consider the implementation impediments of alternative methods for public and private projects. USACE serves as the nation's largest public engineering agency with responsibilities in military construction, civil works, water navigation, environmental restoration, and disaster response. This research concludes with recommendations for selecting alternative project delivery methods best-fit to meet the distinct needs of each USACE business program. Explicitly, the application of Integrated Project Delivery is best suited for highly specialized, technical projects for military construction and interagency support, but also presents contractual challenges notyet adapted for USACE. Public Private Partnerships show promise for possible future implementation in civil works projects, but require further refinement through the USACE Pilot Program. Lastly, Construction Management at Risk is the most mature alternative method for USACE, and can provide Project Managers with additional options in fast-tracking and early contractor involvement. Essentially, the flexibility of PMBP project delivery should match the vast diversity of USACE's missions.
by Brian T. Williams.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography