To see the other types of publications on this topic, follow the link: VIT MODEL.

Dissertations / Theses on the topic 'VIT MODEL'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'VIT MODEL.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Rafele, Antonio. "À rebours : motifs du regard et de l'expérience des modes de vie." Paris 5, 2008. http://www.theses.fr/2008PA05H070.

Full text
Abstract:
La mode est un regard qui se produit à partir de l'expérience de certains processus historiques : le développement de la métropole et des médias du XIXéme siècle. La naissance de ce regard est donc liée au pouvoir social acquis par la métropole et par les différents médias, qui, dans l'ensemble, constituent un groupe homogène d'images historiques parce qu'ils sont unis par une accélération du temps. L'accélération du temps ôte aux choses leur linéarité et leur durée. L'attention se déplace continuellement sur leur naissance et sur leur mort. Une continuelle construction et destruction de choses est exhibée et vécue. La splendeur et la caducité constituent, en même temps, le mécanisme et le but de la vie ; par contrecoup, cette dernière devient une succession d'instants vécus ou morts. Immersion, sortie et retour posthume constituent le processus à la fois experiencielle et mnémonique qui caractérise le rapport avec les choses. Les différentes choses du monde devront être alors perçues comme des illusions ou des accoutumances qui absorbent et occupent le temps du moi, mais seulement pour une période donnée. La vie s'accomplit non pas en tant qu'expérience vitale mais comme expérience de mort. C'est précisément dans ce sens-là, que la mode, en tant que grand mécanisme ludique de création et de destruction d'accoutumances, devient l'image clé à la fois pour l'expérience et pour le regard qui se produit sur celle-ci
Fashion and lifestyle are a gaze produced by the experience of determined historical processes : development of metropolis and media in 19th century. The birth of this gaze is connected to the social power accumulated by metropolitain cities and various medias. Their homogeneous structure of historical images is determined by the phenomenon of time acceleration that removes the linearity and duration of the things. Attention is instead focused on their birth and death. Things are under a continuous construction and destruction, experienced and exhibited. Splendour and nullity constitute, at the same time, the mechanism and the goal of life which by-effect becomes a simple succession of instants, dead or alive. Immersion, exit and posthumous return constitute the alltogether existential and mnemonic process caracterizing our ralation to things. Therefore, different things can only be perceived as short time illusions and habituations. Life achieves itself not as vital experience but as experience of death. It is precisely in this means that lifestyle, defined as ludic mechanism of creation and destruction of habituations, becomes the key-image for the experience and the gaze directed on itself
APA, Harvard, Vancouver, ISO, and other styles
2

Penna, Christiano Modesto. "Crescimento econÃmico via investimentos em capital: evidencias empÃricas para o Brasil." Universidade Federal do CearÃ, 2007. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=1394.

Full text
Abstract:
Neste trabalho fica evidente que existe uma relaÃÃo nÃo linear entre a taxa de formaÃÃo bruta de capital fixo e a taxa de crescimento econÃmico na economia brasileira e, devido a essa nÃo linearidade, o teste de prediÃÃes do modelo AK e neoclÃssico proposto por Jones (1995) passa a ser inconclusivo, pois existe espaÃo para as prediÃÃes de ambos os modelos. Nosso modelo economÃtrico indica que, teoricamente, a produtividade marginal do capital se modifica de acordo com uma taxa de crescimento indicada pelo parÃmetro threshold. Essa modificaÃÃo pode aparentemente ser explicada devido à modificaÃÃo do coeficiente de elasticidade de substituiÃÃo entre capital e trabalho, ficando aqui uma proposta de novas investigaÃÃes. Ao tratarmos de polÃticas pÃblicas, constata-se que, por mais que se amplie a taxa de formaÃÃo bruta de capital fixo chegaremos, no mÃximo, ao âcatch-upâ do crescimento do PIB das economias de renda mÃdia baixa e do crescimento econÃmico dos paÃses do leste asiÃtico e do PacÃfico. O trabalho tambÃm sugere que o montante de recursos necessÃrio para se concluir tais âcatch-upsâ à da ordem de R$ 786 bilhÃes.
In this work it is evident that a not linear relation exists enters the tax of rude formation of capital fixture and the tax of economic growth in Brazilian economy e, which had to this non linearity, the test of predictions of model AK and neoclÃssico considered for Jones (1995) starts to be inconclusivo, therefore exists space for the predictions of both the models. Our econometrical model indicates that, theoretically, the productivity delinquent of the capital if modifies a tax of growth in accordance with indicated for the parameter threshold. This modification can pparently be explained due to modification of the coefficient of elasticity of substitution between capital and work, being here a proposal of new inquiries. When dealing with public politics, one evidences that, no matter how hard if extends the tax of rude formation of capital fixture will arrive, in the maximum, to âcatch-upâ of the growth of the GIP of the economies of average income decrease and of the economic growth of the countries of the Asian east and the Pacific. The work also suggests that the sum of resources necessary to conclude such âcatch-upsâ is of the order of R$ 786 billion.
APA, Harvard, Vancouver, ISO, and other styles
3

Pan, Juming. "Adaptive LASSO For Mixed Model Selection via Profile Log-Likelihood." Bowling Green State University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1466633921.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Metzger, Thomas Anthony. "Detection of Latent Heteroscedasticity and Group-Based Regression Effects in Linear Models via Bayesian Model Selection." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/93226.

Full text
Abstract:
Standard linear modeling approaches make potentially simplistic assumptions regarding the structure of categorical effects that may obfuscate more complex relationships governing data. For example, recent work focused on the two-way unreplicated layout has shown that hidden groupings among the levels of one categorical predictor frequently interact with the ungrouped factor. We extend the notion of a "latent grouping factor'' to linear models in general. The proposed work allows researchers to determine whether an apparent grouping of the levels of a categorical predictor reveals a plausible hidden structure given the observed data. Specifically, we offer Bayesian model selection-based approaches to reveal latent group-based heteroscedasticity, regression effects, and/or interactions. Failure to account for such structures can produce misleading conclusions. Since the presence of latent group structures is frequently unknown a priori to the researcher, we use fractional Bayes factor methods and mixture g-priors to overcome lack of prior information. We provide an R package, slgf, that implements our methodology in practice, and demonstrate its usage in practice.
Doctor of Philosophy
Statistical models are a powerful tool for describing a broad range of phenomena in our world. However, many common statistical models may make assumptions that are overly simplistic and fail to account for key trends and patterns in data. Specifically, we search for hidden structures formed by partitioning a dataset into two groups. These two groups may have distinct variability, statistical effects, or other hidden effects that are missed by conventional approaches. We illustrate the ability of our method to detect these patterns through a variety of disciplines and data layouts, and provide software for researchers to implement this approach in practice.
APA, Harvard, Vancouver, ISO, and other styles
5

Correia, Fagner Cintra [UNESP]. "The standard model effective field theory: integrating UV models via functional methods." Universidade Estadual Paulista (UNESP), 2017. http://hdl.handle.net/11449/151703.

Full text
Abstract:
Submitted by FAGNER CINTRA CORREIA null (ccorreia@ift.unesp.br) on 2017-09-24T14:11:35Z No. of bitstreams: 1 Correia_TeseIFT.pdf: 861574 bytes, checksum: 1829fcb0903e20303312d37d7c1e0ffc (MD5)
Approved for entry into archive by Monique Sasaki (sayumi_sasaki@hotmail.com) on 2017-09-27T19:37:47Z (GMT) No. of bitstreams: 1 correia_fc_dr_ift.pdf: 861574 bytes, checksum: 1829fcb0903e20303312d37d7c1e0ffc (MD5)
Made available in DSpace on 2017-09-27T19:37:47Z (GMT). No. of bitstreams: 1 correia_fc_dr_ift.pdf: 861574 bytes, checksum: 1829fcb0903e20303312d37d7c1e0ffc (MD5) Previous issue date: 2017-07-27
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
O Modelo Padrão Efetivo é apresentado como um método consistente de parametrizar Física Nova. Os conceitos de Matching e Power Counting são tratados, assim como a Expansão em Derivadas Covariantes introduzida como alternativa à construção do conjunto de operadores efetivos resultante de um modelo UV particular. A técnica de integração funcional é aplicada em casos que incluem o MP com Tripleto de Escalares e diferentes setores do modelo 3-3-1 na presença de Leptons pesados. Finalmente, o coeficiente de Wilson de dimensão-6 gerado a partir da integração de um quark-J pesado é limitado pelos valores recentes do parâmetro obliquo Y.
It will be presented the principles behind the use of the Standard Model Effective Field Theory as a consistent method to parametrize New Physics. The concepts of Matching and Power Counting are covered and a Covariant Derivative Expansion introduced to the construction of the operators set coming from the particular integrated UV model. The technique is applied in examples including the SM with a new Scalar Triplet and for different sectors of the 3-3-1 model in the presence of Heavy Leptons. Finally, the Wilson coefficient for a dimension-6 operator generated from the integration of a heavy J-quark is then compared with the measurements of the oblique Y parameter.
CNPq: 142492/2013-2
CAPES: 88881.132498/2016-01
APA, Harvard, Vancouver, ISO, and other styles
6

Correia, Fagner Cintra. "The standard model effective field theory : integrating UV models via functional methods /." São Paulo, 2017. http://hdl.handle.net/11449/151703.

Full text
Abstract:
Orientador: Vicente Pleitez
Resumo: O Modelo Padrão Efetivo é apresentado como um método consistente de parametrizar FísicaNova. Os conceitos de Matching e Power Counting são tratados, assim como a Expansão emDerivadas Covariantes introduzida como alternativa à construção do conjunto de operadoresefetivos resultante de um modelo UV particular. A técnica de integração funcional é aplicadaem casos que incluem o MP com Tripleto de Escalares e diferentes setores do modelo 3-3-1 napresença de Leptons pesados. Finalmente, o coeficiente de Wilson de dimensão-6 gerado a partirda integração de um quark-J pesado é limitado pelos valores recentes do parâmetro obliquo Y.
Doutor
APA, Harvard, Vancouver, ISO, and other styles
7

Rashid, Zhiar, and Hawar Rustum. "Kvalitetssäkring vid leverans av IFC-modell : Vid export från Tekla Structures och Revit." Thesis, Högskolan i Borås, Akademin för textil, teknik och ekonomi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-22306.

Full text
Abstract:
Användning av olika BIM-verktyg blir allt mer frekvent inom byggsektorn. Informationsutbyte av 3D-modeller mellan projektets olika aktörer sker kontinuerligt under projekteringsskedet. Eftersom olika aktörer använder sig av olika programvaror behövs det ett öppet filformat för leverans av 3D-modeller. IFC är ett neutralt filformat som möjliggör informationsutbyte mellan olika discipliner och samordning av BIM-modeller. Detta examenarbete är en förundersökning av kvalitativ och kvantitativ sort, med syftet att ta fram en intern arbetsmetod för kvalitetssäkring av IFC-modeller vid leverans. I de kvalitativamomenten ingår intervjuer. Intervjuer görs för att få en förståelse om vad de anställda på WSP Byggprojektering Göteborg har för uppfattning om olika program och hantering av IFC. Det har gett författarna en djupare inblick och förståelse av fördelar och utmaningar som svarande stöterpå, som lyfts under avsnittet resultat. Ett praktiskt delmoment ingår i examensarbetet. Under depraktiska momenten kommer det modellera i Tekla Structures och Revit för att sedan analyseramodellerna med hjälp av granskningsprogrammen Solibri Model Checker och Navisworks Manage. Två granskningsmetoder skapas i den här avhandlingen, en visuell granskning och en Excelrapport. Den visuella granskningsmetoden innebär att visuellt granska IFC-modellen i samma granskningsprogram som samordnaren. Den andra metoden är en prototyp som jämför olika rapporter från respektive programvaror. Slutsatsen i den här rapporten är enligt följande punkter: • Mänskliga faktorn vid modellering och användning av korrekta verktyg vid export till IFC. • IFC-formatet och de olika programmens förmåga att tolka IFC-informationen kan i vissa fall brista för icke standardiserade geometrier. • Granskningsprogram kan tolka IFC-modellen olika.
The use of different BIM tools is becoming increasingly frequent in the construction sector. Information exchange of 3D models between the various actors of the project takes place continuously during the design phase. Because different actors use different software, an openfile format is needed for the delivery of 3D models. IFC is a neutral file format that enables information exchange between different disciplines and coordination of BIM models. This exam work is a preliminary study of qualitative and quantitative variety, with the aim of developing an internal working method for quality assurance of IFC model delivery. The qualitative parts include interviews. Interviews are made to gain an understanding of what the employees at WSP Byggprojektering Göteborg have in mind about different programs and management of IFC. It has given the authors a deeper insight and understanding of the benefits and challenges that respondents encounter, which are highlighted in the section results. Apractical part is included in the degree project. During the practical steps, it will model in Tekla Structures and Revit to then analyze the models with the help of the review programs Solibri Model Checker and Navisworks Manage. Two review methods are created in this thesis, a visual review and an Excel report. The visual inspection method involves visually reviewing the IFC model in the same review program as the coordinator. The second method is a prototype that compares different reports from the respective software.The conclusion in this report is as follows: • Human factor in the modeling and use of correct tools when exporting to IFC. • The IFC format and the ability of the various programs to interpret the IFCinformation may in some cases fail for non-standard geometries. • Review programs can interpret the IFC model differently.
APA, Harvard, Vancouver, ISO, and other styles
8

Junior, Valdivino Vargas. "Modelagem de epidemias via sistemas de partículas interagentes." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-27052013-085717/.

Full text
Abstract:
Estudamos um sistema de partículas a tempo discreto cuja dinâmica é a seguinte. Considere que no instante inicial sobre cada inteiro não negativo há uma partícula, inicialmente inativa. A partícula da origem é ativada e instantaneamente ativa um conjunto aleatório contíguo de partículas que estão a sua direita. Como regra, no instante seguinte ao que foi ativada, cada partícula ativa realiza esta mesma dinâmica de modo independente de todo o resto. Dizemos que o processo sobrevive se em qualquer momento sempre há ao menos uma partícula ativa. Chamamos este processo de Firework, associando a dinâmica de ativação de uma partícula inativa a uma infecção ou explosão. Nosso interesse é estabelecer se o processo tem probabilidade positiva de sobrevivência e apresentar limites para esta probabilidade. Isto deve ser feito em função da distribuição da variável aleatória que define o raio de ação de uma partícula. Associando o processo de ativação a uma infecção, podemos pensar este modelo como um modelo epidêmico. Consideramos também algumas variações dessa dinâmica. Dentre elas, variantes com partículas distribuídas sobre a semirreta dos reais positivos (nesta vertente, existem condições para as distâncias entre partículas consecutivas) e também com as partículas distribuídas sobre vértices de árvores. Estudamos também para esses casos a transição de fase e probabilidade de sobrevivência. Nesta variante os resultados obtidos são funções da sequência de distribuições dos alcances das explosões e da estrutura dos lugares onde se localizam as partículas. Consideramos também variações do modelo onde cada partícula ao ser ativada, permanece ativa durante um tempo aleatório e nesse período emite explosões que ocorrem em instantes aleatórios.
We studied a discrete time particle system whose dynamic is as follows. Consider that at time zero, on each non-negative integer, there is a particle, initially inactive. A particle which is placed at origin is activated and instantly activates a contiguous random set of particles that is on its right. As a rule, the next moment to what it has been activated, each active particle carries the same behavior independently of the rest. We say that the process survives if the amount of particles activated along the process is infinite. We call this the Firework process, associating the activation dynamic of a particle to an infection or explosion process. Our interest is to establish whether the process has positive probability of survival and to present limits to this probability. This is done according to the distribution random variable that defines the radius of infection of each active particle, Associating the activation process to an infection, we think this model as a model epidemic. We also consider some variations of this dynamic. Among them, variants with particles distributed over the half line (there are conditions for the distances between consecutive particles) and also with particles distributed over the vertices of a tree. We studied phase transitions and the correspondent survival probability. In this variant the results depend on the sequence of probability distributions for the range of the explosions and on the particles displacement. We also consider a variation where each particle after activated, remains active during a random time period emitting explosions that occur in random moments.
APA, Harvard, Vancouver, ISO, and other styles
9

Kuoljok, Simon Pirak. "Konnektivitetsåtgärder i Emån : En fallstudie vid Högsby vattenkraftverk." Thesis, KTH, Hållbar utveckling, miljövetenskap och teknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-298253.

Full text
Abstract:
Den nationella planen för vattenverksamheter antogs år 2020 och innebär att vattenverksamheter ska förses med moderna miljövillkor för att öka vattenmiljönytta och att bibehålla den effektiva tillgången till vattenkraftsel. Emån som är ett Natura 2000-område är det största vattendraget i sydöstra Sverige och vattendraget är utifrån konnektiviteten klassificerat som måttlig. Högsby vattenkraftverk, som ligger i Emån och ägs av Uniper, ska omprövas utifrån miljövillkor enligt den nationella planen och idag har Högsby vattenkraftverk ingen fastställd minimitappning för sin anläggning. Målet med studien är att undersöka storleken på vattenflödet av minimitappningen som kan användas i den ursprungliga naturfåran i Högsby för att möjliggöra longitudinella spridningsmöjligheter i uppströms och nedströms riktning för de fiskarter som historiskt sett har kunnat vandra genom Högsby. En sektion i naturfåran är uppdämd av en tröskeldamm som bedöms vara den svåraste passagen i naturfåran med undantag från utskovsdammen. Metoden som används i rapporten är en endimensionell stationär modellering i HEC-RAS över den naturfåra som kan möjliggöra fiskvandring förbi vattenkraftverk. Två modeller har skapats – med och utan tröskeldamm. Utifrån resultatet är den högsta medelvattenhastigheten i sektionen nedanför tröskeldammen i båda scenarion, detsamma gäller medelvattendjupet.  De konnektivitetshöjande åtgärder som diskuteras är minimitappning och olika fiskvägar i naturfåran för att behålla den historiska forsmiljön som funnits. Både rivning av tröskeldamm och fiskväg runt dammen har diskuterats. Därutöver anses havsöring och lax vara fiskarter som med stor säkerhet har kunnat passera Högsby historiskt sett. Högsby är klass 1 i relativt reglerbidrag, vilket bör tas i beaktning vid bedömning av minimitappning. Det finns osäkerheter med modellen, såsom batymetri och randvillkor samt validering av modellen. Därför går det inte med säkerhet att säga att vattendjup eller vattenhastigheter stämmer överens med verkligheten. Slutsatsen är att det inte heller med säkerhet går att säga vilka fiskarter som historiskt sett har vandrat genom Högsby, inte heller en minimitappning kan fastställas som kan möjliggöra fiskvandring för alla fiskarter. Däremot kommer implementeringen av en fastställd minimitappning och fiskvägar öka konnektiviteten.
In 2020 the national plan for hydropower plants was implemented in Sweden. The national plan aims to increase the environmental standards for hydropower plants for the benefit of the water body and still maintain an efficient access to electricity originated from hydropower. Emån is the largest river in the south-eastern part of Sweden. The river a Natura 2000 area and it is classified as moderate based on connectivity according to environmental norm for rivers and water bodies. Högsby hydropower plant located along Emån is owned by Uniper and are included within the national plan, which indicate that it needs to meet the environmental standard. Högsby hydropower plant currently has no established minimum discharge. The aim of the study is to investigate a minimum discharge for the natural dry stream located in Högsby to enable higher connectivity for the fish species that have historically been able to migrate through Högsby.  The natural stream is downstream the spillway dam in Högsby. Within the stream a weir is considered to be the most difficult passage for fish migration, with the exception of the spillway dam. The method used in the report is a one-dimensional steady flow model in HEC-RAS for the natural stream to investigate the possibility for fish migration past the hydropower plant. Two scenarios have been considered – with and without the crest dam present. Based on the result from the model, the highest mean water velocity is the section below the weir for both scenarios and same applies to the mean water depth. Implementation of a minimum discharge and various fish ways in the natural stream is measures for increasing connectivity that are being discussed which would also maintain the rapids that has existed in the stream. The fish ways that have been discussed is based on the scenarios if the weir is present or not. The fish species that with great certainty have been able to migrate through Högsby historically is sea trout and salmon. Högsby hydropower plant have the highest classified regulation of the hydropower plants in Emån which need to be considered when determining minimum discharge.  There are a few uncertainties with the hydraulic model such as bathymetry and the boundary conditions. The model is not validated which means that the water depth and the velocity might not reflect the true values. It is not certain which fish species that have migrated through Högsby historically, nor can an established minimum discharge be said to enable fish migration. However, an implementation of a minimum discharge and the fish ways that have been discussed will increase the connectivity in Högsby.
APA, Harvard, Vancouver, ISO, and other styles
10

Voigt, Rianne Ruth. "Model vir ervaringsleer in voedseldiensbestuur." Thesis, Cape Technikon, 1999. http://hdl.handle.net/20.500.11838/1887.

Full text
Abstract:
Thesis (MTech(Teacher Education))--Cape Technikon, Cape Town, 1999
The food service management industry is aware of the importance of an effective experiential learning programme. The technikons try to meet the demands of the food service management industry through involvement of mentors and students. This study undertakes to examine/investigate the deficiencies and problems concerning the experiential learning components of the food service management course, as it is evident that this forms an integral part of the food service management diploma. Although the food service managers is aware of the experiential learning programme, no formal information such as: an employers'/mentors' manual; orientation of other procedures of the technikons or industry; comprehensive evaluating procedures; coordinators' visits and debriefing procedures, is explained or sent to the industry or student. The researcher's involvement with experiential learning and specifically the administration of the experiential learning programme, led to the hypothesis that there is a need for a formal structured experiential learning programme to clear up the confusion and uncertainty regarding the involvement of the student and employer/mentor. As starting point a situation analysis was done to compare the South African perspectives of experiential learning. This information was gathered to place the food service management experiential learning programme in perspective for technikons, students and the food service management industry. The second section deals with a comparative study of the international procedure of experiential learning as currently done in the United States of America (USA), the United Kingdom (UK) and South Africa (SA). What follows next is a comparative study, testing the literature against the reality, as it is experienced in tertiary institutions in the USA, UK and SA. Details of this are expounded in chapter 40f this study. Findings are reported and recommendations regarding the application of an . efficient, productive, structured experiential leaming programme for the course: Food Service Management, are noted in chapter 5. In the last section an explanation is given of the following proposed manuals: • Tutorial Manual • Student Manual • Employer/Mentor Manual • Coordinator Manual during student visits • Debriefing Manual These proposed manuals should be used to satisfy the needs of all involved: employers, technikons and students.
APA, Harvard, Vancouver, ISO, and other styles
11

Liu, Shan. "Model checking in Tobit regression model via nonparametric smoothing." Kansas State University, 2012. http://hdl.handle.net/2097/13790.

Full text
Abstract:
Master of Science
Department of Statistics
Weixing Song
A nonparametric lack-of-fit test is proposed to check the adequacy of the presumed parametric form for the regression function in Tobit regression models by applying Zheng's device with weighted residuals. It is shown that testing the null hypothesis for the standard Tobit regression models is equivalent to test a new null hypothesis of the classic regression models. An optimal weight function is identified to maximize the local power of the test. The test statistic proposed is shown to be asymptotically normal under null hypothesis, consistent against some fixed alternatives, and has nontrivial power for some local nonparametric power for some local nonparametric alternatives. The finite sample performance of the proposed test is assessed by Monte-Carlo simulations. An empirical study is conducted based on the data of University of Michigan Panel Study of Income Dynamics for the year 1975.
APA, Harvard, Vancouver, ISO, and other styles
12

GOMES, Adriano José Oliveira. "Systematic model-based safety assessment via probabilistic model checking." Universidade Federal de Pernambuco, 2010. https://repositorio.ufpe.br/handle/123456789/2651.

Full text
Abstract:
Made available in DSpace on 2014-06-12T15:59:55Z (GMT). No. of bitstreams: 2 arquivo5803_1.pdf: 2496332 bytes, checksum: b4666e127bf620dbcb7437f9d83c2344 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2010
Faculdade de Amparo à Ciência e Tecnologia do Estado de Pernambuco
A análise da segurança (Safety Assessment) é um processo bem conhecido que serve para garantir que as restrições de segurança de um sistema crítico sejam cumpridas. Dentro dele, a análise de segurança quantitativa lida com essas restrições em um contexto numérico (probabilístico). Os métodos de análise de segurança, como a tradicional Fault Tree Analysis (FTA), são utilizados no processo de avaliação da segurança quantitativo, seguindo as diretrizes de certificação (por exemplo, a ARP4761 Guia de Práticas Recomendadas da Aviação). No entanto, este método é geralmente custoso e requer muito tempo e esforço para validar um sistema como um todo, uma vez que para uma aeronave chegam a ser construídas, em média, 10.000 árvores de falha e também porque dependem fortemente das habilidades humanas para lidar com suas limitações temporais que restringem o âmbito e o nível de detalhe que a análise e os resultados podem alcançar. Por outro lado, as autoridades certificadoras também permitem a utilização da análise de Markov, que, embora seus modelos sejam mais poderosos que as árvores de falha, a indústria raramente adota esta análise porque seus modelos são mais complexos e difíceis de lidar. Diante disto, FTA tem sido amplamente utilizada neste processo, principalmente porque é conceitualmente mais simples e fácil de entender. À medida que a complexidade e o time-to-market dos sistemas aumentam, o interesse em abordar as questões de segurança durante as fases iniciais do projeto, ao invés de nas fases intermediárias/finais, tornou comum a adoção de projetos, ferramentas e técnicas baseados em modelos. Simulink é o exemplo padrão atualmente utilizado na indústria aeronáutica. Entretanto, mesmo neste cenário, as soluções atuais seguem o que os engenheiros já utilizavam anteriormente. Por outro lado, métodos formais que são linguagens, ferramentas e métodos baseados em lógica e matemática discreta e não seguem as abordagens da engenharia tradicional, podem proporcionar soluções inovadoras de baixo custo para engenheiros. Esta dissertação define uma estratégia para a avaliação quantitativa de segurança baseada na análise de Markov. Porém, em vez de lidar com modelos de Markov diretamente, usamos a linguagem formal Prism (uma especificação em Prism é semanticamente interpretada como um modelo de Markov). Além disto, esta especificação em Prism é extraída de forma sistemática a partir de um modelo de alto nível (diagramas Simulink anotados com lógicas de falha do sistema), através da aplicação de regras de tradução. A verificação sob o aspecto quantitativo dos requisitos de segurança do sistema é realizada utilizando o verificador de modelos de Prism, no qual os requisitos de segurança tornam-se fórmulas probabilísticas em lógica temporal. O objetivo imediato do nosso trabalho é evitar o esforço de se criar várias árvores de falhas até ser constatado que um requisito de segurança foi violado. Prism não constrói árvores de falha para chegar neste resultado. Ele simplesmente verifica de uma só vez se um requisito de segurança é satisfeito ou não no modelo inteiro. Finalmente, nossa estratégia é ilustrada com um sistema simples (um projeto-piloto), mas representativo, projetado pela Embraer
APA, Harvard, Vancouver, ISO, and other styles
13

Almeida, Priscila Todero de 1988. "Estudo do tunelamento da magnetização em magnetos moleculares de Mn 12 via q-histerons." [s.n.], 2013. http://repositorio.unicamp.br/jspui/handle/REPOSIP/278165.

Full text
Abstract:
Orientador: Kleber Roberto Pirota
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin
Made available in DSpace on 2018-08-23T16:30:39Z (GMT). No. of bitstreams: 1 Almeida_PriscilaToderode_M.pdf: 4163268 bytes, checksum: 28aa8c398a21212739c3588c9ce78593 (MD5) Previous issue date: 2013
Resumo: O principal objetivo desse trabalho consiste em uma nova abordagem para o tunelamento da magnetização do magneto molecular Mn12, embasado em uma ampliação do modelo de Preisach. Introduziremos novos operadores que levam em conta a possibilidade de efeitos quânticos. Implementamos esse novo modelo num programa de simulação que é capaz de simular curvas de histerese e curvas de relaxação magnética sem recorrer a resolução de hamiltonianas de spin. Além disso, este programa utiliza simulação estocástica, apresentando os resultados em poucos minutos. Os resultados obtidos concordam com os experimentos realizados de histerese e relaxação magnética. Apesar de ser um modelo de simulação simples, reproduz adequadamente a fenomenologia, pois introduz os dois ingredientes essenciais de um sistema com inversão da magnetização por efeito túnel termicamente ativado: a ativação térmica, descrita pela ocupação de níveis segundo a distribuição de Boltzmann e a possibilidade do efeito túnel descrita pelo modelo de Landau¿Zenner. A consistência física do modelo é estudada através da variação de parâmetros do modelo de forma sistemática
Abstract: The main objective of this work consists of a new approach concerning the tunneling of the magnetization of the molecular magnet Mn12, based on an extension of the Preisach model. We will introduce new operators that take into account the possibility of quantum effects. Thus, we have implemented this new model in a simulation software that is capable of simulating hysteresis curves and magnetic relaxation curves without utilizing resolution of spin Hamiltonians. Also, this program uses stochastic simulation, presenting the results in only a few minutes. The results obtained agree with the hysteresis and relaxation experiments. Despite being a simple simulation model, it adequately reproduces the phenomenology, because it introduces two key ingredients of a system with inversion of magnetization by thermally activated tunnel effect: the thermal activation, described by the occupation of levels according to the Boltzmann distribution and the possibility of tunnel effect described by the Landau-Zenner model. The physical consistency of the model is studied by systematically varying the model¿s parameters
Mestrado
Física
Mestra em Física
APA, Harvard, Vancouver, ISO, and other styles
14

Victoria, Daniel de Castro. "Simulação hidrológica de bacias amazônicas utilizando o modelo de Capacidade de Infiltração Variável (VIC)." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/64/64135/tde-25032010-095240/.

Full text
Abstract:
Com 6 milhões de km2, a bacia Amazônica é o maior sistema hidrográfico do mundo, com descarga estimada de 209.000 m3 s-1, e a maior extensão contínua de floresta tropical. Porém, esta região é alvo de constantes ameaças, seja das pressões por desmatamento, ou por alterações climáticas. Neste contexto, compreender o funcionamento do sistema é essencial, seja para auxiliar na tomada de decisões ou estudos de cenários futuros. Este trabalho teve como objetivo avaliar e adaptar o modelo hidrológico de grandes bacias Variable Infiltration Capacity Model (VIC v.4.0.5), para as condições tropicais. Foram utilizados dados de descarga, precipitação, temperatura e velocidade do vento, e informações sobre tipo de solo e cobertura vegetal, para simular o ciclo hidrológico em 6 grandes bacias situadas na Amazônia: Santo Antônio do Içá, Japurá, Juruá, Negro, Madeira e Purus. O modelo foi calibrado a partir das descargas mensais, de 1980 a 1990, e seu funcionamento foi verificado para o período de 1990 a 2006. Não foi possível simular o ciclo hidrológico para as bacias com grande contribuição dos Andes, Santo A. Içá e Japurá, uma vez que a estimativa de precipitação nestas regiões é subestimada. Nas outras bacias, o modelo foi capaz de simular corretamente as vazões dos rios, apesar de apresentar problemas na estimativa da evapotranspiração (ET). Foram constatados problemas na partição da ET em seus componentes, transpiração da vegetação e evaporação da água interceptada. Uma possível correção foi avaliada, resultando em uma distribuição mais correta da ET em seus componentes porém, tal modificação resultou em redução da ET média simulada. Uma nova versão do modelo (v.4.1) acaba de ser lançada. Dentre as melhorias, destacam-se modificações na maneira como a ET é calculada, que visa corrigir os problemas aqui relatados. No entanto, tal versão ainda não foi avaliada nas condições tropicais
The Amazon river basin is the largest fluvial system in the world, discharging 209,000 m3 s-1 to the ocean. It also sustains the largest continuous tropical forest system. However, the region is under constant pressure from deforestation and climate change. For such reasons, its crucial to understand how the hydrological cycle functions. Such tools can be used for evaluation of future scenarios and guide decision making. The Variable infiltration Capacity Model (VIC) was evaluated and adapted to tropical conditions. Temperature, precipitation, wind speed, soil type and land cover maps were used to simulate the hydrological cycle in 2 sub-basins inside the Amazon: Santo Antônio do Içá, Japurá, Juruá, Negro, Madeira e Purus, covering the period from 1980 to 2006. The simulation was not possible for basins with large drainage area located in the Andes (Santo A. Içá and Japurá), due to underestimation of the precipitation. For the other basins, simulated discharge agreed with observed records, even though evapotranspiration (ET) estimates showed some problems. The ET partitioning in its components, transpiration and canopy evaporation, showed severe discrepancies. A correction was applied to the model, fixing the partitioning problem but it resulted in reduction of estimate ET. A new version of the model (v.4.1) has just been released, with changes in the way ET is estimated. However, this new version has not yet been tested in the Amazon
APA, Harvard, Vancouver, ISO, and other styles
15

Gustavsson, Sam, and Filip Johansson. "Kvalitetssäkring av objektsbaserade IFC-filer för mängdavtagningar och kostnadskalkyler : En fallstudie av levererade konstruktionsmodeller, vid Tyréns AB Stockholm." Thesis, KTH, Byggteknik och design, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-127029.

Full text
Abstract:
Att utföra kostnadskalkyler och mängdavtagningar utifrån objektbaserade 3D-modeller har visat sig vara problematiskt då modellerna inte uppfyller de förväntade kraven på informationsnivå. De bristande modellerna leder till en större osäkerhet då kalkylatorn skall upprätta kostnadskalkyler, vilket utgör bakgrunden till de frågeställningar som rapporten utgörs av. Projektet Biomedicum, som skall bli det toppmoderna laboratoriet åt Karolinska Institutet bedrivs som ett fullskaligt BIM-projekt, där de objektbaserade 3D-modellerna utgör underlaget för bland annat kostnadskalkyler i systemhandlingsskedet. Kalkylatorn i projektet, har uttryckt ett missnöje gällande kvaliteten i modellerna, och menar på att det finns stora brister som leder till att risken för att osäkra kalkyler upprättas ökar. Denna rapport kommer således att se över hur det är möjligt att kvalitetssäkra en objektbaserad 3D-modell i IFC-format i programvaran Solibri Model Checker, utifrån de projektspecifika kraven som råder i projektet. Rapporten utgörs av en fallstudie och en litteraturstudie, och till viss del även av intervjuer. I fallstudien undersöks det om Solibri Model Checker är kapabel till att kontrollera modellerna utifrån de projektspecifika krav som råder i projektet, samt att analysera resultatet av kontrollerna som har utförts. Intervjuerna i sin tur har legat till grund för ett antal problem som har ansetts vara relevanta för rapporten. Problem som kan härledas till juridiska svårigheter, avsaknad av en gemensam branschstandard, meningsskiljaktigheter kring innehållet i en objektbaserad 3D-modell etc.  I fallstudien har det visat sig att 3D-modellerna är bristande ur kalkylsynpunkt. Detta leder till svårigheter och osäkerheter då Skanska upprättar kalkylerna. Anledningarna till detta är flera. Det handlar bland annat om att konstruktören använder sig av en modelleringsteknik som inte är anpassad för kalkylering, men även att det inte finns tydliga riktlinjer om vilken information som bör finnas i objekten i systemhandlingsskedet, vilket har framgått under intervjuerna. Det har även framgått under intervjuerna, att ett avgörande problem som kan anknytas till att modellerna är av bristande kvalitet ur kalkylsynpunkt, är på grund av att det inte har ställts krav tillräckligt tidigt i processen. Vad som är viktigt för att få en fungerande process där de objektbaserade 3D-modellerna skall användas för mängdavtagningar och kostnadskalkyler, är att i ett tidigt skede fastslå vilken information som behövs i objekten för att trovärdiga kalkyler skall vara möjliga att upprätta. Vikten av att det finns en tydlig IT-handledning är även stor, då det är handledningen som ligger till underlag om vilken information som skall finnas med i modellerna.
To perform quantity takeoffs and cost estimates based on 3D models is problematic, if the models do not meet the expected demands of information. The weaknesses of the models leads to uncertainty when the calculator will establish cost estimates, which forms the background to the issues that the thesis constitutes. The project Biomedicum is conducted as a full-fledged BIM project, where the object based 3D models are the basis for cost estimates in the system phase. The calculator in the project has expressed dissatisfaction regarding the quality of the models, and believes that there are major flaws that lead to the risk of uncertain calculations increasing. Of this reason, this thesis will analyze the possibilities about ensuring the quality of an object based 3D model in IFC format, by using the software Solibri Model Checker. This thesis consists of a literature review, interviews and a case study. The case study examines whether Solibri Model Checker is capable of checking the models based on the project specific requirements, and to analyze the results of the checks that have carried out in the case study. The interviews have been the basis for the number of problems that have been considered relevant for the thesis. Problems that can be traced to legal difficulties, lack of a common set requirements, disagreements about the content of an object based 3D model etc. The case study has shown that the 3D models are inadequate to perform cost estimates, which leads to difficulties and uncertainties in the calculation process. There are several reasons for this. The designers of the project have used wrong modeling technique for quantity takeoff and cost estimating. But also that there are no clear guidelines as to what information should be included in the items in the document stage system, which has emerged during the interviews. It has also emerged during the interviews that crucial problems that may be linked to the models are of poor quality, because it has not been required early enough in the process. What is important to get a working process in which the object-based 3D models to be used for quantity takeoffs and cost estimates, is that in the early stages determine what information is needed in the objects to credible estimates will be possible to establish. The importance of a clear IT manual is also great, as it is the manual who serves as the basis of the information to be included in the models.
APA, Harvard, Vancouver, ISO, and other styles
16

Cueva, Marcos Salles. "Estudo de VIM - Movimentos Induzidos por Vórtices em plataformas do tipo monocoluna: abordagem numérico-experimental e impactos no sistema de amarração." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-31052011-162748/.

Full text
Abstract:
A vibração induzida por vórtices - VIV é um assunto muito estudado na engenharia oceânica, tendo seu nome adaptado para VIM por ocasião de observação semelhante em plataformas do tipo Spar no Golfo do México. Com o início da utilização de plataformas monocolunas, especificamente com a instalação da Sevan Piranema no Nordeste Brasileiro, em 2007, questionou-se sobre a ocorrência deste fenômeno nestas unidades de seção cilíndricas, aplicando cargas não previstas aos sistemas de ancoragem, e eventualmente tornando estes subdimensionados. Em um primeiro momento, foi realizado estudo detalhado sobre os trabalhos já realizados na área, e tendo sido encontrado material insuficiente sobre o VIM em plataformas monocolunas, buscou-se a sua caracterização em termos de amplitudes e períodos de oscilação a partir de ensaios em tanques de prova, estratégia justificada pela dificuldade em se obter tais resultados de forma confiável via uma abordagem analítica e/ou numérica. Posteriormente, com o objetivo de se verificar o impacto do VIM nas cargas extremas, e principalmente na fadiga do sistema de ancoragem, foram realizadas análises com dois métodos distintos encontrados na literatura. Em linhas gerais, o primeiro método é descrito como aquele que considera a aplicação do VIM via movimentos equivalentes aos identificados experimentalmente e o segundo, por sua vez, via a consideração apenas das respectivas forças. Para esses estudos foi analisado o sistema de amarração em um cenário típico, sob condições ambientais de vento, onda e correnteza. Finalmente, dada a sua importância observada nos resultados, foram discutidas as relações entre o VIM e alguns fatores externos, tais como: a razão de aspecto da plataforma; a presença concomitante de ondas e o amortecimento viscoso atuante nas linhas de ancoragem, sendo apresentados para este último uma comparação dos resultados obtidos numericamente com aqueles experimentais.
The vortex induced vibration VIV is a main subject studied in the offshore engineering, having its name adapted to VIM by the occurrence in Spar platforms in the Gulf of México. As the first monocolumn unit, the Sevan Piranema, was in Northeast Brazil in 2007, VIM occurrence in cylindrical units like this was questioned, having the mooring lines design dimensioned without this load. Firstly a detailed study about the works developed in this field was performed, and not being found enough publications about VIM in monocolumns, its characterization was aimed in terms of oscillation amplitude and period through model tests, due to the difficulty to obtain reliable results by means of analytical or numerical approaches. Later, with the objective of verifying the impact of VIM in extreme loads and, mainly, in fatigue damage, analyses with two different methods found in the literature were performed. The first consider the application of VIM as a motion in the numerical model, based on the experimental results, and the second applies the observed forces. For these studies the mooring system for a typical scenario was analyzed, under the wave, wind and current environmental force. At last, due to the importance observed from the results, there were discussed the relation of VIM with some external factors, like span wise relation, presence of waves and viscous damping, being presented for this last a comparison of the numerical model results with the experimental is presented.
APA, Harvard, Vancouver, ISO, and other styles
17

Hamoud, Bassam, and Subeyr Yassin. "Användning av BIM-modeller vid kommunikation mellan olika parter." Thesis, KTH, Byggteknik och design, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-123772.

Full text
Abstract:
Trafikverket vill ta reda på hur de skaimplementera BIM (Building information modeling) som ett arbetssätt i sinorganisation vid projektering, byggande och förvaltning. Syftet med det härexamensarbetet är att ta reda på hur Trafikverket kommunicerar med BIM-modelleridag och hur man kan utvecklas. Det är få människor i anläggningsbranschen som idag har erfarenhet av att arbeta med BIM. Även i de projekt som jobbar med BIM idag så behövs det alltid en BIM-samordnare som håller i trådarna. För att få ett bra slutresultat så måste kommunikationen i ett projekt fungera bra. En förutsättning för detta är att alla inblandade parter har samma mål med projektet, t.ex. att hålla tidplanen bättre. De som saknar erfarenhet av arbete med BIM-modeller måste ges en chans att förstå sig på det för att de ska kunna dela med sig av sina erfarenheter och kunskap. Kunskapsnivån måste höjas och det kommer ta sin tid. Det är viktigt att man inte fokuserar allt för mycket på tekniken eftersom BIM handlar om mer än bara teknik. Först och främst måste man vara tydliga med vilket mål det är som man har av arbete med BIM. Utifrån detta kan man sedan definiera en process och sist bestämma sig för vilken teknik som passar bäst för detta ändamål.
The Transportadministration wants to find out how to implement BIM (Building informationmodeling) as a way of working in their organization during design, constructionand management. The purpose of this study is to find out how BIM-models areused in communication today and how it can be developed.   There are a few people today in the construction industry who have experience of working with BIM. Even in projects who work with BIM there is always a need for a BIM-coordinator who holds the strings. To get a good end result the communication within the project must be good. A condition for this is that all the involved parties have the same goals with the project, such as keeping the schedule better. Those who lack experience of working with BIM-models must be given a chance to understand it in order to be able to share their experience and knowledge with others. Knowledge must be enhanced and it will take time. It is important that there is not too much focus too much on technology because BIM is about more than just technology. First and foremost one must be clear about the ones goals of working with BIM. From this one can then define a process and ultimately decide which technology is best suited for this process.
APA, Harvard, Vancouver, ISO, and other styles
18

Himberg, Benjamin Evert. "Accelerating Quantum Monte Carlo via Graphics Processing Units." ScholarWorks @ UVM, 2017. http://scholarworks.uvm.edu/graddis/728.

Full text
Abstract:
An exact quantum Monte Carlo algorithm for interacting particles in the spatial continuum is extended to exploit the massive parallelism offered by graphics processing units. Its efficacy is tested on the Calogero-Sutherland model describing a system of bosons interacting in one spatial dimension via an inverse square law. Due to the long range nature of the interactions, this model has proved difficult to simulate via conventional path integral Monte Carlo methods running on conventional processors. Using Graphics Processing Units, optimal speedup factors of up to 640 times are obtained for N = 126 particles. The known results for the ground state energy are confirmed and, for the first time, the effects of thermal fluctuations at finite temperature are explored.
APA, Harvard, Vancouver, ISO, and other styles
19

Csonka, Kamilla. "Användaracceptans vid systemimplementering." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-20389.

Full text
Abstract:
Användaracceptans är en viktig del i alla systemutvecklingsprojekt och är också en väldigt lättpåverkad variabel för en lyckad implementering. Därför är det viktigt att ha översikt på sådana faktorer som kan påverka användaracceptansen negativt, en sådan faktor är försening. Denna uppsats belyser sammanhanget mellan hur en försening påverkar användaracceptansen. Genom att låta en undersökningsgrupp genomgå en enkät utformad efter Technology Acceptance Model har jag samlat upp en generaliserad åsikt. Resultaten visar att försening som variabel inte påverkar användaracceptansen för fallstudien.
User acceptance is an important part of all system development projects and is also a very easily influenced variable for a successful implementation. That is why it is important to have an overview of those factors that could influence the user acceptance negatively, one of those factors being delay. This thesis highlights the cohesion between a delayed project and its influence over the user acceptance. By letting a study group answer a survey formed by the Technology Acceptance Model, I have gathered the generalized opinion of the group. The results show that delay as a variable does not influence the user acceptance of this case study.
APA, Harvard, Vancouver, ISO, and other styles
20

Nassib, Ali Hussein. "Isar Target Reconstruction Via Dipole Modeling." University of Dayton / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1461250663.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Scibilia, Francesco. "Explicit Model Predictive Control:Solutions Via Computational Geometry." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for teknisk kybernetikk, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-11627.

Full text
Abstract:
The thesis is mainly focused on issues involved with explicit model predictive control approaches. Conventional model predictive control (MPC) implementation requires at each sampling time the solution of an open-loop optimal control problem with the current state as the initial condition of the optimization. Formulating the MPC problem as a multi-parametric programming problem, the online optimization effort can be moved offline and the optimal control law given as an explicitly defined piecewise affine (PWA) function with dependence on the current state. The domain where the PWA function is defined corresponds to the feasible set which is partitioned into convex regions. This makes explicit MPC solutions into promising approaches to extend the scope of applicability of MPC schemes. The online computation reduces to simple evaluations of a PWA function, allowing implementations on simple hardware and with fast sampling rates. Furthermore, the closed form of the MPC solutions allows offline analysis of the performance, providing additional insight of the controller behavior. However, explicit MPC implementations may still be prohibitively costly for large optimization problems. The offline computational effort needed to solve the multiparametric optimization problem may be discouraging, and even the online computation needed to evaluate a complex PWA controller may cause difficulties if low-cost hardware is used. The first contribution of this thesis is to propose a technique for computing approximate explicit MPC solutions for the cases where optimal explicit MPC solutions are impractical due to the offline computational effort needed and their complexity for online evaluations. This technique is based on computational geometry, a branch of computer science which focuses heavily on computational complexity since the algorithms are intended to be used on large data-sets. The approximate solution is suboptimal only over the subregion of the feasible set where constraints are active. In this subregion, the ineffective optimal explicit MPC solution is replaced by an approximation based on Delaunay tessellations and is computed from a finite number of samples of the exact solution. Finer tessellations can be obtained in order to achieve a desired level of accuracy Successively, the thesis presents a twofold contribution concerned with the computation of feasible sets for MPC and their suitable approximations. First, an alternative approach is suggested for computing the feasible set which uses set relations instead of the conventional orthogonal projection. The approach can be implemented incrementally on the length of the MPC prediction horizon, and proves to be computationally less demanding than the standard approach. Thereafter, an algorithm for computing suitable inner approximations of the feasible set is proposed, which constitutes the main contribution. Such approximations are characterized by simpler representations and preserve the essential properties of the feasible set as convexity, positive invariance, inclusion of the set of expected initial states. This contribution is particularly important in the context of finding less complex suboptimal explicit MPC solutions, where the complexity of the feasible set plays a decisive role. The last part of the thesis is concerned with robustness of nominal explicit MPC solutions to model uncertainty. In the presence of model mismatch, when the controller designed using the nominal model is applied to the real plant, the feasible set may lose its invariance property, and this means violation of constraints. Also, since the PWA control law is designed only over the feasible set, there is the technical problem that the control action is undefined if the state moves outside of this set. To deal with this issue, a tool is proposed to analyze how uncertainty on the model affects the PWA control law computed using the nominal model. Given the linear system describing the plant and the PWA control law, the algorithm presented considers the polytopic model uncertainty and constructs the maximal robust feasible set, i.e. the largest subset of the feasible set which is guaranteed to be feasible for any model in the family of models described by the polytopic uncertainty. The appendix of the thesis contains two additional contributions which are only marginally related to the main theme of the thesis. MPC approaches are often implemented as state feedback controllers. The state variables are not always measured, and in these cases a state estimation approach has to be adopted to obtain the state from the measurements. The two contributions deal with state estimation in two different applications, but not with the explicit goal of being used in MPC approaches.
APA, Harvard, Vancouver, ISO, and other styles
22

Alexander, Louis Cadmon. "PROSIM VII an enhanced production simulation model." Ohio : Ohio University, 1992. http://www.ohiolink.edu/etd/view.cgi?ohiou1171474745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Lee, Christina (Christina Esther). "Latent variable model estimation via collaborative filtering." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/113932.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 255-264).
Similarity based collaborative filtering for matrix completion is a popular heuristic that has been used widely across industry in the previous decades to build recommendation systems, due to its simplicity and scalability. However, despite its popularity, there has been little theoretical foundation explaining its widespread success. In this thesis, we prove theoretical guarantees for collaborative filtering under a nonparametric latent variable model, which arises from the natural property of "exchangeability", i.e. invariance under relabeling of the dataset. The analysis suggests that similarity based collaborative filtering can be viewed as kernel regression for latent variable models, where the features are not directly observed and the kernel must be estimated from the data. In addition, while classical collaborative filtering typically requires a dense dataset, this thesis proposes a new collaborative filtering algorithm which compares larger radius neighborhoods of data to compute similarities, and show that the estimate converges even for very sparse datasets, which has implications towards sparse graphon estimation. The algorithms can be applied in a variety of settings, such as recommendations for online markets, analysis of social networks, or denoising crowdsourced labels.
by Christina E. Lee.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
24

Zhou, Yan. "Bayesian model comparison via sequential Monte Carlo." Thesis, University of Warwick, 2014. http://wrap.warwick.ac.uk/62064/.

Full text
Abstract:
The sequential Monte Carlo (smc) methods have been widely used for modern scientific computation. Bayesian model comparison has been successfully applied in many fields. Yet there have been few researches on the use of smc for the purpose of Bayesian model comparison. This thesis studies different smc strategies for Bayesian model computation. In addition, various extensions and refinements of existing smc practices are proposed in this thesis. Through empirical examples, it will be shown that the smc strategies can be applied for many realistic applications which might be difficult for Markov chain Monte Carlo (mcmc) algorithms. The extensions and refinements lead to an automatic and adaptive strategy. This strategy is able to produce accurate estimates of the Bayes factor with minimal manual tuning of algorithms. Another advantage of smc algorithms over mcmc algorithms is that it can be parallelized in a straightforward way. This allows the algorithms to better utilize modern computer resources. This thesis presents work on the parallel implementation of generic smc algorithms. A C++ framework within which generic smc algorithms can be implemented easily on parallel computers is introduced. We show that with little additional effort, the implementations using this framework can provide significant performance speedup.
APA, Harvard, Vancouver, ISO, and other styles
25

Wu, Si Chen. "Research on legal issues of VIE model." Thesis, University of Macau, 2016. http://umaclib3.umac.mo/record=b3525479.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Haapala, O. (Olli). "Application software development via model based design." Master's thesis, University of Oulu, 2015. http://urn.fi/URN:NBN:fi:oulu-201504021268.

Full text
Abstract:
This thesis was set to study the utilization of the MathWorks’ Simulink® program in model based application software development and its compatibility with the Vacon 100 inverter. The target was to identify all the problems related to everyday usage of this method and create a white paper of how to execute a model based design to create a Vacon 100 compatible system software. Before this thesis was started, there was very little knowledge of the compatibility of this method. However during the practical experiments, it became quite quickly clear that this method is very compatible with the Vacon 100. Majority of the problems expected before this thesis was started proved to be wrong. The only permanent problem that came up during this thesis was that Vacon 100 supports only the 32-bit floating-point precision while Simulink uses the 64-bit floating-point precision by default. Even though this data type incompatibility prevents usage of some of the Simulink’s blocks, it is by no means a limiting factor that restricts usage of this method. During the practical research no problems were found that completely prevent the usage of this method with the Vacon 100 inverter. The Simulink PLC Coder was in the spotlight during the practical research work. Even though the code generator worked better than was expected, the biggest problems limiting the usage of model based design relates to the PLC Coder. All in all based on this study, it is easy to say that model based design is a very usable tool for everyday design work. The method does not necessarily show its advantages in a small scale design work and the implementation phase can take some time before the company has collected the support model library needed. However in the future, the software requirements will increase, which grows the importance of model based design. A well-executed model based design can have a significant improving effect on financial, time and quality aspects
Tämän lopputyön tarkoituksena oli tutkia MathWorks:n Simulink®-ohjelmiston käyttöä mallipohjaisessa ohjelmistotuotannossa ja sen soveltuvuutta Vacon 100 -taajusmuuntajan ohjelmointiin. Tavoitteena oli identifioida kaikki ongelmakohdat, jotka vaikuttavat menetelmän jokapäiväisessä hyödyntämisessä, sekä luoda raportti, miten menetelmän avulla voidaan tehdä Vacon 100 yhteensopiva ohjelmisto. Ennen työn aloittamista menetelmän soveltuvuudesta ei ollut tarkkaa tietoa. Työn aikana suoritetut käytännönläheiset ohjelmistotuotantoesimerkit kuitenkin osoittivat nopeasti menetelmän toimivuuden. Ongelmakohdat, joita ajateltiin ennen työn aloittamista, osoittautuivat pääosin vääriksi. Ainoa pysyvä ongelmakohta, joka työn aikana tuli esille, on Vacon 100:n tuki vain 32-bit reaaliluvuille, kun taas Simulink käyttää oletuksena 64-bit reaalilukua. Vaikka datatyypistä aiheutuva ongelma estääkin muutaman Simulink-lohkon käytön, se ei kuitenkaan ole menetelmän käyttöä rajoittava ongelma. Työssä ei tullut vastaan yhtään ongelmaa, joka olisi estänyt mallipohjaisen suunnittelun käytön Vacon 100 -laitteen kanssa. Simulink:n koodigenerointityökalu eli Simulink PLC Coderon tärkeässä osassa työn tutkimuksen kannalta. Kaiken kaikkiaan koodigeneraattori toimi yli odotusten, mutta suurimmat ongelmat, jotka rajoittavat mallipohjaisen suunnittelun käyttöä, liittyvät kuitenkin PLC Coder:n toimintaan. Yhteenvetona työn perusteella voidaan todeta, että mallipohjainen ohjelmistotuotanto on nykyaikana erittäin käyttökelpoinen menetelmä. Tosin menetelmän tuomat hyödyt eivät välttämättä tule esille pienessä mittakaavassa ja ennen kuin yritykselle on muodostunut omaan tuotteeseen liittyvien mallien ja lohkojen tietokanta. Tulevaisuudessa kuitenkin suunnittelutyön vaatimusten kasvaessa, mallipohjaisen ohjelmistotuotannon merkitys tulee kasvamaan. Hyvin toteutettuna menetelmä parantaa huomattavasti suunnittelutyön tulosta niin taloudellisesti, ajallisesti kuin laadullisestikin
APA, Harvard, Vancouver, ISO, and other styles
27

Xu, Ruoxi. "Regression Model Stochastic Search via Local Orthogonalization." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1322589253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Marais, Susan Maria. "'N Generiese model vir 'n effektiewe mentorskapprogram." Thesis, Pretoria : [s.n.], 2001. http://upetd.up.ac.za/thesis/available/etd-08272003-101859.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Filho, Nicolino Trompieri. "ANÃLISE DOS RESULTADOS DA AVALIAÃÃO DO SAEB/2003 VIA REGRESSÃO LINEAR MÃLTIPLA." Universidade Federal do CearÃ, 2007. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=1993.

Full text
Abstract:
nÃo hÃ
O presente estudo teve como objetivo identificar nos resultados do SAEB/2003, no CearÃ, um modelo multilinear com variÃveis que garantam o maior grau de explicaÃÃo do rendimento em matemÃtica nos testes aplicados nas amostras de 4Â sÃrie da escola pÃblica; 4Â sÃrie da escola particular; 8Â sÃrie da escola pÃblica e 8Â sÃrie da escola particular. Tomando-se, entre as variÃveis do diretor, do professor, do aluno e das condiÃÃes fÃsicas da escola, as variÃveis que se relacionavam significativamente com o rendimento no teste de matemÃtica em cada uma das amostraS, com os recursos do software SPSS (Statical Package for the Social Sciences), uma regressÃo linear em cada amostra, com o mÃtodo âstepwareâ. Obtiveram-se quatro modelos lineares mÃltiplos com variÃveis que permitem um maior grau de explicaÃÃo do rendimento no teste em matemÃtica em cada uma das amostras. Conclui-se que a anÃlise desse tipo permite a formulaÃÃo de polÃticas pÃblicas capazes de superar a busca da melhoria do ensino por intervenÃÃes pontuais.
This study had as its objective the identification in the results of SAEB/2003, here in CearÃ, of a multilinear model with variables that gave a greater degree of explaination for the results in mathematics of tests given in the randomic sample of the fourth years students in public schools, the fourth year students in private schools, the eight year students in public schools and the eight year students in private schools. The following variables were taken into account: the director, the teacher, the student, the physical state of the school as variables that had significant relevance with the results of the tests in mathematics in each one of the samples, using the resources of the software SPSS (Statical Package for the School Sciences), a linear regression in each sample, using the method "stepware". Four multilinear models with variables were obtained that permitted a greater degree of explaination for the results of the mathematic tests in each sample. The study concluded affirming that an analysis of this type permits the formulation of public politics capable of overcoming the search for better forms of teaching with punctual interventions.
APA, Harvard, Vancouver, ISO, and other styles
30

Campos, Josmar Furtado de. "Eficiência da análise estatística espacial na classificação de famílias do feijoeiro - estudo via simulação." Universidade Federal de Viçosa, 2011. http://locus.ufv.br/handle/123456789/4039.

Full text
Abstract:
Made available in DSpace on 2015-03-26T13:32:11Z (GMT). No. of bitstreams: 1 texto completo.pdf: 446864 bytes, checksum: 1ba8efc18a08b922adc7e2c5eb3dc55c (MD5) Previous issue date: 2011-02-24
The aim of this study was to evaluate the efficiency of spatial analysis, which considers spatially dependent errors, for classification of common bean families in relation to traditional analysis in randomized blocks and lattice that assuming independent errors. Were considered different degrees of spatial dependence and experimental precision. Were taken as reference to simulate the results of seven experiments carried out in simple square lattice for genetic evaluation of yield (g/plot) of families and bean cultivars of winter crops and water used in 2007 and 2008. From the results presented in the simulation, it was possible to assess the quality of their experiments based on different analysis (Block, lattice and Spatial) and simulated average of 100 families in different scenarios for Spatial Dependence (DE) and Accuracy Selective (AS). In the process of simulation, the average yield (645 g/plot) and the residual variance (7744.00), was defined based on the analysis results of the tests in blocks of bean breeding program at UFV. To make up the four simulated scenarios were considered magnitude of spatial dependence (null, low, medium and high), corresponding to ranges of 0, 25, 50 and 100% of the maximum distance between plots. Were also simulated three classes of selective accuracy (0.95, 0.80 and 0.60), corresponding to the experimental precision very high, high and average, respectively. The actual classification of families was used to evaluate the efficiency of analysis methods tested by Spearman correlation applied to orders and genotypic classification of Selection Efficiency between classifications based on tested methodologies and the actual classification for the selection of 10, 20 and 30% of the best families. To compare the efficiency of adjustment of the models tested, was used the Akaike information criterion (AIC), based on likelihood. Spatial analysis has provided estimates of residual variance very close to the simulated residual variance and higher selective accuracy estimated in all scenarios, indicating greater experimental accuracy. With the reduction in the accuracy and selective increase in spatial dependence, there was greater influence of analysis on the classification of families, and the spatial analysis showed the best results, providing more efficient selection of bean families than traditional analysis of randomized blocks and lattice, mainly for the selection of fewer families. The results for selective accuracy estimated on the basis of F statistics were very close to those obtained with the Spearman correlation between estimated and simulated averages for families, indicating that the accuracy should be used selectively as a measure of experimental precision tests of genetic evaluation.
O objetivo deste trabalho foi avaliar a eficiência da análise Espacial, que considera erros dependentes espacialmente, para classificação de famílias de feijoeiro em relação às análises tradicionais em blocos casualizados e em látice que assumem erros independentes. Considerou-se diferentes graus de dependência espacial e de precisão experimental. Foram tomados como referência para simulação os resultados de sete ensaios instalados em látice quadrado simples para avaliação genética da produtividade de grãos (g/parcela) de famílias e cultivares de feijoeiro das safras de inverno e das águas de 2007 e 2008. A partir dos resultados apresentados na simulação, foi possível avaliar a qualidade dos respectivos experimentos com base nas diferentes análises (Bloco, Látice e Espacial) e médias simuladas das 100 famílias nos diferentes cenários para Dependência Espacial (DE) e Acurácia Seletiva (AS). No processo de simulação, a média de produção (645 g/parcela), bem como a variância residual (7744,00), foi definida com base nos resultados de análises em blocos de ensaios do programa de melhoramento do feijoeiro da UFV. Para a composição dos cenários simulados foram consideradas quatro magnitudes de dependência espacial (nula, baixa, média e alta), correspondendo aos alcances 0, 25, 50 e 100% da distância máxima entre parcelas. Também foram simuladas três classes de acurácia seletiva (0,95, 0,80 e 0,60), correspondente a precisão experimental muito alta, alta e média, respectivamente. A classificação real das famílias foi utilizada para avaliar a eficiência das metodologias de análise testadas através da correlação de Spearman aplicada às ordens de classificação genotípica e da Eficiência de Seleção entre classificações com base nas metodologias testadas e na classificação real, para a seleção de 10, 20 e 30% das melhores famílias. Para comparar a eficiência de ajuste dos modelos testados, foi utilizado o critério de Informação de Akaike (AIC), baseado em verossimilhança. A análise Espacial apresentou estimativas de variância residual muito próxima da variância residual simulada e maior acurácia seletiva estimada em todos os cenários, indicando maior precisão experimental. Com a redução na acurácia seletiva e aumento na dependência espacial, observou-se maior influência do tipo de análise sobre a classificação das famílias, sendo que a análise espacial apresentou os melhores resultados, proporcionando seleção mais eficiente das famílias do feijoeiro do que as análises tradicionais em Látice e em Blocos casualizados, principalmente, para seleção de menor número de famílias. Os resultados para acurácia seletiva estimada em função da estatística F foram muito próximos aos obtidos para a correlação de Spearman entre médias estimadas e simuladas para as famílias, indicando que a acurácia seletiva deve ser utilizada como medida de precisão experimental nos ensaios de avaliação genética.
APA, Harvard, Vancouver, ISO, and other styles
31

Keyser, Helane. "Ondersoek na 'n model vir die opleiding van leksikograwe vir verklarende woordeboeke." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53539.

Full text
Abstract:
Thesis (MPhil)--Stellenbosch University, 2003.
ENGLISH ABSTRACT: In the past work on theoretical lexicography was often done simply "incidentally", usually flowing from research in pure linguistics. In the course of time theoretical lexicography and the compilation of dictionaries developed into a fully fledged discipline. The successful lexicographer requires particular characteristics and skills of a high quality. It appears from discussions with editors of dictionaries that even the best academic training does not necessarily produce an excellent lexicographer, as the compilation of dictionaries is a process that requires the skilful integration of theory, practice and language intuition. This study has tried to identify the knowledge, characteristics and skills that are necessary in order to train prospective lexicographers. An attempt has been made, by way of a selective literature review and conducting interviews with people engaged in the practice of lexicography, to design a model for the training of lexicographers. The study has shown that the training of lexicographers should not focus only on the theoretical aspects, but that in-service training or experience with a publisher of dictionaries is very important. A proposal is made on the subjects that should receive attention in the training of lexicographers. As part of the training model every prospective lexicographer should receive grounding in the development of management skills. A model for the compilation of dictionaries should emphasise the importance of a well thought-through conceptualisation of a plan for a dictionary as part of the lexicographical process. Lexicographers should be trained in the lexicographical process, from the collection of data to the preparation of the dictionary for publication. The purpose of the dictionary should also be identified as part of the planning phase. Prospective lexicographers must, among other things, build up an extensive knowledge of metalexicography, the general theory of lexicography, the management and planning of a dictionary project, the lexicographical process (from the collection of data to the publication and evaluation of dictionaries), aspects regarding the theories of semantics, morphology, phonology and syntax, and computer lexicography. Finally, a recommendation is made that certain components of lexicography training should also be undertaken at technikons rather than only at universities - this proposition should be researched further.
AFRIKAANSE OPSOMMING: Werk in die teoretiese leksikografie is in die verlede dikwels bloot "toevallig" gedoen vanuit navorsing wat gewoonlik veral in die suiwer taalkunde gedoen is. Die teoretiese leksikografie en die skryf van woordeboeke het mettertyd ontwikkel tot 'n volwaardige dissipline. Sekere eienskappe en vaardighede van hoogstaande gehalte word benodig om 'n suksesvolle leksikograaf te wees. Uit gesprekke met woordeboekredakteurs blyk dit duidelik dat selfs die beste akademiese opleiding nie noodwendig 'n voortreflike leksikograaf saloplewer nie, aangesien die skryf van 'n woordeboek 'n proses is waardeur teorie, praktyk en taalintuïsie vaardig ineengevleg moet word. Hierdie studie het gepoog om die kennis, eienskappe en vaardighede te identifiseer wat nodig is om 'n voornemende leksikograaf te onderrig. Deur middel van 'n selektiewe literatuurstudie en persoonlike onderhoude met persone in die praktyk, is gepoog om 'n model daar te stel vir die opleiding van leksikograwe. Die studie het getoon dat leksikograwe se opleiding nie net op die teoretiese aspekte mag fokus nie, maar dat indiensopleiding of ondervinding by 'n woordeboekuitgewer van groot belang is. 'n Voorstel word gemaak oor die onderwerpe wat aandag behoort te geniet in die opleiding van 'n leksikograaf. As deel van die opleidingsmodel behoort elke voornemende leksikograaf onderlê te word in die ontwikkeling van bestuursvaardighede. 'n Model vir die samestelling van woordeboeke behoort klem te lê op die belangrikheid van 'n weldeurdagte woordeboekkonseptualiseringsplan as deel van die leksikografiese proses. Leksikograwe behoort in die leksikografiese proses onderrig te word, vanaf die verkryging van data tot die voorbereiding vir publikasie van die woordeboek. Die doel van die woordeboek moet ook geïdentifiseer word as deel van die beplanningsfase. Voornemende leksikograwe moet onder andere 'n deeglike kennis opdoen van die metaleksikografie, algemene leksikografieteorie, bestuur en beplanning van 'n woordeboekprojek, die leksikografiese proses (van die verkryging van data tot die uitgee van die woordeboek en die evaluering daarvan), aspekte van die teorie van semantiek, morfologie, fonologie en sintaksis en rekenaarleksikografie. Ten slotte word 'n aanbeveling gemaak dat bepaalde komponente van leksikografie-opleiding dalk ook by technikons en nie net by universiteite hoort nie - hierdie opmerking behoort verder nagevors te word.
APA, Harvard, Vancouver, ISO, and other styles
32

Mitchell, Cassie S. "Viewpoint aggregation via relational modeling and analysis." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/28213.

Full text
Abstract:
Thesis (M. S.)--Biomedical Engineering, Georgia Institute of Technology, 2009.
Committee Chair: Lee, Robert; Committee Member: Kemp, Melissa; Committee Member: Prinz, Astrid; Committee Member: Ting, Lena; Committee Member: Wiesenfeld, Kurt.
APA, Harvard, Vancouver, ISO, and other styles
33

Adhikari, Bhisma. "Intelligent Simulink Modeling Assistance via Model Clones and Machine Learning." Miami University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=miami1627040347560589.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Fjärdsjö, Johnny, and Zada Nasir Muhabatt. "Kvalitetssäkrad arbetsprocess vid 3D-modellering av byggnader : Baserat på underlag från ritning och 3D-laserskanning." Thesis, KTH, Byggteknik och design, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-148822.

Full text
Abstract:
Tidigare vid ombyggnation, försäljning och förvaltning av byggnader som var uppförda innan 80-talet utgick fastighetsägarna från enkla handritade pappersritningar. Det är en svår utmaning att hålla ritningen uppdaterad till verkliga förhållanden d.v.s. relationsritning. För ca 25 år sedan (i början på 80-talet) byttes papper och penna ut mot avancerade ritprogram (CAD) för framtagning av ritningar. Idag används CAD i stort sett för all nyprojektering och de senaste åren har utvecklingen gått mot en större användning av 3D-underlag än tidigare 2D-ritningar. Den stora fördelen med att projektera i 3D är att en virtuell modell skapas av hela byggnaden för att få en bättre kontroll av ingående byggdelsobjekt och även att fel kan upptäckas i tidigare skeden än på byggarbetsplatsen. Genom att börja bygga en virtuell byggnad i 3D från första skedet och succesivt fylla den med mer relevant information i hela livscykeln får man en komplett informationsmodell. Ett av kraven som ställs på fastighetsägarna vid ombyggnation och förvaltning är att tillhandahålla korrekt information och uppdaterade ritningar. Det skall vara enkelt för entreprenören att avläsa ritningarna. I rapporten beskrivs en effektiviserad arbetsprocess, metoder, verktyg och användningsområden för framtagning av 3D-modeller. Detta arbete avser att leda fram till en metodbeskrivning som skall användas för erfarenhetsåterföring. Arbetet skall också vara ett underlag som skall användas för att beskriva tillvägagångsättet att modellera från äldre ritningar till 3D-modeller. Metodbeskrivningen kommer att förenkla förståelsen för modellering för både fastighetsägaren och inom företaget, samt höja kvalitén på arbetet med att skapa CAD-modeller från de olika underlag som används för modellering.
The use of hand drawn construction model was the only way of development, rebuilding, sales and real estate management before the 80’s. However, the challenge was to preserve the drawings and maintain its real condition. To make things work faster and easier the development of advanced drawing software (CAD) was introduced which replaced the traditional hand drawn designs. Today, CAD is used broadly for all new constructions with a great success rate. However, with the new advanced technology many engineers and construction companies are heavily using 3D models over 2D drawings. The major advantage of designing in 3D is a virtual model created of the entire building to get a better control of input construction items and the errors can be detected at earlier stages than at the construction sites. By modifying buildings in a virtual model in three dimensions yet at the first stage and gradually fill it with more relevant information throughout the life cycle of buildings to get a complete information model. One of the requirements from the property owners in the redevelopment and management is to provide accurate information and updated drawings. It should be simple for the contractor to read drawings. This report describes a streamlined work processes, methods, tools and applications for the production of 3D models. This work is intended to lead to a methodology and to be used as well as for passing on experience. This report will also be a base to describe the approach to model from older drawings into 3D models. The method description will simplify the understanding of model for both the property owners and for companies who creates 3D models. It will also increase the quality of the work to create CAD models from the different data used for modeling.
APA, Harvard, Vancouver, ISO, and other styles
35

Tang, On-yee, and 鄧安怡. "Estimation for generalized linear mixed model via multipleimputations." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B30687652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Bester, Suzanne Elizabeth. "Die ontwerp van 'n postmodernistiese model vir beroepsvoorligting." Pretoria : [s.n.], 1999. http://upetd.up.ac.za/thesis/available/etd-11062006-150213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

LACERDA, Estefane George Macedo de. "Model Selection of RBF Networks Via Genetic Algorithms." Universidade Federal de Pernambuco, 2003. https://repositorio.ufpe.br/handle/123456789/1845.

Full text
Abstract:
Made available in DSpace on 2014-06-12T15:52:45Z (GMT). No. of bitstreams: 2 arquivo4692_1.pdf: 1118830 bytes, checksum: 96894dd8a22373c59d67d3b286b6c902 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2003
Um dos principais obstáculos para o uso em larga escala das Redes Neurais é a dificuldade de definir valores para seus parâmetros ajustáveis. Este trabalho discute como as Redes Neurais de Funções Base Radial (ou simplesmente Redes RBF) podem ter seus parâmetros ajustáveis definidos por algoritmos genéticos (AGs). Para atingir este objetivo, primeiramente é apresentado uma visão abrangente dos problemas envolvidos e as diferentes abordagens utilizadas para otimizar geneticamente as Redes RBF. É também proposto um algoritmo genético para Redes RBF com codificação genética não redundante baseada em métodos de clusterização. Em seguida, este trabalho aborda o problema de encontrar os parâmetros ajustáveis de um algoritmo de aprendizagem via AGs. Este problema é também conhecido como o problema de seleção de modelos. Algumas técnicas de seleção de modelos (e.g., validação cruzada e bootstrap) são usadas como funções objetivo do AG. O AG é modificado para adaptar-se a este problema por meio de heurísticas tais como narvalha de Occam e growing entre outras. Algumas modificações exploram características do AG, como por exemplo, a abilidade para resolver problemas de otimização multiobjetiva e manipular funções objetivo com ruído. Experimentos usando um problema benchmark são realizados e os resultados alcançados, usando o AG proposto, são comparados com aqueles alcançados por outras abordagens. As técnicas propostas são genéricas e podem também ser aplicadas a um largo conjunto de algoritmos de aprendizagem
APA, Harvard, Vancouver, ISO, and other styles
38

Bernard-Stoecklin, Sibylle. "Role of semen infected leukocytes in HIV mucosal transmission : Experimental model of SIVmac251 infection in Macaca fascicularis." Thesis, Paris 11, 2013. http://www.theses.fr/2013PA114818/document.

Full text
Abstract:
Aujourd’hui, plus de 80% des nouvelles infections par le virus de l’immunodéficience humaine (VIH) se produisent au cours d’un rapport sexuel, avec une transmission du virus par voie muqueuse. Le sperme constitue donc une source majeure de virus à l’échelle mondiale. Le sperme d’hommes infectés par le VIH contient le virus sous deux formes : des particules virales libres et des cellules infectées, principalement des leucocytes.Plusieurs hypothèses ont été proposées afin d’expliquer le passage du virus à travers la barrière muqueuse, qu’il s’agisse d’une muqueuse génitale (cervico-vaginale, pénienne ou urétrale) ou intestinale (muqueuse anale ou rectale). Toutefois, une grande majorité des études qui ont été menées jusqu’à présent se sont concentrées sur le rôle des particules virales libres, et celui des cellules infectées demeure mal compris. Une étude menée dans notre laboratoire a montré que des leucocytes infectés par le virus de l’immunodéficience simienne (VIS) sont capables de transmettre l’infection après inoculation vaginale.Le projet de cette thèse est d’étudier le rôle des leucocytes infectés présents dans le sperme de macaque dans la transmission muqueuse du VIS/VIH. Ainsi, trois axes d’étude principaux ont été définis: 1) l’étude des leucocytes présents dans le sperme de macaque cynomolgus, et de l’influence que peut avoir l’infection par le VIS sur eux ; 2) l’identification des cellules immunitaires infectées présentes dans le sperme de macaque, et l’étude de leur dynamique au cours de l’infection par le VIS. ; 3) l’étude du pouvoir infectieux des deux principales cellules cibles pour le VIS/VIH : les lymphocytes CD4+ (LT CD4+) et les macrophages, in vitro et in vivo, après inoculation rectale et vaginale à des macaques cynomolgus.Le sperme de macaque contient toutes les cellules cibles du VIS/VIH : des lymphocytes T CD4+ (LTCD4+), des macrophages et des cellules dendritiques dans une moindre proportion). Les LTCD4+ et les macrophages du sperme présentent un phénotype d’activation, de différenciation et d’expression de marqueurs de migration typique des leucocytes résidant dans les tissus muqueux. L’infection par le VIS induit des changements significatifs dans leur phénotype et leur dynamique. Ces deux types cellulaires peuvent être infectés de façon productive et sont présents dans le sperme à tous les stades de l’infection. Ces données suggèrent que les LTCD4+ et les macrophages du sperme seraient capables de transmettre l’infection par voie muqueuse.Si le rôle des leucocytes infectés du sperme est confirmé in vivo, il sera important à l’avenir de prendre en compte ce mécanisme de transmission dans le développement de nouvelles stratégies préventives de l’infection par le VIH, notamment les microbicides
Human Immunodeficiency Virus (HIV) infection mostly spreads by the mucosal route: sexual transmission is the dominant mode of transmission, responsible for between 85% and 90% of cases of infection worldwide. These epidemiological data indicate that semen is one of the major sources of HIV-1 transmission. Semen, like other bodily secretions involved in HIV sexual transmission, contains the virus as two forms: cell-free viral particles and cell-associated virus, mostly in infected leukocytes. Although cell-to-cell HIV transmission has been extensively described as more efficient, rapid and resistant to host immune responses, very few studies have investigated the role in vivo of infected leukocytes in virus mucosal transmission. One such study has been recently conducted in our lab, and demonstrated that SIV-infected splenocytes are able to transmit infection to female macaques after vaginal exposure. However, all these studies used immune cells from peripheral blood or lymphoid tissues, such as spleen, and none have investigated the capacity of infected leukocytes in semen to transmit the infection in vivo. Indeed, nature, phenotype and infectivity of HIV associated with semen leukocytes may be different from that of HIV from other sources.Therefore, the objectives of this work are, first, to study of semen leukocytes and their dynamics during SIVmac251 infection in detail, then to investigate seminal factors that may influence semen infectiousness, and finally to test semen leukocyte infectivity in vitro and in vivo, using a model of mucosal exposure in cynomolgus macaques.Macaque semen contains all the target cells for HIV/SIV: CD4+ T cells, macrophages and dendritic cells in lower proportions. Semen CD4+ T cells and macrophages display an activation, differenciation and expression of migration markers profile which is typical of mucosal leucocytes. SIV infection induces significant changes in their phenotype and dynamics. Both cell types can be productively infected and are found in the semen at all stages of infection. These observations suggest that semen CD4+ T cells and macrophages may be able to transmit infection after mucosal exposure.If the role of semen infected leukocytes in HIV/SIV mucosal transmission is confirmed in vivo, this mechanism will be important to consider for further preventive strategies design, like microbicides
APA, Harvard, Vancouver, ISO, and other styles
39

Brenkman, Albertus Marinus Jan. "'n Konseptuele model vir 'n bestuursinligtingstelsel vir 'n streekkantoor / Albertus Marinus Jan Brenkman." Thesis, Potchefstroom University for Christian Higher Education, 1989. http://hdl.handle.net/10394/8980.

Full text
Abstract:
In the research for this paper, which concerns a conceptual model for a management information system (MIS) for a regional office, it was found that the regional office could be seen as a typical mini organisation. The regional office as organisation needs educational leaders in order to function. This educational management can be embodied in a pyramid structure comprising top management, middle management and first-line management. Management information is therefore essential for the effective management of a regional office. Effective management in a regional office can only take place when the educational leaders are supported in the execution of their different managerial tasks. This support of the educational leaders necessitates the timely and relevant supply of management information. Research showed that a management information system is a computer based system which provides information to the educational leaders in support of their different management responsibilities. Not only the hardware and software, but also the guidelines and supporting staff can be classified as components of the management information system. The demands of an educational leader from a management information system comprises information on pupils, college and school educators, as well as information regarding physical matters. These three categories of information can be compiled into three integrated files on a database for each regional office. The required management information can be provided by means of reports, graphs and tables. A management information system provides the educational leaders in the regional offices with a challenging and enriching experience in the management of education. It is therefore to be advised that immediate attention be given to the development of management information systems for all regional offices.
Thesis (MEd)--PU vir CHO, 1990
APA, Harvard, Vancouver, ISO, and other styles
40

Pereira, Felipe Rateiro. "Investigação das vibrações induzidas pela emissão de vórtices em modelos reduzidos de riser lançados em catenária." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-04022016-183409/.

Full text
Abstract:
O estudo do fenômeno de vibrações induzidas pela emissão de vórtices, ou VIV, em risers é um tópico de grande relevância para a engenharia oceânica. Nos últimos anos, diversos resultados vêm sendo apresentados, numéricos e/ou experimentais, buscando o entendimento dos fundamentos deste fenômeno de interação fluido-estrutural, no entanto, sem um entendimento conclusivo e abrangente. Neste contexto, risers também estão sujeitos à ação de correntezas oceânicas e, consequentemente, ao fenômeno de VIV. Neste sentido, o presente estudo traz resultados de uma abordagem experimental para o entendimento do fenômeno de VIV em risers lançados em catenária, configuração que representa grande parte das geometrias adotadas em lançamentos atuais. No entanto, a realização de experimentos com risers em tanque de provas pressupõe a utilização de modelos reduzidos, o que, por si só, representa um grande desafio de pesquisa. Para tanto, desenvolveu-se uma metodologia para a modelagem de risers em escala reduzida, com foco na similaridade entre parâmetros importantes que garantam um comportamento dinâmico equivalente àquele em escala real. Foram, então, realizados experimentos de VIV com catenárias concebidas e construídas com base nesta metodologia, posteriormente ensaiadas em tanque de reboque. Para análise dos resultados, foram utilizadas técnicas como a decomposição modal, que possibilitou a comparação desses com alguns resultados da literatura, mostrando que o fenômeno de VIV na catenária se processa de maneira bastante semelhante ao identificado nos cilindros flexíveis. Além disso, a extensa base de dados experimentais apresentada e discutida nesta tese contribui para futura validação de diversos modelos matemáticos e códigos numéricos de predição do VIV em catenárias, descrevendo com detalhes inúmeros comportamentos imersos neste problema importante de engenharia.
The study of Vortex-induced vibrations in risers is a topic of great relevance in the context of ocean engineering. In recent years, several numerical and experimental results have been presented, addressing the understanding of the fundamentals of this fluid-structure interaction phenomenon, however without a conclusive and comprehensive understanding. In this context, risers are exposed to the action of ocean currents and consequently to the phenomenon of VIV. In this sense, the present study provides results of an experimental approach to understanding the VIV phenomenon on risers launched in catenary configuration, which represents much of the geometries adopted today. However, performing risers experiments in towing-tanks presupposes the use of small-scale models, which represents a major research challenge. Therefore, a methodology for modeling reduced scale risers was developed. The focus of this methodology was on the similarity between certain important parameters that guarantee an equivalent full-scale dynamic behavior. VIV experiments were then performed in the towing tank, with the designed and constructed catenary, based on this methodology. Techniques such as modal decomposition were used for the analysis of the results. This allows the comparison of the results with classical literature paradigms showing that VIV response in the catenary is close to results obtained with flexible cylinders. Furthermore, the large experimental database presented and discussed in the study may contribute to future validation of several mathematical models and numerical codes of VIV prediction in catenary risers, describing in detail several immersed behaviors in this important engineering problem.
APA, Harvard, Vancouver, ISO, and other styles
41

Benghabrit, Walid. "A formal model for accountability." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0043/document.

Full text
Abstract:
Nous assistons à la démocratisation des services ducloud et de plus en plus d’utilisateurs (individuels ouentreprises) utilisent ces services dans la vie de tous lesjours. Dans ces scénarios, les données personnellestransitent généralement entre plusieurs entités.L’utilisateur final se doit d’être informé de la collecte, dutraitement et de la rétention de ses donnéespersonnelles, mais il doit aussi pouvoir tenir pourresponsable le fournisseur de service en cas d’atteinte àsa vie privée. La responsabilisation (ou accountability)désigne le fait qu’un système ou une personne estresponsable de ses actes et de leurs conséquences.Dans cette thèse nous présentons un framework deresponsabilisation AccLab qui permet de prendre enconsidération la responsabilisation dès la phase deconception d’un système jusqu’à son implémentation.Afin de réconcilier le monde juridique et le mondeinformatique, nous avons développé un langage dédiénommé AAL permettant d’écrire des obligations et despolitiques de responsabilisation. Ce langage est basé surune logique formelle FOTL ce qui permet de vérifier lacohérence des politiques de responsabilisation ainsi quela compatibilité entre deux politiques. Les politiques sontensuite traduites en une logique temporelle distribuéeque nous avons nommée FO-DTL 3, cette dernière estassociée à une technique de monitorage basée sur laréécriture de formules. Enfin nous avons développé unoutil monitorage appelé AccMon qui fournit des moyensde surveiller les politiques de responsabilisation dans lecontexte d’un système réel. Les politiques sont fondéessur la logique FO-DTL 3 et le framework peut agir enmode centralisée ou distribuée et fonctionne à la fois enligne et hors ligne
Nowadays we are witnessing the democratization ofcloud services. As a result, more and more end-users(individuals and businesses) are using these services intheir daily life. In such scenarios, personal data isgenerally flowed between several entities. End-usersneed to be aware of the management, processing,storage and retention of personal data, and to havenecessary means to hold service providers accountablefor the use of their data. In this thesis we present anaccountability framework called AccountabilityLaboratory (AccLab) that allows to consideraccountability from design time to implementation time ofa system. In order to reconcile the legal world and thecomputer science world, we developed a language calledAbstract Accountability Language (AAL) that allows towrite obligations and accountability policies. Thislanguage is based on a formal logic called First OrderLinear Temporal Logic (FOTL) which allows to check thecoherence of the accountability policies and thecompliance between two policies. These policies aretranslated into a temporal logic called FO-DTL 3, which isassociated with a monitoring technique based on formularewriting. Finally, we developed a monitoring tool calledAccountability Monitoring (AccMon) which providesmeans to monitor accountability policies in the context ofa real system. These policies are based on FO-DTL 3logic and the framework can act in both centralized anddistributed modes and can run into on-line and off-linemodes
APA, Harvard, Vancouver, ISO, and other styles
42

Adhikari, Kiran. "Verifying a Quantitative Relaxation of Linearizability via Refinement." Thesis, Virginia Tech, 2013. http://hdl.handle.net/10919/23222.

Full text
Abstract:
Concurrent data structures have found increasingly widespread use in both multicore and distributed computing environments, thereby escalating the priority for verifying their correctness. The thread safe behavior of these concurrent objects is often described using formal semantics known as linearizability, which requires that  every operation in a concurrent object appears to take effect between its invocation and response. Quasi linearizability is a quantitative relaxation of linearizability to allow more implementation freedom for performance optimization.  However, ensuring the quantitative aspects of this new correctness condition is an arduous task. We propose the first method for formally verifying quasi linearizability
of the implementation model of a concurrent data structure. The method is based on checking the refinement relation between the implementation model and a specification model via explicit state model checking. It can directly handle multi-threaded programs where each thread can make infinitely many method calls, without requiring the user to manually annotate for the linearization points. We have implemented and evaluated our method in the PAT model checking toolkit.  Our experiments show that the method is effective in verifying quasi linearizability and in detecting its violations.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
43

Nunes, Fábio Magalhães. "Análise da correlação entre o Ibovespa e o ativo petr4 : estimação via modelos Garch e modelos aditivos." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2009. http://hdl.handle.net/10183/22664.

Full text
Abstract:
A estimação e previsão da volatilidade de ativos são de suma importância para os mercados financeiros. Temas como risco e incerteza na teoria econômica incentivaram a procura por métodos capazes de modelar a variância condicional que evolui ao longo do tempo. O objetivo central desta dissertação foi modelar via modelos ARCH – GARCH e modelos aditivos o índice do IBOVESPA e o ativo PETR4 para analisar a existência de correlação entre as volatilidades estimadas. A estimação da volatilidade dos ativos no método paramétrico foi realizada via modelos EGARCH; já para o método não paramétrico, utilizouse os modelos aditivos com 5 defasagens.
Volatility estimation and forecasting are very important matters for the financial markets. Themes like risk and uncertainty in modern economic theory have encouraged the search for methods that allow for modeling of time varying variances. The main objective of this dissertation was estimate through GARCH models and additive models of IBOVESPA and PETR4 assets; and analyzes the existence of correlation between volatilities estimated. We use EGARCH models to estimate through parametric methods and use additive models 5 to estimate non parametric methods.
APA, Harvard, Vancouver, ISO, and other styles
44

Berveglieri, Adilson [UNESP]. "Classificação fuzzy de vertentes por krigagem e TPS com agregação de regiões via diagrama de Voronoi." Universidade Estadual Paulista (UNESP), 2011. http://hdl.handle.net/11449/88156.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:23:09Z (GMT). No. of bitstreams: 0 Previous issue date: 2011-02-16Bitstream added on 2014-06-13T20:30:02Z : No. of bitstreams: 1 berveglieri_a_me_prud.pdf: 2017717 bytes, checksum: 69925cc487658d0455ded0ccb94753b8 (MD5)
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
As vertentes, como superf cies inclinadas, consistem em express~oes da Geomorfologia moldadas por fatores naturais (end ogenos e ex ogenos) e pelo pr oprio homem. Suas formas determinam o uxo ou o ac umulo de agua e representam caracter sticas fundamentais para a preven c~ao e resolu c~ao de problemas associados ao relevo, tais como utiliza c~ao do solo, constru c~ao civil entre outros. A classi ca c~ao da vertente em c oncava, convexa ou retil - nea permite a identi ca c~ao de areas conforme sua declividade. Assim, por meio de uma grade retangular regular, base do modelo digital de terreno, gera-se uma malha interpolada por fun c~oes estimadoras: thin-plate spline, que possui caracter sticas de suaviza c~ao e krigagem, que al em da suavidade tamb em considera a depend encia espacial. Logo ap os, a classi ca c~ao e feita, obedecendo a infer encia fuzzy baseada em fun c~oes de pertin encia que de nem classes a partir do c alculo da inclina c~ao e da concavidade ou convexidade do terreno. Entretanto, o resultado dessa classi ca c~ao est a atrelado a resolu c~ao da malha, n~ao permitindo fazer qualquer corre c~ao pontual. Pois, pequenas areas de pouca signi c ancia podem ser formadas, necessitando elimin a-las. Nesse sentido, para que o resultado seja ajustado, aplica-se o diagrama de Voronoi, caracterizado por sua rela c~ao de abrang encia e proximidade, como ferramenta para agregar regi~oes anteriormente classi cadas de modo a permitir um ajuste local e tornar o resultado mais condizente com a area em estudo, quando comparada a mapas geomorfol ogicos correspondentes
Slopes, such as inclined surfaces, consist in geomorphological expressions shaped by natural factors (endogenous and exogenous) and also by man himself. Their shapes determine the ow or accumulation of water and represent fundamental characteristics for the prevention and resolution of problems associated with relief, as land use, buildings, and others. Classi- cating slopes in concave, convex or straight allows to identi cate areas based on declivity. Thus, by regular rectangular grid which represents a digital terrain model, it generates a interpolated mesh by estimator functions: thin-plate spline, which has characteristics of smoothing, and kriging, which besides smoothing also considers spatial dependence. After that, the classi cation is realized according to fuzzy inference based on membership functions that de ne classes from the calculation of the slope and concavity or convexity of the ground. However, the classi cation depends on mesh resolution and it not allows any point correction. Once small areas with little importance can be formed requiring eliminate them. In order to adjust the result, it applies the Voronoi diagram, characterized by its comprisement and close relationship and scope, as a tool to aggregate regions previously classi ed and allow a local adjustment, that can provides a consistent result in study areas, if it was compared to the corresponding geomorphological maps
APA, Harvard, Vancouver, ISO, and other styles
45

Ventrella, Domenico <1987&gt. "The Piglet as Biomedical Model: Physiological Investigations, New Techniques and Future Applications." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amsdottorato.unibo.it/7852/1/ventrella_domenico_tesi.pdf.

Full text
Abstract:
By the analysis of the literature, it looks like the strive for new piglets’ models of disease have prevaricated the necessity for a deeper knowledge of the physiology of the animal. It is well known how difficult it is to interpret obtained data when poor to none reference standards are provided and how hard it is to apply techniques borrowed from other models, no matter how similar they can look. The aim of the present work was to collect knowledge and information regarding the piglets by studying its physiology and to validating new techniques. The experiments can be divided into two categories: physiological investigations and new techniques and future applications. Blood and Cerebrospinal fluid were analyzed in a quali-quantitative manner in piglets. The studies, thanks to the extensive statistical analyses and the high number of sampled population, provide with important reference intervals that will allow for better understanding of several metabolic processes. For the second category, technical experiments aimed to find easier and relatively pain free procedures for the collection of Cerebrospinal fluid and intrathecal administration in piglets, were performed. Operators’ skills often are a limiting factor for the feasibility of experimental protocols, and easier techniques are the best way to break down these walls. Moreover, when leading to lower mortality and higher welfare, those techniques allow for better results and higher ethical standards. The last experiment aimed to create a comprehensive map of CNS transduction upon intrathecal administration of Adeno-Associated Viral vector in piglets, and to evaluate their potential toxicity. The obtained result will help choosing the right serotype depending on the targeted cell population, thus avoiding preliminary studies reducing the number of enrolled animals. In conclusion, this thesis represents an additional step toward the standardization of the physiological piglet model and its refinement and reduction in experimental protocols.
APA, Harvard, Vancouver, ISO, and other styles
46

Attila, Can. "Identification of Pseudomonas aeruginosa via a poplar tree model." [College Station, Tex. : Texas A&M University, 2008. http://hdl.handle.net/1969.1/ETD-TAMU-2326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Tang, On-yee. "Estimation for generalized linear mixed model via multiple imputations." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B30687652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Lu, Qunfang Flora. "Bayesian forecasting of stock prices via the Ohlson model." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-050605-155155/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Sakizlis, Vassilis. "Design of of model-based controllers via parametric programming." Thesis, Imperial College London, 2003. http://hdl.handle.net/10044/1/8855.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Yoo, Dae Keun, and not supplied. "Model based wheel slip control via constrained optimal algorithm." RMIT University. Electrical and Computer Engineering, 2006. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20070125.163142.

Full text
Abstract:
In a near future, it is imminent that passenger vehicles will soon be introduced with a new revolutionary brake by wire system which replaces all the mechanical linkages and the conventional hydraulic brake systems with complete 'dry' electrical components. One of the many potential benefits of a brake by wire system is the increased brake dynamic performances due to a more accurate and continuous operation of the EMB actuators which leads to the increased amount of possibilities for control in antilock brake system. The main focus of this thesis is on the application of a model predictive control (MPC) method to devise an antilock brake control system for a brake by wire vehicle. Unlike the traditional ABS control algorithms which are based on a trial and error method, the MPC based ABS algorithm aims to utilizes the behaviour of the model to optimize the wheel slip dynamics subject to system constraints. The final implementation of the wheel slip controller emb races decentralized control architecture to independently control the brake torque at each four wheel. Performance of the wheel slip controller is validated through Software-in-the-Loop and Hardware-in-the-Loop simulation. In order to support the high demands of the computational power and the real time constraints of the Hardware-in-the-Loop simulation, a novel multi processor real-time simulation system is developed using the reflective memory network and the off-the-shelf hardware components.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography