Dissertations / Theses on the topic 'VIT MODEL'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'VIT MODEL.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Rafele, Antonio. "À rebours : motifs du regard et de l'expérience des modes de vie." Paris 5, 2008. http://www.theses.fr/2008PA05H070.
Full textFashion and lifestyle are a gaze produced by the experience of determined historical processes : development of metropolis and media in 19th century. The birth of this gaze is connected to the social power accumulated by metropolitain cities and various medias. Their homogeneous structure of historical images is determined by the phenomenon of time acceleration that removes the linearity and duration of the things. Attention is instead focused on their birth and death. Things are under a continuous construction and destruction, experienced and exhibited. Splendour and nullity constitute, at the same time, the mechanism and the goal of life which by-effect becomes a simple succession of instants, dead or alive. Immersion, exit and posthumous return constitute the alltogether existential and mnemonic process caracterizing our ralation to things. Therefore, different things can only be perceived as short time illusions and habituations. Life achieves itself not as vital experience but as experience of death. It is precisely in this means that lifestyle, defined as ludic mechanism of creation and destruction of habituations, becomes the key-image for the experience and the gaze directed on itself
Penna, Christiano Modesto. "Crescimento econÃmico via investimentos em capital: evidencias empÃricas para o Brasil." Universidade Federal do CearÃ, 2007. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=1394.
Full textIn this work it is evident that a not linear relation exists enters the tax of rude formation of capital fixture and the tax of economic growth in Brazilian economy e, which had to this non linearity, the test of predictions of model AK and neoclÃssico considered for Jones (1995) starts to be inconclusivo, therefore exists space for the predictions of both the models. Our econometrical model indicates that, theoretically, the productivity delinquent of the capital if modifies a tax of growth in accordance with indicated for the parameter threshold. This modification can pparently be explained due to modification of the coefficient of elasticity of substitution between capital and work, being here a proposal of new inquiries. When dealing with public politics, one evidences that, no matter how hard if extends the tax of rude formation of capital fixture will arrive, in the maximum, to âcatch-upâ of the growth of the GIP of the economies of average income decrease and of the economic growth of the countries of the Asian east and the Pacific. The work also suggests that the sum of resources necessary to conclude such âcatch-upsâ is of the order of R$ 786 billion.
Pan, Juming. "Adaptive LASSO For Mixed Model Selection via Profile Log-Likelihood." Bowling Green State University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1466633921.
Full textMetzger, Thomas Anthony. "Detection of Latent Heteroscedasticity and Group-Based Regression Effects in Linear Models via Bayesian Model Selection." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/93226.
Full textDoctor of Philosophy
Statistical models are a powerful tool for describing a broad range of phenomena in our world. However, many common statistical models may make assumptions that are overly simplistic and fail to account for key trends and patterns in data. Specifically, we search for hidden structures formed by partitioning a dataset into two groups. These two groups may have distinct variability, statistical effects, or other hidden effects that are missed by conventional approaches. We illustrate the ability of our method to detect these patterns through a variety of disciplines and data layouts, and provide software for researchers to implement this approach in practice.
Correia, Fagner Cintra [UNESP]. "The standard model effective field theory: integrating UV models via functional methods." Universidade Estadual Paulista (UNESP), 2017. http://hdl.handle.net/11449/151703.
Full textApproved for entry into archive by Monique Sasaki (sayumi_sasaki@hotmail.com) on 2017-09-27T19:37:47Z (GMT) No. of bitstreams: 1 correia_fc_dr_ift.pdf: 861574 bytes, checksum: 1829fcb0903e20303312d37d7c1e0ffc (MD5)
Made available in DSpace on 2017-09-27T19:37:47Z (GMT). No. of bitstreams: 1 correia_fc_dr_ift.pdf: 861574 bytes, checksum: 1829fcb0903e20303312d37d7c1e0ffc (MD5) Previous issue date: 2017-07-27
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
O Modelo Padrão Efetivo é apresentado como um método consistente de parametrizar Física Nova. Os conceitos de Matching e Power Counting são tratados, assim como a Expansão em Derivadas Covariantes introduzida como alternativa à construção do conjunto de operadores efetivos resultante de um modelo UV particular. A técnica de integração funcional é aplicada em casos que incluem o MP com Tripleto de Escalares e diferentes setores do modelo 3-3-1 na presença de Leptons pesados. Finalmente, o coeficiente de Wilson de dimensão-6 gerado a partir da integração de um quark-J pesado é limitado pelos valores recentes do parâmetro obliquo Y.
It will be presented the principles behind the use of the Standard Model Effective Field Theory as a consistent method to parametrize New Physics. The concepts of Matching and Power Counting are covered and a Covariant Derivative Expansion introduced to the construction of the operators set coming from the particular integrated UV model. The technique is applied in examples including the SM with a new Scalar Triplet and for different sectors of the 3-3-1 model in the presence of Heavy Leptons. Finally, the Wilson coefficient for a dimension-6 operator generated from the integration of a heavy J-quark is then compared with the measurements of the oblique Y parameter.
CNPq: 142492/2013-2
CAPES: 88881.132498/2016-01
Correia, Fagner Cintra. "The standard model effective field theory : integrating UV models via functional methods /." São Paulo, 2017. http://hdl.handle.net/11449/151703.
Full textResumo: O Modelo Padrão Efetivo é apresentado como um método consistente de parametrizar FísicaNova. Os conceitos de Matching e Power Counting são tratados, assim como a Expansão emDerivadas Covariantes introduzida como alternativa à construção do conjunto de operadoresefetivos resultante de um modelo UV particular. A técnica de integração funcional é aplicadaem casos que incluem o MP com Tripleto de Escalares e diferentes setores do modelo 3-3-1 napresença de Leptons pesados. Finalmente, o coeficiente de Wilson de dimensão-6 gerado a partirda integração de um quark-J pesado é limitado pelos valores recentes do parâmetro obliquo Y.
Doutor
Rashid, Zhiar, and Hawar Rustum. "Kvalitetssäkring vid leverans av IFC-modell : Vid export från Tekla Structures och Revit." Thesis, Högskolan i Borås, Akademin för textil, teknik och ekonomi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-22306.
Full textThe use of different BIM tools is becoming increasingly frequent in the construction sector. Information exchange of 3D models between the various actors of the project takes place continuously during the design phase. Because different actors use different software, an openfile format is needed for the delivery of 3D models. IFC is a neutral file format that enables information exchange between different disciplines and coordination of BIM models. This exam work is a preliminary study of qualitative and quantitative variety, with the aim of developing an internal working method for quality assurance of IFC model delivery. The qualitative parts include interviews. Interviews are made to gain an understanding of what the employees at WSP Byggprojektering Göteborg have in mind about different programs and management of IFC. It has given the authors a deeper insight and understanding of the benefits and challenges that respondents encounter, which are highlighted in the section results. Apractical part is included in the degree project. During the practical steps, it will model in Tekla Structures and Revit to then analyze the models with the help of the review programs Solibri Model Checker and Navisworks Manage. Two review methods are created in this thesis, a visual review and an Excel report. The visual inspection method involves visually reviewing the IFC model in the same review program as the coordinator. The second method is a prototype that compares different reports from the respective software.The conclusion in this report is as follows: • Human factor in the modeling and use of correct tools when exporting to IFC. • The IFC format and the ability of the various programs to interpret the IFCinformation may in some cases fail for non-standard geometries. • Review programs can interpret the IFC model differently.
Junior, Valdivino Vargas. "Modelagem de epidemias via sistemas de partículas interagentes." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-27052013-085717/.
Full textWe studied a discrete time particle system whose dynamic is as follows. Consider that at time zero, on each non-negative integer, there is a particle, initially inactive. A particle which is placed at origin is activated and instantly activates a contiguous random set of particles that is on its right. As a rule, the next moment to what it has been activated, each active particle carries the same behavior independently of the rest. We say that the process survives if the amount of particles activated along the process is infinite. We call this the Firework process, associating the activation dynamic of a particle to an infection or explosion process. Our interest is to establish whether the process has positive probability of survival and to present limits to this probability. This is done according to the distribution random variable that defines the radius of infection of each active particle, Associating the activation process to an infection, we think this model as a model epidemic. We also consider some variations of this dynamic. Among them, variants with particles distributed over the half line (there are conditions for the distances between consecutive particles) and also with particles distributed over the vertices of a tree. We studied phase transitions and the correspondent survival probability. In this variant the results depend on the sequence of probability distributions for the range of the explosions and on the particles displacement. We also consider a variation where each particle after activated, remains active during a random time period emitting explosions that occur in random moments.
Kuoljok, Simon Pirak. "Konnektivitetsåtgärder i Emån : En fallstudie vid Högsby vattenkraftverk." Thesis, KTH, Hållbar utveckling, miljövetenskap och teknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-298253.
Full textIn 2020 the national plan for hydropower plants was implemented in Sweden. The national plan aims to increase the environmental standards for hydropower plants for the benefit of the water body and still maintain an efficient access to electricity originated from hydropower. Emån is the largest river in the south-eastern part of Sweden. The river a Natura 2000 area and it is classified as moderate based on connectivity according to environmental norm for rivers and water bodies. Högsby hydropower plant located along Emån is owned by Uniper and are included within the national plan, which indicate that it needs to meet the environmental standard. Högsby hydropower plant currently has no established minimum discharge. The aim of the study is to investigate a minimum discharge for the natural dry stream located in Högsby to enable higher connectivity for the fish species that have historically been able to migrate through Högsby. The natural stream is downstream the spillway dam in Högsby. Within the stream a weir is considered to be the most difficult passage for fish migration, with the exception of the spillway dam. The method used in the report is a one-dimensional steady flow model in HEC-RAS for the natural stream to investigate the possibility for fish migration past the hydropower plant. Two scenarios have been considered – with and without the crest dam present. Based on the result from the model, the highest mean water velocity is the section below the weir for both scenarios and same applies to the mean water depth. Implementation of a minimum discharge and various fish ways in the natural stream is measures for increasing connectivity that are being discussed which would also maintain the rapids that has existed in the stream. The fish ways that have been discussed is based on the scenarios if the weir is present or not. The fish species that with great certainty have been able to migrate through Högsby historically is sea trout and salmon. Högsby hydropower plant have the highest classified regulation of the hydropower plants in Emån which need to be considered when determining minimum discharge. There are a few uncertainties with the hydraulic model such as bathymetry and the boundary conditions. The model is not validated which means that the water depth and the velocity might not reflect the true values. It is not certain which fish species that have migrated through Högsby historically, nor can an established minimum discharge be said to enable fish migration. However, an implementation of a minimum discharge and the fish ways that have been discussed will increase the connectivity in Högsby.
Voigt, Rianne Ruth. "Model vir ervaringsleer in voedseldiensbestuur." Thesis, Cape Technikon, 1999. http://hdl.handle.net/20.500.11838/1887.
Full textThe food service management industry is aware of the importance of an effective experiential learning programme. The technikons try to meet the demands of the food service management industry through involvement of mentors and students. This study undertakes to examine/investigate the deficiencies and problems concerning the experiential learning components of the food service management course, as it is evident that this forms an integral part of the food service management diploma. Although the food service managers is aware of the experiential learning programme, no formal information such as: an employers'/mentors' manual; orientation of other procedures of the technikons or industry; comprehensive evaluating procedures; coordinators' visits and debriefing procedures, is explained or sent to the industry or student. The researcher's involvement with experiential learning and specifically the administration of the experiential learning programme, led to the hypothesis that there is a need for a formal structured experiential learning programme to clear up the confusion and uncertainty regarding the involvement of the student and employer/mentor. As starting point a situation analysis was done to compare the South African perspectives of experiential learning. This information was gathered to place the food service management experiential learning programme in perspective for technikons, students and the food service management industry. The second section deals with a comparative study of the international procedure of experiential learning as currently done in the United States of America (USA), the United Kingdom (UK) and South Africa (SA). What follows next is a comparative study, testing the literature against the reality, as it is experienced in tertiary institutions in the USA, UK and SA. Details of this are expounded in chapter 40f this study. Findings are reported and recommendations regarding the application of an . efficient, productive, structured experiential leaming programme for the course: Food Service Management, are noted in chapter 5. In the last section an explanation is given of the following proposed manuals: • Tutorial Manual • Student Manual • Employer/Mentor Manual • Coordinator Manual during student visits • Debriefing Manual These proposed manuals should be used to satisfy the needs of all involved: employers, technikons and students.
Liu, Shan. "Model checking in Tobit regression model via nonparametric smoothing." Kansas State University, 2012. http://hdl.handle.net/2097/13790.
Full textDepartment of Statistics
Weixing Song
A nonparametric lack-of-fit test is proposed to check the adequacy of the presumed parametric form for the regression function in Tobit regression models by applying Zheng's device with weighted residuals. It is shown that testing the null hypothesis for the standard Tobit regression models is equivalent to test a new null hypothesis of the classic regression models. An optimal weight function is identified to maximize the local power of the test. The test statistic proposed is shown to be asymptotically normal under null hypothesis, consistent against some fixed alternatives, and has nontrivial power for some local nonparametric power for some local nonparametric alternatives. The finite sample performance of the proposed test is assessed by Monte-Carlo simulations. An empirical study is conducted based on the data of University of Michigan Panel Study of Income Dynamics for the year 1975.
GOMES, Adriano José Oliveira. "Systematic model-based safety assessment via probabilistic model checking." Universidade Federal de Pernambuco, 2010. https://repositorio.ufpe.br/handle/123456789/2651.
Full textFaculdade de Amparo à Ciência e Tecnologia do Estado de Pernambuco
A análise da segurança (Safety Assessment) é um processo bem conhecido que serve para garantir que as restrições de segurança de um sistema crítico sejam cumpridas. Dentro dele, a análise de segurança quantitativa lida com essas restrições em um contexto numérico (probabilístico). Os métodos de análise de segurança, como a tradicional Fault Tree Analysis (FTA), são utilizados no processo de avaliação da segurança quantitativo, seguindo as diretrizes de certificação (por exemplo, a ARP4761 Guia de Práticas Recomendadas da Aviação). No entanto, este método é geralmente custoso e requer muito tempo e esforço para validar um sistema como um todo, uma vez que para uma aeronave chegam a ser construídas, em média, 10.000 árvores de falha e também porque dependem fortemente das habilidades humanas para lidar com suas limitações temporais que restringem o âmbito e o nível de detalhe que a análise e os resultados podem alcançar. Por outro lado, as autoridades certificadoras também permitem a utilização da análise de Markov, que, embora seus modelos sejam mais poderosos que as árvores de falha, a indústria raramente adota esta análise porque seus modelos são mais complexos e difíceis de lidar. Diante disto, FTA tem sido amplamente utilizada neste processo, principalmente porque é conceitualmente mais simples e fácil de entender. À medida que a complexidade e o time-to-market dos sistemas aumentam, o interesse em abordar as questões de segurança durante as fases iniciais do projeto, ao invés de nas fases intermediárias/finais, tornou comum a adoção de projetos, ferramentas e técnicas baseados em modelos. Simulink é o exemplo padrão atualmente utilizado na indústria aeronáutica. Entretanto, mesmo neste cenário, as soluções atuais seguem o que os engenheiros já utilizavam anteriormente. Por outro lado, métodos formais que são linguagens, ferramentas e métodos baseados em lógica e matemática discreta e não seguem as abordagens da engenharia tradicional, podem proporcionar soluções inovadoras de baixo custo para engenheiros. Esta dissertação define uma estratégia para a avaliação quantitativa de segurança baseada na análise de Markov. Porém, em vez de lidar com modelos de Markov diretamente, usamos a linguagem formal Prism (uma especificação em Prism é semanticamente interpretada como um modelo de Markov). Além disto, esta especificação em Prism é extraída de forma sistemática a partir de um modelo de alto nível (diagramas Simulink anotados com lógicas de falha do sistema), através da aplicação de regras de tradução. A verificação sob o aspecto quantitativo dos requisitos de segurança do sistema é realizada utilizando o verificador de modelos de Prism, no qual os requisitos de segurança tornam-se fórmulas probabilísticas em lógica temporal. O objetivo imediato do nosso trabalho é evitar o esforço de se criar várias árvores de falhas até ser constatado que um requisito de segurança foi violado. Prism não constrói árvores de falha para chegar neste resultado. Ele simplesmente verifica de uma só vez se um requisito de segurança é satisfeito ou não no modelo inteiro. Finalmente, nossa estratégia é ilustrada com um sistema simples (um projeto-piloto), mas representativo, projetado pela Embraer
Almeida, Priscila Todero de 1988. "Estudo do tunelamento da magnetização em magnetos moleculares de Mn 12 via q-histerons." [s.n.], 2013. http://repositorio.unicamp.br/jspui/handle/REPOSIP/278165.
Full textDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin
Made available in DSpace on 2018-08-23T16:30:39Z (GMT). No. of bitstreams: 1 Almeida_PriscilaToderode_M.pdf: 4163268 bytes, checksum: 28aa8c398a21212739c3588c9ce78593 (MD5) Previous issue date: 2013
Resumo: O principal objetivo desse trabalho consiste em uma nova abordagem para o tunelamento da magnetização do magneto molecular Mn12, embasado em uma ampliação do modelo de Preisach. Introduziremos novos operadores que levam em conta a possibilidade de efeitos quânticos. Implementamos esse novo modelo num programa de simulação que é capaz de simular curvas de histerese e curvas de relaxação magnética sem recorrer a resolução de hamiltonianas de spin. Além disso, este programa utiliza simulação estocástica, apresentando os resultados em poucos minutos. Os resultados obtidos concordam com os experimentos realizados de histerese e relaxação magnética. Apesar de ser um modelo de simulação simples, reproduz adequadamente a fenomenologia, pois introduz os dois ingredientes essenciais de um sistema com inversão da magnetização por efeito túnel termicamente ativado: a ativação térmica, descrita pela ocupação de níveis segundo a distribuição de Boltzmann e a possibilidade do efeito túnel descrita pelo modelo de Landau¿Zenner. A consistência física do modelo é estudada através da variação de parâmetros do modelo de forma sistemática
Abstract: The main objective of this work consists of a new approach concerning the tunneling of the magnetization of the molecular magnet Mn12, based on an extension of the Preisach model. We will introduce new operators that take into account the possibility of quantum effects. Thus, we have implemented this new model in a simulation software that is capable of simulating hysteresis curves and magnetic relaxation curves without utilizing resolution of spin Hamiltonians. Also, this program uses stochastic simulation, presenting the results in only a few minutes. The results obtained agree with the hysteresis and relaxation experiments. Despite being a simple simulation model, it adequately reproduces the phenomenology, because it introduces two key ingredients of a system with inversion of magnetization by thermally activated tunnel effect: the thermal activation, described by the occupation of levels according to the Boltzmann distribution and the possibility of tunnel effect described by the Landau-Zenner model. The physical consistency of the model is studied by systematically varying the model¿s parameters
Mestrado
Física
Mestra em Física
Victoria, Daniel de Castro. "Simulação hidrológica de bacias amazônicas utilizando o modelo de Capacidade de Infiltração Variável (VIC)." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/64/64135/tde-25032010-095240/.
Full textThe Amazon river basin is the largest fluvial system in the world, discharging 209,000 m3 s-1 to the ocean. It also sustains the largest continuous tropical forest system. However, the region is under constant pressure from deforestation and climate change. For such reasons, its crucial to understand how the hydrological cycle functions. Such tools can be used for evaluation of future scenarios and guide decision making. The Variable infiltration Capacity Model (VIC) was evaluated and adapted to tropical conditions. Temperature, precipitation, wind speed, soil type and land cover maps were used to simulate the hydrological cycle in 2 sub-basins inside the Amazon: Santo Antônio do Içá, Japurá, Juruá, Negro, Madeira e Purus, covering the period from 1980 to 2006. The simulation was not possible for basins with large drainage area located in the Andes (Santo A. Içá and Japurá), due to underestimation of the precipitation. For the other basins, simulated discharge agreed with observed records, even though evapotranspiration (ET) estimates showed some problems. The ET partitioning in its components, transpiration and canopy evaporation, showed severe discrepancies. A correction was applied to the model, fixing the partitioning problem but it resulted in reduction of estimate ET. A new version of the model (v.4.1) has just been released, with changes in the way ET is estimated. However, this new version has not yet been tested in the Amazon
Gustavsson, Sam, and Filip Johansson. "Kvalitetssäkring av objektsbaserade IFC-filer för mängdavtagningar och kostnadskalkyler : En fallstudie av levererade konstruktionsmodeller, vid Tyréns AB Stockholm." Thesis, KTH, Byggteknik och design, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-127029.
Full textTo perform quantity takeoffs and cost estimates based on 3D models is problematic, if the models do not meet the expected demands of information. The weaknesses of the models leads to uncertainty when the calculator will establish cost estimates, which forms the background to the issues that the thesis constitutes. The project Biomedicum is conducted as a full-fledged BIM project, where the object based 3D models are the basis for cost estimates in the system phase. The calculator in the project has expressed dissatisfaction regarding the quality of the models, and believes that there are major flaws that lead to the risk of uncertain calculations increasing. Of this reason, this thesis will analyze the possibilities about ensuring the quality of an object based 3D model in IFC format, by using the software Solibri Model Checker. This thesis consists of a literature review, interviews and a case study. The case study examines whether Solibri Model Checker is capable of checking the models based on the project specific requirements, and to analyze the results of the checks that have carried out in the case study. The interviews have been the basis for the number of problems that have been considered relevant for the thesis. Problems that can be traced to legal difficulties, lack of a common set requirements, disagreements about the content of an object based 3D model etc. The case study has shown that the 3D models are inadequate to perform cost estimates, which leads to difficulties and uncertainties in the calculation process. There are several reasons for this. The designers of the project have used wrong modeling technique for quantity takeoff and cost estimating. But also that there are no clear guidelines as to what information should be included in the items in the document stage system, which has emerged during the interviews. It has also emerged during the interviews that crucial problems that may be linked to the models are of poor quality, because it has not been required early enough in the process. What is important to get a working process in which the object-based 3D models to be used for quantity takeoffs and cost estimates, is that in the early stages determine what information is needed in the objects to credible estimates will be possible to establish. The importance of a clear IT manual is also great, as it is the manual who serves as the basis of the information to be included in the models.
Cueva, Marcos Salles. "Estudo de VIM - Movimentos Induzidos por Vórtices em plataformas do tipo monocoluna: abordagem numérico-experimental e impactos no sistema de amarração." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-31052011-162748/.
Full textThe vortex induced vibration VIV is a main subject studied in the offshore engineering, having its name adapted to VIM by the occurrence in Spar platforms in the Gulf of México. As the first monocolumn unit, the Sevan Piranema, was in Northeast Brazil in 2007, VIM occurrence in cylindrical units like this was questioned, having the mooring lines design dimensioned without this load. Firstly a detailed study about the works developed in this field was performed, and not being found enough publications about VIM in monocolumns, its characterization was aimed in terms of oscillation amplitude and period through model tests, due to the difficulty to obtain reliable results by means of analytical or numerical approaches. Later, with the objective of verifying the impact of VIM in extreme loads and, mainly, in fatigue damage, analyses with two different methods found in the literature were performed. The first consider the application of VIM as a motion in the numerical model, based on the experimental results, and the second applies the observed forces. For these studies the mooring system for a typical scenario was analyzed, under the wave, wind and current environmental force. At last, due to the importance observed from the results, there were discussed the relation of VIM with some external factors, like span wise relation, presence of waves and viscous damping, being presented for this last a comparison of the numerical model results with the experimental is presented.
Hamoud, Bassam, and Subeyr Yassin. "Användning av BIM-modeller vid kommunikation mellan olika parter." Thesis, KTH, Byggteknik och design, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-123772.
Full textThe Transportadministration wants to find out how to implement BIM (Building informationmodeling) as a way of working in their organization during design, constructionand management. The purpose of this study is to find out how BIM-models areused in communication today and how it can be developed. There are a few people today in the construction industry who have experience of working with BIM. Even in projects who work with BIM there is always a need for a BIM-coordinator who holds the strings. To get a good end result the communication within the project must be good. A condition for this is that all the involved parties have the same goals with the project, such as keeping the schedule better. Those who lack experience of working with BIM-models must be given a chance to understand it in order to be able to share their experience and knowledge with others. Knowledge must be enhanced and it will take time. It is important that there is not too much focus too much on technology because BIM is about more than just technology. First and foremost one must be clear about the ones goals of working with BIM. From this one can then define a process and ultimately decide which technology is best suited for this process.
Himberg, Benjamin Evert. "Accelerating Quantum Monte Carlo via Graphics Processing Units." ScholarWorks @ UVM, 2017. http://scholarworks.uvm.edu/graddis/728.
Full textCsonka, Kamilla. "Användaracceptans vid systemimplementering." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-20389.
Full textUser acceptance is an important part of all system development projects and is also a very easily influenced variable for a successful implementation. That is why it is important to have an overview of those factors that could influence the user acceptance negatively, one of those factors being delay. This thesis highlights the cohesion between a delayed project and its influence over the user acceptance. By letting a study group answer a survey formed by the Technology Acceptance Model, I have gathered the generalized opinion of the group. The results show that delay as a variable does not influence the user acceptance of this case study.
Nassib, Ali Hussein. "Isar Target Reconstruction Via Dipole Modeling." University of Dayton / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1461250663.
Full textScibilia, Francesco. "Explicit Model Predictive Control:Solutions Via Computational Geometry." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for teknisk kybernetikk, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-11627.
Full textAlexander, Louis Cadmon. "PROSIM VII an enhanced production simulation model." Ohio : Ohio University, 1992. http://www.ohiolink.edu/etd/view.cgi?ohiou1171474745.
Full textLee, Christina (Christina Esther). "Latent variable model estimation via collaborative filtering." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/113932.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 255-264).
Similarity based collaborative filtering for matrix completion is a popular heuristic that has been used widely across industry in the previous decades to build recommendation systems, due to its simplicity and scalability. However, despite its popularity, there has been little theoretical foundation explaining its widespread success. In this thesis, we prove theoretical guarantees for collaborative filtering under a nonparametric latent variable model, which arises from the natural property of "exchangeability", i.e. invariance under relabeling of the dataset. The analysis suggests that similarity based collaborative filtering can be viewed as kernel regression for latent variable models, where the features are not directly observed and the kernel must be estimated from the data. In addition, while classical collaborative filtering typically requires a dense dataset, this thesis proposes a new collaborative filtering algorithm which compares larger radius neighborhoods of data to compute similarities, and show that the estimate converges even for very sparse datasets, which has implications towards sparse graphon estimation. The algorithms can be applied in a variety of settings, such as recommendations for online markets, analysis of social networks, or denoising crowdsourced labels.
by Christina E. Lee.
Ph. D.
Zhou, Yan. "Bayesian model comparison via sequential Monte Carlo." Thesis, University of Warwick, 2014. http://wrap.warwick.ac.uk/62064/.
Full textWu, Si Chen. "Research on legal issues of VIE model." Thesis, University of Macau, 2016. http://umaclib3.umac.mo/record=b3525479.
Full textHaapala, O. (Olli). "Application software development via model based design." Master's thesis, University of Oulu, 2015. http://urn.fi/URN:NBN:fi:oulu-201504021268.
Full textTämän lopputyön tarkoituksena oli tutkia MathWorks:n Simulink®-ohjelmiston käyttöä mallipohjaisessa ohjelmistotuotannossa ja sen soveltuvuutta Vacon 100 -taajusmuuntajan ohjelmointiin. Tavoitteena oli identifioida kaikki ongelmakohdat, jotka vaikuttavat menetelmän jokapäiväisessä hyödyntämisessä, sekä luoda raportti, miten menetelmän avulla voidaan tehdä Vacon 100 yhteensopiva ohjelmisto. Ennen työn aloittamista menetelmän soveltuvuudesta ei ollut tarkkaa tietoa. Työn aikana suoritetut käytännönläheiset ohjelmistotuotantoesimerkit kuitenkin osoittivat nopeasti menetelmän toimivuuden. Ongelmakohdat, joita ajateltiin ennen työn aloittamista, osoittautuivat pääosin vääriksi. Ainoa pysyvä ongelmakohta, joka työn aikana tuli esille, on Vacon 100:n tuki vain 32-bit reaaliluvuille, kun taas Simulink käyttää oletuksena 64-bit reaalilukua. Vaikka datatyypistä aiheutuva ongelma estääkin muutaman Simulink-lohkon käytön, se ei kuitenkaan ole menetelmän käyttöä rajoittava ongelma. Työssä ei tullut vastaan yhtään ongelmaa, joka olisi estänyt mallipohjaisen suunnittelun käytön Vacon 100 -laitteen kanssa. Simulink:n koodigenerointityökalu eli Simulink PLC Coderon tärkeässä osassa työn tutkimuksen kannalta. Kaiken kaikkiaan koodigeneraattori toimi yli odotusten, mutta suurimmat ongelmat, jotka rajoittavat mallipohjaisen suunnittelun käyttöä, liittyvät kuitenkin PLC Coder:n toimintaan. Yhteenvetona työn perusteella voidaan todeta, että mallipohjainen ohjelmistotuotanto on nykyaikana erittäin käyttökelpoinen menetelmä. Tosin menetelmän tuomat hyödyt eivät välttämättä tule esille pienessä mittakaavassa ja ennen kuin yritykselle on muodostunut omaan tuotteeseen liittyvien mallien ja lohkojen tietokanta. Tulevaisuudessa kuitenkin suunnittelutyön vaatimusten kasvaessa, mallipohjaisen ohjelmistotuotannon merkitys tulee kasvamaan. Hyvin toteutettuna menetelmä parantaa huomattavasti suunnittelutyön tulosta niin taloudellisesti, ajallisesti kuin laadullisestikin
Xu, Ruoxi. "Regression Model Stochastic Search via Local Orthogonalization." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1322589253.
Full textMarais, Susan Maria. "'N Generiese model vir 'n effektiewe mentorskapprogram." Thesis, Pretoria : [s.n.], 2001. http://upetd.up.ac.za/thesis/available/etd-08272003-101859.
Full textFilho, Nicolino Trompieri. "ANÃLISE DOS RESULTADOS DA AVALIAÃÃO DO SAEB/2003 VIA REGRESSÃO LINEAR MÃLTIPLA." Universidade Federal do CearÃ, 2007. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=1993.
Full textO presente estudo teve como objetivo identificar nos resultados do SAEB/2003, no CearÃ, um modelo multilinear com variÃveis que garantam o maior grau de explicaÃÃo do rendimento em matemÃtica nos testes aplicados nas amostras de 4Â sÃrie da escola pÃblica; 4Â sÃrie da escola particular; 8Â sÃrie da escola pÃblica e 8Â sÃrie da escola particular. Tomando-se, entre as variÃveis do diretor, do professor, do aluno e das condiÃÃes fÃsicas da escola, as variÃveis que se relacionavam significativamente com o rendimento no teste de matemÃtica em cada uma das amostraS, com os recursos do software SPSS (Statical Package for the Social Sciences), uma regressÃo linear em cada amostra, com o mÃtodo âstepwareâ. Obtiveram-se quatro modelos lineares mÃltiplos com variÃveis que permitem um maior grau de explicaÃÃo do rendimento no teste em matemÃtica em cada uma das amostras. Conclui-se que a anÃlise desse tipo permite a formulaÃÃo de polÃticas pÃblicas capazes de superar a busca da melhoria do ensino por intervenÃÃes pontuais.
This study had as its objective the identification in the results of SAEB/2003, here in CearÃ, of a multilinear model with variables that gave a greater degree of explaination for the results in mathematics of tests given in the randomic sample of the fourth years students in public schools, the fourth year students in private schools, the eight year students in public schools and the eight year students in private schools. The following variables were taken into account: the director, the teacher, the student, the physical state of the school as variables that had significant relevance with the results of the tests in mathematics in each one of the samples, using the resources of the software SPSS (Statical Package for the School Sciences), a linear regression in each sample, using the method "stepware". Four multilinear models with variables were obtained that permitted a greater degree of explaination for the results of the mathematic tests in each sample. The study concluded affirming that an analysis of this type permits the formulation of public politics capable of overcoming the search for better forms of teaching with punctual interventions.
Campos, Josmar Furtado de. "Eficiência da análise estatística espacial na classificação de famílias do feijoeiro - estudo via simulação." Universidade Federal de Viçosa, 2011. http://locus.ufv.br/handle/123456789/4039.
Full textThe aim of this study was to evaluate the efficiency of spatial analysis, which considers spatially dependent errors, for classification of common bean families in relation to traditional analysis in randomized blocks and lattice that assuming independent errors. Were considered different degrees of spatial dependence and experimental precision. Were taken as reference to simulate the results of seven experiments carried out in simple square lattice for genetic evaluation of yield (g/plot) of families and bean cultivars of winter crops and water used in 2007 and 2008. From the results presented in the simulation, it was possible to assess the quality of their experiments based on different analysis (Block, lattice and Spatial) and simulated average of 100 families in different scenarios for Spatial Dependence (DE) and Accuracy Selective (AS). In the process of simulation, the average yield (645 g/plot) and the residual variance (7744.00), was defined based on the analysis results of the tests in blocks of bean breeding program at UFV. To make up the four simulated scenarios were considered magnitude of spatial dependence (null, low, medium and high), corresponding to ranges of 0, 25, 50 and 100% of the maximum distance between plots. Were also simulated three classes of selective accuracy (0.95, 0.80 and 0.60), corresponding to the experimental precision very high, high and average, respectively. The actual classification of families was used to evaluate the efficiency of analysis methods tested by Spearman correlation applied to orders and genotypic classification of Selection Efficiency between classifications based on tested methodologies and the actual classification for the selection of 10, 20 and 30% of the best families. To compare the efficiency of adjustment of the models tested, was used the Akaike information criterion (AIC), based on likelihood. Spatial analysis has provided estimates of residual variance very close to the simulated residual variance and higher selective accuracy estimated in all scenarios, indicating greater experimental accuracy. With the reduction in the accuracy and selective increase in spatial dependence, there was greater influence of analysis on the classification of families, and the spatial analysis showed the best results, providing more efficient selection of bean families than traditional analysis of randomized blocks and lattice, mainly for the selection of fewer families. The results for selective accuracy estimated on the basis of F statistics were very close to those obtained with the Spearman correlation between estimated and simulated averages for families, indicating that the accuracy should be used selectively as a measure of experimental precision tests of genetic evaluation.
O objetivo deste trabalho foi avaliar a eficiência da análise Espacial, que considera erros dependentes espacialmente, para classificação de famílias de feijoeiro em relação às análises tradicionais em blocos casualizados e em látice que assumem erros independentes. Considerou-se diferentes graus de dependência espacial e de precisão experimental. Foram tomados como referência para simulação os resultados de sete ensaios instalados em látice quadrado simples para avaliação genética da produtividade de grãos (g/parcela) de famílias e cultivares de feijoeiro das safras de inverno e das águas de 2007 e 2008. A partir dos resultados apresentados na simulação, foi possível avaliar a qualidade dos respectivos experimentos com base nas diferentes análises (Bloco, Látice e Espacial) e médias simuladas das 100 famílias nos diferentes cenários para Dependência Espacial (DE) e Acurácia Seletiva (AS). No processo de simulação, a média de produção (645 g/parcela), bem como a variância residual (7744,00), foi definida com base nos resultados de análises em blocos de ensaios do programa de melhoramento do feijoeiro da UFV. Para a composição dos cenários simulados foram consideradas quatro magnitudes de dependência espacial (nula, baixa, média e alta), correspondendo aos alcances 0, 25, 50 e 100% da distância máxima entre parcelas. Também foram simuladas três classes de acurácia seletiva (0,95, 0,80 e 0,60), correspondente a precisão experimental muito alta, alta e média, respectivamente. A classificação real das famílias foi utilizada para avaliar a eficiência das metodologias de análise testadas através da correlação de Spearman aplicada às ordens de classificação genotípica e da Eficiência de Seleção entre classificações com base nas metodologias testadas e na classificação real, para a seleção de 10, 20 e 30% das melhores famílias. Para comparar a eficiência de ajuste dos modelos testados, foi utilizado o critério de Informação de Akaike (AIC), baseado em verossimilhança. A análise Espacial apresentou estimativas de variância residual muito próxima da variância residual simulada e maior acurácia seletiva estimada em todos os cenários, indicando maior precisão experimental. Com a redução na acurácia seletiva e aumento na dependência espacial, observou-se maior influência do tipo de análise sobre a classificação das famílias, sendo que a análise espacial apresentou os melhores resultados, proporcionando seleção mais eficiente das famílias do feijoeiro do que as análises tradicionais em Látice e em Blocos casualizados, principalmente, para seleção de menor número de famílias. Os resultados para acurácia seletiva estimada em função da estatística F foram muito próximos aos obtidos para a correlação de Spearman entre médias estimadas e simuladas para as famílias, indicando que a acurácia seletiva deve ser utilizada como medida de precisão experimental nos ensaios de avaliação genética.
Keyser, Helane. "Ondersoek na 'n model vir die opleiding van leksikograwe vir verklarende woordeboeke." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53539.
Full textENGLISH ABSTRACT: In the past work on theoretical lexicography was often done simply "incidentally", usually flowing from research in pure linguistics. In the course of time theoretical lexicography and the compilation of dictionaries developed into a fully fledged discipline. The successful lexicographer requires particular characteristics and skills of a high quality. It appears from discussions with editors of dictionaries that even the best academic training does not necessarily produce an excellent lexicographer, as the compilation of dictionaries is a process that requires the skilful integration of theory, practice and language intuition. This study has tried to identify the knowledge, characteristics and skills that are necessary in order to train prospective lexicographers. An attempt has been made, by way of a selective literature review and conducting interviews with people engaged in the practice of lexicography, to design a model for the training of lexicographers. The study has shown that the training of lexicographers should not focus only on the theoretical aspects, but that in-service training or experience with a publisher of dictionaries is very important. A proposal is made on the subjects that should receive attention in the training of lexicographers. As part of the training model every prospective lexicographer should receive grounding in the development of management skills. A model for the compilation of dictionaries should emphasise the importance of a well thought-through conceptualisation of a plan for a dictionary as part of the lexicographical process. Lexicographers should be trained in the lexicographical process, from the collection of data to the preparation of the dictionary for publication. The purpose of the dictionary should also be identified as part of the planning phase. Prospective lexicographers must, among other things, build up an extensive knowledge of metalexicography, the general theory of lexicography, the management and planning of a dictionary project, the lexicographical process (from the collection of data to the publication and evaluation of dictionaries), aspects regarding the theories of semantics, morphology, phonology and syntax, and computer lexicography. Finally, a recommendation is made that certain components of lexicography training should also be undertaken at technikons rather than only at universities - this proposition should be researched further.
AFRIKAANSE OPSOMMING: Werk in die teoretiese leksikografie is in die verlede dikwels bloot "toevallig" gedoen vanuit navorsing wat gewoonlik veral in die suiwer taalkunde gedoen is. Die teoretiese leksikografie en die skryf van woordeboeke het mettertyd ontwikkel tot 'n volwaardige dissipline. Sekere eienskappe en vaardighede van hoogstaande gehalte word benodig om 'n suksesvolle leksikograaf te wees. Uit gesprekke met woordeboekredakteurs blyk dit duidelik dat selfs die beste akademiese opleiding nie noodwendig 'n voortreflike leksikograaf saloplewer nie, aangesien die skryf van 'n woordeboek 'n proses is waardeur teorie, praktyk en taalintuïsie vaardig ineengevleg moet word. Hierdie studie het gepoog om die kennis, eienskappe en vaardighede te identifiseer wat nodig is om 'n voornemende leksikograaf te onderrig. Deur middel van 'n selektiewe literatuurstudie en persoonlike onderhoude met persone in die praktyk, is gepoog om 'n model daar te stel vir die opleiding van leksikograwe. Die studie het getoon dat leksikograwe se opleiding nie net op die teoretiese aspekte mag fokus nie, maar dat indiensopleiding of ondervinding by 'n woordeboekuitgewer van groot belang is. 'n Voorstel word gemaak oor die onderwerpe wat aandag behoort te geniet in die opleiding van 'n leksikograaf. As deel van die opleidingsmodel behoort elke voornemende leksikograaf onderlê te word in die ontwikkeling van bestuursvaardighede. 'n Model vir die samestelling van woordeboeke behoort klem te lê op die belangrikheid van 'n weldeurdagte woordeboekkonseptualiseringsplan as deel van die leksikografiese proses. Leksikograwe behoort in die leksikografiese proses onderrig te word, vanaf die verkryging van data tot die voorbereiding vir publikasie van die woordeboek. Die doel van die woordeboek moet ook geïdentifiseer word as deel van die beplanningsfase. Voornemende leksikograwe moet onder andere 'n deeglike kennis opdoen van die metaleksikografie, algemene leksikografieteorie, bestuur en beplanning van 'n woordeboekprojek, die leksikografiese proses (van die verkryging van data tot die uitgee van die woordeboek en die evaluering daarvan), aspekte van die teorie van semantiek, morfologie, fonologie en sintaksis en rekenaarleksikografie. Ten slotte word 'n aanbeveling gemaak dat bepaalde komponente van leksikografie-opleiding dalk ook by technikons en nie net by universiteite hoort nie - hierdie opmerking behoort verder nagevors te word.
Mitchell, Cassie S. "Viewpoint aggregation via relational modeling and analysis." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/28213.
Full textCommittee Chair: Lee, Robert; Committee Member: Kemp, Melissa; Committee Member: Prinz, Astrid; Committee Member: Ting, Lena; Committee Member: Wiesenfeld, Kurt.
Adhikari, Bhisma. "Intelligent Simulink Modeling Assistance via Model Clones and Machine Learning." Miami University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=miami1627040347560589.
Full textFjärdsjö, Johnny, and Zada Nasir Muhabatt. "Kvalitetssäkrad arbetsprocess vid 3D-modellering av byggnader : Baserat på underlag från ritning och 3D-laserskanning." Thesis, KTH, Byggteknik och design, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-148822.
Full textThe use of hand drawn construction model was the only way of development, rebuilding, sales and real estate management before the 80’s. However, the challenge was to preserve the drawings and maintain its real condition. To make things work faster and easier the development of advanced drawing software (CAD) was introduced which replaced the traditional hand drawn designs. Today, CAD is used broadly for all new constructions with a great success rate. However, with the new advanced technology many engineers and construction companies are heavily using 3D models over 2D drawings. The major advantage of designing in 3D is a virtual model created of the entire building to get a better control of input construction items and the errors can be detected at earlier stages than at the construction sites. By modifying buildings in a virtual model in three dimensions yet at the first stage and gradually fill it with more relevant information throughout the life cycle of buildings to get a complete information model. One of the requirements from the property owners in the redevelopment and management is to provide accurate information and updated drawings. It should be simple for the contractor to read drawings. This report describes a streamlined work processes, methods, tools and applications for the production of 3D models. This work is intended to lead to a methodology and to be used as well as for passing on experience. This report will also be a base to describe the approach to model from older drawings into 3D models. The method description will simplify the understanding of model for both the property owners and for companies who creates 3D models. It will also increase the quality of the work to create CAD models from the different data used for modeling.
Tang, On-yee, and 鄧安怡. "Estimation for generalized linear mixed model via multipleimputations." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B30687652.
Full textBester, Suzanne Elizabeth. "Die ontwerp van 'n postmodernistiese model vir beroepsvoorligting." Pretoria : [s.n.], 1999. http://upetd.up.ac.za/thesis/available/etd-11062006-150213.
Full textLACERDA, Estefane George Macedo de. "Model Selection of RBF Networks Via Genetic Algorithms." Universidade Federal de Pernambuco, 2003. https://repositorio.ufpe.br/handle/123456789/1845.
Full textUm dos principais obstáculos para o uso em larga escala das Redes Neurais é a dificuldade de definir valores para seus parâmetros ajustáveis. Este trabalho discute como as Redes Neurais de Funções Base Radial (ou simplesmente Redes RBF) podem ter seus parâmetros ajustáveis definidos por algoritmos genéticos (AGs). Para atingir este objetivo, primeiramente é apresentado uma visão abrangente dos problemas envolvidos e as diferentes abordagens utilizadas para otimizar geneticamente as Redes RBF. É também proposto um algoritmo genético para Redes RBF com codificação genética não redundante baseada em métodos de clusterização. Em seguida, este trabalho aborda o problema de encontrar os parâmetros ajustáveis de um algoritmo de aprendizagem via AGs. Este problema é também conhecido como o problema de seleção de modelos. Algumas técnicas de seleção de modelos (e.g., validação cruzada e bootstrap) são usadas como funções objetivo do AG. O AG é modificado para adaptar-se a este problema por meio de heurísticas tais como narvalha de Occam e growing entre outras. Algumas modificações exploram características do AG, como por exemplo, a abilidade para resolver problemas de otimização multiobjetiva e manipular funções objetivo com ruído. Experimentos usando um problema benchmark são realizados e os resultados alcançados, usando o AG proposto, são comparados com aqueles alcançados por outras abordagens. As técnicas propostas são genéricas e podem também ser aplicadas a um largo conjunto de algoritmos de aprendizagem
Bernard-Stoecklin, Sibylle. "Role of semen infected leukocytes in HIV mucosal transmission : Experimental model of SIVmac251 infection in Macaca fascicularis." Thesis, Paris 11, 2013. http://www.theses.fr/2013PA114818/document.
Full textHuman Immunodeficiency Virus (HIV) infection mostly spreads by the mucosal route: sexual transmission is the dominant mode of transmission, responsible for between 85% and 90% of cases of infection worldwide. These epidemiological data indicate that semen is one of the major sources of HIV-1 transmission. Semen, like other bodily secretions involved in HIV sexual transmission, contains the virus as two forms: cell-free viral particles and cell-associated virus, mostly in infected leukocytes. Although cell-to-cell HIV transmission has been extensively described as more efficient, rapid and resistant to host immune responses, very few studies have investigated the role in vivo of infected leukocytes in virus mucosal transmission. One such study has been recently conducted in our lab, and demonstrated that SIV-infected splenocytes are able to transmit infection to female macaques after vaginal exposure. However, all these studies used immune cells from peripheral blood or lymphoid tissues, such as spleen, and none have investigated the capacity of infected leukocytes in semen to transmit the infection in vivo. Indeed, nature, phenotype and infectivity of HIV associated with semen leukocytes may be different from that of HIV from other sources.Therefore, the objectives of this work are, first, to study of semen leukocytes and their dynamics during SIVmac251 infection in detail, then to investigate seminal factors that may influence semen infectiousness, and finally to test semen leukocyte infectivity in vitro and in vivo, using a model of mucosal exposure in cynomolgus macaques.Macaque semen contains all the target cells for HIV/SIV: CD4+ T cells, macrophages and dendritic cells in lower proportions. Semen CD4+ T cells and macrophages display an activation, differenciation and expression of migration markers profile which is typical of mucosal leucocytes. SIV infection induces significant changes in their phenotype and dynamics. Both cell types can be productively infected and are found in the semen at all stages of infection. These observations suggest that semen CD4+ T cells and macrophages may be able to transmit infection after mucosal exposure.If the role of semen infected leukocytes in HIV/SIV mucosal transmission is confirmed in vivo, this mechanism will be important to consider for further preventive strategies design, like microbicides
Brenkman, Albertus Marinus Jan. "'n Konseptuele model vir 'n bestuursinligtingstelsel vir 'n streekkantoor / Albertus Marinus Jan Brenkman." Thesis, Potchefstroom University for Christian Higher Education, 1989. http://hdl.handle.net/10394/8980.
Full textThesis (MEd)--PU vir CHO, 1990
Pereira, Felipe Rateiro. "Investigação das vibrações induzidas pela emissão de vórtices em modelos reduzidos de riser lançados em catenária." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-04022016-183409/.
Full textThe study of Vortex-induced vibrations in risers is a topic of great relevance in the context of ocean engineering. In recent years, several numerical and experimental results have been presented, addressing the understanding of the fundamentals of this fluid-structure interaction phenomenon, however without a conclusive and comprehensive understanding. In this context, risers are exposed to the action of ocean currents and consequently to the phenomenon of VIV. In this sense, the present study provides results of an experimental approach to understanding the VIV phenomenon on risers launched in catenary configuration, which represents much of the geometries adopted today. However, performing risers experiments in towing-tanks presupposes the use of small-scale models, which represents a major research challenge. Therefore, a methodology for modeling reduced scale risers was developed. The focus of this methodology was on the similarity between certain important parameters that guarantee an equivalent full-scale dynamic behavior. VIV experiments were then performed in the towing tank, with the designed and constructed catenary, based on this methodology. Techniques such as modal decomposition were used for the analysis of the results. This allows the comparison of the results with classical literature paradigms showing that VIV response in the catenary is close to results obtained with flexible cylinders. Furthermore, the large experimental database presented and discussed in the study may contribute to future validation of several mathematical models and numerical codes of VIV prediction in catenary risers, describing in detail several immersed behaviors in this important engineering problem.
Benghabrit, Walid. "A formal model for accountability." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0043/document.
Full textNowadays we are witnessing the democratization ofcloud services. As a result, more and more end-users(individuals and businesses) are using these services intheir daily life. In such scenarios, personal data isgenerally flowed between several entities. End-usersneed to be aware of the management, processing,storage and retention of personal data, and to havenecessary means to hold service providers accountablefor the use of their data. In this thesis we present anaccountability framework called AccountabilityLaboratory (AccLab) that allows to consideraccountability from design time to implementation time ofa system. In order to reconcile the legal world and thecomputer science world, we developed a language calledAbstract Accountability Language (AAL) that allows towrite obligations and accountability policies. Thislanguage is based on a formal logic called First OrderLinear Temporal Logic (FOTL) which allows to check thecoherence of the accountability policies and thecompliance between two policies. These policies aretranslated into a temporal logic called FO-DTL 3, which isassociated with a monitoring technique based on formularewriting. Finally, we developed a monitoring tool calledAccountability Monitoring (AccMon) which providesmeans to monitor accountability policies in the context ofa real system. These policies are based on FO-DTL 3logic and the framework can act in both centralized anddistributed modes and can run into on-line and off-linemodes
Adhikari, Kiran. "Verifying a Quantitative Relaxation of Linearizability via Refinement." Thesis, Virginia Tech, 2013. http://hdl.handle.net/10919/23222.
Full textof the implementation model of a concurrent data structure. The method is based on checking the refinement relation between the implementation model and a specification model via explicit state model checking. It can directly handle multi-threaded programs where each thread can make infinitely many method calls, without requiring the user to manually annotate for the linearization points. We have implemented and evaluated our method in the PAT model checking toolkit. Our experiments show that the method is effective in verifying quasi linearizability and in detecting its violations.
Master of Science
Nunes, Fábio Magalhães. "Análise da correlação entre o Ibovespa e o ativo petr4 : estimação via modelos Garch e modelos aditivos." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2009. http://hdl.handle.net/10183/22664.
Full textVolatility estimation and forecasting are very important matters for the financial markets. Themes like risk and uncertainty in modern economic theory have encouraged the search for methods that allow for modeling of time varying variances. The main objective of this dissertation was estimate through GARCH models and additive models of IBOVESPA and PETR4 assets; and analyzes the existence of correlation between volatilities estimated. We use EGARCH models to estimate through parametric methods and use additive models 5 to estimate non parametric methods.
Berveglieri, Adilson [UNESP]. "Classificação fuzzy de vertentes por krigagem e TPS com agregação de regiões via diagrama de Voronoi." Universidade Estadual Paulista (UNESP), 2011. http://hdl.handle.net/11449/88156.
Full textConselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
As vertentes, como superf cies inclinadas, consistem em express~oes da Geomorfologia moldadas por fatores naturais (end ogenos e ex ogenos) e pelo pr oprio homem. Suas formas determinam o uxo ou o ac umulo de agua e representam caracter sticas fundamentais para a preven c~ao e resolu c~ao de problemas associados ao relevo, tais como utiliza c~ao do solo, constru c~ao civil entre outros. A classi ca c~ao da vertente em c oncava, convexa ou retil - nea permite a identi ca c~ao de areas conforme sua declividade. Assim, por meio de uma grade retangular regular, base do modelo digital de terreno, gera-se uma malha interpolada por fun c~oes estimadoras: thin-plate spline, que possui caracter sticas de suaviza c~ao e krigagem, que al em da suavidade tamb em considera a depend encia espacial. Logo ap os, a classi ca c~ao e feita, obedecendo a infer encia fuzzy baseada em fun c~oes de pertin encia que de nem classes a partir do c alculo da inclina c~ao e da concavidade ou convexidade do terreno. Entretanto, o resultado dessa classi ca c~ao est a atrelado a resolu c~ao da malha, n~ao permitindo fazer qualquer corre c~ao pontual. Pois, pequenas areas de pouca signi c ancia podem ser formadas, necessitando elimin a-las. Nesse sentido, para que o resultado seja ajustado, aplica-se o diagrama de Voronoi, caracterizado por sua rela c~ao de abrang encia e proximidade, como ferramenta para agregar regi~oes anteriormente classi cadas de modo a permitir um ajuste local e tornar o resultado mais condizente com a area em estudo, quando comparada a mapas geomorfol ogicos correspondentes
Slopes, such as inclined surfaces, consist in geomorphological expressions shaped by natural factors (endogenous and exogenous) and also by man himself. Their shapes determine the ow or accumulation of water and represent fundamental characteristics for the prevention and resolution of problems associated with relief, as land use, buildings, and others. Classi- cating slopes in concave, convex or straight allows to identi cate areas based on declivity. Thus, by regular rectangular grid which represents a digital terrain model, it generates a interpolated mesh by estimator functions: thin-plate spline, which has characteristics of smoothing, and kriging, which besides smoothing also considers spatial dependence. After that, the classi cation is realized according to fuzzy inference based on membership functions that de ne classes from the calculation of the slope and concavity or convexity of the ground. However, the classi cation depends on mesh resolution and it not allows any point correction. Once small areas with little importance can be formed requiring eliminate them. In order to adjust the result, it applies the Voronoi diagram, characterized by its comprisement and close relationship and scope, as a tool to aggregate regions previously classi ed and allow a local adjustment, that can provides a consistent result in study areas, if it was compared to the corresponding geomorphological maps
Ventrella, Domenico <1987>. "The Piglet as Biomedical Model: Physiological Investigations, New Techniques and Future Applications." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amsdottorato.unibo.it/7852/1/ventrella_domenico_tesi.pdf.
Full textAttila, Can. "Identification of Pseudomonas aeruginosa via a poplar tree model." [College Station, Tex. : Texas A&M University, 2008. http://hdl.handle.net/1969.1/ETD-TAMU-2326.
Full textTang, On-yee. "Estimation for generalized linear mixed model via multiple imputations." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B30687652.
Full textLu, Qunfang Flora. "Bayesian forecasting of stock prices via the Ohlson model." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-050605-155155/.
Full textSakizlis, Vassilis. "Design of of model-based controllers via parametric programming." Thesis, Imperial College London, 2003. http://hdl.handle.net/10044/1/8855.
Full textYoo, Dae Keun, and not supplied. "Model based wheel slip control via constrained optimal algorithm." RMIT University. Electrical and Computer Engineering, 2006. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20070125.163142.
Full text